658 research outputs found
The fundamental gap of a kind of two dimensional sub-elliptic operator
This paper is concerned at the minimization fundamental gap problem for a
class of two-dimensional degenerate sub-elliptic operators. We establish
existence results for weak solutions, Sobolev embedding theorem and spectral
theory of sub-elliptic operators. We provide the existence and characterization
theorems for extremizing potentials when is subject to
norm constraint
Efficient Semi-Supervised Federated Learning for Heterogeneous Participants
Federated Learning (FL) has emerged to allow multiple clients to
collaboratively train machine learning models on their private data. However,
training and deploying large-scale models on resource-constrained clients is
challenging. Fortunately, Split Federated Learning (SFL) offers a feasible
solution by alleviating the computation and/or communication burden on clients.
However, existing SFL works often assume sufficient labeled data on clients,
which is usually impractical. Besides, data non-IIDness across clients poses
another challenge to ensure efficient model training. To our best knowledge,
the above two issues have not been simultaneously addressed in SFL. Herein, we
propose a novel Semi-SFL system, which incorporates clustering regularization
to perform SFL under the more practical scenario with unlabeled and non-IID
client data. Moreover, our theoretical and experimental investigations into
model convergence reveal that the inconsistent training processes on labeled
and unlabeled data have an influence on the effectiveness of clustering
regularization. To this end, we develop a control algorithm for dynamically
adjusting the global updating frequency, so as to mitigate the training
inconsistency and improve training performance. Extensive experiments on
benchmark models and datasets show that our system provides a 3.0x speed-up in
training time and reduces the communication cost by about 70.3% while reaching
the target accuracy, and achieves up to 5.1% improvement in accuracy under
non-IID scenarios compared to the state-of-the-art baselines.Comment: 16 pages, 12 figures, conferenc
MergeSFL: Split Federated Learning with Feature Merging and Batch Size Regulation
Recently, federated learning (FL) has emerged as a popular technique for edge
AI to mine valuable knowledge in edge computing (EC) systems. To mitigate the
computing/communication burden on resource-constrained workers and protect
model privacy, split federated learning (SFL) has been released by integrating
both data and model parallelism. Despite resource limitations, SFL still faces
two other critical challenges in EC, i.e., statistical heterogeneity and system
heterogeneity. To address these challenges, we propose a novel SFL framework,
termed MergeSFL, by incorporating feature merging and batch size regulation in
SFL. Concretely, feature merging aims to merge the features from workers into a
mixed feature sequence, which is approximately equivalent to the features
derived from IID data and is employed to promote model accuracy. While batch
size regulation aims to assign diverse and suitable batch sizes for
heterogeneous workers to improve training efficiency. Moreover, MergeSFL
explores to jointly optimize these two strategies upon their coupled
relationship to better enhance the performance of SFL. Extensive experiments
are conducted on a physical platform with 80 NVIDIA Jetson edge devices, and
the experimental results show that MergeSFL can improve the final model
accuracy by 5.82% to 26.22%, with a speedup by about 1.74x to 4.14x, compared
to the baselines
- …