8 research outputs found

    FedClassAvg: Local Representation Learning for Personalized Federated Learning on Heterogeneous Neural Networks

    Full text link
    Personalized federated learning is aimed at allowing numerous clients to train personalized models while participating in collaborative training in a communication-efficient manner without exchanging private data. However, many personalized federated learning algorithms assume that clients have the same neural network architecture, and those for heterogeneous models remain understudied. In this study, we propose a novel personalized federated learning method called federated classifier averaging (FedClassAvg). Deep neural networks for supervised learning tasks consist of feature extractor and classifier layers. FedClassAvg aggregates classifier weights as an agreement on decision boundaries on feature spaces so that clients with not independently and identically distributed (non-iid) data can learn about scarce labels. In addition, local feature representation learning is applied to stabilize the decision boundaries and improve the local feature extraction capabilities for clients. While the existing methods require the collection of auxiliary data or model weights to generate a counterpart, FedClassAvg only requires clients to communicate with a couple of fully connected layers, which is highly communication-efficient. Moreover, FedClassAvg does not require extra optimization problems such as knowledge transfer, which requires intensive computation overhead. We evaluated FedClassAvg through extensive experiments and demonstrated it outperforms the current state-of-the-art algorithms on heterogeneous personalized federated learning tasks.Comment: Accepted to ICPP 2022. Code: https://github.com/hukla/fedclassav

    Membership Feature Disentanglement Network

    No full text
    © 2022 ACM.Membership inference (MI) determines whether a given data point is involved in the training of target machine learning model. Thus, the notion of MI relies on both the data feature and the model. The existing MI methods focus on the model only. We introduce a membership feature disentanglement network (MFDN) to approach MI from the perspective of data features. We assume that the data features can be disentangled into the membership features and class features. The membership features are those that enable MI, and class features refer to those that the network is trying to learn. MFDN disentangles these features by adversarial games between the encoders and auxiliary critic networks. It also visualizes the membership features using an inductive bias from the perspective of MI. We perform empirical evaluations to demonstrate that MFDN can disentangle membership features and class features.N

    One-Step Deposition of Photovoltaic Layers Using Iodide Terminated PbS Quantum Dots

    No full text
    We present a one-step layer deposition procedure employing ammonium iodide (NH<sub>4</sub>I) to achieve photovoltaic quality PbS quantum dot (QD) layers. Ammonium iodide is used to replace the long alkyl organic native ligands binding to the QD surface resulting in iodide terminated QDs that are stabilized in polar solvents such as <i>N</i>,<i>N</i>-dimethylformamide without particle aggregation. We extensively characterized the iodide terminated PbS QD via UV–vis absorption, transmission electron microscopy (TEM), thermogravimetric analysis (TGA), FT-IR transmission spectroscopy, and X-ray photoelectron spectroscopy (XPS). Finally, we fabricated PbS QD photovoltaic cells that employ the iodide terminated PbS QDs. The resulting QD-PV devices achieved a best power conversion efficiency of 2.36% under ambient conditions that is limited by the layer thickness. The PV characteristics compare favorably to similar devices that were prepared using the standard layer-by-layer ethandithiol (EDT) treatment that had a similar layer thickness
    corecore