6 research outputs found

    Kernel Normalized Convolutional Networks for Privacy-Preserving Machine Learning

    Full text link
    Normalization is an important but understudied challenge in privacy-related application domains such as federated learning (FL), differential privacy (DP), and differentially private federated learning (DP-FL). While the unsuitability of batch normalization for these domains has already been shown, the impact of other normalization methods on the performance of federated or differentially private models is not well-known. To address this, we draw a performance comparison among layer normalization (LayerNorm), group normalization (GroupNorm), and the recently proposed kernel normalization (KernelNorm) in FL, DP, and DP-FL settings. Our results indicate LayerNorm and GroupNorm provide no performance gain compared to the baseline (i.e. no normalization) for shallow models in FL and DP. They, on the other hand, considerably enhance the performance of shallow models in DP-FL and deeper models in FL and DP. KernelNorm, moreover, significantly outperforms its competitors in terms of accuracy and convergence rate (or communication efficiency) for both shallow and deeper models in all considered learning environments. Given these key observations, we propose a kernel normalized ResNet architecture called KNResNet-13 for differentially private learning. Using the proposed architecture, we provide new state-of-the-art accuracy values on the CIFAR-10 and Imagenette datasets, when trained from scratch.Comment: To appear in the IEEE Conference on Secure and Trustworthy Machine Learning (SaTML), February 202

    Kernel Normalized Convolutional Networks

    Full text link
    Existing deep convolutional neural network (CNN) architectures frequently rely upon batch normalization (BatchNorm) to effectively train the model. BatchNorm significantly improves model performance in centralized training, but it is unsuitable for federated learning and differential privacy settings. Even in centralized learning, BatchNorm performs poorly with smaller batch sizes. To address these limitations, we propose kernel normalization and kernel normalized convolutional layers, and incorporate them into kernel normalized convolutional networks (KNConvNets) as the main building blocks. We implement KNConvNets corresponding to the state-of-the-art CNNs such as VGGNets and ResNets while forgoing BatchNorm layers. Through extensive experiments, we illustrate KNConvNets consistently outperform their batch, group, and layer normalized counterparts in terms of both accuracy and convergence rate in centralized, federated, and differentially private learning settings

    sPLINK : a hybrid federated tool as a robust alternative to meta-analysis in genome-wide association studies

    Get PDF
    Meta-analysis has been established as an effective approach to combining summary statistics of several genome-wide association studies (GWAS). However, the accuracy of meta-analysis can be attenuated in the presence of cross-study heterogeneity. We present sPLINK, a hybrid federated and user-friendly tool, which performs privacy-aware GWAS on distributed datasets while preserving the accuracy of the results. sPLINK is robust against heterogeneous distributions of data across cohorts while meta-analysis considerably loses accuracy in such scenarios. sPLINK achieves practical runtime and acceptable network usage for chi-square and linear/logistic regression tests.Peer reviewe

    Exploring the SARS-CoV-2 virus-host-drug interactome for drug repurposing

    No full text
    Information developed to understand the molecular mechanisms of SARS-CoV-2 infection for predicting drug repurposing candidates is time-consuming to integrate and explore. Here, the authors develop an interactive online platform for virus-host interactome exploration and drug (target) identification
    corecore