5 research outputs found

    Kernel Normalized Convolutional Networks

    Full text link
    Existing deep convolutional neural network (CNN) architectures frequently rely upon batch normalization (BatchNorm) to effectively train the model. BatchNorm significantly improves model performance in centralized training, but it is unsuitable for federated learning and differential privacy settings. Even in centralized learning, BatchNorm performs poorly with smaller batch sizes. To address these limitations, we propose kernel normalization and kernel normalized convolutional layers, and incorporate them into kernel normalized convolutional networks (KNConvNets) as the main building blocks. We implement KNConvNets corresponding to the state-of-the-art CNNs such as VGGNets and ResNets while forgoing BatchNorm layers. Through extensive experiments, we illustrate KNConvNets consistently outperform their batch, group, and layer normalized counterparts in terms of both accuracy and convergence rate in centralized, federated, and differentially private learning settings

    sPLINK : a hybrid federated tool as a robust alternative to meta-analysis in genome-wide association studies

    Get PDF
    Meta-analysis has been established as an effective approach to combining summary statistics of several genome-wide association studies (GWAS). However, the accuracy of meta-analysis can be attenuated in the presence of cross-study heterogeneity. We present sPLINK, a hybrid federated and user-friendly tool, which performs privacy-aware GWAS on distributed datasets while preserving the accuracy of the results. sPLINK is robust against heterogeneous distributions of data across cohorts while meta-analysis considerably loses accuracy in such scenarios. sPLINK achieves practical runtime and acceptable network usage for chi-square and linear/logistic regression tests.Peer reviewe
    corecore