2,043 research outputs found

    Association of glutathione-S-transferase polymorphisms with atopic dermatitis risk in preschool age children

    Get PDF
    Background: Glutathione S-transferase (GST) enzymes are critical for detoxifying reactive oxygen species (ROS) and their products which have been implicated in the pathology of inflammatory diseases such as atopic dermatitis (AD). Methods: We investigated the effects of genetic polymorphisms of GST on the risk of AD in preschool age children. Biomarkers for oxidative stress were also evaluated with respect to GST genotype. Results: The GSTP1 Val105 allele was significantly associated with an increased risk of AD [odds ratio (OR)=1.62, p<0.05]. The combination of the GSTP1 Val105 allele and the GSTT1 null genotype further increased this risk by 2.3-fold (p<0.01). No association was observed for the GSTM1 null or GSTT1 null genotype alone. In children with AD, blood total antioxidant capacity was significantly less (p<0.001), while malondialdehyde was higher (p=0.12). Children with the GSTP1 Val105 allele had significantly lower concentrations of erythrocyte glutathione compared to GSTP1 Ile/Ile homozygotes (p=0.03). Conclusions: Our study suggests that the GSTP1 Val105 allele is an important determinant of susceptibility to AD in preschool age children and increased oxidative stress may play a role in the pathogenesis of AD. Clin Chem Lab Med 2009;47:1475–81.Peer Reviewe

    FedFN: Feature Normalization for Alleviating Data Heterogeneity Problem in Federated Learning

    Full text link
    Federated Learning (FL) is a collaborative method for training models while preserving data privacy in decentralized settings. However, FL encounters challenges related to data heterogeneity, which can result in performance degradation. In our study, we observe that as data heterogeneity increases, feature representation in the FedAVG model deteriorates more significantly compared to classifier weight. Additionally, we observe that as data heterogeneity increases, the gap between higher feature norms for observed classes, obtained from local models, and feature norms of unobserved classes widens, in contrast to the behavior of classifier weight norms. This widening gap extends to encompass the feature norm disparities between local and the global models. To address these issues, we introduce Federated Averaging with Feature Normalization Update (FedFN), a straightforward learning method. We demonstrate the superior performance of FedFN through extensive experiments, even when applied to pretrained ResNet18. Subsequently, we confirm the applicability of FedFN to foundation models.Comment: NeurIPS Workshop: "Federated Learning in the Age of Foundation Models" 202
    • …
    corecore