3,231 research outputs found

    A HAUNTED HOME: GHOSTS IN MAXINE HONG KINGSTON AND WAYSON CHOY

    Get PDF
    This thesis comparatively studies the notion of ghost in Maxine Hong Kingston’s The Woman Warrior and Wayson Choy’s The Jade Peony, Paper Shadows, and All That Matters. In the theoretical framework of diaspora studies, I argue that the ghosts in these texts serve as metaphors for the ambiguities, complexities, and contradictions of Chinese diasporic experiences in North America. Specifically, they represent Chinese diasporic subjects’ crises concerning home and identity provoked in the process of dislocation and relocation. Taking an anti-essentialist position, I note that the ghosts in question do not originate from certain superstitious national cultures but are fashioned in cultural displacement as cross-cultural creations, constantly articulating the interactions between an intangible homeland and a tangible hostland. Associating the notion of ghost with the politics of belonging, this thesis aims at a historically contextualized understanding of contemporary Chinese diaspora literature in North America

    Modulation of Serum Antinuclear Antibody Levels by Levamisole Treatment in Patients With Oral Lichen Planus

    Get PDF
    Background/PurposeSerum autoantibodies, including antinuclear antibodies (ANAs), have been found in patients with oral lichen planus (OLP). This study evaluated whether Taiwanese OLP patients had significantly higher frequencies of serum ANAs than healthy control subjects, and whether levamisole treatment could modulate the antibody levels.MethodsThis study used an indirect immunofluorescence technique to measure the baseline serum levels of ANA in a group of 583 Taiwanese OLP patients and 53 healthy control subjects. Seventy-nine ANA-positive OLP patients were treated with levamisole under a regular follow-up schedule in our dental clinic, and their serum ANA levels were measured after treatment.ResultsWe found that the frequencies of serum ANA in patients with OLP (23.2%), erosive OLP (EOLP, 23.8%), major EOLP (31.5%), and minor EOLP (18.1%) were all significantly higher than that (5.7%) in healthy control subjects. In addition, major EOLP patients had a significantly higher serum ANA positive rate than minor EOLP or non-erosive OLP patients. Of 135 ANA-positive OLP patients, 79 were treated with levamisole under a regular follow-up schedule. We found that treatment with levamisole for a period of 2–38 months (mean, 12 ± 9 months) effectively reduced the high mean serum ANA titer (557 ± 98) at baseline to an undetectable level (0) in all ANA-positive OLP patients, regardless of different high initial serum titers of ANA.ConclusionThere was a significantly higher frequency of serum ANA (23.2%) in Taiwanese OLP patients than in healthy control subjects. Treatment with levamisole for 2–38 months reduced the high serum ANA to an undetectable level, and significantly improved the signs and symptoms in all treated OLP patients

    NeuralFuse: Learning to Improve the Accuracy of Access-Limited Neural Network Inference in Low-Voltage Regimes

    Full text link
    Deep neural networks (DNNs) have become ubiquitous in machine learning, but their energy consumption remains a notable issue. Lowering the supply voltage is an effective strategy for reducing energy consumption. However, aggressively scaling down the supply voltage can lead to accuracy degradation due to random bit flips in static random access memory (SRAM) where model parameters are stored. To address this challenge, we introduce NeuralFuse, a novel add-on module that addresses the accuracy-energy tradeoff in low-voltage regimes by learning input transformations to generate error-resistant data representations. NeuralFuse protects DNN accuracy in both nominal and low-voltage scenarios. Moreover, NeuralFuse is easy to implement and can be readily applied to DNNs with limited access, such as non-configurable hardware or remote access to cloud-based APIs. Experimental results demonstrate that, at a 1% bit error rate, NeuralFuse can reduce SRAM memory access energy by up to 24% while improving accuracy by up to 57%. To the best of our knowledge, this is the first model-agnostic approach (i.e., no model retraining) to address low-voltage-induced bit errors. The source code is available at https://github.com/IBM/NeuralFuse

    Rethinking Normalization Methods in Federated Learning

    Full text link
    Federated learning (FL) is a popular distributed learning framework that can reduce privacy risks by not explicitly sharing private data. In this work, we explicitly uncover external covariate shift problem in FL, which is caused by the independent local training processes on different devices. We demonstrate that external covariate shifts will lead to the obliteration of some devices' contributions to the global model. Further, we show that normalization layers are indispensable in FL since their inherited properties can alleviate the problem of obliterating some devices' contributions. However, recent works have shown that batch normalization, which is one of the standard components in many deep neural networks, will incur accuracy drop of the global model in FL. The essential reason for the failure of batch normalization in FL is poorly studied. We unveil that external covariate shift is the key reason why batch normalization is ineffective in FL. We also show that layer normalization is a better choice in FL which can mitigate the external covariate shift and improve the performance of the global model. We conduct experiments on CIFAR10 under non-IID settings. The results demonstrate that models with layer normalization converge fastest and achieve the best or comparable accuracy for three different model architectures.Comment: Submitted to DistributedML'22 worksho
    • …
    corecore