2,486 research outputs found

    Kirenol attenuates experimental autoimmune encephalomyelitis by inhibiting differentiation of Th1 and th17 cells and inducing apoptosis of effector T cells.

    Get PDF
    Experimental autoimmune encephalomyelitis (EAE), a model of multiple sclerosis (MS), is characterized by CNS demyelination mediated by autoreactive T cells. Kirenol, a biologically active substance isolated from Herba Siegesbeckiae, has potent anti-inflammatory activities. Here we investigated effects of kirenol on EAE. Kirenol treatment markedly delayed onset of disease and reduced clinical scores in EAE mice. Kirenol treatment reduced expression of IFN-γ and IL-17A in the serum and proportion of Th1 and Th17 cells in draining lymph nodes. Priming of lymphocytes was reduced and apoptosis of MOG-activated CD4+ T cells was increased in kirenol treated EAE mice. Kirenol treatment of healthy animals did not affect the lymphocytes in these non-immunized mice. Further in vitro studies showed that kirenol inhibited viability of MOG-specific lymphocytes and induced apoptosis of MOG-specific CD4+ T cells in a dose- and time-dependent manner. Kirenol treatment upregulated Bax,downregulated Bcl-2,and increased activation of caspase-3 and release of cytochrome c, indicating that a mitochondrial pathway was involved in kirenol induced apoptosis. Moreover, pretreatment with either a pan-caspase inhibitor z-VAD-fmk or a more specific caspase 3 inhibitor Ac-DEVD-CHO in lymphocytes reduced kirenol induced apoptosis. Our findings implicate kirenol as a useful agent for the treatment of MS

    Online Deep Metric Learning

    Full text link
    Metric learning learns a metric function from training data to calculate the similarity or distance between samples. From the perspective of feature learning, metric learning essentially learns a new feature space by feature transformation (e.g., Mahalanobis distance metric). However, traditional metric learning algorithms are shallow, which just learn one metric space (feature transformation). Can we further learn a better metric space from the learnt metric space? In other words, can we learn metric progressively and nonlinearly like deep learning by just using the existing metric learning algorithms? To this end, we present a hierarchical metric learning scheme and implement an online deep metric learning framework, namely ODML. Specifically, we take one online metric learning algorithm as a metric layer, followed by a nonlinear layer (i.e., ReLU), and then stack these layers modelled after the deep learning. The proposed ODML enjoys some nice properties, indeed can learn metric progressively and performs superiorly on some datasets. Various experiments with different settings have been conducted to verify these properties of the proposed ODML.Comment: 9 page

    OPML: A One-Pass Closed-Form Solution for Online Metric Learning

    Get PDF
    To achieve a low computational cost when performing online metric learning for large-scale data, we present a one-pass closed-form solution namely OPML in this paper. Typically, the proposed OPML first adopts a one-pass triplet construction strategy, which aims to use only a very small number of triplets to approximate the representation ability of whole original triplets obtained by batch-manner methods. Then, OPML employs a closed-form solution to update the metric for new coming samples, which leads to a low space (i.e., O(d)O(d)) and time (i.e., O(d2)O(d^2)) complexity, where dd is the feature dimensionality. In addition, an extension of OPML (namely COPML) is further proposed to enhance the robustness when in real case the first several samples come from the same class (i.e., cold start problem). In the experiments, we have systematically evaluated our methods (OPML and COPML) on three typical tasks, including UCI data classification, face verification, and abnormal event detection in videos, which aims to fully evaluate the proposed methods on different sample number, different feature dimensionalities and different feature extraction ways (i.e., hand-crafted and deeply-learned). The results show that OPML and COPML can obtain the promising performance with a very low computational cost. Also, the effectiveness of COPML under the cold start setting is experimentally verified.Comment: 12 page
    corecore