1,201 research outputs found

    Chinese herbal Jin-Ying-Tang attenuates the inflammatory response by inhibiting the activation of TLR4/MyD88/TRAF-6/NIK pathway at the mRNA level in LPS-stimulated mouse mammary epithelial cells

    Get PDF
    Introduction: The effects of Jin-Ying-Tang (JYT) on Toll-like Receptor 4 (TLR4) signalling transduction of lipopolysaccharide (LPS)-stimulated mouse mammary epithelial cells (MECs) in vitro were examined. Material and Methods: The cytotoxicity of JYT (0.06-62.50 mg/mL) on mouse MECs was determined by MTT assay. The MECs were co-cultured with LPS in the presence or absence of JYT (39.10 mu g/mL, 391 mu g/mL, 3910 mu g/mL). The concentrations of interleukin-6 (IL-6) and tumour necrosis factor-alpha (TNF-alpha) in the culture supernatants were detected by ELISA. The mRNA expression of TLR4 and downstream TLR4 signalling molecules such as myeloid differentiation factor 88 (MyD88), tumour necrosis factor receptor associated factor 6 (TRAF-6), inhibitor kappa B (I kappa B), and nuclear factor.B inducing kinase (NIK) were determined by quantitative real-time polymerase chain reaction (qRT-PCR). Results: The results showed that the IC50 of JYT on MECs was 12.25 mg/mL and JYT could significantly decrease the concentrations of IL-6 and TNF-alpha in LPS-stimulated MECs (P < 0.05). The mRNA expression of TLR4, MyD88, TRAF-6, I kappa B, and NIK was also significantly decreased when the LPS-stimulated MECs were cocultured at appropriate concentrations of JYT (P < 0.05, P < 0.01). Conclusion: These observations indicate a potential mechanism through which JYT attenuates the systemic inflammatory response to LPS-stimulated mouse mammary epithelial cells by inhibiting the activation of TLR4/MyD88/TRAF-6/NIK pathway at the mRNA level

    Chinese herbal Jin-Ying-Tang attenuates the inflammatory response by inhibiting the activation of TLR4/MyD88/TRAF-6/NIK pathway at the mRNA level in LPS-stimulated mouse mammary epithelial cells

    Get PDF
    Introduction: The effects of Jin-Ying-Tang (JYT) on Toll-like Receptor 4 (TLR4) signalling transduction of lipopolysaccharide (LPS)-stimulated mouse mammary epithelial cells (MECs) in vitro were examined. Material and Methods: The cytotoxicity of JYT (0.06-62.50 mg/mL) on mouse MECs was determined by MTT assay. The MECs were co-cultured with LPS in the presence or absence of JYT (39.10 mu g/mL, 391 mu g/mL, 3910 mu g/mL). The concentrations of interleukin-6 (IL-6) and tumour necrosis factor-alpha (TNF-alpha) in the culture supernatants were detected by ELISA. The mRNA expression of TLR4 and downstream TLR4 signalling molecules such as myeloid differentiation factor 88 (MyD88), tumour necrosis factor receptor associated factor 6 (TRAF-6), inhibitor kappa B (I kappa B), and nuclear factor.B inducing kinase (NIK) were determined by quantitative real-time polymerase chain reaction (qRT-PCR). Results: The results showed that the IC50 of JYT on MECs was 12.25 mg/mL and JYT could significantly decrease the concentrations of IL-6 and TNF-alpha in LPS-stimulated MECs (P < 0.05). The mRNA expression of TLR4, MyD88, TRAF-6, I kappa B, and NIK was also significantly decreased when the LPS-stimulated MECs were cocultured at appropriate concentrations of JYT (P < 0.05, P < 0.01). Conclusion: These observations indicate a potential mechanism through which JYT attenuates the systemic inflammatory response to LPS-stimulated mouse mammary epithelial cells by inhibiting the activation of TLR4/MyD88/TRAF-6/NIK pathway at the mRNA level

    Episodic Training for Domain Generalization

    Get PDF
    Domain generalization (DG) is the challenging and topical problem of learning models that generalize to novel testing domains with different statistics than a set of known training domains. The simple approach of aggregating data from all source domains and training a single deep neural network end-to-end on all the data provides a surprisingly strong baseline that surpasses many prior published methods. In this paper, we build on this strong baseline by designing an episodic training procedure that trains a single deep network in a way that exposes it to the domain shift that characterises a novel domain at runtime. Specifically, we decompose a deep network into feature extractor and classifier components, and then train each component by simulating it interacting with a partner who is badly tuned for the current domain. This makes both components more robust, ultimately leading to our networks producing state-of-the-art performance on three DG benchmarks. Furthermore, we consider the pervasive workflow of using an ImageNet trained CNN as a fixed feature extractor for downstream recognition tasks. Using the Visual Decathlon benchmark, we demonstrate that our episodic-DG training improves the performance of such a general-purpose feature extractor by explicitly training a feature for robustness to novel problems. This shows that DG training can benefit standard practice in computer vision.Comment: ICCV'19 CR version and fix Table 5. Code is now available at https://github.com/HAHA-DL/Episodic-D

    Panoptic Scene Graph Generation

    Full text link
    Existing research addresses scene graph generation (SGG) -- a critical technology for scene understanding in images -- from a detection perspective, i.e., objects are detected using bounding boxes followed by prediction of their pairwise relationships. We argue that such a paradigm causes several problems that impede the progress of the field. For instance, bounding box-based labels in current datasets usually contain redundant classes like hairs, and leave out background information that is crucial to the understanding of context. In this work, we introduce panoptic scene graph generation (PSG), a new problem task that requires the model to generate a more comprehensive scene graph representation based on panoptic segmentations rather than rigid bounding boxes. A high-quality PSG dataset, which contains 49k well-annotated overlapping images from COCO and Visual Genome, is created for the community to keep track of its progress. For benchmarking, we build four two-stage baselines, which are modified from classic methods in SGG, and two one-stage baselines called PSGTR and PSGFormer, which are based on the efficient Transformer-based detector, i.e., DETR. While PSGTR uses a set of queries to directly learn triplets, PSGFormer separately models the objects and relations in the form of queries from two Transformer decoders, followed by a prompting-like relation-object matching mechanism. In the end, we share insights on open challenges and future directions.Comment: Accepted to ECCV'22 (Paper ID #222, Final Score 2222). Project Page: https://psgdataset.org/. OpenPSG Codebase: https://github.com/Jingkang50/OpenPS
    • …
    corecore