3,154 research outputs found

    Noninvasive imaging of radiolabeled exosome-mimetic nanovesicle using Tc-99m-HMPAO

    Get PDF
    Exosomes known as nano-sized extracellular vesicles attracted recent interests due to their potential usefulness in drug delivery. Amid remarkable advances in biomedical applications of exosomes, it is crucial to understand in vivo distribution and behavior of exosomes. Here, we developed a simple method for radiolabeling of macrophage-derived exosome-mimetic nanovesicles (ENVs) with Tc-99m-HMPAO under physiologic conditions and monitored in vivo distribution of Tc-99m-HMPAO-ENVs using SPECT/CT in living mice. ENVs were produced from the mouse RAW264.7 macrophage cell line and labeled with Tc-99m-HMPAO for 1 hr incubation, followed by removal of free Tc-99m-HMPAO. SPECT/CT images were serially acquired after intravenous injection to BALB/c mouse. When ENVs were labeled with Tc-99m-HMPAO, the radiochemical purity of Tc-99m-HMPAO-ENVs was higher than 90% and the expression of exosome specific protein (CD63) did not change in Tc-99m-HMPAO-ENVs. Tc-99m-HMPAOENVs showed high serum stability (90%) which was similar to that in phosphate buffered saline until 5 hr. SPECT/CT images of the mice injected with Tc-99m-HMPAO-ENVs exhibited higher uptake in liver and no uptake in brain, whereas mice injected with Tc-99m-HMPAO showed high brain uptake until 5 hr. Our noninvasive imaging of radiolabeled-ENVs promises better understanding of the in vivo behavior of exosomes for upcoming biomedical application.114327Ysciescopu

    Plant-expressed Fc-fusion protein tetravalent dengue vaccine with inherent adjuvant properties.

    Get PDF
    Dengue is a major global disease requiring improved treatment and prevention strategies. The recently licensed Sanofi-Pasteur Denvaxia vaccine does not protect children under the age of nine and additional vaccine strategies are thus needed to halt this expanding global epidemic. Here, we employed a molecular engineering approach and plant-expression to produce a humanised and highly immunogenic Poly-Immunoglobulin G Scaffold (PIGS) fused to the consensus dengue envelope protein III domain (cEDIII). The immunogenicity of this IgG Fc receptor targeted vaccine candidate was demonstrated in transgenic mice expressing human FcγRI/CD64, by induction of neutralising antibodies and evidence of cell-mediated immunity. Furthermore, these molecules were able to prime immune cells from human adenoid/tonsillar tissue ex vivo as evidenced by antigen-specific CD4+ and CD8+ T cell proliferation, IFN-γ and antibody production. The purified polymeric fraction of dengue PIGS (D-PIGS) induced stronger immune activation than the monomeric form, suggesting a more efficient interaction with the low affinity Fcγ receptors on antigen-presenting cells. These results show that the plant-expressed D-PIGS have the potential for translation towards a safe and easily scalable single antigen based tetravalent dengue vaccine. This article is protected by copyright. All rights reserved

    Machine-Part cell formation through visual decipherable clustering of Self Organizing Map

    Full text link
    Machine-part cell formation is used in cellular manufacturing in order to process a large variety, quality, lower work in process levels, reducing manufacturing lead-time and customer response time while retaining flexibility for new products. This paper presents a new and novel approach for obtaining machine cells and part families. In the cellular manufacturing the fundamental problem is the formation of part families and machine cells. The present paper deals with the Self Organising Map (SOM) method an unsupervised learning algorithm in Artificial Intelligence, and has been used as a visually decipherable clustering tool of machine-part cell formation. The objective of the paper is to cluster the binary machine-part matrix through visually decipherable cluster of SOM color-coding and labelling via the SOM map nodes in such a way that the part families are processed in that machine cells. The Umatrix, component plane, principal component projection, scatter plot and histogram of SOM have been reported in the present work for the successful visualization of the machine-part cell formation. Computational result with the proposed algorithm on a set of group technology problems available in the literature is also presented. The proposed SOM approach produced solutions with a grouping efficacy that is at least as good as any results earlier reported in the literature and improved the grouping efficacy for 70% of the problems and found immensely useful to both industry practitioners and researchers.Comment: 18 pages,3 table, 4 figure

    The K2K SciBar Detector

    Get PDF
    A new near detector, SciBar, for the K2K long-baseline neutrino oscillation expe riment was installed to improve the measurement of neutrino energy spectrum and to study neutrino interactions in the energy region around 1 GeV. SciBar is a 'fully active' tracking detector with fine segmentation consisting of plastic scintillator bars. The detector was constructed in summer 2003 and is taking data since October 2003. The basic design and initial performance is presented.Comment: 7 pages, 4figures, Contributed to Proceedings of the 10th Vienna Conference on Instrumentation, Vienna, February 16-21, 200

    Topology optimization for human proximal femur considering bi-modulus behavior of cortical bones

    Full text link
    © Springer International Publishing Switzerland 2015. The material in the human proximal femur is considered as bi-modulus material and the density distribution is predicted by topology optimization method. To reduce the computational cost, the bi-modulus material is replaced with two isotropic materials in simulation. The selection of local material modulus is determined by the previous local stress state. Compared with density prediction results by traditional isotropic material in proximal femur, the bi-modulus material layouts are different obviously. The results also demonstrate that the bi-modulus material model is better than the isotropic material model in simulation of density prediction in femur bone

    Transcriptomics in Toxicogenomics, Part III: Data Modelling for Risk Assessment

    Get PDF
    Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics

    Transcriptomics in Toxicogenomics, Part I: Experimental Design, Technologies, Publicly Available Data, and Regulatory Aspects

    Get PDF
    The starting point of successful hazard assessment is the generation of unbiased and trustworthy data. Conventional toxicity testing deals with extensive observations of phenotypic endpoints in vivo and complementing in vitro models. The increasing development of novel materials and chemical compounds dictates the need for a better understanding of the molecular changes occurring in exposed biological systems. Transcriptomics enables the exploration of organisms’ responses to environmental, chemical, and physical agents by observing the molecular alterations in more detail. Toxicogenomics integrates classical toxicology with omics assays, thus allowing the characterization of the mechanism of action (MOA) of chemical compounds, novel small molecules, and engineered nanomaterials (ENMs). Lack of standardization in data generation and analysis currently hampers the full exploitation of toxicogenomics-based evidence in risk assessment. To fill this gap, TGx methods need to take into account appropriate experimental design and possible pitfalls in the transcriptomic analyses as well as data generation and sharing that adhere to the FAIR (Findable, Accessible, Interoperable, and Reusable) principles. In this review, we summarize the recent advancements in the design and analysis of DNA microarray, RNA sequencing (RNA-Seq), and single-cell RNA-Seq (scRNA-Seq) data. We provide guidelines on exposure time, dose and complex endpoint selection, sample quality considerations and sample randomization. Furthermore, we summarize publicly available data resources and highlight applications of TGx data to understand and predict chemical toxicity potential. Additionally, we discuss the efforts to implement TGx into regulatory decision making to promote alternative methods for risk assessment and to support the 3R (reduction, refinement, and replacement) concept. This review is the first part of a three-article series on Transcriptomics in Toxicogenomics. These initial considerations on Experimental Design, Technologies, Publicly Available Data, Regulatory Aspects, are the starting point for further rigorous and reliable data preprocessing and modeling, described in the second and third part of the review series

    Transcriptomics in Toxicogenomics, Part III : Data Modelling for Risk Assessment

    Get PDF
    Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics.Peer reviewe

    Transcriptomics in Toxicogenomics, Part II : Preprocessing and Differential Expression Analysis for High Quality Data

    Get PDF
    Preprocessing of transcriptomics data plays a pivotal role in the development of toxicogenomics-driven tools for chemical toxicity assessment. The generation and exploitation of large volumes of molecular profiles, following an appropriate experimental design, allows the employment of toxicogenomics (TGx) approaches for a thorough characterisation of the mechanism of action (MOA) of different compounds. To date, a plethora of data preprocessing methodologies have been suggested. However, in most cases, building the optimal analytical workflow is not straightforward. A careful selection of the right tools must be carried out, since it will affect the downstream analyses and modelling approaches. Transcriptomics data preprocessing spans across multiple steps such as quality check, filtering, normalization, batch effect detection and correction. Currently, there is a lack of standard guidelines for data preprocessing in the TGx field. Defining the optimal tools and procedures to be employed in the transcriptomics data preprocessing will lead to the generation of homogeneous and unbiased data, allowing the development of more reliable, robust and accurate predictive models. In this review, we outline methods for the preprocessing of three main transcriptomic technologies including microarray, bulk RNA-Sequencing (RNA-Seq), and single cell RNA-Sequencing (scRNA-Seq). Moreover, we discuss the most common methods for the identification of differentially expressed genes and to perform a functional enrichment analysis. This review is the second part of a three-article series on Transcriptomics in Toxicogenomics.Peer reviewe

    Jejunal Variceal Bleeding Successfully Treated with Percutaneous Coil Embolization

    Get PDF
    A 52-yr-old male with alcoholic liver cirrhosis was hospitalized for hematochezia. He had undergone small-bowel resection due to trauma 15 yr previously. Esophagogastroduodenoscopy showed grade 1 esophageal varices without bleeding. No bleeding lesion was seen on colonoscopy, but capsule endoscopy showed suspicious bleeding from angiodysplasia in the small bowel. After 2 weeks of conservative treatment, the hematochezia stopped. However, 1 week later, the patient was re-admitted with hematochezia and a hemoglobin level of 5.5 g/dL. Capsule endoscopy was performed again and showed active bleeding in the mid-jejunum. Abdominal computed tomography revealed a varix in the jejunal branch of the superior mesenteric vein. A direct portogram performed via the transhepatic route showed portosystemic collaterals at the distal jejunum. The patient underwent coil embolization of the superior mesenteric vein just above the portosystemic collaterals and was subsequently discharged without re-bleeding. At 8 months after discharge, his condition has remained stable, without further bleeding episodes
    corecore