246 research outputs found

    Online optical monitoring of polymer melting in a twin-screw extruder

    Get PDF
    An experimental setup containing a sliding online optical device is used to monitor in real‐time the melting process of a commercial polypropylene in a corotating intermeshing twin‐screw extruder. Turbidity and birefringence are measured at several axial locations upstream and along the first restrictive zone of the screw, where melting develops. The experiments are performed using different set barrel temperatures, extruder feed rates, and screw speeds, to generate distinct flow histories and, accordingly, changes in the onset and rate of melting of the polymer. The local flow conditions are characterized in terms of residence time distribution and data equivalent to axial pressure profiles. Turbidity and birefringence are sensitive to changes in the operating conditions providing a coherent description of melting. The onset of melting seems to take place in partially filled conveying elements, and then melting develops quickly as the latter become fully filled, and is completed well before flow through the kneading blockCoordenação de Aperfeiçoamento de Pessoal de Nível Superior—Brasil (CAPES)—Finance Code 001, PDSE 88881.132167/2016- 01 scholarship to L.A. Bicalho, grant PVE 30484/2013-01 to J.A. Covas and Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) for a PQ scholarship 311790/2013-5 to S.V. Canevarolo, and the Programa de Pós-graduação em Ciência e Engenharia de Materiais (PPG-CEM) of UFSCar

    Integrative analysis to select cancer candidate biomarkers to targeted validation

    Get PDF
    FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOTargeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS.Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS6414363543652FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOFAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO2009/54067-3; 2010/19278-0; 2011/22421-2; 2009/53839-2470567/2009-0; 470549/2011-4; 301702/2011-0; 470268/2013-

    Virgo Detector Characterization and Data Quality during the O3 run

    Full text link
    The Advanced Virgo detector has contributed with its data to the rapid growth of the number of detected gravitational-wave signals in the past few years, alongside the two LIGO instruments. First, during the last month of the Observation Run 2 (O2) in August 2017 (with, most notably, the compact binary mergers GW170814 and GW170817) and then during the full Observation Run 3 (O3): an 11 months data taking period, between April 2019 and March 2020, that led to the addition of about 80 events to the catalog of transient gravitational-wave sources maintained by LIGO, Virgo and KAGRA. These discoveries and the manifold exploitation of the detected waveforms require an accurate characterization of the quality of the data, such as continuous study and monitoring of the detector noise. These activities, collectively named {\em detector characterization} or {\em DetChar}, span the whole workflow of the Virgo data, from the instrument front-end to the final analysis. They are described in details in the following article, with a focus on the associated tools, the results achieved by the Virgo DetChar group during the O3 run and the main prospects for future data-taking periods with an improved detector.Comment: 86 pages, 33 figures. This paper has been divided into two articles which supercede it and have been posted to arXiv on October 2022. Please use these new preprints as references: arXiv:2210.15634 (tools and methods) and arXiv:2210.15633 (results from the O3 run

    Virgo Detector Characterization and Data Quality: results from the O3 run

    Full text link
    The Advanced Virgo detector has contributed with its data to the rapid growth of the number of detected gravitational-wave (GW) signals in the past few years, alongside the two Advanced LIGO instruments. First during the last month of the Observation Run 2 (O2) in August 2017 (with, most notably, the compact binary mergers GW170814 and GW170817), and then during the full Observation Run 3 (O3): an 11-months data taking period, between April 2019 and March 2020, that led to the addition of about 80 events to the catalog of transient GW sources maintained by LIGO, Virgo and now KAGRA. These discoveries and the manifold exploitation of the detected waveforms require an accurate characterization of the quality of the data, such as continuous study and monitoring of the detector noise sources. These activities, collectively named {\em detector characterization and data quality} or {\em DetChar}, span the whole workflow of the Virgo data, from the instrument front-end hardware to the final analyses. They are described in details in the following article, with a focus on the results achieved by the Virgo DetChar group during the O3 run. Concurrently, a companion article describes the tools that have been used by the Virgo DetChar group to perform this work.Comment: 57 pages, 18 figures. To be submitted to Class. and Quantum Grav. This is the "Results" part of preprint arXiv:2205.01555 [gr-qc] which has been split into two companion articles: one about the tools and methods, the other about the analyses of the O3 Virgo dat

    Virgo Detector Characterization and Data Quality: tools

    Full text link
    Detector characterization and data quality studies -- collectively referred to as {\em DetChar} activities in this article -- are paramount to the scientific exploitation of the joint dataset collected by the LIGO-Virgo-KAGRA global network of ground-based gravitational-wave (GW) detectors. They take place during each phase of the operation of the instruments (upgrade, tuning and optimization, data taking), are required at all steps of the dataflow (from data acquisition to the final list of GW events) and operate at various latencies (from near real-time to vet the public alerts to offline analyses). This work requires a wide set of tools which have been developed over the years to fulfill the requirements of the various DetChar studies: data access and bookkeeping; global monitoring of the instruments and of the different steps of the data processing; studies of the global properties of the noise at the detector outputs; identification and follow-up of noise peculiar features (whether they be transient or continuously present in the data); quick processing of the public alerts. The present article reviews all the tools used by the Virgo DetChar group during the third LIGO-Virgo Observation Run (O3, from April 2019 to March 2020), mainly to analyse the Virgo data acquired at EGO. Concurrently, a companion article focuses on the results achieved by the DetChar group during the O3 run using these tools.Comment: 44 pages, 16 figures. To be submitted to Class. and Quantum Grav. This is the "Tools" part of preprint arXiv:2205.01555 [gr-qc] which has been split into two companion articles: one about the tools and methods, the other about the analyses of the O3 Virgo dat
    corecore