308 research outputs found

    Experimental study of solubility of elemental sulphur in methane

    Get PDF
    International audienceThe chemical engineering department of LaTEP has been working for many years on theproblem of sulphur deposition especially in natural gas network [1, 2]. The solid sulphurappears immediately downstream of a pressure reduction facility. One of the hypothesesproposed to explain the solid formation, based on a thermodynamic approach, is thedesublimation of sulphur. During gas expansion, both pressure and temperature decrease.Consequently the gas may become over saturated in sulphur. Because we are below thetemperature of sulphur triple point, part of the gaseous sulphur can be transformed into solidparticles. Thus, it is important to obtain solubility data of sulphur in natural gases. Methane isthe major natural gas component. So, it is of importance to measure solubility of elementalsulphur in CH4. In this paper experimental measurements up to a pressure and temperature of30 MPA and 363.15 K are presented.The principle of the experimental pilot can be resumed following three steps: saturationof the gas with sulphur, trap of all the dissolved gaseous sulphur and finally quantification.Although the principle is simple, experimental difficulties occur at the three steps. A variablevolume equilibrium cell is used to saturate the gas with sulphur. Since sulphur solubility valueis weak in gas transport conditions, the volume of the cell is necessarily big (0.5 Litre). Thepressure of the equilibrium cell is held constant thanks to a piston during the trapping step. Anoriginal gaseous sulphur trapping method was developed. It is based on the reactiveabsorption of the gaseous sulphur with solvent. Indeed, the gas bubbles into a liquid solutionwhich traps gaseous sulphur. Finally, the solution which contains a standard is analysed bygas chromatography and sulphur is quantified. The total volume of the gas withdrawn isdetermined by a position transducer placed on the autoclave. Then, the sulphur solubilityvalue is calculated

    Development and Application of Ultra-Performance Liquid Chromatography-TOF MS for Precision Large Scale Urinary Metabolic Phenotyping

    Get PDF
    To better understand the molecular mechanisms underpinning physiological variation in human populations, metabolic phenotyping approaches are increasingly being applied to studies involving hundreds and thousands of biofluid samples. Hyphenated ultra-performance liquid chromatography-mass spectrometry (UPLC-MS) has become a fundamental tool for this purpose. However, the seemingly inevitable need to analyze large studies in multiple analytical batches for UPLC-MS analysis poses a challenge to data quality which has been recognized in the field. Herein, we describe in detail a fit-for-purpose UPLC-MS platform, method set, and sample analysis workflow, capable of sustained analysis on an industrial scale and allowing batch-free operation for large studies. Using complementary reversed-phase chromatography (RPC) and hydrophilic interaction liquid chromatography (HILIC) together with high resolution orthogonal acceleration time-of-flight mass spectrometry (oaTOF-MS), exceptional measurement precision is exemplified with independent epidemiological sample sets of approximately 650 and 1000 participant samples. Evaluation of molecular reference targets in repeated injections of pooled quality control (QC) samples distributed throughout each experiment demonstrates a mean retention time relative standard deviation (RSD) of <0.3% across all assays in both studies and a mean peak area RSD of <15% in the raw data. To more globally assess the quality of the profiling data, untargeted feature extraction was performed followed by data filtration according to feature intensity response to QC sample dilution. Analysis of the remaining features within the repeated QC sample measurements demonstrated median peak area RSD values of <20% for the RPC assays and <25% for the HILIC assays. These values represent the quality of the raw data, as no normalization or feature-specific intensity correction was applied. While the data in each experiment was acquired in a single continuous batch, instances of minor time-dependent intensity drift were observed, highlighting the utility of data correction techniques despite reducing the dependency on them for generating high quality data. These results demonstrate that the platform and methodology presented herein is fit-for-use in large scale metabolic phenotyping studies, challenging the assertion that such screening is inherently limited by batch effects. Details of the pipeline used to generate high quality raw data and mitigate the need for batch correction are provided

    MetaFIND: A feature analysis tool for metabolomics data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Metabolomics, or metabonomics, refers to the quantitative analysis of all metabolites present within a biological sample and is generally carried out using NMR spectroscopy or Mass Spectrometry. Such analysis produces a set of peaks, or <it>features</it>, indicative of the metabolic composition of the sample and may be used as a basis for sample classification. Feature selection may be employed to improve classification accuracy or aid model explanation by establishing a subset of class discriminating features. Factors such as experimental noise, choice of technique and threshold selection may adversely affect the set of selected features retrieved. Furthermore, the high dimensionality and multi-collinearity inherent within metabolomics data may exacerbate discrepancies between the set of features retrieved and those required to provide a complete explanation of metabolite signatures. Given these issues, the latter in particular, we present the MetaFIND application for 'post-feature selection' correlation analysis of metabolomics data.</p> <p>Results</p> <p>In our evaluation we show how MetaFIND may be used to elucidate metabolite signatures from the set of features selected by diverse techniques over two metabolomics datasets. Importantly, we also show how MetaFIND may augment standard feature selection and aid the discovery of additional significant features, including those which represent novel class discriminating metabolites. MetaFIND also supports the discovery of higher level metabolite correlations.</p> <p>Conclusion</p> <p>Standard feature selection techniques may fail to capture the full set of relevant features in the case of high dimensional, multi-collinear metabolomics data. We show that the MetaFIND 'post-feature selection' analysis tool may aid metabolite signature elucidation, feature discovery and inference of metabolic correlations.</p

    Effect of pre- and postnatal growth and post-weaning activity on glucose metabolism in the offspring

    Get PDF
    Maternal caloric restriction during late gestation reduces birth weight, but whether long-term adverse metabolic outcomes of intra-uterine growth retardation (IUGR) are dependent on either accelerated postnatal growth or exposure to an obesogenic environment after weaning is not established. We induced IUGR in twin-pregnant sheep using a 40% maternal caloric restriction commencing from 110 days of gestation until term (∼147 days), compared with mothers fed to 100% of requirements. Offspring were reared either as singletons to accelerate postnatal growth or as twins to achieve standard growth. To promote an adverse phenotype in young adulthood, after weaning, offspring were reared under a low-activity obesogenic environment with the exception of a subgroup of IUGR offspring, reared as twins, maintained in a standard activity environment. We assessed glucose tolerance together with leptin and cortisol responses to feeding in young adulthood when the hypothalamus was sampled for assessment of genes regulating appetite control, energy and endocrine sensitivity. Caloric restriction reduced maternal plasma glucose, raised non-esterified fatty acids, and changed the metabolomic profile, but had no effect on insulin, leptin, or cortisol. IUGR offspring whose postnatal growth was enhanced and were obese showed insulin and leptin resistance plus raised cortisol. This was accompanied by increased hypothalamic gene expression for energy and glucocorticoid sensitivity. These long-term adaptations were reduced but not normalized in IUGR offspring whose postnatal growth was not accelerated and remained lean in a standard post-weaning environment. IUGR results in an adverse metabolic phenotype, especially when postnatal growth is enhanced and offspring progress to juvenile-onset obesity

    Statistical HOmogeneous Cluster SpectroscopY (SHOCSY): an optimized statistical approach for clustering of ¹H NMR spectral data to reduce interference and enhance robust biomarkers selection.

    Get PDF
    We propose a novel statistical approach to improve the reliability of (1)H NMR spectral analysis in complex metabolic studies. The Statistical HOmogeneous Cluster SpectroscopY (SHOCSY) algorithm aims to reduce the variation within biological classes by selecting subsets of homogeneous (1)H NMR spectra that contain specific spectroscopic metabolic signatures related to each biological class in a study. In SHOCSY, we used a clustering method to categorize the whole data set into a number of clusters of samples with each cluster showing a similar spectral feature and hence biochemical composition, and we then used an enrichment test to identify the associations between the clusters and the biological classes in the data set. We evaluated the performance of the SHOCSY algorithm using a simulated (1)H NMR data set to emulate renal tubule toxicity and further exemplified this method with a (1)H NMR spectroscopic study of hydrazine-induced liver toxicity study in rats. The SHOCSY algorithm improved the predictive ability of the orthogonal partial least-squares discriminatory analysis (OPLS-DA) model through the use of "truly" representative samples in each biological class (i.e., homogeneous subsets). This method ensures that the analyses are no longer confounded by idiosyncratic responders and thus improves the reliability of biomarker extraction. SHOCSY is a useful tool for removing irrelevant variation that interfere with the interpretation and predictive ability of models and has widespread applicability to other spectroscopic data, as well as other "omics" type of data

    State-of-the art data normalization methods improve NMR-based metabolomic analysis

    Get PDF
    Extracting biomedical information from large metabolomic datasets by multivariate data analysis is of considerable complexity. Common challenges include among others screening for differentially produced metabolites, estimation of fold changes, and sample classification. Prior to these analysis steps, it is important to minimize contributions from unwanted biases and experimental variance. This is the goal of data preprocessing. In this work, different data normalization methods were compared systematically employing two different datasets generated by means of nuclear magnetic resonance (NMR) spectroscopy. To this end, two different types of normalization methods were used, one aiming to remove unwanted sample-to-sample variation while the other adjusts the variance of the different metabolites by variable scaling and variance stabilization methods. The impact of all methods tested on sample classification was evaluated on urinary NMR fingerprints obtained from healthy volunteers and patients suffering from autosomal polycystic kidney disease (ADPKD). Performance in terms of screening for differentially produced metabolites was investigated on a dataset following a Latin-square design, where varied amounts of 8 different metabolites were spiked into a human urine matrix while keeping the total spike-in amount constant. In addition, specific tests were conducted to systematically investigate the influence of the different preprocessing methods on the structure of the analyzed data. In conclusion, preprocessing methods originally developed for DNA microarray analysis, in particular, Quantile and Cubic-Spline Normalization, performed best in reducing bias, accurately detecting fold changes, and classifying samples

    A graphical method for performance mapping of machines and milling tools

    Get PDF
    Optimal design of the machining setup in terms of installed machines, cutting tools and process parameters is of paramount importance for every manufacturing company. In most of the metal cutting companies, all choices related to machine eligibility and cutting parameters selection typically come from heuristic approaches and follow supplier indications or base on the skill of experienced machine operators. More advanced solutions, such as model-based and virtual approaches, are adopted less frequently mainly due to the lack of these techniques in grasping the underlying knowledge successfully. Aim of this work is to introduce a synthetic graphical representation of machining centers and cutting tools capabilities, to provide an accessible way to evaluate the feasibility and close-to-limit conditions of the cutting process. Taking inspiration from previous scientific works from the measurement engineering field, a set of 2D and 3D graphs are presented to map machine, tools and process capabilities, as well as their obtainable manufacturing performances and expectable tool life. This approach synthesizes the nominal data coming from different sources (catalogues, database, tool model geometries etc.) and the real cutting tools parameters used during the production phase. Some examples are provided to show the potential of this graphical evaluation in supporting process planning and decision-making and in formalizing the machining setup knowledge. Further developments are devoted to extend the method to other manufacturing processes, including hybrid processes. At the same time, an in-process data gathering software will be integrated for building a solid database that can be used by an autonomous multi-technological process selector, as well as by a pre-process condition advisor in an Industry 4.0 oriented way

    Combined systems approaches reveal highly plastic responses to antimicrobial peptide challenge in Escherichia coli

    Get PDF
    Obtaining an in-depth understanding of the arms races between peptides comprising the innate immune response and bacterial pathogens is of fundamental interest and will inform the development of new antibacterial therapeutics. We investigated whether a whole organism view of antimicrobial peptide (AMP) challenge on Escherichia coli would provide a suitably sophisticated bacterial perspective on AMP mechanism of action. Selecting structurally and physically related AMPs but with expected differences in bactericidal strategy, we monitored changes in bacterial metabolomes, morphological features and gene expression following AMP challenge at sub-lethal concentrations. For each technique, the vast majority of changes were specific to each AMP, with such a plastic response indicating E. coli is highly capable of discriminating between specific antibiotic challenges. Analysis of the ontological profiles generated from the transcriptomic analyses suggests this approach can accurately predict the antibacterial mode of action, providing a fresh, novel perspective for previous functional and biophysical studies
    corecore