32 research outputs found

    High Abundance Proteins Depletion vs Low Abundance Proteins Enrichment: Comparison of Methods to Reduce the Plasma Proteome Complexity

    Get PDF
    BACKGROUND: To date, the complexity of the plasma proteome exceeds the analytical capacity of conventional approaches to isolate lower abundance proteins that may prove to be informative biomarkers. Only complex multistep separation strategies have been able to detect a substantial number of low abundance proteins (<100 ng/ml). The first step of these protocols is generally the depletion of high abundance proteins by the use of immunoaffinity columns or, alternatively, the enrichment of by the use of solid phase hexapeptides ligand libraries. METHODOLOGY/PRINCIPAL FINDINGS: Here we present a direct comparison of these two approaches. Following either approach, the plasma sample was further fractionated by SCX chromatography and analyzed by RP-LC-MS/MS with a Q-TOF mass spectrometer. The depletion of the 20 most abundant plasma proteins allowed the identification of about 25% more proteins than those detectable following low abundance proteins enrichment. The two datasets are partially overlapping and the identified proteins belong to the same order of magnitude in terms of plasma concentration. CONCLUSIONS/SIGNIFICANCE: Our results show that the two approaches give complementary results. However, the enrichment of low abundance proteins has the great advantage of obtaining much larger amount of material that can be used for further fractionations and analyses and emerges also as a cheaper and technically simpler approach. Collectively, these data indicate that the enrichment approach seems more suitable as the first stage of a complex multi-step fractionation protocol

    The use of 210Pb and 137Cs in the study of sediment pollution in the Lagoon of Venice

    No full text
    The activity of 137Cs per unit area in sediments of the central part of the Lagoon of Venice was determined with the aim of identifying boundaries of homogeneous depositional zones. the 210Pb dating technique was used to date vertical profiles of cores from the same area. A comparison of the total amounts of 137Cs and non-supported 210Pb present in each core and the atmospheric input allows us to identify different depositional areas inside the basin and to outline the possible drainage effect of industrial solid wastes used in past reclaiming operations. © 1988

    Fallout distribution in Padua and Northeast Italy after the chernobyl nuclear reactor accident

    No full text
    The radioactive cloud from the Chernobyl nuclear reactor accident arrived in northeast Italy on 30 April 1986. Ground-level air activities detected in Padua reached maximum values of 28·6, 19·2, 3·3, 1·7 and 7·5 Bq m-3 for 131I, 132Te(132I), 137Cs and 103Ru, respectively, on 1 May; about 10 days later, the activities had fallen to less than 1% of peak values. Considerations of cloud homogeneity are reported. The distribution of fallout radionuclides in Padua was evaluated on the basis of radioactivity detected on natural surfaces. The average committed dose equivalent to the thyroid for adult people in Padua through 131I inhalation was estimated at 0·37 mSv. Soil activity was monitored daily in samples collected in Padua during the first weeks of May 1986. Fallout deposition over northeast Italy was measured on 75 surface soil samples collected during June 1986 and long-lived radionuclide distribution maps were derived. © 1988

    Operator- and software-related post-experimental variability and source of error in 2-DE analysis.

    No full text
    In the field of proteomics, several approaches have been developed for separating proteins and analyzing their differential relative abundance. One of the oldest, yet still widely used, is 2-DE. Despite the continuous advance of new methods, which are less demanding from a technical standpoint, 2-DE is still compelling and has a lot of potential for improvement. The overall variability which affects 2-DE includes biological, experimental, and post-experimental (software-related) variance. It is important to highlight how much of the total variability of this technique is due to post-experimental variability, which, so far, has been largely neglected. In this short review, we have focused on this topic and explained that post-experimental variability and source of error can be further divided into those which are software-dependent and those which are operator-dependent. We discuss these issues in detail, offering suggestions for reducing errors that may affect the quality of results, summarizing the advantages and drawbacks of each approach
    corecore