40 research outputs found

    How chip size impacts steam pretreatment effectiveness for biological conversion of poplar wood into fermentable sugars

    Get PDF
    Background Woody biomass is highly recalcitrant to enzymatic sugar release and often requires significant size reduction and severe pretreatments to achieve economically viable sugar yields in biological production of sustainable fuels and chemicals. However, because mechanical size reduction of woody biomass can consume significant amounts of energy, it is desirable to minimize size reduction and instead pretreat larger wood chips prior to biological conversion. To date, however, most laboratory research has been performed on materials that are significantly smaller than applicable in a commercial setting. As a result, there is a limited understanding of the effects that larger biomass particle size has on the effectiveness of steam explosion pretreatment and subsequent enzymatic hydrolysis of wood chips. Results To address these concerns, novel downscaled analysis and high throughput pretreatment and hydrolysis (HTPH) were applied to examine whether differences exist in the composition and digestibility within a single pretreated wood chip due to heterogeneous pretreatment across its thickness. Heat transfer modeling, Simons’ stain testing, magnetic resonance imaging (MRI), and scanning electron microscopy (SEM) were applied to probe the effects of pretreatment within and between pretreated wood samples to shed light on potential causes of variation, pointing to enzyme accessibility (i.e., pore size) distribution being a key factor dictating enzyme digestibility in these samples. Application of these techniques demonstrated that the effectiveness of pretreatment of Populus tremuloides can vary substantially over the chip thickness at short pretreatment times, resulting in spatial digestibility effects and overall lower sugar yields in subsequent enzymatic hydrolysis. Conclusions These results indicate that rapid decompression pretreatments (e.g., steam explosion) that specifically alter accessibility at lower temperature conditions are well suited for larger wood chips due to the non-uniformity in temperature and digestibility profiles that can result from high temperature and short pretreatment times. Furthermore, this study also demonstrated that wood chips were hydrated primarily through the natural pore structure during pretreatment, suggesting that preserving the natural grain and transport systems in wood during storage and chipping processes could likely promote pretreatment efficacy and uniformity

    Autophagy suppresses the formation of hepatocyte-derived cancer-initiating ductular progenitor cells in the liver

    Get PDF
    Hepatocellular carcinoma (HCC) is driven by repeated rounds of inflammation, leading to fibrosis, cirrhosis, and, ultimately, cancer. A critical step in HCC formation is the transition from fibrosis to cirrhosis, which is associated with a change in the liver parenchyma called ductular reaction. Here, we report a genetically engineered mouse model of HCC driven by loss of macroautophagy and hemizygosity of phosphatase and tensin homolog, which develops HCC involving ductular reaction. We show through lineage tracing that, following loss of autophagy, mature hepatocytes dedifferentiate into biliary-like liver progenitor cells (ductular reaction), giving rise to HCC. Furthermore, this change is associated with deregulation of yes-associated protein and transcriptional coactivator with PDZ-binding motif transcription factors, and the combined, but not individual, deletion of these factors completely reverses the dedifferentiation capacity and tumorigenesis. These findings therefore increase our understanding of the cell of origin of HCC development and highlight new potential points for therapeutic intervention

    The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data

    Get PDF
    The FLUXNET2015 dataset provides ecosystem-scale data on CO2, water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe (over 1500 site-years, up to and including year 2014). These sites, independently managed and operated, voluntarily contributed their data to create global datasets. Data were quality controlled and processed using uniform methods, to improve consistency and intercomparability across sites. The dataset is already being used in a number of applications, including ecophysiology studies, remote sensing studies, and development of ecosystem and Earth system models. FLUXNET2015 includes derived-data products, such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, estimation of uncertainties, and metadata about the measurements, presented for the first time in this paper. In addition, 206 of these sites are for the first time distributed under a Creative Commons (CC-BY 4.0) license. This paper details this enhanced dataset and the processing methods, now made available as open-source codes, making the dataset more accessible, transparent, and reproducible.Peer reviewe

    How chip size impacts steam pretreatment effectiveness for biological conversion of poplar wood into fermentable sugars

    Get PDF
    BACKGROUND: Woody biomass is highly recalcitrant to enzymatic sugar release and often requires significant size reduction and severe pretreatments to achieve economically viable sugar yields in biological production of sustainable fuels and chemicals. However, because mechanical size reduction of woody biomass can consume significant amounts of energy, it is desirable to minimize size reduction and instead pretreat larger wood chips prior to biological conversion. To date, however, most laboratory research has been performed on materials that are significantly smaller than applicable in a commercial setting. As a result, there is a limited understanding of the effects that larger biomass particle size has on the effectiveness of steam explosion pretreatment and subsequent enzymatic hydrolysis of wood chips. RESULTS: To address these concerns, novel downscaled analysis and high throughput pretreatment and hydrolysis (HTPH) were applied to examine whether differences exist in the composition and digestibility within a single pretreated wood chip due to heterogeneous pretreatment across its thickness. Heat transfer modeling, Simons’ stain testing, magnetic resonance imaging (MRI), and scanning electron microscopy (SEM) were applied to probe the effects of pretreatment within and between pretreated wood samples to shed light on potential causes of variation, pointing to enzyme accessibility (i.e., pore size) distribution being a key factor dictating enzyme digestibility in these samples. Application of these techniques demonstrated that the effectiveness of pretreatment of Populus tremuloides can vary substantially over the chip thickness at short pretreatment times, resulting in spatial digestibility effects and overall lower sugar yields in subsequent enzymatic hydrolysis. CONCLUSIONS: These results indicate that rapid decompression pretreatments (e.g., steam explosion) that specifically alter accessibility at lower temperature conditions are well suited for larger wood chips due to the non-uniformity in temperature and digestibility profiles that can result from high temperature and short pretreatment times. Furthermore, this study also demonstrated that wood chips were hydrated primarily through the natural pore structure during pretreatment, suggesting that preserving the natural grain and transport systems in wood during storage and chipping processes could likely promote pretreatment efficacy and uniformity

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples