109 research outputs found

    Monitoring results after 36 ktonnes of deep CO2 injection at the Aquistore CO2 storage site, Saskatchewan, Canada

    Get PDF
    The Aquistore CO2 Storage Site is located in southeastern Saskatchewan, Canada. CO2 is injected into a brine-filled sandstone formation at ∌3200 m depth immediately above the Precambrian basement. Sustained injection rates of 400-600 tonnes/day were achieved at the site starting in the fall of 2015 with

    Human AlkB Homolog ABH8 Is a tRNA Methyltransferase Required for Wobble Uridine Modification and DNA Damage Survival

    Get PDF
    tRNA nucleosides are extensively modified to ensure their proper function in translation. However, many of the enzymes responsible for tRNA modifications in mammals await identification. Here, we show that human AlkB homolog 8 (ABH8) catalyzes tRNA methylation to generate 5-methylcarboxymethyl uridine (mcm[superscript 5]U) at the wobble position of certain tRNAs, a critical anticodon loop modification linked to DNA damage survival. We find that ABH8 interacts specifically with tRNAs containing mcm5U and that purified ABH8 complexes methylate RNA in vitro. Significantly, ABH8 depletion in human cells reduces endogenous levels of mcm[superscript 5]U in RNA and increases cellular sensitivity to DNA-damaging agents. Moreover, DNA-damaging agents induce ABH8 expression in an ATM-dependent manner. These results expand the role of mammalian AlkB proteins beyond that of direct DNA repair and support a regulatory mechanism in the DNA damage response pathway involving modulation of tRNA modification.United States. National Institutes of Health (grant CA055042)United States. National Institutes of Health (grant ES002109)United States. National Institutes of Health (grant ES01701)National Institutes of Health (U.S.). Intramural Research ProgramWestaway Research FundNational Center for Research Resources (U.S.) (grant S10-RR023783

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    • 

    corecore