6 research outputs found

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    ChroMo, an application for unsupervised analysis of chromosome movements in meiosis

    No full text
    © 2021 by the authors.Nuclear movements during meiotic prophase, driven by cytoskeleton forces, are a broadly conserved mechanism in opisthokonts and plants to promote pairing between homologous chromosomes. These forces are transmitted to the chromosomes by specific associations between telomeres and the nuclear envelope during meiotic prophase. Defective chromosome movements (CMs) harm pairing and recombination dynamics between homologues, thereby affecting faithful gametogenesis. For this reason, modelling the behaviour of CMs and their possible microvariations as a result of mutations or physico-chemical stress is important to understand this crucial stage of meiosis. Current developments in high-throughput imaging and image processing are yielding large CM datasets that are suitable for data mining approaches. To facilitate adoption of data mining pipelines, we present ChroMo, an interactive, unsupervised cloud application specifically designed for exploring CM datasets from live imaging. ChroMo contains a wide selection of algorithms and visualizations for time-series segmentation, motif discovery, and assessment of causality networks. Using ChroMo to analyse meiotic CMs in fission yeast, we found previously undiscovered features of CMs and causality relationships between chromosome morphology and trajectory. ChroMo will be a useful tool for understanding the behaviour of meiotic CMs in yeast and other model organisms.This research was funded by Spanish Government, grant number PGC2018-098118-A-I00

    The Rabl chromosome configuration masks a kinetochore reassembly mechanism in yeast mitosis

    No full text
    During cell cycle progression in metazoans, the kinetochore is assembled at mitotic onset and disassembled during mitotic exit. Once assembled, the kinetochore complex attached to centromeres interacts directly with the spindle microtubules, the vehicle of chromosome segregation. This reassembly program is assumed to be absent in budding and fission yeast, because most kinetochore proteins are stably maintained at the centromeres throughout the entire cell cycle. Here, we show that the reassembly program of the outer kinetochore at mitotic onset is unexpectedly conserved in the fission yeast Schizosaccharomyces pombe. We identified this behavior by removing the Rabl chromosome configuration, in which centromeres are permanently associated with the nuclear envelope beneath the spindle pole body during interphase. In addition to having evolutionary implications for kinetochore reassembly, our results aid the understanding of the molecular processes responsible for kinetochore disassembly and assembly during mitotic entry.This work was supported by the Spanish government, Plan Nacional project PGC2018-098118-A-I00, Ramon y Cajal program, RyC-2016-19659 to A.F.-A. and the program “Escalera de Excelencia” of the Junta de Castilla y León Ref. CLU-2017-03, co-funded by the P.O. FEDER of Castilla y León 14-20 and by the Pablo de Olavide University “Ayuda Puente Predoctoral” fellowship (PPI1803) to A.P.-S., and by the Spanish Education and Professional Formation Ministry, Research Collaboration Grant to D.L.-P

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore