8 research outputs found

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    Tumor-Induced Osteomalacia Caused by Primary Fibroblast Growth Factor 23 Secreting Neoplasm in Axial Skeleton: A Case Report

    Get PDF
    We report the case of a 66-year-old woman with tumor-induced osteomalacia (TIO) caused by fibroblast growth factor 23 (FGF-23) secreting mesenchymal tumor localized in a lumbar vertebra and review other cases localized to the axial skeleton. She presented with nontraumatic low back pain and spontaneous bilateral femur fractures. Laboratory testing was remarkable for low serum phosphorus, phosphaturia, and significantly elevated serum FGF-23 level. Magnetic resonance imaging (MRI) of the lumbar spine showed a focal lesion in the L-4 vertebra which was hypermetabolic on positron emission tomography (PET) scan. A computed tomography (CT) guided needle biopsy showed a low grade spindle cell neoplasm with positive FGF-23 mRNA expression by reverse transcriptase polymerase chain reaction (RT-PCR), confirming the diagnosis of a phosphaturic mesenchymal tumor mixed connective tissue variant (PMTMCT). The patient elected to have surgery involving anterior resection of L-4 vertebra with subsequent normalization of serum phosphorus. Including the present case, we identified 12 cases of neoplasms localized to spine causing TIO. To our knowledge, this paper represents the first documented case of lumbar vertebra PMT causing TIO. TIO is a rare metabolic bone disorder that carries a favorable prognosis. When a lesion is identifiable, surgical intervention is typically curative

    Novel Treatments in Neuroprotection for Aneurysmal Subarachnoid Hemorrhage

    No full text

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore