7 research outputs found
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
Large language models (LLMs) have been shown to be able to perform new tasks
based on a few demonstrations or natural language instructions. While these
capabilities have led to widespread adoption, most LLMs are developed by
resource-rich organizations and are frequently kept from the public. As a step
towards democratizing this powerful technology, we present BLOOM, a
176B-parameter open-access language model designed and built thanks to a
collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer
language model that was trained on the ROOTS corpus, a dataset comprising
hundreds of sources in 46 natural and 13 programming languages (59 in total).
We find that BLOOM achieves competitive performance on a wide variety of
benchmarks, with stronger results after undergoing multitask prompted
finetuning. To facilitate future research and applications using LLMs, we
publicly release our models and code under the Responsible AI License
Reinforcing one-carbon metabolism via folic acid/Folr1 promotes ÎČ-cell differentiation
Regeneration of insulin-producing beta-cells may become a future alternative treatment of diabetes. Here the authors report a genetic screen in a zebrafish model that mimics the loss of beta-cells in diabetes, and identified that the folate receptor Folr1 or folinic acid treatment can stimulate beta-cell regeneration
Recommended from our members
Heterogeneity of SOX9 and HNF1ÎČ in Pancreatic Ducts Is Dynamic
Summary Pancreatic duct epithelial cells have been suggested as a source of progenitors for pancreatic growth and regeneration. However, genetic lineage-tracing experiments with pancreatic duct-specific Cre expression have given conflicting results. Using immunofluorescence and flow cytometry, we show heterogeneous expression of both HNF1ÎČ and SOX9 in adult human and murine ductal epithelium. Their expression was dynamic and diminished significantly after induced replication. Purified pancreatic duct cells formed organoid structures in 3D culture, and heterogeneity of expression of Hnf1ÎČ and Sox9 was maintained even after passaging. Using antibodies against a second cell surface molecule CD51 (human) or CD24 (mouse), we could isolate living subpopulations of duct cells enriched for high or low expression of HNF1ÎČ and SOX9. Only the CD24high (HnfÎČhigh/Sox9high) subpopulation was able to form organoids
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License