6 research outputs found

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    Institutionalisation and deinstitutionalisation of children 2:policy and practice recommendations for global, national, and local actors

    No full text
    Worldwide, millions of children live in institutions, which runs counter to both the UN-recognised right of children to be raised in a family environment, and the findings of our accompanying systematic review of the physical, neurobiological, psychological, and mental health costs of institutionalisation and the benefits of deinstitutionalisation of child welfare systems. In this part of the Commission, international experts in reforming care for children identified evidence-based policy recommendations to promote family-based alternatives to institutionalisation. Family-based care refers to caregiving by extended family or foster, kafalah (the practice of guardianship of orphaned children in Islam), or adoptive family, preferably in close physical proximity to the biological family to facilitate the continued contact of children with important individuals in their life when this is in their best interest. 14 key recommendations are addressed to multinational agencies, national governments, local authorities, and institutions. These recommendations prioritise the role of families in the lives of children to prevent child separation and to strengthen families, to protect children without parental care by providing high-quality family-based alternatives, and to strengthen systems for the protection and care of separated children. Momentum for a shift from institutional to family-based care is growing internationally—our recommendations provide a template for further action and criteria against which progress can be assessed

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore