14 research outputs found

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    Engineering application of ecological lake purification tailwater from sewage plant in vein industrial park

    No full text
    As the raising of environmental protection requirements, the outlet water (tailwater) quality standards of many sewage treatment plants need to be further improved. This paper mainly introduces the advanced treatment of tailwater from sewage treatment plant of vein industrial park by artificial ecological lake. The designed processing amount of this project was 1300 m3/d, the inlet water COD was 30 mg/L, the ammonia nitrogen content was 1.5 mg/L, and the outlet water main index reached the surface water environment quality standard (GB3838-2002) â…˘ class water quality standard, which had improved the outlet water (tailwater) quality of sewage treatment plants and improved the regional water environment quality. By constructing an artificial landscape lake, the resource utilization of tailwater can be realized

    Thermostable Chicken Feather Degrading Enzymes from L-23 Isolate from Indonesia

    No full text
    The thermostable chicken feather degrading protease enzymes used here was extracted and partially purified from thermophilic bacteria L-23 isolated from a coastal hot spring in North Sulawesi, Indonesia. The L-23 was grown in the selective medium containing 1% chicken feather powder at 70 °C and pH 7. The cell-free culture was precipitated with ammonium sulphate at 80% saturation, followed by heating at 65 °C for 1 h before applied onto Sephadex G-100. The molecular weight of the two enzymes identified were estimated as 47 and 64 kDa. The optimum pH of the mixed enzymes preparation was 7 while the optimum temperature was 65 °C. Zymogram analysis showed that one of the enzymes was still active after being heated at 100 °C for 20 min and was also resistant towards organic solvents and SDS. The activity was enhanced by addition of 1 mM FeCl3

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore