8 research outputs found

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    The tumour microenvironment and immune milieu of cholangiocarcinoma

    Full text link
    Tumour microenvironment is a complex, multicellular functional compartment that, particularly when assembled as an abundant desmoplastic reaction, may profoundly affect the proliferative and invasive abilities of epithelial cancer cells. Tumour microenvironment comprises not only stromal cells, mainly cancer-associated fibroblasts, but also immune cells of both the innate and adaptive system (tumour-associated macrophages, neutrophils, natural killer cells, and T and B lymphocytes), and endothelial cells. This results in an intricate web of mutual communications regulated by an extensively remodelled extracellular matrix, where the tumour cells are centrally engaged. In this regard, cholangiocarcinoma, in particular the intrahepatic variant, has become the focus of mounting interest in the last years, largely because of the lack of effective therapies despite its rising incidence and high mortality rates worldwide. On the other hand, recent studies in pancreatic cancer, which similarly to cholangiocarcinoma, is highly desmoplastic, have argued against a tumour-promoting function of the tumour microenvironment. In this review, we will discuss recent developments concerning the role of each cellular population and their multifaceted interplay with the malignant biliary epithelial counterpart. We ultimately hope to provide the working knowledge on how their manipulation may lead to a therapeutic gain in cholangiocarcinoma

    Evaluating the Impact of a Wall-Type Green Infrastructure on PM10 and NOx Concentrations in an Urban Street Environment

    No full text
    Nature-based solutions can represent beneficial tools in the field of urban transformation for their contribution to important environmental services such as air quality improvement. To evaluate the impact on urban air pollution of a CityTree (CT), an innovative wall-type green infrastructure in passive (deposition) and active (filtration) modes of operation, a study was conducted in a real urban setting in Modena (Italy) during 2017 and 2018, combining experimental measurements with modelling system evaluations. In this work, relying on the computational resources of CRESCO (Computational Centre for Research on Complex Systems)/ENEAGRID High Performance Computing infrastructure, we used the air pollution microscale model PMSS (Parallel Micro-SWIFT-Micro SPRAY) to simulate air quality during the experimental campaigns. The spatial characteristics of the impact of the CT on local air pollutants concentrations, specifically nitrogen oxides (NOx) and particulate matter (PM10), were assessed. In particular, we used prescribed bulk deposition velocities provided by the experimental campaigns, which tested the CT both in passive (deposition) and in active (filtration) mode of operation. Our results showed that the PM10 and NOx concentration reductions reach from more than 0.1% up to about 0.8% within an area of 10 × 20 m2 around the infrastructure, when the green infrastructure operates in passive mode. In filtration mode the CT exhibited higher performances in the abatement of PM10 concentrations (between 1.5% and 15%), within approximately the same area. We conclude that CTs may find an application in air quality hotspots within specific urban settings (i.e., urban street canyons) where a very localized reduction of pollutants concentration during rush hours might be of interest to limit population exposure. The optimization of the spatial arrangement of CT modules to increment the “clean air zone” is a factor to be investigated in the ongoing development of the CT technology

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    LebergewÀchse

    No full text
    corecore