6 research outputs found

    Hypogean geology of the Camerano site(Marches, Italy)

    Get PDF
    The historical Camerano town (Ancona, Central Italy), built on the top of a hill at the rear of the Conero Mt. promontory, holds a broad underground system of remarkable man-drilled caves. Contrary to the roughly total lack of subaerial accessible outcrops, the caves allow to reconstruct the geological evolution and to describe a composite sedimentological and stratigraphic section into Early Pleistocene (Calabrian) marine deposits. The present study is aimed to a better definition of the sedimentological and palaeoenvironmental context for the Camerano area, and also improves the knowledge of the Camerano caves. Sediments are mainly regarded as massive to laminated yellow-brown bioclastic sand and massive grey-green clay couplets, variable in thickness, each couplet showing erosive basal surface and normal gradation, from sand to clay. Plane-parallel lamination, marked by recurring variations in grain size, is referred to “traction carpets”, and the sand horizons are described as carbonatic turbidites with eastern supply (Conero Mt.). Conversely, clay reflects both western river deltas distal supply and local contribution from marine productivity. Along the section, matrix-supported gravel beds also occur made of heterometric clay fragments dispersed into a bioclastic sandy matrix. Described facies only partially insert in the former geological schemes, and offer new insights to the palaeoenvironmental restoration for the Camerano area, which involves a tectonically active Early Pleistocene basin, mainly dominated by clay sedimentation, periodically reached by storm- to seismic-induced carbonatic turbidites. The large clay fragments, matrix-supported gravels probably derive from remobilization of partially lithified deposits along the basin’s flank and represent the distal evolution of west- coming slumps.</p

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    CONSTANS delays Arabidopsis flowering under short days

    No full text
    Long days (LD) promote flowering of Arabidopsis thaliana compared with short days (SD) by activating the photoperiodic pathway. Here we show that growth under very-SD (3 h) or darkness (on sucrose) also accelerates flowering on a biological scale, indicating that SD actively repress flowering compared with very-SD. CONSTANS (CO) repressed flowering under SD, and the early flowering of co under SD required FLOWERING LOCUS T (FT). FT was expressed at a basal level in the leaves under SD, but these levels were not enhanced in co. This indicates that the action of CO in A. thaliana is not the mirror image of the action of its homologue in rice. In the apex, CO enhanced the expression of TERMINAL FLOWER 1 (TFL1) around the time when FT expression is important to promote flowering. Under SD, the tfl1 mutation was epistatic to co and in turn ft was epistatic to tfl1. These observations are consistent with the long-standing but not demonstrated model where CO can inhibit FT induction of flowering by affecting TFL1 expression.Fil: Luccioni, Laura. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura. Universidad de Buenos Aires. Facultad de Agronomía. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura; ArgentinaFil: Krzymuski, Martin Javier. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura. Universidad de Buenos Aires. Facultad de Agronomía. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura; ArgentinaFil: Sanchez Lamas, Maximiliano. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Bioquímicas de Buenos Aires. Fundación Instituto Leloir. Instituto de Investigaciones Bioquímicas de Buenos Aires; ArgentinaFil: Karayekov, Elizabeth Irene. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura. Universidad de Buenos Aires. Facultad de Agronomía. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura; ArgentinaFil: Cerdan, Pablo Diego. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Bioquímicas de Buenos Aires. Fundación Instituto Leloir. Instituto de Investigaciones Bioquímicas de Buenos Aires; ArgentinaFil: Casal, Jorge José. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura. Universidad de Buenos Aires. Facultad de Agronomía. Instituto de Investigaciones Fisiológicas y Ecológicas Vinculadas a la Agricultura; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Instituto de Investigaciones Bioquímicas de Buenos Aires. Fundación Instituto Leloir. Instituto de Investigaciones Bioquímicas de Buenos Aires; Argentin

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore