5 research outputs found

    The rise and fall of the king : the correlation between FO Aquarii's low states and the White Dwarf's Spindown

    Get PDF
    The intermediate polar FO Aquarii experienced its first-reported low-accretion states in 2016, 2017, and 2018. We establish that these low states occurred shortly after the system's white dwarf (WD) began spinning down, after having spent a quarter-century spinning up. FO Aquarii is the only intermediate polar whose period derivative has undergone a sign change, and it has now done so twice. By combining our spin-pulse timings with previous data, we determine that the WD's spin period has varied quasi-sinusoidally since the system's discovery, and an extrapolation predicts that the white dwarf was spinning down during newly discovered low states in photographic plates from 1964, 1965, and 1974. Thus, FO Aquarii's low states appear to occur exclusively during epochs of spindown. Additionally, our time-series photometry of the 2016-18 low states reveals that the mode of accretion is extremely sensitive to the accretion rate; when the system is fainter than V~14.0, the accretion onto the WD is largely stream-fed, but when it is brighter, it is almost exclusively disk-fed. The system's grazing eclipse remained detectable throughout all observations, confirming the uninterrupted presence of a disk-like structure, regardless of the accretion state. Our observations are consistent with theoretical predictions that during the low states, the accretion disk dissipates into a ring of diamagnetic blobs. Finally, a new XMM-Newton observation from 2017 indicates that the system's anomalously soft X-ray spectrum and diminished X-ray luminosity in the wake of the 2016 low state appear to be long-lasting changes compared to pre-2016 observations.peer-reviewe

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore