22 research outputs found

    SEASTAR: a mission to study ocean submesoscale dynamics and small-scale atmosphere-ocean processes in coastal, shelf and polar seas

    Get PDF
    High-resolution satellite images of ocean color and sea surface temperature reveal an abundance of ocean fronts, vortices and filaments at scales below 10 km but measurements of ocean surface dynamics at these scales are rare. There is increasing recognition of the role played by small scale ocean processes in ocean-atmosphere coupling, upper-ocean mixing and ocean vertical transports, with advanced numerical models and in situ observations highlighting fundamental changes in dynamics when scales reach 1 km. Numerous scientific publications highlight the global impact of small oceanic scales on marine ecosystems, operational forecasts and long-term climate projections through strong ageostrophic circulations, large vertical ocean velocities and mixed layer re-stratification. Small-scale processes particularly dominate in coastal, shelf and polar seas where they mediate important exchanges between land, ocean, atmosphere and the cryosphere, e.g., freshwater, pollutants. As numerical models continue to evolve toward finer spatial resolution and increasingly complex coupled atmosphere-wave-ice-ocean systems, modern observing capability lags behind, unable to deliver the high-resolution synoptic measurements of total currents, wind vectors and waves needed to advance understanding, develop better parameterizations and improve model validations, forecasts and projections. SEASTAR is a satellite mission concept that proposes to directly address this critical observational gap with synoptic two-dimensional imaging of total ocean surface current vectors and wind vectors at 1 km resolution and coincident directional wave spectra. Based on major recent advances in squinted along-track Synthetic Aperture Radar interferometry, SEASTAR is an innovative, mature concept with unique demonstrated capabilities, seeking to proceed toward spaceborne implementation within Europe and beyond

    The Zero Emissions Commitment and climate stabilization

    Get PDF
    How do we halt global warming? Reaching net zero carbon dioxide (CO2) emissions is understood to be a key milestone on the path to a safer planet. But how confident are we that when we stop carbon emissions, we also stop global warming? The Zero Emissions Commitment (ZEC) quantifies how much warming or cooling we can expect following a complete cessation of anthropogenic CO2 emissions. To date, the best estimate by the Intergovernmental Panel on Climate Change (IPCC) Sixth Assessment Report is zero change, though with substantial uncertainty. In this article, we present an overview of the changes expected in major Earth system processes after net zero and their potential impact on global surface temperature, providing an outlook toward building a more confident assessment of ZEC in the decades to come. We propose a structure to guide research into ZEC and associated changes in the climate, separating the impacts expected over decades, centuries, and millennia. As we look ahead at the century billed to mark the end of net anthropogenic CO2 emissions, we ask: what is the prospect of a stable climate in a post-net zero world?</jats:p

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    Cloud-based framework for inter-comparing submesoscale-permitting realistic ocean models

    Get PDF
    With the increase in computational power, ocean models with kilometer-scale resolution have emerged over the last decade. These models have been used for quantifying the energetic exchanges between spatial scales, informing the design of eddy parametrizations, and preparing observing networks. The increase in resolution, however, has drastically increased the size of model outputs, making it difficult to transfer and analyze the data. It remains, nonetheless, of primary importance to assess more systematically the realism of these models. Here, we showcase a cloud-based analysis framework proposed by the Pangeo project that aims to tackle such distribution and analysis challenges. We analyze the output of eight submesoscale-permitting simulations, all on the cloud, for a crossover region of the upcoming Surface Water and Ocean Topography (SWOT) altimeter mission near the Gulf Stream separation. The cloud-based analysis framework (i) minimizes the cost of duplicating and storing ghost copies of data and (ii) allows for seamless sharing of analysis results amongst collaborators. We describe the framework and provide example analyses (e.g., sea-surface height variability, submesoscale vertical buoyancy fluxes, and comparison to predictions from the mixed-layer instability parametrization). Basin- to global-scale, submesoscale-permitting models are still at their early stage of development; their cost and carbon footprints are also rather large. It would, therefore, benefit the community to document the different model configurations for future best practices. We also argue that an emphasis on data analysis strategies would be crucial for improving the models themselves
    corecore