100 research outputs found

    Anelastic sensitivity kernels with parsimonious storage for adjoint tomography and full waveform inversion

    Full text link
    We introduce a technique to compute exact anelastic sensitivity kernels in the time domain using parsimonious disk storage. The method is based on a reordering of the time loop of time-domain forward/adjoint wave propagation solvers combined with the use of a memory buffer. It avoids instabilities that occur when time-reversing dissipative wave propagation simulations. The total number of required time steps is unchanged compared to usual acoustic or elastic approaches. The cost is reduced by a factor of 4/3 compared to the case in which anelasticity is partially accounted for by accommodating the effects of physical dispersion. We validate our technique by performing a test in which we compare the KαK_\alpha sensitivity kernel to the exact kernel obtained by saving the entire forward calculation. This benchmark confirms that our approach is also exact. We illustrate the importance of including full attenuation in the calculation of sensitivity kernels by showing significant differences with physical-dispersion-only kernels

    Some General and Fundamental Requirements for Designing Observing System Simulation Experiments (OSSEs)

    Get PDF
    The intent of this white paper is to inform WMO projects and working groups, together with the broader weather research and general meteorology and oceanography communities, regarding the use of Observing System Simulation Experiments (OSSEs). This paper is not intended to be either a critical or cursory review of past OSSE efforts. Instead, it describes some fundamental, but often neglected, aspects of OSSEs and prescribes important caveats regarding their design, validation, and application. Well designed, properly validated, and carefully conducted OSSEs can be invaluable for examining, understanding, and estimating impacts of proposed observing systems and new data assimilation techniques. Although significant imperfections and limitations should be expected, OSSEs either profoundly complement or uniquely provide both qualitative and quantitative characterizations of potential analysis of components of the earth system

    Remote Sensing of River Discharge: A Review and a Framing for the Discipline

    Get PDF
    Remote sensing of river discharge (RSQ) is a burgeoning field rife with innovation. This innovation has resulted in a highly non-cohesive subfield of hydrology advancing at a rapid pace, and as a result misconceptions, mis-citations, and confusion are apparent among authors, readers, editors, and reviewers. While the intellectually diverse subfield of RSQ practitioners can parse this confusion, the broader hydrology community views RSQ as a monolith and such confusion can be damaging. RSQ has not been comprehensively summarized over the past decade, and we believe that a summary of the recent literature has a potential to provide clarity to practitioners and general hydrologists alike. Therefore, we here summarize a broad swath of the literature, and find after our reading that the most appropriate way to summarize this literature is first by application area (into methods appropriate for gauged, semi-gauged, regionally gauged, politically ungauged, and totally ungauged basins) and next by methodology. We do not find categorizing by sensor useful, and everything from un-crewed aerial vehicles (UAVs) to satellites are considered here. Perhaps the most cogent theme to emerge from our reading is the need for context. All RSQ is employed in the service of furthering hydrologic understanding, and we argue that nearly all RSQ is useful in this pursuit provided it is properly contextualized. We argue that if authors place each new work into the correct application context, much confusion can be avoided, and we suggest a framework for such context here. Specifically, we define which RSQ techniques are and are not appropriate for ungauged basins, and further define what it means to be ‘ungauged’ in the context of RSQ. We also include political and economic realities of RSQ, as the objective of the field is sometimes to provide data purposefully cloistered by specific political decisions. This framing can enable RSQ to respond to hydrology at large with confidence and cohesion even in the face of methodological and application diversity evident within the literature. Finally, we embrace the intellectual diversity of RSQ and suggest the field is best served by a continuation of methodological proliferation rather than by a move toward orthodoxy and standardization

    Advancing measurements and representations of subsurface heterogeneity and dynamic processes: towards 4D hydrogeology

    Get PDF
    Essentially all hydrogeological processes are strongly influenced by the subsurface spatial heterogeneity and the temporal variation of environmental conditions, hydraulic properties, and solute concentrations. This spatial and temporal variability generally leads to effective behaviors and emerging phenomena that cannot be predicted from conventional approaches based on homogeneous assumptions and models. However, it is not always clear when, why, how, and at what scale the 4D (3D + time) nature of the subsurface needs to be considered in hydrogeological monitoring, modeling, and applications. In this paper, we discuss the interest and potential for the monitoring and characterization of spatial and temporal variability, including 4D imaging, in a series of hydrogeological processes: (1) groundwater fluxes, (2) solute transport and reaction, (3) vadose zone dynamics, and (4) surface–subsurface water interactions. We first identify the main challenges related to the coupling of spatial and temporal fluctuations for these processes. We then highlight recent innovations that have led to significant breakthroughs in high-resolution space–time imaging and modeling the characterization, monitoring, and modeling of these spatial and temporal fluctuations. We finally propose a classification of processes and applications at different scales according to their need and potential for high-resolution space–time imaging. We thus advocate a more systematic characterization of the dynamic and 3D nature of the subsurface for a series of critical processes and emerging applications. This calls for the validation of 4D imaging techniques at highly instrumented observatories and the harmonization of open databases to share hydrogeological data sets in their 4D components

    Doctor of Philosophy

    Get PDF
    dissertationAccording to a UN report, more than 50% of the total world's population resides in urban areas and this fraction is increasing. Urbanization has a wide range of potential environmental impacts, including those related to the dispersion of potentially dangerous substances emitted from activities such as combustion, industrial processing or from deliberate harmful releases. This research is primarily focused on the investigation of various factors which contribute to the dispersion of certain classes of materials in a complex urban environment and improving both of the fundamental components of a fast response dispersion modeling system - wind modeling and dispersion modeling. Specifically, new empirical parameterizations have been suggested for an existing fast response wind model for street canyon flow fields. These new parameterizations are shown to produce more favorable results when compared with the experimental data. It is also demonstrated that the use of Graphics Processing Unit (GPU) technology can enhance the efficiency of an urban Lagrangian dispersion model and can achieve near real-time particle advection. The GPU also enables real-time visualizations which can be used for creating virtual urban environments to aid emergency responders. The dispersion model based on the GPU architecture relies on the so-called "simplified Langevin equations (SLEs)" for particle advection. The full or generalized form of the Langevin equations (GLEs) is known for its stiffness which tends to generate unstable modes in particle trajectory, where a particle may travel significant distances in a small time step

    Bio-Optical Sensors on Argo Floats

    Get PDF
    The general objective of the IOCCG BIO-Argo working group is to elaborate recommendations for establishing a framework for the future development of a cost-effective, bio-optical float network corresponding to the needs and expectations of the scientific community. In this context, our recommendations will necessarily be broad; they range from the identification of key bio-optical measurements to be implemented on floats, to the real-time management of the data flux resulting from the deployment of a "fleet of floats". Each chapter of this report is dedicated to an essential brick leading towards the goal of implementing a bio-optical profiling float network. The following topics are discussed in the Chapters listed below: - Chapter 2 reviews the scientific objectives that could be tackled through the development of such networks, by allowing some of the gaps in the present spatio-temporal resolution of bio-optical variables to be progressively filled. - Chapter 3 identifies the optical and bio-optical properties that are now amenable to remote and autonomous measurement through the use of optical sensors mounted on floats. - Chapter 4 addresses the question of sensor requirements, in particular with respect to measurements performed from floats. - Chapter 5 proposes and argues for the development of dedicated float missions corresponding to specific scientific objectives and relying on specific optical sensor suites, as well as on specific modes of float operation. - Chapter 6 identifies technological issues that need to be addressed for the various bio-optical float missions to become even more cost-effective. - Chapter 7 covers all aspects of data treatment ranging from the development of various quality control procedures (from real-time to delayed mode) to the architecture required for favoring easy access to data. - Chapter 8 reviews the necessary steps and experience required before the operational implementation of different types of float networks can become a reality.JRC.H.5-Land Resources Managemen

    Bayesian methods in glaciology

    Get PDF
    Thesis (Ph.D) University of Alaska Fairbanks, 2017The problem of inferring the value of unobservable model parameters given a set of observations is ubiquitous in glaciology, as are large measurement errors. Bayes' theorem provides a unified framework for addressing such problems in a rigorous and robust way through Monte Carlo sampling of posterior distributions, which provides not only the optimal solution for a given inverse problem, but also the uncertainty. We apply these methods to three glaciological problems. First, we use Markov Chain Monte Carlo sampling to infer the importance of different glacier hydrological processes from observations of terminus water flux and surface speed. We find that the opening of sub-glacial cavities due to sliding over asperities at the glacier bed is of a similar magnitude to the opening of channels due to turbulent melt during periods of large input flux, but also that the processes of turbulent melting is the greatest source of uncertainty in hydrological modelling. Storage of water in both englacial void spaces and exchange of water between the englacial and subglacial systems are both necessary to explain observations. We next use Markov Chain Monte Carlo sampling to determine distributed glacier thickness from dense observations of surface velocity and mass balance coupled with sparse direct observations of thickness. These three variables are related through the principle of mass conservation. We develop a new framework for modelling observational uncertainty, then apply the method to three test cases. We find a strong relationship between measurement uncertainty, measurement spacing, and the resulting uncertainty in thickness estimates. We also find that in order to minimize uncertainty, measurement spacing should be 1-2 times the characteristic length scale of variations in subglacial topography. Finally, we apply the method of particle filtering to compute robust estimates of ice surface velocity and uncertainty from oblique time-lapse photos for the rapidly retreating Columbia Glacier. The resulting velocity fields, when averaged over suitable time scales, agree well with velocity measurements derived from satellites. At higher temporal resolution, our results suggest that seasonal evolution of the subglacial drainage system is responsible for observed changes in ice velocity at seasonal scales, and that this changing configuration produces varying degrees of glacier flow sensitivity to changes in external water input

    Development of high-resolution L4 ocean wind products

    Get PDF
    [eng] Heat, moisture, gas, and momentum exchanges at the oceanic and atmospheric interface modulate, inter alia, the Earth’s heat and carbon budgets, global circulation, and dynamical modes. Sea surface winds are fundamental to these exchanges and, as such, play a major role in the evolution and dynamics of the Earth’s climate. For ocean and atmospheric modeling purposes, and for their coupling, accurate sea-surface winds are therefore crucial to properly estimate these turbulent fluxes. Over the last decades, as numerical models became more sophisticated, the requirements for higher temporal and spatial resolution ocean forcing products grew. Sea surface winds from numerical weather prediction (NWP) models provide a convenient temporal and spatial coverage to force ocean models, and for that they are extensively used, e.g., the European Centre for Medium-range Weather Forecasts (ECMWF) latest reanalysis, ERA5, with ubiquitous hourly estimates of sea-surface wind available globally on a 30-km spatial grid. However, local systematic errors have been reported in global NWP fields using collocated scatterometer observations as reference. These rather persistent errors are associated with physical processes that are absent or misrepresented by the NWP models, e.g., strong current effects like the Western Boundary Current Systems (highly stationary), wind effects as- sociated with the oceanic mesoscale (sea surface temperature gradients), coastal effects (land see breezes, katabatic winds), Planetary Boundary Layer parameterization errors, and large-scale circulation effects, such as those associated with moist convection areas. In contrast, the ocean surface vector wind or wind stress derived from scatterometers, although intrinsically limited by temporal and spatial sampling, exhibits considerable spatial detail and accuracy. The latter has an effective resolution of 25 km while that of NWP models is of 150 km. Consequently, the biases between the two mostly represent the physical processes unresolved by NWP models. In this thesis, a high-resolution ocean surface wind forcing, the so-called ERAú, that combines the strengths of both the scatterometer observations and of the atmospheric model wind fields is created using a scatterometer-based local NWP wind vector model bias correction. ERAú stress equivalent wind (U10S) is generated by means of a geolocated scatterometer-based correction applied separately to two different ECMWF reanalyses, the nowadays obsolete ERA-interim (ERAi) and the most recent ERA5. Several ERAú configurations using complementary scatterometer data accumulated over different temporal windows (TW) are generated and verified against independent wind sources (scatterometer and moored buoys), through statistical and spectral analysis of spatial structures. The newly developed method successfully corrects for local wind vector biases in the reanalysis output, particularly in open ocean regions, by introducing the oceanic mesoscales captured by the scatterometers into the ERAi/ERA5 NWP reanalyses. However, the effectiveness of the method is intrinsically dependent on regional scatterometer sampling, wind variability and local bias persistence. The optimal ERAú uses multiple complementary scatterometers and a 3-day TW. Bias patterns are the same for ERAi and ERA5 SC to the reanalyses, though the latter shows smaller bias amplitudes and hence smaller error variance reduction differences in verification (up to 8% globally). However, because of ERA5 being more accurate than ERAi, ERAú derived from ERA5 turns out to be the highest quality product. ERAú ocean forcing does not enhance the sensitivity in global circulation models to highly localized transient events, however it improves large-scale ocean simulations, where large- scale corrections are relevant. Besides ocean forcing studies, the developed methodology can be further applied to improve scatterometer wind data assimilation by accounting for the persistent model biases. In addition, since the biases can be associated with misrepresented processes and parmeterizations, empirical predictors of these biases can be developed for use in forecasting and to improve the dynamical closure and parameterizations in coupled ocean-atmosphere models.[spa] Los vientos de la superficie del mar son fundamentales para estimar los flujos de calor y momento en la interfaz oceánica-atmosfera, ocupando un papel importante en la evolución y la dinámica del clima del planeta. Por tanto, en modelación (oceánica y atmosférica), vientos de calidad son cruciales para estimar adecuadamente estos flujos turbulentos. Vientos de la superficie del mar de salidas de modelos de predicción numérica del tiempo (NWP) proporcionan una cobertura temporal y espacial conveniente para forzar los modelos oceánicos, y todavía se utilizan ampliamente. Sin embargo, se han documentado errores sistemáticos locales en campos de NWP globales utilizando observaciones de dispersómetros co-ubicados como referencia (asociados con procesos físicos que ausentes o mal representados por los modelos). Al contrario, el viento de la superficie del mar derivado de los dispersómetros, aunque intrínsecamente limitado por el muestreo temporal y espacial, presenta una precisión y un detalle espacial considerables. Consecuentemente, los sesgos entre los dos representan principalmente los procesos físicos no resueltos por los modelos NWP. En esta tesis, se crea un producto de forzamiento del viento en la superficie del océano de alta resolución, el ERAú. ERAú se genera con una corrección media basada en diferencias geolocalizadas entre dispersometro y modelo, aplicadas por separado a dos reanálisis diferentes, el ERA-interim (ERAi) y el ERA5. Varias configuraciones de ERAú utilizando datos de dispersómetros complementarios acumulados en diferentes ventanas tempo- rales (TW) se generan y validan frente a datos de viento independientes, a través de análisis estadísticos y espectrales de estructuras espaciales. El método corrige con éxito los sesgos del vector de viento local de la reanálisis. Sin embargo, su eficacia depende del muestreo del dispersómetro regional, la variabilidad del viento y la persistencia del sesgo local. El ERAú óptimo utiliza múltiples dispersómetros complementarios y un TW de 3 días. Las dos reanálisis muestran los mismos patrones de sesgo en la SC, debido a que ERA5 es más preciso que ERAi, ERAú derivado de ERA5 es el producto de mayor calidad. El forzamiento oceánico ERAú mejora las simulaciones oceánicas a gran escala, donde las correcciones a gran escala son relevantes
    corecore