31 research outputs found
Changes in extremely hot days under stabilized 1.5â°C and 2.0â°c global warming scenarios as simulated by the HAPPI multi-model ensemble
The half a degree additional warming, prognosis and projected impacts
(HAPPI) experimental protocol provides a multi-model database to compare the
effects of stabilizing anthropogenic global warming of 1.5âŻÂ°C over
preindustrial levels to 2.0âŻÂ°C over these levels. The HAPPI experiment
is based upon large ensembles of global atmospheric models forced by sea
surface temperature and sea ice concentrations plausible for these
stabilization levels. This paper examines changes in extremes of high
temperatures averaged over three consecutive days. Changes in this measure
of extreme temperature are also compared to changes in hot season
temperatures. We find that over land this measure of extreme high
temperature increases from about 0.5 to 1.5âŻÂ°C over present-day values
in the 1.5âŻÂ°C stabilization scenario, depending on location and model. We
further find an additional 0.25 to 1.0âŻÂ°C increase in extreme high
temperatures over land in the 2.0âŻÂ°C stabilization scenario. Results
from the HAPPI models are consistent with similar results from the one
available fully coupled climate model. However, a complicating factor in
interpreting extreme temperature changes across the HAPPI models is their
diversity of aerosol forcing changes
Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography.
Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11ânm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a set of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices
Vitamin D Levels in Asymptomatic Adults-A Population Survey in Karachi, Pakistan
Background: It is well established that low levels of 25(OH) Vitamin D (/dL) are a common finding world over, affecting over a billion of the global population. Our primary objective was to determine the prevalence of vitamin D deficiency and insufficiency in the asymptomatic adult population of Karachi, Pakistan and the demographic, nutritional and co-morbidity characteristics associated with serum vitamin D levels. Methods: A cross-sectional population survey was conducted at two spaced out densely populated areas of the city. Serum levels of 25OH vitamin D were measured and GFR as renal function was assessed by using 4 variable MDRD formula. Results: Our sample of 300 had a median age of 48(interquartile range 38-55) years. The median level of serum vitamin D was 18.8 (IQ range 12.65-24.62) ng/dL. A total of 253 (84.3%) respondents had low levels (/dL) of 25OH vitamin D. Serum PTH and vitamin D were negatively correlated (râ=â-0.176, pâ=â0.001). The median PTH in the vitamin D sufficiency group was 38.4 (IQ range28.0-48.8)pg/mL compared with 44.4 (IQ range 34.3-56.8) pg/mL in the deficiency group (pâ=â0.011).The median serum calcium level in the sample was 9.46(IQ range 9.18-9.68) ng/dL. Low serum levels of vitamin D were not associated with hypertension (pâ=â0.771) or with an elevated spot blood pressure (pâ=â0.164).In our sample 75(26%) respondents had an eGFR corresponding to stage 2 and stage 3 CKD. There was no significant correlation between levels of vitamin D and eGFR (râ=â-0.127, p-valueâ=â0.277). Respondents using daily vitamin D supplements had higher 25 OH vitamin D levels (p-valueâ=â0.021). Conclusion: We observed a high proportion of the asymptomatic adult population having low levels of vitamin D and subclinical deterioration of eGFR. The specific cause(s) for this observed high prevalence of low 25OH vitamin D levels are not clear and need to be investigated further upon
Near-edge X-ray Refraction Fine Structure Microscopy
We demonstrate a method for obtaining increased spatial resolution and specificity in nanoscale chemical composition maps through the use of full refractive reference spectra in soft x-ray spectro-microscopy. Using soft x-rayptychography, we measure both the absorption and refraction of x-rays through pristine reference materials as a function of photon energy and use these reference spectra as the basis for decomposing spatially resolved spectra from a heterogeneous sample, thereby quantifying the composition at high resolution. While conventional instruments are limited to absorption contrast, our novel refraction based method takes advantage of the strongly energy dependent scattering cross-section and can see nearly five-fold improved spatial resolutionon resonance
The Atmospheric River Tracking Method Intercomparison Project (ARTMIP): Quantifying Uncertainties in Atmospheric River Climatology
Atmospheric rivers (ARs) are now widely known for their association with highâimpact weather events and longâterm water supply in many regions. Researchers within the scientific community have developed numerous methods to identify and track of ARsâa necessary step for analyses on gridded data sets, and objective attribution of impacts to ARs. These different methods have been developed to answer specific research questions and hence use different criteria (e.g., geometry, threshold values of key variables, and time dependence). Furthermore, these methods are often employed using different reanalysis data sets, time periods, and regions of interest. The goal of the Atmospheric River Tracking Method Intercomparison Project (ARTMIP) is to understand and quantify uncertainties in AR science that arise due to differences in these methods. This paper presents results for key ARârelated metrics based on 20+ different AR identification and tracking methods applied to ModernâEra Retrospective Analysis for Research and Applications Version 2 reanalysis data from January 1980 through June 2017. We show that AR frequency, duration, and seasonality exhibit a wide range of results, while the meridional distribution of these metrics along selected coastal (but not interior) transects are quite similar across methods. Furthermore, methods are grouped into criteriaâbased clusters, within which the range of results is reduced. AR case studies and an evaluation of individual method deviation from an allâmethod mean highlight advantages/disadvantages of certain approaches. For example, methods with less (more) restrictive criteria identify more (less) ARs and ARârelated impacts. Finally, this paper concludes with a discussion and recommendations for those conducting ARârelated research to consider.Fil: Rutz, Jonathan J.. National Ocean And Atmospheric Administration; Estados UnidosFil: Shields, Christine A.. National Center for Atmospheric Research; Estados UnidosFil: Lora, Juan M.. University of Yale; Estados UnidosFil: Payne, Ashley E.. University of Michigan; Estados UnidosFil: Guan, Bin. California Institute of Technology; Estados UnidosFil: Ullrich, Paul. University of California at Davis; Estados UnidosFil: O'Brien, Travis. Lawrence Berkeley National Laboratory; Estados UnidosFil: Leung, Ruby. Pacific Northwest National Laboratory; Estados UnidosFil: Ralph, F. Martin. Center For Western Weather And Water Extremes; Estados UnidosFil: Wehner, Michael. Lawrence Berkeley National Laboratory; Estados UnidosFil: Brands, Swen. Meteogalicia; EspañaFil: Collow, Allison. Universities Space Research Association; Estados UnidosFil: Goldenson, Naomi. University of California at Los Angeles; Estados UnidosFil: Gorodetskaya, Irina. Universidade de Aveiro; PortugalFil: Griffith, Helen. University of Reading; Reino UnidoFil: Kashinath, Karthik. Lawrence Bekeley National Laboratory; Estados UnidosFil: Kawzenuk, Brian. Center For Western Weather And Water Extremes; Reino UnidoFil: Krishnan, Harinarayan. Lawrence Berkeley National Laboratory; Estados UnidosFil: Kurlin, Vitaliy. University of Liverpool; Reino UnidoFil: Lavers, David. European Centre For Medium-range Weather Forecasts; Estados UnidosFil: Magnusdottir, Gudrun. University of California at Irvine; Estados UnidosFil: Mahoney, Kelly. Universidad de Lisboa; PortugalFil: Mc Clenny, Elizabeth. University of California at Davis; Estados UnidosFil: Muszynski, Grzegorz. University of Liverpool; Reino Unido. Lawrence Bekeley National Laboratory; Estados UnidosFil: Nguyen, Phu Dinh. University of California at Irvine; Estados UnidosFil: Prabhat, Mr.. Lawrence Bekeley National Laboratory; Estados UnidosFil: Qian, Yun. Pacific Northwest National Laboratory; Estados UnidosFil: Ramos, Alexandre M.. Universidade Nova de Lisboa; PortugalFil: Sarangi, Chandan. Pacific Northwest National Laboratory; Estados UnidosFil: Viale, Maximiliano. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Centro CientĂfico TecnolĂłgico Conicet - Mendoza. Instituto Argentino de NivologĂa, GlaciologĂa y Ciencias Ambientales. Provincia de Mendoza. Instituto Argentino de NivologĂa, GlaciologĂa y Ciencias Ambientales. Universidad Nacional de Cuyo. Instituto Argentino de NivologĂa, GlaciologĂa y Ciencias Ambientales; Argentin
Recommended from our members
Exact Gaussian processes for massive datasets via non-stationary sparsity-discovering kernels
A Gaussian Process (GP) is a prominent mathematical framework for stochastic function approximation in science and engineering applications. Its success is largely attributed to the GP's analytical tractability, robustness, and natural inclusion of uncertainty quantification. Unfortunately, the use of exact GPs is prohibitively expensive for large datasets due to their unfavorable numerical complexity of [Formula: see text] in computation and [Formula: see text] in storage. All existing methods addressing this issue utilize some form of approximation-usually considering subsets of the full dataset or finding representative pseudo-points that render the covariance matrix well-structured and sparse. These approximate methods can lead to inaccuracies in function approximations and often limit the user's flexibility in designing expressive kernels. Instead of inducing sparsity via data-point geometry and structure, we propose to take advantage of naturally-occurring sparsity by allowing the kernel to discover-instead of induce-sparse structure. The premise of this paper is that the data sets and physical processes modeled by GPs often exhibit natural or implicit sparsities, but commonly-used kernels do not allow us to exploit such sparsity. The core concept of exact, and at the same time sparse GPs relies on kernel definitions that provide enough flexibility to learn and encode not only non-zero but also zero covariances. This principle of ultra-flexible, compactly-supported, and non-stationary kernels, combined with HPC and constrained optimization, lets us scale exact GPs well beyond 5 million data points
Recommended from our members
Changes in tropical cyclones under stabilized 1.5â°C and 2.0â°C global warming scenarios as simulated by the Community Atmospheric Model under the HAPPI protocols
Abstract. The United Nations Framework Convention on Climate Change (UNFCCC) invited the scientific community to explore the impacts of a world where anthropogenic global warming is stabilized at only 1.5â°C above preindustrial average temperatures. We present a projection of future tropical cyclone statistics for both 1.5â°C and 2.0â°C stabilized warming scenarios by direct numerical simulation using a high resolution global climate model. As in similar projections at higher warming levels, we find that even at these low warming levels the most intense tropical cyclones becomes more frequent and more intense, while simultaneously the frequency of weaker tropical storms is decreased. We also conclude that in the 1.5â°C stabilization, the effect of aerosol forcing changes complicates the interpretation of greenhouse gas forcing changes
Early 21st century anthropogenic changes in extremely hot days as simulated by the C20C+ detection and attribution multi-model ensemble
We examine the effect of the 20th and recent 21st century anthropogenic climate change on high temperature extremes as simulated by four global atmospheric general circulation models submitted to the Climate of the 20th Century Plus Detection and Attribution project. This coordinated experiment is based upon two large ensembles simulations for each participating model. The âworld that wasâ simulations are externally forced as realistically as possible. The âworld that might have beenâ is identical except that the influence of human forcing is removed but natural forcing agents and variations in ocean and sea ice are retained. We apply a stationary generalized extreme value analysis to the annual maxima of the three day average of the daily maximum surface air temperature, finding that long period return values have been increased by human activities between 1 and 3âŻÂ°C over most land areas. Corresponding changes in the probability of achieving long period non-industrial return values in the industrialized world are also presented. We find that most regions experience increases in the frequency and intensity of extremely hot three day periods, but anthropogenic sulfate aerosol forcing changes locally can decrease these measures of heat waves in some models
Recommended from our members
Structure recognition from high resolution images of ceramic composites
Fibers provide exceptional strength-to-weight ratio capabilities when woven into ceramic composites, transforming them into materials with exceptional resistance to high temperature, and high strength combined with improved fracture toughness.Microcracks are inevitable when the material is under strain, which can be imaged using synchrotron X-ray computed micro-tomography (mu-CT) for assessment of material mechanical toughness variation. An important part of this analysis is to recognize fibrillar features. This paper presents algorithms for detecting and quantifying composite cracks and fiber breaks from high-resolution image stacks. First, we propose recognition algorithms to identify the different structures of the composite, including matrix cracks and fibers breaks. Second, we introduce our package F3D for fast filtering of large 3D imagery, implemented in OpenCL to take advantage of graphic cards. Results show that our algorithms automatically identify micro-damage and that the GPU-based implementation introduced here takes minutes, being 17x fasterthan similar tools on a typical image file
Recommended from our members
Toward implementing autonomous adaptive data acquisition for scanning hyperspectral imaging of biological systems
Autonomous experimentation is an emerging area of research, primarily related to autonomous vehicles, scientific combinatorial discovery approaches in materials science and drug discovery, and iterative research loops of planning, experimentation, and analysis. However, autonomous approaches developed in these contexts are difficult to apply to high-dimensional mapping technologies, such as scanning hyperspectral imaging of biological systems, due to sample complexity and heterogeneity. We briefly cover the history of adaptive sampling algorithms and surrogate modeling in order to define autonomous adaptive data acquisition as an objective-based, flexible building block for future biological imaging experimentation driven by intelligent infrastructure. We subsequently summarize the recent implementations of autonomous adaptive data acquisition (AADA) for scanning hyperspectral imaging, assess how these address the difficulties of autonomous approaches in hyperspectral imaging, and highlight the AADA design variation from a goal-oriented perspective. Finally, we present a modular AADA architecture that embeds AADA-driven flexible building blocks to address the challenge of time resolution for high-dimensional scanning hyperspectral imaging of nonequilibrium dynamical systems. In our example research-driven experimental design case, we propose an AADA infrastructure for time-resolved, noninvasive, and label-free scanning hyperspectral imaging of living biological systems. This AADA infrastructure can accurately target the correct state of the system for experimental workflows that utilize subsequent expensive, high-information-content analytical techniques