1,382 research outputs found
Estimate solar contribution to the global surface warming using the ACRIM TSI satellite composite
We study, by using a wavelet decomposition methodology, the solar signature
on global surface temperature data using the ACRIM total solar irradiance
satellite composite by Willson and Mordvinov. These data present a
+0.047%/decade trend between minima during solar cycles 21-23 (1980-2002). We
estimate that the ACRIM upward trend might have minimally contributed
10-30% of the global surface temperature warming over the period
1980-2002
Diagnostics of accelerating plasma Semiannual progress report, 1 Sep. 1968 - 28 Feb. 1969
Accelerating plasma diagnostics - validity of local thermal equilibrium assumption in electromagnetic shock tubes, and current-sheet velocity in coaxial plasma accelerato
Diagnostics of accelerating plasma Semiannual progress report, 1 Mar. - 31 Aug. 1968
Plasma diagnostics in electromagnetically driven shock tubes using laser scattering methods as compared to spectroscopic technique
Monte Carlo simulation of the transmission of measles: Beyond the mass action principle
We present a Monte Carlo simulation of the transmission of measles within a
population sample during its growing and equilibrium states by introducing two
different vaccination schedules of one and two doses. We study the effects of
the contact rate per unit time as well as the initial conditions on the
persistence of the disease. We found a weak effect of the initial conditions
while the disease persists when lies in the range 1/L-10/L ( being
the latent period). Further comparison with existing data, prediction of future
epidemics and other estimations of the vaccination efficiency are provided.
Finally, we compare our approach to the models using the mass action
principle in the first and another epidemic region and found the incidence
independent of the number of susceptibles after the epidemic peak while it
strongly fluctuates in its growing region. This method can be easily applied to
other human, animals and vegetable diseases and includes more complicated
parameters.Comment: 15 pages, 4 figures, 1 table, Submitted to Phys.Rev.
Filling the intervention gap: service evaluation of an intensive nonsurgical weight management programme for severe and complex obesity
Background:
Weight management including formula total diet replacement (TDR) is emerging as an effective intervention for severe and complex obesity, particularly with respect to type 2 diabetes (T2DM). However, no prospective audit and service evaluation of such programmes have been reported.
Methods:
Following initial feasibility piloting, the CounterweightâPlus programme was commissioned across a variety of healthcare providers. The programme includes: Screening, TDR (formula low energy diet), food reintroduction and weight loss maintenance, all delivered by staff with 8 h of training, inâservice mentoring, ongoing specialist support and access to medical consultant expertise. Anonymised data are returned centrally for clinical evaluation.
Results:
Up to December 2016, 288 patients commenced the programme. Mean (SD) baseline characteristics were: age 47.5 (12.7) years, weight 128.0 (32.0) kg, body mass index 45.7 (10.1) kg mâ2, n = 76 (26.5%) were male and n = 99 (34.5%) had T2DM. On an intentionâtoâtreat (ITT) basis, a loss of â„15 kg at 12 months was achieved by 48 patients, representing 22.1% of all who started and 40% of those who maintained engagement. For complete cases, mean (95% confidence interval) weight loss was 13.3 (12.1â14.4) kg at 3 months, 16.0 (14.4â17.6) kg at 6 months and 14.2 (12.1â16.3) kg at 12 months (all P < 0.001), with losses to followâup of 10.8%, 29.3% and 44.2%, respectively. Mean loss at 12 months by ITT analyses was: single imputation â10.5 (9.5) kg, last observation carried forward â10.9 (11.6) kg and baseline observation carried forward â7.9 (11.1) kg. The presence of diabetes had no significant impact on weight change outcomes.
Conclusions:
This nonsurgical approach is effective for many individuals with severe and complex obesity, representing an option before considering surgery. The results are equally effective in terms of weight loss for people with T2DM
Continuous data assimilation for global numerical weather prediction
A new configuration of the European Centre for Medium-Range Weather Forecasts (ECMWF) incremental 4D-Var data assimilation (DA) system is introduced which builds upon the quasi-continuous DA concept proposed in the mid-1990s. Rather than working with a fixed set of observations, the new 4D-Var configuration exploits the near-continuous stream of incoming observations by introducing recently arrived observations at each outer loop iteration of the assimilation. This allows the analysis to benefit from more recent observations. Additionally, by decoupling the start time of the DA calculations from the observational data cut-off time, real-time forecasting applications can benefit from more expensive analysis configurations that previously could not have been considered. In this work we present results of a systematic comparison of the performance of a Continuous DA system against that of two more traditional baseline 4D-Var configurations. We show that the quality of the analysis produced by the new, more continuous configuration is comparable to that of a conventional baseline that has access to all of the observations in each of the outer loops, which is a configuration not feasible in real-time operational numerical weather prediction. For real-time forecasting applications, the Continuous DA framework allows configurations which clearly outperform the best available affordable non-continuous configuration. Continuous DA became operational at ECMWF in June 2019 and led to significant 2 to 3% reductions in medium-range forecast root mean square errors, which is roughly equivalent to 2-3 hr of additional predictive skill.Peer reviewe
Recommended from our members
Ensemble prediction for nowcasting with a convection-permitting model - II: forecast error statistics
A 24-member ensemble of 1-h high-resolution forecasts over the Southern United Kingdom is used to study short-range forecast error statistics. The initial conditions are found from perturbations from an ensemble transform Kalman filter. Forecasts from this system are assumed to lie within the bounds of forecast error of an operational forecast system. Although noisy, this system is capable of producing physically reasonable statistics which are analysed and compared to statistics implied from a variational assimilation system. The variances for temperature errors for instance show structures that reflect convective activity. Some variables, notably potential temperature and specific humidity perturbations, have autocorrelation functions that deviate from 3-D isotropy at the convective-scale (horizontal scales less than 10 km). Other variables, notably the velocity potential for horizontal divergence perturbations, maintain 3-D isotropy at all scales. Geostrophic and hydrostatic balances are studied by examining correlations between terms in the divergence and vertical momentum equations respectively. Both balances are found to decay as the horizontal scale decreases. It is estimated that geostrophic balance becomes less important at scales smaller than 75 km, and hydrostatic balance becomes less important at scales smaller than 35 km, although more work is required to validate these findings. The implications of these results for high-resolution data assimilation are discussed
A "superstorm": When moral panic and new risk discourses converge in the media
This is an Author's Accepted Manuscript of an article published in Health, Risk and Society, 15(6), 681-698, 2013, copyright Taylor & Francis, available online at: http://www.tandfonline.com/10.1080/13698575.2013.851180.There has been a proliferation of risk discourses in recent decades but studies of these have been polarised, drawing either on moral panic or new risk frameworks to analyse journalistic discourses. This article opens the theoretical possibility that the two may co-exist and converge in the same scare. I do this by bringing together more recent developments in moral panic thesis, with new risk theory and the concept of media logic. I then apply this theoretical approach to an empirical analysis of how and with what consequences moral panic and new risk type discourses converged in the editorials of four newspaper campaigns against GM food policy in Britain in the late 1990s. The article analyses 112 editorials published between January 1998 and December 2000, supplemented with news stories where these were needed for contextual clarity. This analysis shows that not only did this novel food generate intense media and public reactions; these developed in the absence of the type of concrete details journalists usually look for in risk stories. Media logic is important in understanding how journalists were able to engage and hence how a major scare could be constructed around convergent moral panic and new risk type discourses. The result was a media âsuperstormâ of sustained coverage in which both types of discourse converged in highly emotive mutually reinforcing ways that resonated in a highly sensitised context. The consequence was acute anxiety, social volatility and the potential for the disruption of policy and social change
Quantum statistical effects in nano-oscillator arrays
We have theoretically predicted the density of states(DOS), the low
temperature specific heat, and Brillouin scattering spectra of a large, free
standing array of coupled nano-oscillators. We have found significant gaps in
the DOS of 2D elastic systems, and predict the average DOS to be nearly
independent of frequency over a broad band f < 50GHz. At low temperatures, the
measurements probe the quantum statistics obeyed by rigid body modes of the
array and, thus, could be used to verify the quantization of the associated
energy levels. These states, in turn, involve center-of mass motion of large
numbers of atoms, N > 1.e14, and therefore such observations would extend the
domain in which quantum mechanics has been experimentally tested. We have found
the required measurement capability to carry out this investigation to be
within reach of current technology.Comment: 1 tex file, 3 figures, 1 bbl fil
Recommended from our members
Radiative forcing of climate: the historical evolution of the radiative forcing concept, the forcing agents and their quantification, and applications
We describe the historical evolution of the conceptualization, formulation, quantification, application and utilization of âradiative forcing (RF, see e.g., IPCC, 1990)â of Earthâs climate.
Basic theories of shortwave and long wave radiation were developed through the 19th and 20th centuries, and established the analytical framework for defining and quantifying the perturbations to the Earthâs radiative energy balance by natural and anthropogenic influences. The insight that the Earthâs climate could be radiatively forced by changes in carbon dioxide, first introduced in the 19th century, gained empirical support with sustained observations of the atmospheric concentrations of the gas beginning in 1957. Advances in laboratory and field measurements, theory, instrumentation, computational technology, data and analysis of well-mixed greenhouse gases and the global climate system through the 20th Century enabled the development and formalism of RF; this allowed RF to be related to changes in global-mean surface temperature with the aid of increasingly sophisticated models. This in turn led to RF becoming firmly established as a principal concept in climate science by 1990.
The linkage with surface temperature has proven to be the most important application of the RF concept, enabling a simple metric to evaluate the relative climate impacts of different agents. The late 1970s and 1980s saw accelerated developments in quantification including the first assessment of the effect of the forcing due to doubling of carbon dioxide on climate (the âCharneyâ report, National Research Council, 1979). The concept was subsequently extended to a wide variety of agents beyond well-mixed greenhouse gases (WMGHGs: carbon dioxide, methane, nitrous oxide, and halocarbons) to short-lived species such as ozone. The WMO (1986) and IPCC (1990) international assessments began the important sequence of periodic evaluations and quantifications of the forcings by natural (solar irradiance changes and stratospheric aerosols resulting from volcanic eruptions) and a growing set of anthropogenic agents (WMGHGs, ozone, aerosols, land surface changes, contrails). From 1990s to the present, knowledge and scientific confidence in the radiative agents acting on the climate system has proliferated. The conceptual basis of RF has also evolved as both our understanding of the way radiative forcing drives climate change, and the diversity of the forcing mechanisms, have grown. This has led to the current situation where âEffective Radiative Forcing (ERF, e.g., IPCC, 2013)â is regarded as the preferred practical definition of radiative forcing in order to better capture the link between forcing and global-mean surface temperature change. The use of ERF, however, comes with its own attendant issues, including challenges in its diagnosis from climate models, its applications to small forcings, and blurring of the distinction between rapid climate adjustments (fast responses) and climate feedbacks; this will necessitate further elaboration of its utility in the future. Global climate model simulations of radiative perturbations by various agents have established how the forcings affect other climate variables besides temperature e.g., precipitation. The forcing-response linkage as simulated by models, including the diversity in the spatial distribution of forcings by the different agents, has provided a practical demonstration of the effectiveness of agents in perturbing the radiative energy balance and causing climate changes.
The significant advances over the past half-century have established, with very high confidence, that the global-mean ERF due to human activity since preindustrial times is positive (the 2013 IPCC assessment gives a best estimate of 2.3 W m-2, with a range from 1.1 to 3.3 W m-2; 90% confidence interval). Further, except in the immediate aftermath of climatically-significant volcanic eruptions, the net anthropogenic forcing dominates over natural radiative forcing mechanisms. Nevertheless, the substantial remaining uncertainty in the net anthropogenic ERF leads to large uncertainties in estimates of climate sensitivity from observations and in predicting future climate impacts. The uncertainty in the ERF arises principally from the incorporation of the rapid climate adjustments in the formulation, the well-recognized difficulties in characterizing the preindustrial state of the atmosphere, and the incomplete knowledge of the interactions of aerosols with clouds. This uncertainty impairs the quantitative evaluation of climate adaptation and mitigation pathways in the future. A grand challenge in Earth System science lies in continuing to sustain the relatively simple essence of the radiative forcing concept in a form similar to that originally devised, and at the same time improving the quantification of the forcing. This, in turn, demands an accurate, yet increasingly complex and comprehensive, accounting of the relevant processes in the climate system
- âŠ