350 research outputs found
Locating and Quantifying Broadband Fan Sources Using In-Duct Microphones
In-duct beamforming techniques have been developed for locating broadband noise sources on a low-speed fan and quantifying the acoustic power in the inlet and aft fan ducts. The NASA Glenn Research Center's Advanced Noise Control Fan was used as a test bed. Several of the blades were modified to provide a broadband source to evaluate the efficacy of the in-duct beamforming technique. Phased arrays consisting of rings and line arrays of microphones were employed. For the imaging, the data were mathematically resampled in the frame of reference of the rotating fan. For both the imaging and power measurement steps, array steering vectors were computed using annular duct modal expansions, selected subsets of the cross spectral matrix elements were used, and the DAMAS and CLEAN-SC deconvolution algorithms were applied
Exclusive Queueing Process with Discrete Time
In a recent study [C Arita, Phys. Rev. E 80, 051119 (2009)], an extension of
the M/M/1 queueing process with the excluded-volume effect as in the totally
asymmetric simple exclusion process (TASEP) was introduced. In this paper, we
consider its discrete-time version. The update scheme we take is the parallel
one. A stationary-state solution is obtained in a slightly arranged matrix
product form of the discrete-time open TASEP with the parallel update. We find
the phase diagram for the existence of the stationary state. The critical line
which separates the parameter space into the regions with and without the
stationary state can be written in terms of the stationary current of the open
TASEP. We calculate the average length of the system and the average number of
particles
Fourier Analysis of Gapped Time Series: Improved Estimates of Solar and Stellar Oscillation Parameters
Quantitative helio- and asteroseismology require very precise measurements of
the frequencies, amplitudes, and lifetimes of the global modes of stellar
oscillation. It is common knowledge that the precision of these measurements
depends on the total length (T), quality, and completeness of the observations.
Except in a few simple cases, the effect of gaps in the data on measurement
precision is poorly understood, in particular in Fourier space where the
convolution of the observable with the observation window introduces
correlations between different frequencies. Here we describe and implement a
rather general method to retrieve maximum likelihood estimates of the
oscillation parameters, taking into account the proper statistics of the
observations. Our fitting method applies in complex Fourier space and exploits
the phase information. We consider both solar-like stochastic oscillations and
long-lived harmonic oscillations, plus random noise. Using numerical
simulations, we demonstrate the existence of cases for which our improved
fitting method is less biased and has a greater precision than when the
frequency correlations are ignored. This is especially true of low
signal-to-noise solar-like oscillations. For example, we discuss a case where
the precision on the mode frequency estimate is increased by a factor of five,
for a duty cycle of 15%. In the case of long-lived sinusoidal oscillations, a
proper treatment of the frequency correlations does not provide any significant
improvement; nevertheless we confirm that the mode frequency can be measured
from gapped data at a much better precision than the 1/T Rayleigh resolution.Comment: Accepted for publication in Solar Physics Topical Issue
"Helioseismology, Asteroseismology, and MHD Connections
Correlation effects in ionic crystals: I. The cohesive energy of MgO
High-level quantum-chemical calculations, using the coupled-cluster approach
and extended one-particle basis sets, have been performed for (Mg2+)n (O2-)m
clusters embedded in a Madelung potential. The results of these calculations
are used for setting up an incremental expansion for the correlation energy of
bulk MgO. This way, 96% of the experimental cohesive energy of the MgO crystal
is recovered. It is shown that only 60% of the correlation contribution to the
cohesive energy is of intra-ionic origin, the remaining part being caused by
van der Waals-like inter-ionic excitations.Comment: LaTeX, 20 pages, no figure
Preliminary assessment of the environmental baseline in the Fylde, Lancashire
This report presents the collated preliminary results from the British Geological Survey (BGS) led project Science-based environmental baseline monitoring associated with shale gas development in the Fylde, Lancashire. The project has been funded by a combination of BGS National Capability funding, in-kind contributions from project partners and a grant awarded by the Department of Business Energy and Investment Strategy (BEIS). It complements an on-going project, in which similar activities are being carried out, in the Vale of Pickering, North Yorkshire. Further information on the projects can be found on the BGS website: www.bgs.ac.uk.
The project has initiated a wide-ranging environmental baseline monitoring programme that includes water quality (groundwater and surface water), seismicity, ground motion, atmospheric composition (greenhouse gases and air quality), soil gas and radon in air (indoors and outdoors). The motivation behind the project(s) was to establish independent monitoring in the area around the proposed shale gas hydraulic fracturing sites in the Fylde, Lancashire (Cuadrilla Resources Ltd) before any shale gas operations take place.
As part of the project, instrumentation has been deployed to measure, in real-time or near real-time, a range of environmental variables (water quality, seismicity, atmospheric composition). These data are being displayed on the project’s web site (www.bgs.ac.uk/lancashire). Additional survey, sampling and monitoring has also been carried out through a co-ordinated programme of fieldwork and laboratory analysis, which has included installation of new monitoring infrastructure, to allow compilation of one of the most comprehensive environmental datasets in the UK.
The monitoring programme is continuing. However, there are already some very important findings emerging from the limited datasets which should be taken into account when developing future monitoring strategy, policy and regulation. The information is not only relevant to Lancashire but will be applicable more widely in the UK and internationally. Although shale gas operations in other parts of the world are well-established, there is a paucity of good baseline data and effective guidance on monitoring. The project will also allow the experience gained, and the scientifically-robust findings to be used, to develop and establish effective environmental monitoring strategies for shale gas and similar industrial activities
Intensification of coffee systems can increase the effectiveness of REDD mechanisms
In agricultural production systems with shade trees, such as coffee, the increase in greenhouse gas (GHG) emissions from production intensification can be compensated for, or even outweighed, by the increase in carbon sequestration into above-ground and below-ground tree biomass. We use data from a long-term coffee agroforestry experiment in Costa Rica to evaluate the trade-offs between intensification, profitability and net greenhouse gas emissions through two scenarios. First, by assessing the GHG emissions associated with conversion from shaded to more profitable full-sun (un-shaded) systems, we calculate the break-even carbon price which would need to be paid to offset the opportunity cost of not converting. The price per tCO2e of emissions reduction required to compensate for the coffee production revenue foregone varies widely from 9.3 to 196.3 US$ amongst different shaded systems. Second, as an alternative to intensification, production area can be extended onto currently forested land. We estimate this land-use change required to compensate for the shortfall in profitability from retaining lower intensity coffee production systems. For four of the five shade types tested, this land-use change causes additional GHG emissions >5 tCO2e ha−1 yr−1 resulting in net emissions >8 tCO2e ha−1 yr−1 for the whole system. We conclude that instead, by intensifying production, mechanisms similar to REDD that are based on reducing emissions through avoided land-use change (REAL) could play a major role in increasing the climate change mitigation success of agroforestry systems at the same time as aiding REDD through reducing pressure for further forest conversion to agriculture
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Measurement of the Charged Multiplicities in b, c and Light Quark Events from Z0 Decays
Average charged multiplicities have been measured separately in , and
light quark () events from decays measured in the SLD experiment.
Impact parameters of charged tracks were used to select enriched samples of
and light quark events, and reconstructed charmed mesons were used to select
quark events. We measured the charged multiplicities:
,
, from
which we derived the differences between the total average charged
multiplicities of or quark events and light quark events: and . We compared
these measurements with those at lower center-of-mass energies and with
perturbative QCD predictions. These combined results are in agreement with the
QCD expectations and disfavor the hypothesis of flavor-independent
fragmentation.Comment: 19 pages LaTex, 4 EPS figures, to appear in Physics Letters
Genuine Correlations of Like-Sign Particles in Hadronic Z0 Decays
Correlations among hadrons with the same electric charge produced in Z0
decays are studied using the high statistics data collected from 1991 through
1995 with the OPAL detector at LEP. Normalized factorial cumulants up to fourth
order are used to measure genuine particle correlations as a function of the
size of phase space domains in rapidity, azimuthal angle and transverse
momentum. Both all-charge and like-sign particle combinations show strong
positive genuine correlations. One-dimensional cumulants initially increase
rapidly with decreasing size of the phase space cells but saturate quickly. In
contrast, cumulants in two- and three-dimensional domains continue to increase.
The strong rise of the cumulants for all-charge multiplets is increasingly
driven by that of like-sign multiplets. This points to the likely influence of
Bose-Einstein correlations. Some of the recently proposed algorithms to
simulate Bose-Einstein effects, implemented in the Monte Carlo model PYTHIA,
are found to reproduce reasonably well the measured second- and higher-order
correlations between particles with the same charge as well as those in
all-charge particle multiplets.Comment: 26 pages, 6 figures, Submitted to Phys. Lett.
Enhanced glycemic control with combination therapy for type 2 diabetes in primary care
Type 2 diabetes mellitus is an increasingly common medical problem for primary care clinicians to address. Treatment of diabetes has evolved from simple replacement of insulin (directly or through insulin secretagogs) through capture of mechanisms such as insulin sensitizers, alpha-glucosidase inhibitors, and incretins. Only very recently has recognition of the critical role of the gastrointestinal system as a major culprit in glucose dysregulation been established. Since glycated hemoglobin A1c reductions provide meaningful risk reduction as well as improved quality of life, it is worthwhile to explore evolving paths for more efficient use of the currently available pharmacotherapies. Because diabetes is a progressive disease, even transiently successful treatment will likely require augmentation as the disorder progresses. Pharmacotherapies with complementary mechanisms of action will be necessary to achieve glycemic goals. Hence, clinicians need to be well informed about the various noninsulin alternatives that have been shown to be successful in glycemic goal attainment. This article reviews the benefits of glucose control, the current status of diabetes control, pertinent pathophysiology, available pharmacological classes for combination, limitations of current therapies, and suggestions for appropriate combination therapies, including specific suggestions for thresholds at which different strategies might be most effectively utilized by primary care clinicians
- …