129,849 research outputs found
Extracting Reynolds stresses from acoustic Doppler current profiler measurements in wave-dominated environments
Surface waves introduce velocity correlations that bias and often dominate Reynolds stress estimates made using the traditional variance method for acoustic Doppler current profilers (ADCPs). This analysis shows that the wave bias is the sum of a real wave stress and an error due to instrument tilt, both of which have a large uncertainty. Three alternative extensions to the variance method for calculating Reynolds stress profiles from ADCP measurements in wavy conditions are analyzed. The previously proposed variance fitting method (Variance Fit) is evaluated and two more general methods that use along- and between-beam velocity differencing with adaptive filtering (Vertical AF and Horizontal AF) are derived. The three methods are tested on datasets containing long-period monochromatic swell (Moorea, French Polynesia) and shorter-period mixed swell (Santa Barbara, California). The Variance Fit method leaves a residual wave bias in beam velocity variances, especially for intermediate waves, but gives physically reasonable Reynolds stress estimates because most of the residual wave bias cancels when the variance method is applied. The new Vertical AF method does not produce inherent wave bias in beam velocity variances, but yields comparable Reynolds stresses to the Variance Fit method. The Horizontal AF method performs poorly for all but monochromatic waves. Error remaining after one of the above methods is applied can be attributed to residual wave error, correlation of turbulence between points chosen for differencing, or correlation between waves and turbulence. A simple procedure is provided for determining the minimum bin separation that can be used
Stratification Trees for Adaptive Randomization in Randomized Controlled Trials
This paper proposes an adaptive randomization procedure for two-stage
randomized controlled trials. The method uses data from a first-wave experiment
in order to determine how to stratify in a second wave of the experiment, where
the objective is to minimize the variance of an estimator for the average
treatment effect (ATE). We consider selection from a class of stratified
randomization procedures which we call stratification trees: these are
procedures whose strata can be represented as decision trees, with differing
treatment assignment probabilities across strata. By using the first wave to
estimate a stratification tree, we simultaneously select which covariates to
use for stratification, how to stratify over these covariates, as well as the
assignment probabilities within these strata. Our main result shows that using
this randomization procedure with an appropriate estimator results in an
asymptotic variance which is minimal in the class of stratification trees.
Moreover, the results we present are able to accommodate a large class of
assignment mechanisms within strata, including stratified block randomization.
In a simulation study, we find that our method, paired with an appropriate
cross-validation procedure ,can improve on ad-hoc choices of stratification. We
conclude by applying our method to the study in Karlan and Wood (2017), where
we estimate stratification trees using the first wave of their experiment
Ab initio Bogoliubov coupled cluster theory for open-shell nuclei
Ab initio many-body methods address closed-shell nuclei up to mass A ~ 130 on
the basis of realistic two- and three-nucleon interactions. Several routes to
address open-shell nuclei are currently under investigation, including ideas
which exploit spontaneous symmetry breaking. Singly open-shell nuclei can be
efficiently described via the sole breaking of gauge symmetry associated
with particle number conservation, to account for their superfluid character.
The present work formulates and applies Bogoliubov coupled cluster (BCC)
theory, which consists of representing the exact ground-state wavefunction of
the system as the exponential of a quasiparticle excitation cluster operator
acting on a Bogoliubov reference state. Equations for the ground-state energy
and cluster amplitudes are derived at the singles and doubles level (BCCSD)
both algebraically and diagrammatically. The formalism includes three-nucleon
forces at the normal-ordered two-body level. The first BCC code is implemented
in -scheme, which will eventually permit the treatment of doubly open-shell
nuclei. Proof-of-principle calculations in an spherical
harmonic oscillator basis are performed for O, Ne,
Mg in the BCCD approximation with a chiral two-nucleon interaction,
comparing to results obtained in standard coupled cluster theory when
applicable. The breaking of symmetry is monitored by computing the
variance associated with the particle-number operator. The newly developed
many-body formalism increases the potential span of ab initio calculations
based on single-reference coupled cluster techniques tremendously, i.e.
potentially to reach several hundred additional mid-mass nuclei. The new
formalism offers a wealth of potential applications and further extensions
dedicated to the description of ground and excited states of open-shell nuclei.Comment: 22 pages, 13 figure
Asymptotic Properties for Methods Combining Minimum Hellinger Distance Estimates and Bayesian Nonparametric Density Estimates
In frequentist inference, minimizing the Hellinger distance between a kernel
density estimate and a parametric family produces estimators that are both
robust to outliers and statistically efficienty when the parametric model is
correct. This paper seeks to extend these results to the use of nonparametric
Bayesian density estimators within disparity methods. We propose two
estimators: one replaces the kernel density estimator with the expected
posterior density from a random histogram prior; the other induces a posterior
over parameters through the posterior for the random histogram. We show that it
is possible to adapt the mathematical machinery of efficient influence
functions from semiparametric models to demonstrate that both our estimators
are efficient in the sense of achieving the Cramer-Rao lower bound. We further
demonstrate a Bernstein-von-Mises result for our second estimator indicating
that it's posterior is asymptotically Gaussian. In addition, the robustness
properties of classical minimum Hellinger distance estimators continue to hold
Effects of meteorological factors on epidemic malaria in Ethiopia: a statistical modelling approach based on theoretical reasoning.
This study was conducted to quantify the association between meteorological variables and incidence of Plasmodium falciparum in areas with unstable malaria transmission in Ethiopia. We used morbidity data pertaining to microscopically confirmed cases reported from 35 sites throughout Ethiopia over a period of approximately 6-7 years. A model was developed reflecting biological relationships between meteorological and morbidity variables. A model that included rainfall 2 and 3 months earlier, mean minimum temperature of the previous month and P. falciparum case incidence during the previous month was fitted to morbidity data from the various areas. The model produced similar percentages of over-estimation (19.7% of predictions exceeded twice the observed values) and under-estimation (18.6%, were less than half the observed values). Inclusion of maximum temperature did not improve the model. The model performed better in areas with relatively high or low incidence (>85% of the total variance explained) than those with moderate incidence (55-85% of the total variance explained). The study indicated that a dynamic immunity mechanism is needed in a prediction model. The potential usefulness and drawbacks of the modelling approach in studying the weather-malaria relationship are discussed, including a need for mechanisms that can adequately handle temporal variations in immunity to malaria
A Game-Theoretic Study on Non-Monetary Incentives in Data Analytics Projects with Privacy Implications
The amount of personal information contributed by individuals to digital
repositories such as social network sites has grown substantially. The
existence of this data offers unprecedented opportunities for data analytics
research in various domains of societal importance including medicine and
public policy. The results of these analyses can be considered a public good
which benefits data contributors as well as individuals who are not making
their data available. At the same time, the release of personal information
carries perceived and actual privacy risks to the contributors. Our research
addresses this problem area. In our work, we study a game-theoretic model in
which individuals take control over participation in data analytics projects in
two ways: 1) individuals can contribute data at a self-chosen level of
precision, and 2) individuals can decide whether they want to contribute at all
(or not). From the analyst's perspective, we investigate to which degree the
research analyst has flexibility to set requirements for data precision, so
that individuals are still willing to contribute to the project, and the
quality of the estimation improves. We study this tradeoff scenario for
populations of homogeneous and heterogeneous individuals, and determine Nash
equilibria that reflect the optimal level of participation and precision of
contributions. We further prove that the analyst can substantially increase the
accuracy of the analysis by imposing a lower bound on the precision of the data
that users can reveal
- …