9,946 research outputs found
Densities, spectral densities and modality
This paper considers the problem of specifying a simple approximating density
function for a given data set (x_1,...,x_n). Simplicity is measured by the
number of modes but several different definitions of approximation are
introduced. The taut string method is used to control the numbers of modes and
to produce candidate approximating densities. Refinements are introduced that
improve the local adaptivity of the procedures and the method is extended to
spectral densities.Comment: Published by the Institute of Mathematical Statistics
(http://www.imstat.org) in the Annals of Statistics
(http://www.imstat.org/aos/) at http://dx.doi.org/10.1214/00905360400000036
Maximizing information on the environment by dynamically controlled qubit probes
We explore the ability of a qubit probe to characterize unknown parameters of
its environment. By resorting to quantum estimation theory, we analytically
find the ultimate bound on the precision of estimating key parameters of a
broad class of ubiquitous environmental noises ("baths") which the qubit may
probe. These include the probe-bath coupling strength, the correlation time of
generic bath spectra, the power laws governing these spectra, as well as their
dephasing times T2. Our central result is that by optimizing the dynamical
control on the probe under realistic constraints one may attain the maximal
accuracy bound on the estimation of these parameters by the least number of
measurements possible. Applications of this protocol that combines dynamical
control and estimation theory tools to quantum sensing are illustrated for a
nitrogen-vacancy center in diamond used as a probe.Comment: 8 pages + 6 pages (appendix), 3 Figure
Efficient adaptive integration of functions with sharp gradients and cusps in n-dimensional parallelepipeds
In this paper, we study the efficient numerical integration of functions with
sharp gradients and cusps. An adaptive integration algorithm is presented that
systematically improves the accuracy of the integration of a set of functions.
The algorithm is based on a divide and conquer strategy and is independent of
the location of the sharp gradient or cusp. The error analysis reveals that for
a function (derivative-discontinuity at a point), a rate of convergence
of is obtained in . Two applications of the adaptive integration
scheme are studied. First, we use the adaptive quadratures for the integration
of the regularized Heaviside function---a strongly localized function that is
used for modeling sharp gradients. Then, the adaptive quadratures are employed
in the enriched finite element solution of the all-electron Coulomb problem in
crystalline diamond. The source term and enrichment functions of this problem
have sharp gradients and cusps at the nuclei. We show that the optimal rate of
convergence is obtained with only a marginal increase in the number of
integration points with respect to the pure finite element solution with the
same number of elements. The adaptive integration scheme is simple, robust, and
directly applicable to any generalized finite element method employing
enrichments with sharp local variations or cusps in -dimensional
parallelepiped elements.Comment: 22 page
Nonparametric Beta Kernel Estimator for Long Memory Time Series
The paper introduces a new nonparametric estimator of the spectral density that is given in smoothing the periodogram by the probability density of Beta random variable (Beta kernel). The estimator is proved to be bounded for short memory data, and diverges at the origin for long memory data. The convergence in probability of the relative error and Monte Carlo simulations suggest that the estimator automaticaly adapts to the long- or the short-range dependency of the process. A cross-validation procedure is also studied in order to select the nuisance parameter of the estimator. Illustrations on historical as well as most recent returns and absolute returns of the S&P500 index show the reasonable performance of the estimation, and show that the data-driven estimator is a valuable tool for the detection of long-memory as well as hidden periodicities in stock returns.spectral density, long rage dependence, nonparametric estimation
Optimized auxiliary oscillators for the simulation of general open quantum systems
A method for the systematic construction of few-body damped harmonic
oscillator networks accurately reproducing the effect of general bosonic
environments in open quantum systems is presented. Under the sole assumptions
of a Gaussian environment and regardless of the system coupled to it, an
algorithm to determine the parameters of an equivalent set of interacting
damped oscillators obeying a Markovian quantum master equation is introduced.
By choosing a suitable coupling to the system and minimizing an appropriate
distance between the two-time correlation function of this effective bath and
that of the target environment, the error induced in the reduced dynamics of
the system is brought under rigorous control. The interactions among the
effective modes provide remarkable flexibility in replicating non-Markovian
effects on the system even with a small number of oscillators, and the
resulting Lindblad equation may therefore be integrated at a very reasonable
computational cost using standard methods for Markovian problems, even in
strongly non-perturbative coupling regimes and at arbitrary temperatures
including zero. We apply the method to an exactly solvable problem in order to
demonstrate its accuracy, and present a study based on current research in the
context of coherent transport in biological aggregates as a more realistic
example of its use; performance and versatility are highlighted, and
theoretical and numerical advantages over existing methods, as well as possible
future improvements, are discussed.Comment: 23 + 9 pages, 11 + 2 figures. No changes from previous version except
publication info and updated author affiliation
Locally Stationary Functional Time Series
The literature on time series of functional data has focused on processes of
which the probabilistic law is either constant over time or constant up to its
second-order structure. Especially for long stretches of data it is desirable
to be able to weaken this assumption. This paper introduces a framework that
will enable meaningful statistical inference of functional data of which the
dynamics change over time. We put forward the concept of local stationarity in
the functional setting and establish a class of processes that have a
functional time-varying spectral representation. Subsequently, we derive
conditions that allow for fundamental results from nonstationary multivariate
time series to carry over to the function space. In particular, time-varying
functional ARMA processes are investigated and shown to be functional locally
stationary according to the proposed definition. As a side-result, we establish
a Cram\'er representation for an important class of weakly stationary
functional processes. Important in our context is the notion of a time-varying
spectral density operator of which the properties are studied and uniqueness is
derived. Finally, we provide a consistent nonparametric estimator of this
operator and show it is asymptotically Gaussian using a weaker tightness
criterion than what is usually deemed necessary
ABC-CDE: Towards Approximate Bayesian Computation with Complex High-Dimensional Data and Limited Simulations
Approximate Bayesian Computation (ABC) is typically used when the likelihood
is either unavailable or intractable but where data can be simulated under
different parameter settings using a forward model. Despite the recent interest
in ABC, high-dimensional data and costly simulations still remain a bottleneck
in some applications. There is also no consensus as to how to best assess the
performance of such methods without knowing the true posterior. We show how a
nonparametric conditional density estimation (CDE) framework, which we refer to
as ABC-CDE, help address three nontrivial challenges in ABC: (i) how to
efficiently estimate the posterior distribution with limited simulations and
different types of data, (ii) how to tune and compare the performance of ABC
and related methods in estimating the posterior itself, rather than just
certain properties of the density, and (iii) how to efficiently choose among a
large set of summary statistics based on a CDE surrogate loss. We provide
theoretical and empirical evidence that justify ABC-CDE procedures that {\em
directly} estimate and assess the posterior based on an initial ABC sample, and
we describe settings where standard ABC and regression-based approaches are
inadequate
Nonparametric Beta Kernel Estimator for Long Memory Time Series
The paper introduces a new nonparametric estimator of the spectral density that is given in smoothing the periodogram by the probability density of Beta random variable (Beta kernel). The estimator is proved to be bounded for short memory data, and diverges at the origin for long memory data. The convergence in probability of the relative error and Monte Carlo simulations suggest that the estimator automaticaly adapts to the long- or the short-range dependency of the process. A cross-validation procedure is also studied in order to select the nuisance parameter of the estimator. Illustrations on historical as well as most recent returns and absolute returns of the S&P500 index show the reasonable performance of the estimation, and show that the data-driven estimator is a valuable tool for the detection of long-memory as well as hidden periodicities in stock returns
- …
