16,662 research outputs found
Analysis of the 3DVAR Filter for the Partially Observed Lorenz '63 Model
The problem of effectively combining data with a mathematical model
constitutes a major challenge in applied mathematics. It is particular
challenging for high-dimensional dynamical systems where data is received
sequentially in time and the objective is to estimate the system state in an
on-line fashion; this situation arises, for example, in weather forecasting.
The sequential particle filter is then impractical and ad hoc filters, which
employ some form of Gaussian approximation, are widely used. Prototypical of
these ad hoc filters is the 3DVAR method. The goal of this paper is to analyze
the 3DVAR method, using the Lorenz '63 model to exemplify the key ideas. The
situation where the data is partial and noisy is studied, and both discrete
time and continuous time data streams are considered. The theory demonstrates
how the widely used technique of variance inflation acts to stabilize the
filter, and hence leads to asymptotic accuracy
Data Assimilation: A Mathematical Introduction
These notes provide a systematic mathematical treatment of the subject of
data assimilation
Analysis and interpretation of high transverse entanglement in optical parametric down conversion
Quantum entanglement associated with transverse wave vectors of down
conversion photons is investigated based on the Schmidt decomposition method.
We show that transverse entanglement involves two variables: orbital angular
momentum and transverse frequency. We show that in the monochromatic limit high
values of entanglement are closely controlled by a single parameter resulting
from the competition between (transverse) momentum conservation and
longitudinal phase matching. We examine the features of the Schmidt eigenmodes,
and indicate how entanglement can be enhanced by suitable mode selection
methods.Comment: 4 pages, 4 figure
Well-Posedness And Accuracy Of The Ensemble Kalman Filter In Discrete And Continuous Time
The ensemble Kalman filter (EnKF) is a method for combining a dynamical model
with data in a sequential fashion. Despite its widespread use, there has been
little analysis of its theoretical properties. Many of the algorithmic
innovations associated with the filter, which are required to make a useable
algorithm in practice, are derived in an ad hoc fashion. The aim of this paper
is to initiate the development of a systematic analysis of the EnKF, in
particular to do so in the small ensemble size limit. The perspective is to
view the method as a state estimator, and not as an algorithm which
approximates the true filtering distribution. The perturbed observation version
of the algorithm is studied, without and with variance inflation. Without
variance inflation well-posedness of the filter is established; with variance
inflation accuracy of the filter, with resepct to the true signal underlying
the data, is established. The algorithm is considered in discrete time, and
also for a continuous time limit arising when observations are frequent and
subject to large noise. The underlying dynamical model, and assumptions about
it, is sufficiently general to include the Lorenz '63 and '96 models, together
with the incompressible Navier-Stokes equation on a two-dimensional torus. The
analysis is limited to the case of complete observation of the signal with
additive white noise. Numerical results are presented for the Navier-Stokes
equation on a two-dimensional torus for both complete and partial observations
of the signal with additive white noise
Do early-life exposures explain why more advantaged children get eczema? Findings from the U.K. Millennium Cohort Study
Background:
Atopic dermatitis (eczema) in childhood is socially patterned, with higher incidence in more advantaged populations. However, it is unclear what factors explain the social differences.
Objectives:
To identify early-life risk factors for eczema, and to explore how early-life risk factors explain any differences in eczema.
Methods:
We estimated odds ratios (ORs) for ever having had eczema by age 5 years in 14 499 children from the U.K. Millennium Cohort Study (MCS), with a focus on maternal, antenatal and early-life risk factors and socioeconomic circumstances (SECs). Risk factors were explored to assess whether they attenuated associations between SECs and eczema.
Results:
Overall 35·1% of children had ever had eczema by age 5 years. Children of mothers with degree-level qualifications vs. no educational qualifications were more likely to have eczema (OR 1·52, 95% confidence interval 1·31–1·76), and there was a gradient across the socioeconomic spectrum. Maternal atopy, breastfeeding (1–6 weeks and ≥ 6 months), introduction of solids under 4 months or cow's milk under 9 months, antibiotic exposure in the first year of life and grime exposure were associated with an increased odds of having eczema. Female sex, Pakistani and Bangladeshi ethnicity, smoking during pregnancy, exposure to environmental tobacco smoke and having more siblings were associated with reduced odds for eczema. Controlling for maternal, antenatal and early-life characteristics (particularly maternal smoking during pregnancy, breastfeeding and number of siblings) reduced the OR for eczema to 1·26 (95% confidence interval 1·03–1·50) in the group with the highest educational qualifications compared with the least.
Conclusions:
In a representative U.K. child cohort, eczema was more common in more advantaged children. This was explained partially by early-life factors including not smoking during pregnancy, breastfeeding and having fewer siblings
Not all the bots are created equal:the Ordering Turing Test for the labelling of bots in MMORPGs
This article contributes to the research on bots in Social Media. It takes as its starting point an emerging perspective which proposes that we should abandon the investigation of the Turing Test and the functional aspects of bots in favor of studying the authentic and cooperative relationship between humans and bots. Contrary to this view, this article argues that Turing Tests are one of the ways in which authentic relationships between humans and bots take place. To understand this, this article introduces the concept of Ordering Turing Tests: these are sort of Turing Tests proposed by social actors for purposes of achieving social order when bots produce deviant behavior. An Ordering Turing Test is method for labeling deviance, whereby social actors can use this test to tell apart rule-abiding humans and rule-breaking bots. Using examples from Massively Multiplayer Online Role-Playing Games, this article illustrates how Ordering Turing Tests are proposed and justified by players and service providers. Data for the research comes from scientific literature on Machine Learning proposed for the identification of bots and from game forums and other player produced paratexts from the case study of the game Runescape
Localized structures in Kagome lattices
We investigate the existence and stability of gap vortices and multi-pole gap
solitons in a Kagome lattice with a defocusing nonlinearity both in a discrete
case and in a continuum one with periodic external modulation. In particular,
predictions are made based on expansion around a simple and analytically
tractable anti-continuum (zero coupling) limit. These predictions are then
confirmed for a continuum model of an optically-induced Kagome lattice in a
photorefractive crystal obtained by a continuous transformation of a honeycomb
lattice
Stability of Filters for the Navier-Stokes Equation
Data assimilation methodologies are designed to incorporate noisy
observations of a physical system into an underlying model in order to infer
the properties of the state of the system. Filters refer to a class of data
assimilation algorithms designed to update the estimation of the state in a
on-line fashion, as data is acquired sequentially. For linear problems subject
to Gaussian noise filtering can be performed exactly using the Kalman filter.
For nonlinear systems it can be approximated in a systematic way by particle
filters. However in high dimensions these particle filtering methods can break
down. Hence, for the large nonlinear systems arising in applications such as
weather forecasting, various ad hoc filters are used, mostly based on making
Gaussian approximations. The purpose of this work is to study the properties of
these ad hoc filters, working in the context of the 2D incompressible
Navier-Stokes equation. By working in this infinite dimensional setting we
provide an analysis which is useful for understanding high dimensional
filtering, and is robust to mesh-refinement. We describe theoretical results
showing that, in the small observational noise limit, the filters can be tuned
to accurately track the signal itself (filter stability), provided the system
is observed in a sufficiently large low dimensional space; roughly speaking
this space should be large enough to contain the unstable modes of the
linearized dynamics. Numerical results are given which illustrate the theory.
In a simplified scenario we also derive, and study numerically, a stochastic
PDE which determines filter stability in the limit of frequent observations,
subject to large observational noise. The positive results herein concerning
filter stability complement recent numerical studies which demonstrate that the
ad hoc filters perform poorly in reproducing statistical variation about the
true signal
Multiphoton entanglement through a Bell multiport beam splitter
Multiphoton entanglement is an important resource for linear optics quantum
computing. Here we show that a wide range of highly entangled multiphoton
states, including W-states, can be prepared by interfering single photons
inside a Bell multiport beam splitter and using postselection. A successful
state preparation is indicated by the collection of one photon per output port.
An advantage of the Bell multiport beam splitter is that it redirects the
photons without changing their inner degrees of freedom. The described setup
can therefore be used to generate polarisation, time-bin and frequency
multiphoton entanglement, even when using only a single photon source.Comment: 8 pages, 2 figures, carefully revised version, references adde
- …