3,652 research outputs found
Multisite Weather Generators Using Bayesian Networks: An Illustrative Case Study for Precipitation Occurrence
ABSTRACT: Many existing approaches for multisite weather generation try to capture several statistics of the observed data (e.g. pairwise correlations) in order to generate spatially and temporarily consistent series. In this work we analyse the application of Bayesian networks to this problem, focusing on precipitation occurrence and considering a simple case study to illustrate the potential of this new approach. We use Bayesian networks to approximate the multi-variate (-site) probability distribution of observed gauge data, which is factorized according to the relevant (marginal and conditional) dependencies. This factorization allows the simulation of synthetic samples from the multivariate distribution, thus providing a sound and promising methodology for multisite precipitation series generation.We acknowledge funding provided by the project MULTIâSDM (CGL2015â 66583âR, MINECO/FEDER)
On the Josephson Coupling between a disk of one superconductor and a surrounding superconducting film of a different symmetry
A cylindrical Josephson junction with a spatially dependent Josephson
coupling which averages to zero is studied in order to model the physics of a
disk of d-wave superconductor embedded in a superconducting film of a different
symmetry. It is found that the system always introduces Josepshon vortices in
order to gain energy at the junction. The critical current is calculated. It is
argued that a recent experiment claimed to provide evidence for s-wave
superconductivity in may also be consistent with d-wave
superconductivity. Figures available from the author on request.Comment: 10 pages, revtex3.0, TM-11111-940321-1
Large Area Crop Inventory Experiment (LACIE). Intensive test site assessment report
There are no author-identified significant results in this report
RankPL: A Qualitative Probabilistic Programming Language
In this paper we introduce RankPL, a modeling language that can be thought of
as a qualitative variant of a probabilistic programming language with a
semantics based on Spohn's ranking theory. Broadly speaking, RankPL can be used
to represent and reason about processes that exhibit uncertainty expressible by
distinguishing "normal" from" surprising" events. RankPL allows (iterated)
revision of rankings over alternative program states and supports various types
of reasoning, including abduction and causal inference. We present the
language, its denotational semantics, and a number of practical examples. We
also discuss an implementation of RankPL that is available for download
Causality re-established
Causality never gained the status of a "law" or "principle" in physics. Some
recent literature even popularized the false idea that causality is a notion
that should be banned from theory. Such misconception relies on an alleged
universality of reversibility of laws of physics, based either on determinism
of classical theory, or on the multiverse interpretation of quantum theory, in
both cases motivated by mere interpretational requirements for realism of the
theory. Here, I will show that a properly defined unambiguous notion of
causality is a theorem of quantum theory, which is also a falsifiable
proposition of the theory. Such causality notion appeared in the literature
within the framework of operational probabilistic theories. It is a genuinely
theoretical notion, corresponding to establish a definite partial order among
events, in the same way as we do by using the future causal cone on Minkowski
space. The causality notion is logically completely independent of the
misidentified concept of "determinism", and, being a consequence of quantum
theory, is ubiquitous in physics. In addition, as classical theory can be
regarded as a restriction of quantum theory, causality holds also in the
classical case, although the determinism of the theory trivializes it. I then
conclude arguing that causality naturally establishes an arrow of time. This
implies that the scenario of the "Block Universe" and the connected "Past
Hypothesis" are incompatible with causality, and thus with quantum theory: they
both are doomed to remain mere interpretations and, as such, not falsifiable,
similar to the hypothesis of "super-determinism". This article is part of a
discussion meeting issue "Foundations of quantum mechanics and their impact on
contemporary society".Comment: Presented at the Royal Society of London, on 11/12/ 2017, at the
conference "Foundations of quantum mechanics and their impact on contemporary
society". To appear on Philosophical Transactions of the Royal Society
Gaussian Belief with dynamic data and in dynamic network
In this paper we analyse Belief Propagation over a Gaussian model in a
dynamic environment. Recently, this has been proposed as a method to average
local measurement values by a distributed protocol ("Consensus Propagation",
Moallemi & Van Roy, 2006), where the average is available for read-out at every
single node. In the case that the underlying network is constant but the values
to be averaged fluctuate ("dynamic data"), convergence and accuracy are
determined by the spectral properties of an associated Ruelle-Perron-Frobenius
operator. For Gaussian models on Erdos-Renyi graphs, numerical computation
points to a spectral gap remaining in the large-size limit, implying
exceptionally good scalability. In a model where the underlying network also
fluctuates ("dynamic network"), averaging is more effective than in the dynamic
data case. Altogether, this implies very good performance of these methods in
very large systems, and opens a new field of statistical physics of large (and
dynamic) information systems.Comment: 5 pages, 7 figure
Hierarchical Models for Independence Structures of Networks
We introduce a new family of network models, called hierarchical network
models, that allow us to represent in an explicit manner the stochastic
dependence among the dyads (random ties) of the network. In particular, each
member of this family can be associated with a graphical model defining
conditional independence clauses among the dyads of the network, called the
dependency graph. Every network model with dyadic independence assumption can
be generalized to construct members of this new family. Using this new
framework, we generalize the Erd\"os-R\'enyi and beta-models to create
hierarchical Erd\"os-R\'enyi and beta-models. We describe various methods for
parameter estimation as well as simulation studies for models with sparse
dependency graphs.Comment: 19 pages, 7 figure
Statistical physics-based reconstruction in compressed sensing
Compressed sensing is triggering a major evolution in signal acquisition. It
consists in sampling a sparse signal at low rate and later using computational
power for its exact reconstruction, so that only the necessary information is
measured. Currently used reconstruction techniques are, however, limited to
acquisition rates larger than the true density of the signal. We design a new
procedure which is able to reconstruct exactly the signal with a number of
measurements that approaches the theoretical limit in the limit of large
systems. It is based on the joint use of three essential ingredients: a
probabilistic approach to signal reconstruction, a message-passing algorithm
adapted from belief propagation, and a careful design of the measurement matrix
inspired from the theory of crystal nucleation. The performance of this new
algorithm is analyzed by statistical physics methods. The obtained improvement
is confirmed by numerical studies of several cases.Comment: 20 pages, 8 figures, 3 tables. Related codes and data are available
at http://aspics.krzakala.or
A very fast inference algorithm for finite-dimensional spin glasses: Belief Propagation on the dual lattice
Starting from a Cluster Variational Method, and inspired by the correctness
of the paramagnetic Ansatz (at high temperatures in general, and at any
temperature in the 2D Edwards-Anderson model) we propose a novel message
passing algorithm --- the Dual algorithm --- to estimate the marginal
probabilities of spin glasses on finite dimensional lattices. We show that in a
wide range of temperatures our algorithm compares very well with Monte Carlo
simulations, with the Double Loop algorithm and with exact calculation of the
ground state of 2D systems with bimodal and Gaussian interactions. Moreover it
is usually 100 times faster than other provably convergent methods, as the
Double Loop algorithm.Comment: 23 pages, 12 figures. v2: improved introductio
Common Causes and The Direction of Causation
Is the common cause principle merely one of a set of useful heuristics for discovering causal relations, or is it rather a piece of heavy duty metaphysics, capable of grounding the direction of causation itself? Since the principle was introduced in Reichenbachâs groundbreaking work The Direction of Time (1956), there have been a series of attempts to pursue the latter programâto take the probabilistic relationships constitutive of the principle of the common cause and use them to ground the direction of causation. These attempts have not all explicitly appealed to the principle as originally formulated; it has also appeared in the guise of independence conditions, counterfactual overdetermination, and, in the causal modelling literature, as the causal markov condition. In this paper, I identify a set of difficulties for grounding the asymmetry of causation on the principle and its descendents. The first difficulty, concerning what I call the vertical placement of causation, consists of a tension between considerations that drive towards the macroscopic scale, and considerations that drive towards the microscopic scaleâthe worry is that these considerations cannot both be comfortably accommodated. The second difficulty consists of a novel potential counterexample to the principle based on the familiar Einstein Podolsky Rosen (EPR) cases in quantum mechanics
- âŚ