8,071 research outputs found
Process, System, Causality, and Quantum Mechanics, A Psychoanalysis of Animal Faith
We shall argue in this paper that a central piece of modern physics does not
really belong to physics at all but to elementary probability theory. Given a
joint probability distribution J on a set of random variables containing x and
y, define a link between x and y to be the condition x=y on J. Define the {\it
state} D of a link x=y as the joint probability distribution matrix on x and y
without the link. The two core laws of quantum mechanics are the Born
probability rule, and the unitary dynamical law whose best known form is the
Schrodinger's equation. Von Neumann formulated these two laws in the language
of Hilbert space as prob(P) = trace(PD) and D'T = TD respectively, where P is a
projection, D and D' are (von Neumann) density matrices, and T is a unitary
transformation. We'll see that if we regard link states as density matrices,
the algebraic forms of these two core laws occur as completely general theorems
about links. When we extend probability theory by allowing cases to count
negatively, we find that the Hilbert space framework of quantum mechanics
proper emerges from the assumption that all D's are symmetrical in rows and
columns. On the other hand, Markovian systems emerge when we assume that one of
every linked variable pair has a uniform probability distribution. By
representing quantum and Markovian structure in this way, we see clearly both
how they differ, and also how they can coexist in natural harmony with each
other, as they must in quantum measurement, which we'll examine in some detail.
Looking beyond quantum mechanics, we see how both structures have their special
places in a much larger continuum of formal systems that we have yet to look
for in nature.Comment: LaTex, 86 page
A survey of statistical network models
Networks are ubiquitous in science and have become a focal point for
discussion in everyday life. Formal statistical models for the analysis of
network data have emerged as a major topic of interest in diverse areas of
study, and most of these involve a form of graphical representation.
Probability models on graphs date back to 1959. Along with empirical studies in
social psychology and sociology from the 1960s, these early works generated an
active network community and a substantial literature in the 1970s. This effort
moved into the statistical literature in the late 1970s and 1980s, and the past
decade has seen a burgeoning network literature in statistical physics and
computer science. The growth of the World Wide Web and the emergence of online
networking communities such as Facebook, MySpace, and LinkedIn, and a host of
more specialized professional network communities has intensified interest in
the study of networks and network data. Our goal in this review is to provide
the reader with an entry point to this burgeoning literature. We begin with an
overview of the historical development of statistical network modeling and then
we introduce a number of examples that have been studied in the network
literature. Our subsequent discussion focuses on a number of prominent static
and dynamic network models and their interconnections. We emphasize formal
model descriptions, and pay special attention to the interpretation of
parameters and their estimation. We end with a description of some open
problems and challenges for machine learning and statistics.Comment: 96 pages, 14 figures, 333 reference
An Overview of LISA Data Analysis Algorithms
The development of search algorithms for gravitational wave sources in the
LISA data stream is currently a very active area of research. It has become
clear that not only does difficulty lie in searching for the individual
sources, but in the case of galactic binaries, evaluating the fidelity of
resolved sources also turns out to be a major challenge in itself. In this
article we review the current status of developed algorithms for galactic
binary, non-spinning supermassive black hole binary and extreme mass ratio
inspiral sources. While covering the vast majority of algorithms, we will
highlight those that represent the state of the art in terms of speed and
accuracy.Comment: 21 pages. Invited highlight article appearing in issue 01 of
Gravitational Waves Notes, "GW Notes", edited by Pau Amaro-Seoane and Bernard
F. Schutz at: http://brownbag.lisascience.org/lisa-gw-notes
Activity Analysis; Finding Explanations for Sets of Events
Automatic activity recognition is the computational process of analysing visual input and reasoning about detections to understand the performed events. In all but the simplest scenarios, an activity involves multiple interleaved events, some related and others independent. The activity in a car park or at a playground would typically include many events. This research assumes the possible events and any constraints between the events can be defined for the given scene. Analysing the activity should thus recognise a complete and consistent set of events; this is referred to as a global explanation of the activity. By seeking a global explanation that satisfies the activity’s constraints, infeasible interpretations can be avoided, and ambiguous observations may be resolved.
An activity’s events and any natural constraints are defined using a grammar formalism. Attribute Multiset Grammars (AMG) are chosen because they allow defining hierarchies, as well as attribute rules and constraints. When used for recognition, detectors are employed to gather a set of detections. Parsing the set of detections by the AMG provides a global explanation. To find the best parse tree given a set of detections, a Bayesian network models the probability distribution over the space of possible parse trees. Heuristic and exhaustive search techniques are proposed to find the maximum a posteriori global explanation.
The framework is tested for two activities: the activity in a bicycle rack, and around a building entrance. The first case study involves people locking bicycles onto a bicycle rack and picking them up later. The best global explanation for all detections gathered during the day resolves local ambiguities from occlusion or clutter. Intensive testing on 5 full days proved global analysis achieves higher recognition rates. The second case study tracks people and any objects they are carrying as they enter and exit a building entrance. A complete sequence of the person entering and exiting multiple times is recovered by the global explanation
- …