11,225 research outputs found
Explicit tracking of uncertainty increases the power of quantitative rule-of-thumb reasoning in cell biology
"Back-of-the-envelope" or "rule-of-thumb" calculations involving rough
estimates of quantities play a central scientific role in developing intuition
about the structure and behaviour of physical systems, for example in so-called
`Fermi problems' in the physical sciences. Such calculations can be used to
powerfully and quantitatively reason about biological systems, particularly at
the interface between physics and biology. However, substantial uncertainties
are often associated with values in cell biology, and performing calculations
without taking this uncertainty into account may limit the extent to which
results can be interpreted for a given problem. We present a means to
facilitate such calculations where uncertainties are explicitly tracked through
the line of reasoning, and introduce a `probabilistic calculator' called
Caladis, a web tool freely available at www.caladis.org, designed to perform
this tracking. This approach allows users to perform more statistically robust
calculations in cell biology despite having uncertain values, and to identify
which quantities need to be measured more precisely in order to make confident
statements, facilitating efficient experimental design. We illustrate the use
of our tool for tracking uncertainty in several example biological
calculations, showing that the results yield powerful and interpretable
statistics on the quantities of interest. We also demonstrate that the outcomes
of calculations may differ from point estimates when uncertainty is accurately
tracked. An integral link between Caladis and the Bionumbers repository of
biological quantities further facilitates the straightforward location,
selection, and use of a wealth of experimental data in cell biological
calculations.Comment: 8 pages, 3 figure
Sealed containers in Z
Physical means of securing information, such as sealed envelopes and scratch cards, can be used to achieve cryptographic objectives. Reasoning about this has so far been informal.
We give a model of distinguishable sealed envelopes in Z, exploring design decisions and further analysis and development of such models
Enhanced tracking and recognition of moving objects by reasoning about spatio-temporal continuity.
A framework for the logical and statistical analysis and annotation of dynamic scenes containing occlusion and other uncertainties is presented. This framework consists
of three elements; an object tracker module, an object recognition/classification module and a logical consistency, ambiguity and error reasoning engine. The principle behind the object tracker and object recognition modules is to reduce error by increasing ambiguity (by merging objects in close proximity and presenting multiple
hypotheses). The reasoning engine deals with error, ambiguity and occlusion in a unified framework to produce a hypothesis that satisfies fundamental constraints
on the spatio-temporal continuity of objects. Our algorithm finds a globally consistent model of an extended video sequence that is maximally supported by a voting function based on the output of a statistical classifier. The system results
in an annotation that is significantly more accurate than what would be obtained
by frame-by-frame evaluation of the classifier output. The framework has been implemented
and applied successfully to the analysis of team sports with a single
camera.
Key words: Visua
Quantum Locality
It is argued that while quantum mechanics contains nonlocal or entangled
states, the instantaneous or nonlocal influences sometimes thought to be
present due to violations of Bell inequalities in fact arise from mistaken
attempts to apply classical concepts and introduce probabilities in a manner
inconsistent with the Hilbert space structure of standard quantum mechanics.
Instead, Einstein locality is a valid quantum principle: objective properties
of individual quantum systems do not change when something is done to another
noninteracting system. There is no reason to suspect any conflict between
quantum theory and special relativity.Comment: Introduction has been revised, references added, minor corrections
elsewhere. To appear in Foundations of Physic
The New Quantum Logic
It is shown how all the major conceptual difficulties of standard (textbook)
quantum mechanics, including the two measurement problems and the (supposed)
nonlocality that conflicts with special relativity, are resolved in the
consistent or decoherent histories interpretation of quantum mechanics by using
a modified form of quantum logic to discuss quantum properties (subspaces of
the quantum Hilbert space), and treating quantum time development as a
stochastic process. The histories approach in turn gives rise to some
conceptual difficulties, in particular the correct choice of a framework
(probabilistic sample space) or family of histories, and these are discussed.
The central issue is that the principle of unicity, the idea that there is a
unique single true description of the world, is incompatible with our current
understanding of quantum mechanics.Comment: Minor changes and corrections to bring into conformity with published
versio
Big data and the SP theory of intelligence
This article is about how the "SP theory of intelligence" and its realisation
in the "SP machine" may, with advantage, be applied to the management and
analysis of big data. The SP system -- introduced in the article and fully
described elsewhere -- may help to overcome the problem of variety in big data:
it has potential as "a universal framework for the representation and
processing of diverse kinds of knowledge" (UFK), helping to reduce the
diversity of formalisms and formats for knowledge and the different ways in
which they are processed. It has strengths in the unsupervised learning or
discovery of structure in data, in pattern recognition, in the parsing and
production of natural language, in several kinds of reasoning, and more. It
lends itself to the analysis of streaming data, helping to overcome the problem
of velocity in big data. Central in the workings of the system is lossless
compression of information: making big data smaller and reducing problems of
storage and management. There is potential for substantial economies in the
transmission of data, for big cuts in the use of energy in computing, for
faster processing, and for smaller and lighter computers. The system provides a
handle on the problem of veracity in big data, with potential to assist in the
management of errors and uncertainties in data. It lends itself to the
visualisation of knowledge structures and inferential processes. A
high-parallel, open-source version of the SP machine would provide a means for
researchers everywhere to explore what can be done with the system and to
create new versions of it.Comment: Accepted for publication in IEEE Acces
Using spatio-temporal continuity constraints to enhance visual tracking of moving objects
We present a framework for annotating dynamic scenes involving occlusion and other uncertainties. Our system comprises an object tracker, an object classifier and an algorithm for reasoning about spatio-temporal continuity. The principle behind the object tracking and classifier modules is to reduce error by increasing ambiguity (by merging objects in close proximity and presenting multiple hypotheses). The reasoning engine resolves error, ambiguity and occlusion to produce a most likely hypothesis, which is consistent with global spatio-temporal continuity constraints. The system results in improved annotation over frame-by-frame methods. It has been implemented and applied to the analysis of a team sports video
- …