44,344 research outputs found
Logical Entropy: Introduction to Classical and Quantum Logical Information theory
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states
Recommended from our members
Set-related restrictions for semantic groupings
Semantic database models utilize several fundamental forms of groupings to increase their expressive power. In this paper we consider four of the most common of these constructs; basic set groupings, is-a related groupings, power set groupings, and Cartesian aggregation groupings. For each, we define a number of useful restrictions that control its structure and composition. This permits each grouping to capture more subtle distinctions of the concepts or situations in the application environment. The resulting set of restrictions forms a framework which increases the expressive power of semantic models and specifies various set-related integrity constraints
From receptive profiles to a metric model of V1
In this work we show how to construct connectivity kernels induced by the
receptive profiles of simple cells of the primary visual cortex (V1). These
kernels are directly defined by the shape of such profiles: this provides a
metric model for the functional architecture of V1, whose global geometry is
determined by the reciprocal interactions between local elements. Our
construction adapts to any bank of filters chosen to represent a set of
receptive profiles, since it does not require any structure on the
parameterization of the family. The connectivity kernel that we define carries
a geometrical structure consistent with the well-known properties of long-range
horizontal connections in V1, and it is compatible with the perceptual rules
synthesized by the concept of association field. These characteristics are
still present when the kernel is constructed from a bank of filters arising
from an unsupervised learning algorithm.Comment: 25 pages, 18 figures. Added acknowledgement
- …