32,149 research outputs found
MemCA: all-memristor design for deterministic and probabilistic cellular automata hardware realization
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksInspired by the behavior of natural systems, Cellular Automata (CA) tackle the demanding long-distance information transfer of conventional computers by the massive parallel computation performed by a set of locally-coupled dynamical nodes. Although CA are envisioned as powerful deterministic computers, their intrinsic capabilities are expanded after the memristor’s probabilistic switching is introduced into CA cells, resulting in new hybrid deterministic and probabilistic memristor-based CA (MemCA). In the proposed MemCA hardware realization, memristor devices are incorporated in both the cell and rule modules, composing the very first all-memristor CA hardware, designed with mixed CMOS/Memristor circuits. The proposed implementation accomplishes high operating speed and reduced area requirements, exploiting also memristor as an entropy source in every CA cell. MemCA’s functioning is showcased in deterministic and probabilistic operation, which can be externally modified by the selection of programming voltage amplitude, without changing the design. Also, the proposed MemCA system includes a reconfigurable rule module implementation that allows for spatial and temporal rule inhomogeneity.Peer ReviewedPostprint (published version
Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy
This study critically analyses the information-theoretic, axiomatic and
combinatorial philosophical bases of the entropy and cross-entropy concepts.
The combinatorial basis is shown to be the most fundamental (most primitive) of
these three bases, since it gives (i) a derivation for the Kullback-Leibler
cross-entropy and Shannon entropy functions, as simplified forms of the
multinomial distribution subject to the Stirling approximation; (ii) an
explanation for the need to maximize entropy (or minimize cross-entropy) to
find the most probable realization; and (iii) new, generalized definitions of
entropy and cross-entropy - supersets of the Boltzmann principle - applicable
to non-multinomial systems. The combinatorial basis is therefore of much
broader scope, with far greater power of application, than the
information-theoretic and axiomatic bases. The generalized definitions underpin
a new discipline of ``{\it combinatorial information theory}'', for the
analysis of probabilistic systems of any type.
Jaynes' generic formulation of statistical mechanics for multinomial systems
is re-examined in light of the combinatorial approach. (abbreviated abstract)Comment: 45 pp; 1 figure; REVTex; updated version 5 (incremental changes
The identification of cellular automata
Although cellular automata have been widely studied as a class of the spatio temporal systems, very few investigators have studied how to identify the CA rules given observations of the patterns. A solution using a polynomial realization to describe the CA rule is reviewed in the present study based on the application of an orthogonal least squares algorithm. Three new neighbourhood detection methods are then reviewed as important preliminary analysis procedures to reduce the complexity of the estimation. The identification of excitable media is discussed using simulation examples and real data sets and a new method for the identification of
hybrid CA is introduced
Direct Data-Driven Portfolio Optimization with Guaranteed Shortfall Probability
This paper proposes a novel methodology for optimal allocation of a portfolio of risky financial assets. Most existing methods that aim at compromising between portfolio performance (e.g., expected return) and its risk (e.g., volatility or shortfall probability) need some statistical model of the asset returns. This means that: ({\em i}) one needs to make rather strong assumptions on the market for eliciting a return distribution, and ({\em ii}) the parameters of this distribution need be somehow estimated, which is quite a critical aspect, since optimal portfolios will then depend on the way parameters are estimated. Here we propose instead a direct, data-driven, route to portfolio optimization that avoids both of the mentioned issues: the optimal portfolios are computed directly from historical data, by solving a sequence of convex optimization problems (typically, linear programs). Much more importantly, the resulting portfolios are theoretically backed by a guarantee that their expected shortfall is no larger than an a-priori assigned level. This result is here obtained assuming efficiency of the market, under no hypotheses on the shape of the joint distribution of the asset returns, which can remain unknown and need not be estimate
Theoretical framework for quantum networks
We present a framework to treat quantum networks and all possible
transformations thereof, including as special cases all possible manipulations
of quantum states, measurements, and channels, such as, e.g., cloning,
discrimination, estimation, and tomography. Our framework is based on the
concepts of quantum comb-which describes all transformations achievable by a
given quantum network-and link product-the operation of connecting two quantum
networks. Quantum networks are treated both from a constructive point of
view-based on connections of elementary circuits-and from an axiomatic
one-based on a hierarchy of admissible quantum maps. In the axiomatic context a
fundamental property is shown, which we call universality of quantum memory
channels: any admissible transformation of quantum networks can be realized by
a suitable sequence of memory channels. The open problem whether this property
fails for some nonquantum theory, e.g., for no-signaling boxes, is posed.Comment: 23 pages, revtex
A Minimum Relative Entropy Principle for Learning and Acting
This paper proposes a method to construct an adaptive agent that is universal
with respect to a given class of experts, where each expert is an agent that
has been designed specifically for a particular environment. This adaptive
control problem is formalized as the problem of minimizing the relative entropy
of the adaptive agent from the expert that is most suitable for the unknown
environment. If the agent is a passive observer, then the optimal solution is
the well-known Bayesian predictor. However, if the agent is active, then its
past actions need to be treated as causal interventions on the I/O stream
rather than normal probability conditions. Here it is shown that the solution
to this new variational problem is given by a stochastic controller called the
Bayesian control rule, which implements adaptive behavior as a mixture of
experts. Furthermore, it is shown that under mild assumptions, the Bayesian
control rule converges to the control law of the most suitable expert.Comment: 36 pages, 11 figure
Combining Clustering techniques and Formal Concept Analysis to characterize Interestingness Measures
Formal Concept Analysis "FCA" is a data analysis method which enables to
discover hidden knowledge existing in data. A kind of hidden knowledge
extracted from data is association rules. Different quality measures were
reported in the literature to extract only relevant association rules. Given a
dataset, the choice of a good quality measure remains a challenging task for a
user. Given a quality measures evaluation matrix according to semantic
properties, this paper describes how FCA can highlight quality measures with
similar behavior in order to help the user during his choice. The aim of this
article is the discovery of Interestingness Measures "IM" clusters, able to
validate those found due to the hierarchical and partitioning clustering
methods "AHC" and "k-means". Then, based on the theoretical study of sixty one
interestingness measures according to nineteen properties, proposed in a recent
study, "FCA" describes several groups of measures.Comment: 13 pages, 2 figure
- …