31,473 research outputs found
A Generalized Notion of Time for Modeling Temporal Networks
Most approaches for modeling and analyzing temporal networks do not explicitly discuss the underlying notion
of time. In this paper, we therefore introduce a generalized notion of time for temporal networks. Our
approach also allows for considering non-deterministic time and incomplete data, two issues that are often
found when analyzing data-sets extracted from online social networks, for example. In order to demonstrate
the consequences of our generalized notion of time, we also discuss the implications for the computation of
(shortest) temporal paths in temporal networks
Neural Networks retrieving Boolean patterns in a sea of Gaussian ones
Restricted Boltzmann Machines are key tools in Machine Learning and are
described by the energy function of bipartite spin-glasses. From a statistical
mechanical perspective, they share the same Gibbs measure of Hopfield networks
for associative memory. In this equivalence, weights in the former play as
patterns in the latter. As Boltzmann machines usually require real weights to
be trained with gradient descent like methods, while Hopfield networks
typically store binary patterns to be able to retrieve, the investigation of a
mixed Hebbian network, equipped with both real (e.g., Gaussian) and discrete
(e.g., Boolean) patterns naturally arises. We prove that, in the challenging
regime of a high storage of real patterns, where retrieval is forbidden, an
extra load of Boolean patterns can still be retrieved, as long as the ratio
among the overall load and the network size does not exceed a critical
threshold, that turns out to be the same of the standard
Amit-Gutfreund-Sompolinsky theory. Assuming replica symmetry, we study the case
of a low load of Boolean patterns combining the stochastic stability and
Hamilton-Jacobi interpolating techniques. The result can be extended to the
high load by a non rigorous but standard replica computation argument.Comment: 16 pages, 1 figur
Lurching Toward Chernobyl: Dysfunctions of Real-Time Computation
Cognitive biological structures, social organizations, and computing machines operating in real time are subject to Rate Distortion Theorem constraints driven by the homology between information source uncertainty and free energy density. This exposes the unitary structure/environment system to a relentless entropic torrent compounded by sudden large deviations causing increased distortion between intent and impact, particularly as demands escalate. The phase transitions characteristic of information phenomena suggest that, rather than graceful decay under increasing load, these structures will undergo punctuated degradation akin to spontaneous symmetry breaking in physical systems. Rate distortion problems, that also affect internal structural dynamics, can become synergistic with limitations equivalent to the inattentional blindness of natural cognitive process. These mechanisms, and their interactions, are unlikely to scale well, so that, depending on architecture, enlarging the structure or its duties may lead to a crossover point at which added resources must be almost entirely devoted to ensuring system stability -- a form of allometric scaling familiar from biological examples. This suggests a critical need to tune architecture to problem type and system demand. A real-time computational structure and its environment are a unitary phenomenon, and environments are usually idiosyncratic. Thus the resulting path dependence in the development of pathology could often require an individualized approach to remediation more akin to an arduous psychiatric intervention than to the traditional engineering or medical quick fix. Failure to recognize the depth of these problems seems likely to produce a relentless chain of the Chernobyl-like failures that are necessary, bot often insufficient, for remediation under our system
Gene Expression and its Discontents: Developmental disorders as dysfunctions of epigenetic cognition
Systems biology presently suffers the same mereological and sufficiency fallacies that haunt neural network models of high order cognition. Shifting perspective from the massively parallel space of gene matrix interactions to the grammar/syntax of the time series of expressed phenotypes using a cognitive paradigm permits import of techniques from statistical physics via the homology between information source uncertainty and free energy density. This produces a broad spectrum of possible statistical models of development and its pathologies in which epigenetic regulation and the effects of embedding environment are analogous to a tunable enzyme catalyst. A cognitive paradigm naturally incorporates memory, leading directly to models of epigenetic inheritance, as affected by environmental exposures, in the largest sense. Understanding gene expression, development, and their dysfunctions will require data analysis tools considerably more sophisticated than the present crop of simplistic models abducted from neural network studies or stochastic chemical reaction theory
- …