30,462 research outputs found
Immunotronics - novel finite-state-machine architectures with built-in self-test using self-nonself differentiation
A novel approach to hardware fault tolerance is demonstrated that takes inspiration from the human immune system as a method of fault detection. The human immune system is a remarkable system of interacting cells and organs that protect the body from invasion and maintains reliable operation even in the presence of invading bacteria or viruses. This paper seeks to address the field of electronic hardware fault tolerance from an immunological perspective with the aim of showing how novel methods based upon the operation of the immune system can both complement and create new approaches to the development of fault detection mechanisms for reliable hardware systems. In particular, it is shown that by use of partial matching, as prevalent in biological systems, high fault coverage can be achieved with the added advantage of reducing memory requirements. The development of a generic finite-state-machine immunization procedure is discussed that allows any system that can be represented in such a manner to be "immunized" against the occurrence of faulty operation. This is demonstrated by the creation of an immunized decade counter that can detect the presence of faults in real tim
Clique of functional hubs orchestrates population bursts in developmentally regulated neural networks
It has recently been discovered that single neuron stimulation can impact
network dynamics in immature and adult neuronal circuits. Here we report a
novel mechanism which can explain in neuronal circuits, at an early stage of
development, the peculiar role played by a few specific neurons in
promoting/arresting the population activity. For this purpose, we consider a
standard neuronal network model, with short-term synaptic plasticity, whose
population activity is characterized by bursting behavior. The addition of
developmentally inspired constraints and correlations in the distribution of
the neuronal connectivities and excitabilities leads to the emergence of
functional hub neurons, whose stimulation/deletion is critical for the network
activity. Functional hubs form a clique, where a precise sequential activation
of the neurons is essential to ignite collective events without any need for a
specific topological architecture. Unsupervised time-lagged firings of
supra-threshold cells, in connection with coordinated entrainments of
near-threshold neurons, are the key ingredients to orchestrateComment: 39 pages, 15 figures, to appear in PLOS Computational Biolog
Limits on Fundamental Limits to Computation
An indispensable part of our lives, computing has also become essential to
industries and governments. Steady improvements in computer hardware have been
supported by periodic doubling of transistor densities in integrated circuits
over the last fifty years. Such Moore scaling now requires increasingly heroic
efforts, stimulating research in alternative hardware and stirring controversy.
To help evaluate emerging technologies and enrich our understanding of
integrated-circuit scaling, we review fundamental limits to computation: in
manufacturing, energy, physical space, design and verification effort, and
algorithms. To outline what is achievable in principle and in practice, we
recall how some limits were circumvented, compare loose and tight limits. We
also point out that engineering difficulties encountered by emerging
technologies may indicate yet-unknown limits.Comment: 15 pages, 4 figures, 1 tabl
Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures
Probabilistic graphical models are a central tool in AI; however, they are
generally not as expressive as deep neural models, and inference is notoriously
hard and slow. In contrast, deep probabilistic models such as sum-product
networks (SPNs) capture joint distributions in a tractable fashion, but still
lack the expressive power of intractable models based on deep neural networks.
Therefore, we introduce conditional SPNs (CSPNs), conditional density
estimators for multivariate and potentially hybrid domains which allow
harnessing the expressive power of neural networks while still maintaining
tractability guarantees. One way to implement CSPNs is to use an existing SPN
structure and condition its parameters on the input, e.g., via a deep neural
network. This approach, however, might misrepresent the conditional
independence structure present in data. Consequently, we also develop a
structure-learning approach that derives both the structure and parameters of
CSPNs from data. Our experimental evidence demonstrates that CSPNs are
competitive with other probabilistic models and yield superior performance on
multilabel image classification compared to mean field and mixture density
networks. Furthermore, they can successfully be employed as building blocks for
structured probabilistic models, such as autoregressive image models.Comment: 13 pages, 6 figure
Neural development features: Spatio-temporal development of the Caenorhabditis elegans neuronal network
The nematode Caenorhabditis elegans, with information on neural connectivity,
three-dimensional position and cell linage provides a unique system for
understanding the development of neural networks. Although C. elegans has been
widely studied in the past, we present the first statistical study from a
developmental perspective, with findings that raise interesting suggestions on
the establishment of long-distance connections and network hubs. Here, we
analyze the neuro-development for temporal and spatial features, using birth
times of neurons and their three-dimensional positions. Comparisons of growth
in C. elegans with random spatial network growth highlight two findings
relevant to neural network development. First, most neurons which are linked by
long-distance connections are born around the same time and early on,
suggesting the possibility of early contact or interaction between connected
neurons during development. Second, early-born neurons are more highly
connected (tendency to form hubs) than later born neurons. This indicates that
the longer time frame available to them might underlie high connectivity. Both
outcomes are not observed for random connection formation. The study finds that
around one-third of electrically coupled long-range connections are late
forming, raising the question of what mechanisms are involved in ensuring their
accuracy, particularly in light of the extremely invariant connectivity
observed in C. elegans. In conclusion, the sequence of neural network
development highlights the possibility of early contact or interaction in
securing long-distance and high-degree connectivity
Algorithmic Aspects of Cyclic Combinational Circuit Synthesis
Digital circuits are called combinational if they are memoryless: if they have outputs that depend only on the current values of the inputs. Combinational circuits are generally thought of as acyclic (i.e., feed-forward) structures. And yet, cyclic circuits can be combinational. Cycles sometimes occur in designs synthesized from high-level descriptions, as well as in bus-based designs [16]. Feedback in such cases is carefully contrived, typically occurring when functional units are connected in a cyclic topology. Although the premise of cycles in combinational circuits has been accepted, and analysis techniques have been proposed [7], no one has attempted the synthesis of circuits with feedback at the logic level.
We have argued the case for a paradigm shift in combinational circuit design [10]. We should no longer think of combinational logic as acyclic in theory or in practice, since most combinational circuits are best designed with cycles. We have proposed a general methodology for the synthesis of multilevel networks with cyclic topologies and incorporated it in a general logic synthesis environment. In trials, benchmark circuits were optimized significantly, with improvements of up to 30%I n the area. In this paper, we discuss algorithmic aspects of cyclic circuit design. We formulate a symbolic framework for analysis based on a divide-and-conquer strategy. Unlike previous approaches, our method does not require ternary-valued simulation. Our analysis for combinationality is tightly coupled with the synthesis phase, in which we assemble a combinational network from smaller combinational components. We discuss the underpinnings of the heuristic search methods and present examples as well as synthesis results for benchmark circuits.
In this paper, we discuss algorithmic aspects of cyclic circuit design. We formulate a symbolic framework for analysis based on a divide-and-conquer strategy. Unlike previous approaches, our method does not require ternary-valued simulation. Our analysis for combinationality is tightly coupled with the synthesis phase, in which we assemble a combinational network from smaller combinational components. We discuss the underpinnings of the heuristic search methods and present examples as well as synthesis results for benchmark circuits
- …