13,529 research outputs found
LOT: Logic Optimization with Testability - new transformations for logic synthesis
A new approach to optimize multilevel logic circuits is introduced. Given a multilevel circuit, the synthesis method optimizes its area while simultaneously enhancing its random pattern testability. The method is based on structural transformations at the gate level. New transformations involving EX-OR gates as well as ReedâMuller expansions have been introduced in the synthesis of multilevel circuits. This method is augmented with transformations that specifically enhance random-pattern testability while reducing the area. Testability enhancement is an integral part of our synthesis methodology. Experimental results show that the proposed methodology not only can achieve lower area than other similar tools, but that it achieves better testability compared to available testability enhancement tools such as tstfx. Specifically for ISCAS-85 benchmark circuits, it was observed that EX-OR gate-based transformations successfully contributed toward generating smaller circuits compared to other state-of-the-art logic optimization tools
Boolean Satisfiability in Electronic Design Automation
Boolean Satisfiability (SAT) is often used as the underlying model for a significant and increasing number of applications in Electronic Design Automation (EDA) as well as in many other fields of Computer Science and Engineering. In recent years, new and efficient algorithms for SAT have been developed, allowing much larger problem instances to be solved. SAT âpackagesâ are currently expected to have an impact on EDA applications similar to that of BDD packages since their introduction more than a decade ago. This tutorial paper is aimed at introducing the EDA professional to the Boolean satisfiability problem. Specifically, we highlight the use of SAT models to formulate a number of EDA problems in such diverse areas as test pattern generation, circuit delay computation, logic optimization, combinational equivalence checking, bounded model checking and functional test vector generation, among others. In addition, we provide an overview of the algorithmic techniques commonly used for solving SAT, including those that have seen widespread use in specific EDA applications. We categorize these algorithmic techniques, indicating which have been shown to be best suited for which tasks
Approach to the Organisational Complexity in Terms of Network and Intellectual Capital Concepts
The viability of the systems depends on the way of adaptation of the internal complexity to the environmental complexity. Under structural aspect, any complex system represents a network. Complexity may be estimated on the basis of density and of the non-redundant character of the network. The capacity of the networks to create and diffuse knowledge is essential. Comparing the change speed of the environment with the knowledge processing speed in the system, we can determine the maximum complexity that can be absorbed. A close image of the internal complexity is the level of the human and structural capital. The external complexity may be expressed by means of the relational capital.requisite variety, endogenous complexity, exogenous complexity, network, intellectual capital
Selection, tinkering and emergence in complex networks: crossing the land of tinkering
Complex biological networks have very different origins than technologic ones. The latter involve extensive design and, as engineered structures, include a high level of optimization. The former involve (in principle) contingency and structural constraints, with new structures being incorporated through tinkering with previously evolved modules or units. However, the observation of the topological features of different biological nets suggests that nature can have a limited repertoire of âattractorsâ that essentially optimize communication under some basic constraints of cost and architecture or that allow the biological nets to reach a high degree of homeostasis. Conversely, the topological features exhibited by some technology graphs indicate that tinkering and internal constraints play a key role, in spite of the âdesignedâ nature of these structures. Previous scenarios suggested to explain the overall trends of evolution are re-analyzed in light of topological patterns.Peer ReviewedPostprint (author's final draft
Recommended from our members
VSS : a VHDL synthesis system
This report describes a register transfer synthesis system that allows a designer to interact with the design process. The designer can modify the compiled design by changing the input description, selecting optimization and mapping strategies, or graphically changing the generated design schematic. The VHDL language is used for input and output descriptions. An intermediate representation which incorporates signal typing and component attributes simplifies compilation and facilitates design optimization. The compilation process consists of two phases. First, a design composed of generic components is synthesized from the input description. Second, this design is translated into components from a particular library by a mapper and optimized by a logic optimizer. Redesign to new technologies can be accomplished by changing only the component library
Software Corrections of Vocal Disorders
We discuss how vocal disorders can be post-corrected via a simple nonlinear
noise reduction scheme. This work is motivated by the need of a better
understanding of voice dysfunctions. This would entail a twofold advantage for
affected patients: Physicians can perform better surgical interventions and on
the other hand researchers can try to build up devices that can help to improve
voice quality, i.e. in a phone conversation, avoiding any surgigal treatment.
As a first step, a proper signal classification is performed, through the idea
of geometric signal separation in a feature space. Then through the analysis of
the different regions populated by the samples coming from healthy people and
from patients affected by T1A glottis cancer, one is able to understand which
kind of interventions are necessary in order to correct the illness, i.e. to
move the corresponding feature vector from the sick region to the healthy one.
We discuss such a filter and show its performance.Comment: Computer Methods and Programs in Biomedicine, accepted for
publicatio
Neuronal assembly dynamics in supervised and unsupervised learning scenarios
The dynamic formation of groups of neuronsâneuronal assembliesâis believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus on the temporal structure of neuronal signals. In this context, we investigate neuronal assembly dynamics in two complementary scenarios: the first, a supervised spike pattern classification task, in which noisy variations of a collection of spikes have to be correctly labeled; the second, an unsupervised, minimally cognitive evolutionary robotics tasks, in which an evolved agent has to cope with multiple, possibly conflicting, objectives. In both cases, the more traditional dynamical analysis of the systemâs variables is paired with information-theoretic techniques in order to get a broader picture of the ongoing interactions with and within the network. The neural network model is inspired by the Kuramoto model of coupled phase oscillators and allows one to fine-tune the network synchronization dynamics and assembly configuration. The experiments explore the computational power, redundancy, and generalization capability of neuronal circuits, demonstrating that performance depends nonlinearly on the number of assemblies and neurons in the network and showing that the framework can be exploited to generate minimally cognitive behaviors, with dynamic assembly formation accounting for varying degrees of stimuli modulation of the sensorimotor interactions
Fault-tolerant computer study
A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed
Theory of reliable systems
An attempt was made to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations. Techniques which facilitate analysis of system reliability are included. Special attention was given to fault tolerance, diagnosability, and reconfigurability characteristics of systems
- âŠ