26 research outputs found

    Encoding problems in logic synthesis

    Get PDF

    SAT-based Analysis, (Re-)Configuration & Optimization in the Context of Automotive Product documentation

    Get PDF
    Es gibt einen steigenden Trend hin zu kundenindividueller Massenproduktion (mass customization), insbesondere im Bereich der Automobilkonfiguration. Kundenindividuelle Massenproduktion führt zu einem enormen Anstieg der Komplexität. Es gibt Hunderte von Ausstattungsoptionen aus denen ein Kunde wählen kann um sich sein persönliches Auto zusammenzustellen. Die Anzahl der unterschiedlichen konfigurierbaren Autos eines deutschen Premium-Herstellers liegt für ein Fahrzeugmodell bei bis zu 10^80. SAT-basierte Methoden haben sich zur Verifikation der Stückliste (bill of materials) von Automobilkonfigurationen etabliert. Carsten Sinz hat Mitte der 90er im Bereich der SAT-basierten Verifikationsmethoden für die Daimler AG Pionierarbeit geleistet. Darauf aufbauend wurde nach 2005 ein produktives Software System bei der Daimler AG installiert. Später folgten weitere deutsche Automobilhersteller und installierten ebenfalls SAT-basierte Systeme zur Verifikation ihrer Stücklisten. Die vorliegende Arbeit besteht aus zwei Hauptteilen. Der erste Teil beschäftigt sich mit der Entwicklung weiterer SAT-basierter Methoden für Automobilkonfigurationen. Wir zeigen, dass sich SAT-basierte Methoden für interaktive Automobilkonfiguration eignen. Wir behandeln unterschiedliche Aspekte der interaktiven Konfiguration. Darunter Konsistenzprüfung, Generierung von Beispielen, Erklärungen und die Vermeidung von Fehlkonfigurationen. Außerdem entwickeln wir SAT-basierte Methoden zur Verifikation von dynamischen Zusammenbauten. Ein dynamischer Zusammenbau repräsentiert die chronologische Zusammenbau-Reihenfolge komplexer Teile. Der zweite Teil beschäftigt sich mit der Optimierung von Automobilkonfigurationen. Wir erläutern und vergleichen unterschiedliche Optimierungsprobleme der Aussagenlogik sowie deren algorithmische Lösungsansätze. Wir beschreiben Anwendungsfälle aus der Automobilkonfiguration und zeigen wie diese als aussagenlogisches Optimierungsproblem formalisiert werden können. Beispielsweise möchte man zu einer Menge an Ausstattungswünschen ein Test-Fahrzeug mit minimaler Ergänzung weiterer Ausstattungen berechnen um Kosten zu sparen. DesWeiteren beschäftigen wir uns mit der Problemstellung eine kleinste Menge an Fahrzeugen zu berechnen um eine Testmenge abzudecken. Im Rahmen dieser Arbeit haben wir einen Prototypen eines (Re-)Konfigurators, genannt AutoConfig, entwickelt. Unser (Re-)Konfigurator verwendet im Kern SAT-basierte Methoden und besitzt eine grafische Benutzeroberfläche, welche interaktive Konfiguration erlaubt. AutoConfig kann mit Instanzen von drei großen deutschen Automobilherstellern umgehen, aber ist nicht alleine darauf beschränkt. Mit Hilfe dieses Prototyps wollen wir die Anwendbarkeit unserer Methoden demonstrieren

    Designing Accurate and Low-Cost Stochastic Circuits.

    Full text link
    Stochastic computing (SC) is an unconventional computing approach that processes data represented by pseudo-random bit-streams called stochastic numbers (SNs). It enables arithmetic functions to be implemented by tiny, low-power logic circuits, and is highly error-tolerant. These properties make SC practical for applications that need massive parallelism or operate in noisy environments where conventional binary designs are too costly or too unreliable. SC has recently come to be seen as an attractive choice for tasks such as biomedical image processing and decoding complex error-correcting codes. Despite its desirable properties, SC has features that limit its usefulness, including insufficient accuracy and an inadequate design theory. Accuracy is especially vulnerable to correlation among interacting SNs and to the random fluctuations inherent in SC’s data representation. This dissertation examines the major factors affecting accuracy using analytical and experimental approaches based on probability theory and circuit simulation, respectively. We devise methods to quantify the error effects in stochastic circuits by means of probabilistic transfer matrices and Bernouilli processes. These methods make it possible to compare the impact of errors on conventional and stochastic circuits under various conditions. We then analyze correlation in detail and show that correlation-induced errors can be reduced by the careful insertion of delay elements, a de-correlation technique called isolation. Noting that different logic functions can have the same stochastic behavior when constant SNs are applied to their inputs, we show how to partition logic functions into stochastic equivalence classes (SECs). We derive a procedure for identifying SECs, and apply SEC concepts to the synthesis and optimization of stochastic circuits. While addition, subtraction and multiplication have well-known and simple SC implementations, this is not true for division. We study stochastic division methods and propose a new type of stochastic divider that combines low cost with high accuracy. Finally, we turn to the design of general stochastic circuits and investigate a desirable property of SNs called monotonic progressive precision (MPP) whereby accuracy increases steadily with bit-stream length. We develop an SC design technique which produces results that are accurate and have good MPP. The dissertation concludes with some ideas for future research.PhDComputer Science and EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133255/1/tehsuan_1.pd

    Introduction to Logic Circuits & Logic Design with VHDL

    Get PDF
    The overall goal of this book is to fill a void that has appeared in the instruction of digital circuits over the past decade due to the rapid abstraction of system design. Up until the mid-1980s, digital circuits were designed using classical techniques. Classical techniques relied heavily on manual design practices for the synthesis, minimization, and interfacing of digital systems. Corresponding to this design style, academic textbooks were developed that taught classical digital design techniques. Around 1990, large-scale digital systems began being designed using hardware description languages (HDL) and automated synthesis tools. Broad-scale adoption of this modern design approach spread through the industry during this decade. Around 2000, hardware description languages and the modern digital design approach began to be taught in universities, mainly at the senior and graduate level. There were a variety of reasons that the modern digital design approach did not penetrate the lower levels of academia during this time. First, the design and simulation tools were difficult to use and overwhelmed freshman and sophomore students. Second, the ability to implement the designs in a laboratory setting was infeasible. The modern design tools at the time were targeted at custom integrated circuits, which are cost- and time-prohibitive to implement in a university setting. Between 2000 and 2005, rapid advances in programmable logic and design tools allowed the modern digital design approach to be implemented in a university setting, even in lower-level courses. This allowed students to learn the modern design approach based on HDLs and prototype their designs in real hardware, mainly field programmable gate arrays (FPGAs). This spurred an abundance of textbooks to be authored teaching hardware description languages and higher levels of design abstraction. This trend has continued until today. While abstraction is a critical tool for engineering design, the rapid movement toward teaching only the modern digital design techniques has left a void for freshman- and sophomore-level courses in digital circuitry. Legacy textbooks that teach the classical design approach are outdated and do not contain sufficient coverage of HDLs to prepare the students for follow-on classes. Newer textbooks that teach the modern digital design approach move immediately into high-level behavioral modeling with minimal or no coverage of the underlying hardware used to implement the systems. As a result, students are not being provided the resources to understand the fundamental hardware theory that lies beneath the modern abstraction such as interfacing, gate-level implementation, and technology optimization. Students moving too rapidly into high levels of abstraction have little understanding of what is going on when they click the “compile and synthesize” button of their design tool. This leads to graduates who can model a breadth of different systems in an HDL but have no depth into how the system is implemented in hardware. This becomes problematic when an issue arises in a real design and there is no foundational knowledge for the students to fall back on in order to debug the problem

    Introduction to Logic Circuits & Logic Design with Verilog

    Get PDF
    The overall goal of this book is to fill a void that has appeared in the instruction of digital circuits over the past decade due to the rapid abstraction of system design. Up until the mid-1980s, digital circuits were designed using classical techniques. Classical techniques relied heavily on manual design practices for the synthesis, minimization, and interfacing of digital systems. Corresponding to this design style, academic textbooks were developed that taught classical digital design techniques. Around 1990, large-scale digital systems began being designed using hardware description languages (HDL) and automated synthesis tools. Broad-scale adoption of this modern design approach spread through the industry during this decade. Around 2000, hardware description languages and the modern digital design approach began to be taught in universities, mainly at the senior and graduate level. There were a variety of reasons that the modern digital design approach did not penetrate the lower levels of academia during this time. First, the design and simulation tools were difficult to use and overwhelmed freshman and sophomore students. Second, the ability to implement the designs in a laboratory setting was infeasible. The modern design tools at the time were targeted at custom integrated circuits, which are cost- and time-prohibitive to implement in a university setting. Between 2000 and 2005, rapid advances in programmable logic and design tools allowed the modern digital design approach to be implemented in a university setting, even in lower-level courses. This allowed students to learn the modern design approach based on HDLs and prototype their designs in real hardware, mainly fieldprogrammable gate arrays (FPGAs). This spurred an abundance of textbooks to be authored, teaching hardware description languages and higher levels of design abstraction. This trend has continued until today. While abstraction is a critical tool for engineering design, the rapid movement toward teaching only the modern digital design techniques has left a void for freshman- and sophomore-level courses in digital circuitry. Legacy textbooks that teach the classical design approach are outdated and do not contain sufficient coverage of HDLs to prepare the students for follow-on classes. Newer textbooks that teach the modern digital design approach move immediately into high-level behavioral modeling with minimal or no coverage of the underlying hardware used to implement the systems. As a result, students are not being provided the resources to understand the fundamental hardware theory that lies beneath the modern abstraction such as interfacing, gate-level implementation, and technology optimization. Students moving too rapidly into high levels of abstraction have little understanding of what is going on when they click the “compile and synthesize” button of their design tool. This leads to graduates who can model a breadth of different systems in an HDL but have no depth into how the system is implemented in hardware. This becomes problematic when an issue arises in a real design and there is no foundational knowledge for the students to fall back on in order to debug the problem

    NASA SERC 1990 Symposium on VLSI Design

    Get PDF
    This document contains papers presented at the first annual NASA Symposium on VLSI Design. NASA's involvement in this event demonstrates a need for research and development in high performance computing. High performance computing addresses problems faced by the scientific and industrial communities. High performance computing is needed in: (1) real-time manipulation of large data sets; (2) advanced systems control of spacecraft; (3) digital data transmission, error correction, and image compression; and (4) expert system control of spacecraft. Clearly, a valuable technology in meeting these needs is Very Large Scale Integration (VLSI). This conference addresses the following issues in VLSI design: (1) system architectures; (2) electronics; (3) algorithms; and (4) CAD tools

    Early Identification of Youth at Risk of Long Term Emergency Homeless Shelter Use: An Evaluation of Interpretable Machine Learning models

    Get PDF
    Homelessness is a serious violation of one’s dignity, and youth who are long-term shelter users are particularly vulnerable members of a vulnerable demographic. The commitment to prevent and eliminate homelessness, particularly among the youth, is a shared responsibility. Programmes aiming at providing homeless people with permanent housing, mostly identify people who have lived with the condition for an extended period of time for support. Allowing young people to be homeless for an extended period of time before intervening, exposes them to several kinds of hardships on the streets. Early identification of youth at risk of becoming a long term shelter user is a proactive and a more humane way of addressing the problem. Machine learning is brought forth as a tool to augment the expertise of shelter staff in identifying youth at risk of long-term shelter use. Machine learning algorithms are utilised to predict youth at risk of long-term shelter use with the clients’ first 30, 60, 90, 120, or 180 days of shelter access records. A real time program delivery approach was incorporated in the experiments as a supplement to existing other methods in fighting homelessness. Interpretable machine learning models capable of ultimately producing classification rules in DNF format are evaluated. The level of control over the complexity of the generated rules, coupled with statistical evaluation metrics are employed in the evaluation

    36th International Symposium on Theoretical Aspects of Computer Science: STACS 2019, March 13-16, 2019, Berlin, Germany

    Get PDF

    Logic Synthesis for Established and Emerging Computing

    Get PDF
    Logic synthesis is an enabling technology to realize integrated computing systems, and it entails solving computationally intractable problems through a plurality of heuristic techniques. A recent push toward further formalization of synthesis problems has shown to be very useful toward both attempting to solve some logic problems exactly--which is computationally possible for instances of limited size today--as well as creating new and more powerful heuristics based on problem decomposition. Moreover, technological advances including nanodevices, optical computing, and quantum and quantum cellular computing require new and specific synthesis flows to assess feasibility and scalability. This review highlights recent progress in logic synthesis and optimization, describing models, data structures, and algorithms, with specific emphasis on both design quality and emerging technologies. Example applications and results of novel techniques to established and emerging technologies are reported
    corecore