203,953 research outputs found

    DeSyRe: on-Demand System Reliability

    No full text
    The DeSyRe project builds on-demand adaptive and reliable Systems-on-Chips (SoCs). As fabrication technology scales down, chips are becoming less reliable, thereby incurring increased power and performance costs for fault tolerance. To make matters worse, power density is becoming a significant limiting factor in SoC design, in general. In the face of such changes in the technological landscape, current solutions for fault tolerance are expected to introduce excessive overheads in future systems. Moreover, attempting to design and manufacture a totally defect and fault-free system, would impact heavily, even prohibitively, the design, manufacturing, and testing costs, as well as the system performance and power consumption. In this context, DeSyRe delivers a new generation of systems that are reliable by design at well-balanced power, performance, and design costs. In our attempt to reduce the overheads of fault-tolerance, only a small fraction of the chip is built to be fault-free. This fault-free part is then employed to manage the remaining fault-prone resources of the SoC. The DeSyRe framework is applied to two medical systems with high safety requirements (measured using the IEC 61508 functional safety standard) and tight power and performance constraints

    Testing mixed-signal cores: a practical oscillation-based test in an analog macrocell

    Get PDF
    A formal set of design decisions can aid in using oscillation-based test (OBT) for analog subsystems in SoCs. The goal is to offer designers testing options that do not have significant area overhead, performance degradation, or test time. This work shows that OBT is a potential candidate for IP providers to use in combination with functional test techniques. We have shown how to modify the basic concept of OBT to come up with a practical method. Using our approach, designers can use OBT to pave the way for future developments in SoC testing, and it is simple to extend this idea to BIST.European Union 2635

    Finding The Lazy Programmer's Bugs

    Get PDF
    Traditionally developers and testers created huge numbers of explicit tests, enumerating interesting cases, perhaps biased by what they believe to be the current boundary conditions of the function being tested. Or at least, they were supposed to. A major step forward was the development of property testing. Property testing requires the user to write a few functional properties that are used to generate tests, and requires an external library or tool to create test data for the tests. As such many thousands of tests can be created for a single property. For the purely functional programming language Haskell there are several such libraries; for example QuickCheck [CH00], SmallCheck and Lazy SmallCheck [RNL08]. Unfortunately, property testing still requires the user to write explicit tests. Fortunately, we note there are already many implicit tests present in programs. Developers may throw assertion errors, or the compiler may silently insert runtime exceptions for incomplete pattern matches. We attempt to automate the testing process using these implicit tests. Our contributions are in four main areas: (1) We have developed algorithms to automatically infer appropriate constructors and functions needed to generate test data without requiring additional programmer work or annotations. (2) To combine the constructors and functions into test expressions we take advantage of Haskell's lazy evaluation semantics by applying the techniques of needed narrowing and lazy instantiation to guide generation. (3) We keep the type of test data at its most general, in order to prevent committing too early to monomorphic types that cause needless wasted tests. (4) We have developed novel ways of creating Haskell case expressions to inspect elements inside returned data structures, in order to discover exceptions that may be hidden by laziness, and to make our test data generation algorithm more expressive. In order to validate our claims, we have implemented these techniques in Irulan, a fully automatic tool for generating systematic black-box unit tests for Haskell library code. We have designed Irulan to generate high coverage test suites and detect common programming errors in the process

    The Detection of Defects in a Niobium Tri-layer Process

    Get PDF
    Niobium (Nb) LTS processes are emerging as the technology for future ultra high-speed systems especially in the digital domain. As the number of Josephson Junctions (JJ) per chip has recently increased to around 90000, the quality of the process has to be assured so as to realize these complex circuits. Until now, very little or no information is available in the literature on how to achieve this. In this paper we present an approach and results of a study conducted on an RSFQ process. Measurements and SEM inspection were carried out on sample chips and a list of possible defects has been identified and described in detail. We have also developed test-structures for detection of the top-ranking defects, which will be used for yield analysis and the determination of the probability distribution of faults in the process. A test chip has been designed, based on the results of this study, and certain types of defects were introduced in the design to study the behavior of faulty junctions and interconnections

    Towards a CC-function in 4D quantum gravity

    Get PDF
    We develop a generally applicable method for constructing functions, CC, which have properties similar to Zamolodchikov's CC-function, and are geometrically natural objects related to the theory space explored by non-perturbative functional renormalization group (RG) equations. Employing the Euclidean framework of the Effective Average Action (EAA), we propose a CC-function which can be defined for arbitrary systems of gravitational, Yang-Mills, ghost, and bosonic matter fields, and in any number of spacetime dimensions. It becomes stationary both at critical points and in classical regimes, and decreases monotonically along RG trajectories provided the breaking of the split-symmetry which relates background and quantum fields is sufficiently weak. Within the Asymptotic Safety approach we test the proposal for Quantum Einstein Gravity in d>2d>2 dimensions, performing detailed numerical investigations in d=4d=4. We find that the bi-metric Einstein-Hilbert truncation of theory space introduced recently is general enough to yield perfect monotonicity along the RG trajectories, while its more familiar single-metric analog fails to achieve this behavior which we expect on general grounds. Investigating generalized crossover trajectories connecting a fixed point in the ultraviolet to a classical regime with positive cosmological constant in the infrared, the CC-function is shown to depend on the choice of the gravitational instanton which constitutes the background spacetime. For de Sitter space in 4 dimensions, the Bekenstein-Hawking entropy is found to play a role analogous to the central charge in conformal field theory. We also comment on the idea of a `Λ\Lambda-NN connection' and the `NN-bound' discussed earlier.Comment: 15 figures; additional comment

    Nesting Quadratic Logarithmic Demand Systems

    Get PDF
    We propose a new generalised rank-3 demand system which nests all known (and new) rank-3 and rank-2 demand systems derived from the Quadratic Logarithmic (QL) cost function. We investigate its statistical adequacy against commonly en-countered alternatives using U.K. household data.quadratic Logarithmic demand systems, rank-3 demand systems, individual household data.
    • …
    corecore