226 research outputs found

    A Golden Age of Hardware Description Languages: Applying Programming Language Techniques to Improve Design Productivity

    Get PDF
    Leading experts have declared that there is an impending golden age of computer architecture. During this age, the rate at which architects will be able to innovate will be directly tied to the design and implementation of the hardware description languages they use. Thus, the programming languages community stands on the critical path to this new golden age. This implies that we are also on the cusp of a golden age of hardware description languages. In this paper, we discuss the intellectual challenges facing researchers interested in hardware description language design, compilers, and formal methods. The major theme will be identifying opportunities to apply programming language techniques to address issues in hardware design productivity. Then, we present a vision for a multi-language system that provides a framework for developing solutions to these intellectual problems. This vision is based on a meta-programmed host language combined with a core embedded hardware description language that is used as the basis for the research and development of a sea of domain-specific languages. Central to the design of this system is the core language which is based on an abstraction that provides a general mechanism for the composition of hardware components described in any language

    Freestanding Cellulose Acetate/ZnO Flowers Composites for Solar Photocatalysis and Controlled Zinc Ions Release

    Get PDF
    The versatile properties of ZnO micro- and nano- structures have resulted in many applications in piezotronics, biosensors and photocatalysis. However, ZnO can easily dissolve in aqueous fluids, potentially resulting in the release of reactive oxygen species and zinc ions at toxic concentrations. Such an issue can be solved by dispersing ZnO within biocompatible polymeric matrices to reduce the direct exposure to the aqueous fluid and control the release of zinc ions. Herein, this work explores tailored ZnO flowers/cellulose acetate photocatalytic composites at different ZnO weight percentages (1-15 wt%). The photocatalytic degradation of methylene blue dye under simulated solar light is studied, finding an optimal value of ZnO filler loading in the polymer (10 wt %), resulting from a compromise between the photodegradation efficiency and the hydrophobicity induced by ZnO flowers. The reusability of the composites is investigated, finding a surprising improvement in the photodegradation efficiency after the first cycle. Simulated solar light stimulation induces the controllable release of zinc ions in aqueous solution at ppm-levels from the composites at the optimal ZnO filler loading. Finally, the release of ionic species in the absence of light stimulation is found to be directly proportional to the ZnO-loading in the composite, as a result of its degradation in aqueous environments

    Standard Model Physics and the Digital Quantum Revolution: Thoughts about the Interface

    Get PDF
    Advances in isolating, controlling and entangling quantum systems are transforming what was once a curious feature of quantum mechanics into a vehicle for disruptive scientific and technological progress. Pursuing the vision articulated by Feynman, a concerted effort across many areas of research and development is introducing prototypical digital quantum devices into the computing ecosystem available to domain scientists. Through interactions with these early quantum devices, the abstract vision of exploring classically-intractable quantum systems is evolving toward becoming a tangible reality. Beyond catalyzing these technological advances, entanglement is enabling parallel progress as a diagnostic for quantum correlations and as an organizational tool, both guiding improved understanding of quantum many-body systems and quantum field theories defining and emerging from the Standard Model. From the perspective of three domain science theorists, this article compiles thoughts about the interface on entanglement, complexity, and quantum simulation in an effort to contextualize recent NISQ-era progress with the scientific objectives of nuclear and high-energy physics.Comment: 63 pages, 5 figure

    Towards Dynamic Dependable Systems through Evidence-Based Continuous Certification

    Get PDF
    International audienceFuture cyber-physical systems are expected to be dynamic, evolving while already being deployed. Frequent updates of software components are likely to become the norm even for safety-critical systems. In this setting, a full re-certification before each software update might delay important updates that fix previous bugs, or security or safety issues. Here we propose a vision addressing this challenge, namely through the evidence-based continuous supervision and certification of software variants in the field. The idea is to run both old and new variants of component software inside the same system, together with a supervising instance that monitors their behavior. Updated variants are phased into operation after sufficient evidence for correct behavior has been collected. The variants are required to explicate their decisions in a logical language, enabling the supervisor to reason about these decisions and to identify inconsistencies. To resolve contradictory information, the supervisor can run a component analysis to identify potentially faulty components on the basis of previously observed behavior, and can trigger micro-experiments which plan and execute system behavior specifically aimed at reducing uncertainty. We spell out our overall vision, and provide a first formalization of the different components and their interplay. In order to provide efficient supervisor reasoning as well as automatic verification of supervisor properties we introduce SupERLog, a logic specifically designed to this end

    Logic Programs as Declarative and Procedural Bias in Inductive Logic Programming

    Get PDF
    Machine Learning is necessary for the development of Artificial Intelligence, as pointed out by Turing in his 1950 article ``Computing Machinery and Intelligence''. It is in the same article that Turing suggested the use of computational logic and background knowledge for learning. This thesis follows a logic-based machine learning approach called Inductive Logic Programming (ILP), which is advantageous over other machine learning approaches in terms of relational learning and utilising background knowledge. ILP uses logic programs as a uniform representation for hypothesis, background knowledge and examples, but its declarative bias is usually encoded using metalogical statements. This thesis advocates the use of logic programs to represent declarative and procedural bias, which results in a framework of single-language representation. We show in this thesis that using a logic program called the top theory as declarative bias leads to a sound and complete multi-clause learning system MC-TopLog. It overcomes the entailment-incompleteness of Progol, thus outperforms Progol in terms of predictive accuracies on learning grammars and strategies for playing Nim game. MC-TopLog has been applied to two real-world applications funded by Syngenta, which is an agriculture company. A higher-order extension on top theories results in meta-interpreters, which allow the introduction of new predicate symbols. Thus the resulting ILP system Metagol can do predicate invention, which is an intrinsically higher-order logic operation. Metagol also leverages the procedural semantic of Prolog to encode procedural bias, so that it can outperform both its ASP version and ILP systems without an equivalent procedural bias in terms of efficiency and accuracy. This is demonstrated by the experiments on learning Regular, Context-free and Natural grammars. Metagol is also applied to non-grammar learning tasks involving recursion and predicate invention, such as learning a definition of staircases and robot strategy learning. Both MC-TopLog and Metagol are based on a ⊤\top-directed framework, which is different from other multi-clause learning systems based on Inverse Entailment, such as CF-Induction, XHAIL and IMPARO. Compared to another ⊤\top-directed multi-clause learning system TAL, Metagol allows the explicit form of higher-order assumption to be encoded in the form of meta-rules.Open Acces

    Process mining meets model learning: Discovering deterministic finite state automata from event logs for business process analysis

    Get PDF
    Within the process mining field, Deterministic Finite State Automata (DFAs) are largely employed as foundation mechanisms to perform formal reasoning tasks over the information contained in the event logs, such as conformance checking, compliance monitoring and cross-organization process analysis, just to name a few. To support the above use cases, in this paper, we investigate how to leverage Model Learning (ML) algorithms for the automated discovery of DFAs from event logs. DFAs can be used as a fundamental building block to support not only the development of process analysis techniques, but also the implementation of instruments to support other phases of the Business Process Management (BPM) lifecycle such as business process design and enactment. The quality of the discovered DFAs is assessed wrt customized definitions of fitness, precision, generalization, and a standard notion of DFA simplicity. Finally, we use these metrics to benchmark ML algorithms against real-life and synthetically generated datasets, with the aim of studying their performance and investigate their suitability to be used for the development of BPM tools

    Inference and Regeneration of Programs that Store and Retrieve Data

    Get PDF
    As modern computation platforms become increasingly complex, their programming interfaces are increasingly difficult to use. This complexity is especially inappropriate given the relatively simple core functionality that many of the computations implement. We present a new approach for obtaining so ware that executes on modern computing platforms with complex programming interfaces. Our approach starts with a simple seed program, written in the language of the developer's choice, that implements the desired core functionality. It then systematically generates inputs and observes the resulting outputs to learn the core functionality. It finally automatically regenerates new code that implements the learned core functionality on the target computing platform. This regenerated code contains both (a) boilerplate code for the complex programming interfaces that the target computing platform presents and (b) systematic error and vulnerability checking code that makes the new implementations robust and secure. By providing a productive new mechanism for capturing and encapsulating knowledge about how to use modern complex interfaces, this new approach promises to greatly reduce the developer effort required to obtain secure, robust so ware that executes on modern computing platforms

    Model-based quality assurance of instrumented context-free systems

    Get PDF
    The ever-growing complexity of today’s software and hardware systems makes quality assurance (QA) a challenging task. Abstraction is a key technique for dealing with this complexity because it allows one to skip non-essential properties of a system and focus on the important ones. Crucial for the success of this approach is the availability of adequate abstraction models that strike a fine balance between simplicity and expressiveness. This thesis presents the formalisms of systems of procedural automata (SPAs), systems of behavioral automata (SBAs), and systems of procedural Mealy machines (SPMMs). The three model types describe systems which consist of multiple procedures that can mutually call each other, including recursion. While the individual procedures are described by regular automata and therefore are easy to understand, the aggregation of procedures towards systems captures the semantics of context-free systems, offering the expressiveness necessary for representing procedural systems. A central concept of the proposed model types is an instrumentation that exposes the internal structure of systems by making calls to and returns from procedures observable. This instrumentation allows for a notion of rigorous (de-) composition which enables a translation between local (procedural) views and global (holistic) views on a system. On the basis of this translation, this thesis presents algorithms for the verification, testing, and learning of (instrumented) context-free systems, covering a broad spectrum of practical QA tasks. Starting with SPAs as a “base” formalism for context-free systems, the flexibility of this concept is shown by including features such as prefix-closure (SBAs) and dialog-based transductions (SPMMs). In a comparison with related formalisms, this thesis shows that the simplicity of the proposed model types not only increases the understandability of models but can also improve the performance of QA tasks. This makes SPAs, SBAs, and SPMMs a powerful tool for tackling the practical challenges of assuring the quality of today’s software and hardware systems
    • …
    corecore