79 research outputs found

    Algebraic approach to hardware description and verification

    Get PDF

    New Foundation in the Sciences: Physics without sweeping infinities under the rug

    Get PDF
    It is widely known among the Frontiers of physics, that “sweeping under the rug” practice has been quite the norm rather than exception. In other words, the leading paradigms have strong tendency to be hailed as the only game in town. For example, renormalization group theory was hailed as cure in order to solve infinity problem in QED theory. For instance, a quote from Richard Feynman goes as follows: “What the three Nobel Prize winners did, in the words of Feynman, was to get rid of the infinities in the calculations. The infinities are still there, but now they can be skirted around . . . We have designed a method for sweeping them under the rug. [1] And Paul Dirac himself also wrote with similar tune: “Hence most physicists are very satisfied with the situation. They say: Quantum electrodynamics is a good theory, and we do not have to worry about it any more. I must say that I am very dissatisfied with the situation, because this so-called good theory does involve neglecting infinities which appear in its equations, neglecting them in an arbitrary way. This is just not sensible mathematics. Sensible mathematics involves neglecting a quantity when it turns out to be small—not neglecting it just because it is infinitely great and you do not want it!”[2] Similarly, dark matter and dark energy were elevated as plausible way to solve the crisis in prevalent Big Bang cosmology. That is why we choose a theme here: New Foundations in the Sciences, in order to emphasize the necessity to introduce a new set of approaches in the Sciences, be it Physics, Cosmology, Consciousness etc

    Fourth NASA Langley Formal Methods Workshop

    Get PDF
    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS

    Pseudo-contractions as Gentle Repairs

    Get PDF
    Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas

    Fabrication, Mechanical Characterization, and Modeling of 3D Architected Materials upon Static and Dynamic Loading

    Get PDF
    Architected materials have been ubiquitous in nature, enabling unique properties that are unachievable by monolithic, homogeneous materials. Inspired by natural processes, man-made three-dimensional (3D) architected materials have been reported to enable novel mechanical properties such as high stiffness- and strength-to-density ratios, extreme resilience, or high energy absorption. Furthermore, advanced fabrication techniques have enabled architected materials with feature sizes at the nanometer-scale, which exploit material size effects to approach theoretical bounds. However, most architected materials have relied on symmetry, periodicity, and lack of defects to achieve the desired mechanical response, resulting in sub-optimal mechanical response under the presence of inevitable defects. Additionally, most of these nano- and micro-architected materials have only been studied in the static regime, leaving the dynamic parameter space unexplored. In this work, we address these issues by: (i) proposing numerical and theoretical tools that predict the behavior of architected materials with non-ideal geometries, (ii) presenting a pathway for scalable fabrication of tunable nano-architected materials, and (iii) exploring the response of nano- and micro-architected materials under three types of dynamic loading. We first explore lattice architectures with features at the micro- and millimeter scales and provide an extension to the classical stiffness scaling laws, enabled by reduced-order numerical models and experiments at both scales. After discussing the effect of nodes (i.e., junctions) on the mechanical response of lattice architectures, we propose alternative node-less geometries that eliminate the stress concentrations associated with nodes to provide extreme resilience. Using natural processes such as spinodal decomposition, we present pathways to fabricate a version of these materials with samples sizes on the order of cubic centimeters while achieving feature sizes on the order of tens of nanometers. In the dynamic regime, we design, fabricate, and test micro-architected materials with tunable vibrational band gaps through the use of architectural reconfiguration and local resonance. Lastly, we present methods to fabricate carbon-based materials at the nano- and centimeter scales and test them under supersonic impact and blast conditions, respectively. Our work provides explorations into pathways that could enable the use of nano- and micro-architected materials for applications that go beyond small-volume, quasi-static mechanical regimes.</p

    Computer Science Logic 2018: CSL 2018, September 4-8, 2018, Birmingham, United Kingdom

    Get PDF

    What is an internet? Norbert Wiener and the society of control

    Get PDF
    By means of a philosophical reading of Norbert Wiener, founder of cybernetics, this thesis attempts to derive anew the concepts of internet and control. It develops upon Wiener’s position that every age is reflected by a certain machine, arguing that the internet is that which does so today. Grounded by a critical historiography of the relation between the Cold War and the internet’s invention in 1969 by the ‘network’ of J. C. R. Licklider, it argues for an agonistic concept of internet derived from Wiener’s disjunctive reading of figures including Claude Bernard, Walter Cannon, Benoüt Mandelbrot, John von Neumann and above all, his Neo-Kantian inflected reading of Leibniz. It offers a counter-theory of the society of control to those grounded by Spinoza’s ethology, notably that of Michael Hardt and Toni Negri, and attempts to establish a single conceptual vocabulary for depicting the possible modes of conflict through which an internet is determined

    Mechanically proving determinacy of hierarchical block diagram translations

    No full text
    | openaire: EC/H2020/730080/EU//ESROCOSHierarchical block diagrams (HBDs) are at the heart of embedded system design tools, including Simulink. Numerous translations exist from HBDs into languages with formal semantics, amenable to formal verification. However, none of these translations has been proven correct, to our knowledge. We present in this paper the first mechanically proven HBD translation algorithm. The algorithm translates HBDs into an algebra of terms with three basic composition operations (serial, parallel, and feedback). In order to capture various translation strategies resulting in different terms achieving different tradeoffs, the algorithm is nondeterministic. Despite this, we prove its semantic determinacy: for every input HBD, all possible terms that can be generated by the algorithm are semantically equivalent. We apply this result to show how three Simulink translation strategies introduced previously can be formalized as determinizations of the algorithm, and derive that these strategies yield semantically equivalent results (a question left open in previous work). All results are formalized and proved in the Isabelle theorem-prover and the code is publicly available.Peer reviewe
    • 

    corecore