7,064 research outputs found

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Unstable Periodic Orbits: a language to interpret the complexity of chaotic systems

    Get PDF
    Unstable periodic orbits (UPOs), exact periodic solutions of the evolution equation, offer a very powerful framework for studying chaotic dynamical systems, as they allow one to dissect their dynamical structure. UPOs can be considered the skeleton of chaotic dynamics, its essential building blocks. In fact, it is possible to prove that in a chaotic system, UPOs are dense in the attractor, meaning that it is always possible to find a UPO arbitrarily near any chaotic trajectory. We can thus think of the chaotic trajectory as being approximated by different UPOs as it evolves in time, jumping from one UPO to another as a result of their instability. In this thesis we provide a contribution towards the use of UPOs as a tool to understand and distill the dynamical structure of chaotic dynamical systems. We will focus on two models, characterised by different properties, the Lorenz-63 and Lorenz-96 model. The process of approximation of a chaotic trajectory in terms of UPOs will play a central role in our investigation. In fact, we will use this tool to explore the properties of the attractor of the system under the lens of its UPOs. In the first part of the thesis we consider the Lorenz-63 model with the classic parameters’ value. We investigate how a chaotic trajectory can be approximated using a complete set of UPOs up to symbolic dynamics’ period 14. At each instant in time, we rank the UPOs according to their proximity to the position of the orbit in the phase space. We study this process from two different perspectives. First, we find that longer period UPOs overwhelmingly provide the best local approximation to the trajectory. Second, we construct a finite-state Markov chain by studying the scattering of the trajectory between the neighbourhood of the various UPOs. Each UPO and its neighbourhood are taken as a possible state of the system. Through the analysis of the subdominant eigenvectors of the corresponding stochastic matrix we provide a different interpretation of the mixing processes occurring in the system by taking advantage of the concept of quasi-invariant sets. In the second part of the thesis we provide an extensive numerical investigation of the variability of the dynamical properties across the attractor of the much studied Lorenz ’96 dynamical system. By combining the Lyapunov analysis of the tangent space with the study of the shadowing of the chaotic trajectory performed by a very large set of unstable periodic orbits, we show that the observed variability in the number of unstable dimensions, which shows a serious breakdown of hyperbolicity, is associated with the presence of a substantial number of finite-time Lyapunov exponents that fluctuate about zero also when very long averaging times are considered

    Writing Facts

    Get PDF
    »Fact« is one of the most crucial inventions of modern times. Susanne Knaller discusses the functions of this powerful notion in the arts and the sciences, its impact on aesthetic models and systems of knowledge. The practice of writing provides an effective procedure to realize and to understand facts. This concerns preparatory procedures, formal choices, models of argumentation, and narrative patterns. By considering »writing facts« and »writing facts«, the volume shows why and how »facts« are a result of knowledge, rules, and norms as well as of description, argumentation, and narration. This approach allows new perspectives on »fact« and its impact on modernity

    Grasping nothing: a study of minimal ontologies and the sense of music

    Get PDF
    If music were to have a proper sense – one in which it is truly given – one might reasonably place this in sound and aurality. I contend, however, that no such sense exists; rather, the sense of music takes place, and it does so with the impossible. To this end, this thesis – which is a work of philosophy and music – advances an ontology of the impossible (i.e., it thinks the being of what, properly speaking, can have no being) and considers its implications for music, articulating how ontological aporias – of the event, of thinking the absolute, and of sovereignty’s dismemberment – imply senses of music that are anterior to sound. John Cage’s Silent Prayer, a nonwork he never composed, compels a rerethinking of silence on the basis of its contradictory status of existence; Florian Hecker et al.’s Speculative Solution offers a basis for thinking absolute music anew to the precise extent that it is a discourse of meaninglessness; and Manfred Werder’s [yearn] pieces exhibit exemplarily that music’s sense depends on the possibility of its counterfeiting. Inso-much as these accounts produce musical senses that take the place of sound, they are also understood to be performances of these pieces. Here, then, thought is music’s organon and its instrument

    Corporate Social Responsibility: the institutionalization of ESG

    Get PDF
    Understanding the impact of Corporate Social Responsibility (CSR) on firm performance as it relates to industries reliant on technological innovation is a complex and perpetually evolving challenge. To thoroughly investigate this topic, this dissertation will adopt an economics-based structure to address three primary hypotheses. This structure allows for each hypothesis to essentially be a standalone empirical paper, unified by an overall analysis of the nature of impact that ESG has on firm performance. The first hypothesis explores the evolution of CSR to the modern quantified iteration of ESG has led to the institutionalization and standardization of the CSR concept. The second hypothesis fills gaps in existing literature testing the relationship between firm performance and ESG by finding that the relationship is significantly positive in long-term, strategic metrics (ROA and ROIC) and that there is no correlation in short-term metrics (ROE and ROS). Finally, the third hypothesis states that if a firm has a long-term strategic ESG plan, as proxied by the publication of CSR reports, then it is more resilience to damage from controversies. This is supported by the finding that pro-ESG firms consistently fared better than their counterparts in both financial and ESG performance, even in the event of a controversy. However, firms with consistent reporting are also held to a higher standard than their nonreporting peers, suggesting a higher risk and higher reward dynamic. These findings support the theory of good management, in that long-term strategic planning is both immediately economically beneficial and serves as a means of risk management and social impact mitigation. Overall, this contributes to the literature by fillings gaps in the nature of impact that ESG has on firm performance, particularly from a management perspective

    Statistical Equilibrium of Circulating Fluids

    Full text link
    We are investigating the inviscid limit of the Navier-Stokes equation, and we find previously unknown anomalous terms in Hamiltonian, Dissipation, and Helicity, which survive this limit and define the turbulent statistics. We find various topologically nontrivial configurations of the confined Clebsch field responsible for vortex sheets and lines. In particular, a stable vortex sheet family is discovered, but its anomalous dissipation vanishes as ν\sqrt{\nu}. Topologically stable stationary singular flows, which we call Kelvinons, are introduced. They have a conserved velocity circulation Γα\Gamma_\alpha around the loop CC and another one Γβ\Gamma_\beta for an infinitesimal closed loop C~\tilde C encircling CC, leading to a finite helicity. The anomalous dissipation has a finite limit, which we computed analytically. The Kelvinon is responsible for asymptotic PDF tails of velocity circulation, \textbf{perfectly matching numerical simulations}. The loop equation for circulation PDF as functional of the loop shape is derived and studied. This equation is \textbf{exactly} equivalent to the Schr\"odinger equation in loop space, with viscosity ν\nu playing the role of Planck's constant. Kelvinons are fixed points of the loop equation at WKB limit ν0\nu \rightarrow 0. The anomalous Hamiltonian for the Kelvinons contains a large parameter logΓβν\log \frac{|\Gamma_\beta|}{\nu}. The leading powers of this parameter can be summed up, leading to familiar asymptotic freedom, like in QCD. In particular, the so-called multifractal scaling laws are, as in QCD, modified by the powers of the logarithm.Comment: 246 pages, 96 figures, and six appendixes. Submitted to Physics Reports. Revised the energy balance analysis and discovered asymptotic freedom leading to powers of logarithm of scale modifying K41 scaling law

    Statistical-dynamical analyses and modelling of multi-scale ocean variability

    Get PDF
    This thesis aims to provide a comprehensive analysis of multi-scale oceanic variabilities using various statistical and dynamical tools and explore the data-driven methods for correct statistical emulation of the oceans. We considered the classical, wind-driven, double-gyre ocean circulation model in quasi-geostrophic approximation and obtained its eddy-resolving solutions in terms of potential vorticity anomaly and geostrophic streamfunctions. The reference solutions possess two asymmetric gyres of opposite circulations and a strong meandering eastward jet separating them with rich eddy activities around it, such as the Gulf Stream in the North Atlantic and Kuroshio in the North Pacific. This thesis is divided into two parts. The first part discusses a novel scale-separation method based on the local spatial correlations, called correlation-based decomposition (CBD), and provides a comprehensive analysis of mesoscale eddy forcing. In particular, we analyse the instantaneous and time-lagged interactions between the diagnosed eddy forcing and the evolving large-scale PVA using the novel `product integral' characteristics. The product integral time series uncover robust causality between two drastically different yet interacting flow quantities, termed `eddy backscatter'. We also show data-driven augmentation of non-eddy-resolving ocean models by feeding them the eddy fields to restore the missing eddy-driven features, such as the merging western boundary currents, their eastward extension and low-frequency variabilities of gyres. In the second part, we present a systematic inter-comparison of Linear Regression (LR), stochastic and deep-learning methods to build low-cost reduced-order statistical emulators of the oceans. We obtain the forecasts on seasonal and centennial timescales and assess them for their skill, cost and complexity. We found that the multi-level linear stochastic model performs the best, followed by the ``hybrid stochastically-augmented deep learning models''. The superiority of these methods underscores the importance of incorporating core dynamics, memory effects and model errors for robust emulation of multi-scale dynamical systems, such as the oceans.Open Acces

    CITIES: Energetic Efficiency, Sustainability; Infrastructures, Energy and the Environment; Mobility and IoT; Governance and Citizenship

    Get PDF
    This book collects important contributions on smart cities. This book was created in collaboration with the ICSC-CITIES2020, held in San José (Costa Rica) in 2020. This book collects articles on: energetic efficiency and sustainability; infrastructures, energy and the environment; mobility and IoT; governance and citizenship

    Command and Persuade

    Get PDF
    Why, when we have been largely socialized into good behavior, are there more laws that govern our behavior than ever before? Levels of violent crime have been in a steady decline for centuries—for millennia, even. Over the past five hundred years, homicide rates have decreased a hundred-fold. We live in a time that is more orderly and peaceful than ever before in human history. Why, then, does fear of crime dominate modern politics? Why, when we have been largely socialized into good behavior, are there more laws that govern our behavior than ever before? In Command and Persuade, Peter Baldwin examines the evolution of the state's role in crime and punishment over three thousand years. Baldwin explains that the involvement of the state in law enforcement and crime prevention is relatively recent. In ancient Greece, those struck by lightning were assumed to have been punished by Zeus. In the Hebrew Bible, God was judge, jury, and prosecutor when Cain killed Abel. As the state's power as lawgiver grew, more laws governed behavior than ever before; the sum total of prohibited behavior has grown continuously. At the same time, as family, community, and church exerted their influences, we have become better behaved and more law-abiding. Even as the state stands as the socializer of last resort, it also defines through law the terrain on which we are schooled into acceptable behavior. This title is also available in an Open Access edition
    corecore