10,090 research outputs found

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    An empirical investigation of the relationship between integration, dynamic capabilities and performance in supply chains

    Get PDF
    This research aimed to develop an empirical understanding of the relationships between integration, dynamic capabilities and performance in the supply chain domain, based on which, two conceptual frameworks were constructed to advance the field. The core motivation for the research was that, at the stage of writing the thesis, the combined relationship between the three concepts had not yet been examined, although their interrelationships have been studied individually. To achieve this aim, deductive and inductive reasoning logics were utilised to guide the qualitative study, which was undertaken via multiple case studies to investigate lines of enquiry that would address the research questions formulated. This is consistent with the author’s philosophical adoption of the ontology of relativism and the epistemology of constructionism, which was considered appropriate to address the research questions. Empirical data and evidence were collected, and various triangulation techniques were employed to ensure their credibility. Some key features of grounded theory coding techniques were drawn upon for data coding and analysis, generating two levels of findings. These revealed that whilst integration and dynamic capabilities were crucial in improving performance, the performance also informed the former. This reflects a cyclical and iterative approach rather than one purely based on linearity. Adopting a holistic approach towards the relationship was key in producing complementary strategies that can deliver sustainable supply chain performance. The research makes theoretical, methodological and practical contributions to the field of supply chain management. The theoretical contribution includes the development of two emerging conceptual frameworks at the micro and macro levels. The former provides greater specificity, as it allows meta-analytic evaluation of the three concepts and their dimensions, providing a detailed insight into their correlations. The latter gives a holistic view of their relationships and how they are connected, reflecting a middle-range theory that bridges theory and practice. The methodological contribution lies in presenting models that address gaps associated with the inconsistent use of terminologies in philosophical assumptions, and lack of rigor in deploying case study research methods. In terms of its practical contribution, this research offers insights that practitioners could adopt to enhance their performance. They can do so without necessarily having to forgo certain desired outcomes using targeted integrative strategies and drawing on their dynamic capabilities

    ABC: Adaptive, Biomimetic, Configurable Robots for Smart Farms - From Cereal Phenotyping to Soft Fruit Harvesting

    Get PDF
    Currently, numerous factors, such as demographics, migration patterns, and economics, are leading to the critical labour shortage in low-skilled and physically demanding parts of agriculture. Thus, robotics can be developed for the agricultural sector to address these shortages. This study aims to develop an adaptive, biomimetic, and configurable modular robotics architecture that can be applied to multiple tasks (e.g., phenotyping, cutting, and picking), various crop varieties (e.g., wheat, strawberry, and tomato) and growing conditions. These robotic solutions cover the entire perception–action–decision-making loop targeting the phenotyping of cereals and harvesting fruits in a natural environment. The primary contributions of this thesis are as follows. a) A high-throughput method for imaging field-grown wheat in three dimensions, along with an accompanying unsupervised measuring method for obtaining individual wheat spike data are presented. The unsupervised method analyses the 3D point cloud of each trial plot, containing hundreds of wheat spikes, and calculates the average size of the wheat spike and total spike volume per plot. Experimental results reveal that the proposed algorithm can effectively identify spikes from wheat crops and individual spikes. b) Unlike cereal, soft fruit is typically harvested by manual selection and picking. To enable robotic harvesting, the initial perception system uses conditional generative adversarial networks to identify ripe fruits using synthetic data. To determine whether the strawberry is surrounded by obstacles, a cluster complexity-based perception system is further developed to classify the harvesting complexity of ripe strawberries. c) Once the harvest-ready fruit is localised using point cloud data generated by a stereo camera, the platform’s action system can coordinate the arm to reach/cut the stem using the passive motion paradigm framework, as inspired by studies on neural control of movement in the brain. Results from field trials for strawberry detection, reaching/cutting the stem of the fruit with a mean error of less than 3 mm, and extension to analysing complex canopy structures/bimanual coordination (searching/picking) are presented. Although this thesis focuses on strawberry harvesting, ongoing research is heading toward adapting the architecture to other crops. The agricultural food industry remains a labour-intensive sector with a low margin, and cost- and time-efficiency business model. The concepts presented herein can serve as a reference for future agricultural robots that are adaptive, biomimetic, and configurable

    Standardized Exclusion: A Theory of Barrier Lock-In

    Get PDF
    The United States has relaxed antitrust scrutiny of private standard-setting organizations in recognition of their potential procompetitive benefits. In the meantime, however, the growing importance of network industries—and the coinciding move toward vendor-led standards consortia—has welcomed new, insidious anticompetitive risks. This Note proffers one such risk: barrier lock-in. A theory of barrier lock-in recognizes that dominant vendors can capture and control standards consortia to keep standardized equipment complex and costly. These practices are exclusionary. This Note situates barrier lock-in within the existing antitrust literature and jurisprudence, provides a potential example of barrier lock-in in the 5G network equipment standardization process, and proposes two solutions for future legislative, executive, and judicial action against misbehaving standard-setters

    Endogenous measures for contextualising large-scale social phenomena: a corpus-based method for mediated public discourse

    Get PDF
    This work presents an interdisciplinary methodology for developing endogenous measures of group membership through analysis of pervasive linguistic patterns in public discourse. Focusing on political discourse, this work critiques the conventional approach to the study of political participation, which is premised on decontextualised, exogenous measures to characterise groups. Considering the theoretical and empirical weaknesses of decontextualised approaches to large-scale social phenomena, this work suggests that contextualisation using endogenous measures might provide a complementary perspective to mitigate such weaknesses. This work develops a sociomaterial perspective on political participation in mediated discourse as affiliatory action performed through language. While the affiliatory function of language is often performed consciously (such as statements of identity), this work is concerned with unconscious features (such as patterns in lexis and grammar). This work argues that pervasive patterns in such features that emerge through socialisation are resistant to change and manipulation, and thus might serve as endogenous measures of sociopolitical contexts, and thus of groups. In terms of method, the work takes a corpus-based approach to the analysis of data from the Twitter messaging service whereby patterns in users’ speech are examined statistically in order to trace potential community membership. The method is applied in the US state of Michigan during the second half of 2018—6 November having been the date of midterm (i.e. non-Presidential) elections in the United States. The corpus is assembled from the original posts of 5,889 users, who are nominally geolocalised to 417 municipalities. These users are clustered according to pervasive language features. Comparing the linguistic clusters according to the municipalities they represent finds that there are regular sociodemographic differentials across clusters. This is understood as an indication of social structure, suggesting that endogenous measures derived from pervasive patterns in language may indeed offer a complementary, contextualised perspective on large-scale social phenomena

    Comparing Recent Advances in Estimating and Measuring Oil Slick Thickness: An MPRI Technical Report

    Get PDF
    Characterization of the degree and extent of surface oil during and after an oil spill is a critical part of emergency response and Natural Resource Damage Assessment (NRDA) activities. More specifically, understanding floating oil thickness in real-time can guide response efforts by directing limited assets to priority cleanup areas; aid in ‘volume released’ estimates; enhance fate, transport and effects modeling capabilities; and support natural resource injury determinations. An international workshop brought researchers from agencies, academia and industry who were advancing in situ and remote oil characterization tools and methods together with stake holders and end users who rely on information about floating oil thickness for mission critical assignments (e.g., regulatory, assessment, cleanup, research). In total, over a dozen researchers presented and discussed their findings from tests using various different sensors and sensor platforms. The workshop resulted in discussions and recommendations for better ways to leverage limited resources and opportunities for advancing research and developing tools and methods for oil spill thickness measurements and estimates that could be applied during spill responses. One of the primary research gaps identified by the workshop participants was the need for side-by-side testing and validation of these different methods, to better understand their respective strengths, weaknesses and technical readiness levels, so that responders would be better able to make decisions about what methods are appropriate to use under what conditions, and to answer the various questions associated with response actions. Approach: 1) Convene a more in-depth multi day researcher workshop to discuss and develop specific workplan to conduct side-by-side validation and verification experiments for testing oil thickness measurements. 2) Conduct the validation and verification experiments in controlled environments: the Coastal Response Research Center (CRRC) highbay at the University of New Hampshire (UNH); and the Ohmsett National Oil Spill Response Research & Renewable Energy Test Facility

    Modelling, Monitoring, Control and Optimization for Complex Industrial Processes

    Get PDF
    This reprint includes 22 research papers and an editorial, collected from the Special Issue "Modelling, Monitoring, Control and Optimization for Complex Industrial Processes", highlighting recent research advances and emerging research directions in complex industrial processes. This reprint aims to promote the research field and benefit the readers from both academic communities and industrial sectors

    Numerical Simulations of Dusty Colliding Wind Binaries

    Get PDF
    Colliding Wind Binary (CWB) systems are relatively rare phenomena, but have a significant influence on galactic evolution in terms of dust production -- especially in the early universe. The mechanisms behind this dust production, however, are poorly understood. The strong winds from both partners in the binary system drive shocks that heat the dust forming region to temperatures in excess of 100 million Kelvin; whilst this region does rapidly cool, the initial shock temperatures would destroy any dust grains that formed outside the collision region. Furthermore, this collision region is difficult to observe and simulate, limiting our understanding of how grains form and evolve in this region. This thesis attempts to improve our understanding of the evolution of dust grains within these systems, particularly growth of these grains from small dust grain cores to micron-scale grains. A co-moving dust grain model was implemented that simulates growth through accretion of gas onto the dust grains, as well as destruction through gas-grain sputtering. The model also simulates cooling through collisional excitation and subsequent emission for both dust grains and gas. Overall, the goal of this model was to determine how dust growth was influenced by the wind and orbital characteristics of the system, and which of these characteristics were most important for dust growth. First, a parameter space exploration of dust producing CWB systems (WCd systems) was conducted, varying the orbital separation, the wind terminal velocity and the mass loss rate of each star. It was found that dust production is strongly influenced by the ratio of wind terminal velocities between each star, as well as the orbital separation. Following up on this, a limited simulation of the episodic dust forming system WR140 was conducted, in order to understand how variance in orbital separation through eccentricity changed dust production rates over the course of a periastron passage. Furthermore, it was determined that dust production occurs over a very short period immediately prior to periastron passage and a small period after, with an ``active'' phase of approximately 1 year, or an eighth of the systems orbital period Whilst there is much to be done in the future, and many more systems to be simulated (in particular the recently discovered WR+WR CWB systems WR48a and WR70-16) this model is a good first step towards shedding light on these elusive and dust-shrouded systems
    • 

    corecore