2,320 research outputs found

    Contract-Based Design of Dataflow Programs

    Get PDF
    Quality and correctness are becoming increasingly important aspects of software development, as our reliance on software systems in everyday life continues to increase. Highly complex software systems are today found in critical appliances such as medical equipment, cars, and telecommunication infrastructure. Failures in these kinds of systems may have disastrous consequences. At the same time, modern computer platforms are increasingly concurrent, as the computational capacity of modern CPUs is improved mainly by increasing the number of processor cores. Computer platforms are also becoming increasingly parallel, distributed and heterogeneous, often involving special processing units, such as graphics processing units (GPU) or digital signal processors (DSP) for performing specific tasks more efficiently than possible on general-purpose CPUs. These modern platforms allow implementing increasingly complex functionality in software. Cost efficient development of software that efficiently exploits the power of this type of platforms and at the same time ensures correctness is, however, a challenging task. Dataflow programming has become popular in development of safetycritical software in many domains in the embedded community. For instance, in the automotive domain, the dataflow language Simulink has become widely used in model-based design of control software. However, for more complex functionality, this model of computation may not be expressive enough. In the signal processing domain, more expressive, dynamic models of computation have attracted much attention. These models of computation have, however, not gained as significant uptake in safety-critical domains due to a great extent to that it is challenging to provide guarantees regarding e.g. timing or determinism under these more expressive models of computation. Contract-based design has become widespread to specify and verify correctness properties of software components. A contract consists of assumptions (preconditions) regarding the input data and guarantees (postconditions) regarding the output data. By verifying a component with respect to its contract, it is ensured that the output fulfils the guarantees, assuming that the input fulfils the assumptions. While contract-based verification of traditional object-oriented programs has been researched extensively, verification of asynchronous dataflow programs has not been researched to the same extent. In this thesis, a contract-based design framework tailored specifically to dataflow programs is proposed. The proposed framework supports both an extensive subset of the discrete-time Simulink synchronous language, as well as a more general, asynchronous and dynamic, dataflow language. The proposed contract-based verification techniques are automatic, only guided by user-provided invariants, and based on encoding dataflow programs in existing, mature verification tools for sequential programs, such as the Boogie guarded command language and its associated verifier. It is shown how dataflow programs, with components implemented in an expressive programming language with support for matrix computations, can be efficiently encoded in such a verifier. Furthermore, it is also shown that contract-based design can be used to improve runtime performance of dataflow programs by allowing more scheduling decisions to be made at compile-time. All the proposed techniques have been implemented in prototype tools and evaluated on a large number of different programs. Based on the evaluation, the methods were proven to work in practice and to scale to real-world programs.Kvalitet och korrekthet blir idag allt viktigare aspekter inom mjukvaruutveckling, dÄ vi i allt högre grad förlitar oss pÄ mjukvarusystem i vÄra vardagliga sysslor. Mycket komplicerade mjukvarusystem finns idag i kritiska tillÀmpningar sÄ som medicinsk utrustning, bilar och infrastruktur för telekommunikation. Fel som uppstÄr i de hÀr typerna av system kan ha katastrofala följder. Samtidigt utvecklas kapaciteten hos moderna datorplattformar idag frÀmst genom att öka antalet processorkÀrnor. DÀrtill blir datorplattformar allt mer parallella, distribuerade och heterogena, och innefattar ofta specialla processorer sÄ som grafikprocessorer (GPU) eller signalprocessorer (DSP) för att utföra specifika berÀkningar snabbare Àn vad som Àr möjligt pÄ vanliga processorer. Den hÀr typen av plattformar möjligör implementering av allt mer komplicerade berÀkningar i mjukvara. Kostnadseffektiv utveckling av mjukvara som effektivt utnyttjar kapaciteten i den hÀr typen av plattformar och samtidigt sÀkerstÀller korrekthet Àr emellertid en mycket utmanande uppgift. Dataflödesprogrammering har blivit ett populÀrt sÀtt att utveckla mjukvara inom flera omrÄden som innefattar sÀkerhetskritiska inbyggda datorsystem. Till exempel inom fordonsindustrin har dataflödessprÄket Simulink kommit att anvÀndas i bred utstrÀckning för modellbaserad design av kontrollsystem. För mer komplicerad funktionalitet kan dock den hÀr modellen för berÀkning vara för begrÀnsad betrÀffande vad som kan beksrivas. Inom signalbehandling har mera expressiva och dynamiska modeller för berÀkning attraherat stort intresse. De hÀr modellerna för berÀkning har ÀndÄ inte tagits i bruk i samma utstrÀckning inom sÀkerhetskritiska tillÀmpningar. Det hÀr beror till en stor del pÄ att det Àr betydligt svÄrare att garantera egenskaper gÀllande till exempel timing och determinism under sÄdana hÀr modeller för berÀkning. Kontraktbaserad design har blivit ett vanligt sÀtt att specifiera och verifiera korrekthetsegenskaper hos mjukvarukomponeneter. Ett kontrakt bestÄr av antaganden (förvillkor) gÀllande indata och garantier (eftervillkor) gÀllande utdata. Genom att verifiera en komponent gentemot sitt konktrakt kan man bevisa att utdatan uppfyller garantierna, givet att indatan uppfyller antagandena. Trots att kontraktbaserad verifiering i sig Àr ett mycket beforskat omrÄde, sÄ har inte verifiering av asynkrona dataflödesprogram beforskats i samma utstrÀckning. I den hÀr avhandlingen presenteras ett ramverk för kontraktbaserad design skrÀddarsytt för dataflödesprogram. Det föreslagna ramverket stödjer sÄ vÀl en stor del av det synkrona sprÄket. Simulink med diskret tid som ett mera generellt asynkront och dynamiskt dataflödessprÄk. De föreslagna kontraktbaserade verifieringsteknikerna Àr automatiska. Utöver kontraktets för- och eftervillkor ger anvÀndaren endast de invarianter som krÀvs för att möjliggöra verifieringen. Verifieringsteknikerna grundar sig pÄ att omkoda dataflödesprogram till input för existerande och beprövade verifieringsverktyg för sekventiella program sÄ som Boogie. Avhandlingen visar hur dataflödesprogram implementerade i ett expressivt programmeringssprÄk med inbyggt stöd för matrisoperationer effektivt kan omkodas till input för ett verifieringsverktyg som Boogie. Utöver detta visar avhandlingen ocksÄ att kontraktbaserad design ocksÄ kan förbÀttra prestandan hos dataflödesprogram i körningsskedet genom att möjliggöra flera schemalÀggningsbeslut redan i kompileringsskedet. Alla tekniker som presenteras i avhandlingen har implementerats i prototypverktyg och utvÀrderats pÄ en stor mÀngd olika program. UtvÀrderingen bevisar att teknikerna fungerar i praktiken och Àr tillrÀckligt skalbara för att ocksÄ fungera pÄ program av realistisk storlek

    Lattice Boltzmann Modelling of Droplet Dynamics on Fibres and Meshed Surfaces

    Get PDF
    Fibres and fibrous materials are ubiquitous in nature and industry, and their interactions with liquid droplets are often key for their use and functions. These structures can be employed as-is or combined to construct more complex mesh structures. Therefore, to optimise the effectiveness of these structures, the study of the wetting interactions between droplets and solids is essential. In this work, I use the numerical solver lattice Boltzmann method (LBM) to systematically study three different cases of droplet wetting, spreading, and moving across fibres, and droplets impacting mesh structures. First, I focus on partially wetting droplets moving along a fibre. For the so-called clamshell morphology, I find three possible dynamic regimes upon varying the droplet Bond number and the fibre radius: compact, breakup, and oscillation. For small Bond numbers, in the compact regime, the droplet reaches a steady state, and its velocity scales linearly with the driving body force. For higher Bond numbers, in the breakup regime, satellite droplets are formed trailing the initial moving droplet, which is easier with smaller fibre radii. Finally, in the oscillation regime (favoured in the midrange of fibre radius), the droplet shape periodically extends and contracts along the fibre. Outside of the commonly known fully wetting and partial wetting states, there exists the pseudo-partial wetting state (where both the spherical cap and the thin film can coexist together), which few numerical methods are able to simulate. I implement long-range interactions between the fluid and solid in LBM to realise this wetting state. The robustness of this approach is shown by simulating a number of scenarios. I start by simulating droplets in fully, partial, and pseudo-partial wetting states on flat surfaces, followed by pseudo-partially wetting droplets spreading on grooved surfaces and fibre structures. I also explore the effects of key parameters in long-range interactions. For the dynamics demonstration, I simulate droplets in the pseudo-partial wetting state moving along a fibre in both the barrel and clamshell morphologies at different droplet volumes and fibre radii. Finally, I focus on the dynamics of droplets impacting square mesh structures. I systematically vary the impact point, trajectory, and velocity. To rationalise the results, I find it useful to consider whether the droplet trajectory is dominated by orthogonal or diagonal movement. The former leads to a lower incident rate and a more uniform interaction time distribution, while the latter is typically characterised by more complex droplet trajectories with less predictability. Then, focussing on an impact point, I compare the droplet dynamics impacting a single-layer structure and equivalent double-layer structures. From a water-capturing capability perspective (given the same effective pore size), a double-layer structure performs slightly worse. A double-layer structure also generally leads to shorter interaction time compared to a single-layer structure

    Authentication enhancement in command and control networks: (a study in Vehicular Ad-Hoc Networks)

    Get PDF
    Intelligent transportation systems contribute to improved traffic safety by facilitating real time communication between vehicles. By using wireless channels for communication, vehicular networks are susceptible to a wide range of attacks, such as impersonation, modification, and replay. In this context, securing data exchange between intercommunicating terminals, e.g., vehicle-to-everything (V2X) communication, constitutes a technological challenge that needs to be addressed. Hence, message authentication is crucial to safeguard vehicular ad-hoc networks (VANETs) from malicious attacks. The current state-of-the-art for authentication in VANETs relies on conventional cryptographic primitives, introducing significant computation and communication overheads. In this challenging scenario, physical (PHY)-layer authentication has gained popularity, which involves leveraging the inherent characteristics of wireless channels and the hardware imperfections to discriminate between wireless devices. However, PHY-layerbased authentication cannot be an alternative to crypto-based methods as the initial legitimacy detection must be conducted using cryptographic methods to extract the communicating terminal secret features. Nevertheless, it can be a promising complementary solution for the reauthentication problem in VANETs, introducing what is known as “cross-layer authentication.” This thesis focuses on designing efficient cross-layer authentication schemes for VANETs, reducing the communication and computation overheads associated with transmitting and verifying a crypto-based signature for each transmission. The following provides an overview of the proposed methodologies employed in various contributions presented in this thesis. 1. The first cross-layer authentication scheme: A four-step process represents this approach: initial crypto-based authentication, shared key extraction, re-authentication via a PHY challenge-response algorithm, and adaptive adjustments based on channel conditions. Simulation results validate its efficacy, especially in low signal-to-noise ratio (SNR) scenarios while proving its resilience against active and passive attacks. 2. The second cross-layer authentication scheme: Leveraging the spatially and temporally correlated wireless channel features, this scheme extracts high entropy shared keys that can be used to create dynamic PHY-layer signatures for authentication. A 3-Dimensional (3D) scattering Doppler emulator is designed to investigate the scheme’s performance at different speeds of a moving vehicle and SNRs. Theoretical and hardware implementation analyses prove the scheme’s capability to support high detection probability for an acceptable false alarm value ≀ 0.1 at SNR ≄ 0 dB and speed ≀ 45 m/s. 3. The third proposal: Reconfigurable intelligent surfaces (RIS) integration for improved authentication: Focusing on enhancing PHY-layer re-authentication, this proposal explores integrating RIS technology to improve SNR directed at designated vehicles. Theoretical analysis and practical implementation of the proposed scheme are conducted using a 1-bit RIS, consisting of 64 × 64 reflective units. Experimental results show a significant improvement in the Pd, increasing from 0.82 to 0.96 at SNR = − 6 dB for multicarrier communications. 4. The fourth proposal: RIS-enhanced vehicular communication security: Tailored for challenging SNR in non-line-of-sight (NLoS) scenarios, this proposal optimises key extraction and defends against denial-of-service (DoS) attacks through selective signal strengthening. Hardware implementation studies prove its effectiveness, showcasing improved key extraction performance and resilience against potential threats. 5. The fifth cross-layer authentication scheme: Integrating PKI-based initial legitimacy detection and blockchain-based reconciliation techniques, this scheme ensures secure data exchange. Rigorous security analyses and performance evaluations using network simulators and computation metrics showcase its effectiveness, ensuring its resistance against common attacks and time efficiency in message verification. 6. The final proposal: Group key distribution: Employing smart contract-based blockchain technology alongside PKI-based authentication, this proposal distributes group session keys securely. Its lightweight symmetric key cryptography-based method maintains privacy in VANETs, validated via Ethereum’s main network (MainNet) and comprehensive computation and communication evaluations. The analysis shows that the proposed methods yield a noteworthy reduction, approximately ranging from 70% to 99%, in both computation and communication overheads, as compared to the conventional approaches. This reduction pertains to the verification and transmission of 1000 messages in total

    Discrete modelling of continuous dynamic recrystallisation by modified Metropolis algorithm

    Get PDF
    Continuous dynamic recrystallisation (CDRX) is often the primary mechanism for microstructure evolution during severe plastic deformation (SPD) of polycrystalline metals. Its physically realistic simulation remains challenging for the existing modelling approaches based on continuum mathematics because they do not capture important local interactions between microstructure elements and spatial inhomogeneities in plastic strain. An effective discrete method for simulating CDRX is developed in this work. It employs algebraic topology, graph theory and statistical physics tools to represent an evolution of grain boundary networks as a sequence of conversions between low-angle grain boundaries (LAGBs) and high-angle grain boundaries (HAGBs) governed by the principle of minimal energy increase, similar to the well-known Ising model. The energy is minimised by a modified Metropolis algorithm. The model is used to predict the equilibrium fractions of HAGBs in several SPD-processed copper alloys. The analysis captures non-equilibrium features of the transitions from sub-grain structures to new HAGB-dominated grain structures and provides estimations of critical values for HAGB fractions and accumulated strain at these transitions

    Search and Explore: Symbiotic Policy Synthesis in POMDPs

    Full text link
    This paper marries two state-of-the-art controller synthesis methods for partially observable Markov decision processes (POMDPs), a prominent model in sequential decision making under uncertainty. A central issue is to find a POMDP controller - that solely decides based on the observations seen so far - to achieve a total expected reward objective. As finding optimal controllers is undecidable, we concentrate on synthesising good finite-state controllers (FSCs). We do so by tightly integrating two modern, orthogonal methods for POMDP controller synthesis: a belief-based and an inductive approach. The former method obtains an FSC from a finite fragment of the so-called belief MDP, an MDP that keeps track of the probabilities of equally observable POMDP states. The latter is an inductive search technique over a set of FSCs, e.g., controllers with a fixed memory size. The key result of this paper is a symbiotic anytime algorithm that tightly integrates both approaches such that each profits from the controllers constructed by the other. Experimental results indicate a substantial improvement in the value of the controllers while significantly reducing the synthesis time and memory footprint.Comment: Accepted to CAV 202

    Current and Future Challenges in Knowledge Representation and Reasoning

    Full text link
    Knowledge Representation and Reasoning is a central, longstanding, and active area of Artificial Intelligence. Over the years it has evolved significantly; more recently it has been challenged and complemented by research in areas such as machine learning and reasoning under uncertainty. In July 2022 a Dagstuhl Perspectives workshop was held on Knowledge Representation and Reasoning. The goal of the workshop was to describe the state of the art in the field, including its relation with other areas, its shortcomings and strengths, together with recommendations for future progress. We developed this manifesto based on the presentations, panels, working groups, and discussions that took place at the Dagstuhl Workshop. It is a declaration of our views on Knowledge Representation: its origins, goals, milestones, and current foci; its relation to other disciplines, especially to Artificial Intelligence; and on its challenges, along with key priorities for the next decade

    Counterfactual Causality for Reachability and Safety based on Distance Functions

    Full text link
    Investigations of causality in operational systems aim at providing human-understandable explanations of why a system behaves as it does. There is, in particular, a demand to explain what went wrong on a given counterexample execution that shows that a system does not satisfy a given specification. To this end, this paper investigates a notion of counterfactual causality in transition systems based on Stalnaker's and Lewis' semantics of counterfactuals in terms of most similar possible worlds and introduces a novel corresponding notion of counterfactual causality in two-player games. Using distance functions between paths in transition systems, this notion defines whether reaching a certain set of states is a cause for the violation of a reachability or safety property. Similarly, using distance functions between memoryless strategies in reachability and safety games, it is defined whether reaching a set of states is a cause for the fact that a given strategy for the player under investigation is losing. The contribution of the paper is two-fold: In transition systems, it is shown that counterfactual causality can be checked in polynomial time for three prominent distance functions between paths. In two-player games, the introduced notion of counterfactual causality is shown to be checkable in polynomial time for two natural distance functions between memoryless strategies. Further, a notion of explanation that can be extracted from a counterfactual cause and that pinpoints changes to be made to the given strategy in order to transform it into a winning strategy is defined. For the two distance functions under consideration, the problem to decide whether such an explanation imposes only minimal necessary changes to the given strategy with respect to the used distance function turns out to be coNP-complete and not to be solvable in polynomial time if P is not equal to NP, respectively.Comment: This is the extended version of a paper accepted for publication at GandALF 202

    An Investigation into Radiative Property Variations across Pre-Annealed Advanced High Strength Steel Coils

    Get PDF
    In recent years, increasingly stringent crashworthiness and emissions regulations have driven automakers to consider novel materials for automotive lightweighting. Advanced high-strength steels (AHSS) used in automotive chassis construction has increased considerably. The most widely used grades of AHSS are dual-phase (DP) ferrite-martensite (α + α') grades. Advanced high strength steel coils must be annealed with a precise heating schedule to achieve the required mechanical properties. However, temperature excursions during intercritical annealing cause erratic changes in the steel's microstructure, resulting in variations in post-annealed mechanical properties across coils. These variations lead to high scrap rates and cost manufacturers millions of dollars annually. Past research has attributed these temperature excursions to non-uniform thermal irradiation. The present work shows variations in radiative properties across a single AHSS coil may cause temperature excursions through pyrometer errors and nonuniform heating. Radiative property variations across a coil may also arise before annealing due to non-homogeneities in surface topography, influencing how the radiative properties subsequently evolve during annealing. This thesis documents experimental and theoretical work characterising radiative property variations across a single AHSS coil processed on an industrial cold-rolling line. The ex-situ radiative properties of samples extracted from various coil locations are analysed using a Fourier Transform Infra-Red (FTIR) spectrometer equipped with an integrating sphere, revealing large swings in radiative properties along its length and width. The effect of these variations on pyrometric temperature measurements, strip temperature evolution, and in turn, the as-formed mechanical properties are discussed. Radiative property variations are strongly correlated to differences in surface topography (particularly surface cavities) through optical profilometry, optical microscopy, and scanning electron microscopy (SEM). The work uses 3D depth mapping of optical imagery to generate surface height maps and theoretically models the radiative properties using a geometric optics approximation (GOA) ray-tracing algorithm. The GOA approach provides accurate spectral emissivity predictions within its validity regime. The study then explores reasons for surface cavity formation, hypothesising that cavities form due to the dissolution of selective grain boundary oxides (formed during hot rolling) during acid pickling, which leads to micro-topographical changes to the strip surface. Furthermore, non-homogeneous cold-rolling parameters subsequently lead to non-uniform cavity flattening. The thesis then explores the combined effect of acid pickling time and cold-rolling reduction percentage by studying different AHSS alloys, cold-rolled and acid-pickled to different extents, through a factorial design-of-experiments procedure. An artificial neural network (ANN) regression model for near-instantaneous spectral emissivity predictions of AHSS was developed using surface roughness parameters and optical imagery as inputs. Manufacturers can implement this model with emerging in-situ strip imaging technologies to provide real-time spectral emissivity predictions before a coil section enters an annealing furnace. Galvanisers can also use these on-line spectral emissivity predictions to update pyrometry and furnace temperature control algorithms in real time. This thesis expands our knowledge base on the possible causes for temperature excursions across an AHSS coil during annealing. Findings of this research will benefit steel manufacturers in identifying and reducing non-homogeneities in mechanical properties across AHSS coils, reducing high scrap rates in the industry

    Evaluating Architectural Safeguards for Uncertain AI Black-Box Components

    Get PDF
    Although tremendous progress has been made in Artificial Intelligence (AI), it entails new challenges. The growing complexity of learning tasks requires more complex AI components, which increasingly exhibit unreliable behaviour. In this book, we present a model-driven approach to model architectural safeguards for AI components and analyse their effect on the overall system reliability
    • 

    corecore