144,518 research outputs found

    A Mechanized Semantic Framework for Real-Time Systems

    Get PDF
    International audienceConcurrent systems consist of many components which may execute in parallel and are complex to design, to analyze, to verify, and to implement. The complexity increases if the systems have real-time constraints, which are very useful in avionic, spatial and other kind of embedded applications. In this paper we present a logical framework for defining and validating real-time formalisms as well as reasoning methods over them. For this purpose, we have implemented in the Coq proof assistant well known semantic domains for real-time systems based on labelled transitions systems and timed runs. We experiment our framework by considering the real-time CSP-based language fiacre, which has been defined as a pivot formalism for modeling languages (aadl, sdl, ...) used in the TOPCASED project. Thus, we define an extension to the formal semantic models mentioned above that facilitates the modeling of fine-grained time constraints of fiacre. Finally, we implement this extension in our framework and provide a proof method environment to deal with real-time system in order to achieve their formal certification

    MacFormer: Map-Agent Coupled Transformer for Real-time and Robust Trajectory Prediction

    Full text link
    Predicting the future behavior of agents is a fundamental task in autonomous vehicle domains. Accurate prediction relies on comprehending the surrounding map, which significantly regularizes agent behaviors. However, existing methods have limitations in exploiting the map and exhibit a strong dependence on historical trajectories, which yield unsatisfactory prediction performance and robustness. Additionally, their heavy network architectures impede real-time applications. To tackle these problems, we propose Map-Agent Coupled Transformer (MacFormer) for real-time and robust trajectory prediction. Our framework explicitly incorporates map constraints into the network via two carefully designed modules named coupled map and reference extractor. A novel multi-task optimization strategy (MTOS) is presented to enhance learning of topology and rule constraints. We also devise bilateral query scheme in context fusion for a more efficient and lightweight network. We evaluated our approach on Argoverse 1, Argoverse 2, and nuScenes real-world benchmarks, where it all achieved state-of-the-art performance with the lowest inference latency and smallest model size. Experiments also demonstrate that our framework is resilient to imperfect tracklet inputs. Furthermore, we show that by combining with our proposed strategies, classical models outperform their baselines, further validating the versatility of our framework.Comment: Accepted by IEEE Robotics and Automation Letters. 8 Pages, 9 Figures, 9 Tables. Video: https://www.youtube.com/watch?v=XY388iI6sP

    Incremental Latency Analysis of Heterogeneous Cyber-Physical Systems

    Get PDF
    REACTION 2014. 3rd International Workshop on Real-time and Distributed Computing in Emerging Applications. Rome, Italy. December 2nd, 2014.Cyber-Physical Systems, as used in automotive, avionics, or aerospace domains, have critical real-time require-ments. Time-related issues might have important impacts and, as these systems are becoming extremely software-reliant, validate and enforcing timing constraints is becoming difficult. Current techniques are mainly focused on validating these constraints late by using integration tests and tracing the system execution. Such methods are time-consuming and labor-intensive and, discovering timing issue late in the development process might incur significant rework efforts. In this paper, we propose an incremental model-based ap-proach to analyze and validate timing requirements of cyber-physical systems. We first capture the system functions, its related latency requirements and validate the end-to-end latency at a high level. This functional architecture is then refined into an implementation deployed on an execution platform. As system description is evolving, the latency analysis is being refined with more precise values. Such an approach provide latency analysis from a high level specification without having to implement the system, saving potential re-engineering efforts. It also helps engineers to select appropriate execution platform components or change the deployment strategy of system functions to ensure that latency requirements will be met when implementing the system.This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center

    Towards Certain Fixes with Editing Rules and Master Data

    Get PDF
    A variety of integrity constraints have been studied for data cleaning. While these constraints can detect the presence of errors, they fall short of guiding us to correct the errors. Indeed, data repairing based on these constraints may not find certain fixes that are absolutely correct, and worse, may introduce new errors when repairing the data. We propose a method for finding certain fixes, based on master data, a notion of certain regions , and a class of editing rules . A certain region is a set of attributes that are assured correct by the users. Given a certain region and master data, editing rules tell us what attributes to fix and how to update them. We show how the method can be used in data monitoring and enrichment. We develop techniques for reasoning about editing rules, to decide whether they lead to a unique fix and whether they are able to fix all the attributes in a tuple, relative to master data and a certain region. We also provide an algorithm to identify minimal certain regions, such that a certain fix is warranted by editing rules and master data as long as one of the regions is correct. We experimentally verify the effectiveness and scalability of the algorithm. </jats:p

    A new model for solution of complex distributed constrained problems

    Full text link
    In this paper we describe an original computational model for solving different types of Distributed Constraint Satisfaction Problems (DCSP). The proposed model is called Controller-Agents for Constraints Solving (CACS). This model is intended to be used which is an emerged field from the integration between two paradigms of different nature: Multi-Agent Systems (MAS) and the Constraint Satisfaction Problem paradigm (CSP) where all constraints are treated in central manner as a black-box. This model allows grouping constraints to form a subset that will be treated together as a local problem inside the controller. Using this model allows also handling non-binary constraints easily and directly so that no translating of constraints into binary ones is needed. This paper presents the implementation outlines of a prototype of DCSP solver, its usage methodology and overview of the CACS application for timetabling problems

    Toward a Unified Performance and Power Consumption NAND Flash Memory Model of Embedded and Solid State Secondary Storage Systems

    Full text link
    This paper presents a set of models dedicated to describe a flash storage subsystem structure, functions, performance and power consumption behaviors. These models cover a large range of today's NAND flash memory applications. They are designed to be implemented in simulation tools allowing to estimate and compare performance and power consumption of I/O requests on flash memory based storage systems. Such tools can also help in designing and validating new flash storage systems and management mechanisms. This work is integrated in a global project aiming to build a framework simulating complex flash storage hierarchies for performance and power consumption analysis. This tool will be highly configurable and modular with various levels of usage complexity according to the required aim: from a software user point of view for simulating storage systems, to a developer point of view for designing, testing and validating new flash storage management systems

    Moving from Data-Constrained to Data-Enabled Research: Experiences and Challenges in Collecting, Validating and Analyzing Large-Scale e-Commerce Data

    Get PDF
    Widespread e-commerce activity on the Internet has led to new opportunities to collect vast amounts of micro-level market and nonmarket data. In this paper we share our experiences in collecting, validating, storing and analyzing large Internet-based data sets in the area of online auctions, music file sharing and online retailer pricing. We demonstrate how such data can advance knowledge by facilitating sharper and more extensive tests of existing theories and by offering observational underpinnings for the development of new theories. Just as experimental economics pushed the frontiers of economic thought by enabling the testing of numerous theories of economic behavior in the environment of a controlled laboratory, we believe that observing, often over extended periods of time, real-world agents participating in market and nonmarket activity on the Internet can lead us to develop and test a variety of new theories. Internet data gathering is not controlled experimentation. We cannot randomly assign participants to treatments or determine event orderings. Internet data gathering does offer potentially large data sets with repeated observation of individual choices and action. In addition, the automated data collection holds promise for greatly reduced cost per observation. Our methods rely on technological advances in automated data collection agents. Significant challenges remain in developing appropriate sampling techniques integrating data from heterogeneous sources in a variety of formats, constructing generalizable processes and understanding legal constraints. Despite these challenges, the early evidence from those who have harvested and analyzed large amounts of e-commerce data points toward a significant leap in our ability to understand the functioning of electronic commerce.Comment: Published at http://dx.doi.org/10.1214/088342306000000231 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Proof Generation from Delta-Decisions

    Full text link
    We show how to generate and validate logical proofs of unsatisfiability from delta-complete decision procedures that rely on error-prone numerical algorithms. Solving this problem is important for ensuring correctness of the decision procedures. At the same time, it is a new approach for automated theorem proving over real numbers. We design a first-order calculus, and transform the computational steps of constraint solving into logic proofs, which are then validated using proof-checking algorithms. As an application, we demonstrate how proofs generated from our solver can establish many nonlinear lemmas in the the formal proof of the Kepler Conjecture.Comment: Appeared in SYNASC'1
    corecore