272 research outputs found

    Impact of the assimilation of conventional data on the quantitative precipitation forecasts in the Eastern Mediterranean

    No full text
    International audienceThis study is devoted to the evaluation of the role of assimilation of conventional data on the quantitative precipitation forecasts at regional scale. The conventional data included surface station reports as well as upper air observations. The analysis was based on the simulation of 15 cases of heavy precipitation that occurred in the Eastern Mediterranean. The verification procedure revealed that the ingestion of conventional data by objective analysis in the initial conditions of BOLAM limited area model do not result in a statistically significant improvement of the quantitative precipitation forecasts

    From MTL to deterministic timed automata

    Get PDF
    In this paper we propose a novel technique for constructing timed automata from properties expressed in the logic MTL, under bounded-variability assumptions. We handle full MTL and in particular do not impose bounds on the future temporal connectives. Our construction is based on separation of the continuous time monitoring of the input sequence and discrete predictions regarding the future. The separation of the continuous from the discrete allows us to further determinize our automata. This leads, for the first time, to a construction from full MTL to deterministic timed automata

    Temporal Logic Robustness for General Signal Classes

    Get PDF
    In multi-agent systems, robots transmit their planned trajectories to each other or to a central controller, and each receiver plans its own actions by maximizing a measure of mission satisfaction. For missions expressed in temporal logic, the robustness function plays the role of satisfaction measure. Currently, a Piece-Wise Linear (PWL) or piece-wise constant reconstruction is used at the receiver. This allows an efficient robustness computation algorithm - a.k.a. monitoring - but is not adaptive to the signal class of interest, and does not leverage the compression properties of more general representations. When communication capacity is at a premium, this is a serious bottleneck. In this paper we first show that the robustness computation is significantly affected by how the continuous-time signal is reconstructed from the received samples, which can mean the difference between a successful control and a crash. We show that monitoring general spline-based reconstructions yields a smaller robustness error, and that it can be done with the same time complexity as monitoring the simpler PWL reconstructions. Thus robustness computation can now be adapted to the signal class of interest. We further show that the monitoring error is tightly upper-bounded by the L ∞ signal reconstruction error. We present a (non-linear) L ∞ -based scheme which yields even lower monitoring error than the spline-based schemes (which have the advantage of being faster to compute), and illustrate all results on two case studies. As an application of these results, we show how time-frequency specifications can be efficiently monitored online

    LNCS

    Get PDF
    In this paper we propose a novel technique for constructing timed automata from properties expressed in the logic mtl, under bounded-variability assumptions. We handle full mtl and include all future operators. Our construction is based on separation of the continuous time monitoring of the input sequence and discrete predictions regarding the future. The separation of the continuous from the discrete allows us to determinize our automata in an exponential construction that does not increase the number of clocks. This leads to a doubly exponential construction from mtl to deterministic timed automata, compared with triply exponential using existing approaches. We offer an alternative to the existing approach to linear real-time model checking, which has never been implemented. It further offers a unified framework for model checking, runtime monitoring, and synthesis, in an approach that can reuse tools, implementations, and insights from the discrete setting

    Second International Competition on Runtime Verification: CRV 2015

    Get PDF
    International audienceWe report on the Second International Competition on Run-time Verification (CRV-2015). The competition was held as a satellite event of the 15th International Conference on Runtime Verification (RV'15). The competition consisted of three tracks: o✏ine monitoring, online monitoring of C programs, and online monitoring of Java programs. This report describes the format of the competition, the participating teams and submitted benchmarks. We give an example illustrating the two main inputs expected from the participating teams, namely a benchmark (i.e., a program and a property on this program) and a monitor for this benchmark. We also propose some reflection based on the lessons learned

    IST Austria Technical Report

    Get PDF
    Model-based testing is a promising technology for black-box software and hardware testing, in which test cases are generated automatically from high-level specifications. Nowadays, systems typically consist of multiple interacting components and, due to their complexity, testing presents a considerable portion of the effort and cost in the design process. Exploiting the compositional structure of system specifications can considerably reduce the effort in model-based testing. Moreover, inferring properties about the system from testing its individual components allows the designer to reduce the amount of integration testing. In this paper, we study compositional properties of the IOCO-testing theory. We propose a new approach to composition and hiding operations, inspired by contract-based design and interface theories. These operations preserve behaviors that are compatible under composition and hiding, and prune away incompatible ones. The resulting specification characterizes the input sequences for which the unit testing of components is sufficient to infer the correctness of component integration without the need for further tests. We provide a methodology that uses these results to minimize integration testing effort, but also to detect potential weaknesses in specifications. While we focus on asynchronous models and the IOCO conformance relation, the resulting methodology can be applied to a broader class of systems

    Technical Note: High-resolution mineralogical database of dust-productive soils for atmospheric dust modeling

    Get PDF
    Dust storms and associated mineral aerosol transport are driven primarily by meso- and synoptic-scale atmospheric processes. It is therefore essential that the dust aerosol process and background atmospheric conditions that drive dust emissions and atmospheric transport are represented with sufficiently well-resolved spatial and temporal features. The effects of airborne dust interactions with the environment determine the mineral composition of dust particles. The fractions of various minerals in aerosol are determined by the mineral composition of arid soils; therefore, a high-resolution specification of the mineral and physical properties of dust sources is needed. <br></br> Several current dust atmospheric models simulate and predict the evolution of dust concentrations; however, in most cases, these models do not consider the fractions of minerals in the dust. The accumulated knowledge about the impacts of the mineral composition in dust on weather and climate processes emphasizes the importance of including minerals in modeling systems. Accordingly, in this study, we developed a global dataset consisting of the mineral composition of the current potentially dust-producing soils. In our study, we (a) mapped mineral data to a high-resolution 30 s grid, (b) included several mineral-carrying soil types in dust-productive regions that were not considered in previous studies, and (c) included phosphorus

    Robust Online Monitoring of Signal Temporal Logic

    Full text link
    Signal Temporal Logic (STL) is a formalism used to rigorously specify requirements of cyberphysical systems (CPS), i.e., systems mixing digital or discrete components in interaction with a continuous environment or analog com- ponents. STL is naturally equipped with a quantitative semantics which can be used for various purposes: from assessing the robustness of a specification to guiding searches over the input and parameter space with the goal of falsifying the given property over system behaviors. Algorithms have been proposed and implemented for offline computation of such quantitative semantics, but only few methods exist for an online setting, where one would want to monitor the satisfaction of a formula during simulation. In this paper, we formalize a semantics for robust online monitoring of partial traces, i.e., traces for which there might not be enough data to decide the Boolean satisfaction (and to compute its quantitative counterpart). We propose an efficient algorithm to compute it and demonstrate its usage on two large scale real-world case studies coming from the automotive domain and from CPS education in a Massively Open Online Course (MOOC) setting. We show that savings in computationally expensive simulations far outweigh any overheads incurred by an online approach
    corecore