137 research outputs found

    Performance Evaluation of Complex Systems Using the SBIP Framework

    Get PDF
    International audienceIn this paper we survey the main experiments performed using the SBIP framework. The latter consists of a stochastic component-based modeling formalism and a probabilistic model checking engine for verification. The modeling formalism is built as an extension of BIP and enables to build complex systems in a compositional way, while the verification engine implements a set of statistical algorithms for the verification of qualitative and quantitative properties. The SBIP framework has been used to model and verify a large set of real life systems including various network protocols and multimedia applications

    Using BIP to reinforce correctness of resource-constrained IoT applications

    No full text
    International audienceIoT applications have either a sense-only or a sense-compute-actuate goal and they implement a capability to process and respond to multiple (external) events while performing computations. Existing IoT operating systems provide a versatile execution environment that adheres to the limitations of the interconnected resource-constrained devices. To reduce the development effort, applications are often built on top of RESTful web services, which can be shared and reused. However, the asynchronous communication between remote nodes is prone to event scheduling delays, which cannot be predicted and taken into account while programming the application. Moreover, to avoid long delays in message processing and communication due to packet collisions, the data transmission frequencies between the system's nodes have to carefully chosen. In general, even when appropriate debugging tools and simulators are available, it is still a hard challenge to guarantee the required functional and non-functional properties at the application and system levels. To this end, we focus on IoT applications for the Contiki OS and we introduce a model-based rigorous analysis approach using the BIP component framework. At the application level, we verify qualitative properties regarding service responsiveness, whereas at the system level we can validate qualitative and quantitative properties using statistical model checking. We present results for an application scenario running on a distributed system infrastructure with nodes executing the Contiki OS

    Weighted Combination of Sample Based and Block Based Intra Prediction in Video Coding

    Get PDF
    The latest standard within video compression, HEVC/H.265, was released during 2013 and provides a significant improvement from its predecessor AVC/H.264. However, with a constantly increasing demand for high denition video and streaming of large video files, there are still improvements that can be done. Difficult content in video sequences, for example smoke, leaves and water that moves irregularly, is being hard to predict and can be troublesome at the prediction stage in the video compression. In this thesis, carried out at Ericsson in Stockholm, the combination of sample based intra prediction (SBIP) and block based intra prediction (BBIP) is tested to see if it could improve the prediction of video sequences containing difficult content, here focusing on water. The combined methods are compared to HEVC intra prediction. All implementations have been done in Matlab. The results show that a combination reduces the Mean Squared Error (MSE) as well as could improve the Visual Information Fidelity (VIF) and the mean Structural Similarity (MSSIM). Moreover the visual quality was improved by more details and less blocking artefacts

    Building Faithful High-level Models and Performance Evaluation of Manycore Embedded Systems

    Get PDF
    International audiencePerformance and functional correctness are key for successful design of modern embedded systems. Both aspects must be considered early in the design process to enable founded decision making towards final implementation. Nonetheless, building abstract system-level models that faithfully capture performance information along to functional behavior is a challenging task. In contrast to functional aspects, performance details are rarely available during early design phases and no clear method is known to characterize them. Moreover, once such system-level models are built they are inherently complex as they usually mix software models, hardware architecture constraints and environment abstractions. Their analysis by using traditional performance evaluation methods is reaching the limits and the need for more scalable and accurate techniques is becoming urgent. In this paper, we introduce a systematic method for building stochastic abstract performance models using statistical inference and model calibration and we propose statistical model checking as performance evaluation technique upon the obtained models. We experimented our method on a real-life case study. We were able to verify different timing properties

    SBIP 2.0: Statistical Model Checking Stochastic Real-time Systems

    Get PDF
    International audienceThis paper presents a major new release of SBIP, an extensi-ble statistical model checker for Metric (MTL) and Linear-time Temporal Logic (LTL) properties on respectively Generalized Semi-Markov Processes (GSMP), Continuous-Time (CTMC) and Discrete-Time Markov Chain (DTMC) models. The newly added support for MTL, GSMPs, CTMCs and rare events allows to capture both real-time and stochastic aspects, allowing faithful specification, modeling and analysis of real-life systems. SBIP is redesigned as an IDE providing project management, model edition, compilation, simulation, and statistical analysis

    Model-based validation of CANopen systems

    No full text
    International audienceCANopen is an increasingly popular protocol for the design of networked embedded systems. Nonetheless, the large variety of communication and network management functionalities supported in CANopen can increase significantly systems complexity and in turn, the needs for system validation at design time. We present hereafter a rigorous method based on formal modeling and verification techniques, allowing to provide a comprehensive analysis of CANopen systems. Our method uses BIP, a formal framework for modeling, analysis and implementation of real-time, heterogeneous, component-based systems and the associated BIP tools for simulation, performance evaluation and statistical model-checking

    Rigorous Design of FDIR Systems with BIP

    Get PDF
    The correct design of autonomous systems is a challenge, due to the uncertainties arising at execution time. A special case of uncertainties are the faults and failures that break the system’s requirements. Dealing with such situations requires to design fault detection, isolation and recovery (FDIR) components. The aim of FDIR components is to detect when a fault has occurred and to apply a recovery strategy that brings the system into a mode where the requirements are satisfied. In this paper we describe an approach based on the Behavior, Interaction, Priority (BIP) tools for the rigorous design of FDIR components. This approach leverages the scalability of statistical model-checking tool BIP-SMC to check for requirement satisfaction, and the code generation feature of the BIP compiler. Moreover, the generated code is executable with the BIP engine(s) and easily integrated with the original system. The approach has been used in the H2020 ESROCOS and ERGO projects for the development of (autonomous) robotics control systems, which have been validated through field trials

    SAM-SoS: A stochastic software architecture modeling and verification approach for complex System-of-Systems

    Get PDF
    A System-of-Systems (SoS) is a complex, dynamic system whose Constituent Systems (CSs) are not known precisely at design time, and the environment in which they operate is uncertain. SoS behavior is unpredictable due to underlying architectural characteristics such as autonomy and independence. Although the stochastic composition of CSs is vital to achieving SoS missions, their unknown behaviors and impact on system properties are unavoidable. Moreover, unknown conditions and volatility have significant effects on crucial Quality Attributes (QAs) such as performance, reliability and security. Hence, the structure and behavior of a SoS must be modeled and validated quantitatively to foresee any potential impact on the properties critical for achieving the missions. Current modeling approaches lack the essential syntax and semantics required to model and verify SoS behaviors at design time and cannot offer alternative design choices for better design decisions. Therefore, the majority of existing techniques fail to provide qualitative and quantitative verification of SoS architecture models. Consequently, we have proposed an approach to model and verify Non-Deterministic (ND) SoS in advance by extending the current algebraic notations for the formal models as a hybrid stochastic formalism to specify and reason architectural elements with the required semantics. A formal stochastic model is developed using a hybrid approach for architectural descriptions of SoS with behavioral constraints. Through a model-driven approach, stochastic models are then translated into PRISM using formal verification rules. The effectiveness of the approach has been tested with an end-to-end case study design of an emergency response SoS for dealing with a fire situation. Architectural analysis is conducted on the stochastic model, using various qualitative and quantitative measures for SoS missions. Experimental results reveal critical aspects of SoS architecture model that facilitate better achievement of missions and QAs with improved design, using the proposed approach
    • …
    corecore