4,489 research outputs found

    Verifying service continuity in a satellite reconfiguration procedure: application to a satellite

    Get PDF
    The paper discusses the use of the TURTLE UML profile to model and verify service continuity during dynamic reconfiguration of embedded software, and space-based telecommunication software in particular. TURTLE extends UML class diagrams with composition operators, and activity diagrams with temporal operators. Translating TURTLE to the formal description technique RT-LOTOS gives the profile a formal semantics and makes it possible to reuse verification techniques implemented by the RTL, the RT-LOTOS toolkit developed at LAAS-CNRS. The paper proposes a modeling and formal validation methodology based on TURTLE and RTL, and discusses its application to a payload software application in charge of an embedded packet switch. The paper demonstrates the benefits of using TURTLE to prove service continuity for dynamic reconfiguration of embedded software

    Hardware Design and Implementation of Role-Based Cryptography

    Get PDF
    Traditional public key cryptographic methods provide access control to sensitive data by allowing the message sender to grant a single recipient permission to read the encrypted message. The Need2KnowĀ® system (N2K) improves upon these methods by providing role-based access control. N2K defines data access permissions similar to those of a multi-user file system, but N2K strictly enforces access through cryptographic standards. Since custom hardware can efficiently implement many cryptographic algorithms and can provide additional security, N2K stands to benefit greatly from a hardware implementation. To this end, the main N2K algorithm, the Key Protection Module (KPM), is being specified in VHDL. The design is being built and tested incrementally: this first phase implements the core control logic of the KPM without integrating its cryptographic sub-modules. Both RTL simulation and formal verification are used to test the design. This is the first N2K implementation in hardware, and it promises to provide an accelerated and secured alternative to the software-based system. A hardware implementation is a necessary step toward highly secure and flexible deployments of the N2K system

    Formal Verification and Validation of AADL Models

    Get PDF
    International audienceSafety-critical systems are increasingly difficult to com- prehend due to their rising complexity. Methodologies, tools and modeling formalisms have been developed to overcome this. Component-based design is an im- portant paradigm that is shared by many of them

    Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    Get PDF
    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic

    Modeling and Analyzing Adaptive User-Centric Systems in Real-Time Maude

    Full text link
    Pervasive user-centric applications are systems which are meant to sense the presence, mood, and intentions of users in order to optimize user comfort and performance. Building such applications requires not only state-of-the art techniques from artificial intelligence but also sound software engineering methods for facilitating modular design, runtime adaptation and verification of critical system requirements. In this paper we focus on high-level design and analysis, and use the algebraic rewriting language Real-Time Maude for specifying applications in a real-time setting. We propose a generic component-based approach for modeling pervasive user-centric systems and we show how to analyze and prove crucial properties of the system architecture through model checking and simulation. For proving time-dependent properties we use Metric Temporal Logic (MTL) and present analysis algorithms for model checking two subclasses of MTL formulas: time-bounded response and time-bounded safety MTL formulas. The underlying idea is to extend the Real-Time Maude model with suitable clocks, to transform the MTL formulas into LTL formulas over the extended specification, and then to use the LTL model checker of Maude. It is shown that these analyses are sound and complete for maximal time sampling. The approach is illustrated by a simple adaptive advertising scenario in which an adaptive advertisement display can react to actions of the users in front of the display.Comment: In Proceedings RTRTS 2010, arXiv:1009.398

    Perpetual requirements engineering

    Get PDF
    This dissertation attempts to make a contribution within the fields of distributed systems, security, and formal verification. We provide a way to formally assess the impact of a given change in three different contexts. We have developed a logic based on Lewisā€™s Counterfactual Logic. First we show how our approach is applied to a standard sequential programming setting. Then, we show how a modified version of the logic can be used in the context of reactive systems and sensor networks. Last but not least we show how this logic can be used in the context of security systems. Traditionally, change impact analysis has been viewed as an area in traditional software engineering. Software artifacts (source code, usually) are modified in response to a change in user requirements. Aside from making sure that the changes are inherently correct (testing and verification), programmers (software engineers) need to make sure that the introduced changes are coherent with those parts of the systems that were not affected by the artifact modification. The latter is generally achieved by establishing a dependency relation between software artifacts. In rough lines, the process of change management consists of projecting the transitive closure of the this dependency relation based on the set of artifacts that have actually changed and assessing how the related artifacts changed. The latter description of the traditional change management process generally occurs after the affected artifacts are changed. Undesired secondary effects are usually found during the testing phase after the changes have been incorporated. In cases when there is certain level of criticality, there is always a division between production and development environments. Change management (either automatic, tool driven, or completely manually done) can introduce extraneous defects into any of the changed software life-cycle artifacts. The testing phase tries to eradicate a relatively large portion of the undesired defects introduced by change. However, traditional testing techniques are limited by their coverage strength. Therefore, even when maximum coverage is guaranteed there is always the non-zero probability of having secondary effects prior to a change
    • ā€¦
    corecore