4 research outputs found

    Assessing the Risk of an Adaptation using Prior Compliance Verification

    Get PDF
    Autonomous systems must respond to large amounts of streaming information. They also must comply with critical properties to maintain behavior guarantees. Compliance is especially important when a system self-adapts to perform a repair, improve performance, or modify decisions. There remain significant challenges assessing the risk of adaptations that are dynamically configured at runtime with respect to critical property compliance. Assuming compliance verification was performed for the originally deployed system, the proof process holds valuable meta-data about the variables and conditions that impact reusing the proof on the adapted system. We express this meta-data as a verification workflow using Colored Petri Nets. As dynamic adaptations are configured, the Petri Nets produce alert tokens suggesting the potential proof reuse impact of an adaptation. Alert tokens hold risk values for use in a utility function to determine the least risky adaptations. We illustrate the modeling and risk assessment using a case study

    Towards Run-Time Verification of Adaptive Security for IoT in eHealth

    No full text
    corecore