106 research outputs found

    Toward Contention Analysis for Parallel Executing Real-Time Tasks

    Get PDF
    In measurement-based probabilistic timing analysis, the execution conditions imposed to tasks as measurement scenarios, have a strong impact to the worst-case execution time estimates. The scenarios and their effects on the task execution behavior have to be deeply investigated. The aim has to be to identify and to guarantee the scenarios that lead to the maximum measurements, i.e. the worst-case scenarios, and use them to assure the worst-case execution time estimates. We propose a contention analysis in order to identify the worst contentions that a task can suffer from concurrent executions. The work focuses on the interferences on shared resources (cache memories and memory buses) from parallel executions in multi-core real-time systems. Our approach consists of searching for possible task contenders for parallel executions, modeling their contentiousness, and classifying the measurement scenarios accordingly. We identify the most contentious ones and their worst-case effects on task execution times. The measurement-based probabilistic timing analysis is then used to verify the analysis proposed, qualify the scenarios with contentiousness, and compare them. A parallel execution simulator for multi-core real-time system is developed and used for validating our framework. The framework applies heuristics and assumptions that simplify the system behavior. It represents a first step for developing a complete approach which would be able to guarantee the worst-case behavior

    On the Representativity of Execution Time Measurements: Studying Dependence and Multi-Mode Tasks

    Get PDF
    The Measurement-Based Probabilistic Timing Analysis (MBPTA) infers probabilistic Worst-Case Execution Time (pWCET) estimates from measurements of tasks execution times; the Extreme Value Theory (EVT) is the statistical tool that MBPTA applies for inferring worst-cases from observations/measurements of the actual task behavior. MBPTA and EVT capability of estimating safe/pessimistic pWCET rely on the quality of the measurements; in particular, execution time measurements have to be representative of the actual system execution conditions and have to cover multiple possible execution conditions. In this work, we investigate statistical dependences between execution time measurements and tasks with multiple runtime operational modes. In the first case, we outline the effects of dependences on the EVT applicability as well as on the quality of the pWCET estimates. In the second case, we propose the best approaches to account for the different task execution modes and guaranteeing safe pWCET estimates that cover them all. The solutions proposed are validated with test cases

    On the Sustainability of the Extreme Value Theory for WCET Estimation

    Get PDF
    Measurement-based approaches with extreme value worst-case estimations are beginning to be proficiently considered for timing analyses. In this paper, we intend to make more formal extreme value theory applicability to safe worst-case execution time estimations. We outline complexities and challenges behind extreme value theory assumptions and parameter tuning. Including the knowledge requirements, we are able to conclude about safety of the probabilistic worst-case execution estimations from the extreme value theory, and execution time measurements

    Non-Preemptive Scheduling of Periodic Mixed-Criticality Real-Time Systems

    Get PDF
    In this work we develop an offline analysis of periodic mixed-criticality real-time systems. We develop a graph-based exploratory method to non-preemptively schedule multiple criticality tasks. The exploration process obtains a schedule for each periodic instance of the tasks. The schedule adjusts for criticality mode changes to maximize the resource usage by allowing lower criticality executions. At the same time, it ensures that the schedulability of other higher criticality jobs is never compromised. We also quantify the probabilities associated to a criticality mode change by using task probabilistic Worst Case Execution Times. A method to reduce the offline complexity is also proposed.info:eu-repo/semantics/publishedVersio

    Applying real-time interface and calculus for dynamic power management in hard real-time systems

    Get PDF
    Power dissipation has been an important design issue for a wide range of computer systems in the past decades. Dynamic power consumption due to signal switching activities and static power consumption due to leakage current are the two major sources of power consumption in a CMOS circuit. As CMOS technology advances towards deep sub-micron domain, static power dissipation is comparable to or even more than dynamic power dissipation. This article explores how to apply dynamic power management to reduce static power for hard real-time systems. We propose online algorithms that adaptively control the power mode of a system, procrastinating the processing of arrived events as late as possible. To cope with multiple event streams with different characteristics, we provide solutions for preemptive earliest-deadline-first and fixed-priority scheduling policies. By adopting a worst-case interval-based abstraction, our approach can not only tackle arbitrary event arrivals, e.g., with burstiness, but also guarantee hard real-time requirements with respect to both timing and backlog constraints. We also present extensive simulation results to demonstrate the effectiveness of our approache

    Architectural performance analysis of FPGA synthesized LEON processors

    Get PDF
    Current processors have gone through multiple internal opti- mization to speed-up the average execution time e.g. pipelines, branch prediction. Besides, internal communication mechanisms and shared resources like caches or buses have a sig- nificant impact on Worst-Case Execution Times (WCETs). Having an accurate estimate of a WCET is now a challenge. Probabilistic approaches provide a viable alternative to single WCET estimation. They consider WCET as a probabilistic distribution associated to uncertainty or risk. In this paper, we present synthetic benchmarks and associated analysis for several LEON3 configurations on FPGA targets. Benchmarking exposes key parameters to execution time variability allowing for accurate probabilistic modeling of system dynamics. We analyze the impact of architecture- level configurations on average and worst-case behaviors

    Open Challenges for Probabilistic Measurement-Based Worst-Case Execution Time

    Get PDF
    The worst-case execution time (WCET) is a critical parameter describing the largest value for the execution time of programs. Even though such a parameter is very hard to attain, it is essential as part of guaranteeing a real-time system meets its timing requirements. The complexity of modern hardware has increased the challenges of statically analyzing the WCET and reduced the reliability of purely measuring the WCET. This has led to the emergence of probabilistic WCETs (pWCETs) analysis as a viable technique. The low probability of appearance of large execution times of a program has motivated the utilization of rare events theory like extreme value theory (EVT). As pWCET estimation based on EVT has matured as a discipline, a number of open challenges have become apparent when applying the existing approaches. This letter enumerates key challenges while establishing a state of the art of EVT-based pWCET estimation methods

    Static Probabilistic Timing Analysis in Presence of Faults

    Get PDF
    Accurate timing prediction for software execution is becoming a problem due to the increasing complexity of computer architecture, and the presence of mixed-criticality workloads. Probabilistic caches were proposed to set bounds to Worst Case Execution Time (WCET) estimates and help designers improve system resource usage. However, as technology scales down, system fault rates increase and timing behavior is affected. In this paper, we propose a Static Probabilistic Timing Analysis (SPTA) approach for caches with evict-on-miss random replacement policy using a state space modeling technique, with consideration of fault impacts on both timing analysis and task WCET. Different scenarios of transient and permanent faults are investigated. Results show that our proposed approach provides tight probabilistic WCET (pWCET) estimates and as fault rate increases, the timing behavior of the system can be affected significantly

    CleanET: enabling timing validation for complex automotive systems

    Get PDF
    Timing validation for automotive systems occurs in late integration stages when it is hard to control how the instances of software tasks overlap in time. To make things worse, in complex software systems, like those for autonomous driving, tasks schedule has a strong event-driven nature, which further complicates relating those task-overlapping scenarios (TOS) captured during the software timing budgeting and those observed during validation phases. This paper proposes CleanET, an approach to derive the dilation factor r caused due to the simultaneous execution of multiple tasks. To that end, CleanET builds on the captured TOS during testing and predicts how tasks execution time react under untested TOS (e.g. full overlap), hence acting as a mean of robust testing. CleanET also provides additional evidence for certification about the derived timing budgets for every task. We apply CleanET to a commercial autonomous driving framework, Apollo, where task measurements can only be reasonably collected under 'arbitrary' TOS. Our results show that CleanET successfully derives the dilation factor and allows assessing whether execution times for the different tasks adhere to their respective deadlines for unobserved scenarios.This work has been partially supported by the Spanish Ministry of Economy and Competitiveness (MINECO) under grant TIN2015- 65316-P, the SuPerCom European Research Council (ERC) project under the European Union’s Horizon 2020 research and innovation programme (grant agreement No. 772773), and the HiPEAC Network of Excellence. MINECO partially supported Jaume Abella under Ramon y Cajal postdoctoral fellowship (RYC-2013-14717).Peer ReviewedPostprint (author's final draft

    Exploiting bacteria for improving hypoxemia of COVID-19 patients

    Get PDF
    Background: Although useful in the time-race against COVID-19, CPAP cannot provide oxygen over the physiological limits imposed by severe pulmonary impairments. In previous studies, we reported that the administration of the SLAB51 probiotics reduced risk of developing respiratory failure in severe COVID-19 patients through the activation of oxygen sparing mechanisms providing additional oxygen to organs critical for survival. Methods: This "real life" study is a retrospective analysis of SARS-CoV-2 infected patients with hypoxaemic acute respiratory failure secondary to COVID-19 pneumonia undergoing CPAP treatment. A group of patients managed with ad interim routinely used therapy (RUT) were compared to a second group treated with RUT associated with SLAB51 oral bacteriotherapy (OB). Results: At baseline, patients receiving SLAB51 showed significantly lower blood oxygenation than controls. An opposite condition was observed after 3 days of treatment, despite the significantly reduced amount of oxygen received by patients taking SLAB51. At 7 days, a lower prevalence of COVID-19 patients needing CPAP in the group taking probiotics was observed. The administration of SLAB51 is a complementary approach for ameliorating oxygenation conditions at the systemic level. Conclusion: This study proves that probiotic administration results in an additional boost in alleviating hypoxic conditions, permitting to limit on the use of CPAP and its contraindications
    • …
    corecore