25 research outputs found

    PROARTIS: Probabilistically analyzable real-time systems

    Get PDF
    Static timing analysis is the state-of-the-art practice of ascertaining the timing behavior of currentgeneration real-time embedded systems. The adoption of more complex hardware to respond to the increasing demand for computing power in next-generation systems exacerbates some of the limitations of static timing analysis. In particular, the effort of acquiring (1) detailed information on the hardware to develop an accurate model of its execution latency as well as (2) knowledge of the timing behavior of the program in the presence of varying hardware conditions, such as those dependent on the history of previously executed instructions. We call these problems the timing analysis walls. In this vision-statement article, we present probabilistic timing analysis, a novel approach to the analysis of the timing behavior of next-generation real-time embedded systems. We show how probabilistic timing analysis attacks the timing analysis walls; we then illustrate the mathematical foundations on which this method is based and the challenges we face in the effort of efficiently implementing it. We also present experimental evidence that shows how probabilistic timing analysis reduces the extent of knowledge about the execution platform required to produce probabilistically accurate WCET estimations. © 2013 ACM.Peer Reviewe

    Improving Measurement-Based Timing Analysis through Randomisation and Probabilistic Analysis

    Get PDF
    The use of increasingly complex hardware and software platforms in response to the ever rising performance demands of modern real-time systems complicates the verification and validation of their timing behaviour, which form a time-and-effort-intensive step of system qualification or certification. In this paper we relate the current state of practice in measurement-based timing analysis, the predominant choice for industrial developers, to the proceedings of the PROXIMA project in that very field. We recall the difficulties that the shift towards more complex computing platforms causes in that regard. Then we discuss the probabilistic approach proposed by PROXIMA to overcome some of those limitations. We present the main principles behind the PROXIMA approach as well as the changes it requires at hardware or software level underneath the application. We also present the current status of the project against its overall goals, and highlight some of the principal confidence-building results achieved so far

    PROARTIS: Probabilistically Analysable Real-Time Systems

    Get PDF
    Static Timing Analysis is the state-of-the-art practice to ascertain the timing behaviour of current-generation real-time embedded systems. The adoption of more complex hardware to respond to the increasing demand for computing power in next-generation systems exacerbates some of the limitations of Static Timing Analysis. In particular, the effort of acquiring (1) detail information on the hardware to develop an accurate model of its execution latency as well as (2) knowledge of the timing behaviour of the program in the presence of varying hardware conditions, such as those dependent on the history of previously executed instructions. We call these problems the Timing Analysis Walls. In this vision-statement paper we present Probabilistic Timing Analysis, a novel approach to the analysis of the timing behaviour of next-generation real-time embedded systems. We show how Probabilistic Timing Analysis attacks the Timing Analysis Walls; we then illustrate the mathematical foundations on which this method is based and the challenges we face in the effort of efficiently implementing it. We also present experimental evidence that shows how Probabilistic Timing Analysis reduces the extent of knowledge about the execution platform required to produce probabilistically-safe and tight WCET estimations

    Communication et changement de comportement : analyse de 40 ans de campagnes de communication de l'ADEME pour la réduction de la consommation énergétique dans les logements.

    No full text
    International audienceLes campagnes de communication pour la réduction des consommations d'énergie du « grand public » n'ont pas porté tous les fruits escomptés. Les caractéristiques des campagnes en termes de rapport aux publics et aux savoirs semblent avoir participé à cet échec. Pour explorer cette hypothèse à partir d'une analyse des discours, notre recherche porte sur 40 ans de campagnes de communication de l'ADEME (Agence de l'Environnement et de la Maitrise de l'Energie), visant l'évolution des comportements des ménages en matière de consommation d'énergie. L'analyse du support « plaquette » met en lumière un discours inscrit dans le déficit model, qui tend à évacuer les controverses. Nous proposons quelques pistes pour que la communication puisse induire de réels changements de comportement des ménages en matière de consommation d'énergie.</p

    Communication et changement de comportement : analyse de 40 ans de campagnes de communication de l'ADEME pour la réduction de la consommation énergétique dans les logements.

    No full text
    International audienceLes campagnes de communication pour la réduction des consommations d'énergie du « grand public » n'ont pas porté tous les fruits escomptés. Les caractéristiques des campagnes en termes de rapport aux publics et aux savoirs semblent avoir participé à cet échec. Pour explorer cette hypothèse à partir d'une analyse des discours, notre recherche porte sur 40 ans de campagnes de communication de l'ADEME (Agence de l'Environnement et de la Maitrise de l'Energie), visant l'évolution des comportements des ménages en matière de consommation d'énergie. L'analyse du support « plaquette » met en lumière un discours inscrit dans le déficit model, qui tend à évacuer les controverses. Nous proposons quelques pistes pour que la communication puisse induire de réels changements de comportement des ménages en matière de consommation d'énergie.</p

    Rare events and worst-case execution times

    Get PDF
    International audienceDuring the last years the arrival of multi-core processors or many-core processors as well as the increased complexity of programs have made more difficult the estimation of the worst case execution times (WCETs) of programs. The existing methods may produce estimates that are too pessimistic for some systems. As result new analyses based on probabilities and statistics have appeared to cope with this complexity by taking into account the fact that large values of WCET may have low probability of appearance. The first paper introducing probabilistic distributions for the description of execution times of tasks had associated to large values of execution times low probabilities [7] as illustrated in Figure 1. Different papers propose since methods to obtain such distributions. In [3] the authors provide a framework for obtaining the probabilistic execution times (pETs) of a program. Another method for estimating a pWCET bound in the presence of permanent faults in instruction caches was introduced in [6]. Papers like [4, 9] propose the estimation of pWCET using extreme value theory. Such theory is applied in [2] to platforms with randomized timing behavior and an associated avionics case study is presented in [8]. Only for this type of architecture, to our best knowledge, it is provided a proof that a large value of an execution time of a program is a rare event [1]. Figure 1: Distribution of execution times 2 Open Problem In practice, it is noticeable that the higher the measured execution time is, the smaller its probability of occurrence is. In reality, the WCET is not easy to measure, and the analysis tools can either overestimate the WCET (static analysis), or underestimate it (taking in consideration only measurements), or predict it with a certain probability of occurrence (measurement-based probabilistic timing analyses). Figure 2 shows a description of the currently common accepted relation between observed execution times, WCET, etc [5]. As stated in the introduction associating low probability of appearance to large values of pETs was proved valid in the context of cache randomized architectures. One would expect to have higher probability of appearance for large values of pETs on existing real-world deterministic architectures (from which the
    corecore