5 research outputs found

    Novas Abordagens para o Uso das Ferramentas Clássicas de Planejamento de Cenários

    Get PDF
    The future is to be built – is multiple and uncertain. Within the social sciences, scenarios can be defined as a description of a future situation and a course of events that allow move from a primary position toward this future situation. Currently, there is a multiplicity of methods and tools available for building scenarios, including methods of an essentially rationalist approach, as Michel Godet’s method. The purpose of this work is to use the hypothetical-deductive method to reduce, starting from Michel Godet’s Scenario Method and its tools, the complexity of the scenario-building process while maintaining the robustness of the findings. For this purpose, it is proposed two different approaches: (1) to integrate, in one step, the structural analysis and the cross-impact matrix so the first one derives automatically while filling the last one; (2) to use the concept of Bayesian networks as a method to integrate the cross-impact matrix and the morphological analysis. Both approaches aim to reduce the amount of information needed to feed the tools and improve the feedback criteria, resulting in greater flexibility during the process and better holistic view of the system. Scientifically, these approaches open a new field of studies in scenario planning as it appropriates the concept of Bayesian networks, widely used in other areas of knowledge (artificial intelligence, geological studies, medical diagnostics, pattern classification, etc.), and bring it to the field of social sciences.O futuro está para ser construído – é múltiplo e incerto. Dentro das ciências sociais, cenários podem ser definidos como descrição de uma situação futura e de um curso de eventos que permita o movimento de uma posição original para essa situação futura. Atualmente, existe à disposição uma enormidade de métodos e ferramentas para construção de cenários, entre eles métodos de abordagem essencialmente racionalista, como o de Michel Godet. A proposta deste trabalho é utilizar o método hipotético-dedutivo para reduzir, a partir do Método de Cenários de Michel Godet e de suas ferramentas, a complexidade no processo de construção de cenários, mas ao mesmo tempo manter a robustez das conclusões. Para isso, foram propostas duas abordagens: (1) integrar em apenas uma etapa a análise estrutural e a análise de impactos cruzados, a primeira resultando automaticamente do preenchimento da última; (2) utilizar o conceito de redes bayesianas como forma de integrar a matriz de impactos cruzados e a análise morfológica. Ambas as abordagens visam reduzir a quantidade de informações necessárias para alimentar as ferramentas e melhoram o critério de feedback, resultando em maior agilidade no processo e melhor visão holística do sistema. Cientificamente, essas abordagens abrem um novo campo para estudos de planejamento de cenários já que se apropriam do conceito de redes bayesianas, muito utilizado em outras áreas do conhecimento (inteligência artificial, estudos geológicos, diagnósticos médicos, classificação de padrões, etc.) e o trazem para o campo das ciências sociais

    Scheduling real-time systems with cyclic dependence using data criticality

    Get PDF
    The increase of interdependent components in avionic and automotive software rises new challenges for real-time system integration. For instance, most scheduling and mapping techniques proposed in the literature rely on the availability of the system’s DAG representation. However, at the initial stage of system design, a dataflow graph (DFG) is generally used to represent the dependence between software components. Due to limited software knowledge, legacy components might not have fully-specified dependencies, leading to cycles in the DFG and making it difficult to determine the overall scheduling of the system as well as restrict access to DAG-based techniques. In this paper, we propose an approach that breaks cycles based on the assignment of a degree of importance and that with no inherent knowledge of the functional or temporal behaviour of the components. We define a “criticality” metric that quantifies the effect of removing edges on the system by tracking the propagation of error in the graph. The approach was reported to produce systems (56±14)% less critical than other methods. It was also validated on two case studies; a data modem and an industrial full-mission simulator, while ensuring the correctness of the system is maintained

    Handling Information and its Propagation to Engineer Complex Embedded Systems

    Get PDF
    Avec l’intérêt que la technologie d’aujourd’hui a sur les données, il est facile de supposer que l’information est au bout des doigts, prêt à être exploité. Les méthodologies et outils de recherche sont souvent construits sur cette hypothèse. Cependant, cette illusion d’abondance se brise souvent lorsqu’on tente de transférer des techniques existantes à des applications industrielles. Par exemple, la recherche a produit divers méthodologies permettant d’optimiser l’utilisation des ressources de grands systèmes complexes, tels que les avioniques de l’Airbus A380. Ces approches nécessitent la connaissance de certaines mesures telles que les temps d’exécution, la consommation de mémoire, critères de communication, etc. La conception de ces systèmes complexes a toutefois employé une combinaison de compétences de différents domaines (probablement avec des connaissances en génie logiciel) qui font que les données caractéristiques au système sont incomplètes ou manquantes. De plus, l’absence d’informations pertinentes rend difficile de décrire correctement le système, de prédire son comportement, et améliorer ses performances. Nous faisons recours au modèles probabilistes et des techniques d’apprentissage automatique pour remédier à ce manque d’informations pertinentes. La théorie des probabilités, en particulier, a un grand potentiel pour décrire les systèmes partiellement observables. Notre objectif est de fournir des approches et des solutions pour produire des informations pertinentes. Cela permet une description appropriée des systèmes complexes pour faciliter l’intégration, et permet l’utilisation des techniques d’optimisation existantes. Notre première étape consiste à résoudre l’une des difficultés rencontrées lors de l’intégration de système : assurer le bon comportement temporelle des composants critiques des systèmes. En raison de la mise à l’échelle de la technologie et de la dépendance croissante à l’égard des architectures à multi-coeurs, la surcharge de logiciels fonctionnant sur différents coeurs et le partage d’espace mémoire n’est plus négligeable. Pour tel, nous étendons la boîte à outils des système temps réel avec une analyse temporelle probabiliste statique qui estime avec précision l’exécution d’un logiciel avec des considerations pour les conflits de mémoire partagée. Le modèle est ensuite intégré dans un simulateur pour l’ordonnancement de systèmes temps réel multiprocesseurs. ----------ABSTRACT: In today’s data-driven technology, it is easy to assume that information is at the tip of our fingers, ready to be exploited. Research methodologies and tools are often built on top of this assumption. However, this illusion of abundance often breaks when attempting to transfer existing techniques to industrial applications. For instance, research produced various methodologies to optimize the resource usage of large complex systems, such as the avionics of the Airbus A380. These approaches require the knowledge of certain metrics such as the execution time, memory consumption, communication delays, etc. The design of these complex systems, however, employs a mix of expertise from different fields (likely with limited knowledge in software engineering) which might lead to incomplete or missing specifications. Moreover, the unavailability of relevant information makes it difficult to properly describe the system, predict its behavior, and improve its performance. We fall back on probabilistic models and machine learning techniques to address this lack of relevant information. Probability theory, especially, has great potential to describe partiallyobservable systems. Our objective is to provide approaches and solutions to produce relevant information. This enables a proper description of complex systems to ease integration, and allows the use of existing optimization techniques. Our first step is to tackle one of the difficulties encountered during system integration: ensuring the proper timing behavior of critical systems. Due to technology scaling, and with the growing reliance on multi-core architectures, the overhead of software running on different cores and sharing memory space is no longer negligible. For such, we extend the real-time system tool-kit with a static probabilistic timing analysis technique that accurately estimates the execution of software with an awareness of shared memory contention. The model is then incorporated into a simulator for scheduling multi-processor real-time systems
    corecore