3,796 research outputs found

    Enhancing the EAST-ADL error model with HiP-HOPS semantics

    Get PDF
    EAST-ADL is a domain-specific modelling language for the engineering of automotive embedded systems. The language has abstractions that enable engineers to capture a variety of information about design in the course of the lifecycle — from requirements to detailed design of hardware and software architectures. The specification of the EAST-ADL language includes an error model extension which documents language structures that allow potential failures of design elements to be specified locally. The effects of these failures are then later assessed in the context of the architecture design. To provide this type of useful assessment, a language and a specification are not enough; a compiler-like tool that can read and operate on a system specification together with its error model is needed. In this paper we integrate the error model of EAST-ADL with the precise semantics of HiP-HOPS — a state-of-the-art tool that enables dependability analysis and optimization of design models. We present the integration concept between EAST-ADL structure and HiP-HOPS error propagation logic and its transformation into the HiP-HOPS model. Source and destination models are represented using the corresponding XML formats. The connection of these two models at tool level enables practical EAST-ADL designs of embedded automotive systems to be analysed in terms of dependability, i.e. safety, reliability and availability. In addition, the information encoded in the error model can be re-used across different contexts of application with the associated benefits for cost reduction, simplification, and rationalisation of dependability assessments in complex engineering designs

    Aspect-Oriented Programming

    Get PDF
    Aspect-oriented programming is a promising idea that can improve the quality of software by reduce the problem of code tangling and improving the separation of concerns. At ECOOP'97, the first AOP workshop brought together a number of researchers interested in aspect-orientation. At ECOOP'98, during the second AOP workshop the participants reported on progress in some research topics and raised more issues that were further discussed. \ud \ud This year, the ideas and concepts of AOP have been spread and adopted more widely, and, accordingly, the workshop received many submissions covering areas from design and application of aspects to design and implementation of aspect languages

    A Simulation Model Articulation of the REA Ontology

    Get PDF
    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise

    Visual product architecture modelling for structuring data in a PLM system

    Get PDF
    Part 8: Formalization for PLMInternational audienceThe goal of this paper is to determine the role of a product architecture model to support communication and to form the basis for developing and maintaining information of product structures in a PLM system. This paper contains descriptions of a modelling tool to represent a product architecture in a company to support the development of a family of products, as well as the reasons leading to the use of the specific model and its terminology. The fundamental idea for using the architecture model is that an improved understanding of the whole product system, will lead to better decision making. Moreover, it is discussed how the sometimes intangible elements and phenomena within an architecture model can be visually modeled in order to form the basis for a data model in a PLM system

    Verifying Policy Enforcers

    Get PDF
    Policy enforcers are sophisticated runtime components that can prevent failures by enforcing the correct behavior of the software. While a single enforcer can be easily designed focusing only on the behavior of the application that must be monitored, the effect of multiple enforcers that enforce different policies might be hard to predict. So far, mechanisms to resolve interferences between enforcers have been based on priority mechanisms and heuristics. Although these methods provide a mechanism to take decisions when multiple enforcers try to affect the execution at a same time, they do not guarantee the lack of interference on the global behavior of the system. In this paper we present a verification strategy that can be exploited to discover interferences between sets of enforcers and thus safely identify a-priori the enforcers that can co-exist at run-time. In our evaluation, we experimented our verification method with several policy enforcers for Android and discovered some incompatibilities.Comment: Oliviero Riganelli, Daniela Micucci, Leonardo Mariani, and Yli\`es Falcone. Verifying Policy Enforcers. Proceedings of 17th International Conference on Runtime Verification (RV), 2017. (to appear

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    Hybridization of Bayesian networks and belief functions to assess risk. Application to aircraft deconstruction

    Get PDF
    This paper aims to present a study on knowledge management for the disassembly of end-of-life aircraft. We propose a model using Bayesian networks to assess risk and present three approaches to integrate the belief functions standing for the representation of fuzzy and uncertain knowledge
    • …
    corecore