12 research outputs found

    Telemaco: A Language Oriented Tool for Graph-based Models Layout Optimization, Journal of Telecommunications and Information Technology, 2013, nr 4

    Get PDF
    Progress of ICT is shifting the paradigm of systems organization towards a distributed approach, in which physical deployment of components influences the evaluation of systems properties. This contribution can be considered as a problem of graph layout optimization, well-known in literature where several approaches have been exploited in different application fields with different solving techniques. Then again, complex systems can be only studied by means of different formalisms which codification is the aim of language engineering. Telemaco is a tool that supports a novel approach for the application of graph layout optimizations to heterogeneous models, based on the OsMoSys framework and on the language engineering principles. It can cope with different graph-based formalisms by exploiting either their core graph nature or their different specialized features by means of language hierarchies. In this paper Telemaco is introduced together with its foundations and an example of application to Wireless Sensor Networks (WSN) deployment

    Flexible Views for View-based Model-driven Development

    Get PDF
    Modern software development faces the problem of fragmentation of information across heterogeneous artefacts in different modelling and programming languages. In this dissertation, the Vitruvius approach for view-based engineering is presented. Flexible views offer a compact definition of user-specific views on software systems, and can be defined the novel ModelJoin language. The process is supported by a change metamodel for metamodel evolution and change impact analysis

    Methodologies synthesis

    Get PDF
    This deliverable deals with the modelling and analysis of interdependencies between critical infrastructures, focussing attention on two interdependent infrastructures studied in the context of CRUTIAL: the electric power infrastructure and the information infrastructures supporting management, control and maintenance functionality. The main objectives are: 1) investigate the main challenges to be addressed for the analysis and modelling of interdependencies, 2) review the modelling methodologies and tools that can be used to address these challenges and support the evaluation of the impact of interdependencies on the dependability and resilience of the service delivered to the users, and 3) present the preliminary directions investigated so far by the CRUTIAL consortium for describing and modelling interdependencies

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Extended Fault Trees Analysis supported by Stochastic Petri Nets

    Get PDF
    This work presents several extensions to the Fault Tree [90] formalism used to build models oriented to the Dependability [103] analysis of systems. In this way, we increment the modelling capacity of Fault Trees which turn from simple combinatorial models to an high level language to represent more complicated aspects of the behaviour and of the failure mode of systems. Together with the extensions to the Fault Tree formalism, this work proposes solution methods for extended Fault Trees in order to cope with the new modelling facilities. These methods are mainly based on the use of Stochastic Petri Nets. Some of the formalisms described in this work are already present in the literature; for them we propose alternative solution methods with respect to the existing ones. Other formalisms are instead part of the original contribution of this work

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
    corecore