12 research outputs found

    Efficient computer-aided verification of parallel and distributed software systems

    Get PDF
    The society is becoming increasingly dependent on applications of distributed software systems, such as controller systems and wireless telecommunications. It is very difficult to guarantee the correct operation of this kind of systems with traditional software quality assurance methods, such as code reviews and testing. Formal methods, which are based on mathematical theories, have been suggested as a solution. Unfortunately, the vast complexity of the systems and the lack of competent personnel have prevented the adoption of sophisticated methods, such as theorem proving. Computerised tools for verifying finite state asynchronous systems exist, and they been successful on locating errors in relatively small software systems. However, a direct translation of software to low-level formal models may lead to unmanageably large models or complex behaviour. Abstract models and algorithms that operate on compact high-level designs are needed to analyse larger systems. This work introduces modelling formalisms and verification methods of distributed systems, presents efficient algorithms for verifying high-level models of large software systems, including an automated method for abstracting unneeded details from systems consisting of loosely connected components, and shows how the methods can be applied in the software development industry.reviewe

    A SOA-based architecture framework

    Get PDF
    We present an Service-Oriented Architecture (SOA)– based architecture framework. The architecture framework is designed to be close to industry standards, especially to the Service Component Architecture (SCA). The framework is language independent and the building blocks of each system, activities and data, are first class citizens. We present a meta model of the architecture framework and discuss its concepts in detail. Through the framework, concepts of an SOA such as wiring, correlation and instantiation can be clarifie

    A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    Full text link
    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference is based on the semantic tableaux method which has some advantages when compared to traditional deduction strategies. The algorithm for an automatic generation of logical specifications is proposed. The generation procedure is based on the predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea for the approach is to consider patterns, defined in terms of temporal logic,as a kind of (logical) primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between intuitiveness of the deductive reasoning and the difficulty of its practical application in the case when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing our understanding of, the deduction-based formal verification of workflow-oriented models.Comment: International Journal of Applied Mathematics and Computer Scienc

    Sixth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools Aarhus, Denmark, October 24-26, 2005

    Get PDF
    This booklet contains the proceedings of the Sixth Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 24-26, 2005. The workshop is organised by the CPN group at the Department of Computer Science, University of Aarhus, Denmark. The papers are also available in electronic form via the web pages: http://www.daimi.au.dk/CPnets/workshop0

    Methodologies synthesis

    Get PDF
    This deliverable deals with the modelling and analysis of interdependencies between critical infrastructures, focussing attention on two interdependent infrastructures studied in the context of CRUTIAL: the electric power infrastructure and the information infrastructures supporting management, control and maintenance functionality. The main objectives are: 1) investigate the main challenges to be addressed for the analysis and modelling of interdependencies, 2) review the modelling methodologies and tools that can be used to address these challenges and support the evaluation of the impact of interdependencies on the dependability and resilience of the service delivered to the users, and 3) present the preliminary directions investigated so far by the CRUTIAL consortium for describing and modelling interdependencies

    Second Workshop on Modelling of Objects, Components and Agents

    Get PDF
    This report contains the proceedings of the workshop Modelling of Objects, Components, and Agents (MOCA'02), August 26-27, 2002.The workshop is organized by the 'Coloured Petri Net' Group at the University of Aarhus, Denmark and the 'Theoretical Foundations of Computer Science' Group at the University of Hamburg, Germany. The homepage of the workshop is: http://www.daimi.au.dk/CPnets/workshop02

    Fifth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools Aarhus, Denmark, October 8-11, 2004

    Get PDF
    This booklet contains the proceedings of the Fifth Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 8-11, 2004. The workshop is organised by the CPN group at the Department of Computer Science, University of Aarhus, Denmark. The papers are also available in electronic form via the web pages: http://www.daimi.au.dk/CPnets/workshop0

    Land-Cover and Land-Use Study Using Genetic Algorithms, Petri Nets, and Cellular Automata

    Get PDF
    Recent research techniques, such as genetic algorithm (GA), Petri net (PN), and cellular automata (CA) have been applied in a number of studies. However, their capability and performance in land-cover land-use (LCLU) classification, change detection, and predictive modeling have not been well understood. This study seeks to address the following questions: 1) How do genetic parameters impact the accuracy of GA-based LCLU classification; 2) How do image parameters impact the accuracy of GA-based LCLU classification; 3) Is GA-based LCLU classification more accurate than the maximum likelihood classifier (MLC), iterative self-organizing data analysis technique (ISODATA), and the hybrid approach; 4) How do genetic parameters impact Petri Net-based LCLU change detection; and 5) How do cellular automata components impact the accuracy of LCLU predictive modeling. The study area, namely the Tickfaw River watershed (711mi²), is located in southeast Louisiana and southwest Mississippi. The major datasets include time-series Landsat TM / ETM images and Digital Orthophoto Quarter Quadrangles (DOQQ’s). LCLU classification was conducted by using the GA, MLC, ISODATA, and Hybrid approach. The LCLU change was modeled by using genetic PN-based process mining technique. The process models were interpreted and input to a CA for predicting future LCLU. The major findings include: 1) GA-based LCLU classification is more accurate than the traditional approaches; 2) When genetic parameters, image parameters, or CA components are configured improperly, the accuracy of LCLU classification, the coverage of LCLU change process model, and/or the accuracy of LCLU predictive modeling will be low; 3) For GA-based LCLU classification, the recommended configuration of genetic / image parameters is generation 2000-5000, population 1000, crossover rate 69%-99%, mutation rate 0.1%-0.5%, generation gap 25%-50%, data layers 16-20, training / testing data size 10000-20000 / 5000-10000, and spatial resolution 30m-60m; 4) For genetic Petri nets-based LCLU change detection, the recommended configuration of genetic parameters is generation 500, population 300, crossover rate 59%, mutation rate 5%, and elitism rate 4%; and 5) For CA-based LCLU predictive modeling, the recommended configuration of CA components is space 6025 * 12993, state 2, von Neumann neighborhood 3 * 3, time step 2-3 years, and optimized transition rules

    Data-driven conceptual modeling: how some knowledge drivers for the enterprise might be mined from enterprise data

    Get PDF
    As organizations perform their business, they analyze, design and manage a variety of processes represented in models with different scopes and scale of complexity. Specifying these processes requires a certain level of modeling competence. However, this condition does not seem to be balanced with adequate capability of the person(s) who are responsible for the task of defining and modeling an organization or enterprise operation. On the other hand, an enterprise typically collects various records of all events occur during the operation of their processes. Records, such as the start and end of the tasks in a process instance, state transitions of objects impacted by the process execution, the message exchange during the process execution, etc., are maintained in enterprise repositories as various logs, such as event logs, process logs, effect logs, message logs, etc. Furthermore, the growth rate in the volume of these data generated by enterprise process execution has increased manyfold in just a few years. On top of these, models often considered as the dashboard view of an enterprise. Models represents an abstraction of the underlying reality of an enterprise. Models also served as the knowledge driver through which an enterprise can be managed. Data-driven extraction offers the capability to mine these knowledge drivers from enterprise data and leverage the mined models to establish the set of enterprise data that conforms with the desired behaviour. This thesis aimed to generate models or knowledge drivers from enterprise data to enable some type of dashboard view of enterprise to provide support for analysts. The rationale for this has been started as the requirement to improve an existing process or to create a new process. It was also mentioned models can also serve as a collection of effectors through which an organization or an enterprise can be managed. The enterprise data refer to above has been identified as process logs, effect logs, message logs, and invocation logs. The approach in this thesis is to mine these logs to generate process, requirement, and enterprise architecture models, and how goals get fulfilled based on collected operational data. The above a research question has been formulated as whether it is possible to derive the knowledge drivers from the enterprise data, which represent the running operation of the enterprise, or in other words, is it possible to use the available data in the enterprise repository to generate the knowledge drivers? . In Chapter 2, review of literature that can provide the necessary background knowledge to explore the above research question has been presented. Chapter 3 presents how process semantics can be mined. Chapter 4 suggest a way to extract a requirements model. The Chapter 5 presents a way to discover the underlying enterprise architecture and Chapter 6 presents a way to mine how goals get orchestrated. Overall finding have been discussed in Chapter 7 to derive some conclusions

    Acta Cybernetica : Volume 17. Number 4.

    Get PDF
    corecore