3 research outputs found

    Discovery and validation for composite services on the semantic web

    Get PDF
    urrent technology for locating and validating composite services are not sufficient due to the following reasons. • Current frameworks do not have the capacity to create complete service descriptions since they do not model all the functional aspects together (i.e. the purpose of a service, state transitions, data transformations). Those that deal with behavioural descriptions are unable to model the ordering constraints between concurrent interactions completely since they do not consider the time taken by interactions. Furthermore, there is no mechanism to assess the correctness of a functional description. • Existing semantic-based matching techniques cannot locate services that conform to global constraints. Semantic-based techniques use ontological relationships to perform mappings between the terms in service descriptions and user requests. Therefore, unlike techniques that perform either direct string matching or schema matching, semantic-based approaches can match descriptions created with different terminologies and achieve a higher recall. Global constraints relate to restrictions on values of two or more attributes of multiple constituent services. • Current techniques that generate and validate global communication models of composite services yield inaccurate results (i.e. detect phantom deadlocks or ignore actual deadlocks) since they either (i) do not support all types of interactions (i.e. only send and receive, not service and invoke) or (ii) do not consider the time taken by interactions. This thesis presents novel ideas to deal with the stated limitations. First, we propose two formalisms (WS-ALUE and WS-π-calculus) for creating functional and behavioural descriptions respectively. WS-ALUE extends the Description Logic language ALUE with some new predicates and models all the functional aspects together. WS-π-calculus extends π-calculus with Interval Time Logic (ITL) axioms. ITL axioms accurately model temporal relationships between concurrent interactions. A technique comparing a WS-π-calculus description of a service against its WS-ALUE description is introduced to detect any errors that are not equally reflected in both descriptions. We propose novel semantic-based matching techniques to locate composite services that conform to global constraints. These constraints are of two types: strictly dependent or independent. A constraint is of the former type if the values that should be assigned to all the remaining restricted attributes can be uniquely determined once a value is assigned to one. Any global constraint that is not strictly dependent is independent. A complete and correct technique that locates services that conform to strictly dependent constraints in polynomial time, is defined using a three-dimensional data cube. The proposed approach that deals with independent constraints is correct, but not complete, and is a heuristic approach. It incorporates user defined objective functions, greedy algorithms and domain rules to locate conforming services. We propose a new approach to generate global communication models (of composite services) that are free of deadlocks and synchronisation conflicts. This approach is an extension of a transitive temporal reasoning mechanism

    Land-Cover and Land-Use Study Using Genetic Algorithms, Petri Nets, and Cellular Automata

    Get PDF
    Recent research techniques, such as genetic algorithm (GA), Petri net (PN), and cellular automata (CA) have been applied in a number of studies. However, their capability and performance in land-cover land-use (LCLU) classification, change detection, and predictive modeling have not been well understood. This study seeks to address the following questions: 1) How do genetic parameters impact the accuracy of GA-based LCLU classification; 2) How do image parameters impact the accuracy of GA-based LCLU classification; 3) Is GA-based LCLU classification more accurate than the maximum likelihood classifier (MLC), iterative self-organizing data analysis technique (ISODATA), and the hybrid approach; 4) How do genetic parameters impact Petri Net-based LCLU change detection; and 5) How do cellular automata components impact the accuracy of LCLU predictive modeling. The study area, namely the Tickfaw River watershed (711mi²), is located in southeast Louisiana and southwest Mississippi. The major datasets include time-series Landsat TM / ETM images and Digital Orthophoto Quarter Quadrangles (DOQQ’s). LCLU classification was conducted by using the GA, MLC, ISODATA, and Hybrid approach. The LCLU change was modeled by using genetic PN-based process mining technique. The process models were interpreted and input to a CA for predicting future LCLU. The major findings include: 1) GA-based LCLU classification is more accurate than the traditional approaches; 2) When genetic parameters, image parameters, or CA components are configured improperly, the accuracy of LCLU classification, the coverage of LCLU change process model, and/or the accuracy of LCLU predictive modeling will be low; 3) For GA-based LCLU classification, the recommended configuration of genetic / image parameters is generation 2000-5000, population 1000, crossover rate 69%-99%, mutation rate 0.1%-0.5%, generation gap 25%-50%, data layers 16-20, training / testing data size 10000-20000 / 5000-10000, and spatial resolution 30m-60m; 4) For genetic Petri nets-based LCLU change detection, the recommended configuration of genetic parameters is generation 500, population 300, crossover rate 59%, mutation rate 5%, and elitism rate 4%; and 5) For CA-based LCLU predictive modeling, the recommended configuration of CA components is space 6025 * 12993, state 2, von Neumann neighborhood 3 * 3, time step 2-3 years, and optimized transition rules

    JCPNet Tool and Automated Analysis of Distributed Systems ABSTRACT

    No full text
    Model based approach is crucial to the analysis of system design. Colored Petri Nets (CPNet) have been used as software model with good success. It can be used to complement UML diagrams by providing formal modeling and analysis of dynamic behaviors of distributed systems. In order to enable CPNet model to be utilized more flexibly and widely, a new software tool JCPNet has been developed. This paper discusses the background, motivation and development of JCPNet, and its applications to specification and analysis of distributed systems
    corecore