468,368 research outputs found

    Applying Association Rules and Co-location Techniques on Geospatial Web Services

    Get PDF
    Most contemporary GIS have only very basic spatial analysis and data mining functionality and many are confined to analysis that involves comparing maps and descriptive statistical displays like histograms or pie charts. Emerging Web standards promise a network of heterogeneous yet interoperable Web Services. Web Services would greatly simplify the development of many kinds of data integration and knowledge management applications. Geospatial data mining describes the combination of two key market intelligence software tools: Geographical Information Systems and Data Mining Systems. This research aims to develop a Spatial Data Mining web service it uses rule association techniques and correlation methods to explore results of huge amounts of data generated from crises management integrated applications developed. It integrates between traffic systems, medical services systems, civil defense and state of the art Geographic Information Systems and Data Mining Systems functionality in an open, highly extensible, internet-enabled plug-in architecture. The Interoperability of geospatial data previously focus just on data formats and standards. The recent popularity and adoption of the Internet and Web Services has provided a new means of interoperability for geospatial information not just for exchanging data but for analyzing these data during exchange. An integrated, user friendly Spatial Data Mining System available on the internet via a web service offers exciting new possibilities for spatial decision making and geographical research to a wide range of potential users.   Keywords: Spatial Data Mining, Rule Association, Co-location, Web Services, Geospatial Dat

    METAPLEX: AN INTEGRATED ENVIRONMENT FOR ORGANIZATION AND INFORMATION SYSTEMS DEVELOPMENT

    Get PDF
    This paper presents an integrated environment, called MetaPlex; for organization and information systems development. The kernel of MetaPlex is a knowledge base management system which captures the semantic primitives of a domain at the meta level and uses these primitives to describe target systems. Three levels of abstraction are used in MetaPlex for representing knowledge: the axiomatic, median, and instance levels. The MetaPlex Language Definition System is used to name the object types in the domain of interest and to define the attributes, relations, and descriptions which can be used by these object types. The structural knowledge of the domain in general is thus captured at the median level. Knowledge of the domain captured at the median level is used by the MetaPlex Specification System to define a target system at the instance level. A rule-based inference engine is embedded in the MetaPlex environment as an intelligent assistant to help end users. The expertise of a designer can be codified into a rule set which can assist users in classifying an object, in decomposing a high level system component, or in clustering the detailed components at the lower level. Both topdown and bottom-up approaches for systems development are thus supported. A layered approach has been proposed to manage the dynamics of such a metasystem environment. An enterprise model has been developed to demonstrate the usage of MetaPlex and the integration of organization and information systems modeling. Directions for future research are also discussed

    Hybrid GIS and Remote Sensing in Environmental Applications

    Get PDF
    Geographical information systems (GIS) are a relatively new and rapidly developing class of computer applications. They show considerable potential in a growing number of application domains, regional and environmental planning and management being one of them. The integration of GIS methods adds a key technology for spatial analysis to the set of tools of applied systems analysis, and environmental systems analysis in particular. The two papers presented in this volume address the use of GIS and satellite imagery in environmental applications. Hybrid GIS, that combine both vector and raster information, are the topics of the first report that was presented at the "EARSel Workshop on Relationship of Remote Sensing and Geographical Information Systems" which took place in Hannover, Germany in September 1991. The integration of satellite imagery in a global change information system that also uses rule-based expert systems methods, together with GIS data structures and tools, is the theme of the second part. This paper was presented at the "25th International Symposium - Remote Sensing and Global Environmental Change, Tools for Sustainable Development," held in Graz, Austria, in September 1993

    Acoustic Filters for Sensors and Transducers: Energy Policy Instrument for Monitoring and Evaluating Holy Places and Their Habitants

    Get PDF
    The aim of the study is to present a brief overview of energy policy instrument for monitoring and evaluating holy places and their habitants with the aid of acoustic filters for sensors and transducers. A monitoring protocol for policy instrument is presented for noise protection and security from power systems. Methods of information and data collection are briefly elaborated. The power systems are classified as per source signals of solar power, electric power, light power, sound power, heat power, fluid power and fire power. The acoustic filters as per source of noise signals from power systems are defined. The filters are differentiated as per source signal of unwanted frequencies from solar power, electric power, light power, sound power, heat power, fluid power and fire power. Some examples of acoustic filters are mentioned as per source of noise signal. A slide rule for noise measurement is illustrated along with its noise grades and flag colors under limiting conditions. Some noise filtering results from various power systems of an outdoor duct are also tabulated. An overview of noise systems integration with command and control center is described. A brief discussion on management of holy places and their habitants through monitoring and evaluation is also mentioned

    Understanding Social–Ecological Systems using Loop Analysis

    Get PDF
    The sustainable management of social–ecological systems (SESs) requires that we understand the complex structure of relationships and feedbacks among ecosystem components and socioeconomic entities. Therefore, the construction and analysis of models integrating ecological and human actors is crucial for describing the functioning of SESs, and qualitative modeling represents an ideal tool since it allows studying dependencies among variables of diverse types. In particular, the qualitative technique of loop analysis yields predictions about how a system’s variables respond to stress factors. Different interaction types, scarce information about functional relationships among variables, and uncertainties in the values of the parameters are the rule rather than exceptions when studying SESs. Accordingly, loop analysis seems to be perfectly suitable to investigate them. Here, we introduce the key aspects of loop analysis, discuss its applications to SESs, and suggest it enables making the first steps toward the integration of the three dimensions of sustainability

    Management systems integration: should quality be redefined?

    Get PDF
    Concepts like interfaces, synergies, fuzzy logic, organisational synapses and networking rule our days. These concepts orbit around the term “interactions”. Literally, interactions are everything. For instance, graphite and diamond are just carbon atoms “interacting” differently. Currently, customers demand a broader vision from organizations. In order to fulfil this demand organizations implemented and certified their management systems focusing different stakeholders’ requirements and according to several organisational standards, being the most reported ones ISO 9001, ISO 14001 and OHSAS 18001. Despite of the difficulties to achieve a consensual definition, if we consider concepts like quality tools, “Quality” is mainly an “action” concept. Thus, the main challenge faced by quality management systems (QMS) in an integrated environment would be philosophical: to leave the spotlights of an “action” based approach and to embrace the subtleness of the “interaction” approach. It is common sense that the system defining, promoting and stimulating interactions should not be involved in the action itself. Implications of this new role to play by quality management system are huge. Traditional organisational structures place quality transversally to production processes. It is expected that quality management system adopts a more vertical organisational orientation in order to accomplish new objectives posed by management systems integration. Human behaviour towards item production or service performance should change according new organisational placement by QMS. Each worker despite their organisational function should have a priori and a posteriori quality requirements knowledge, being critical a precise boundaries definition among elements constituting the responsibility chain. Auditing procedures should be adjusted too. Potential synergies between processes, internal and external communication flow, objectives redefinition, policies adjustment and a new vision horizon are among some check-points to be assessed by the audit team. It should be assure that top management commitment is not with a system but with an organisational philosophy. Processes indicators should be available at all time and by several persons. This requirement is not far fetch in our days due to the currently available networks and information systems. Non conformity detection, treatment and correction, under an integrated approach, should not be a middle management meeting of the affected process. In this case, quality procedures should assure that Environmental and Occupational Health and Safety representatives are involved and empowered to discuss decisions to be made. In this article we analyze the reposition of the quality management system after an integration process pointing out the practical implications of this newly perspective. It is intended to be an initial contribution to a newly task already achieved by other systems and sciences: the assessment of “interactions” in management systems

    FP6 CEDER Project Deliverable 3.2 "Benefits of a new reporting system"

    Get PDF
    Addressing the uncertainties in fishing activities, the CEDER project examines the use of observer reports, landings, e-logbooks, VMS and GPS tracks, and fishery-specific information. Such information was assessed in order to provide more accurate and timelier data on effort, catches, discards, and/or landings. This document contains CEDERÂżs Project Implementation Plan for policy makers, as well as expected benefits for government, industry, and science. The CEDER consortium advocates the use of GPS data at 15 minute intervals for scientific purposes. Among these are improved spatial planning and a new fishing effort measure, the actual effort while fishing, which can be inferred from vessel behaviour. The correlation between catch and effort can be used as an indicator for inspectors, but one cannot reliably guess catches from effort. VMS and logbook data can be matched using rule-bases systems, leading to higher data quality and better use of quota. Furthermore, if fishing mortality were known in near real time, then the integration of current year fishing mortality into management plans would yield benefits for stock recovery. The full realisation of such benefits requires a re-appraisal of the 15% TAC revision rule. The CEDER consortium insists that any roll-out of the ERS e-logbook must be properly enforced, and that the e-logbook cannot by itself replace observer reports. Finally, estimating discards may be feasible in selected fisheries, but additional means such as gear sensors may be required in order to get more reliable data in the general case.JRC.G.4-Maritime affair

    A service oriented architecture to implement clinical guidelines for evidence-based medical practice

    Get PDF
    Health information technology (HIT) has been identified as the fundamental driver to streamline the healthcare delivery processes to improve care quality and reduce operational costs. Of the many facets of HIT is Clinical Decision Support (CDS) which provides the physician with patient-specific inferences, intelligently filtered and organized, at appropriate times. This research has been conducted to develop an agile solution to Clinical Decision Support at the point of care in a healthcare setting as a potential solution to the challenges of interoperability and the complexity of possible solutions. The capabilities of Business Process Management (BPM) and Workflow Management systems are leveraged to support a Service Oriented Architecture development approach for ensuring evidence based medical practice. The aim of this study is to present an architecture solution that is based on SOA principles and embeds clinical guidelines within a healthcare setting. Since the solution is designed to implement real life healthcare scenarios, it essentially supports evidence-based clinical guidelines that are liable to change over a period of time. The thesis is divided into four parts. The first part consists of an Introduction to the study and a background to existing approaches for development and integration of Clinical Decision Support Systems. The second part focuses on the development of a Clinical Decision Support Framework based on Service Oriented Architecture. The CDS Framework is composed of standards based open source technologies including JBoss SwitchYard (enterprise service bus), rule-based CDS enabled by JBoss Drools, process modelling using Business Process Modelling and Notation. To ensure interoperability among various components, healthcare standards by HL7 and OMG are implemented. The third part provides implementation of this CDS Framework in healthcare scenarios. Two scenarios are concerned with the medical practice for diagnosis and early intervention (Chronic Obstructive Pulmonary Disease and Lung Cancer), one case study for Genetic data enablement of CDS systems (New born screening for Cystic Fibrosis) and the last case study is about using BPM techniques for managing healthcare organizational perspectives including human interaction with automated clinical workflows. The last part concludes the research with contributions in design and architecture of CDS systems. This thesis has primarily adopted the Design Science Research Methodology for Information Systems. Additionally, Business Process Management Life Cycle, Agile Business Rules Development methodology and Pattern-Based Cycle for E-Workflow Design for individual case studies are used. Using evidence-based clinical guidelines published by UK’s National Institute of Health and Care Excellence, the integration of latest research in clinical practice has been employed in the automated workflows. The case studies implemented using the CDS Framework are evaluated against implementation requirements, conformance to SOA principles and response time using load testing strategy. For a healthcare organization to achieve its strategic goals in administrative and clinical practice, this research has provided a standards based integration solution in the field of clinical decision support. A SOA based CDS can serve as a potential solution to complexities in IT interventions as the core data and business logic functions are loosely coupled from the presentation. Additionally, the results of this this research can serve as an exemplar for other industrial domains requiring rapid response to evolving business processes

    Data integration through service-based mediation for web-enabled information systems

    Get PDF
    The Web and its underlying platform technologies have often been used to integrate existing software and information systems. Traditional techniques for data representation and transformations between documents are not sufficient to support a flexible and maintainable data integration solution that meets the requirements of modern complex Web-enabled software and information systems. The difficulty arises from the high degree of complexity of data structures, for example in business and technology applications, and from the constant change of data and its representation. In the Web context, where the Web platform is used to integrate different organisations or software systems, additionally the problem of heterogeneity arises. We introduce a specific data integration solution for Web applications such as Web-enabled information systems. Our contribution is an integration technology framework for Web-enabled information systems comprising, firstly, a data integration technique based on the declarative specification of transformation rules and the construction of connectors that handle the integration and, secondly, a mediator architecture based on information services and the constructed connectors to handle the integration process
    • 

    corecore