53 research outputs found

    Code Generation and Global Optimization Techniques for a Reconfigurable PRAM-NUMA Multicore Architecture

    Full text link

    A Merging System for Integrated Person-Centric Information Systems

    Get PDF
    Large-scale integrated information systems correlate equivalent or related information from multiple data sources to provide a unified view of data. This report describes the design and implementation of a tool called xMerger that provides a unified view from multiple matching records, which could be multi-source duplicates and overlapping records. To achieve this xMerger provides a merging process that generates a complete and accurate merged record from conflicting and incomplete records. This report also discusses the challenges present in the process of merging and xMerger’s solutions. xMerger’s design and implementation was validated by adapting it to CHARM, a real world integrated system currently in use at the Utah Department of Health

    Management of Multiply Represented Geographic Entities

    Get PDF
    Multiple representation of geographic information occurs when a real-world entity is represented more than once in the same or different databases. In this paper, we propose a new approach to the modeling of multiply represented entities and the relationships among the entities and their representations. A Multiple Representation Management System is outlined that can manage multiple representations consistently over a number of autonomous databases. Central to our approach is the Multiple Representation Schema Language that is used to configure the system. It provides an intuitive and declarative means of modeling multiple representations and specifying rules that are used to maintain consistency, match objects representing the same entity, and restore consistency if necessary

    Applications integration for manufacturing control systems with particular reference to software interoperability issues

    Get PDF
    The introduction and adoption of contemporary computer aided manufacturing control systems (MCS) can help rationalise and improve the productivity of manufacturing related activities. Such activities include product design, process planning and production management with CAD, CAPP and CAPM. However, they tend to be domain specific and would generally have been designed as stand-alone systems where there is a serious lack of consideration for integration requirements with other manufacturing activities outside the area of immediate concern. As a result, "islands of computerisation" exist which exhibit deficiencies and constraints that inhibit or complicate subsequent interoperation among typical MCS components. As a result of these interoperability constraints, contemporary forms of MCS typically yield sub-optimal benefits and do not promote synergy on an enterprise-wide basis. The move towards more integrated manufacturing systems, which requires advances in software interoperability, is becoming a strategic issue. Here the primary aim is to realise greater functional synergy between software components which span engineering, production and management activities and systems. Hence information of global interest needs to be shared across conventional functional boundaries between enterprise functions. The main thrust of this research study is to derive a new generation of MCS in which software components can "functionally interact" and share common information through accessing distributed data repositories in an efficient, highly flexible and standardised manner. It addresses problems of information fragmentation and the lack of formalism, as well as issues relating to flexibly structuring interactions between threads of functionality embedded within the various components. The emphasis is on the: • definition of generic information models which underpin the sharing of common data among production planning, product design, finite capacity scheduling and cell control systems. • development of an effective framework to manage functional interaction between MCS components, thereby coordinating their combined activities. • "soft" or flexible integration of the MCS activities over an integrating infrastructure in order to (i) help simplify typical integration problems found when using contemporary interconnection methods for applications integration; and (ii) enable their reconfiguration and incremental development. In order to facilitate adaptability in response to changing needs, these systems must also be engineered to enable reconfigurability over their life cycle. Thus within the scope of this research study a new methodology and software toolset have been developed to formally structure and support implementation, run-time and change processes. The tool set combines the use of IDEFO (for activity based or functional modelling), IDEFIX (for entity-attribute relationship modelling), and EXPRESS (for information modelling). This research includes a pragmatic but effective means of dealing with legacyl software, which often may be a vital source of readily available information which supports the operation of the manufacturing enterprise. The pragmatism and medium term relevance of the research study has promoted particular interest and collaboration from software manufacturers and industrial practitioners. Proof of concept studies have been carried out to implement and evaluate the developed mechanisms and software toolset

    Extending an open source enterprise service bus for cloud data access support

    Get PDF
    In the last years Cloud computing has become popular among IT organizations aiming to reduce its operational costs. Applications can be designed to be run on the Cloud, and utilize its technologies, or can be partially or totally migrated to the Cloud. The application's architecture contains three layers: presentation, business logic, and data layer. The presentation layer provides a user friendly interface, and acts as intermediary between the user and the application logic. The business logic separates the business logic from the underlaying layers of the application. The Data Layer (DL) abstracts the underlaying database storage system from the business layer. It is responsible for storing the application's data. The DL is divided into two sublayers: Data Access Layer (DAL), and Database Layer (DBL). The former provides the abstraction to the business layer of the database operations, while the latter is responsible for the data persistency, and manipulation. When migrating an application to the Cloud, it can be fully or partially migrated. Each application layer can be hosted using different Cloud deployment models. Possible Cloud deployment models are: Private Cloud, Public Cloud, Community Cloud, and Hybrid Cloud. In this diploma thesis we focus on the database layer, which is one of the most expensive layers to build and maintain in an IT infrastructure. Application data is typically moved to the Cloud because of , e. g. Cloud bursting, data analysis, or backup and archiving. Currently, there is little support and guidance how to enable appropriate data access to the Cloud. In this diploma thesis the we extend an Open Source Enterprise Service Bus to provide support for enabling transparent data access in the Cloud. After a research in the different protocols used by the Cloud providers to manage and store data, we design and implement the needed components in the Enterprise Service Bus to provide the user transparent access to his data previously migrated to the Cloud

    Schema matching in a peer-to-peer database system

    Get PDF
    Includes bibliographical references (p. 112-118).Peer-to-peer or P2P systems are applications that allow a network of peers to share resources in a scalable and efficient manner. My research is concerned with the use of P2P systems for sharing databases. To allow data mediation between peers' databases, schema mappings need to exist, which are mappings between semantically equivalent attributes in different peers' schemas. Mappings can either be defined manually or found semi-automatically using a technique called schema matching. However, schema matching has not been used much in dynamic environments, such as P2P networks. Therefore, this thesis investigates how to enable effective semi-automated schema matching within a P2P network
    • …
    corecore