112 research outputs found
Composite information systems : resolving semantic heterogeneities
"March 1991."Includes bibliographical references (p. 14-16).Michael Siegel, Stuart Madnick, Amar Gupta
Towards interoperability in heterogeneous database systems
Distributed heterogeneous databases consist of systems which differ physically and logically, containing different data models and data manipulation languages. Although these databases are independently created and administered they must cooperate and interoperate. Users need to access and manipulate data from several databases and applications may require data from a wide variety of independent databases. Therefore, a new system architecture is required to manipulate and manage distinct and multiple databases, in a transparent way, while preserving their autonomy. This report contains an extensive survey on heterogeneous databases, analysing and comparing the different aspects, concepts and approaches related to the topic. It introduces an architecture to support interoperability among heterogeneous database systems. The architecture avoids the use of a centralised structure to assist in the different phases of the interoperability process. It aims to support scalability, and to assure privacy and nfidentiality of the data. The proposed architecture allows the databases to decide when to participate in the system, what type of data to share and with which other databases, thereby preserving their autonomy. The report also describes an approach to information discovery in the proposed architecture, without using any centralised structure as repositories and dictionaries, and broadcasting to all databases. It attempts to reduce the number of databases searched and to preserve the privacy of the shared data. The main idea is to visit a database that either containsthe requested data or knows about another database that possible contains this data
Maintaining temporal consistency of discrete objects in soft real-time database systems
A real-time database system contains base data items which record and model a physical, real-world environment. For better decision support, base data items are summarized and correlated to derive views. These base data and views are accessed by application transactions to generate the ultimate actions taken by the system. As the environment changes, updates are applied to base data, which subsequently trigger view recomputations. There are thus three types of activities: Base data update, view recomputation, and transaction execution. In a real-time database system, two timing constraints need to be enforced. We require that transactions meet their deadlines (transaction timeliness) and read fresh data (data timeliness). In this paper, we define the concept of absolute and relative temporal consistency from the perspective of transactions for discrete data objects. We address the important issue of transaction scheduling among the three types of activities such that the two timing requirements can be met. We also discuss how a real-time database system should be designed to enforce different levels of temporal consistency.published_or_final_versio
Multi-dimensional criteria for testing web services transactions
Web services (WS) transactions are important in order to reliably compose distributed and autonomous services into composite web services and to ensure that their execution is consistent and correct. But such transactions are generally complex and they require longer processing time, and manipulate critical data. Thus various techniques have been developed in order to perform quality assessment of WS transactions in terms of response time efficiency, failure recovery and throughput. This paper focuses on the testing aspect of WS transactions - another key quality issue that has not been examined in the literature. Accordingly it proposes multi-dimensional criteria for testing the WS transactions. The proposed criteria have the potential to capture the behaviour of WS transactions and to analyse and classify the possible (failure) situations that effect the execution of such transactions. These criteria are used to generate various test cases and to provide (WS transactions) ! tester with flexibility of adjusting the method in terms of test efforts and effectiveness. The proposed criteria have been designed, implemented and evaluated through a case study and a number of experiments have been performed. The evaluation shows that these criteria have the capability to effectively generate test cases for testing WS transactions as well as enable tester to decide on the trade-off between test efforts and the quality
IDEAS-1997-2021-Final-Programs
This document records the final program for each of the 26 meetings of the International Database and Engineering Application Symposium from 1997 through 2021. These meetings were organized in various locations on three continents. Most of the papers published during these years are in the digital libraries of IEEE(1997-2007) or ACM(2008-2021)
Applications integration for manufacturing control systems with particular reference to software interoperability issues
The introduction and adoption of contemporary computer aided manufacturing control
systems (MCS) can help rationalise and improve the productivity of manufacturing related
activities. Such activities include product design, process planning and production
management with CAD, CAPP and CAPM. However, they tend to be domain specific and
would generally have been designed as stand-alone systems where there is a serious lack of
consideration for integration requirements with other manufacturing activities outside the area
of immediate concern. As a result, "islands of computerisation" exist which exhibit
deficiencies and constraints that inhibit or complicate subsequent interoperation among typical
MCS components. As a result of these interoperability constraints, contemporary forms of
MCS typically yield sub-optimal benefits and do not promote synergy on an enterprise-wide
basis.
The move towards more integrated manufacturing systems, which requires advances in
software interoperability, is becoming a strategic issue. Here the primary aim is to realise
greater functional synergy between software components which span engineering, production
and management activities and systems. Hence information of global interest needs to be
shared across conventional functional boundaries between enterprise functions.
The main thrust of this research study is to derive a new generation of MCS in which
software components can "functionally interact" and share common information through
accessing distributed data repositories in an efficient, highly flexible and standardised
manner. It addresses problems of information fragmentation and the lack of formalism, as
well as issues relating to flexibly structuring interactions between threads of functionality
embedded within the various components. The emphasis is on the:
• definition of generic information models which underpin the sharing of common
data among production planning, product design, finite capacity scheduling and cell
control systems.
• development of an effective framework to manage functional interaction between
MCS components, thereby coordinating their combined activities.
• "soft" or flexible integration of the MCS activities over an integrating infrastructure
in order to (i) help simplify typical integration problems found when using
contemporary interconnection methods for applications integration; and (ii) enable
their reconfiguration and incremental development. In order to facilitate adaptability in response to changing needs, these systems must also be
engineered to enable reconfigurability over their life cycle. Thus within the scope of this
research study a new methodology and software toolset have been developed to formally
structure and support implementation, run-time and change processes. The tool set combines
the use of IDEFO (for activity based or functional modelling), IDEFIX (for entity-attribute
relationship modelling), and EXPRESS (for information modelling).
This research includes a pragmatic but effective means of dealing with legacyl software,
which often may be a vital source of readily available information which supports the
operation of the manufacturing enterprise. The pragmatism and medium term relevance of the
research study has promoted particular interest and collaboration from software manufacturers
and industrial practitioners. Proof of concept studies have been carried out to implement and
evaluate the developed mechanisms and software toolset
An Integrated Engineering-Computation Framework for Collaborative Engineering: An Application in Project Management
Today\u27s engineering applications suffer from a severe integration problem. Engineering, the entire process, consists of a myriad of individual, often complex, tasks. Most computer tools support particular tasks in engineering, but the output of one tool is different from the others\u27. Thus, the users must re-enter the relevant information in the format required by another tool. Moreover, usually in the development process of a new product/process, several teams of engineers with different backgrounds/responsibilities are involved, for example mechanical engineers, cost estimators, manufacturing engineers, quality engineers, and project manager. Engineers need a tool(s) to share technical and managerial information and to be able to instantly access the latest changes made by one member, or more, in the teams to determine right away the impacts of these changes in all disciplines (cost, time, resources, etc.). In other words, engineers need to participate in a truly collaborative environment for the achievement of a common objective, which is the completion of the product/process design project in a timely, cost effective, and optimal manner.
In this thesis, a new framework that integrates the capabilities of four commercial software, Microsoft Excel™ (spreadsheet), Microsoft Project™ (project management), What\u27s Best! (an optimization add-in), and Visual Basic™ (programming language), with a state-of-the-art object-oriented database (knowledge medium), InnerCircle2000™ is being presented and applied to handle the Cost-Time Trade-Off problem in project networks. The result was a vastly superior solution over the conventional solution from the viewpoint of data handling, completeness of solution space, and in the context of a collaborative engineering-computation environment
- …