68 research outputs found
A Logical Design Methodology for Relational Databases Using the Extended ER Model
https://deepblue.lib.umich.edu/bitstream/2027.42/154152/1/39015099114723.pd
NESTMOD: The NetMod-NEST Interface
NESTMOD is a combined analytical modeling and simulation tool, based on the existing tools NetMod and NEST. It provides both transient and steady-state response statistics from models of interconnected local area networks that can execute at any level of detail desired. This gives users the potential to model both networks of extremely large scope (hundreds of thousands of nodes), and to look at great detail for any combination of the ISO layers. This paper describes the interface implementation, and presents an example to illustrate the potential power of the tool.http://deepblue.lib.umich.edu/bitstream/2027.42/107970/1/citi-tr-91-7.pd
A knowledge-based approach to multiple query processing
The collective processing of multiple queries in a database system has recently received renewed attention due to its capability of improving the overall performance of a database system and its applicability to the design of knowledge-based expert systems and extensible database systems. A new multiple query processing strategy is presented which utilizes semantic knowledge on data integrity and information on predicate conditions of the access paths (plans) of queries. The processing of multiple queries is accomplished by the utilization of subset relationships between intermediate results of query executions, which are inferred employing both semantic and logical information. Given a set of fixed order access plans, the A* algorithm is used to find the set of reformulated access plans which is optimal for a given collection of semantic knowledge.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/28071/1/0000514.pd
From Cooperative Scans to Predictive Buffer Management
In analytical applications, database systems often need to sustain workloads
with multiple concurrent scans hitting the same table. The Cooperative Scans
(CScans) framework, which introduces an Active Buffer Manager (ABM) component
into the database architecture, has been the most effective and elaborate
response to this problem, and was initially developed in the X100 research
prototype. We now report on the the experiences of integrating Cooperative
Scans into its industrial-strength successor, the Vectorwise database product.
During this implementation we invented a simpler optimization of concurrent
scan buffer management, called Predictive Buffer Management (PBM). PBM is based
on the observation that in a workload with long-running scans, the buffer
manager has quite a bit of information on the workload in the immediate future,
such that an approximation of the ideal OPT algorithm becomes feasible. In the
evaluation on both synthetic benchmarks as well as a TPC-H throughput run we
compare the benefits of naive buffer management (LRU) versus CScans, PBM and
OPT; showing that PBM achieves benefits close to Cooperative Scans, while
incurring much lower architectural impact.Comment: VLDB201
Linking data and BPMN processes to achieve executable models
We describe a formally well founded approach to link data and processes conceptually, based on adopting UML class diagrams to represent data, and BPMN to represent the process. The UML class diagram together with a set of additional process variables, called Artifact, form the information model of the process. All activities of the BPMN process refer to such an information model by means of OCL operation contracts. We show that the resulting semantics while abstract is fully executable. We also provide an implementation of the executor.Peer ReviewedPostprint (author's final draft
MaDe4IC: an abstract method for managing model dependencies in inter-organizational cooperations
Inter-organizational cooperations are complex in terms of coordination, agreements, and value creation for involved partners. When managing complex cooperations, it is vital to maintain models describing them. Changing one model to regain consistency with the running system might result in new inconsistencies. As a consequence, this maintenance phase grows in complexity with increasing number of models. In this context, challenges are to ensure consistency at design time and to monitor the system at runtime, i.e., at design time, consistency between different models describing the cooperation needs to be ensured. At runtime, behavior of the software system needs to be compared with its underlying models. In this paper, we propose a structured and model-independent method that supports ensuring and maintaining consistency between running system and underlying models for inter-organizational cooperations
An efficient record linkage scheme using graphical analysis for identifier error detection
Integration of information on individuals (record linkage) is a key problem in healthcare delivery, epidemiology, and "business intelligence" applications. It is now common to be required to link very large numbers of records, often containing various combinations of theoretically unique identifiers, such as NHS numbers, which are both incomplete and error-prone
- âŠ