7 research outputs found
The covering property of the object-oriented data model: design and implementation issues
Inheritance is a necessary condition for construction of an object- oriented data model (OODM), but it is not sufficient. This is because inheritance applies to only one hierarchy. The covering construct meets this deficiency because covering maps an object in one hierarchy to a class of objects in another hierarchy. To date, covering has not been implemented into an existing OODM application. This thesis implements the covering construct into a functioning object-oriented database environment. Implementation was achieved through modification of data constructs and the creation of a user-defined relation linking two or more hierarchies. Using the Multi-model Multi-lingual Database Supercomputer (MDBS), a sample, working application is described illustrating real world applications. The results of this thesis show that the covering property can be implemented into an existing OODM without sacrificing the integrity of the data model. The cross-hierarchical mapping afforded by covering is a powerful construct that expands the capabilities of the model beyond pure inheritance. This makes the OODM suitable for a far wider range of applications. Together, inheritance and covering meet the necessary and sufficient conditions of the OODM.http://archive.org/details/thecoveringprope1094539938Lieutenant, United States NavyLieutenant, United States Navy ReserveApproved for public release; distribution is unlimited
Design, Implementation, and Evaluation of Network Monitoring Tasks with the TelegraphCQ Data Stream Management System : Master's Thesis
Data stream management systems (DSMSs) provide a new and alternative way to perceive and analyze data streams. Similar to the database management systems (DBMSs), the DSMSs use a declarative query language to handle data. One of the main differences is that the DSMSs obtain the data from streaming sources, for example a local area network (LAN), instead of a database. Such an approach opens up for a set of tasks that can be described using e.g. SQL-like queries.
To begin with, we discuss the networking application and introduce a set of requirements that might be useful for DSMSs in general. Some of these requirements are further discussed as we describe the issues in DSMSs. This thesis focuses on one particular DSMS, TelegraphCQ, and we give a thorough description and discussion of its features.
We have designed and implemented a set of tasks that may be of value for the network monitoring application as described in this thesis. We discuss these tasks, investigate their qualities and propose solutions on how to implement them in the declarative language provided by TelegraphCQ.
Finally, we run a performance analysis of some of the tasks to see how TelegraphCQ manages to handle data streams at varying loads. We focus on two metrics; relative throughput to the number of packets received, and accuracy of the results. These metrics are very important with respect to the reliability and applicability of TelegraphCQ. In this context, we implement an experiment setup for network monitoring with DSMSs, such that the results can be easily re-tested and verified. We show that TelegraphCQ only manages a network load of approximately 2.5 Mbits/s before it starts dropping packets.
We end the discussion by evaluating TelegraphCQ's support for the requirements described in the beginning of the thesis, and point out some of the requirements TelegraphCQ does not support. We discuss the results from the performance evaluation and conclude that the accuracy is satisfying. The conclusion is that, due to the low relative throughput, TelegraphCQ is not suited for network traffic monitoring at higher network loads
Recommended from our members
Modeling and Simulating a Software Architecture Design Space
Frequently, a similar type of software system is used in the implementation of many different software applications. Databases are an example. Two software development approaches are common to 脼ll the need for instances from a class of similar systems: (1) repeated custom development of similar instances, one for each different application, or (2) development of one or more general purpose off-the-shelf systems that are used many times in the different applications. Each approach has advantages and disadvantages. Custom development can closely match the requirements of an application, but has an associated high development cost. General purpose systems may have a lower cost when amortized across multiple applications, but may not closely match the requirements of all the different applications. It can be dif脼cult for application developers to determine which approach is best for their application. Do any of the existing off-the-shelf systems suf脼ciently satisfy the application requirements? If so, which ones provide the best match? Would a custom implementation be suf脼ciently better to justify the cost difference between an off-the-shelf solution? These dif脼cult buy-versus-build decisions are extremely important in today脮s fastpaced, competitive, unforgiving software application market. In this thesis we propose and study a software engineering approach for evaluating how well off-the-shelf and custom software architectures within the design space of a class of OODB systems satisfy the requirements for different applications. The approach is based on the ability to explicitly enumerate and represent the key dimensions of commonality and variability in the space of OODB designs. We demonstrate that modeling and simulation of OODB software architectures can be used to help software developers rapidly converge on OODB requirements for an application and identify OODB software architectures that satisfy those requirements. The technical focus of this work is on the circular relationships between requirements, software architectures, and system properties such as OODB functionality, size, and performance. We capture these relationships in a parametrized OODB architectural model, together with an OODB simulation and modeling tool that allows software developers to re脼ne application requirements on an OODB, identify corresponding custom and offthe- shelf OODB software architectures, evaluate how well the software architecture properties satisfy the application requirements, and identify potential re脼nements to requirements
Anales del XIII Congreso Argentino de Ciencias de la Computaci贸n (CACIC)
Contenido:
Arquitecturas de computadoras
Sistemas embebidos
Arquitecturas orientadas a servicios (SOA)
Redes de comunicaciones
Redes heterog茅neas
Redes de Avanzada
Redes inal谩mbricas
Redes m贸viles
Redes activas
Administraci贸n y monitoreo de redes y servicios
Calidad de Servicio (QoS, SLAs)
Seguridad inform谩tica y autenticaci贸n, privacidad
Infraestructura para firma digital y certificados digitales
An谩lisis y detecci贸n de vulnerabilidades
Sistemas operativos
Sistemas P2P
Middleware
Infraestructura para grid
Servicios de integraci贸n (Web Services o .Net)Red de Universidades con Carreras en Inform谩tica (RedUNCI
A quantitative study investigating the effects of computerised clinical decision support in the emergency department
Introduction: Over the last decade there has been a significant increase in the use of computerised clinical decision support systems (CCDSS) in health care. While significant research has been carried out to demonstrate the impact of CCDSS, the role of CCDSSs in Emergency Departments (EDs) remains under-investigated. The aim of this study was to investigate if the introduction of a CCDSS at ED triage, improved the quality and safety of decisions at triage and improved overall departmental safety. Methods: This study adopted an interrupted time series design, with 8 time points. A random sample of triage records (n=400) from the year before the introduction of eTriage (four time points) were compared to the same number of records from the year after its introduction. Data was extracted from ED clinical records to establish the accuracy of triage prioritisation as an indicator of safety and the management of pain as an indicator of quality. A smaller subset of cases (n=44) over the same time period were analysed to assess any differences in the clinical management of patients presenting with neutropenic sepsis, a further indicator of safety. Logistic regression analysis was undertaken to expose the underlying decision-making trend over the whole study period.Results: This study demonstrates a statistically significant improvement in triage prioritisation (p<0.001), pain scoring (p<0.001) and pain management (p<0.001). Logistic regression demonstrated improvements in decision-making above what have been expected if eTriage had not been introduced. For patients presenting with neutropenic sepsis there was no statistically significant difference in their clinical management.Conclusion: This study clearly demonstrated the positive impact that a CCDSS can have on the quality and safety for ED patients and provides a unique contribution of the current ED CCDSS knowledge base. The ever-increasing demand for emergency care and the difficulties in recruiting an experienced workforce is a fertile environment for clinicians to harness the potential that technological solutions can offer