592,873 research outputs found

    Advances in Real-Time Database Systems Research Special Section on RTDBS of ACM SIGMOD Record 25(1), March 1996.

    Full text link
    A Real-Time DataBase System (RTDBS) can be viewed as an amalgamation of a conventional DataBase Management System (DBMS) and a real-time system. Like a DBMS, it has to process transactions and guarantee ACID database properties. Furthermore, it has to operate in real-time, satisfying time constraints imposed on transaction commitments. A RTDBS may exist as a stand-alone system or as an embedded component in a larger multidatabase system. The publication in 1988 of a special issue of ACM SIGMOD Record on Real-Time DataBases signaled the birth of the RTDBS research area -- an area that brings together researchers from both the database and real-time systems communities. Today, almost eight years later, I am pleased to present in this special section of ACM SIGMOD Record a review of recent advances in RTDBS research. There were 18 submissions to this special section, of which eight papers were selected for inclusion to provide the readers of ACM SIGMOD Record with an overview of current and future research directions within the RTDBS community. In this paper [below], I summarize these directions and provide the reader with pointers to other publications for further information. -Azer Bestavros, Guest Edito

    Transparent Forecasting Strategies in Database Management Systems

    Get PDF
    Whereas traditional data warehouse systems assume that data is complete or has been carefully preprocessed, increasingly more data is imprecise, incomplete, and inconsistent. This is especially true in the context of big data, where massive amount of data arrives continuously in real-time from vast data sources. Nevertheless, modern data analysis involves sophisticated statistical algorithm that go well beyond traditional BI and, additionally, is increasingly performed by non-expert users. Both trends require transparent data mining techniques that efficiently handle missing data and present a complete view of the database to the user. Time series forecasting estimates future, not yet available, data of a time series and represents one way of dealing with missing data. Moreover, it enables queries that retrieve a view of the database at any point in time - past, present, and future. This article presents an overview of forecasting techniques in database management systems. After discussing possible application areas for time series forecasting, we give a short mathematical background of the main forecasting concepts. We then outline various general strategies of integrating time series forecasting inside a database and discuss some individual techniques from the database community. We conclude this article by introducing a novel forecasting-enabled database management architecture that natively and transparently integrates forecast models

    ControlShell: A real-time software framework

    Get PDF
    The ControlShell system is a programming environment that enables the development and implementation of complex real-time software. It includes many building tools for complex systems, such as a graphical finite state machine (FSM) tool to provide strategic control. ControlShell has a component-based design, providing interface definitions and mechanisms for building real-time code modules along with providing basic data management. Some of the system-building tools incorporated in ControlShell are a graphical data flow editor, a component data requirement editor, and a state-machine editor. It also includes a distributed data flow package, an execution configuration manager, a matrix package, and an object database and dynamic binding facility. This paper presents an overview of ControlShell's architecture and examines the functions of several of its tools

    Conclusions from the European Roadmap on Control of Computing Systems

    Get PDF
    The use of control-based methods for resource management in real-time computing and communication systems has gained a substantial interest recently. Applications areas include performance control of web-servers, dynamic resource management in embedded systems, traffic control in communication networks, transaction management in database servers, error control in software systems, and autonomic computing. Within the European EU/IST FP6 Network of Exellence ARTIST2 on Embedded System Design a roadmap on Control of Real-Time Computing Systems has recently been completed. The focus of the roadmap is how flexibility, adaptivity, performance and robustness can be achieved in a real-time computing or communication system through the use of control theory. The item that is controlled is in most cases the allocation of computing and communication resources, e.g., the distribution or scheduling of CPU time among different competing tasks, jobs, requests, or transactions, or the communication resources in a network. Due to this, control of computing systems also goes under the name of feedback scheduling. The roadmap is divided into six research areas: control of server systems, control of CPU resources, control of communication networks, error control of software systems, feedback scheduling of control systems, and control middleware. For each area an overview is given and challenges for future research are stated. The aim of this position paper is to summarize the conclusions concerning these research challenges. In this paper, we will only cover the first four of the areas above. A preliminary version of the roadmap can be found on http://www.control.lth.se/user/karlerik/roadmap1.pd

    Visualization Component of Vehicle Health Decision Support System

    Get PDF
    The visualization front-end of a Decision Support System (DSS) also includes an analysis engine linked to vehicle telemetry, and a database of learned models for known behaviors. Because the display is graphical rather than text-based, the summarization it provides has a greater information density on one screen for evaluation by a flight controller.This tool provides a system-level visualization of the state of a vehicle, and drill-down capability for more details and interfaces to separate analysis algorithms and sensor data streams. The system-level view is a 3D rendering of the vehicle, with sensors represented as icons, tied to appropriate positions within the vehicle body and colored to indicate sensor state (e.g., normal, warning, anomalous state, etc.). The sensor data is received via an Information Sharing Protocol (ISP) client that connects to an external server for real-time telemetry. Users can interactively pan, zoom, and rotate this 3D view, as well as select sensors for a detail plot of the associated time series data. Subsets of the plotted data can be selected and sent to an external analysis engine to either search for a similar time series in an historical database, or to detect anomalous events. The system overview and plotting capabilities are completely general in that they can be applied to any vehicle instrumented with a collection of sensors. This visualization component can interface with the ISP for data streams used by NASA s Mission Control Center at Johnson Space Center. In addition, it can connect to, and display results from, separate analysis engine components that identify anomalies or that search for past instances of similar behavior. This software supports NASA's Software, Intelligent Systems, and Modeling element in the Exploration Systems Research and Technology Program by augmenting the capability of human flight controllers to make correct decisions, thus increasing safety and reliability. It was designed specifically as a tool for NASA's flight controllers to monitor the International Space Station and a future Crew Exploration Vehicle

    Fast computation of the performance evaluation of biometric systems: application to multibiometric

    Full text link
    The performance evaluation of biometric systems is a crucial step when designing and evaluating such systems. The evaluation process uses the Equal Error Rate (EER) metric proposed by the International Organization for Standardization (ISO/IEC). The EER metric is a powerful metric which allows easily comparing and evaluating biometric systems. However, the computation time of the EER is, most of the time, very intensive. In this paper, we propose a fast method which computes an approximated value of the EER. We illustrate the benefit of the proposed method on two applications: the computing of non parametric confidence intervals and the use of genetic algorithms to compute the parameters of fusion functions. Experimental results show the superiority of the proposed EER approximation method in term of computing time, and the interest of its use to reduce the learning of parameters with genetic algorithms. The proposed method opens new perspectives for the development of secure multibiometrics systems by speeding up their computation time.Comment: Future Generation Computer Systems (2012

    A Survey on IT-Techniques for a Dynamic Emergency Management in Large Infrastructures

    Get PDF
    This deliverable is a survey on the IT techniques that are relevant to the three use cases of the project EMILI. It describes the state-of-the-art in four complementary IT areas: Data cleansing, supervisory control and data acquisition, wireless sensor networks and complex event processing. Even though the deliverable’s authors have tried to avoid a too technical language and have tried to explain every concept referred to, the deliverable might seem rather technical to readers so far little familiar with the techniques it describes
    corecore