4,369 research outputs found

    Lazy Approaches for Interval Timing Correlation of Sensor Data Streams

    Get PDF
    We propose novel algorithms for the timing correlation of streaming sensor data. The sensor data are assumed to have interval timestamps so that they can represent temporal uncertainties. The proposed algorithms can support efficient timing correlation for various timing predicates such as deadline, delay, and within. In addition to the classical techniques, lazy evaluation and result cache are utilized to improve the algorithm performance. The proposed algorithms are implemented and compared under various workloads

    Extending Complex Event Processing for Advanced Applications

    Get PDF
    Recently numerous emerging applications, ranging from on-line financial transactions, RFID based supply chain management, traffic monitoring to real-time object monitoring, generate high-volume event streams. To meet the needs of processing event data streams in real-time, Complex Event Processing technology (CEP) has been developed with the focus on detecting occurrences of particular composite patterns of events. By analyzing and constructing several real-world CEP applications, we found that CEP needs to be extended with advanced services beyond detecting pattern queries. We summarize these emerging needs in three orthogonal directions. First, for applications which require access to both streaming and stored data, we need to provide a clear semantics and efficient schedulers in the face of concurrent access and failures. Second, when a CEP system is deployed in a sensitive environment such as health care, we wish to mitigate possible privacy leaks. Third, when input events do not carry the identification of the object being monitored, we need to infer the probabilistic identification of events before feed them to a CEP engine. Therefore this dissertation discusses the construction of a framework for extending CEP to support these critical services. First, existing CEP technology is limited in its capability of reacting to opportunities and risks detected by pattern queries. We propose to tackle this unsolved problem by embedding active rule support within the CEP engine. The main challenge is to handle interactions between queries and reactions to queries in the high-volume stream execution. We hence introduce a novel stream-oriented transactional model along with a family of stream transaction scheduling algorithms that ensure the correctness of concurrent stream execution. And then we demonstrate the proposed technology by applying it to a real-world healthcare system and evaluate the stream transaction scheduling algorithms extensively using real-world workload. Second, we are the first to study the privacy implications of CEP systems. Specifically we consider how to suppress events on a stream to reduce the disclosure of sensitive patterns, while ensuring that nonsensitive patterns continue to be reported by the CEP engine. We formally define the problem of utility-maximizing event suppression for privacy preservation. We then design a suite of real-time solutions that eliminate private pattern matches while maximizing the overall utility. Our first solution optimally solves the problem at the event-type level. The second solution, at event-instance level, further optimizes the event-type level solution by exploiting runtime event distributions using advanced pattern match cardinality estimation techniques. Our experimental evaluation over both real-world and synthetic event streams shows that our algorithms are effective in maximizing utility yet still efficient enough to offer near real time system responsiveness. Third, we observe that in many real-world object monitoring applications where the CEP technology is adopted, not all sensed events carry the identification of the object whose action they report on, so called €œnon-ID-ed€� events. Such non-ID-ed events prevent us from performing object-based analytics, such as tracking, alerting and pattern matching. We propose a probabilistic inference framework to tackle this problem by inferring the missing object identification associated with an event. Specifically, as a foundation we design a time-varying graphic model to capture correspondences between sensed events and objects. Upon this model, we elaborate how to adapt the state-of-the-art Forward-backward inference algorithm to continuously infer probabilistic identifications for non-ID-ed events. More important, we propose a suite of strategies for optimizing the performance of inference. Our experimental results, using large-volume streams of a real-world health care application, demonstrate the accuracy, efficiency, and scalability of the proposed technology

    Income incentives to labour participation and home production; the contribution of the tax credits in the Netherlands

    Get PDF
    We set up a reduced form model of labour-market participation for young women who have to balance their career with motherhood. The model accounts for the occurrence of future uncertain events, like child birth and early retirement, and includes time spent in home production; however it does not require the estimation of a dynamic programming model. We claim that the careful implementation of institutions can return optimal life patterns of participation without the need of a structural approach. The weaker theoretical framework is more than compensated by the rich spectrum in policy simulations that may be performed. As illustrations, we simulate the effect of two policy options regarding tax credits on the hazard rate out of work.

    10451 Abstracts Collection -- Runtime Verification, Diagnosis, Planning and Control for Autonomous Systems

    Get PDF
    From November 7 to 12, 2010, the Dagstuhl Seminar 10451 ``Runtime Verification, Diagnosis, Planning and Control for Autonomous Systems\u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, 35 participants presented their current research and discussed ongoing work and open problems. This document puts together abstracts of the presentations given during the seminar, and provides links to extended abstracts or full papers, if available

    Streaming the Web: Reasoning over dynamic data.

    Get PDF
    In the last few years a new research area, called stream reasoning, emerged to bridge the gap between reasoning and stream processing. While current reasoning approaches are designed to work on mainly static data, the Web is, on the other hand, extremely dynamic: information is frequently changed and updated, and new data is continuously generated from a huge number of sources, often at high rate. In other words, fresh information is constantly made available in the form of streams of new data and updates. Despite some promising investigations in the area, stream reasoning is still in its infancy, both from the perspective of models and theories development, and from the perspective of systems and tools design and implementation. The aim of this paper is threefold: (i) we identify the requirements coming from different application scenarios, and we isolate the problems they pose; (ii) we survey existing approaches and proposals in the area of stream reasoning, highlighting their strengths and limitations; (iii) we draw a research agenda to guide the future research and development of stream reasoning. In doing so, we also analyze related research fields to extract algorithms, models, techniques, and solutions that could be useful in the area of stream reasoning. © 2014 Elsevier B.V. All rights reserved

    Non—cryptographic methods for improving real time transmission security and integrity

    Get PDF
    In this paper we present a few non cryptographic methods for improving the security, integrity and reliability of real time services. The methods presented in this paper apply to real time transmitting systems, which are based on the Peer-to-Peer (P2P) model. A basic idea of the first technique is to use agents for detecting steganographic content in packet headers, so packets with suspicious entries in the IP header fields will be blocked or the fields will be erased. The two other presented techniques are based on reputation and trust systems, so trust and reputation basic definitions, types and modelling methods are shown. Also a simple design scheme of using these mechanisms in a P2P real-time data transmitting infrastructure is presented. Additionally, we describe an idea of path selecting technique, which can be used to avoid paths that are susceptible to eavesdropping

    Adaptive Resonance Theory

    Full text link
    SyNAPSE program of the Defense Advanced Projects Research Agency (Hewlett-Packard Company, subcontract under DARPA prime contract HR0011-09-3-0001, and HRL Laboratories LLC, subcontract #801881-BS under DARPA prime contract HR0011-09-C-0001); CELEST, an NSF Science of Learning Center (SBE-0354378

    Flood Prediction and Mitigation in Data-Sparse Environments

    Get PDF
    In the last three decades many sophisticated tools have been developed that can accurately predict the dynamics of flooding. However, due to the paucity of adequate infrastructure, this technological advancement did not benefit ungauged flood-prone regions in the developing countries in a major way. The overall research theme of this dissertation is to explore the improvement in methodology that is essential for utilising recently developed flood prediction and management tools in the developing world, where ideal model inputs and validation datasets do not exist. This research addresses important issues related to undertaking inundation modelling at different scales, particularly in data-sparse environments. The results indicate that in order to predict dynamics of high magnitude stream flow in data-sparse regions, special attention is required on the choice of the model in relation to the available data and hydraulic characteristics of the event. Adaptations are necessary to create inputs for the models that have been primarily designed for areas with better availability of data. Freely available geospatial information of moderate resolution can often meet the minimum data requirements of hydrological and hydrodynamic models if they are supplemented carefully with limited surveyed/measured information. This thesis also explores the issue of flood mitigation through rainfall-runoff modelling. The purpose of this investigation is to assess the impact of land-use changes at the sub-catchment scale on the overall downstream flood risk. A key component of this study is also quantifying predictive uncertainty in hydrodynamic models based on the Generalised Likelihood Uncertainty Estimation (GLUE) framework. Detailed uncertainty assessment of the model outputs indicates that, in spite of using sparse inputs, the model outputs perform at reasonably low levels of uncertainty both spatially and temporally. These findings have the potential to encourage the flood managers and hydrologists in the developing world to use similar data sets for flood management
    corecore