204 research outputs found

    Managing Uncertain Complex Events in Web of Things Applications

    Get PDF
    A critical issue in the Web of Things (WoT) is the need to process and analyze the interactions of Web-interconnected real-world objects. Complex Event Processing (CEP) is a powerful technology for analyzing streams of information about real-time distributed events, coming from different sources, and for extracting conclusions from them. However, in many situations these events are not free from uncertainty, due to either unreliable data sources and networks, measurement uncertainty, or to the inability to determine whether an event has actually happened or not. This short research paper discusses how CEP systems can incorporate different kinds of uncertainty, both in the events and in the rules. A case study is used to validate the proposal, and we discuss the benefits and limitations of this CEP extension.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    spChains: A Declarative Framework for Data Stream Processing in Pervasive Applications

    Get PDF
    Pervasive applications rely on increasingly complex streams of sensor data continuously captured from the physical world. Such data is crucial to enable applications to ``understand'' the current context and to infer the right actions to perform, be they fully automatic or involving some user decisions. However, the continuous nature of such streams, the relatively high throughput at which data is generated and the number of sensors usually deployed in the environment, make direct data handling practically unfeasible. Data not only needs to be cleaned, but it must also be filtered and aggregated to relieve higher level algorithms from near real-time handling of such massive data flows. We propose here a stream-processing framework (spChains), based upon state-of-the-art stream processing engines, which enables declarative and modular composition of stream processing chains built atop of a set of extensible stream processing blocks. While stream processing blocks are delivered as a standard, yet extensible, library of application-independent processing elements, chains can be defined by the pervasive application engineering team. We demonstrate the flexibility and effectiveness of the spChains framework on two real-world applications in the energy management and in the industrial plant management domains, by evaluating them on a prototype implementation based on the Esper stream processo

    Event-Cloud Platform to Support Decision- Making in Emergency Management

    Full text link
    The challenge of this paper is to underline the capability of an Event-Cloud Platform to support efficiently an emergency situation. We chose to focus on a nuclear crisis use case. The proposed approach consists in modeling the business processes of crisis response on the one hand, and in supporting the orchestration and execution of these processes by using an Event-Cloud Platform on the other hand. This paper shows how the use of Event-Cloud techniques can support crisis management stakeholders by automatizing non-value added tasks and by directing decision- makers on what really requires their capabilities of choice. If Event-Cloud technology is a very interesting and topical subject, very few research works have considered this to improve emergency management. This paper tries to fill this gap by considering and applying these technologies on a nuclear crisis use-case

    Clustering-Based Predictive Process Monitoring

    Full text link
    Business process enactment is generally supported by information systems that record data about process executions, which can be extracted as event logs. Predictive process monitoring is concerned with exploiting such event logs to predict how running (uncompleted) cases will unfold up to their completion. In this paper, we propose a predictive process monitoring framework for estimating the probability that a given predicate will be fulfilled upon completion of a running case. The predicate can be, for example, a temporal logic constraint or a time constraint, or any predicate that can be evaluated over a completed trace. The framework takes into account both the sequence of events observed in the current trace, as well as data attributes associated to these events. The prediction problem is approached in two phases. First, prefixes of previous traces are clustered according to control flow information. Secondly, a classifier is built for each cluster using event data to discriminate between fulfillments and violations. At runtime, a prediction is made on a running case by mapping it to a cluster and applying the corresponding classifier. The framework has been implemented in the ProM toolset and validated on a log pertaining to the treatment of cancer patients in a large hospital

    A model-driven approach for facilitating user-friendly design of complex event patterns

    Get PDF
    Complex Event Processing (CEP) is an emerging technology which allows us to efficiently process and correlate huge amounts of data in order to discover relevant or critical situations of interest (complex events) for a specific domain. This technology requires domain experts to define complex event patterns, where the conditions to be detected are specified by means of event processing languages. However, these experts face the handicap of defining such patterns with editors which are not user-friendly enough. To solve this problem, a model-driven approach for facilitating user-friendly design of complex event patterns is proposed and developed in this paper. Besides, the proposal has been applied to different domains and several event processing languages have been compared. As a result, we can affirm that the presented approach is independent both of the domain where CEP technology has to be applied to and of the concrete event processing language required for defining event patterns
    corecore