20 research outputs found

    spChains: A Declarative Framework for Data Stream Processing in Pervasive Applications

    Get PDF
    Pervasive applications rely on increasingly complex streams of sensor data continuously captured from the physical world. Such data is crucial to enable applications to ``understand'' the current context and to infer the right actions to perform, be they fully automatic or involving some user decisions. However, the continuous nature of such streams, the relatively high throughput at which data is generated and the number of sensors usually deployed in the environment, make direct data handling practically unfeasible. Data not only needs to be cleaned, but it must also be filtered and aggregated to relieve higher level algorithms from near real-time handling of such massive data flows. We propose here a stream-processing framework (spChains), based upon state-of-the-art stream processing engines, which enables declarative and modular composition of stream processing chains built atop of a set of extensible stream processing blocks. While stream processing blocks are delivered as a standard, yet extensible, library of application-independent processing elements, chains can be defined by the pervasive application engineering team. We demonstrate the flexibility and effectiveness of the spChains framework on two real-world applications in the energy management and in the industrial plant management domains, by evaluating them on a prototype implementation based on the Esper stream processo

    RFID-enabled complex event processing application framework for manufacturing

    Get PDF
    In order to face up with classic manufacturing challenges such as high work in progress (WIP) inventories, complexity in production planning and scheduling, and low labour and machine utilisation, many manufacturing companies made their efforts in implementing RFID (Radio Frequency Identification Devices) throughout the manufacturing workshops. Through this way, all production data in manufacturing fields can be obtained in real time, and it improves the flexibility and responsivity to the changing market for the companies. However, at the same time the RFID deployment also introduces a new challenge which requires an effective and efficient method to handle the large amounts of events. This paper proposes an application framework for a real-time Complex Event Management System (CEMS) based on RFID equipments deployment. With the use of Complex Event Processing (CEP) technologies, this system allows users to obtain interested and meaningful information from large numbers of primitive events captured from the RFID devices deployed in manufacturing shop-floor in real time. This paper presents the RFID deployment infrastructure first, and then system design of the CEMS is proposed. © 2011 Inderscience Enterprises Ltd.postprin

    Determination of Rule Patterns in Complex Event Processing Using Machine Learning Techniques

    Get PDF
    AbstractComplex Event Processing (CEP) is a novel and promising methodology that enables the real-time analysis of stream event data. The main purpose of CEP is detection of the complex event patterns from the atomic and semantically low-level events such as sensor, log, or RFID data. Determination of the rule patterns for matching these simple events based on the temporal, semantic, or spatial correlations is the central task of CEP systems. In the current design of the CEP systems, experts provide event rule patterns. Having reached maturity, the Big Data Systems and Internet of Things (IoT) technology require the implementation of advanced machine learning approaches for automation in the CEP domain. The goal of this research is proposing a machine learning model to replace the manual identification of rule patterns. After a pre-processing stage (dealing with missing values, data outliers, etc.), various rule-based machine learning approaches were applied to detect complex events. Promising results with high preciseness were obtained. A comparative analysis of the performance of classifiers is discussed

    Understanding the Organizational Impact of Radio Frequency Identification Technology: A Holistic View

    Get PDF
    The adoption and deployment of radio frequency identification technology (RFID) in retail supply chains results in an influx of data, supporting the development of better information and increased knowledge. This impacts not only an organization’s information technology infrastructure, but also the quality and timeliness of its business intelligence and decision-making. This paper provides an introduction to RFID technology and surveys a variety of its applications, then examines and discusses the impact of RFID technology on organizational IT infrastructure, business intelligence, and decision-making. Propositions are advanced to provide the basis for the development of specific hypotheses to be empirically tested in future studies, and a conceptual research framework for understanding the organizational impact of RFID technology is proposed. Available at: https://aisel.aisnet.org/pajais/vol2/iss2/3

    ModeL4CEP: Graphical domain-specific modeling languages for CEP domains and event patterns

    Get PDF
    Complex event processing (CEP) is a cutting-edge technology that allows the analysis and correlation of large volumes of data with the aim of detecting complex and meaningful events through the use of event patterns, as well as permitting the inference of valuable knowledge for end users. Despite the great advantages that CEP can bring to expert or intelligent business systems, it poses a substantial challenge to their users, who are business experts but do not have the necessary knowledge and experience using this technology. The main problem these users have to face is precisely hand-writing the code for event pattern definition, which requires them to implement the conditions to be met to detect relevant situations for the domain in question by using a particular event processing language (EPL). In order to respond to this need, in this paper we propose both a graphical domain-specific modeling language (DSML) for facilitating CEP domain definitions by domain experts, and a graphical DSML for event pattern definition by non-technological users. The proposed languages provide high expressiveness and flexibility and are independent of event patterns and actions’ implementation code. This way, domain experts can define the relevant event types and patterns within their business domain, without having to be experts on EPL programming, nor on other complicated computer science technological issues, beyond an understandable and intuitive graphical definition. Furthermore, with these DSMLs, users will also be able to define the actions to be automatically taken once a pattern is detected in the system. Further benefits of these DSMLs are evaluated and discussed in depth in this paper

    Analysis of Traceability Optimization and Shareholder’s Profit for Efficient Supply Chain Operation under Product Recall Crisis

    Get PDF
    Product recall gains considerable importance in recent times; the reason may be the huge losses faced by manufacturers because of product recall issues. Furthermore, the revenue of the firm is immensely affected as a result of product recall, which may lead to serious outcomes. Huge recall cost (such as repairing or destroying the recalled products and cost of notification) occurs as a result of large recall. Therefore, in order to minimize the quantity and probability of recalls the traceability systems are widely used and considered as a necessary part of product safety strategies. However, from literature it is clear that manufacturers are still struggling to obtain the significant results. This study helps the managers to understand the importance of recall cost by analysing its impact on shareholders profit. Keeping in view the importance of problem, the paper proposed an integrated optimization model to minimize the expected loss to shareholders in recall crisis using batch dispersion methodology. The analysed results show that reduction in traceability level increases the expected shareholders losses while decreasing the operational costs. This will help managers to optimally set the production batch sizes in order to reduce the product recall impact

    MEdit4CEP: A model-driven solution for real-time decision making in SOA 2.0

    Get PDF
    Organizations all around the world need to manage huge amounts of data from heterogeneous sources every day in order to conduct decision making processes. This requires them to infer what the value of such data is for the business in question through data analysis as well as acting promptly for critical or relevant situations. Complex Event Processing (CEP) is a technology that helps tackle this issue by detecting event patterns in real time. However, this technology forces domain experts to define these patterns indicating such situations and the appropriate actions to be executed in their information systems, generally based on Service-Oriented Architectures (SOAs). In particular, these users face the incommodity of implementing these patterns manually or by using editors which are not user-friendly enough. To deal with this problem, a model-driven solution for real-time decision making in event-driven SOAs is proposed and conducted in this paper. This approach allows the integration of CEP with this architecture type as well as defining CEP domain and event pattern through a graphical and intuitive editor, which also permits automatic code generation. Moreover, the solution is evaluated and its benefits are discussed. As a result, we can assert this is a novel solution for bringing CEP technology closer to any user, positively impacting on business decision making processes

    Analysis of Traceability Optimization and Shareholder's Profit for Efficient Supply Chain Operation under Product Recall Crisis

    Get PDF
    Product recall gains considerable importance in recent times; the reason may be the huge losses faced by manufacturers because of product recall issues. Furthermore, the revenue of the firm is immensely affected as a result of product recall, which may lead to serious outcomes. Huge recall cost (such as repairing or destroying the recalled products and cost of notification) occurs as a result of large recall. Therefore, in order to minimize the quantity and probability of recalls the traceability systems are widely used and considered as a necessary part of product safety strategies. However, from literature it is clear that manufacturers are still struggling to obtain the significant results. This study helps the managers to understand the importance of recall cost by analysing its impact on shareholders profit. Keeping in view the importance of problem, the paper proposed an integrated optimization model to minimize the expected loss to shareholders in recall crisis using batch dispersion methodology. The analysed results show that reduction in traceability level increases the expected shareholders losses while decreasing the operational costs. This will help managers to optimally set the production batch sizes in order to reduce the product recall impact

    Formal Modelling of Complex Event Processing and its Application to a Manufacturing Line

    Get PDF
    Identifying the significant and most needed information in huge enterprises at the right time not only helps in decision making, but also plays an important role in overall performance and profit making of enterprises. Complex Event Processing (CEP) is a developing method of processing different events from multiple sources and filtering them to produce complex events. This thesis provides a methodology to model CEP using Timed Net Condition Event System (TNCES), a Petri Nets derived formalism. Petri Nets is a graphical, mathematical modelling language used to analyze and describe discrete-event dynamic systems. The biggest advantage of representing CEP in TNCES is that it opens paths to the validation of the events filtering and decision making in different level of enterprise. /Kir1

    Temporal and Contextual Dependencies in Relational Data Modeling

    Get PDF
    Although a solid theoretical foundation of relational data modeling has existed for decades, critical reassessment from temporal requirements’ perspective reveals shortcomings in its integrity constraints. We identify the need for this work by discussing how existing relational databases fail to ensure correctness of data when the data to be stored is time sensitive. The analysis presented in this work becomes particularly important in present times where, because of relational databases’ inadequacy to cater to all the requirements, new forms of database systems such as temporal databases, active databases, real time databases, and NoSQL (non-relational) databases have been introduced. In relational databases, temporal requirements have been dealt with either at application level using scripts or through manual assistance, but no attempts have been made to address them at design level. These requirements are the ones that need changing metadata as the time progresses, which remains unsupported by Relational Database Management System (RDBMS) to date. Starting with shortcomings of data, entity, and referential integrity in relational data modeling, we propose a new form of integrity that works at a more detailed level of granularity. We also present several important concepts including temporal dependency, contextual dependency, and cell level integrity. We then introduce cellular-constraints to implement the proposed integrity and dependencies, and also how they can be incorporated into the relational data model to enable RDBMS to handle temporal requirements in future. Overall, we provide a formal description to address the temporal requirements’ problem in relational data model, and design a framework for solving this problem. We have supplemented our proposition using examples, experiments and results
    corecore