29,585 research outputs found

    Integration of Event Processing with Service-oriented Architectures and Business Processes

    Get PDF
    Data sources like the Internet of Things or Cyber-physical Systems provide enormous amounts of real-time information in form of streams of events. The use of such event streams enables reactive software components as building blocks in a new generation of systems. Businesses, for example, can benefit from the integration of event streams; new services can be provided to customers, or existing business processes can be improved. The development of reactive systems and the integration with existing application landscapes, however, is challenging. While traditional system components follow a pull-based request/reply interaction style, event-based systems follow a push-based interaction scheme; events arrive continuously and application logic is triggered implicitly. To benefit from push-based and pull-based interactions together, an intuitive software abstraction is necessary to integrate push-based application logic with existing systems. In this work we introduce such an abstraction: we present Event Stream Processing Units (SPUs) - a container model for the encapsulation of event-processing application logic at the technical layer as well as at the business process layer. At the technical layer SPUs provide a service-like abstraction and simplify the development of scalable reactive applications. At the business process layer SPUs make event processing explicitly representable. SPUs have a managed lifecycle and are instantiated implicitly - upon arrival of appropriate events - or explicitly upon request. At the business process layer SPUs encapsulate application logic for event stream processing and enable a seamless transition between process models, executable process representations, and components at the IT layer. Throughout this work, we focus on different aspects of the SPU container model: we first introduce the SPU container model and its execution semantics. Since SPUs rely on a publish/subscribe system for event dissemination, we discuss quality of service requirements in the context of event processing. SPUs rely on input in form of events; in event-based systems, however, event production is logically decoupled, i.e., event producers are not aware of the event consumers. This influences the system development process and requires an appropriate methodology. Fur this purpose we present a requirements engineering approach that takes the specifics of event-based applications into account. The integration of events with business processes leads to new business opportunities. SPUs can encapsulate event processing at the abstraction level of business functions and enable a seamless integration with business processes. For this integration, we introduce extensions to the business process modeling notations BPMN and EPCs to model SPUs. We also present a model-to-execute workflow for SPU-containing process models and implementation with business process modeling software. The SPU container model itself is language-agnostic; thus, we present Eventlets as SPU implementation based on Java Enterprise technology. Eventlets are executed inside a distributed middleware and follow a lifecycle. They reduce the development effort of scalable event processing applications as we show in our evaluation. Since the SPU container model introduces an additional layer of abstraction we analyze the overhead in terms of performance and show that Eventlets can compete with traditional event processing approaches in terms of performance. SPUs can be used to process sensitive data, e.g., in health care environments. Thus, privacy protection is an important requirement for certain use cases and we sketch the application of a privacy-preserving event dissemination scheme to protect event consumers and producers from curious brokers. We also quantify the resulting overhead introduced by a privacy-preserving brokering scheme in an evaluation

    MEdit4CEP: A model-driven solution for real-time decision making in SOA 2.0

    Get PDF
    Organizations all around the world need to manage huge amounts of data from heterogeneous sources every day in order to conduct decision making processes. This requires them to infer what the value of such data is for the business in question through data analysis as well as acting promptly for critical or relevant situations. Complex Event Processing (CEP) is a technology that helps tackle this issue by detecting event patterns in real time. However, this technology forces domain experts to define these patterns indicating such situations and the appropriate actions to be executed in their information systems, generally based on Service-Oriented Architectures (SOAs). In particular, these users face the incommodity of implementing these patterns manually or by using editors which are not user-friendly enough. To deal with this problem, a model-driven solution for real-time decision making in event-driven SOAs is proposed and conducted in this paper. This approach allows the integration of CEP with this architecture type as well as defining CEP domain and event pattern through a graphical and intuitive editor, which also permits automatic code generation. Moreover, the solution is evaluated and its benefits are discussed. As a result, we can assert this is a novel solution for bringing CEP technology closer to any user, positively impacting on business decision making processes

    SOA, SoBI & EDA - Paradigms for Integration Capabilities of BI Platform

    Get PDF
    A Business Intelligence (BI) provider may offer a basic solution, a packed application or a comprising BI platform which integrates components from individual technologies in a synergic system. The providers’ tendency is to standardize the instruments offered on a single server platform. The article analyzes the integration capabilities and problems of BI platforms, emphasizes the differences between emergent technologies and suggests integration solutions. The analysis is useful both to the providers of BI solutions - in order to develop some agile platforms, as well as to their users - representing an important factor in selecting the solution. In addition, the conclusions to be drawn will emphasize the tendencies from the BI market and will represent the support in creating some agile platforms.Business Intelligence, agile platform, integration, metadata management, Service Oriented Architecture (SOA), Event Driven Architecture (EDA)

    Quality-aware model-driven service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Quality aspects ranging from interoperability to maintainability to performance are of central importance for the integration of heterogeneous, distributed service-based systems. Architecture models can substantially influence quality attributes of the implemented software systems. Besides the benefits of explicit architectures on maintainability and reuse, architectural constraints such as styles, reference architectures and architectural patterns can influence observable software properties such as performance. Empirical performance evaluation is a process of measuring and evaluating the performance of implemented software. We present an approach for addressing the quality of services and service-based systems at the model-level in the context of model-driven service engineering. The focus on architecture-level models is a consequence of the black-box character of services

    Integration of BPM systems

    Get PDF
    New technologies have emerged to support the global economy where for instance suppliers, manufactures and retailers are working together in order to minimise the cost and maximise efficiency. One of the technologies that has become a buzz word for many businesses is business process management or BPM. A business process comprises activities and tasks, the resources required to perform each task, and the business rules linking these activities and tasks. The tasks may be performed by human and/or machine actors. Workflow provides a way of describing the order of execution and the dependent relationships between the constituting activities of short or long running processes. Workflow allows businesses to capture not only the information but also the processes that transform the information - the process asset (Koulopoulos, T. M., 1995). Applications which involve automated, human-centric and collaborative processes across organisations are inherently different from one organisation to another. Even within the same organisation but over time, applications are adapted as ongoing change to the business processes is seen as the norm in today’s dynamic business environment. The major difference lies in the specifics of business processes which are changing rapidly in order to match the way in which businesses operate. In this chapter we introduce and discuss Business Process Management (BPM) with a focus on the integration of heterogeneous BPM systems across multiple organisations. We identify the problems and the main challenges not only with regards to technologies but also in the social and cultural context. We also discuss the issues that have arisen in our bid to find the solutions

    Real-Time Context-Aware Microservice Architecture for Predictive Analytics and Smart Decision-Making

    Get PDF
    The impressive evolution of the Internet of Things and the great amount of data flowing through the systems provide us with an inspiring scenario for Big Data analytics and advantageous real-time context-aware predictions and smart decision-making. However, this requires a scalable system for constant streaming processing, also provided with the ability of decision-making and action taking based on the performed predictions. This paper aims at proposing a scalable architecture to provide real-time context-aware actions based on predictive streaming processing of data as an evolution of a previously provided event-driven service-oriented architecture which already permitted the context-aware detection and notification of relevant data. For this purpose, we have defined and implemented a microservice-based architecture which provides real-time context-aware actions based on predictive streaming processing of data. As a result, our architecture has been enhanced twofold: on the one hand, the architecture has been supplied with reliable predictions through the use of predictive analytics and complex event processing techniques, which permit the notification of relevant context-aware information ahead of time. On the other, it has been refactored towards a microservice architecture pattern, highly improving its maintenance and evolution. The architecture performance has been evaluated with an air quality case study

    The Management of Manufacturing-Oriented Informatics Systems Using Efficient and Flexible Architectures

    Get PDF
    Industry and in particular the manufacturing-oriented sector has always been researched and innovated as a result of technological progress, diversification and differentiation among consumers' demands. A company that provides to its customers products matching perfectly their demands at competitive prices has a great advantage over its competitors. Manufacturing-oriented information systems are becoming more flexible and configurable and they require integration with the entire organization. This can be done using efficient software architectures that will allow the coexistence between commercial solutions and open source components while sharing computing resources organized in grid infrastructures and under the governance of powerful management tools.Manufacturing-Oriented Informatics Systems, Open Source, Software Architectures, Grid Computing, Web-Based Management Systems
    corecore