35 research outputs found

    Application of Hybrid Agents to Smart Energy Management of a Prosumer Node

    Get PDF
    We outline a solution to the problem of intelligent control of energy consumption of a smart building system by a prosumer planning agent that acts on the base of the knowledge of the system state and of a prediction of future states. Predictions are obtained by using a synthetic model of the system as obtained with a machine learning approach. We present case studies simulations implementing different instantiations of agents that control an air conditioner according to temperature set points dynamically chosen by the user. The agents are able of energy saving while trying to keep indoor temperature within a given comfort interval

    Efficient and Effective Event Pattern Management

    Get PDF
    The goal of this thesis is to reduce the barriers stopping more enterprises from accessing CEP technology by providing additional support in managing relevant business situations. Therefore we outline the role of event pattern management and present a methodology, methods and tools aiming at an efficient and effective event pattern management. We provide a meta model for event patterns, an event pattern life cycle methodology, methods for guidance, refinement and evolution

    Improving data preparation for the application of process mining

    Get PDF
    Immersed in what is already known as the fourth industrial revolution, automation and data exchange are taking on a particularly relevant role in complex environments, such as industrial manufacturing environments or logistics. This digitisation and transition to the Industry 4.0 paradigm is causing experts to start analysing business processes from other perspectives. Consequently, where management and business intelligence used to dominate, process mining appears as a link, trying to build a bridge between both disciplines to unite and improve them. This new perspective on process analysis helps to improve strategic decision making and competitive capabilities. Process mining brings together data and process perspectives in a single discipline that covers the entire spectrum of process management. Through process mining, and based on observations of their actual operations, organisations can understand the state of their operations, detect deviations, and improve their performance based on what they observe. In this way, process mining is an ally, occupying a large part of current academic and industrial research. However, although this discipline is receiving more and more attention, it presents severe application problems when it is implemented in real environments. The variety of input data in terms of form, content, semantics, and levels of abstraction makes the execution of process mining tasks in industry an iterative, tedious, and manual process, requiring multidisciplinary experts with extensive knowledge of the domain, process management, and data processing. Currently, although there are numerous academic proposals, there are no industrial solutions capable of automating these tasks. For this reason, in this thesis by compendium we address the problem of improving business processes in complex environments thanks to the study of the state-of-the-art and a set of proposals that improve relevant aspects in the life cycle of processes, from the creation of logs, log preparation, process quality assessment, and improvement of business processes. Firstly, for this thesis, a systematic study of the literature was carried out in order to gain an in-depth knowledge of the state-of-the-art in this field, as well as the different challenges faced by this discipline. This in-depth analysis has allowed us to detect a number of challenges that have not been addressed or received insufficient attention, of which three have been selected and presented as the objectives of this thesis. The first challenge is related to the assessment of the quality of input data, known as event logs, since the requeriment of the application of techniques for improving the event log must be based on the level of quality of the initial data, which is why this thesis presents a methodology and a set of metrics that support the expert in selecting which technique to apply to the data according to the quality estimation at each moment, another challenge obtained as a result of our analysis of the literature. Likewise, the use of a set of metrics to evaluate the quality of the resulting process models is also proposed, with the aim of assessing whether improvement in the quality of the input data has a direct impact on the final results. The second challenge identified is the need to improve the input data used in the analysis of business processes. As in any data-driven discipline, the quality of the results strongly depends on the quality of the input data, so the second challenge to be addressed is the improvement of the preparation of event logs. The contribution in this area is the application of natural language processing techniques to relabel activities from textual descriptions of process activities, as well as the application of clustering techniques to help simplify the results, generating more understandable models from a human point of view. Finally, the third challenge detected is related to the process optimisation, so we contribute with an approach for the optimisation of resources associated with business processes, which, through the inclusion of decision-making in the creation of flexible processes, enables significant cost reductions. Furthermore, all the proposals made in this thesis are validated and designed in collaboration with experts from different fields of industry and have been evaluated through real case studies in public and private projects in collaboration with the aeronautical industry and the logistics sector

    Medical devices with embedded electronics: design and development methodology for start-ups

    Get PDF
    358 p.El sector de la biotecnología demanda innovación constante para hacer frente a los retos del sector sanitario. Hechos como la reciente pandemia COVID-19, el envejecimiento de la población, el aumento de las tasas de dependencia o la necesidad de promover la asistencia sanitaria personalizada tanto en entorno hospitalario como domiciliario, ponen de manifiesto la necesidad de desarrollar dispositivos médicos de monitorización y diagnostico cada vez más sofisticados, fiables y conectados de forma rápida y eficaz. En este escenario, los sistemas embebidos se han convertido en tecnología clave para el diseño de soluciones innovadoras de bajo coste y de forma rápida. Conscientes de la oportunidad que existe en el sector, cada vez son más las denominadas "biotech start-ups" las que se embarcan en el negocio de los dispositivos médicos. Pese a tener grandes ideas y soluciones técnicas, muchas terminan fracasando por desconocimiento del sector sanitario y de los requisitos regulatorios que se deben cumplir. La gran cantidad de requisitos técnicos y regulatorios hace que sea necesario disponer de una metodología procedimental para ejecutar dichos desarrollos. Por ello, esta tesis define y valida una metodología para el diseño y desarrollo de dispositivos médicos embebidos

    Efficient Context-aware Real-time Processing of Personal Data Streams

    Get PDF
    In this dissertation we propose a framework for the development of innovative mobile applications that are context-aware in processing of real-time personal data streams by taking into account the resource limitation on mobile devices, in order to achieve an efficient processing of real-time sensor data on mobile devices for various use cases

    Emotional theory of rationality

    Full text link

    Integration of Event Processing with Service-oriented Architectures and Business Processes

    Get PDF
    Data sources like the Internet of Things or Cyber-physical Systems provide enormous amounts of real-time information in form of streams of events. The use of such event streams enables reactive software components as building blocks in a new generation of systems. Businesses, for example, can benefit from the integration of event streams; new services can be provided to customers, or existing business processes can be improved. The development of reactive systems and the integration with existing application landscapes, however, is challenging. While traditional system components follow a pull-based request/reply interaction style, event-based systems follow a push-based interaction scheme; events arrive continuously and application logic is triggered implicitly. To benefit from push-based and pull-based interactions together, an intuitive software abstraction is necessary to integrate push-based application logic with existing systems. In this work we introduce such an abstraction: we present Event Stream Processing Units (SPUs) - a container model for the encapsulation of event-processing application logic at the technical layer as well as at the business process layer. At the technical layer SPUs provide a service-like abstraction and simplify the development of scalable reactive applications. At the business process layer SPUs make event processing explicitly representable. SPUs have a managed lifecycle and are instantiated implicitly - upon arrival of appropriate events - or explicitly upon request. At the business process layer SPUs encapsulate application logic for event stream processing and enable a seamless transition between process models, executable process representations, and components at the IT layer. Throughout this work, we focus on different aspects of the SPU container model: we first introduce the SPU container model and its execution semantics. Since SPUs rely on a publish/subscribe system for event dissemination, we discuss quality of service requirements in the context of event processing. SPUs rely on input in form of events; in event-based systems, however, event production is logically decoupled, i.e., event producers are not aware of the event consumers. This influences the system development process and requires an appropriate methodology. Fur this purpose we present a requirements engineering approach that takes the specifics of event-based applications into account. The integration of events with business processes leads to new business opportunities. SPUs can encapsulate event processing at the abstraction level of business functions and enable a seamless integration with business processes. For this integration, we introduce extensions to the business process modeling notations BPMN and EPCs to model SPUs. We also present a model-to-execute workflow for SPU-containing process models and implementation with business process modeling software. The SPU container model itself is language-agnostic; thus, we present Eventlets as SPU implementation based on Java Enterprise technology. Eventlets are executed inside a distributed middleware and follow a lifecycle. They reduce the development effort of scalable event processing applications as we show in our evaluation. Since the SPU container model introduces an additional layer of abstraction we analyze the overhead in terms of performance and show that Eventlets can compete with traditional event processing approaches in terms of performance. SPUs can be used to process sensitive data, e.g., in health care environments. Thus, privacy protection is an important requirement for certain use cases and we sketch the application of a privacy-preserving event dissemination scheme to protect event consumers and producers from curious brokers. We also quantify the resulting overhead introduced by a privacy-preserving brokering scheme in an evaluation

    Developing methods of resilience for design practice

    Get PDF
    It was noted by the researcher that living and working in Puerto Rico, in what are politically and socio-economically difficult and sometimes threatening conditions, at the time of this programme of research, there was something to be learnt from those designers who exhibited resilience to stressful events. Therefore, the specific purpose of this practice-led programme of research was to understand designers’ decision-making processes when under political and socio-economic stressors and question how they can make strategically successful decisions that enable them to thrive. The first objective was to identify and define resilient strategic thinking. To do this, the researcher reflected upon her own thinking and practices as an art director and design educator suffering the adversities of political and socio-economic disintegration in her own context. This self-reflective process revealed her use of a number of coping tools, which became the set of Real-Time Response Planning (RTRP) tools for managing adversity. The second tool’s objective was to explore the possibility of teaching strategic application of the RTRP tools to other designers who were also experiencing their own stressors. In review of designers’ engagement with these tools, the third objective was to develop an effective graphic articulation of the RTRP toolbox. This enabled the fourth objective, which was to measure the effectiveness of the RTRP toolbox in guiding designers towards radical resilience, towards bouncing forward as a more adaptive response to adverse conditions. The research was begun using the Reflective Practice and Action Research approach; however, critical review of its appropriateness within this social-political context of design practice moved the researcher to apply the Systematization of Experience method. A Systematization workshop was conducted applying Participatory Action Research and Participatory Design to the creation of the RTRP toolbox paper prototype, as a vehicle for observing the application of the RTRP tools during design practices. This programme of research found that the RTRP tools were able to positively support thriving and resilience as defined by the Resilience Theory. The toolbox successfully supported the teaching of resilience behaviours at a personal and local level, enabling the development of positive coping strategies in real-time, and informed the planning of longer-term strategies for similar adversities in the future. The current global economic crisis has left many designers with insecure futures, yet there is an expectation that they will carry on efficiently to maintain their livelihoods and lifestyles in the face of daily adversity. These RTRP tools offer designers a means of managing these experiences and help them see oportunities

    Ontology-based context-aware model for event processing in an IoT environment

    Get PDF
    The Internet of Things (IoT) is more and more becoming one of the fundamental sources of data. The observations produced by these sources are made accessible with heterogeneous vocabularies, models and data formats. The heterogeneity factor in such an enormous environment complicates the task of sharing and reusing this data in a more intelligent way (other than the purposes it was initially set up for). In this research, we investigate these challenges, considering how we can transform raw sensor data into a more meaningful information. This raw data will be modelled using ontology-based information that is accessible through continuous queries for sensor streaming data.Interoperability among heterogeneous entities is an important issue in an IoT environment. Semantic modelling is a key element to support interoperability. Most of the current ontologies for IoT mainly focus on resources and services information. This research builds upon the current state-of-the-art ontologies to provide contextual information and facilitate sensor data querying. In this research, we present an Ontology to represent an IoT environment, with emphasis on temporal and geospatial context enrichment. Furthermore, the Ontology is used alongside a proposed syntax based on Description Logic to build an Event Processing Model. The aim of this model is to interconnect ontology-based reasoning with event processing. This model enables to perform event processing over high-level ontological concepts.The Ontology was developed using the NeOn methodology, which emphasises on the reuse and modularisation. The Competency Questions techniques was used to develop the requirements of this Ontology. This was later evaluated by domain experts in software engineering and cloud computing. The ontology was evaluated based on its completeness, conciseness, consistency and expandability, over 70% of the domain experts agreed on the core modules, concepts and relationships within the ontology. The resulted Ontology provides a core IoT ontology that could be used for further development within a specific IoT domain. IIThe proposed Ontology-Based Context-Aware model for Event-Processing in an IoT environment “OCEM-IoT”, implements all the time operators used in complex event processing engines. Throughput and latency were used as performance comparison metrics for the syntax evaluation; the results obtained show an improved performance over existing event processing languages
    corecore