2,929 research outputs found

    Elastic Business Process Management: State of the Art and Open Challenges for BPM in the Cloud

    Full text link
    With the advent of cloud computing, organizations are nowadays able to react rapidly to changing demands for computational resources. Not only individual applications can be hosted on virtual cloud infrastructures, but also complete business processes. This allows the realization of so-called elastic processes, i.e., processes which are carried out using elastic cloud resources. Despite the manifold benefits of elastic processes, there is still a lack of solutions supporting them. In this paper, we identify the state of the art of elastic Business Process Management with a focus on infrastructural challenges. We conceptualize an architecture for an elastic Business Process Management System and discuss existing work on scheduling, resource allocation, monitoring, decentralized coordination, and state management for elastic processes. Furthermore, we present two representative elastic Business Process Management Systems which are intended to counter these challenges. Based on our findings, we identify open issues and outline possible research directions for the realization of elastic processes and elastic Business Process Management.Comment: Please cite as: S. Schulte, C. Janiesch, S. Venugopal, I. Weber, and P. Hoenisch (2015). Elastic Business Process Management: State of the Art and Open Challenges for BPM in the Cloud. Future Generation Computer Systems, Volume NN, Number N, NN-NN., http://dx.doi.org/10.1016/j.future.2014.09.00

    Cloud Computing cost and energy optimization through Federated Cloud SoS

    Get PDF
    2017 Fall.Includes bibliographical references.The two most significant differentiators amongst contemporary Cloud Computing service providers have increased green energy use and datacenter resource utilization. This work addresses these two issues from a system's architectural optimization viewpoint. The proposed approach herein, allows multiple cloud providers to utilize their individual computing resources in three ways by: (1) cutting the number of datacenters needed, (2) scheduling available datacenter grid energy via aggregators to reduce costs and power outages, and lastly by (3) utilizing, where appropriate, more renewable and carbon-free energy sources. Altogether our proposed approach creates an alternative paradigm for a Federated Cloud SoS approach. The proposed paradigm employs a novel control methodology that is tuned to obtain both financial and environmental advantages. It also supports dynamic expansion and contraction of computing capabilities for handling sudden variations in service demand as well as for maximizing usage of time varying green energy supplies. Herein we analyze the core SoS requirements, concept synthesis, and functional architecture with an eye on avoiding inadvertent cascading conditions. We suggest a physical architecture that diminishes unwanted outcomes while encouraging desirable results. Finally, in our approach, the constituent cloud services retain their independent ownership, objectives, funding, and sustainability means. This work analyzes the core SoS requirements, concept synthesis, and functional architecture. It suggests a physical structure that simulates the primary SoS emergent behavior to diminish unwanted outcomes while encouraging desirable results. The report will analyze optimal computing generation methods, optimal energy utilization for computing generation as well as a procedure for building optimal datacenters using a unique hardware computing system design based on the openCompute community as an illustrative collaboration platform. Finally, the research concludes with security features cloud federation requires to support to protect its constituents, its constituents tenants and itself from security risks

    Blockchain for IoT Access Control: Recent Trends and Future Research Directions

    Full text link
    With the rapid development of wireless sensor networks, smart devices, and traditional information and communication technologies, there is tremendous growth in the use of Internet of Things (IoT) applications and services in our everyday life. IoT systems deal with high volumes of data. This data can be particularly sensitive, as it may include health, financial, location, and other highly personal information. Fine-grained security management in IoT demands effective access control. Several proposals discuss access control for the IoT, however, a limited focus is given to the emerging blockchain-based solutions for IoT access control. In this paper, we review the recent trends and critical needs for blockchain-based solutions for IoT access control. We identify several important aspects of blockchain, including decentralised control, secure storage and sharing information in a trustless manner, for IoT access control including their benefits and limitations. Finally, we note some future research directions on how to converge blockchain in IoT access control efficiently and effectively

    The Smart Mobile Application Framework (SMAF) - Exploratory Evaluation in the Smart City Contex

    Get PDF
    What makes mobile apps "smart"? This paper challenges this question by seeking to identify the inherent characteristics of smartness. Starting with the etymological foundations of the term, elements of smart behavior in software applications are extracted from the literature, elaborated and contrasted. Based on these findings we propose a Smart Mobile Application Framework incorporating a set of activities and qualities associated with smart mobile software. The framework is applied to analyze a specific mobile application in the context of Smart Cities and proves its applicability for uncovering the implementation of smart concepts in real-world settings. Hence, this work contributes to research by conceptualizing a new type of application and provides useful insights to practitioners who want to design, implement or evaluate smart mobile applications

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Development of a real-time business intelligence (BI) framework based on hex-elementization of data points for accurate business decision-making

    Get PDF
    The desire to use business intelligence (BI) to enhance efficiency and effectiveness of business decisions is neither new nor revolutionary. The promise of BI is to provide the ability to capture interrelationship from data and information to guide action towards a business goal. Although BI has been around since the 1960s, businesses still cannot get competitive information in the form they want, when they want and how they want. Business decisions are already full of challenges. The challenges in business decision-making include the use of a vast amount of data, adopting new technologies, and making decisions on a real-time basis. To address these challenges, businesses spend valuable time and resources on data, technologies and business processes. Integration of data in decision-making is crucial for modern businesses. This research aims to propose and validate a framework for organic integration of data into business decision-making. This proposed framework enables efficient business decisions in real-time. The core of this research is to understand and modularise the pre-established set of data points into intelligent and granular “hex-elements” (stated simply, hex-element is a data point with six properties). These intelligent hex-elements build semi-automatic relationships using their six properties between the large volume and high-velocity data points in a dynamic, automated and integrated manner. The proposed business intelligence framework is called “Hex-Elementization” (or “Hex-E” for short). Evolution of technology presents ongoing challenges to BI. These challenges emanate from the challenging nature of the underlying new-age data characterised by large volume, high velocity and wide variety. Efficient and effective analysis of such data depends on the business context and the corresponding technical capabilities of the organisation. Technologies like Big Data, Internet of Things (IoT), Artificial Intelligence (AI) and Machine Learning (ML), play a key role in capitalising on the variety, volume and veracity of data. Extricating the “value” from data in its various forms, depth and scale require synchronizing technologies with analytics and business processes. Transforming data into useful and actionable intelligence is the discipline of data scientists. Data scientists and data analysts use sophisticated tools to crunch data into information which, in turn, are converted into intelligence. The transformation of data into information and its final consumption as actionable business intelligence is an end-to-end journey. This end-to-end transformation of data to intelligence is complex, time-consuming and resource-intensive. This research explores approaches to ease the challenges the of end-to-end transformation of data into intelligence. This research presents Hex-E as a simplified and semi-automated framework to integrate, unify, correlate and coalesce data (from diverse sources and disparate formats) into intelligence. Furthermore, this framework aims to unify data from diverse sources and disparate formats to help businesses make accurate and timely decisions

    Taking Computation to Data: Integrating Privacy-preserving AI techniques and Blockchain Allowing Secure Analysis of Sensitive Data on Premise

    Get PDF
    PhD thesis in Information technologyWith the advancement of artificial intelligence (AI), digital pathology has seen significant progress in recent years. However, the use of medical AI raises concerns about patient data privacy. The CLARIFY project is a research project funded under the European Union’s Marie Sklodowska-Curie Actions (MSCA) program. The primary objective of CLARIFY is to create a reliable, automated digital diagnostic platform that utilizes cloud-based data algorithms and artificial intelligence to enable interpretation and diagnosis of wholeslide-images (WSI) from any location, maximizing the advantages of AI-based digital pathology. My research as an early stage researcher for the CLARIFY project centers on securing information systems using machine learning and access control techniques. To achieve this goal, I extensively researched privacy protection technologies such as federated learning, differential privacy, dataset distillation, and blockchain. These technologies have different priorities in terms of privacy, computational efficiency, and usability. Therefore, we designed a computing system that supports different levels of privacy security, based on the concept: taking computation to data. Our approach is based on two design principles. First, when external users need to access internal data, a robust access control mechanism must be established to limit unauthorized access. Second, it implies that raw data should be processed to ensure privacy and security. Specifically, we use smart contractbased access control and decentralized identity technology at the system security boundary to ensure the flexibility and immutability of verification. If the user’s raw data still cannot be directly accessed, we propose to use dataset distillation technology to filter out privacy, or use locally trained model as data agent. Our research focuses on improving the usability of these methods, and this thesis serves as a demonstration of current privacy-preserving and secure computing technologies
    • …
    corecore