4,043 research outputs found

    The Cloud-to-Thing Continuum

    Get PDF
    The Internet of Things offers massive societal and economic opportunities while at the same time significant challenges, not least the delivery and management of the technical infrastructure underpinning it, the deluge of data generated from it, ensuring privacy and security, and capturing value from it. This Open Access Pivot explores these challenges, presenting the state of the art and future directions for research but also frameworks for making sense of this complex area. This book provides a variety of perspectives on how technology innovations such as fog, edge and dew computing, 5G networks, and distributed intelligence are making us rethink conventional cloud computing to support the Internet of Things. Much of this book focuses on technical aspects of the Internet of Things, however, clear methodologies for mapping the business value of the Internet of Things are still missing. We provide a value mapping framework for the Internet of Things to address this gap. While there is much hype about the Internet of Things, we have yet to reach the tipping point. As such, this book provides a timely entrée for higher education educators, researchers and students, industry and policy makers on the technologies that promise to reshape how society interacts and operates

    Automated Cloud-to-Cloud Migration of Distributed Software Systems for Privacy Compliance

    Get PDF
    Mit der stĂ€ndig wachsenden Zahl von verteilten Cloudanwendungen und immer mehr Datenschutzverordnungen wĂ€chst das Interesse an legalen Cloudanwendungen. Jedoch ist vielen Betreibern der LegalitĂ€tsstatus ihrer Anwendung nicht bekannt. In 2018 wird die neue EU Datenschutzverordnung in Kraft treten. Diese Verordnung beinhaltet empfindliche Strafen fĂŒr Datenschutzverletzungen. Einer der wichtigsten Faktoren fĂŒr die Einhaltung der Datenschutzverordnung ist die Verarbeitung von Stammdaten von EU-BĂŒrgern innerhalb der EU. Wir haben fĂŒr diese Regelung eine Privacy Analyse entwickelt, formalisiert, implementiert und evaluiert. Außerdem haben wir mit iObserve Privacy ein System nach dem MAPE-Prinzip entwickelt, das automatisch Datenschutzverletzungen erkennt und ein alternatives, datenschutzkonformes Systemhosting errechnet. Zudem migriert iObserve Privacy die Cloudanwendung entsprechend dem alternativen Hosting automatisch. Hierdurch können wir eine rechtskonforme Verteilung der Cloudanwendung gewĂ€hrleisten, ohne das System in seiner Tiefe zu analysieren oder zu verstehen. Jedoch benötigen wir die Closed World Assumption. Wir benutzen PerOpteryx fĂŒr die Generierung von rechtskonformen, alternativen Hostings. Basierend auf diesem Hosting errechnen wir eine Sequenz von Adaptionsschritten zur Wiedererlangung der RechtskonformitĂ€t. Wenn Fehler auftreten, nutzen wir das Operator-in-the-loop Prinzip von iObserve. Als Datengrundlage nutzen wir das Palladio Component Model. In dieser Arbeit beschreiben wir detailliert die Konzepte, weisen auf Implementierungsdetails hin und evaluieren iObserve nach PrĂ€zision und Skalierbarkeit

    Towards Data Sharing across Decentralized and Federated IoT Data Analytics Platforms

    Get PDF
    In the past decade the Internet-of-Things concept has overwhelmingly entered all of the fields where data are produced and processed, thus, resulting in a plethora of IoT platforms, typically cloud-based, that centralize data and services management. In this scenario, the development of IoT services in domains such as smart cities, smart industry, e-health, automotive, are possible only for the owner of the IoT deployments or for ad-hoc business one-to-one collaboration agreements. The realization of "smarter" IoT services or even services that are not viable today envisions a complete data sharing with the usage of multiple data sources from multiple parties and the interconnection with other IoT services. In this context, this work studies several aspects of data sharing focusing on Internet-of-Things. We work towards the hyperconnection of IoT services to analyze data that goes beyond the boundaries of a single IoT system. This thesis presents a data analytics platform that: i) treats data analytics processes as services and decouples their management from the data analytics development; ii) decentralizes the data management and the execution of data analytics services between fog, edge and cloud; iii) federates peers of data analytics platforms managed by multiple parties allowing the design to scale into federation of federations; iv) encompasses intelligent handling of security and data usage control across the federation of decentralized platforms instances to reduce data and service management complexity. The proposed solution is experimentally evaluated in terms of performances and validated against use cases. Further, this work adopts and extends available standards and open sources, after an analysis of their capabilities, fostering an easier acceptance of the proposed framework. We also report efforts to initiate an IoT services ecosystem among 27 cities in Europe and Korea based on a novel methodology. We believe that this thesis open a viable path towards a hyperconnection of IoT data and services, minimizing the human effort to manage it, but leaving the full control of the data and service management to the users' will

    Security implications of digitalization: The dangers of data colonialism and the way towards sustainable and sovereign management of environmental data

    Get PDF
    Digitalization opens up new opportunities in the collection, analysis, and presentation of data which can contribute to the achievement of the 2030 Agenda and its Sustainable Development Goals (SDGs). In particular, the access to and control of environmental and geospatial data is fundamental to identify and understand global issues and trends. Also immediate crises such as the COVID-19 pandemic demonstrate the importance of accurate health data such as infection statistics and the relevance of digital tools like video conferencing platforms. However, today much of the data is collected and processed by private actors. Thus, governments and researchers depend on data platforms and proprietary systems of big tech companies such as Google or Microsoft. The market capitalization of the seven largest US and Chinese big tech companies has grown to 8.7tn USD in recent years, about twice the size of Germany's gross domestic product (GDP). Therefore, their market power is enormous, allowing them to dictate many rules of the digital space and even interfere with legislations. Based on a literature review and nine expert interviews this study presents a framework that identifies the risks and consequences along the workflow of collecting, processing, storing, using of data. It also includes solutions that governmental and multilateral actors can strive for to alleviate the risks. Fundamental to this framework is the novel concept of "data colonialism" which describes today's trend of private companies appropriating the digital sphere. Historically, colonial nations used to grab indigenous land and exploit the cheap labor of slave workers. In a similar way, today's big tech corporations use cheap data of their users to produce valuable services and thus create enormous market power.Comment: This study was prepared under contract to the Federal Department of Foreign Affairs (FDFA). The authors bear responsibility for the conten

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Advancing security information and event management frameworks in managed enterprises using geolocation

    Get PDF
    Includes bibliographical referencesSecurity Information and Event Management (SIEM) technology supports security threat detection and response through real-time and historical analysis of security events from a range of data sources. Through the retrieval of mass feedback from many components and security systems within a computing environment, SIEMs are able to correlate and analyse events with a view to incident detection. The hypothesis of this study is that existing Security Information and Event Management techniques and solutions can be complemented by location-based information provided by feeder systems. In addition, and associated with the introduction of location information, it is hypothesised that privacy-enforcing procedures on geolocation data in SIEMs and meta- systems alike are necessary and enforceable. The method for the study was to augment a SIEM, established for the collection of events in an enterprise service management environment, with geo-location data. Through introducing the location dimension, it was possible to expand the correlation rules of the SIEM with location attributes and to see how this improved security confidence. An important co-consideration is the effect on privacy, where location information of an individual or system is propagated to a SIEM. With a theoretical consideration of the current privacy directives and regulations (specifically as promulgated in the European Union), privacy supporting techniques are introduced to diminish the accuracy of the location information - while still enabling enhanced security analysis. In the context of a European Union FP7 project relating to next generation SIEMs, the results of this work have been implemented based on systems, data, techniques and resilient features of the MASSIF project. In particular, AlienVault has been used as a platform for augmentation of a SIEM and an event set of several million events, collected over a three month period, have formed the basis for the implementation and experimentation. A "brute-force attack" misuse case scenario was selected to highlight the benefits of geolocation information as an enhancement to SIEM detection (and false-positive prevention). With respect to privacy, a privacy model is introduced for SIEM frameworks. This model utilises existing privacy legislation, that is most stringent in terms of privacy, as a basis. An analysis of the implementation and testing is conducted, focusing equally on data security and privacy, that is, assessing location-based information in enhancing SIEM capability in advanced security detection, and, determining if privacy-enforcing procedures on geolocation in SIEMs and other meta-systems are achievable and enforceable. Opportunities for geolocation enhancing various security techniques are considered, specifically for solving misuse cases identified as existing problems in enterprise environments. In summary, the research shows that additional security confidence and insight can be achieved through the augmentation of SIEM event information with geo-location information. Through the use of spatial cloaking it is also possible to incorporate location information without com- promising individual privacy. Overall the research reveals that there are significant benefits for SIEMs to make use of geo-location in their analysis calculations, and that this can be effectively conducted in ways which are acceptable to privacy considerations when considered against prevailing privacy legislation and guidelines

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    SciTech News Volume 71, No. 1 (2017)

    Get PDF
    Columns and Reports From the Editor 3 Division News Science-Technology Division 5 Chemistry Division 8 Engineering Division Aerospace Section of the Engineering Division 9 Architecture, Building Engineering, Construction and Design Section of the Engineering Division 11 Reviews Sci-Tech Book News Reviews 12 Advertisements IEEE

    Privacy Protection and Mobility Enhancement in Internet

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)The Internet has substantially embraced mobility since last decade. Cellular data network carries majority of Internet mobile access traffic and become the de facto solution of accessing Internet in mobile fashion, while many clean-slate Internet mobility solutions were proposed but none of them has been largely deployed. Internet mobile users increasingly concern more about their privacy as both researches and real-world incidents show leaking of communication and location privacy could lead to serious consequences. Just the communication itself between mobile user and their peer users or websites could leak considerable privacy of mobile user, such as location history, to other parties. Additionally, comparing to ordinary Internet access, connecting through cellular network yet provides equivalent connection stability or longevity. In this research we proposed a novelty paradigm that leverages concurrent far-side proxies to maximize network location privacy protection and minimize interruption and performance penalty brought by mobility.To avoid the deployment feasibility hurdle we also investigated the root causes impeding popularity of existing Internet mobility proposals and proposed guidelines on how to create an economical feasible solution for this goal. Based on these findings we designed a mobility support system offered as a value-added service by mobility service providers and built on elastic infrastructure that leverages various cloud aided designs, to satisfy economic feasibility and explore the architectural trade-offs among service QoS, economic viability, security and privacy

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity
    • 

    corecore