330,859 research outputs found

    Ambient intelligence: applications and privacy policies

    Get PDF
    Proceedings of: 12th International Conference on Practical Applications of Agents and Multi-Agent Systems, University of Salamanca (Spain), 4th-6th June, 2014. Workshop on Intelligent Systems for Context-based Information Fusion (ISCIF 2014)In this paper, we present a complete overview of Ambient Intelligence (AmI) focused in its applications, considering the involved domain and technologies. The applications include AmI at home, care of elderly and people with disabilities, healthcare, education, business, public services, leisure and entertainment. The aim of this survey of AmI’s applications is to show its socials and ethical implications and specially privacy issues. Intelligent Environments (IE) collect and process a massive amount of person-related and sensitive data. These data must ensure privacy of the users. An important concern in AmI's applications is privacy. Addressing design by privacy, an important challenge to consider is the development of an architecture that includes the different privacy policies and how can we fusion them in a specific application domain. Ensuring privacy in Intelligent Environments is a difficult problem to solve, as there are different perceptions of privacy and its role in computing for each user. In the so called ‘design by privacy’ we have to identify the relevant design issues that should be addressed for its developing. Here we present an approach to the dimensions to consider, in order to provide privacy in the design of Ambient Intelligence’s applications.This work has been supported in part by the projects: CAM “Contexts” (S2009/TIC-1485), MINECO “Falcon” (TEC2011-28626-C02-02), MINECO “Tease” (TEC2012-37832-C02-01)

    A System Perspective to Privacy, Security and Resilience in Mobile Applications

    Get PDF
    Mobile applications have changed our life so much, but they also create problems related to privacy which is one of basic human rights. Protection (or security) of privacy is an important issue in mobile applications owing to the high likelihood of privacy violation nowadays. This thesis is devoted to a fundamental study on the privacy issue in mobile applications. The overall objective of the thesis is to advance our understanding of privacy and its relevant concepts in the context of mobile applications. There are three specific objectives with this thesis. Objective 1 is to have a more comprehensive understanding of the concepts of privacy, security and resilience (PSR for short) along with their relationship in the context of mobile applications. Objective 2 is to develop the principles of design of a mobile application system with a satisfactory PSR. Objective 3 is to develop a demonstration system (PSR demo for short) to illustrate how the principles of design can be applied. A salient approach was taken in this thesis, that is based on a general knowledge architecture called FCBPSS (F: function, C: context, B: behavior, P: principle. SS: state and structure). An analysis of literature was conducted first, resulting in a classification of various privacies against the FCPBSS architecture, followed by developing a theory of privacy, protection of privacy (security), and resilience of the system that performs protection of privacy, PSR theory for short. The principles of design of a mobile application system based on the PSR theory were then developed, which are expected to guide the practice of developing a mobile application for satisfactory privacy protection. Finally, a demonstration system, regarding the doctor booking for minimum waiting time and energy consumption, was developed to issue how the PSR theory and design principles work. The main contribution of this thesis is the development of the concept of PSR, especially the relationship among privacy (P), security (S), and resilience (R), and a set of design rules to develop a mobile application based on the PSR theory

    Protecting Visual Information in Augmented Reality from Malicious Application Developers

    Get PDF
    abstract: Visual applications – those that use camera frames as part of the application – provide a rich, context-aware experience. The continued development of mixed and augmented reality (MR/AR) computing environments furthers the richness of this experience by providing applications a continuous vision experience, where visual information continuously provides context for applications and the real world is augmented by the virtual. To understand user privacy concerns in continuous vision computing environments, this work studies three MR/AR applications (augmented markers, augmented faces, and text capture) to show that in a modern mobile system, the typical user is exposed to potential mass collection of sensitive information, posing privacy and security deficiencies to be addressed in future systems. To address such deficiencies, a development framework is proposed that provides resource isolation between user information contained in camera frames and application access to the network. The design is implemented using existing system utilities as a proof of concept on the Android operating system and demonstrates its viability with a modern state-of-the-art augmented reality library and several augmented reality applications. Evaluation is conducted on the design on a Samsung Galaxy S8 phone by comparing the applications from the case study with modified versions which better protect user privacy. Early results show that the new design efficiently protects users against data collection in MR/AR applications with less than 0.7% performance overhead.Dissertation/ThesisMasters Thesis Computer Engineering 201

    Towards Privacy Preservation of Federated Learning in Artificial Intelligence of Things

    Get PDF
    Under the need of processing huge amounts of data, providing high-quality service, and protecting user privacy in Artificial Intelligence of Things (AIoT), Federated Learning (FL) has been adopted as a promising technique to facilitate its broad applications. Although the importance of developing privacy-preserving FL has attracted lots of attention in different aspects, the existing research is still far from perfect in real applications. In this dissertation, we propose three privacy-related research accordingly towards three realistic weaknesses of federated learning in the AIoT scenarios, which solve the problems of private data inference, private data generation, and private data deletion in different stages of data life. First, to solve the privacy inference problem of traditional FL, we design a dual differentially private FL mechanism to achieve privacy preservation efficiently for both server side and local clients. In particular, our proposed method focuses on FL with non-independent identically distributed (non-i.i.d.) data distribution and gives theoretical analysis on privacy leakage as well as algorithm convergence. The second problem is to generate heterogeneous data privately in FL. To handle this challenging problem, we design a distributed generative model framework that can learn a powerful generator in hierarchical AIoT systems. Thirdly, we investigate a newly emerged machine unlearning problem, which is to remove a data point and its influence from the trained machine learning model with efficiency and effectiveness. Moreover, as the very first work on exact federated machine unlearning in literature, we design a quantization based method, which can remove unlearned data from multiple clients with significantly higher speed-up. All of the proposed methods are evaluated on different datasets, and the results output by our models express superiority over existing baselines
    corecore