57,589 research outputs found

    Privacy Leakages in Approximate Adders

    Full text link
    Approximate computing has recently emerged as a promising method to meet the low power requirements of digital designs. The erroneous outputs produced in approximate computing can be partially a function of each chip's process variation. We show that, in such schemes, the erroneous outputs produced on each chip instance can reveal the identity of the chip that performed the computation, possibly jeopardizing user privacy. In this work, we perform simulation experiments on 32-bit Ripple Carry Adders, Carry Lookahead Adders, and Han-Carlson Adders running at over-scaled operating points. Our results show that identification is possible, we contrast the identifiability of each type of adder, and we quantify how success of identification varies with the extent of over-scaling and noise. Our results are the first to show that approximate digital computations may compromise privacy. Designers of future approximate computing systems should be aware of the possible privacy leakages and decide whether mitigation is warranted in their application.Comment: 2017 IEEE International Symposium on Circuits and Systems (ISCAS

    A Framework for Designing Fair Ubiquitous Computing Systems

    Full text link
    Over the past few decades, ubiquitous sensors and systems have been an integral part of humans' everyday life. They augment human capabilities and provide personalized experiences across diverse contexts such as healthcare, education, and transportation. However, the widespread adoption of ubiquitous computing has also brought forth concerns regarding fairness and equitable treatment. As these systems can make automated decisions that impact individuals, it is essential to ensure that they do not perpetuate biases or discriminate against specific groups. While fairness in ubiquitous computing has been an acknowledged concern since the 1990s, it remains understudied within the field. To bridge this gap, we propose a framework that incorporates fairness considerations into system design, including prioritizing stakeholder perspectives, inclusive data collection, fairness-aware algorithms, appropriate evaluation criteria, enhancing human engagement while addressing privacy concerns, and interactive improvement and regular monitoring. Our framework aims to guide the development of fair and unbiased ubiquitous computing systems, ensuring equal treatment and positive societal impact.Comment: 8 pages, 1 figure, published in 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computin

    Expressing Privacy Preferences in terms of Invasiveness

    Get PDF
    Dynamic context aware systems need highly flexible privacy protection mechanisms. We describe an extension to an existing RBAC-based mechanism that utilises a dynamic measure of invasiveness to determine whether contextual information should be released

    Mobile recommender apps with privacy management for accessible and usable technologies

    Get PDF
    The paper presents the preliminary results of an ongoing survey of the use of computers and mobile devices, interest in recommender apps and knowledge and concerns about privacy issues amongst English and Italian speaking disabled people. Participants were found to be regular users of computers and mobile devices for a range of applications. They were interested in recommender apps for household items, computer software and apps that met their accessibility and other requirements. They showed greater concerns about controlling access to personal data of different types than this data being retained by the computer or mobile device. They were also willing to make tradeoffs to improve device performance
    • …
    corecore