122,022 research outputs found

    Towards Data Protection Compliance

    Get PDF
    Privacy and data protection are fundamental issues nowadays for every organization. This paper calls for the development of methods, techniques and infrastructure to allow the deployment of privacy-aware IT systems, in which humans are integral part of the organizational processes and accountable for their possible misconduct. In particular, we discuss the challenges to be addressed in order to improve organizations privacy practices, as well as the approach to ensure compliance with legal requirements and increasing efficiency

    Privacy-Preserving Accountable Cloud Storage

    Get PDF
    In cloud storage services, a wide range of sensitive information may be leaked to the host server via the exposure of access pattern albeit data is encrypted. Many security-provable schemes have been proposed to preserve the access pattern privacy; however, they may be vulnerable to attacks towards data integrity or availability from malicious users. This is due to the fact that, preserving access pattern privacy requires data to be frequently re-encrypted and re-positioned at the storage server, which can easily conceal the traces that are needed for account- ability support to detect misbehaviors and identify attackers. To address this issue, this paper proposes a scheme that integrates accountability support into hash-based ORAMs. Security analysis shows that the proposed scheme can detect misconduct committed by malicious users and identify the attackers, while preserving the access pattern privacy. Overhead analysis shows that the proposed accountability support incurs only slightly increased storage, communication, and computational overheads

    The Surveillance State: Do License Plate Readers Impinge Upon Americans\u27 Civil Liberties?

    Get PDF
    The boundaries that delineate public from private sphere have challenged our political system’s foundations since its origination. License plate readers (LPRs), a tool used by law enforcement and private businesses, cause citizens and their government to question the criteria separating public and private information. While police and repossession agencies contend that license plate readers aid their work, the American Civil Liberties Union (ACLU) argues that surveillance equipment interferes with an individual’s right to privacy. Addressing such privacy concerns requires the public to hold its government accountable by petitioning for limits on LPR use and data retention. LPRs also pose unique threats to public administration. Placing this technology into the hands of public and private interests without informing constituents hinders government accountability. Even though LPRs help police maintain a cost- effective way to handle crime, the United States’ federalist structure prevents uniform regulations at local, state, and federal levels. Politics pit those favoring big government against supporters of limited government; thus, creating deadlocks on the issue of LPRs violating an individual’s privacy. LPRs ultimately provide a new opportunity to reopen age-old debates within the fields of political science and public administration

    ORide: A Privacy-Preserving yet Accountable Ride-Hailing Service

    Get PDF
    In recent years, ride-hailing services (RHSs) have be- come increasingly popular, serving millions of users per day. Such systems, however, raise significant privacy concerns, because service providers are able to track the precise mobility patterns of all riders and drivers. In this paper, we propose ORide (Oblivious Ride), a privacy- preserving RHS based on somewhat-homomorphic en- cryption with optimizations such as ciphertext packing and transformed processing. With ORide, a service provider can match riders and drivers without learning their identities or location information. ORide offers rid- ers with fairly large anonymity sets (e.g., several thou- sands), even in sparsely populated areas. In addition, ORide supports key RHS features such as easy payment, reputation scores, accountability, and retrieval of lost items. Using real data-sets that consist of millions of rides, we show that the computational and network over- head introduced by ORide is acceptable. For example, ORide adds only several milliseconds to ride-hailing op- erations, and the extra driving distance for a driver is less than 0.5 km in more than 75% of the cases evaluated. In short, we show that a RHS can offer strong privacy guar- antees to both riders and drivers while maintaining the convenience of its services

    Accountable-eHealth Systems: the Next Step Forward for Privacy

    Get PDF
    EHealth systems promise enviable benefits and capabilities for healthcare, yet the technologies that make these capabilities possible brings with them undesirable drawback such as information security related threats which need to be appropriately addressed. Lurking in these threats are patient privacy concerns. Resolving these privacy concerns have proven to be difficult since they often conflict with information requirements of healthcare providers. It is important to achieve a proper balance between these requirements. We believe that information accountability can achieve this balance. In this paper we introduce accountable-eHealth systems. We will discuss how our designed protocols can successfully address the aforementioned requirement. We will also compare characteristics of AeH systems with Australia’s PCEHR system and identify similarities and highlight the differences and the impact those differences would have to the eHealth domain

    Peeling Back the Student Privacy Pledge

    Get PDF
    Education software is a multi-billion dollar industry that is rapidly growing. The federal government has encouraged this growth through a series of initiatives that reward schools for tracking and aggregating student data. Amid this increasingly digitized education landscape, parents and educators have begun to raise concerns about the scope and security of student data collection. Industry players, rather than policymakers, have so far led efforts to protect student data. Central to these efforts is the Student Privacy Pledge, a set of standards that providers of digital education services have voluntarily adopted. By many accounts, the Pledge has been a success. Since its introduction in 2014, over 300 companies have signed on, indicating widespread commitment to the Pledge’s seemingly broad protections for student privacy. This industry participation is encouraging, but the Pledge does not contain any meaningful oversight or enforcement provisions. This Article analyzes whether signatory companies are actually complying with the Pledge rather than just paying lip service to its goals. By looking to the privacy policies and terms of service of a sample of the Pledge’s signatories, I conclude that noncompliance may be a significant and prevalent issue. Consumers of education software have some power to hold signatories accountable, but their oversight abilities are limited. This Article argues that the federal government, specifically the Federal Trade Commission, is best positioned to enforce compliance with the Pledge and should hold Pledge signatories to their promises

    Peeling Back the Student Privacy Pledge

    Get PDF
    Education software is a multi-billion dollar industry that is rapidly growing. The federal government has encouraged this growth through a series of initiatives that reward schools for tracking and aggregating student data. Amid this increasingly digitized education landscape, parents and educators have begun to raise concerns about the scope and security of student data collection. Industry players, rather than policymakers, have so far led efforts to protect student data. Central to these efforts is the Student Privacy Pledge, a set of standards that providers of digital education services have voluntarily adopted. By many accounts, the Pledge has been a success. Since its introduction in 2014, over 300 companies have signed on, indicating widespread commitment to the Pledge’s seemingly broad protections for student privacy. This industry participation is encouraging, but the Pledge does not contain any meaningful oversight or enforcement provisions. This Article analyzes whether signatory companies are actually complying with the Pledge rather than just paying lip service to its goals. By looking to the privacy policies and terms of service of a sample of the Pledge’s signatories, I conclude that noncompliance may be a significant and prevalent issue. Consumers of education software have some power to hold signatories accountable, but their oversight abilities are limited. This Article argues that the federal government, specifically the Federal Trade Commission, is best positioned to enforce compliance with the Pledge and should hold Pledge signatories to their promises

    [How] Can Pluralist Approaches to Computational Cognitive Modeling of Human Needs and Values Save our Democracies?

    Get PDF
    In our increasingly digital societies, many companies have business models that perceive users’ (or customers’) personal data as a siloed resource, owned and controlled by the data controller rather than the data subjects. Collecting and processing such a massive amount of personal data could have many negative technical, social and economic consequences, including invading people’s privacy and autonomy. As a result, regulations such as the European General Data Protection Regulation (GDPR) have tried to take steps towards a better implementation of the right to digital privacy. This paper proposes that such legal acts should be accompanied by the development of complementary technical solutions such as Cognitive Personal Assistant Systems to support people to effectively manage their personal data processing on the Internet. Considering the importance and sensitivity of personal data processing, such assistant systems should not only consider their owner’s needs and values, but also be transparent, accountable and controllable. Pluralist approaches in computational cognitive modelling of human needs and values which are not bound to traditional paradigmatic borders such as cognitivism, connectionism, or enactivism, we argue, can create a balance between practicality and usefulness, on the one hand, and transparency, accountability, and controllability, on the other, while supporting and empowering humans in the digital world. Considering the threat to digital privacy as significant to contemporary democracies, the future implementation of such pluralist models could contribute to power-balance, fairness and inclusion in our societies
    • 

    corecore