1,016 research outputs found
Digital Dollar: Privacy and Transparency Dilemma
Many have voiced concerns that a digital dollar, a digital form of central bank money, will facilitate government surveillance, thus depriving users of privacy. Contrary to popular belief, this Article investigates critical technical designs proposed by leading think tanks, central banks, and scholars from interdisciplinary fields, it reaches a surprising conclusion: a digital dollar can offer better privacy protection than existing digital payment systems. The Article argues that those expressing concerns have made two flawed assumptions: (1) that the digital dollar data is fully transparent regarding personal information and transaction details, and (2) that the government or the Federal Reserve has unlimited access to this fully transparent data and the potential for their misuse. In reality, the designs directly oppose these assumptions by allowing for a certain degree of anonymity, whether it be payer anonymity, transaction anonymity, or a combination of both, and by preventing government access to identity data and transaction details. The real issue is that if the digital dollar adopts these privacy-preserving designs, it will directly conflict with existing anti-money laundering and countering the financing of terrorism (AML/CFT) regulations that require transparent data to combat financial crimes. Accordingly, this Article proposes changes to financial institutionsâ record-keeping and reporting practices. It also suggests modernizing AML/CFT requirements to allow a certain degree of anonymity to protect privacy while still fulfilling public interest objectives such as combating money laundering and terrorist financing
Transparent Government, Not Transparent Citizens: Executive Summary and Recommendations
Executive summary and recommendations from Transparent Government, Not Transparent Citizens: A Report on Privacy and Transparency for the Cabinet Offic
Transparent government, not transparent citizens: a report on privacy and transparency for the Cabinet Office
1. Privacy is extremely important to transparency. The political legitimacy of a transparency programme will depend crucially on its ability to retain public confidence. Privacy protection should therefore be embedded in any transparency programme, rather than bolted on as an afterthought. 2. Privacy and transparency are compatible, as long as the former is carefully protected and considered at every stage. 3. Under the current transparency regime, in which public data is specifically understood not to include personal data, most data releases will not raise privacy concerns. However, some will, especially as we move toward a more demand-driven scheme. 4. Discussion about deanonymisation has been driven largely by legal considerations, with a consequent neglect of the input of the technical community. 5. There are no complete legal or technical fixes to the deanonymisation problem. We should continue to anonymise sensitive data, being initially cautious about releasing such data under the Open Government Licence while we continue to take steps to manage and research the risks of deanonymisation. Further investigation to determine the level of risk would be very welcome. 6. There should be a focus on procedures to output an auditable debate trail. Transparency about transparency â metatransparency â is essential for preserving trust and confidence. Fourteen recommendations are made to address these conclusions
Recommended from our members
Security, Privacy, and Transparency Guarantees for Machine Learning Systems
Machine learning (ML) is transforming a wide range of applications, promising to bring immense economic and social benefits. However, it also raises substantial security, privacy, and transparency challenges. ML workloads indeed push companies toward aggressive data collection and loose data access policies, placing troves of sensitive user information at risk if the company is hacked. ML also introduces new attack vectors, such as adversarial example attacks, which can completely nullify modelsâ accuracy under attack. Finally, ML models make complex data-driven decisions, which are opaque to the end-users, and difficult to inspect for programmers. In this dissertation we describe three systems we developed. Each system addresses a dimension of the previous challenges, by combining new practical systems techniques with rigorous theory to achieve a guaranteed level of protection, and make systems easier to understand. First we present Sage, a differentially private ML platform that enforces a meaningful protection semantic for the troves of personal information amassed by todayâs companies. Second we describe PixelDP, a defense against adversarial examples that leverages differential privacy theory to provide a guaranteed level of accuracy under attack. Third we introduce Sunlight, a tool to enhance the transparency of opaque targeting services, using rigorous causal inference theory to explain targeting decisions to end-users
Privacy and Transparency in Blockchain-based Smart Grid Operations
In the past few years, blockchain technology has emerged in numerous smart grid applications,
enabling the construction of systems without the need for a trusted third party. Blockchain
offers transparency, traceability, and accountability, which lets various energy management system
functionalities be executed through smart contracts, such as monitoring, consumption analysis,
and intelligent energy adaptation. Nevertheless, revealing sensitive energy consumption information
could render users vulnerable to digital and physical assaults. This paper presents a novel method
for achieving a dual balance between privacy and transparency, as well as accountability and
verifiability. This equilibrium requires the incorporation of cryptographic tools like Secure Mul-
tiparty Computation and Verifiable Secret Sharing within the distributed components of a multi-
channel blockchain and its associated smart contracts. We corroborate the suggested architecture
throughout the entire process of a Demand Response scenario, from the collection of energy data
to the ultimate reward. To address our proposalâs constraints, we present countermeasures against
accidental crashes and Byzantine behavior while ensuring that the solution remains appropriate
for low-performance IoT devices
Trust, Privacy and Transparency with Blockhain Technology in Logistics
Since the introduction of blockchain over a decade ago, many industries and industrial sectors are exploring the potentials of the technology. In line with the trend, logistics sector is not an exception and is investigation various dynamics associated with the implementation of the technology. This study focuses on the linking between the capabilities of blockchain technology and trust, privacy and transparency. In order to explore dynamics of the linkage, the study used case study as a method for the inquiry. These have been common issues in logistics which the existing information solutions are unable in resolving to a greater extent.. The results shows that blockchain technology has the capability to build trust among unknown industry players while maintaining a sufficient level of privacy and transparency at the same time. Overall, the study presents useful insights by contributing to the major issues in logistics and supply chain when an innovative digital technology is put into action
Privacy and Transparency in Graph Machine Learning: A Unified Perspective
Graph Machine Learning (GraphML), whereby classical machine learning is
generalized to irregular graph domains, has enjoyed a recent renaissance,
leading to a dizzying array of models and their applications in several
domains. With its growing applicability to sensitive domains and regulations by
government agencies for trustworthy AI systems, researchers have started
looking into the issues of transparency and privacy of graph learning. However,
these topics have been mainly investigated independently. In this position
paper, we provide a unified perspective on the interplay of privacy and
transparency in GraphML
A Synthesized Perspective on Privacy and Transparency in the Digital Workplace
The pandemic crisis has made the digitalization of workplaces imperative for many organizations. Besides reorganizing work, rapid advances in technologies also enhance organizational efficiency and enable remote work. Having to work completely digitally imposes unprecedented transparency on employees. A major consequence of the transparent workplace is the emergence of employeesâ privacy concerns. Even though the concepts of transparency and privacy are closely related, there is a research gap regarding the relationship between the two. Based on a conceptual approach and a systematic literature review, we postulate a synthesis of transparency and privacy in the digital workplace, and outline directions for future research. We discuss what makes the relationship between the two constructs double-edged by introducing the privacy-transparency paradox. This study therefore adds to the literature on privacy and transparency in the digital workplace and forms the basis for further studies
- âŠ