4,553 research outputs found

    SoK: Privacy-Enhancing Technologies in Finance

    Get PDF
    Recent years have seen the emergence of practical advanced cryptographic tools that not only protect data privacy and authenticity, but also allow for jointly processing data from different institutions without sacrificing privacy. The ability to do so has enabled implementations a number of traditional and decentralized financial applications that would have required sacrificing privacy or trusting a third party. The main catalyst of this revolution was the advent of decentralized cryptocurrencies that use public ledgers to register financial transactions, which must be verifiable by any third party, while keeping sensitive data private. Zero Knowledge (ZK) proofs rose to prominence as a solution to this challenge, allowing for the owner of sensitive data (e.g. the identities of users involved in an operation) to convince a third party verifier that a certain operation has been correctly executed without revealing said data. It quickly became clear that performing arbitrary computation on private data from multiple sources by means of secure Multiparty Computation (MPC) and related techniques allows for more powerful financial applications, also in traditional finance. In this SoK, we categorize the main traditional and decentralized financial applications that can benefit from state-of-the-art Privacy-Enhancing Technologies (PETs) and identify design patterns commonly used when applying PETs in the context of these applications. In particular, we consider the following classes of applications: 1. Identity Management, KYC & AML; and 2. Markets & Settlement; 3. Legal; and 4. Digital Asset Custody. We examine how ZK proofs, MPC and related PETs have been used to tackle the main security challenges in each of these applications. Moreover, we provide an assessment of the technological readiness of each PET in the context of different financial applications according to the availability of: theoretical feasibility results, preliminary benchmarks (in scientific papers) or benchmarks achieving real-world performance (in commercially deployed solutions). Finally, we propose future applications of PETs as Fintech solutions to currently unsolved issues. While we systematize financial applications of PETs at large, we focus mainly on those applications that require privacy preserving computation on data from multiple parties

    General-Purpose Secure Conflict-free Replicated Data Types

    Get PDF
    Conflict-free Replicated Data Types (CRDTs) are a very popular class of distributed data structures that strike a compromise between strong and eventual consistency. Ensuring the protection of data stored within a CRDT, however, cannot be done trivially using standard encryption techniques, as secure CRDT protocols would require replica-side computation. This paper proposes an approach to lift general-purpose implementations of CRDTs to secure variants using secure multiparty computation (MPC). Each replica within the system is realized by a group of MPC parties that compute its functionality. Our results include: i) an extension of current formal models used for reasoning over the security of CRDT solutions to the MPC setting; ii) a MPC language and type system to enable the construction of secure versions of CRDTs and; iii) a proof of security that relates the security of CRDT constructions designed under said semantics to the underlying MPC library. We provide an open-source system implementation with an extensive evaluation, which compares different designs with their baseline throughput and latency

    Pseudorandomness with Proof of Destruction and Applications

    Get PDF
    Two fundamental properties of quantum states that quantum information theory explores are pseudorandomness and provability of destruction. We introduce the notion of quantum pseudorandom states with proofs of destruction (PRSPD) that combines both these properties. Like standard pseudorandom states (PRS), these are efficiently generated quantum states that are indistinguishable from random, but they can also be measured to create a classical string. This string is verifiable (given the secret key) and certifies that the state has been destructed. We show that, similarly to PRS, PRSPD can be constructed from any post-quantum one-way function. As far as the authors are aware, this is the first construction of a family of states that satisfies both pseudorandomness and provability of destruction. We show that many cryptographic applications that were shown based on PRS variants using quantum communication can be based on (variants of) PRSPD using only classical communication. This includes symmetric encryption, message authentication, one-time signatures, commitments, and classically verifiable private quantum coins

    Short-term forecast techniques for energy management systems in microgrid applications

    Get PDF
    A Dissertation Submitted in Partial Fulfilment of the Requirements for the Degree of Doctor of Philosophy in Sustainable Energy Science and Engineering of the Nelson Mandela African Institution of Science and TechnologyIn the 2015 Paris Agreement, 195 countries adopted a global climate agreement to limit the global average temperature rise to less than 2°C. Achieving the set targets involves increasing energy efficiency and embracing cleaner energy solutions. Although advances in computing and Internet of Things (IoT) technologies have been made, there is limited scientific research work in this arena that tackles the challenges of implementing low-cost IoT-based Energy Management System (EMS) with energy forecast and user engagement for adoption by a layman both in off-grid or microgrid tied to a weak grid. This study proposes an EMS approach for short-term forecast and monitoring for hybrid microgrids in emerging countries. This is done by addressing typical submodules of EMS namely: load forecast, blackout forecast, and energy monitoring module. A short-term load forecast model framework consisting of a hybrid feature selection and prediction model was developed. Prediction error performance evaluation of the developed model was done by varying input predictors and using the principal subset features to perform supervised training of 20 different conventional prediction models and their hybrid variants. The proposed principal k-features subset union approach registered low error performance values than standard feature selection methods when it was used with the ‘linear Support Vector Machine (SVM)’ prediction model for load forecast. The hybrid regression model formed from a fusion of the best 2 models (‘linearSVM’ and ‘cubicSVM’) showed improved prediction performance than the individual regression models with a reduction in Mean Absolute Error (MAE) by 5.4%. In the case of the EMS blackout prediction aspect, a hybrid Adaptive Similar Day (ASD) and Random Forest (RF) model for short-term power outage prediction was proposed that predicted accurately almost half of the blackouts (49.16%), thereby performing slightly better than the stand-alone RF (32.23%), and ASD (46.57%) models. Additionally, a low-cost EMS smart meter was developed to realize the implemented energy forecast and offer user engagement through monitoring and control of the microgrid towards the goal of increasing energy efficiency

    Contested environmental futures: rankings, forecasts and indicators as sociotechnical endeavours

    Get PDF
    In a world where numbers and science are often taken as the voice of truth and reason, Quantitative Devices (QDs) represent the epitome of policy driven by facts rather than hunches. Despite the scholarly interest in understanding the role of quantification in policy, the actual production of rankings, forecasts, indexes and other QDs has, to a great extent, been left unattended. While appendixes and technical notebooks offer an explanation of how these devices are produced, they exclude aspects of their making that are arbitrarily considered "mundane." It is in the everyday performances at research centres that the micropolitics of knowledge production, imaginaries, and frustrations merge. These are vital dimensions to understand the potential, limitations and ethical consequences of QDs. Using two participant observations as the starting point, this thesis offers a comprehensive critical analysis of the processes through which university-based research centres create QDs that represent the world. It addresses how researchers conceive quantitative data. It pays attention to the discourses of hope and expectation embedded in the devices. Finally, it considers the ethics of creating devices that cannot be replicated independently of their place of production. Two QDs were analysed: the Violence Early Warning System (ViEWS) and the Environmental Performance Index (EPI). At Uppsala University, researchers created ViEWS to forecast the probability of drought-driven conflicts within the next 100 years. The EPI, produced at the Yale Centre for Environmental Law and Policy, ranks the performance of countries' environmental policies. This thesis challenges existing claims within Science and Technology Studies and the Sociology of Quantification that QDs co-produce knowledge within their realms. I argue that these devices act as vehicles for sociotechnical infrastructures to be consolidated with little debate among policymakers, given their understanding as scientific and objective tools. Moreover, for an indicator to be incorporated within a QD, it needs to be deemed as relevant for those making the devices but also valuable enough to have been previously quantified by data providers. Even more, existing sociotechnical inequalities, power relations and epistemic injustices could impede disadvantaged communities' (e.g., in the Global South) ability to challenge metrics originated in centres in the Global North. This thesis, therefore, demonstrates how the future QDs propose is unilateral and does not acknowledge the myriad possibilities that might arise from a diversity of worldviews. In other words, they cast a future designed to fit under the current status quo. In sum, through two QDs focused on environmental-related, this thesis launches an inquiry into the elements that make up the imaginaries they propose following the everyday life of their producers. To achieve this, I discuss two core elements. First, the role of tacit knowledge and sociotechnical inequalities in reinforcing power relations between those with the means to quantify and those who might only accommodate proposed futures. Second, the dynamics between research centres and data providers in relation to what is quantified. By scrutinising mundanity, this work is a step forward in understanding the construction of sociotechnical imaginaries and infrastructures

    Communication Lower Bounds for Cryptographic Broadcast Protocols

    Full text link
    Broadcast protocols enable a set of nn parties to agree on the input of a designated sender, even facing attacks by malicious parties. In the honest-majority setting, randomization and cryptography were harnessed to achieve low-communication broadcast with sub-quadratic total communication and balanced sub-linear cost per party. However, comparatively little is known in the dishonest-majority setting. Here, the most communication-efficient constructions are based on Dolev and Strong (SICOMP '83), and sub-quadratic broadcast has not been achieved. On the other hand, the only nontrivial ω(n)\omega(n) communication lower bounds are restricted to deterministic protocols, or against strong adaptive adversaries that can perform "after the fact" removal of messages. We provide new communication lower bounds in this space, which hold against arbitrary cryptography and setup assumptions, as well as a simple protocol showing near tightness of our first bound. 1) We demonstrate a tradeoff between resiliency and communication for protocols secure against n−o(n)n-o(n) static corruptions. For example, Ω(n⋅polylog(n))\Omega(n\cdot {\sf polylog}(n)) messages are needed when the number of honest parties is n/polylog(n)n/{\sf polylog}(n); Ω(nn)\Omega(n\sqrt{n}) messages are needed for O(n)O(\sqrt{n}) honest parties; and Ω(n2)\Omega(n^2) messages are needed for O(1)O(1) honest parties. Complementarily, we demonstrate broadcast with O(n⋅polylog(n))O(n\cdot{\sf polylog}(n)) total communication facing any constant fraction of static corruptions. 2) Our second bound considers n/2+kn/2 + k corruptions and a weakly adaptive adversary that cannot remove messages "after the fact." We show that any broadcast protocol within this setting can be attacked to force an arbitrary party to send messages to kk other parties. This rules out, for example, broadcast facing 51% corruptions in which all non-sender parties have sublinear communication locality.Comment: A preliminary version of this work appeared in DISC 202

    Enduring Displacement, Enduring Violence: Camps, Closure, and Exile In/After Return (Experiences of Burundian Refugees in Tanzania)

    Get PDF
    “Return home” was the joint message by the Burundian and Tanzanian presidents in 2017, just two years after hundreds of thousands Burundians were recognized as refugees in neighbouring countries, and as more continued to seek refuge or asylum each month. In Tanzania, where refugees are subject to strict encampment, the vast majority of Burundian refugees had previously been refugees at least once before. Many returned to Tanzania less than three years after their prior return to Burundi, which, as camps were closed, had been framed as a “durable solution” to their displacement. This thesis explores the interrelated dynamics of enduring displacement, encampment, and closure, by drawing on life history research with Burundian refugees in two camps in Tanzania (2017-8), as well as semi-structured interviews with government and humanitarian staff, and ethnographic methods. Empirically, this dissertation contributes to knowledge by tracing the diverse prior trajectories of current Burundian refugees, both within and beyond camp boundaries, challenging there-and-back-again geographical imaginary of refuge management. It highlights an understudied but constitutive aspect of camps—their ultimate closures—by recounting refugees’ memories of the violent closure of Mtabila camp, as well as its fearful afterlives and present-presence. The violence of past camp closure is part of the violence of current encampment due to its evocation as a a disciplinary dispositif to “encourage” return, threatening and anticipating future violence. State and humanitarian practices “close” and harden space for those deemed “undesirable,” through forced encampment, camp closures, and coerced or forced return. In so doing, they produce and prolong displacement, in which varied spatio-temporalities of violence endure. Burundian refugees’ life histories thus trace the ways displacement endures, and is endured

    Data Rescue : defining a comprehensive workflow that includes the roles and responsibilities of the research library.

    Get PDF
    Thesis (PhD (Research))--University of Pretoria, 2023.This study, comprising a case study at a selected South African research institute, focused on the creation of a workflow model for data rescue indicating the roles and responsibilities of the research library. Additional outcomes of the study include a series of recommendations addressing the troublesome findings that revealed data at risk to be a prevalent reality at the selected institute, showing the presence of a multitude of factors putting data at risk, disclosing the profusion of data rescue obstacles faced by researchers, and uncovering that data rescue at the institute is rarely implemented. The study consists of four main parts: (i) a literature review, (ii) content analysis of literature resulting in the creation of a data rescue workflow model, (iii) empirical data collection methods , and (iv) the adaptation and revision of the initial data rescue model to present a recommended version of the model. A literature review was conducted and addressed data at risk and data rescue terminology, factors putting data at risk, the nature, diversity and prevalence of data rescue projects, and the rationale for data rescue. The second part of the study entailed the application of content analysis to selected documented data rescue workflows, guidelines and models. Findings of the analysis led to the identification of crucial components of data rescue and brought about the creation of an initial Data Rescue Workflow Model. As a first draft of the model, it was crucial that the model be reviewed by institutional research experts during the next main stage of the study. The section containing the study methodology culminates in the implementation of four different empirical data collection methods. Data collected via a web-based questionnaire distributed to a sample of research group leaders (RGLs), one-on-one virtual interviews with a sample of the aforementioned RGLs, feedback supplied by RGLs after reviewing the initial Data Rescue Workflow Model, and a focus group session held with institutional research library experts resulted in findings producing insight into the institute’s data at risk and the state of data rescue. Feedback supplied by RGLs after examining the initial Data Rescue Workflow Model produced a list of concerns linked to the model and contained suggestions for changes to the model. RGL feedback was at times unrelated to the model or to data and necessitated the implementation of a mini focus group session involving institutional research library experts. The mini focus group session comprised discussions around requirements for a data rescue workflow model. The consolidation of RGL feedback and feedback supplied by research library experts enabled the creation of a recommended Data Rescue Workflow Model, with the model also indicating the various roles and responsibilities of the research library. The contribution of this research lies primarily in the increase in theoretical knowledge regarding data at risk and data rescue, and culminates in the presentation of a recommended Data Rescue Workflow Model. The model not only portrays crucial data rescue activities and outputs, but also indicates the roles and responsibilities of a sector that can enhance and influence the prevalence and execution of data rescue projects. In addition, participation in data rescue and an understanding of the activities and steps portrayed via the model can contribute towards an increase in the skills base of the library and information services sector and enhance collaboration projects with relevant research sectors. It is also anticipated that the study recommendations and exposure to the model may influence the viewing and handling of data by researchers and accompanying research procedures.Information SciencePhD (Research)Unrestricte

    Trade Union Effectiveness in the UK Hospitality Sector: A Case Study Approach

    Get PDF
    This study investigates how union representation, management attitude, union membership, collective bargaining, and technological revolution affect trade union effectiveness in the UK hospitality industry. It specifically explores how internal organisational factors and trade union factors impact the effectiveness of trade unions in the UK hospitality sector, as well as how improvement in such effectiveness can benefit employers and employees in this sector. An analysis of seven case studies of hotels in the UK was conducted, which included a total of 71 interviews with employees and hotel managers, seven meeting observations, and an analysis of documents from each hotel. All these data were analysed thematically using NVivo 12. Key findings revealed that two hotels were strongly unionised while the rest were weakly unionised. The unionised hotels had strong union representation, bargaining power and an adequate and growing union membership. The weakly unionised hotels had weak bargaining power and declining membership. Workers from weakly unionised hotels began to turn to management efficiency to seek resolutions for their concerns. Conversely, workers from strongly unionised hotels sought union representation to resolve their issues with the management. The study contributes a proposed conceptual framework of trade union effectiveness applied to the seven case organisations
    • 

    corecore