158 research outputs found

    The Interplay of Authorial Control and Readerly Judgments in Ian McEwan\u27s Atonement

    Get PDF
    Mainly focusing on postmodern literary theory, I will analyze Ian McEwan’s Atonement and suggest how it becomes a simulacrum due to the protagonist, Briony Tallis taking control of authorship from McEwan and expressing how she is the author of the text. Because Briony negates an important aspect of the novel, hyperreality occurs. This thesis will look at the role McEwan plays as author of Atonement, how main characters Robbie and Cecelia take part within this fictional world and how they become aware of an authorial presence within their lives, how Briony takes ultimate control of the pen and appoints herself into the authorial role, and finally how her text is a simulacrum due to her acts as author

    Breaking the Trilemma of Privacy, Utility, Efficiency via Controllable Machine Unlearning

    Full text link
    Machine Unlearning (MU) algorithms have become increasingly critical due to the imperative adherence to data privacy regulations. The primary objective of MU is to erase the influence of specific data samples on a given model without the need to retrain it from scratch. Accordingly, existing methods focus on maximizing user privacy protection. However, there are different degrees of privacy regulations for each real-world web-based application. Exploring the full spectrum of trade-offs between privacy, model utility, and runtime efficiency is critical for practical unlearning scenarios. Furthermore, designing the MU algorithm with simple control of the aforementioned trade-off is desirable but challenging due to the inherent complex interaction. To address the challenges, we present Controllable Machine Unlearning (ConMU), a novel framework designed to facilitate the calibration of MU. The ConMU framework contains three integral modules: an important data selection module that reconciles the runtime efficiency and model generalization, a progressive Gaussian mechanism module that balances privacy and model generalization, and an unlearning proxy that controls the trade-offs between privacy and runtime efficiency. Comprehensive experiments on various benchmark datasets have demonstrated the robust adaptability of our control mechanism and its superiority over established unlearning methods. ConMU explores the full spectrum of the Privacy-Utility-Efficiency trade-off and allows practitioners to account for different real-world regulations. Source code available at: https://github.com/guangyaodou/ConMU

    Formalizing Data Deletion in the Context of the Right to be Forgotten

    Get PDF
    The right of an individual to request the deletion of their personal data by an entity that might be storing it -- referred to as the right to be forgotten -- has been explicitly recognized, legislated, and exercised in several jurisdictions across the world, including the European Union, Argentina, and California. However, much of the discussion surrounding this right offers only an intuitive notion of what it means for it to be fulfilled -- of what it means for such personal data to be deleted. In this work, we provide a formal definitional framework for the right to be forgotten using tools and paradigms from cryptography. In particular, we provide a precise definition of what could be (or should be) expected from an entity that collects individuals' data when a request is made of it to delete some of this data. Our framework captures several, though not all, relevant aspects of typical systems involved in data processing. While it cannot be viewed as expressing the statements of current laws (especially since these are rather vague in this respect), our work offers technically precise definitions that represent possibilities for what the law could reasonably expect, and alternatives for what future versions of the law could explicitly require. Finally, with the goal of demonstrating the applicability of our framework and definitions, we consider various natural and simple scenarios where the right to be forgotten comes up. For each of these scenarios, we highlight the pitfalls that arise even in genuine attempts at implementing systems offering deletion guarantees, and also describe technological solutions that provably satisfy our definitions. These solutions bring together techniques built by various communities

    The Cryptographic Imagination

    Get PDF
    Originally published in 1996. In The Cryptographic Imagination, Shawn Rosenheim uses the writings of Edgar Allan Poe to pose a set of questions pertaining to literary genre, cultural modernity, and technology. Rosenheim argues that Poe's cryptographic writing—his essays on cryptography and the short stories that grew out of them—requires that we rethink the relation of poststructural criticism to Poe's texts and, more generally, reconsider the relation of literature to communication. Cryptography serves not only as a template for the language, character, and themes of much of Poe's late fiction (including his creation, the detective story) but also as a "secret history" of literary modernity itself. "Both postwar fiction and literary criticism," the author writes, "are deeply indebted to the rise of cryptography in World War II." Still more surprising, in Rosenheim's view, Poe is not merely a source for such literary instances of cryptography as the codes in Conan Doyle's "The Dancing-Men" or in Jules Verne, but, through his effect on real cryptographers, Poe's writing influenced the outcome of World War II and the development of the Cold War. However unlikely such ideas sound, The Cryptographic Imagination offers compelling evidence that Poe's cryptographic writing clarifies one important avenue by which the twentieth century called itself into being. "The strength of Rosenheim's work extends to a revisionistic understanding of the entirety of literary history (as a repression of cryptography) and then, in a breathtaking shift of register, interlinks Poe's exercises in cryptography with the hyperreality of the CIA, the Cold War, and the Internet. What enables this extensive range of applications is the stipulated tension Rosenheim discerns in the relationship between the forms of the literary imagination and the condition of its mode of production. Cryptography, in this account, names the technology of literary production—the diacritical relationship between decoding and encoding—that the literary imagination dissimulates as hieroglyphics—the hermeneutic relationship between a sign and its content."—Donald E. Pease, Dartmouth Colleg

    Communicative action in information security systems: an application of social theory in a technical domain

    Get PDF
    This thesis is about grounding an increasingly common practice in an established theory where no explicit theory currently exists. The common practice that is the subject of this study is information security. It is commonly held that information security means maintaining the confidentiality, integrity (accuracy) and availability of information.It seems that a whole industry has built up with tools, techniques and consultants to help organisations achieve a successful information security practice. There is even a British Standard containing around 130 controls, and a management system to guide organisations and practitioners. In the absence of many alternatives this British Standard has grown into something of a requirement for organisations who are concerned about the security of their information.The British Standard was developed almost entirely through the collaboration of some powerful blue-chip organisations. These organisations compared their practices and found some key areas of commonality. These common areas became the foundation of many information security practices today. Although there has been considerable evolutionary change the fundamentals, and not least the principles of confidentiality, integrity and availability, remain largely the same.It is argued in this thesis that the absence of a theoretical grounding has left the domain as weak and unable to cope with the rapidly developing area of information security. It is also argued that there was far too little consideration of human issues when the standard was devised and that situation has worsened recently with greater reliance on information security driven by more threats of increasing complexity, and more restrictive controls being implemented to counteract those threats.This thesis aims to pull human issues into the domain of information security: a domain which is currently dominated by non-social and practical paradigms.The key contribution of this thesis is therefore to provide a new model around which information security practices can be evaluated. This new model has a strong and established theoretical basis. The theory selected to underpin the new model is in the broad domain of critical social theory

    The Information Theoretic Interpretation of the Length of a Curve

    Full text link
    In the context of holographic duality with AdS3 asymptotics, the Ryu-Takayanagi formula states that the entanglement entropy of a subregion is given by the length of a certain bulk geodesic. The entanglement entropy can be operationalized as the entanglement cost necessary to transmit the state of the subregion from one party to another while preserving all correlations with a reference party. The question then arises as to whether the lengths of other bulk curves can be interpreted as entanglement costs for some other information theoretic tasks. Building on recent results showing that the length of more general bulk curves is computed by the differential entropy, we introduce a new task called constrained state merging, whereby the state of the boundary subregion must be transmitted using operations restricted in location and scale in a way determined by the geometry of the bulk curve. Our main result is that the cost to transmit the state of a subregion under the conditions of constrained state merging is given by the differential entropy and hence the signed length of the corresponding bulk curve. When the cost is negative, constrained state merging distills entanglement rather than consuming it. This demonstration has two parts: first, we exhibit a protocol whose cost is the length of the curve and second, we prove that this protocol is optimal in that it uses the minimum amount of entanglement. In order to complete the proof, we additionally demonstrate that single-shot smooth conditional entropies for intervals in 1+1-dimensional conformal field theories with large central charge are well approximated by their von Neumann counterparts. We also revisit the relationship between the differential entropy and the maximum entropy among locally consistent density operators, demonstrating large quantitative discrepancy between the two quantities in conformal field theories.Comment: 40 pages, 7 figure

    Principles of Security and Trust

    Get PDF
    This open access book constitutes the proceedings of the 8th International Conference on Principles of Security and Trust, POST 2019, which took place in Prague, Czech Republic, in April 2019, held as part of the European Joint Conference on Theory and Practice of Software, ETAPS 2019. The 10 papers presented in this volume were carefully reviewed and selected from 27 submissions. They deal with theoretical and foundational aspects of security and trust, including on new theoretical results, practical applications of existing foundational ideas, and innovative approaches stimulated by pressing practical problems
    • …
    corecore