243 research outputs found

    Sounding the dead in Cambodia: cultivating ethics, generating wellbeing, and living with history through music and sound

    Full text link
    This dissertation rethinks the ethics of history and trauma in post-genocide Cambodia by examining how Cambodians use a broad repertoire of sounded practices to form relations of mutual care with ancestors, dead teachers, deities, and other predecessors. At its root, the dissertation is the study of an ethical-religious-aesthetic system by which Cambodians recall predecessors’ legacies, care for the dead, and engage ancestors and deities as supportive co-presences. Traditional and popular musics, Buddhist chants and incantations, whispers, and the non-acoustic practice of “speaking in the heart” (niyāy knung citt) are among the primary sounded practices that Cambodians use to engage the dead. Parts One and Two detail those sounded practices and their social implications. I discuss how previous approaches have misinterpreted the nature and capacities of Cambodian music and other ritualized sounds through historicist, colonialist, and secular epistemologies, which cast those sounds as “culture” or “performance” and ignore their capacities as modes of ethics and exchange with the dead. Instead, by rethinking those sounded practices as Cambodian-Buddhist ethics and exchange, I examine how Cambodians fulfill an obligation to care for the ancestors who have supported themselves. I suggest fulfilling that obligation generates personal wellbeing and provides a new model for what living with history can sound like and feel like. Taken together, in Parts One and Two, I detail the non-linear temporalities, types of personhood, ethics, exchange with the dead, and the intergenerational mode of living with history that Cambodians bring into being through music and sound. Part Three zooms further out to discuss how sounded relations with the dead have consequences for national and international politics, which leads to larger critiques of the Cambodian government’s politicization of Khmer Rouge remembrance and international humanitarian efforts that attempt to help Cambodians heal from trauma. Since at least the mid-1990s, a plurality of international activists, scholars, volunteers, and development workers have concluded that Cambodians perpetuate a silence about the Khmer Rouge era that furthers their traumatization. Most observers suggest that Cambodians need to provide public testimony about that violent past in order to heal. This dissertation contests those conclusions, following work in anthropology and trauma studies that problematizes the universalization of the Western psychotherapeutic notion of biomedical trauma and its treatments. I suggest that those calls for a testimonial voice presuppose historicist modes of remembrance and knowledge production that naturalize liberal Western models of personhood, citizenship, justice, wellness, and political agency. To move away from those models, I argue that Cambodian sounded and ritual practices generate what I term “modes of being historical” and “ways of living with history” that are intimate, familial, intergenerational, engage national pasts, and can be a mode of political action. Those “modes of being historical” include but are not limited to telling stories of others’ struggles and deaths. I illustrate how Cambodians have long used a multitude of sounded practices to engage the past, grapple with life’s difficulties, and care for themselves and their ancestors. This dissertation posits that sound studies and ethnomusicology can further the emerging scholarly shifts toward the culturally specific ways people cope with difficult pasts. I propose a new approach to post-violence ethics and history by arguing for the decolonizing possibilities of emphasizing the modes of being historical, ethical relations of mutual care, and ontological entanglements with the dead that Cambodians generate through music and sound

    Attribute Repair for Threat Prevention

    Get PDF
    We propose a model-based procedure for preventing security threats using formal models. We encode system models and threats as satisfiability modulo theory (SMT) formulas. This model allows us to ask security questions as satisfiability queries. We formulate threat preven- tion as an optimization problem over the same formulas. The outcome of our threat prevention procedure is a suggestion of model attribute repair that eliminates threats. We implement our approach using the state-of-the-art Z3 SMT solver and interface it with the threat analysis tool THREATGET. We demonstrate the value of our procedure in two case studies from automotive and smart home domains

    2023-2024 Undergraduate Academic Catalog

    Get PDF
    https://digitalcommons.cedarville.edu/academic_catalogs/1128/thumbnail.jp

    Differentially Private Data Generation with Missing Data

    Full text link
    Despite several works that succeed in generating synthetic data with differential privacy (DP) guarantees, they are inadequate for generating high-quality synthetic data when the input data has missing values. In this work, we formalize the problems of DP synthetic data with missing values and propose three effective adaptive strategies that significantly improve the utility of the synthetic data on four real-world datasets with different types and levels of missing data and privacy requirements. We also identify the relationship between privacy impact for the complete ground truth data and incomplete data for these DP synthetic data generation algorithms. We model the missing mechanisms as a sampling process to obtain tighter upper bounds for the privacy guarantees to the ground truth data. Overall, this study contributes to a better understanding of the challenges and opportunities for using private synthetic data generation algorithms in the presence of missing data.Comment: 18 pages, 9 figures, 2 table

    Auf dem Weg zur Cyberpolis

    Get PDF
    Der soziale, kulturelle und politische Prozess der Digitalisierung hat neue Gemeinschafts- und Bildungsformen denkbar werden lassen, die u.a. durch drei Szenen entscheidend geprägt wurden: die kybernetisch-künstlerischen Hintergründe der PC-Kultur als Basis des Silicon Valley, die Popularisierung des Internets in den 1990er Jahren und aktuelle Entwicklungen, die unter dem Begriff des digitalen Nomadentums gefasst werden. Martin Donner und Heidrun Allert fragen vor dem Hintergrund der damit verbundenen Verschiebungen der Gemeinschaftsverständnisse nach praxistauglichen Gestaltungsmöglichkeiten der digitalen Gesellschaft

    Private Data Exploring, Sampling, and Profiling

    Get PDF
    Data analytics is being widely used not only as a business tool, which empowers organizations to drive efficiencies, glean deeper operational insights and identify new opportunities, but also for the greater good of society, as it is helping solve some of world's most pressing issues, such as developing COVID-19 vaccines, fighting poverty and climate change. Data analytics is a process involving a pipeline of tasks over the underlying datasets, such as data acquisition and cleaning, data exploration and profiling, building statistics and training machine learning models. In many cases, conducting data analytics faces two practical challenges. First, many sensitive datasets have restricted access and do not allow unfettered access; Second, data assets are often owned and stored in silos by multiple business units within an organization with different access control. Therefore, data scientists have to do analytics on private and siloed data. There is a fundamental trade-off between data privacy and the data analytics tasks. On the one hand, achieving good quality data analytics requires understanding the whole picture of the data; on the other hand, despite recent advances in designing privacy and security primitives such as differential privacy and secure computation, when naivly applied, they often significantly downgrade tasks' efficiency and accuracy, due to the expensive computations and injected noise, respectively. Moreover, those techniques are often piecemeal and they fall short in holistically integrating into end-to-end data analytics tasks. In this thesis, we approach this problem by treating privacy and utility as constraints on data analytics. First, we study each task and express its utility as data constraints; then, we select a principled data privacy and security model for each task; and finally, we develop mechanisms to combine them into end to end analytics tasks. This dissertation addresses the specific technical challenges of trading off privacy and utility in three popular analytics tasks. The first challenge is to ensure query accuracy in private data exploration. Current systems for answering queries with differential privacy place an inordinate burden on the data scientist to understand differential privacy, manage their privacy budget, and even implement new algorithms for noisy query answering. Moreover, current systems do not provide any guarantees to the data analyst on the quality they care about, namely accuracy of query answers. We propose APEx, a generic accuracy-aware privacy query engine for private data exploration. The key distinction of APEx is to allow the data scientist to explicitly specify the desired accuracy bounds to a SQL query. Using experiments with query benchmarks and a case study, we show that APEx allows high exploration quality with a reasonable privacy loss. The second challenge is to preserve the structure of the data in private data synthesis. Existing differentially private data synthesis methods aim to generate useful data based on applications, but they fail in keeping one of the most fundamental data properties of the structured data — the underlying correlations and dependencies among tuples and attributes. As a result, the synthesized data is not useful for any downstream tasks that require this structure to be preserved. We propose Kamino, a data synthesis system to ensure differential privacy and to preserve the structure and correlations present in the original dataset. We empirically show that while preserving the structure of the data, Kamino achieves comparable and even better usefulness in applications of training classification models and answering marginal queries than the state-of-the-art methods of differentially private data synthesis. The third challenge is efficient and secure private data profiling. Discovering functional dependencies (FDs) usually requires access to all data partitions to find constraints that hold on the whole dataset. Simply applying general secure multi-party computation protocols incurs high computation and communication cost. We propose SMFD to formulate the FD discovery problem in the secure multi-party scenario, and design secure and efficient cryptographic protocols to discover FDs over distributed partitions. Experimental results show that SMFD is practically efficient over non-secure distributed FD discovery, and can significantly outperform general purpose multi-party computation framework

    Automated Approaches for Program Verification and Repair

    Get PDF
    Formal methods techniques, such as verification, analysis, and synthesis,allow programmers to prove properties of their programs, or automatically derive programs from specifications. Making such techniques usable requires care: they must provide useful debugging information, be scalable, and enable automation. This dissertation presents automated analysis and synthesis techniques to ease the debugging of modular verification systems and allow easy access to constraint solvers from functional code. Further, it introduces machine learning based techniques to improve the scalability of off-the-shelf syntax-guided synthesis solvers and techniques to reduce the burden of network administrators writing and analyzing firewalls. We describe the design and implementationof a symbolic execution engine, G2, for non-strict functional languages such as Haskell. We extend G2 to both debug and automate the process of modular verification, and give Haskell programmers easy access to constraints solvers via a library named G2Q. Modular verifiers, such as LiquidHaskell, Dafny, and ESC/Java,allow programmers to write and prove specifications of their code. When a modular verifier fails to verify a program, it is not necessarily because of an actual bug in the program. This is because when verifying a function f, modular verifiers consider only the specification of a called function g, not the actual definition of g. Thus, a modular verifier may fail to prove a true specification of f if the specification of g is too weak. We present a technique, counterfactual symbolic execution, to aid in the debugging of modular verification failures. The approach uses symbolic execution to find concrete counterexamples, in the case of an actual inconsistency between a program and a specification; and abstract counterexamples, in the case that a function specification is too weak. Further, a counterexample-guided inductive synthesis (CEGIS) loop based technique is introduced to fully automate the process of modular verification, by using found counterexamples to automatically infer needed function specifications. The counterfactual symbolic execution and automated specification inference techniques are implemented in G2, and evaluated on existing LiquidHaskell errors and programs. We also leveraged G2 to build a library, G2Q, which allows writing constraint solving problemsdirectly as Haskell code. Users of G2Q can embed specially marked Haskell constraints (Boolean expressions) into their normal Haskell code, while marking some of the variables in the constraint as symbolic. Then, at runtime, G2Q automatically derives values for the symbolic variables that satisfy the constraint, and returns those values to the outside code. Unlike other constraint solving solutions, such as directly calling an SMT solver, G2Q uses symbolic execution to unroll recursive function definitions, and guarantees that the use of G2Q constraints will preserve type correctness. We further consider the problem of synthesizing functions viaa class of tools known as syntax-guided synthesis (SyGuS) solvers. We introduce a machine learning based technique to preprocess SyGuS problems, and reduce the space that the solver must search for a solution in. We demonstrate that the technique speeds up an existing SyGuS solver, CVC4, on a set of SyGuS solver benchmarks. Finally, we describe techniques to ease analysis and repair of firewalls.Firewalls are widely deployed to manage network security. However, firewall systems provide only a primitive interface, in which the specification is given as an ordered list of rules. This makes it hard to manually track and maintain the behavior of a firewall. We introduce a formal semantics for iptables firewall rules via a translation to first-order logic with uninterpreted functions and linear integer arithmetic, which allows encoding of firewalls into a decidable logic. We then describe techniques to automate the analysis and repair of firewalls using SMT solvers, based on user provided specifications of the desired behavior. We evaluate this approach with real world case studies collected from StackOverflow users
    corecore