248 research outputs found

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Exploiting Process Algebras and BPM Techniques for Guaranteeing Success of Distributed Activities

    Get PDF
    The communications and collaborations among activities, pro- cesses, or systems, in general, are the base of complex sys- tems defined as distributed systems. Given the increasing complexity of their structure, interactions, and functionali- ties, many research areas are interested in providing mod- elling techniques and verification capabilities to guarantee their correctness and satisfaction of properties. In particular, the formal methods community provides robust verification techniques to prove system properties. However, most ap- proaches rely on manually designed formal models, making the analysis process challenging because it requires an expert in the field. On the other hand, the BPM community pro- vides a widely used graphical notation (i.e., BPMN) to design internal behaviour and interactions of complex distributed systems that can be enhanced with additional features (e.g., privacy technologies). Furthermore, BPM uses process min- ing techniques to automatically discover these models from events observation. However, verifying properties and ex- pected behaviour, especially in collaborations, still needs a solid methodology. This thesis aims at exploiting the features of the formal meth- ods and BPM communities to provide approaches that en- able formal verification over distributed systems. In this con- text, we propose two approaches. The modelling-based ap- proach starts from BPMN models and produces process al- gebra specifications to enable formal verification of system properties, including privacy-related ones. The process mining- based approach starts from logs observations to automati- xv cally generate process algebra specifications to enable veri- fication capabilities

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Individual Verifiability for E-Voting, From Formal Verification To Machine Learning

    Get PDF
    The cornerstone of secure electronic voting protocols lies in the principle of individual verifiability. This thesis delves into the intricate task of harmonizing this principle with two other crucial aspects: ballot privacy and coercion-resistance. In the realm of electronic voting, individual verifiability serves as a critical safeguard. It empowers each voter with the ability to confirm that their vote has been accurately recorded and counted in the final tally. This thesis explores the intricate balance between this pivotal aspect of electronic voting and the equally important facets of ballot privacy and coercion-resistance. Ballot privacy, or the assurance that a voter's choice remains confidential, is a fundamental right in democratic processes. It ensures that voters can express their political preferences without fear of retribution or discrimination. On the other hand, coercion-resistance refers to the system's resilience against attempts to influence or manipulate a voter's choice. Furthermore, this thesis also ventures into an empirical analysis of the effectiveness of individual voter checks in ensuring a correct election outcome. It considers a scenario where an adversary possesses additional knowledge about the individual voters and can strategically decide which voters to target. The study aims to estimate the degree to which these checks can still guarantee the accuracy of the election results under such circumstances. In essence, this thesis embarks on a comprehensive exploration of the dynamics between individual verifiability, ballot privacy, and coercion-resistance in secure electronic voting protocols. It also seeks to quantify the effectiveness of individual voter checks in maintaining the integrity of election outcomes, particularly when faced with a knowledgeable and capable adversary. The first contribution of this thesis is revisiting the seminal coercion-resistant e-voting protocol by Juels, Catalano, and Jakobsson (JCJ), examining its usability and practicality. It discusses the credential handling system proposed by Neumann et al., which uses a smart card to unlock or fake credentials via a PIN code. The thesis identifies several security concerns with the JCJ protocol, including an attack on coercion-resistance due to information leakage from the removal of duplicate ballots. It also addresses the issues of PIN errors and the single point of failure associated with the smart card. To mitigate these vulnerabilities, we propose hardware-flexible protocols that allow credentials to be stored by ordinary means while still being PIN-based and providing PIN error resilience. One of these protocols features a linear tally complexity, ensuring efficiency and scalability for large-scale electronic voting systems. The second contribution of this thesis pertains to the exploration and validation of the ballot privacy definition proposed by Cortier et. al., particularly in the context of an adversarial presence. Our exploration involves both the Selene and the MiniVoting abstract scheme. We apply Cortier's definition of ballot privacy to this scheme, investigating how it holds up under this framework. To ensure the validity of our findings, we employ the use of tools for machine-checked proof. This method provides a rigorous and reliable means of verifying our results, ensuring that our conclusions are both accurate and trustworthy. The final contribution of this thesis is a detailed examination and analysis of the Estonian election results. This analysis is conducted in several phases, each contributing to a comprehensive understanding of the election process. The first phase involves a comprehensive marginal analysis of the Estonian election results. We compute upper bounds for several margins, providing a detailed statistical overview of the election outcome. This analysis allows us to identify key trends and patterns in the voting data, laying the groundwork for the subsequent phase of our research. We then train multiple binary classifiers to predict whether a voter is likely to verify their vote. This predictive modeling enables an adversary to gain insights into voter behavior and the factors that may influence their decision to verify their vote. With the insights gained from the previous phases, an adversarial classification algorithm for verifying voters is trained. The likelihood of such an adversary is calculated using various machine learning models, providing a more robust assessment of potential threats to the election process

    Fast batched asynchronous distributed key generation

    Get PDF
    We present new protocols for threshold Schnorr signatures that work in an asynchronous communication setting, providing robustness and optimal resilience. These protocols provide unprecedented performance in terms of communication and computational complexity. In terms of communication complexity, for each signature, a single party must transmit a few dozen group elements and scalars across the network (independent of the size of the signing committee). In terms of computational complexity, the amortized cost for one party to generate a signature is actually less than that of just running the standard Schnorr signing or verification algorithm (at least for moderately sized signing committees, say, up to 100). For example, we estimate that with a signing committee of 49 parties, at most 16 of which are corrupt, we can generate 50,000 Schnorr signatures per second (assuming each party can dedicate one standard CPU core and 500Mbs of network bandwidth to signing). Importantly, this estimate includes both the cost of an offline precomputation phase (which just churns out message independent presignatures ) and an online signature generation phase. Also, the online signing phase can generate a signature with very little network latency (just one to three rounds, depending on how throughput and latency are balanced). To achieve this result, we provide two new innovations. One is a new secret sharing protocol (again, asynchronous, robust, optimally resilient) that allows the dealer to securely distribute shares of a large batch of ephemeral secret keys, and to publish the corresponding ephemeral public keys. To achieve better performance, our protocol minimizes public-key operations, and in particular, is based on a novel technique that does not use the traditional technique based on polynomial commitments . The second innovation is a new algorithm to efficiently combine ephemeral public keys contributed by different parties (some possibly corrupt) into a smaller number of secure ephemeral public keys. This new algorithm is based on a novel construction of a so-called super-invertible matrix along with a corresponding highly-efficient algorithm for multiplying this matrix by a vector of group elements. As protocols for verifiably sharing a secret key with an associated public key and the technology of super-invertible matrices both play a major role in threshold cryptography and multi-party computation, our two new innovations should have applicability well beyond that of threshold Schnorr signatures

    Formal Methods for Trustworthy Voting Systems : From Trusted Components to Reliable Software

    Get PDF
    Voting is prominently an important part of democratic societies, and its outcome may have a dramatic and broad impact on societal progress. Therefore, it is paramount that such a society has extensive trust in the electoral process, such that the system’s functioning is reliable and stable with respect to the expectations within society. Yet, with or without the use of modern technology, voting is full of algorithmic and security challenges, and the failure to address these challenges in a controlled manner may produce fundamental flaws in the voting system and potentially undermine critical societal aspects. In this thesis, we argue for a development process of voting systems that is rooted in and assisted by formal methods that produce transparently checkable evidence for the guarantees that the final system should provide so that it can be deemed trustworthy. The goal of this thesis is to advance the state of the art in formal methods that allow to systematically develop trustworthy voting systems that can be provenly verified. In the literature, voting systems are modeled in the following four comparatively separable and distinguishable layers: (1) the physical layer, (2) the computational layer, (3) the election layer, and (4) the human layer. Current research usually either mostly stays within one of those layers or lacks machine-checkable evidence, and consequently, trusted and understandable criteria often lack formally proven and checkable guarantees on software-level and vice versa. The contributions in this work are formal methods that fill in the trust gap between the principal election layer and the computational layer by a reliable translation of trusted and understandable criteria into trustworthy software. Thereby, we enable that executable procedures can be formally traced back and understood by election experts without the need for inspection on code level, and trust can be preserved to the trustworthy system. The works in this thesis all contribute to this end and consist in five distinct contributions, which are the following: (I) a method for the generation of secure card-based communication schemes, (II) a method for the synthesis of reliable tallying procedures, (III) a method for the efficient verification of reliable tallying procedures, (IV) a method for the computation of dependable election margins for reliable audits, (V) a case study about the security verification of the GI voter-anonymization software. These contributions span formal methods on illustrative examples for each of the three principal components, (1) voter-ballot box communication, (2) election method, and (3) election management, between the election layer and the computational layer. Within the first component, the voter-ballot box communication channel, we build a bridge from the communication channel to the cryptography scheme by automatically generating secure card-based schemes from a small formal model with a parameterization of the desired security requirements. For the second component, the election method, we build a bridge from the election method to the tallying procedure by (1) automatically synthesizing a runnable tallying procedure from the desired requirements given as properties that capture the desired intuitions or regulations of fairness considerations, (2) automatically generating either comprehensible arguments or bounded proofs to compare tallying procedures based on user-definable fairness properties, and (3) automatically computing concrete election margins for a given tallying procedure, the collected ballots, and the computed election result, that enable efficient election audits. Finally, for the third and final component, the election management system, we perform a case study and apply state-of-the-art verification technology to a real-world e-voting system that has been used for the annual elections of the German Informatics Society (GI – “Gesellschaft für Informatik”) in 2019. The case study consists in the formal implementation-level security verification that the voter identities are securely anonymized and the voters’ passwords cannot be leaked. The presented methods assist the systematic development and verification of provenly trustworthy voting systems across traditional layers, i.e., from the election layer to the computational layer. They all pursue the goal of making voting systems trustworthy by reliable and explainable formal requirements. We evaluate the devised methods on minimal card-based protocols that compute a secure AND function for two different decks of cards, a classical knock-out tournament and several Condorcet rules, various plurality, scoring, and Condorcet rules from the literature, the Danish national parliamentary elections in 2015, and a state-of-the-art electronic voting system that is used for the German Informatics Society’s annual elections in 2019 and following

    CLARIN

    Get PDF
    The book provides a comprehensive overview of the Common Language Resources and Technology Infrastructure – CLARIN – for the humanities. It covers a broad range of CLARIN language resources and services, its underlying technological infrastructure, the achievements of national consortia, and challenges that CLARIN will tackle in the future. The book is published 10 years after establishing CLARIN as an Europ. Research Infrastructure Consortium

    Cryptalphabet Soup: DPFs meet MPC and ZKPs

    Get PDF
    Secure multiparty computation (MPC) protocols enable multiple parties to collaborate on a computation using private inputs possessed by the different parties in the computation. At the same time, MPC protocols ensure that no participating party learns anything about the other parties’ private inputs beyond what they can infer from the computation’s output and their own inputs. MPC has wide ranging applications for privacy protecting systems. However, these systems have been plagued by limited performance, lack of scalability, and poor accuracy. In this thesis, we demonstrate several novel techniques for using distributed point functions (DPFs) in combination with MPC to obtain significant performance improvements in several different applications. Namely, using novel observations about the structure of the most efficient available DPF construction in the literature, we show that DPF keys from untrusted sources can be checked for correctness using an MPC protocol between the two key holders, with direct applications in sender-anonymous messaging. We expand these observations to produce the most efficient available method to evaluate piecewise-polynomial functions, also known as splines. The scalability and efficiency of this method allows for splines to be used for extremely high accuracy approximation of non-linear functions in MPC. Furthermore, the protocols proposed in this thesis far outperform prior solutions both in large-scale asymptotic measurements and in concrete benchmarks using high-performance software implementations at both small- and large-scale

    Migration Research in a Digitized World: Using Innovative Technology to Tackle Methodological Challenges

    Get PDF
    This open access book explores implications of the digital revolution for migration scholars’ methodological toolkit. New information and communication technologies hold considerable potential to improve the quality of migration research by originating previously non-viable solutions to a myriad of methodological challenges in this field of study. Combining cutting-edge migration scholarship and methodological expertise, the book addresses a range of crucial issues related to both researcher-designed data collections and the secondary use of “big data”, highlighting opportunities as well as challenges and limitations. A valuable source for students and scholars engaged in migration research, the book will also be of keen interest to policymakers

    Private Data Exploring, Sampling, and Profiling

    Get PDF
    Data analytics is being widely used not only as a business tool, which empowers organizations to drive efficiencies, glean deeper operational insights and identify new opportunities, but also for the greater good of society, as it is helping solve some of world's most pressing issues, such as developing COVID-19 vaccines, fighting poverty and climate change. Data analytics is a process involving a pipeline of tasks over the underlying datasets, such as data acquisition and cleaning, data exploration and profiling, building statistics and training machine learning models. In many cases, conducting data analytics faces two practical challenges. First, many sensitive datasets have restricted access and do not allow unfettered access; Second, data assets are often owned and stored in silos by multiple business units within an organization with different access control. Therefore, data scientists have to do analytics on private and siloed data. There is a fundamental trade-off between data privacy and the data analytics tasks. On the one hand, achieving good quality data analytics requires understanding the whole picture of the data; on the other hand, despite recent advances in designing privacy and security primitives such as differential privacy and secure computation, when naivly applied, they often significantly downgrade tasks' efficiency and accuracy, due to the expensive computations and injected noise, respectively. Moreover, those techniques are often piecemeal and they fall short in holistically integrating into end-to-end data analytics tasks. In this thesis, we approach this problem by treating privacy and utility as constraints on data analytics. First, we study each task and express its utility as data constraints; then, we select a principled data privacy and security model for each task; and finally, we develop mechanisms to combine them into end to end analytics tasks. This dissertation addresses the specific technical challenges of trading off privacy and utility in three popular analytics tasks. The first challenge is to ensure query accuracy in private data exploration. Current systems for answering queries with differential privacy place an inordinate burden on the data scientist to understand differential privacy, manage their privacy budget, and even implement new algorithms for noisy query answering. Moreover, current systems do not provide any guarantees to the data analyst on the quality they care about, namely accuracy of query answers. We propose APEx, a generic accuracy-aware privacy query engine for private data exploration. The key distinction of APEx is to allow the data scientist to explicitly specify the desired accuracy bounds to a SQL query. Using experiments with query benchmarks and a case study, we show that APEx allows high exploration quality with a reasonable privacy loss. The second challenge is to preserve the structure of the data in private data synthesis. Existing differentially private data synthesis methods aim to generate useful data based on applications, but they fail in keeping one of the most fundamental data properties of the structured data — the underlying correlations and dependencies among tuples and attributes. As a result, the synthesized data is not useful for any downstream tasks that require this structure to be preserved. We propose Kamino, a data synthesis system to ensure differential privacy and to preserve the structure and correlations present in the original dataset. We empirically show that while preserving the structure of the data, Kamino achieves comparable and even better usefulness in applications of training classification models and answering marginal queries than the state-of-the-art methods of differentially private data synthesis. The third challenge is efficient and secure private data profiling. Discovering functional dependencies (FDs) usually requires access to all data partitions to find constraints that hold on the whole dataset. Simply applying general secure multi-party computation protocols incurs high computation and communication cost. We propose SMFD to formulate the FD discovery problem in the secure multi-party scenario, and design secure and efficient cryptographic protocols to discover FDs over distributed partitions. Experimental results show that SMFD is practically efficient over non-secure distributed FD discovery, and can significantly outperform general purpose multi-party computation framework
    • …
    corecore