139 research outputs found
Recommended from our members
Cloud-Based Quadratic Optimization with Partially Homomorphic Encryption
This article develops a cloud-based protocol for a constrained quadratic optimization problem involving multiple parties, each holding private data. The protocol is based on the projected gradient ascent on the Lagrange dual problem and exploits partially homomorphic encryption and secure communication techniques. Using formal cryptographic definitions of indistinguishability, the protocol is shown to achieve computational privacy. We show the implementation results of the protocol and discuss its computational and communication complexity. We conclude this article with a discussion on privacy notions
Towards Logical Specification of Statistical Machine Learning
We introduce a logical approach to formalizing statistical properties of
machine learning. Specifically, we propose a formal model for statistical
classification based on a Kripke model, and formalize various notions of
classification performance, robustness, and fairness of classifiers by using
epistemic logic. Then we show some relationships among properties of
classifiers and those between classification performance and robustness, which
suggests robustness-related properties that have not been formalized in the
literature as far as we know. To formalize fairness properties, we define a
notion of counterfactual knowledge and show techniques to formalize conditional
indistinguishability by using counterfactual epistemic operators. As far as we
know, this is the first work that uses logical formulas to express statistical
properties of machine learning, and that provides epistemic (resp.
counterfactually epistemic) views on robustness (resp. fairness) of
classifiers.Comment: SEFM'19 conference paper (full version with errors corrected
A weakness measure for GR(1) formulae
In spite of the theoretical and algorithmic developments for system synthesis in recent years, little effort has been dedicated to quantifying the quality of the specifications used for synthesis. When dealing with unrealizable specifications, finding the weakest environment assumptions that would ensure realizability is typically a desirable property; in such context the weakness of the assumptions is a major quality parameter. The question of whether one assumption is weaker than another is commonly interpreted using implication or, equivalently, language inclusion. However, this interpretation does not provide any further insight into the weakness of assumptions when implication does not hold. To our knowledge, the only measure that is capable of comparing two formulae in this case is entropy, but even it fails to provide a sufficiently refined notion of weakness in case of GR(1) formulae, a subset of linear temporal logic formulae which is of particular interest in controller synthesis. In this paper we propose a more refined measure of weakness based on the Hausdorff dimension, a concept that captures the notion of size of the omega-language satisfying a linear temporal logic formula. We identify the conditions under which this measure is guaranteed to distinguish between weaker and stronger GR(1) formulae. We evaluate our proposed weakness measure in the context of computing GR(1) assumptions refinements
Recommended from our members
Combining Induction, Deduction, and Structure for Verification and Synthesis
Even with impressive advances in formal methods, certain major challenges remain. Chief among these are environment modeling, incompleteness in specifications, and the hardness of underlying decision problems. In this paper, we characterize two trends that show great promise in meeting these challenges. The first trend is to perform verification by reduction to synthesis. The second is to solve the resulting synthesis problem by integrating traditional, deductive methods with inductive inference (learning from examples) using hypotheses about system structure. We present a formalization of such an integration, show how it can tackle hard problems in verification and synthesis, and outline directions for future work
Recommended from our members
Verifying high-confidence interactive systems: Electronic voting and beyond
Human interaction is central to many computing systems that require a high level of assurance. We term such systems as high-confidence interactive systems. Examples of such systems include aircraft control systems (interacting with a pilot), automobiles with self-driving features (interacting with a driver), medical devices (interacting with a doctor), and electronic voting machines (interacting with a voter). A major challenge to verifying the correct operation of such systems is that it is difficult to formally specify the human user's view of correct operation and perception of the input/output interface. In this paper, we describe a promising approach towards addressing this challenge that combines formal verification with systematic testing by humans. We describe an illustrative application of this approach to electronic voting in the U.S., and outline directions for future work. © Springer-Verlag 2013
Incremental determinization
We present a novel approach to solve quantified boolean formulas with one quantifier alternation (2QBF). The algorithm incrementally adds new constraints to the formula until the constraints describe a unique Skolem function - or until the absence of a Skolem function is detected. Backtracking is required if the absence of Skolem functions depends on the newly introduced constraints. We present the algorithm in analogy to search algorithms for SAT and explain how propagation, decisions, and conflicts are lifted from values to Skolem functions. The algorithm improves over the state of the art in terms of the number of solved instances, solving time, and the size of the certificates
- …