5 research outputs found

    A New Upperbound for the Oblivious Transfer Capacity of Discrete Memoryless Channels

    Full text link
    We derive a new upper bound on the string oblivious transfer capacity of discrete memoryless channels. The main tool we use is the tension region of a pair of random variables introduced in Prabhakaran and Prabhakaran (2014) where it was used to derive upper bounds on rates of secure sampling in the source model. In this paper, we consider secure computation of string oblivious transfer in the channel model. Our bound is based on a monotonicity property of the tension region in the channel model. We show that our bound strictly improves upon the upper bound of Ahlswede and Csisz\'ar (2013).Comment: 7 pages, 3 figures, extended version of submission to IEEE Information Theory Workshop, 201

    An Elementary Completeness Proof for Secure Two-Party Computation Primitives

    Full text link
    In the secure two-party computation problem, two parties wish to compute a (possibly randomized) function of their inputs via an interactive protocol, while ensuring that neither party learns more than what can be inferred from only their own input and output. For semi-honest parties and information-theoretic security guarantees, it is well-known that, if only noiseless communication is available, only a limited set of functions can be securely computed; however, if interaction is also allowed over general communication primitives (multi-input/output channels), there are "complete" primitives that enable any function to be securely computed. The general set of complete primitives was characterized recently by Maji, Prabhakaran, and Rosulek leveraging an earlier specialized characterization by Kilian. Our contribution in this paper is a simple, self-contained, alternative derivation using elementary information-theoretic tools.Comment: 6 pages, extended version of ITW 2014 pape

    On the Communication Complexity of Secure Computation

    Full text link
    Information theoretically secure multi-party computation (MPC) is a central primitive of modern cryptography. However, relatively little is known about the communication complexity of this primitive. In this work, we develop powerful information theoretic tools to prove lower bounds on the communication complexity of MPC. We restrict ourselves to a 3-party setting in order to bring out the power of these tools without introducing too many complications. Our techniques include the use of a data processing inequality for residual information - i.e., the gap between mutual information and G\'acs-K\"orner common information, a new information inequality for 3-party protocols, and the idea of distribution switching by which lower bounds computed under certain worst-case scenarios can be shown to apply for the general case. Using these techniques we obtain tight bounds on communication complexity by MPC protocols for various interesting functions. In particular, we show concrete functions that have "communication-ideal" protocols, which achieve the minimum communication simultaneously on all links in the network. Also, we obtain the first explicit example of a function that incurs a higher communication cost than the input length in the secure computation model of Feige, Kilian and Naor (1994), who had shown that such functions exist. We also show that our communication bounds imply tight lower bounds on the amount of randomness required by MPC protocols for many interesting functions.Comment: 37 page
    corecore