3,386 research outputs found

    Short Attribute-Based Signatures for Arbitrary Turing Machines from Standard Assumptions

    Get PDF
    This paper presents the first attribute-based signature (ABS) scheme supporting signing policies representable by Turing machines (TM), based on well-studied computational assumptions. Our work supports arbitrary TMs as signing policies in the sense that the TMs can accept signing attribute strings of unbounded polynomial length and there is no limit on their running time, description size, or space complexity. Moreover, we are able to achieve input-specific running time for the signing algorithm. All other known expressive ABS schemes could at most support signing policies realizable by either arbitrary polynomial-size circuits or TMs having a pre-determined upper bound on the running time. Consequently, those schemes can only deal with signing attribute strings whose lengths are a priori bounded, as well as suffers from the worst-case running time problem. On a more positive note, for the first time in the literature, the signature size of our ABS scheme only depends on the size of the signed message and is completely independent of the size of the signing policy under which the signature is generated. This is a significant achievement from the point of view of communication efficiency. Our ABS construction makes use of indistinguishability obfuscation (IO) for polynomial-size circuits and certain IO-compatible cryptographic tools. Note that, all of these building blocks including IO for polynomial-size circuits are currently known to be realizable under well-studied computational assumptions

    Machines, Logic and Quantum Physics

    Full text link
    Though the truths of logic and pure mathematics are objective and independent of any contingent facts or laws of nature, our knowledge of these truths depends entirely on our knowledge of the laws of physics. Recent progress in the quantum theory of computation has provided practical instances of this, and forces us to abandon the classical view that computation, and hence mathematical proof, are purely logical notions independent of that of computation as a physical process. Henceforward, a proof must be regarded not as an abstract object or process but as a physical process, a species of computation, whose scope and reliability depend on our knowledge of the physics of the computer concerned.Comment: 19 pages, 8 figure

    A Survey of Languages for Specifying Dynamics: A Knowledge Engineering Perspective

    Get PDF
    A number of formal specification languages for knowledge-based systems has been developed. Characteristics for knowledge-based systems are a complex knowledge base and an inference engine which uses this knowledge to solve a given problem. Specification languages for knowledge-based systems have to cover both aspects. They have to provide the means to specify a complex and large amount of knowledge and they have to provide the means to specify the dynamic reasoning behavior of a knowledge-based system. We focus on the second aspect. For this purpose, we survey existing approaches for specifying dynamic behavior in related areas of research. In fact, we have taken approaches for the specification of information systems (Language for Conceptual Modeling and TROLL), approaches for the specification of database updates and logic programming (Transaction Logic and Dynamic Database Logic) and the generic specification framework of abstract state machine

    Attribute-Based Signatures for Unbounded Languages from Standard Assumptions

    Get PDF
    Attribute-based signature (ABS) schemes are advanced signature schemes that simultaneously provide fine-grained authentication while protecting privacy of the signer. Previously known expressive ABS schemes support either the class of deterministic finite automata and circuits from standard assumptions or Turing machines from the existence of indistinguishability obfuscations. In this paper, we propose the first ABS scheme for a very general policy class, all deterministic Turin machines, from a standard assumption, namely, the Symmetric External Diffie-Hellman (SXDH) assumption. We also propose the first ABS scheme that allows nondeterministic finite automata (NFA) to be used as policies. Although the expressiveness of NFAs are more restricted than Turing machines, this is the first scheme that supports nondeterministic computations as policies. Our main idea lies in abstracting ABS constructions and presenting the concept of history of computations; this allows a signer to prove possession of a policy that accepts the string associated to a message in zero-knowledge while also hiding the policy, regardless of the computational model being used. With this abstraction in hand, we are able to construct ABS for Turing machines and NFAs using a surprisingly weak NIZK proof system. Essentially we only require a NIZK proof system for proving that a (normal) signature is valid. Such a NIZK proof system together with a base signature scheme are, in turn, possible from bilinear groups under the SXDH assumption, and hence so are our ABS schemes

    A comprehensive meta-analysis of cryptographic security mechanisms for cloud computing

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The concept of cloud computing offers measurable computational or information resources as a service over the Internet. The major motivation behind the cloud setup is economic benefits, because it assures the reduction in expenditure for operational and infrastructural purposes. To transform it into a reality there are some impediments and hurdles which are required to be tackled, most profound of which are security, privacy and reliability issues. As the user data is revealed to the cloud, it departs the protection-sphere of the data owner. However, this brings partly new security and privacy concerns. This work focuses on these issues related to various cloud services and deployment models by spotlighting their major challenges. While the classical cryptography is an ancient discipline, modern cryptography, which has been mostly developed in the last few decades, is the subject of study which needs to be implemented so as to ensure strong security and privacy mechanisms in today’s real-world scenarios. The technological solutions, short and long term research goals of the cloud security will be described and addressed using various classical cryptographic mechanisms as well as modern ones. This work explores the new directions in cloud computing security, while highlighting the correct selection of these fundamental technologies from cryptographic point of view

    Indistinguishability Obfuscation for Turing Machines: Constant Overhead and Amortization

    Get PDF
    We study the asymptotic efficiency of indistinguishability obfuscation (iO) on two fronts: - Obfuscation size: Present constructions of indistinguishability obfuscation (iO) create obfuscated programs where the size of the obfuscated program is at least a multiplicative factor of security parameter larger than the size of the original program. In this work, we construct the first iO scheme for (bounded-input) Turing machines that achieves only a constant multiplicative overhead in size. The constant in our scheme is, in fact, 2. - Amortization: Suppose we want to obfuscate an arbitrary polynomial number of (bounded-input) Turing machines M_1,...,M_n. We ask whether it is possible to obfuscate M_1,...,M_n using a single application of an iO scheme for a circuit family where the size of any circuit is independent of n as well the size of any Turing machine M_i. In this work, we resolve this question in the affirmative, obtaining a new bootstrapping theorem for obfuscating arbitrarily many Turing machines. Our results rely on the existence of sub-exponentially secure iO for circuits and re-randomizable encryption schemes. In order to obtain these results, we develop a new template for obfuscating Turing machines that is of independent interest and has recently found application in subsequent work on patchable obfuscation [Ananth et al, EUROCRYPT\u2717]

    Functional Encryption as Mediated Obfuscation

    Get PDF
    We introduce a new model for program obfuscation, called mediated obfuscation. A mediated obfuscation is a 3-party protocol for evaluating an obfuscated program that requires minimal interaction and limited trust. The party who originally supplies the obfuscated program need not be online when the client wants to evaluate the program. A semi-trusted third-party mediator allows the client to evaluate the program, while learning nothing about the obfuscated program or the client’s inputs and outputs. Mediated obfuscation would provide the ability for a software vendor to safely outsource the less savory aspects (like accounting of usage statistics, and remaining online to facilitate access) of “renting out” access to proprietary software. We give security definitions for this new obfuscation paradigm, and then present a simple and generic construction based on functional encryption. If a functional encryption scheme supports decryption functionality F (m, k), then our construction yields a mediated obfuscation of the class of functions {F (m, ·) | m}. In our construction, the interaction between the client and the mediator is minimal (much more efficient than a general- purpose multi-party computation protocol). Instantiating with existing FE constructions, we achieve obfuscation for point-functions with output (under a strong “virtual black-box” notion of security), and a general feasibility result for obfuscating conjunctive normal form and disjunctive normal form formulae (under a weaker “semantic” notion of security). Finally, we use mediated obfuscation to illustrate a connection between worst-case and average-case static obfuscation. In short, an average-case (static) obfuscation of some component of a suitable functional encryption scheme yields a worst-case (static) obfuscation for a related class of functions. We use this connection to demonstrate new impossibility results for average-case (static) obfuscation

    Short interval control for the cost estimate baseline of novel high value manufacturing products – a complexity based approach

    Get PDF
    Novel high value manufacturing products by default lack the minimum a priori data needed for forecasting cost variance over of time using regression based techniques. Forecasts which attempt to achieve this therefore suffer from significant variance which in turn places significant strain on budgetary assumptions and financial planning. The authors argue that for novel high value manufacturing products short interval control through continuous revision is necessary until the context of the baseline estimate stabilises sufficiently for extending the time intervals for revision. Case study data from the United States Department of Defence Scheduled Annual Summary Reports (1986-2013) is used to exemplify the approach. In this respect it must be remembered that the context of a baseline cost estimate is subject to a large number of assumptions regarding future plausible scenarios, the probability of such scenarios, and various requirements related to such. These assumptions change over time and the degree of their change is indicated by the extent that cost variance follows a forecast propagation curve that has been defined in advance. The presented approach determines the stability of this context by calculating the effort required to identify a propagation pattern for cost variance using the principles of Kolmogorov complexity. Only when that effort remains stable over a sufficient period of time can the revision periods for the cost estimate baseline be changed from continuous to discrete time intervals. The practical implication of the presented approach for novel high value manufacturing products is that attention is shifted from the bottom up or parametric estimation activity to the continuous management of the context for that cost estimate itself. This in turn enables a faster and more sustainable stabilisation of the estimating context which then creates the conditions for reducing cost estimate uncertainty in an actionable and timely manner
    corecore