72,115 research outputs found

    Cryptology in the Crowd

    Get PDF
    Uhell skjer: Kanskje mistet du nøkkelen til huset, eller hadde PIN-koden til innbruddsalarmen skrevet på en dårlig plassert post-it lapp. Og kanskje endte de slik opp i hendene på feil person, som nå kan påføre livet ditt all slags ugagn: Sikkerhetssystemer gir ingen garantier når nøkler blir stjålet og PIN-koder lekket. Likevel burde naboen din, hvis nøkkel-og-PIN-kode rutiner er heller vanntette, kunne føle seg trygg i vissheten om at selv om du ikke evner å sikre huset ditt mot innbrudd, så forblir deres hjem trygt. Det er tilsvarende for kryptologi, som også lener seg på at nøkkelmateriale hemmeligholdes for å kunne garantere sikkerhet: Intuitivt forventer man at kjennskap til ett systems hemmelige nøkkel ikke burde være til hjelp for å bryte inn i andre, urelaterte systemer. Men det har vist seg overraskende vanskelig å sette denne intuisjonen på formell grunn, og flere konkurrerende sikkerhetsmodeller av varierende styrke har oppstått. Det blir dermed naturlig å spørre seg: Hvilken formalisme er den riktige når man skal modellere realistiske scenarioer med mange brukere og mulige lekkasjer? Eller: hvordan bygger man kryptografi i en folkemengde? Artikkel I begir seg ut på reisen mot et svar ved å sammenligne forskjellige flerbrukervarianter av sikkerhetsmodellen IND-CCA, med og uten evnen til å motta hemmelige nøkler tilhørende andre brukere. Vi finner et delvis svar ved å vise at uten denne evnen, så er noen modeller faktisk å foretrekke over andre. Med denne evnen, derimot, forblir situasjonen uavklart. Artikkel II tar et sidesteg til et sett relaterte sikkerhetsmodeller hvor, heller enn å angripe én enkelt bruker (ut fra en mengde av mulige ofre), angriperen ønsker å bryte kryptografien til så mange brukere som mulig på én gang. Man ser for seg en uvanlig mektig motstander, for eksempel en statssponset aktør, som ikke har problemer med å bryte kryptografien til en enkelt bruker: Målet skifter dermed fra å garantere trygghet for alle brukerne, til å gjøre masseovervåking så vanskelig som mulig, slik at det store flertall av brukere kan forbli sikret. Artikkel III fortsetter der Artikkel I slapp ved å sammenligne og systematisere de samme IND-CCA sikkerhetsmodellene med en større mengde med sikkerhetsmodeller, med det til felles at de alle modellerer det samme (eller lignende) scenarioet. Disse modellene, som går under navnene SOA (Selective Opening Attacks; utvalgte åpningsangrep) og NCE (Non-Committing Encryption; ikke-bindende kryptering), er ofte vesentlig sterkere enn modellene studert i Artikkel I. Med et system på plass er vi i stand til å identifisere en rekke hull i litteraturen; og dog vi tetter noen, etterlater vi mange som åpne problemer.Accidents happen: you may misplace the key to your home, or maybe the PIN to your home security system was written on an ill-placed post-it note. And so they end up in the hands of a bad actor, who is then granted the power to wreak all kinds of havoc in your life: the security of your home grants no guarantees when keys are stolen and PINs are leaked. Nonetheless your neighbour, whose key-and-pin routines leave comparatively little to be desired, should feel safe that just because you can’t keep your house safe from intruders, their home remains secured. It is likewise with cryptography, whose security also relies on the secrecy of key material: intuitively, the ability to recover the secret keys of other users should not help an adversary break into an uncompromised system. Yet formalizing this intuition has turned out tricky, with several competing notions of security of varying strength. This begs the question: when modelling a real-world scenario with many users, some of which may be compromised, which formalization is the right one? Or: how do we build cryptology in a crowd? Paper I embarks on the quest to answer the above questions by studying how various notions of multi-user IND-CCA compare to each other, with and without the ability to adaptively compromise users. We partly answer the question by showing that, without compromise, some notions of security really are preferable over others. Still, the situation is left largely open when compromise is accounted for. Paper II takes a detour to a related set of security notions in which, rather than attacking a single user, an adversary seeks to break the security of many. One imagines an unusually powerful adversary, for example a state-sponsored actor, for whom brute-forcing a single system is not a problem. Our goal then shifts from securing every user to making mass surveillance as difficult as possible, so that the vast majority of uncompromised users can remain secure. Paper III picks up where Paper I left off by comparing and systemizing the same security notions with a wider array of security notions that aim to capture the same (or similar) scenarios. These notions appear under the names of Selective Opening Attacks (SOA) and Non-Committing Encryption (NCE), and are typically significantly stronger than the notions of IND-CCA studied in Paper I. With a system in place, we identify and highlight a number of gaps, some of which we close, and many of which are posed as open problems.Doktorgradsavhandlin

    Fiat-Shamir for highly sound protocols is instantiable

    Get PDF
    The Fiat–Shamir (FS) transformation (Fiat and Shamir, Crypto '86) is a popular paradigm for constructing very efficient non-interactive zero-knowledge (NIZK) arguments and signature schemes from a hash function and any three-move interactive protocol satisfying certain properties. Despite its wide-spread applicability both in theory and in practice, the known positive results for proving security of the FS paradigm are in the random oracle model only, i.e., they assume that the hash function is modeled as an external random function accessible to all parties. On the other hand, a sequence of negative results shows that for certain classes of interactive protocols, the FS transform cannot be instantiated in the standard model. We initiate the study of complementary positive results, namely, studying classes of interactive protocols where the FS transform does have standard-model instantiations. In particular, we show that for a class of “highly sound” protocols that we define, instantiating the FS transform via a q-wise independent hash function yields NIZK arguments and secure signature schemes. In the case of NIZK, we obtain a weaker “q-bounded” zero-knowledge flavor where the simulator works for all adversaries asking an a-priori bounded number of queries q; in the case of signatures, we obtain the weaker notion of random-message unforgeability against q-bounded random message attacks. Our main idea is that when the protocol is highly sound, then instead of using random-oracle programming, one can use complexity leveraging. The question is whether such highly sound protocols exist and if so, which protocols lie in this class. We answer this question in the affirmative in the common reference string (CRS) model and under strong assumptions. Namely, assuming indistinguishability obfuscation and puncturable pseudorandom functions we construct a compiler that transforms any 3-move interactive protocol with instance-independent commitments and simulators (a property satisfied by the Lapidot–Shamir protocol, Crypto '90) into a compiled protocol in the CRS model that is highly sound. We also present a second compiler, in order to be able to start from a larger class of protocols, which only requires instance-independent commitments (a property for example satisfied by the classical protocol for quadratic residuosity due to Blum, Crypto '81). For the second compiler we require dual-mode commitments. We hope that our work inspires more research on classes of (efficient) 3-move protocols where Fiat–Shamir is (efficiently) instantiable

    Political strategies of external support for democratization

    Get PDF
    Political strategies of external support to democratization are contrasted and critically examined in respect of the United States and European Union. The analysis begins by defining its terms of reference and addresses the question of what it means to have a strategy. The account briefly notes the goals lying behind democratization support and their relationship with the wider foreign policy process, before considering what a successful strategy would look like and how that relates to the selection of candidates. The literature's attempts to identify strategy and its recommendations for better strategies are compared and assessed. Overall, the article argues that the question of political strategies of external support for democratization raises several distinct but related issues including the who?, what?, why?, and how? On one level, strategic choices can be expected to echo the comparative advantage of the "supporter." On a different level, the strategies cannot be divorced from the larger foreign policy framework. While it is correct to say that any sound strategy for support should be grounded in a theoretical understanding of democratization, the literature on strategies reveals something even more fundamental: divergent views about the nature of politics itself. The recommendations there certainly pinpoint weaknesses in the actual strategies of the United States and Europe but they have their own limitations too. In particular, in a world of increasing multi-level governance strategies for supporting democratization should go beyond preoccupation with just an "outside-in" approach

    Greening the WTO's Disputes Settlement Understanding: Opportunities and Risks

    Get PDF
    It is reasonable to ask whether the WTO’s rules may hamper the ability of national and sub-national governments to be genuine pacesetters in environmental law making. Environmentalists consider that the WTO’s disputes panels may encourage governments to converge to the relevant international standard for a particular risk regulation because such uniformity is likely to reduce the incidence of trade disputes. Proposals that go beyond environmental advocacy and greater transparency in the WTO’s disputes settlement process—changes such as a weakening of the sound science requirement and incorporating stronger forms of the precautionary principle into WTO agreements on biosecurity laws—reduce due process safeguards against disguised regulatory protectionism in New Zealand’s agricultural export markets.World Trade Organization, trade disputes, environment, conservation, New Zealand

    Short Group Signatures via Structure-Preserving Signatures: Standard Model Security from Simple Assumptions

    Get PDF
    International audienceGroup signatures are a central cryptographic primitive which allows users to sign messages while hiding their identity within a crowd of group members. In the standard model (without the random oracle idealization), the most efficient constructions rely on the Groth-Sahai proof systems (Euro-crypt'08). The structure-preserving signatures of Abe et al. (Asiacrypt'12) make it possible to design group signatures based on well-established, constant-size number theoretic assumptions (a.k.a. " simple assumptions ") like the Symmetric eXternal Diffie-Hellman or Decision Linear assumptions. While much more efficient than group signatures built on general assumptions, these constructions incur a significant overhead w.r.t. constructions secure in the idealized random oracle model. Indeed, the best known solution based on simple assumptions requires 2.8 kB per signature for currently recommended parameters. Reducing this size and presenting techniques for shorter signatures are thus natural questions. In this paper, our first contribution is to significantly reduce this overhead. Namely, we obtain the first fully anonymous group signatures based on simple assumptions with signatures shorter than 2 kB at the 128-bit security level. In dynamic (resp. static) groups, our signature length drops to 1.8 kB (resp. 1 kB). This improvement is enabled by two technical tools. As a result of independent interest, we first construct a new structure-preserving signature based on simple assumptions which shortens the best previous scheme by 25%. Our second tool is a new method for attaining anonymity in the strongest sense using a new CCA2-secure encryption scheme which is simultaneously a Groth-Sahai commitment

    Aid and Governance in Vulnerable States: Bangladesh and Pakistan since 1971

    Get PDF
    Bangladesh and Pakistan had very different experiences with aid after 1971. Politics in Pakistan was less inclusive in terms of opportunities for intermediate (middle- and lower-middle-) class political entrepreneurs, and the dominance of military aid to Pakistan exacerbated the problem by allowing the top leadership to continue to rule without sharing much power with these classes. This not only had negative effects on the evolution of Pakistan’s politics but also slowed down the growth of a broad-based manufacturing sector. In contrast, in Bangladesh the less centralized organization of political power and less concentrated forms of aid allowed intermediate-class political entrepreneurs to improve their access to resources and created opportunities for many of them to enter productive manufacturing activities such as the garments industry. Differences in patterns of aid can help to explain significant differences in economic and political outcomes in the two countries. These experiences challenge conventional ideas about the relationship among aid, good governance, and security. Designing aid policies so that aid can assist developing countries in improving their economic and political viability requires a better understanding of the complex relationships between aid and the political economies of recipient countries

    暗号要素技術の一般的構成を介した高い安全性・高度な機能を備えた暗号要素技術の構成

    Get PDF
    Recent years have witnessed an active research on cryptographic primitives with complex functionality beyond simple encryption or authentication. A cryptographic primitive is required to be proposed together with a formal model of its usage and a rigorous proof of security under that model.This approach has suffered from the two drawbacks: (1) security models are defined in a very specific manner for each primitive, which situation causes the relationship between these security models not to be very clear, and (2) no comprehensive ways to confirm that a formal model of security really captures every possible scenarios in practice.This research relaxes these two drawbacks by the following approach: (1) By observing the fact that a cryptographic primitive A should be crucial for constructing another primitive B, we identify an easy-to-understand approach for constructing various cryptographic primitives.(2) Consider a situation in which there are closely related cryptographic primitives A and B, and the primitive A has no known security requirement that corresponds to some wellknown security requirement (b) for the latter primitive B.We argue that this situation suggests that this unknown security requirement for A can capture some practical attack. This enables us to detect unknown threats for various cryptographic primitives that have been missed bythe current security models.Following this approach, we identify an overlooked security threat for a cryptographic primitive called group signature. Furthermore, we apply the methodology (2) to the “revocable”group signature and obtain a new extension of public-key encryption which allows to restrict a plaintext that can be securely encrypted.通常の暗号化や認証にとどまらず, 複雑な機能を備えた暗号要素技術の提案が活発になっている. 暗号要素技術の安全性は利用形態に応じて, セキュリティ上の脅威をモデル化して安全性要件を定め, 新方式はそれぞれ安全性定義を満たすことの証明と共に提案される.既存研究では, 次の問題があった: (1) 要素技術ごとに個別に安全性の定義を与えているため, 理論的な体系化が不十分であった. (2) 安全性定義が実用上の脅威を完全に捉えきれているかの検証が難しかった.本研究は上記の問題を次の考え方で解決する. (1) ある要素技術(A) を構成するには別の要素技術(B) を部品として用いることが不可欠であることに注目し, 各要素技術の安全性要件の関連を整理・体系化して, 新方式を見通し良く構成可能とする. (2) 要素技術(B)で考慮されていた安全性要件(b) に対応する要素技術(A) の安全性要件が未定義なら, それを(A) の新たな安全性要件(a) として定式化する. これにより未知の脅威の検出が容易になる.グループ署名と非対話開示機能付き公開鍵暗号という2 つの要素技術について上記の考え方を適用して, グループ署名について未知の脅威を指摘する.また, 証明書失効機能と呼ばれる拡張機能を持つグループ署名に上記の考え方を適用して, 公開鍵暗号についての新たな拡張機能である, 暗号化できる平文を制限できる公開鍵暗号の効率的な構成法を明らかにする.電気通信大学201

    Selective Reductions in Labour Taxation: Labour Market Adjustments and Macroeconomic Performance

    Get PDF
    Significant differences in unemployment incidence in Europe have been observed across skill groups, with the least skilled suffering the highest and most persistent unemployment rates. To identify policies alleviating this problem, we study the impact of reductions in employer social security contributions. We construct a general equilibrium model with three types of heterogeneous workers and firms, matching frictions, wage bargaining and a rigid minimum wage. We find evidence in favour of narrow tax cuts targeted at the minimum wage but we argue that it is most important to account for the effects of such reductions on both job creation and job destruction. The failure to do so may explain the gap between macro- and microeconometric evaluations of such policies in France and Belgium. Policy impact on welfare and inefficiencies induced by job competition, ladder effects and on-the-job search are discussed.Skill Bias, Minimum Wage, Job Creation, Job Destruction, Job Competition, Search Unemployment, Taxation
    corecore