1,160 research outputs found

    Making Existential-Unforgeable Signatures Strongly Unforgeable in the Quantum Random-Oracle Model

    Get PDF
    Strongly unforgeable signature schemes provide a more stringent security guarantee than the standard existential unforgeability. It requires that not only forging a signature on a new message is hard, it is infeasible as well to produce a new signature on a message for which the adversary has seen valid signatures before. Strongly unforgeable signatures are useful both in practice and as a building block in many cryptographic constructions. This work investigates a generic transformation that compiles any existential-unforgeable scheme into a strongly unforgeable one, which was proposed by Teranishi et al. and was proven in the classical random-oracle model. Our main contribution is showing that the transformation also works against quantum adversaries in the quantum random-oracle model. We develop proof techniques such as adaptively programming a quantum random-oracle in a new setting, which could be of independent interest. Applying the transformation to an existential-unforgeable signature scheme due to Cash et al., which can be shown to be quantum-secure assuming certain lattice problems are hard for quantum computers, we get an efficient quantum-secure strongly unforgeable signature scheme in the quantum random-oracle model.Comment: 15 pages, to appear in Proceedings TQC 201

    Quantum-based security in optical fibre networks

    Get PDF
    Electronic communication is used everyday for a number of different applications. Some of the information transferred during these communications can be private requiring encryption and authentication protocols to keep this information secure. Although there are protocols today which provide some security, they are not necessarily unconditionally secure. Quantum based protocols on the other hand, can provide unconditionally secure protocols for encryption and authentication. Prior to this Thesis, only one experimental realisation of quantum digital signatures had been demonstrated. This used a lossy photonic device along with a quantum memory allowing two parties to test whether they were sent the same signature by a single sender, and also store the quantum states for measurement later. This restricted the demonstration to distances of only a few metres, and was tested with a primitive approximation of a quantum memory rather than an actual one. This Thesis presents an experimental realisation of a quantum digital signature protocol which removes the reliance on quantum memory at the receivers, making a major step towards practicality. By removing the quantum memory, it was also possible to perform the swap and comparison mechanism in a more efficient manner resulting in an experimental realisation of quantum digital signatures over 2 kilometres of optical fibre. Quantum communication protocols can be unconditionally secure, however the transmission distance is limited by loss in quantum channels. To overcome this loss in conventional channels an optical amplifier is used, however the added noise from these would swamp the quantum signal if directly used in quantum communications. This Thesis looked into probabilistic quantum amplification, with an experimental realisation of the state comparison amplifier, based on linear optical components and single-photon detectors. The state comparison amplifier operated by using the wellestablished techniques of optical coherent state comparison and weak subtraction to post-select the output and provide non-deterministic amplification with increased fidelity at a high repetition rate. The success rates of this amplifier were found to be orders of magnitude greater than other state of the art quantum amplifiers, due to its lack of requirement for complex quantum resources, such as single or entangled photon sources, and photon number resolving detectors

    Proving knowledge of isogenies – A survey

    Get PDF
    Isogeny-based cryptography is an active area of research in post-quantum public key cryptography. The problem of proving knowledge of an isogeny is a natural problem that has several applications in isogeny-based cryptography, such as allowing users to demonstrate that they are behaving honestly in a protocol. It is also related to isogeny-based digital signatures. Over the last few years, there have been a number of advances in this area, but there are still many open problems. This paper aims to give an overview of the topic and highlight some open problems and directions for future research

    Subwavelength Engineering of Silicon Photonic Waveguides

    Get PDF
    The dissertation demonstrates subwavelength engineering of silicon photonic waveguides in the form of two different structures or avenues: (i) a novel ultra-low mode area v-groove waveguide to enhance light-matter interaction; and (ii) a nanoscale sidewall crystalline grating performed as physical unclonable function to achieve hardware and information security. With the advancement of modern technology and modern supply chain throughout the globe, silicon photonics is set to lead the global semiconductor foundries, thanks to its abundance in nature and a mature and well-established industry. Since, the silicon waveguide is the heart of silicon photonics, it can be considered as the core building block of modern integrated photonic systems. Subwavelength structuring of silicon waveguides shows immense promise in a variety of field of study, such as, tailoring electromagnetic near fields, enhancing light-matter interactions, engineering anisotropy and effective medium effects, modal and dispersion engineering, nanoscale sensitivity etc. In this work, we are going to exploit the boundary conditions of modern silicon photonics through subwavelength engineering by means of novel ultra-low mode area v-groove waveguide to answer long-lasting challenges, such as, fabrication of such sophisticated structure while ensuring efficient coupling of light between dissimilar modes. Moreover, physical unclonable function derived from our nanoscale sidewall crystalline gratings should give us a fast and reliable optical security solution with improved information density. This research should enable new avenues of subwavelength engineered silicon photonic waveguide and answer to many unsolved questions of silicon photonics foundries

    Snapshot hyperspectral imaging : near-infrared image replicating imaging spectrometer and achromatisation of Wollaston prisms

    Get PDF
    Conventional hyperspectral imaging (HSI) techniques are time-sequential and rely on temporal scanning to capture hyperspectral images. This temporal constraint can limit the application of HSI to static scenes and platforms, where transient and dynamic events are not expected during data capture. The Near-Infrared Image Replicating Imaging Spectrometer (N-IRIS) sensor described in this thesis enables snapshot HSI in the short-wave infrared (SWIR), without the requirement for scanning and operates without rejection in polarised light. It operates in eight wavebands from 1.1μm to 1.7μm with a 2.0° diagonal field-of-view. N-IRIS produces spectral images directly, without the need for prior topographic or image reconstruction. Additional benefits include compactness, robustness, static operation, lower processing overheads, higher signal-to-noise ratio and higher optical throughput with respect to other HSI snapshot sensors generally. This thesis covers the IRIS design process from theoretical concepts to quantitative modelling, culminating in the N-IRIS prototype designed for SWIR imaging. This effort formed the logical step in advancing from peer efforts, which focussed upon the visible wavelengths. After acceptance testing to verify optical parameters, empirical laboratory trials were carried out. This testing focussed on discriminating between common materials within a controlled environment as proof-of-concept. Significance tests were used to provide an initial test of N-IRIS capability in distinguishing materials with respect to using a conventional SWIR broadband sensor. Motivated by the design and assembly of a cost-effective visible IRIS, an innovative solution was developed for the problem of chromatic variation in the splitting angle (CVSA) of Wollaston prisms. CVSA introduces spectral blurring of images. Analytical theory is presented and is illustrated with an example N-IRIS application where a sixfold reduction in dispersion is achieved for wavelengths in the region 400nm to 1.7μm, although the principle is applicable from ultraviolet to thermal-IR wavelengths. Experimental proof of concept is demonstrated and the spectral smearing of an achromatised N-IRIS is shown to be reduced by an order of magnitude. These achromatised prisms can provide benefits to areas beyond hyperspectral imaging, such as microscopy, laser pulse control and spectrometry

    Personality Disruption as Mental Torture: The CIA, Interrogational Abuse, and the U.S. Torture Act

    Get PDF
    This Article is a contribution to the torture debate. It argues that the abusive interrogation tactics used by the United States in what was then called the “global war on terrorism” are, unequivocally, torture under U.S. law. To some readers, this might sound like déjà vu all over again. Hasn’t this issue been picked over for nearly fifteen years? It has, but we think the legal analysis we offer has been mostly overlooked. We argue that the basic character of the CIA’s interrogation of so-called “high-value detainees” has been misunderstood: both lawyers and commentators have placed far too much emphasis on the dozen or so “enhanced interrogation techniques” (EITs) short-listed in government “torture memos,” and far too little emphasis on other forms of physical violence, psychological stressors, environmental manipulations, and abusive conditions of confinement that are crucial to the question of whether the detainees were tortured. Furthermore, we dispute one of the standard narratives about the origins of the program: that it was the brainchild of civilian contractor psychologists because— in the CIA’s words—“[n]on-standard interrogation methodologies were not an area of expertise of CIA officers or of the US Government generally.” This narrative ignores the CIA’s role in devising these methods, in spite of the decades of prior CIA research and doctrine about forcing interrogation subjects into a state of extreme psychological debilitation, and about how to do so—by making them physically weak, intensely fearful and anxious, and helplessly dependent. By neglecting this history and focusing on the contractors and the EITs they devised, this narrative contributes to the misunderstanding that the torture debate is about EITs and nothing else. In effect, a “torture debate” about EITs and the torture memos neglects the purloined letter in front of our eyes: the abusive conditions the CIA inflicted on prisoners even when they were not subject to EITs, including abuses that the torture memos never bothered to discuss. Unpacking what this debate is really about turns out to be crucial to understanding that such interrogation methods are torture under existing U.S. law. The U.S. Torture Act includes a clause in its definition of mental torture that was intended to ban exactly the kind of interrogation methods the CIA had researched, out of concern that our Cold War adversaries were using them: mind-altering procedures “calculated to disrupt profoundly the senses or the personality.” That is precisely the “non-standard interrogation methodology” the CIA employed after 9/11

    Incremental Program Obfuscation

    Get PDF
    Recent advances in program obfuscation suggest that it is possible to create software that can provably safeguard secret information. However, software systems usually contain large executable code that is updated multiple times and sometimes very frequently. Freshly obfuscating the program for every small update will lead to a considerable efficiency loss. Thus, an extremely desirable property for obfuscation algorithms is incrementality: small changes to the underlying program translate into small changes to the corresponding obfuscated program. We initiate a thorough investigation of incremental program obfuscation. We show that the strong simulation-based notions of program obfuscation, such as ``virtual black-box\u27\u27 and ``virtual grey-box\u27\u27 obfuscation, cannot be incremental (according to our efficiency requirements) even for very simple functions such as point functions. We then turn to the indistinguishability-based notions, and present two security definitions of varying strength --- namely, a weak one and a strong one. To understand the overall strength of our definitions, we formulate the notion of incremental best-possible obfuscation and show that it is equivalent to our strong indistinguishability-based notion. Finally, we present constructions for incremental program obfuscation satisfying both our security notions. We first give a construction achieving the weaker security notion based on the existence of general purpose indistinguishability obfuscation. Next, we present a generic transformation using oblivious RAM to amplify security from weaker to stronger, while maintaining the incrementality property

    Mitigating Insider Sabotage and Espionage: A Review of the United States Air Force\u27s Current Posture

    Get PDF
    The security threat from malicious insiders affects all organizations. Mitigating this problem is quite difficult due to the fact that (1) there is no definitive profile for malicious insiders, (2) organizations have placed trust in these individuals, and (3) insiders have a vast knowledge of their organization’s personnel, security policies, and information systems. The purpose of this research is to analyze to what extent the United States Air Force (USAF) security policies address the insider threat problem. The policies are reviewed in terms of how well they align with best practices published by the Carnegie Mellon University Computer Emergency Readiness Team and additional factors this research deems important, including motivations, organizational priorities, and social networks. Based on the findings of the policy review, this research offers actionable recommendations that the USAF could implement in order to better prevent, detect, and respond to malicious insider attacks. The most important course of action is to better utilize its workforce. All personnel should be trained on observable behaviors that can be precursors to malicious activity. Additionally, supervisors need to be empowered as the first line of defense, monitoring for stress, unmet expectations, and disgruntlement. In addition, this research proposes three new best practices regarding (1) screening for prior concerning behaviors, predispositions, and technical incidents, (2) issuing sanctions for inappropriate technical acts, and (3) requiring supervisors to take a proactive role

    Binary pattern tile set synthesis is NP-hard

    Full text link
    In the field of algorithmic self-assembly, a long-standing unproven conjecture has been that of the NP-hardness of binary pattern tile set synthesis (2-PATS). The kk-PATS problem is that of designing a tile assembly system with the smallest number of tile types which will self-assemble an input pattern of kk colors. Of both theoretical and practical significance, kk-PATS has been studied in a series of papers which have shown kk-PATS to be NP-hard for k=60k = 60, k=29k = 29, and then k=11k = 11. In this paper, we close the fundamental conjecture that 2-PATS is NP-hard, concluding this line of study. While most of our proof relies on standard mathematical proof techniques, one crucial lemma makes use of a computer-assisted proof, which is a relatively novel but increasingly utilized paradigm for deriving proofs for complex mathematical problems. This tool is especially powerful for attacking combinatorial problems, as exemplified by the proof of the four color theorem by Appel and Haken (simplified later by Robertson, Sanders, Seymour, and Thomas) or the recent important advance on the Erd\H{o}s discrepancy problem by Konev and Lisitsa using computer programs. We utilize a massively parallel algorithm and thus turn an otherwise intractable portion of our proof into a program which requires approximately a year of computation time, bringing the use of computer-assisted proofs to a new scale. We fully detail the algorithm employed by our code, and make the code freely available online
    corecore