219 research outputs found

    Towards Electrical, Integrated Implementations of SIMPL Systems

    Get PDF
    International audienceThis paper discusses strategies for the electrical, integrated implementation of a novel security tool termed SIMPL system, which was introduced in [1]. SIMPL systems are a public key version of Physical Unclonable Functions (PUFs). Like a PUF, each SIMPL system S is physically unique and non-reproducible, and implements an individual function FS. In opposition to a PUF, every SIMPL system S possesses a publicly known numerical description D(S), which allows its digital simulation and prediction. However, any such simulation must work at a detectably lower speed than the real-time behavior of S. As argued in [1], SIMPL systems have practicality and security advantages over PUFs, Certificates of Authenticity (COAs), Physically Obfuscated Keys (POKs), and also over standard mathematical cryptotechniques. This manuscript focuses on electrical, integrated realizations of SIMPL systems, and proposes two potential candidates: SIMPL systems derived from special SRAM-architectures (so-called "skew designs" of SRAM cells), and implementations based on analog computing arrays called Cellular Non-Linear Networks (CNNs)

    Virtual Proofs of Reality

    Get PDF
    In this paper, we discuss the question how physical statements can be proven remotely over digital communication channels, but without using classical secret keys, and without assuming tamper-resistant and trusted measurement hardware in the location of the prover. Examples for the considered physical statements are: (i) “the temperature of a certain object is X °C”, (ii) “two certain objects are positioned at distance X”, or (iii) “a certain object has been irreversibly altered or destroyed”. In lack of an established name, we would like to call the corresponding security protocols ”virtual proofs of reality” (VPs). While a host of variants seems conceivable, this paper focuses on VPs in which the verifier has handed over one or more specific physical objects O_i to the prover at some point prior to the VP. These “witness objects” assist the prover during the proof, but shall not contain classical digital keys nor be assumed tamper-resistant in the classical sense. The prover is allowed to open, inspect and alter these objects in our adversarial model, only being limited by current technology, while he shall still be unable to prove false claims to the verifier. In order to illustrate our concept, we give example protocols built on temperature sensitive integrated circuits, disordered optical scattering media, and quantum systems. These protocols prove the temperature, destruction/modification, or relative position of witness objects in the prover’s location. Full experimental realizations of these schemes are beyond the scope of this paper. But the protocols utilize established technologies from the areas of physical unclonable functions and quantum cryptography, and hence appear plausible also without such proof. Finally, we also discuss potential advancements of our method in theory, for example “public virtual proofs” that function without exchanging witness objects Oi between the verifier and the prover. Our work touches upon and partly extends several established cryptographic and security concepts, including physical unclonable functions, quantum cryptography, and interactive proof systems

    Secret-free security: a survey and tutorial

    Get PDF
    Classical keys, i.e., secret keys stored permanently in digital form in nonvolatile memory, appear indispensable in modern computer security-but also constitute an obvious attack target in any hardware containing them. This contradiction has led to perpetual battle between key extractors and key protectors over the decades. It is long known that physical unclonable functions (PUFs) can at least partially overcome this issue, since they enable secure hardware without the above classical keys. Unfortunately, recent research revealed that many standard PUFs still contain other types of secrets deeper in their physical structure, whose disclosure to adversaries breaks security as well: Examples include the manufacturing variations in SRAM PUFs, the power-up states of SRAM PUFs, or the signal delays in Arbiter PUFs. Most of these secrets have already been extracted in viable attacks in the past, breaking PUF-security in practice. A second generation of physical security primitives now shows potential to resolve this remaining problem, however. In certain applications, so-called Complex PUFs, SIMPLs/PPUFs, and UNOs are able to realize not just hardware that is free of classical keys in the above sense, but completely secret-free instead. In the resulting hardware systems, adversaries could hypothetically be allowed to inspect every bit and every atom, and learn any information present in any form in the system, without being able to break security. Secret-free hardware would hence promise to be innately and permanently immune against any physical or malware-based key-extraction: There simply is no security-critical information to extract anymore. Our survey and tutorial paper takes the described situation as starting point, and categorizes, formalizes, and overviews the recently evolving area of secret-free security. We propose the attempt of making hardware completely secret-free as promising endeavor in future hardware designs, at least in those application scenarios where this is logically possible. In others, we suggest that secret-free techniques could be combined with standard PUFs and classical methods to construct hybrid systems with notably reduced attack surfaces

    Abstract-Syntax-Driven Development of Oberon-0 Using YAJCo

    Get PDF
    YAJCo is a tool for the development of software languages based on an annotated language model. The model is represented by Java classes with annotations defining their mapping to concrete syntax. This approach to language definition enables the abstract syntax to be central point of the development process, instead of concrete syntax. In this paper a case study of Oberon-0 programming language development is presented. The study is based on the LTDA Tool Challenge and showcases details of abstract and concrete syntax definition using YAJCo, as well as implementation of name resolution, type checking, model transformation and code generation. The language was implemented in modular fashion to demonstrate language extension mechanisms supported by YAJCo

    Synchronized exchange of material and information

    Get PDF
    Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, June 2003.Includes bibliographical references (leaves 39-41).Commerce is all about the carefully managed exchange of material, money, and information. Traditionally, the connection between material and information has been tenuous, with humans acting as the intermediaries. This has made the supply chain inefficient and expensive. The Auto-lID Center has created a stronger, automatic link between inanimate objects and computers. This thesis completes the information exchange, or feedback loop, which makes commerce possible. Specifically, it identifies a framework for information exchange alongside material exchange using Savant-to-Savant communication. Messaging standards will need to support the Auto-ID Center's technology, and this thesis suggests how to augment existing and emerging communication standards to accomplish this feat. Finally, to address the issue of increasing information management, this thesis analyzes the aggregation database, an IT infrastructure component that might be of value to organizations. The outcome of this thesis is an understanding of the various issues necessary to develop a secure, efficient and robust system for tracking and automatically confirming material exchange.by Amit Goyal.M.Eng

    Next Generation Cloud Computing: New Trends and Research Directions

    Get PDF
    The landscape of cloud computing has significantly changed over the last decade. Not only have more providers and service offerings crowded the space, but also cloud infrastructure that was traditionally limited to single provider data centers is now evolving. In this paper, we firstly discuss the changing cloud infrastructure and consider the use of infrastructure from multiple providers and the benefit of decentralising computing away from data centers. These trends have resulted in the need for a variety of new computing architectures that will be offered by future cloud infrastructure. These architectures are anticipated to impact areas, such as connecting people and devices, data-intensive computing, the service space and self-learning systems. Finally, we lay out a roadmap of challenges that will need to be addressed for realising the potential of next generation cloud systems.Comment: Accepted to Future Generation Computer Systems, 07 September 201

    Superresolution Enhancement with Active Convolved Illumination

    Get PDF
    The first two decades of the 21st century witnessed the emergence of “metamaterials”. The prospect of unrestricted control over light-matter interactions was a major contributing factor leading to the realization of new technologies and advancement of existing ones. While the field certainly does not lack innovative applications, widespread commercial deployment may still be several decades away. Fabrication of sophisticated 3d micro and nano structures, specially for telecommunications and optical frequencies will require a significant advancement of current technologies. More importantly, the effects of absorption and scattering losses will require a robust solution since this renders any conceivable application of metamaterials impracticable. In this dissertation, a new approach, called Active Convolved Illumination (ACI), is formulated to address the problem of optical losses in metamaterials and plasmonics. An active implementation of ACI’s predecessor the Π scheme formulated to provide compensation for arbitrary spatial frequencies. The concept of “selective amplification” of spatial frequencies is introduced as a method of providing signal amplification with suppressed noise amplification. Pendry’s non-ideal negative index flat lens is intentionally chosen as an example of a stringent and conservative test candidate. A physical implementation of ACI is presented with a plasmonic imaging system. The superlens integrated with a tunable near-field spatial filter designed with a layered metal-dielectric system exhibiting hyperbolic dispersion. A study of the physical generation of the auxiliary shows how selective amplification via convolution, is implemented by a lossy metamaterial functioning as a near-field spatial filter. Additionally the preservation of the mathematical formalism of ACI is presented by integrating the hyperbolic metamaterial with the previously used plasmonic imaging system. A comprehensive mathematical exposition of ACI is developed for coherent light. This provides a rigorous understanding of the role of selective spectral amplification and correlations during the loss compensation process. The spectral variance of noise is derived to prove how an auxiliary source, which is firstly correlated with the object field, secondly is defined over a finite spectral bandwidth and thirdly, provides amplification over the selected bandwidth can significantly improve the spectral signal-to-noise ratio and consequently the resolution limit of a generic lossy plasmonic superlens

    Semiconductor process design : representations, tools, and methodologies

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1991.Vita.Includes bibliographical references (p. 267-278).by Duane S. Boning.Ph.D
    corecore