190 research outputs found

    Security and privacy services based on biosignals for implantable and wearable device

    Get PDF
    Mención Internacional en el título de doctorThe proliferation of wearable and implantable medical devices has given rise to an interest in developing security schemes suitable for these devices and the environment in which they operate. One area that has received much attention lately is the use of (human) biological signals as the basis for biometric authentication, identification and the generation of cryptographic keys. More concretely, in this dissertation we use the Electrocardiogram (ECG) to extract some fiducial points which are later used on crytographic protocols. The fiducial points are used to describe the points of interest which can be extracted from biological signals. Some examples of fiducials points of the ECG are P-wave, QRS complex,T-wave, R peaks or the RR-time-interval. In particular, we focus on the time difference between two consecutive heartbeats (R-peaks). These time intervals are referred to as Inter-Pulse Intervals (IPIs) and have been proven to contain entropy after applying some signal processing algorithms. This process is known as quantization algorithm. Theentropy that the heart signal has makes the ECG values an ideal candidate to generate tokens to be used on security protocols. Most of the proposed solutions in the literature rely on some questionable assumptions. For instance, it is commonly assumed that it possible to generate the same cryptographic token in at least two different devices that are sensing the same signal using the IPI of each cardiac signal without applying any synchronization algorithm; authors typically only measure the entropy of the LSB to determine whether the generated cryptographic values are random or not; authors usually pick the four LSBs assuming they are the best ones to create the best cryptographic tokens; the datasets used in these works are rather small and, therefore, possibly not significant enough, or; in general it is impossible to reproduce the experiments carried out by other researchers because the source code of such experiments is not usually available. In this Thesis, we overcome these weaknesses trying to systematically address most of the open research questions. That is why, in all the experiments carried out during this research we used a public database called PhysioNet which is available on Internet and stores a huge heart database named PhysioBank. This repository is constantly being up dated by medical researchers who share the sensitive information about patients and it also offers an open source software named PhysioToolkit which can be used to read and display these signals. All datasets we used contain ECG records obtained from a variety of real subjects with different heart-related pathologies as well as healthy people. The first chapter of this dissertation (Chapter 1) is entirely dedicated to present the research questions, introduce the main concepts used all along this document as well as settle down some medical and cryptographic definitions. Finally, the objectives that this dissertation tackles down are described together with the main motivations for this Thesis. In Chapter 2 we report the results of a large-scale statistical study to determine if heart signal is a good source of entropy. For this, we analyze 19 public datasets of heart signals from the Physionet repository, spanning electrocardiograms from multiple subjects sampled at different frequencies and lengths. We then apply both ENT and NIST STS standard battery of randomness tests to the extracted IPIs. The results we obtain through the analysis, clearly show that a short burst of bits derived from an ECG record may seem random, but large files derived from long ECG records should not be used for security purposes. In Chapter3, we carry out an análisis to check whether it is reasonable or not the assumption that two different sensors can generate the same cryptographic token. We systematically check if two sensors can agree on the same token without sharing any type of information. Similarly to other proposals, we include ECC algorithms like BCH to the token generation. We conclude that a fuzzy extractor (or another error correction technique) is not enough to correct the synchronization errors between the IPI values derived from two ECG signals captured via two sensors placed on different positions. We demonstrate that a pre-processing of the heart signal must be performed before the fuzzy extractor is applied. Going one step forward and, in order to generate the same token on different sensors, we propose a synchronization algorithm. To do so, we include a runtimemonitoralgorithm. Afterapplyingourproposedsolution,werun again the experiments with 19 public databases from the PhysioNet repository. The only constraint to pick those databases was that they need at least two measurements of heart signals (ECG1 and ECG2). As a conclusion, running the experiments, the same token can be dexix rived on different sensors in most of the tested databases if and only if a pre-processing of the heart signal is performed before extracting the tokens. In Chapter 4, we analyze the entropy of the tokens extracted from a heart signal according to the NISTSTS recommendation (i.e.,SP80090B Recommendation for the Entropy Sources Used for Random Bit Generation). We downloaded 19 databases from the Physionet public repository and analyze, in terms of min-entropy, more than 160,000 files. Finally, we propose other combinations for extracting tokens by taking 2, 3, 4 and 5 bits different than the usual four LSBs. Also, we demonstrate that the four LSB are not the best bits to be used in cryptographic applications. We offer other alternative combinations for two (e.g., 87), three (e.g., 638), four (e.g., 2638) and five (e.g., 23758) bits which are, in general, much better than taking the four LSBs from the entropy point of view. Finally, the last Chapter of this dissertation (Chapter 5) summarizes the main conclusions arisen from this PhD Thesis and introduces some open questions.Programa de Doctorado en Ciencia y Tecnología Informática por la Universidad Carlos III de MadridPresidente: Arturo Ribagorda Garnacho.- Secretario: Jorge Blasco Alis.- Vocal: Jesús García López de la Call

    Quantum random number generators for industrial applications

    Get PDF
    Premi extraordinari doctorat UPC curs 2017-2018. Àmbit de CiènciesRandomness is one of the most intriguing, inspiring and debated topics in the history of the world. It appears every time we wonder about our existence, about the way we are, e.g. Do we have free will? Is evolution a result of chance? It is also present in any attempt to understand our anchoring to the universe, and about the rules behind the universe itself, e.g. Why are we here and when and why did all this start? Is the universe deterministic or does unpredictability exist? Remarkably, randomness also plays a central role in the information era and technology. Random digits are used in communication protocols like Ethernet, in search engines and in processing algorithms as page rank. Randomness is also widely used in so-called Monte Carlo methods in physics, biology, chemistry, finance and mathematics, as well as in many other disciplines. However, the most iconic use of random digits is found in cryptography. Random numbers are used to generate cryptographic keys, which are the most basic element to provide security and privacy to any form of secure communication. This thesis has been carried out with the following questions in mind: Does randomness exist in photonics? If so, how do we mine it and how do we mine it in a massively scalable manner so that everyone can easily use it? Addressing these two questions lead us to combine tools from fundamental physics and engineering. The thesis starts with an in-depth study of the phase diffusion process in semiconductor lasers and its application to random number generation. In contrast to other physical processes based on deterministic laws of nature, the phase diffusion process has a pure quantum mechanical origin, and, as such, is an ideal source for generating truly unpredictable digits. First, we experimentally demonstrated the fastest quantum random number generation scheme ever reported (at the time), using components from the telecommunications industry only. Up to 40 Gb/s were demonstrated to be possible using a pulsed scheme. We then moved towards building prototypes and testing them with partners in supercomputation and fundamental research. In particular, the devices developed during this thesis were used in the landmark loophole- free Bell test experiments of 2015. In the process of building the technology, we started a new research focus as an attempt to answer the following question: How do we know that the digits that we generate are really coming from the phase diffusion process that we trust? As a result, we introduced the randomness metrology methodology, which can be used to derive quantitative bounds on the quality of any physical random number generation device. Finally, we moved towards miniaturisation of the technology by leveraging techniques from the photonic integrated circuits technology industry. The first fully integrated quantum random number generator was demonstrated using a novel two-laser scheme on an Indium Phosphide platform. In addition, we also demonstrated the integration of part of the technology on a Silicon Photonics platform, opening the door towards manufacturing in the most advanced semiconductor industry.L’aleatorietat és un dels temes més intrigants, inspiradors i debatuts al llarg de la història. És un concepte que sorgeix quan ens preguntem sobre la nostra pròpia existència i de per què som com som. Tenim freewill? És l’evolució resultat de l’atzar? L’aleatorietat és també un tema que sorgeix quan intentem entendre la nostra relació amb l’univers mateix. Per què estem aquí? Quan o com va començar tot això? És l’univers una màquina determinista o hi ha cabuda per a l’atzar? Sorprenentment, l’aleatorietat també juga un paper crucial en l’era de la informació i la tecnologia. Els nombres aleatoris es fan servir en protocols de comunicació com Ethernet, en algoritmes de classificació i processat com Page Rank. També usem l’aleatorietat en els mètodes Monte Carlo, que s’utilitzen en els àmbits de la física, la biologia, la química, les finances o les matemàtiques. Malgrat això, l’aplicació més icònica per als nombres aleatoris la trobem en el camp de la criptografia o ciber-seguretat. Els nombres aleatoris es fan servir per a generar claus criptogràfiques, l’element bàsic que proporciona la seguretat i privacitat a les nostres comunicacions. Aquesta tesi parteix de la següent pregunta fonamental: Existeix l’aleatorietat a la fotònica? En cas afirmatiu, com podem extreure-la i ferla accessible a tothom? Per a afrontar aquestes dues preguntes, s’han combinat eines des de la física fonamental fins a l’enginyeria. La tesi parteix d’un estudi detallat del procés de difusió de fase en làsers semiconductors i de com aplicar aquest procés per a la generació de nombres aleatoris. A diferència d’altres processos físics basats en lleis deterministes de la natura, la difusió de fase té un origen purament quàntic, i per tant, és una font ideal per a generar nombres aleatoris. Primerament, i fent servir aquest procés de difusió de fase, vam crear el generador quàntic de nombres aleatoris més ràpid mai implementat (en aquell moment) fent servir, únicament, components de la indústria de les telecomunicacions. Més de 40 Gb/s van ser demostrats fent servir un esquema de làser polsat. Posteriorment, vam construir diversos prototips que van ser testejats en aplicacions de ciència fonamental i supercomputació. En particular, alguns dels prototips desenvolupats en aquesta tesi van ser claus en els famosos experiments loophole-free Bell tests realitzats l’any 2015. En el procés de construir aquests prototips, vam iniciar una nova línia de recerca per a intentar contestar una nova pregunta: Com sabem si els nombres aleatoris que generem realment sorgeixen del procés de difusió de fase, tal com nosaltres creiem? Com a resultat, vam introduir una nova metodologia, la metrologia de l’aleatorietat. Aquesta es pot fer servir per a derivar límits quantificables sobre la qualitat de qualsevol dispositiu de generació de nombres aleatoris físic. Finalment, ens vam moure en la direcció de la miniaturització de la tecnologia utilitzant tècniques de la indústria de la fotònica integrada. En particular, vam demostrar el primer generador de nombres aleatoris quàntic totalment integrat, fent servir un esquema de dos làsers en un xip de Fosfur d’Indi. En paral·lel, també vam demostrar la integració d’una part del dispositiu emprant tecnologia de Silici, obrint les portes, per tant, a la producció a gran escala a través de la indústria més avançada de semiconductors.La aleatoriedad es uno de los temas más intrigantes, inspiradores y debatidos a lo largo de la historia. Es un concepto que surge cuando nos preguntamos sobre nuestra propia existencia y de por qué somos como somos. ¿Tenemos libre albedrío? ¿Es la evolución resultado del azar? La aleatoriedad es también un tema que surge cuando intentamos entender nuestra relación con el universo. ¿Por qué estamos aquí? ¿Cuándo y cómo empezó todo esto? ¿Es el universo una máquina determinista o existe espacio para el azar? Sorprendentemente, la aleatoriedad también juega un papel crucial en la era de la información y la tecnología. Los números aleatorios se usan en protocolos de comunicación como Ethernet, y en algoritmos de clasificación y procesado como Page Rank. También la utilizamos en los métodos Monte Carlo, que sirven en los ámbitos de la física, la biología, la química, las finanzas o las matemáticas. Sin embargo, la aplicación más icónica para los números aleatorios la encontramos en el campo de la criptografía y la ciberseguridad. Aquí, los números aleatorios se usan para generar claves criptográficas, proporcionando el elemento básico para dotar a nuestras comunicaciones de seguridad y privacidad. En esta tesis partimos de la siguiente pregunta fundamental: ¿Existe la aleatoriedad en la fotónica? En caso afirmativo, ¿Cómo podemos extraerla y hacerla accesible a todo el mundo? Para afrontar estas dos preguntas, se han combinado herramientas desde la física fundamental hasta la ingeniería. La tesis parte de un estudio detallado del proceso de difusión de fase en láseres semiconductores y de cómo aplicar este proceso para la generación de números aleatorios. A diferencia de otros procesos físicos basados en leyes deterministas de la naturaleza, la difusión de fase tiene un origen puramente cuántico y, por lo tanto, es una fuente ideal para generar números aleatorios. Primeramente, y utilizando este proceso de difusión de fase, creamos el generador cuántico de números aleatorios más rápido nunca implementado (en ese momento) utilizando únicamente componentes de la industria de las telecomunicaciones. Más de 40 Gb/s fueron demostrados utilizando un esquema de láser pulsado. Posteriormente, construimos varios prototipos que fueron testeados en aplicaciones de ciencia fundamental y supercomputación. En particular, algunos de los prototipos desarrollados en esta tesis fueron claves en los famosos experimentos Loophole-free Bell tests realizados en el 2015. En el proceso de construir estos prototipos, iniciamos una nueva línea de investigación para intentar dar respuesta a una nueva pregunta: ¿Cómo sabemos si los números aleatorios que generamos realmente surgen del proceso de difusión de fase, tal y como nosotros creemos? Como resultado introdujimos una nueva metodología, la metrología de la aleatoriedad. Esta se puede usar para derivar límites cuantificables sobre la calidad de cualquier dispositivo de generación de números aleatorios físico. Finalmente, nos movimos en la dirección de la miniaturización de la tecnología utilizando técnicas de la industria de la fotónica integrada. En particular, creamos el primer generador de números aleatorios cuántico totalmente integrado utilizando un esquema de dos láseres en un chip de Fosfuro de Indio. En paralelo, también demostramos la integración de una parte del dispositivo utilizando tecnología de Silicio, abriendo las puertas, por tanto, a la producción a gran escala a través de la industria más avanzada de semiconductores.Award-winningPostprint (published version

    Design and Analysis of a True Random Number Generator Based on GSR Signals for Body Sensor Networks

    Get PDF
    This article belongs to the Section Internet of ThingsToday, medical equipment or general-purpose devices such as smart-watches or smart-textiles can acquire a person's vital signs. Regardless of the type of device and its purpose, they are all equipped with one or more sensors and often have wireless connectivity. Due to the transmission of sensitive data through the insecure radio channel and the need to ensure exclusive access to authorised entities, security mechanisms and cryptographic primitives must be incorporated onboard these devices. Random number generators are one such necessary cryptographic primitive. Motivated by this, we propose a True Random Number Generator (TRNG) that makes use of the GSR signal measured by a sensor on the body. After an exhaustive analysis of both the entropy source and the randomness of the output, we can conclude that the output generated by the proposed TRNG behaves as that produced by a random variable. Besides, and in comparison with the previous proposals, the performance offered is much higher than that of the earlier works.This work was supported by the Spanish Ministry of Economy and Competitiveness under the contract ESP-2015-68245-C4-1-P, by the MINECO grant TIN2016-79095-C2-2-R (SMOG-DEV), and by the Comunidad de Madrid (Spain) under the project CYNAMON (P2018/TCS-4566), co-financed by European Structural Funds (ESF and FEDER). This research was also supported by the Interdisciplinary Research Funds (HTC, United Arab Emirates) under the grant No. 103104

    Heartbeats Do Not Make Good Pseudo-Random Number Generators: An Analysis of the Randomness of Inter-Pulse Intervals

    Get PDF
    The proliferation of wearable and implantable medical devices has given rise to an interest in developing security schemes suitable for these systems and the environment in which they operate. One area that has received much attention lately is the use of (human) biological signals as the basis for biometric authentication, identification and the generation of cryptographic keys. The heart signal (e.g., as recorded in an electrocardiogram) has been used by several researchers in the last few years. Specifically, the so-called Inter-Pulse Intervals (IPIs), which is the time between two consecutive heartbeats, have been repeatedly pointed out as a potentially good source of entropy and are at the core of various recent authentication protocols. In this work, we report the results of a large-scale statistical study to determine whether such an assumption is (or not) upheld. For this, we have analyzed 19 public datasets of heart signals from the Physionet repository, spanning electrocardiograms from 1353 subjects sampled at different frequencies and with lengths that vary between a few minutes and several hours. We believe this is the largest dataset on this topic analyzed in the literature. We have then applied a standard battery of randomness tests to the extracted IPIs. Under the algorithms described in this paper and after analyzing these 19 public ECG datasets, our results raise doubts about the use of IPI values as a good source of randomness for cryptographic purposes. This has repercussions both in the security of some of the protocols proposed up to now and also in the design of future IPI-based schemes.This work was supported by the MINECO Grant TIN2013-46469-R (SPINY: Security and Privacy in the Internet of You); by the CAMGrant S2013/ICE-3095 (CIBERDINE: Cybersecurity, Data and Risks); and by the MINECO Grant TIN2016-79095-C2-2-R (SMOG-DEV: Security Mechanisms for fog computing: advanced security for Devices). This research has been supported by the Swedish Research Council (Vetenskapsrådet) under Grant No. 2015-04154 (PolUser: Rich User-Controlled Privacy Policies)

    On the establishment of PSEUDO random keys for body area network security using physiological signals

    Get PDF
    With the help of recent technological advancements especially in the last decade, it has become much easier to extensively and remotely observe medical conditions of the patients. This observation is done through wearable devices named biosensors that act as connected nodes on the Body Area Network (BAN). The main goal of these biosensors is to collect and provide critical and sensitive health data concerning the host individual, communicate with each other in order to make decisions based on what has been captured and relay the collected data to remote healthcare professionals. The sensitive nature of this critical data makes it extremely important to process it as securely as possible. Biosensors communicate with each other through wireless medium that is vulnerable to potential security attacks. Therefore, secure mechanisms for both data protection and intra-BAN iii communication are needed. Moreover, these mechanisms should be lightweight in order to overcome the hardware resource restrictions of biosensors. Random and secure cryptographic key generation and agreement among the biosensors take place at the core of these security mechanisms. In this thesis, we propose SKA-PSAR (Secure Key Agreement Using Physiological Signals with Augmented Randomness) system. The main goal of this system is to produce highly random cryptographic keys for the biosensors for secure communication in a BAN. Similar to its predecessor SKA-PS protocol by Karaoğlan Altop et al., SKA-PSAR also employs physiological signals, such as heart rate and blood pressure, as inputs for the keys and utilizes the set reconciliation mechanism as basic building block. Novel quantization and binarization methods of the Secure Key Agreement Protocol of the proposed SKA-PSAR system distinguish it from SKA-PS in a way that the former has increased the randomness of the generated keys. In addition, the generated cryptographic keys in our proposed SKA-PSAR system have distinctive and time variant characteristics as well as long enough bit sizes that can be considered resistant against a cryptographic attack. Moreover, correct key generation rate of 100% and false key generation rate of 0% have been obtained. Last but not least, results of the computational complexity, communication complexity and memory requirements of our proposed system are quite higher as compared to SKA-PS, but this is a cost that needs to be paid for achieving high randomness level

    Energy Efficient Computing with Time-Based Digital Circuits

    Get PDF
    University of Minnesota Ph.D. dissertation. May 2019. Major: Electrical Engineering. Advisor: Chris Kim. 1 computer file (PDF); xv, 150 pages.Advancements in semiconductor technology have given the world economical, abundant, and reliable computing resources which have enabled countless breakthroughs in science, medicine, and agriculture which have improved the lives of many. Due to physics, the rate of these advancements is slowing, while the demand for the increasing computing horsepower ever grows. Novel computer architectures that leverage the foundation of conventional systems must become mainstream to continue providing the improved hardware required by engineers, scientists, and governments to innovate. This thesis provides a path forward by introducing multiple time-based computing architectures for a diverse range of applications. Simply put, time-based computing encodes the output of the computation in the time it takes to generate the result. Conventional systems encode this information in voltages across multiple signals; the performance of these systems is tightly coupled to improvements in semiconductor technology. Time-based computing elegantly uses the simplest of components from conventional systems to efficiently compute complex results. Two time-based neuromorphic computing platforms, based on a ring oscillator and a digital delay line, are described. An analog-to-digital converter is designed in the time domain using a beat frequency circuit which is used to record brain activity. A novel path planning architecture, with designs for 2D and 3D routes, is implemented in the time domain. Finally, a machine learning application using time domain inputs enables improved performance of heart rate prediction, biometric identification, and introduces a new method for using machine learning to predict temporal signal sequences. As these innovative architectures are presented, it will become clear the way forward will be increasingly enabled with time-based designs

    Dew: Transparent Constant-sized zkSNARKs

    Get PDF
    We construct polynomial commitment schemes with constant sized evaluation proofs and logarithmic verification time in the transparent setting. To the best of our knowledge, this is the first result achieving this combination of properties. Our starting point is a transparent inner product commitment scheme with constant-sized proofs and linear verification. We build on this to construct a polynomial commitment scheme with constant size evaluation proofs and logarithmic (in the degree of the polynomial) verification time. Our constructions make use of groups of unknown order instantiated by class groups. We prove security of our construction in the Generic Group Model (GGM). Using our polynomial commitment scheme to compile an information-theoretic proof system yields Dew -- a transparent and constant-sized zkSNARK (Zero-knowledge Succinct Non-interactive ARguments of Knowledge) with logarithmic verification. Finally, we show how to recover the result of DARK (Bünz et al., Eurocrypt 2020). DARK presented a succinct transparent polynomial commitment scheme with logarithmic proof size and verification. However, it was recently discovered to have a gap in its security proof (Block et al, CRYPTO 2021). We recover its extractability based on our polynomial commitment construction, thus obtaining a transparent polynomial commitment scheme with logarithmic proof size and verification under the same assumptions as DARK, but with a prover time that is quadratic

    Biometrics

    Get PDF
    Biometrics uses methods for unique recognition of humans based upon one or more intrinsic physical or behavioral traits. In computer science, particularly, biometrics is used as a form of identity access management and access control. It is also used to identify individuals in groups that are under surveillance. The book consists of 13 chapters, each focusing on a certain aspect of the problem. The book chapters are divided into three sections: physical biometrics, behavioral biometrics and medical biometrics. The key objective of the book is to provide comprehensive reference and text on human authentication and people identity verification from both physiological, behavioural and other points of view. It aims to publish new insights into current innovations in computer systems and technology for biometrics development and its applications. The book was reviewed by the editor Dr. Jucheng Yang, and many of the guest editors, such as Dr. Girija Chetty, Dr. Norman Poh, Dr. Loris Nanni, Dr. Jianjiang Feng, Dr. Dongsun Park, Dr. Sook Yoon and so on, who also made a significant contribution to the book
    corecore