740 research outputs found

    J3Gen : a PRNG for Low-Cost Passive RFID

    Get PDF
    Pseudorandom number generation (PRNG) is the main security tool in low-cost passive radio-frequency identification (RFID) technologies, such as EPC Gen2. We present a lightweight PRNG design for low-cost passive RFID tags, named J3Gen. J3Gen is based on a linear feedback shift register (LFSR) configured with multiple feedback polynomials. The polynomials are alternated during the generation of sequences via a physical source of randomness. J3Gen successfully handles the inherent linearity of LFSR based PRNGs and satisfies the statistical requirements imposed by the EPC Gen2 standard. A hardware implementation of J3Gen is presented and evaluated with regard to different design parameters, defining the key-equivalence security and nonlinearity of the design. The results of a SPICE simulation confirm the power-consumption suitability of the proposal

    A Group-Based Ring Oscillator Physical Unclonable Function

    Get PDF
    Silicon Physical Unclonable Function (PUF) is a physical structure of the chip which has functional characteristics that are hard to predict before fabrication but are expected to be unique after fabrication. This is caused by the random fabrication variations. The secret characteristics can only be extracted through physical measurement and will vanish immediately when the chip is powered down. PUF promises a securer means for cryptographic key generation and storage among many other security applications. However, there are still many practical challenges to cost effectively build secure and reliable PUF secrecy. This dissertation proposes new architectures for ring oscillator (RO) PUFs to answer these challenges. First, our temperature-aware cooperative (TAC) RO PUF can utilize certain ROs that were otherwise discarded due to their instability. Second, our novel group-based algorithm can generate secrecy higher than the theoretical upper bound of the conventional pairwise comparisons approach. Third, we build the first regression-based entropy distiller that can turn the PUF secrecy statistically random and robust, meeting the NIST standards. Fourth, we develop a unique Kendall syndrome coding (KSC) that makes the PUF secrecy error resilient against potential environmental fluctuations. Each of these methods can improve the hardware efficiency of the RO PUF implementation by 1.5X to 8X while improving the security and reliability of the PUF secrecy

    Characterization of cyber attacks through variable length Markov models

    Get PDF
    The increase in bandwidth, the emergence of wireless technologies, and the spread of the Internet throughout the world have created new forms of communication with effects on areas such as business, entertainment, and education. This pervasion of computer networks into human activity has amplified the importance of cyber security. Network security relies heavily on Intrusion Detection Systems (IDS), whose objective is to detect malicious network traffic and computer usage. IDS data can be correlated into cyber attack tracks, which consist of ordered collections of alerts triggered during a single multi-stage attack. The objective of this research is to enhance the current knowledge of attack behavior by developing a model that captures the sequential properties of attack tracks. Two sequence characterization models are discussed: Variable Length Markov Models (VLMMs), which are a type of finite-context models, and Hidden Markov Models (HMMs), which are also known as finite-state models. A VLMM is implemented based on attack sequences s = {x1, x2, ...xn} where xi 2 and is a set of possible values of one or more fields in an alert message. This work shows how the proposed model can be used to predict future attack actions (xj+1) belonging to a newly observed and unfolding attack sequence s = {x1, x2, ..., xj}. It also presents a metric that measures the variability in attack actions based on information entropy and a method for classifying attack tracks as sophisticated or simple based on average log-loss. In addition, insights into the analysis of attack target machines are discussed

    The rhythm of therapy: psychophysiological synchronization in clinical dyads

    Get PDF
    Rhythmicity and synchronization are fundamental mechanisms employed by countless natural phenomena to communicate. Previous research has found evidence for synchronization in patients and therapists during clinical activity, for instance in their body movements (Ramseyer & Tschacher, 2011) and physiological activations (e.g. Marci et al, 2007; Kleinbub et al., 2012; Messina et al., 2013). While this phenomenon has been found associated with different important aspects of clinical relationship, such as empathy, rapport, and outcome, and many authors suggested that it may describe crucial dimensions of the therapeutic dyad interaction and change, a clear explanation of its meaning is still lacking. The goals of the present work were to: 1) Provide a solid theoretical and epistemological background, in which to inscribe the phenomenon. This was pursued by crossing neurophenomenology’s sophisticated ideas on mind-body integration (Varela, 1996) and Infant Research’s detailed observations on development of infants’ Self through their relationships. The common ground for this connection was the complex systems theory (von Bertalanffy, 1968; Haken, 2006). 2) Contribute to literature through two replications of existing studies (Kleinbub et al., 2012; Messina et al., 2013) on skin conductance (SC) synchronization. In addition to the original designs, secure attachment priming (Mikulincer & Shaver, 2007) was introduced to explore if observed SC linkage was susceptible to manipulation, accordingly to the developmental premises defined in the theoretical chapters. Study 1 focused on synchrony between students and psychotherapists in simulated clinical sessions; Study 2 reprised the same methodology with two principal changes: first the clinician’s role was played by psychologists without further clinical trainings, and second, each psychologist was involved in two distinct interviews, in order to assess the impact of individual characteristics on SC synchrony. 3) Provide an ideographical exploration of the psychotherapy processes linked to matched SC activity. In study 3 the highest and lowest synchrony sequences of 6 sessions of psychodynamic psychotherapy were subject of a detailed phenomenological content analysis. These micro-categories were synthetized in more abstract ones, in order to attempt the recognizing of regularities that could shed light on the phenomenon. 4) To explore the pertinence of employing mathematical properties derived from the application of system theory in psychological contexts. In study 4, Shannon’s entropy and order equations (1948) were applied on the transcribed verbal content of 12 depression psychotherapies, to assess both intra-personal and inter-personal (dyad) order in verbal categories. Results from these studies provided further evidence for the existence of a synchronization mechanism in the clinical dyads. Furthermore the various findings were generally supporting the dyad system theoretical model, and its description of regulatory dynamics as a good explanation of the synchronization phenomena. Discrepancies with previous literature highlighted the need for further studies to embrace more methodological sophistication (such as employing lag analysis), and cautiousness in the interpretation of results

    Process Flow Features as a Host-based Event Knowledge Representation

    Get PDF
    The detection of malware is of great importance but even non-malicious software can be used for malicious purposes. Monitoring processes and their associated information can characterize normal behavior and help identify malicious processes or malicious use of normal process by measuring deviations from the learned baseline. This exploratory research describes a novel host feature generation process that calculates statistics of an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from computer memory using virtual machine introspection. Each flow cluster generated using k-means of the flow features represents a behavior where the members of the cluster all exhibit similar behavior. Testing explores associations between behavior and process flows that in the future may be useful for detecting unauthorized behavior or behavioral trends on a host. Analysis of two data collections demonstrate that this novel way of thinking of process behavior as process flows can produce baseline models in the form of clusters that do represent specific behaviors

    LCPR: High Performance Compression Algorithm for Lattice-Based Signatures

    Get PDF
    Many lattice-based signature schemes have been proposed in recent years. However, all of them suffer from huge signature sizes as compared to their classical counterparts. We present a novel and generic construction of a lossless compression algorithm for Schnorr-like signatures utilizing publicly accessible randomness. Conceptually, exploiting public randomness in order to reduce the signature size has never been considered in cryptographic applications. We illustrate the applicability of our compression algorithm using the example of a current state-of-the-art signature scheme due to Gentry et al. (GPV scheme) instantiated with the efficient trapdoor construction from Micciancio and Peikert. This scheme benefits from increasing the main security parameter nn, which is positively correlated with the compression rate measuring the amount of storage savings. For instance, GPV signatures admit improvement factors of approximately lg⁥n\lg n implying compression rates of about 6565\% at a security level of about 100 bits without suffering loss of information or decrease in security, meaning that the original signature can always be recovered from its compressed state. As a further result, we propose a multi-signer compression strategy in case more than one signer agree to share the same source of public randomness. Such a strategy of bundling compressed signatures together to an aggregate has many advantages over the single signer approach

    Computer Science & Technology Series : XXI Argentine Congress of Computer Science. Selected papers

    Get PDF
    CACIC’15 was the 21thCongress in the CACIC series. It was organized by the School of Technology at the UNNOBA (North-West of Buenos Aires National University) in Junín, Buenos Aires. The Congress included 13 Workshops with 131 accepted papers, 4 Conferences, 2 invited tutorials, different meetings related with Computer Science Education (Professors, PhD students, Curricula) and an International School with 6 courses. CACIC 2015 was organized following the traditional Congress format, with 13 Workshops covering a diversity of dimensions of Computer Science Research. Each topic was supervised by a committee of 3-5 chairs of different Universities. The call for papers attracted a total of 202 submissions. An average of 2.5 review reports werecollected for each paper, for a grand total of 495 review reports that involved about 191 different reviewers. A total of 131 full papers, involving 404 authors and 75 Universities, were accepted and 24 of them were selected for this book.Red de Universidades con Carreras en Informática (RedUNCI

    Computer Science & Technology Series : XXI Argentine Congress of Computer Science. Selected papers

    Get PDF
    CACIC’15 was the 21thCongress in the CACIC series. It was organized by the School of Technology at the UNNOBA (North-West of Buenos Aires National University) in Junín, Buenos Aires. The Congress included 13 Workshops with 131 accepted papers, 4 Conferences, 2 invited tutorials, different meetings related with Computer Science Education (Professors, PhD students, Curricula) and an International School with 6 courses. CACIC 2015 was organized following the traditional Congress format, with 13 Workshops covering a diversity of dimensions of Computer Science Research. Each topic was supervised by a committee of 3-5 chairs of different Universities. The call for papers attracted a total of 202 submissions. An average of 2.5 review reports werecollected for each paper, for a grand total of 495 review reports that involved about 191 different reviewers. A total of 131 full papers, involving 404 authors and 75 Universities, were accepted and 24 of them were selected for this book.Red de Universidades con Carreras en Informática (RedUNCI

    Large-scale Wireless Local-area Network Measurement and Privacy Analysis

    Get PDF
    The edge of the Internet is increasingly becoming wireless. Understanding the wireless edge is therefore important for understanding the performance and security aspects of the Internet experience. This need is especially necessary for enterprise-wide wireless local-area networks (WLANs) as organizations increasingly depend on WLANs for mission- critical tasks. To study a live production WLAN, especially a large-scale network, is a difficult undertaking. Two fundamental difficulties involved are (1) building a scalable network measurement infrastructure to collect traces from a large-scale production WLAN, and (2) preserving user privacy while sharing these collected traces to the network research community. In this dissertation, we present our experience in designing and implementing one of the largest distributed WLAN measurement systems in the United States, the Dartmouth Internet Security Testbed (DIST), with a particular focus on our solutions to the challenges of efficiency, scalability, and security. We also present an extensive evaluation of the DIST system. To understand the severity of some potential trace-sharing risks for an enterprise-wide large-scale wireless network, we conduct privacy analysis on one kind of wireless network traces, a user-association log, collected from a large-scale WLAN. We introduce a machine-learning based approach that can extract and quantify sensitive information from a user-association log, even though it is sanitized. Finally, we present a case study that evaluates the tradeoff between utility and privacy on WLAN trace sanitization
    • 

    corecore