121 research outputs found

    Emulation of TPM on Raspberry Pi

    Get PDF
    The Trusted Platform Module (TPM) is a dedicated microprocessor designed to secure hardware by integrating cryptographic keys into the non-volatile memory of the module. TPM is specified by the Trusted Computing Group (TCG). TCG is an initiative started in 2003 by several multinational semiconductor and IT-companies. The initiative is an effort to develop standards for Trusted Computing where hardware is used to provide security support to software. The TPM is typically connected to the LPC bus on the motherboard of a PC and can be used to create and store cryptographic keys, generate random numbers, hash values and encrypt data. The purpose of this thesis is to develop a TPM learning environment and a laboratory manual for introductory courses in computer security where the students are able to learn about the functionalities of the TPM as a means to secure hardware. The functions of the TPM will be emulated on the ARM based single board computer Raspberry Pi developed by the Raspberry Pi foundation. The TPM commands will be executed from a PC which will connect to the Raspberry Pi remotely through TCP. Several exercises related to TPM and its functionalities are provided as an appendix to this report. The exercises are intended for students or others interested in Trusted Computing. This report also provides exercises related to the creation of TPM applications using TSS (Trusted Computing Software Stack)

    The effects of transport infrastructure on regional economic development: A simulated spatial overlapping generations model with heterogenous skill

    Get PDF
    JTLU vol 5, no 2, pp 77-101 (2012)As a result of public investment, lower freight transport costs tend to translate into lower local price indices and are associated with equilibria characterized by higher output and consumption. In this paper we investigate an additional effect to these trade gains, namely the gains from better spatial matching in the labor market. We simulate a two-region Spatial OLG model in which agents are heterogeneous in terms of skill. Under repeated simulation experiments, we show that, for high household relocation frictions, the possibility of interregional commuting can be seen as an alternative way to realize the potential matching effects. For high levels of skill heterogeneity and a plausible parametric input, a steady state in which labor matching is realized through commuting can be associated with up to 10% higher per capita output, compared to the one with homogenous labor, in which only gains from trade are feasible

    An empirical study on aggregation of alternatives and its influence on prediction in car type choice models

    Get PDF
    Assessing and predicting car type choices are important for policy analysis. Car type choice models are often based on aggregate alternatives. This is due to the fact that analysts typically do not observe choices at the detailed level that they are made. In this paper, we use registry data of all new car purchases in Sweden for two years where cars are observed by their brand, model and fuel type. However, the choices are made at a more detailed level. Hence, an aggregate (observed) alternative can correspond to several disaggregate (detailed) alternatives. We present an extensive empirical study analyzing estimation results, in-sample and out-of-sample fit as well as prediction performance of five model specifications. These models use different aggregation methods from the literature. We propose a specification of a two-level nested logit model that captures correlation between aggregate and disaggregate alternatives. The nest specific scale parameters are defined as parameterized exponential functions to keep the number of parameters reasonable. The results show that the in-sample and out-of-sample fit as well as the prediction performance differ. The best model accounts for the heterogeneity over disaggregate alternatives as well as the correlation between both disaggregate and aggregate alternatives. It outperforms the commonly used aggregation method of simply including a size measure

    Including time in a travel demand model using dynamic discrete choice

    Get PDF
    Activity based travel demand models are based on the idea that travel is derived from the demand to participate in different activities. Predicting travel demand should therefore include the prediction of demand for activity participation. Time-space constraints, such as working hours, restricts when and where different activities can be conducted, and plays an important role in determining how people choose to travel. Travelling is seen as a possibly costly link between different activities, that also implicitly leads to missed opportunities for activity participation. With a microeconomic foundation, activity based models can further be used for appraisal and for accessibility measures. However, most models up to date lack some dynamic consistency that, e.g., might make it hard to capture the trade-off between activity decisions at different times of the day. In this paper, we show how dynamic discrete choice theory can be used to formulate a travel demand model which includes choice of departure time for all trips, as well as number of trips, location, purpose and mode of transport. We estimate the model on travel diaries and show that the it is able to reproduce the distribution of, e.g., number of trips per day, departure times and travel time distributions

    Random Matrix Theories in Quantum Physics: Common Concepts

    Full text link
    We review the development of random-matrix theory (RMT) during the last decade. We emphasize both the theoretical aspects, and the application of the theory to a number of fields. These comprise chaotic and disordered systems, the localization problem, many-body quantum systems, the Calogero-Sutherland model, chiral symmetry breaking in QCD, and quantum gravity in two dimensions. The review is preceded by a brief historical survey of the developments of RMT and of localization theory since their inception. We emphasize the concepts common to the above-mentioned fields as well as the great diversity of RMT. In view of the universality of RMT, we suggest that the current development signals the emergence of a new "statistical mechanics": Stochasticity and general symmetry requirements lead to universal laws not based on dynamical principles.Comment: 178 pages, Revtex, 45 figures, submitted to Physics Report

    Scalable and accurate deep learning for electronic health records

    Get PDF
    Predictive modeling with electronic health record (EHR) data is anticipated to drive personalized medicine and improve healthcare quality. Constructing predictive statistical models typically requires extraction of curated predictor variables from normalized EHR data, a labor-intensive process that discards the vast majority of information in each patient's record. We propose a representation of patients' entire, raw EHR records based on the Fast Healthcare Interoperability Resources (FHIR) format. We demonstrate that deep learning methods using this representation are capable of accurately predicting multiple medical events from multiple centers without site-specific data harmonization. We validated our approach using de-identified EHR data from two U.S. academic medical centers with 216,221 adult patients hospitalized for at least 24 hours. In the sequential format we propose, this volume of EHR data unrolled into a total of 46,864,534,945 data points, including clinical notes. Deep learning models achieved high accuracy for tasks such as predicting in-hospital mortality (AUROC across sites 0.93-0.94), 30-day unplanned readmission (AUROC 0.75-0.76), prolonged length of stay (AUROC 0.85-0.86), and all of a patient's final discharge diagnoses (frequency-weighted AUROC 0.90). These models outperformed state-of-the-art traditional predictive models in all cases. We also present a case-study of a neural-network attribution system, which illustrates how clinicians can gain some transparency into the predictions. We believe that this approach can be used to create accurate and scalable predictions for a variety of clinical scenarios, complete with explanations that directly highlight evidence in the patient's chart.Comment: Published version from https://www.nature.com/articles/s41746-018-0029-

    A Robust Parser-Interpreter for Jazz Chord Sequences

    Get PDF
    Hierarchical structure similar to that associated with prosody and syntax in language can be identified in the rhythmic and harmonic progressions that underlie Western tonal music. Analysing such musical struc-ture resembles natural language parsing: it requires the derivation of an underlying interpretation from an un-structured sequence of highly ambiguous elements— in the case of music, the notes. The task here is not merely to decide whether the sequence is grammati-cal, but rather to decide which among a large number of analyses it has. An analysis of this sort is a part of the cognitive processing performed by listeners familiar with a musical idiom, whether musically trained or not. Our focus is on the analysis of the structure of ex-pectations and resolutions created by harmonic progres-sions. Building on previous work, we define a theory of tonal harmonic progression, which plays a role analo-gous to semantics in language. Our parser uses a formal grammar of jazz chord sequences, of a kind widely used for natural language processing (NLP), to map music, in the form of chord sequences used by performers, onto a representation of the structured relationships between chords. It uses statistical modelling techniques used for wide-coverage parsing in NLP to make practical pars-ing feasible in the face of considerable ambiguity in the grammar. Using machine learning over a small corpus of jazz chord sequences annotated with harmonic anal-yses, we show that grammar-based musical interpreta-tion using simple statistical parsing models is more ac-curate than a baseline HMM. The experiment demon-strates that statistical techniques adapted from NLP can be profitably applied to the analysis of harmonic struc-ture
    corecore