29,051 research outputs found

    Electronic Interface Reconstruction at Polar-Nonpolar Mott Insulator Heterojunctions

    Full text link
    We report on a theoretical study of the electronic interface reconstruction (EIR) induced by polarity discontinuity at a heterojunction between a polar and a nonpolar Mott insulators, and of the two-dimensional strongly-correlated electron systems (2DSCESs) which accompany the reconstruction. We derive an expression for the minimum number of polar layers required to drive the EIR, and discuss key parameters of the heterojunction system which control 2DSCES properties. The role of strong correlations in enhancing confinement at the interface is emphasized.Comment: 7 pages, 6 figures, some typos correcte

    Direction-Projection-Permutation for High Dimensional Hypothesis Tests

    Full text link
    Motivated by the prevalence of high dimensional low sample size datasets in modern statistical applications, we propose a general nonparametric framework, Direction-Projection-Permutation (DiProPerm), for testing high dimensional hypotheses. The method is aimed at rigorous testing of whether lower dimensional visual differences are statistically significant. Theoretical analysis under the non-classical asymptotic regime of dimension going to infinity for fixed sample size reveals that certain natural variations of DiProPerm can have very different behaviors. An empirical power study both confirms the theoretical results and suggests DiProPerm is a powerful test in many settings. Finally DiProPerm is applied to a high dimensional gene expression dataset

    Scaling Behavior of the Activated Conductivity in a Quantum Hall Liquid

    Full text link
    We propose a scaling model for the universal longitudinal conductivity near the mobility edge for the integer quantum Hall liquid. We fit our model with available experimental data on exponentially activated conductance near the Landau level tails in the integer quantum Hall regime. We obtain quantitative agreement between our scaling model and the experimental data over a wide temperature and magnetic field range.Comment: 9 pages, Latex, 2 figures (available upon request), #phd0

    Data augmentation and semi-supervised learning for deep neural networks-based text classifier

    Get PDF
    User feedback is essential for understanding user needs. In this paper, we use free-text obtained from a survey on sleep-related issues to build a deep neural networks-based text classifier. However, to train the deep neural networks model, a lot of labelled data is needed. To reduce manual data labelling, we propose a method which is a combination of data augmentation and pseudo-labelling: data augmentation is applied to labelled data to increase the size of the initial train set and then the trained model is used to annotate unlabelled data with pseudo-labels. The result shows that the model with the data augmentation achieves macro-averaged f1 score of 65.2% while using 4,300 training data, whereas the model without data augmentation achieves macro-averaged f1 score of 68.2% with around 14,000 training data. Furthermore, with the combination of pseudo-labelling, the model achieves macro-averaged f1 score of 62.7% with only using 1,400 training data with labels. In other words, with the proposed method we can reduce the amount of labelled data for training while achieving relatively good performance
    • …
    corecore