6,956 research outputs found

    Electron Monte Carlo Simulations of Nanoporous Si Thin Films -- The Influence of Pore-Edge Charges

    Full text link
    Electron transport within nanostructures can be important to varied engineering applications, such as thermoelectrics and nanoelectronics. In theoretical studies, electron Monte Carlo simulations are widely used as an alternative approach to solving the electron Boltzmann transport equation, where the energy-dependent electron scattering, exact structure shape, and detailed electric field distribution can be fully incorporated. In this work, such electron Monte Carlo simulations are employed to predict the electrical conductivity of periodic nanoporous Si films that have been widely studied for thermoelectric applications. The focus is on the influence of pore-edge charges on the electron transport. The results are further compared to our previous modeling [Hao et al., J. Appl. Phys. 121, 094308 (2017)], where the pore-edge electric field has its own scattering rate to be added to the scattering rates of other mechanisms

    Precise photoproduction of the charged top-pions at the LHC with forward detector acceptances

    Get PDF
    We study the photoproduction of the charged top-pion predicted by the top triangle moose (TTMTTM) model (a deconstructed version of the topcolor-assisted technicolor TC2TC2 model) via the processes pppγpπt±t+Xpp\rightarrow p \gamma p \rightarrow \pi^\pm_t t +X at the 14 TeVTeV Large Hadron Collider (LHCLHC) including next-to-leading order (NLONLO) QCDQCD corrections. Our results show that the production cross sections and distributions are sensitive to the free parameters sinω\sin\omega and MπtM_{\pi_t}. Typical QCDQCD correction value is 7%11%7\% \sim 11\% and does not depend much on sinω\sin\omega as well as the forward detector acceptances.Comment: 21pages, 7figures. arXiv admin note: text overlap with arXiv:1201.4364 by other author

    Multiple Change-point Detection: a Selective Overview

    Full text link
    Very long and noisy sequence data arise from biological sciences to social science including high throughput data in genomics and stock prices in econometrics. Often such data are collected in order to identify and understand shifts in trend, e.g., from a bull market to a bear market in finance or from a normal number of chromosome copies to an excessive number of chromosome copies in genetics. Thus, identifying multiple change points in a long, possibly very long, sequence is an important problem. In this article, we review both classical and new multiple change-point detection strategies. Considering the long history and the extensive literature on the change-point detection, we provide an in-depth discussion on a normal mean change-point model from aspects of regression analysis, hypothesis testing, consistency and inference. In particular, we present a strategy to gather and aggregate local information for change-point detection that has become the cornerstone of several emerging methods because of its attractiveness in both computational and theoretical properties.Comment: 26 pages, 2 figure

    Fairness of machine learning applications in criminal justice: Insights from evaluation of COMPAS

    Get PDF
    Machine learning has been widely applied in facilitating high-staked decision making, however, there is an increasing concern on hidden biases behind these methodologies. In the criminal justice context, there is a lasting debate on the fairness of Correctional Defendant Management Profiling for Alternative Sanctions (COMPAS) which uses Random Forest as foundation for recidivism risk predictions. But we noticed that fairness of the algorithm is genuinely measured by two different criterion: calibration and equalized odds. In this research we trained a Random Forest classifier and examined why its application is not eligible in achieving fairness based on different measures. Results show that both of the scales were not achieved on all the six racial groups in COMPAS data set which calls for further evaluation on the algorithm design and more efforts in defining a universal definition and measuring standard on algorithms fairness

    Scale Invariance vs. Conformal Invariance: Holographic Two-Point Functions in Horndeski Gravity

    Full text link
    We consider Einstein-Horndeski gravity with a negative bare constant as a holographic model to investigate whether a scale invariant quantum field theory can exist without the full conformal invariance. Einstein-Horndeski gravity can admit two different AdS vacua. One is conformal, and the holographic two-point functions of the boundary energy-momentum tensor are the same as the ones obtained in Einstein gravity. The other AdS vacuum, which arises at some critical point of the coupling constants, preserves the scale invariance but not the special conformal invariance due to the logarithmic radial dependence of the Horndeski scalar. In addition to the transverse and traceless graviton modes, the theory admits an additional trace/scalar mode in the scale invariant vacuum. We obtain the two-point functions of the corresponding boundary operators. We find that the trace/scalar mode gives rise to an non-vanishing two-point function, which distinguishes the scale invariant theory from the conformal theory. The two-point function vanishes in d=2d=2, where the full conformal symmetry is restored. Our results indicate the strongly coupled scale invariant unitary quantum field theory may exist in d3d\ge 3 without the full conformal symmetry. The operator that is dual to the bulk trace/scalar mode however violates the dominant energy condition.Comment: Latex, 28 pages, comments and references adde
    corecore