506 research outputs found

    PerfWeb: How to Violate Web Privacy with Hardware Performance Events

    Full text link
    The browser history reveals highly sensitive information about users, such as financial status, health conditions, or political views. Private browsing modes and anonymity networks are consequently important tools to preserve the privacy not only of regular users but in particular of whistleblowers and dissidents. Yet, in this work we show how a malicious application can infer opened websites from Google Chrome in Incognito mode and from Tor Browser by exploiting hardware performance events (HPEs). In particular, we analyze the browsers' microarchitectural footprint with the help of advanced Machine Learning techniques: k-th Nearest Neighbors, Decision Trees, Support Vector Machines, and in contrast to previous literature also Convolutional Neural Networks. We profile 40 different websites, 30 of the top Alexa sites and 10 whistleblowing portals, on two machines featuring an Intel and an ARM processor. By monitoring retired instructions, cache accesses, and bus cycles for at most 5 seconds, we manage to classify the selected websites with a success rate of up to 86.3%. The results show that hardware performance events can clearly undermine the privacy of web users. We therefore propose mitigation strategies that impede our attacks and still allow legitimate use of HPEs

    Error Reduction Program

    Get PDF
    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate

    Top Quark Production in Extended Bess Model

    Full text link
    We study top production at Tevatron collider in the extended BESS model, which is an effective lagrangian parametrization of a dynamical symmetry breaking of the electroweak symmetry. The existence of a colored octet of gauge vector bosons can increase top production at a rate still consistent with recent experimental data and lead to distorsions in the transverse momentum spectrum of the top.Comment: 13 pages, LaTeX, 4 figure

    Z' Physics

    Get PDF
    The limits on extra neutral gauge bosons, which could be reached at LEP2, are reviewed. Exclusion and discovery limits are discussed for f\bar f and WW production.Comment: 20 pages Latex, 7 figures included by epsfig, Contribution to the Proceedings the workshop "Physics at LEP2", Geneva, 199

    Z' effects and anomalous gauge couplings at LC with polarization

    Full text link
    We show that the availability of longitudinally polarized electron beams at a 500 GeV Linear Collider would allow, from an analysis of the reaction e^+e^-\to W^+W^-, to set stringent bounds on the couplings of a Z' of the most general type. In addition, to some extent, it would be possible to disentangle observable effects of the Z' from analogous ones due to competitor models with anomalous trilinear gauge couplings.Comment: 22 pages LaTex, 6 figures available on request, revised version accepted for publication in Int. J. Mod. Phys.

    Combining deep learning and machine learning for the automatic identification of hip prosthesis failure: Development, validation and explainability analysis

    Get PDF
    Aim: Revision hip arthroplasty has a less favorable outcome than primary total hip arthroplasty and an understanding of the timing of total hip arthroplasty failure may be helpful. The aim of this study is to develop a combined deep learning (DL) and machine learning (ML) approach to automatically detect hip prosthetic failure from conventional plain radiographs. Methods: Two cohorts of patients (of 280 and 352 patients) were included in the study, for model development and validation, respectively. The analysis was based on one antero-posterior and one lateral radiographic view obtained from each patient during routine post-surgery follow-up. After pre-processing, three images were obtained: the original image, the acetabulum image and the stem image. These images were analyzed through convolutional neural networks aiming to predict prosthesis failure. Deep features of the three images were extracted for each model and two feature-based pipelines were developed: one utilizing only the features of the original image (original image pipeline) and the other concatenating the features of the three images (3-image pipeline). The obtained features were either used directly or reduced through principal component analysis. Both support vector machine (SVM) and random forest (RF) classifiers were considered for each pipeline. Results: The SVM applied to the 3-image pipeline provided the best performance, with an accuracy of 0.958 +/- 0.006 in the internal validation and an F1-score of 0.874 in the external validation set. The explainability analysis, besides identifying the features of the complete original images as the major contributor, highlighted the role of the acetabulum and stem images on the prediction. Conclusions: This study demonstrated the potentialities of the developed DL-ML procedure based on plain radiographs in the detection of the failure of the hip prosthesis

    A Next-to-Leading-Order Study of Dihadron Production

    Get PDF
    The production of pairs of hadrons in hadronic collisions is studied using a next-to-leading-order Monte Carlo program based on the phase space slicing technique. Up-to-date fragmentation functions based on fits to LEP data are employed, together with several versions of current parton distribution functions. Good agreement is found with data for the dihadron mass distribution. A comparison is also made with data for the dihadron angular distribution. The scale dependence of the predictions and the dependence on the choices made for the fragmentation and parton distribution functions are also presented. The good agreement between theory and experiment is contrasted to the case for single π0\pi^0 production where significant deviations between theory and experiment have been observed.Comment: 22 pages, 15 figures; 3 references added, one figure modified for clarit
    • …
    corecore