5,878 research outputs found

    Quantifying Timing Leaks and Cost Optimisation

    Full text link
    We develop a new notion of security against timing attacks where the attacker is able to simultaneously observe the execution time of a program and the probability of the values of low variables. We then show how to measure the security of a program with respect to this notion via a computable estimate of the timing leakage and use this estimate for cost optimisation.Comment: 16 pages, 2 figures, 4 tables. A shorter version is included in the proceedings of ICICS'08 - 10th International Conference on Information and Communications Security, 20-22 October, 2008 Birmingham, U

    Probabilistic abstract interpretation: From trace semantics to DTMC’s and linear regression

    Get PDF
    In order to perform probabilistic program analysis we need to consider probabilistic languages or languages with a probabilistic semantics, as well as a corresponding framework for the analysis which is able to accommodate probabilistic properties and properties of probabilistic computations. To this purpose we investigate the relationship between three different types of probabilistic semantics for a core imperative language, namely Kozen’s Fixpoint Semantics, our Linear Operator Semantics and probabilistic versions of Maximal Trace Semantics. We also discuss the relationship between Probabilistic Abstract Interpretation (PAI) and statistical or linear regression analysis. While classical Abstract Interpretation, based on Galois connection, allows only for worst-case analyses, the use of the Moore-Penrose pseudo inverse in PAI opens the possibility of exploiting statistical and noisy observations in order to analyse and identify various system properties

    The second moment of the pion's distribution amplitude

    Full text link
    We present preliminary results for the second moment of the pion's distribution amplitude. The lattice formulation and the phenomenological implications are briefly reviewed, with special emphasis on some subtleties that arise when the Lorentz group is replaced by the hypercubic group. Having analysed more than half of the available configurations, the result obtained is \xi^2_L = 0.06 \pm 0.02.Comment: Lattice 99 (matrix elements), 3 page

    An Algorithmic Approach to Quantum Field Theory

    Full text link
    The lattice formulation provides a way to regularize, define and compute the Path Integral in a Quantum Field Theory. In this paper we review the theoretical foundations and the most basic algorithms required to implement a typical lattice computation, including the Metropolis, the Gibbs sampling, the Minimal Residual, and the Stabilized Biconjugate inverters. The main emphasis is on gauge theories with fermions such as QCD. We also provide examples of typical results from lattice QCD computations for quantities of phenomenological interest.Comment: 44 pages, to be published in IJMP

    Information Security as Strategic (In)effectivity

    Full text link
    Security of information flow is commonly understood as preventing any information leakage, regardless of how grave or harmless consequences the leakage can have. In this work, we suggest that information security is not a goal in itself, but rather a means of preventing potential attackers from compromising the correct behavior of the system. To formalize this, we first show how two information flows can be compared by looking at the adversary's ability to harm the system. Then, we propose that the information flow in a system is effectively information-secure if it does not allow for more harm than its idealized variant based on the classical notion of noninterference

    Mode Confinement in Photonic Quasi-Crystal Point-Defect Cavities for Particle Accelerators

    Full text link
    In this Letter, we present a study of the confinement properties of point-defect resonators in finite-size photonic-bandgap structures composed of aperiodic arrangements of dielectric rods, with special emphasis on their use for the design of cavities for particle accelerators. Specifically, for representative geometries, we study the properties of the fundamental mode (as a function of the filling fraction, structure size, and losses) via 2-D and 3-D full-wave numerical simulations, as well as microwave measurements at room temperature. Results indicate that, for reduced-size structures, aperiodic geometries exhibit superior confinement properties by comparison with periodic ones.Comment: 4 pages, 4 figures, accepted for publication in Applied Physics Letter

    Consanguinity and polygenic diseases: a model for antibody deficiencies

    Get PDF
    Primary immunodeficiencies represent a heterogeneous group of disorders of the immune system, predisposing to various types of infections. Among them, common variable immunodeficiency is the most common symptomatic antibody deficiency. It includes several different forms characterized by defects in the terminal stage of B lymphocyte differentiation, leading to markedly reduced immunoglobulin serum levels and increased susceptibility to bacterial infections. The clinical phenotype is complex, including autoimmunity, granulomatous inflammation, lymphoproliferative disorders and malignancies. Rare autosomal recessive mutations in a number of single genes have recently been reported. However, the underlying genetic defects remain unknown in the majority of cases. In order to seek new genes responsible for the disease, we studied a consanguineous Italian family through exome sequencing combined with homozygosity mapping. Six missense homozygous variants passed our filtering selection and at least two of them were associated with some aspects of the pathological phenotype. Our data remark the complexity of immune system disorders and emphasize the difficulty to understand the significance of genetic results and their correlation with the disease phenotype

    Second large-scale Monte Carlo study for the Cherenkov Telescope Array

    Full text link
    The Cherenkov Telescope Array (CTA) represents the next generation of ground based instruments for Very High Energy gamma-ray astronomy. It is expected to improve on the sensitivity of current instruments by an order of magnitude and provide energy coverage from 20 GeV to more than 200 TeV. In order to achieve these ambitious goals Monte Carlo (MC) simulations play a crucial role, guiding the design of CTA. Here, results of the second large-scale MC production are reported, providing a realistic estimation of feasible array candidates for both Northern and Sourthern Hemisphere sites performance, placing CTA capabilities into the context of the current generation of High Energy γ\gamma-ray detectors.Comment: In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589
    corecore