8,347 research outputs found

    Evolutionary Computation in High Energy Physics

    Get PDF
    Evolutionary Computation is a branch of computer science with which, traditionally, High Energy Physics has fewer connections. Its methods were investigated in this field, mainly for data analysis tasks. These methods and studies are, however, less known in the high energy physics community and this motivated us to prepare this lecture. The lecture presents a general overview of the main types of algorithms based on Evolutionary Computation, as well as a review of their applications in High Energy Physics.Comment: Lecture presented at 2006 Inverted CERN School of Computing; to be published in the school proceedings (CERN Yellow Report

    A first unbiased global NLO determination of parton distributions and their uncertainties

    Get PDF
    We present a determination of the parton distributions of the nucleon from a global set of hard scattering data using the NNPDF methodology: NNPDF2.0. Experimental data include deep-inelastic scattering with the combined HERA-I dataset, fixed target Drell-Yan production, collider weak boson production and inclusive jet production. Next-to-leading order QCD is used throughout without resorting to K-factors. We present and utilize an improved fast algorithm for the solution of evolution equations and the computation of general hadronic processes. We introduce improved techniques for the training of the neural networks which are used as parton parametrization, and we use a novel approach for the proper treatment of normalization uncertainties. We assess quantitatively the impact of individual datasets on PDFs. We find very good consistency of all datasets with each other and with NLO QCD, with no evidence of tension between datasets. Some PDF combinations relevant for LHC observables turn out to be determined rather more accurately than in any other parton fit.Comment: 86 pages, 41 figures. PDF sets available from http://sophia.ecm.ub.es/nnpdf/nnpdf_pdfsets.htm and from LHAPDF. Final version to be published in Nucl. Phys. B. Various typos corrected and small clarifications added, fig. 4 added, extended discussion of data consistency especially in sect 5.1 and 5.

    Gene autoregulation via intronic microRNAs and its functions

    Get PDF
    Background: MicroRNAs, post-transcriptional repressors of gene expression, play a pivotal role in gene regulatory networks. They are involved in core cellular processes and their dysregulation is associated to a broad range of human diseases. This paper focus on a minimal microRNA-mediated regulatory circuit, in which a protein-coding gene (host gene) is targeted by a microRNA located inside one of its introns. Results: Autoregulation via intronic microRNAs is widespread in the human regulatory network, as confirmed by our bioinformatic analysis, and can perform several regulatory tasks despite its simple topology. Our analysis, based on analytical calculations and simulations, indicates that this circuitry alters the dynamics of the host gene expression, can induce complex responses implementing adaptation and Weber's law, and efficiently filters fluctuations propagating from the upstream network to the host gene. A fine-tuning of the circuit parameters can optimize each of these functions. Interestingly, they are all related to gene expression homeostasis, in agreement with the increasing evidence suggesting a role of microRNA regulation in conferring robustness to biological processes. In addition to model analysis, we present a list of bioinformatically predicted candidate circuits in human for future experimental tests. Conclusions: The results presented here suggest a potentially relevant functional role for negative self-regulation via intronic microRNAs, in particular as a homeostatic control mechanism of gene expression. Moreover, the map of circuit functions in terms of experimentally measurable parameters, resulting from our analysis, can be a useful guideline for possible applications in synthetic biology.Comment: 29 pages and 7 figures in the main text, 18 pages of Supporting Informatio

    Robust multi-objective optimization of safety barriers performance parameters for NaTech scenarios risk assessment and management

    Get PDF
    Safety barriers are to be designed to bring the largest benefit in terms of accidental scenarios consequences mitigation at the most reasonable cost. In this paper, we formulate the problem of the identification of the optimal performance parameters of the barriers that can at the same time allow for the consequences mitigation of Natural Technological (NaTech) accidental scenarios at reasonable cost as a Multi-Objective Optimization (MOO) problem. The MOO is solved for a case study of literature, consisting in a chemical facility composed by three tanks filled with flammable substances and equipped with six safety barriers (active, passive and procedural), exposed to NaTech scenarios triggered by either severe floods or earthquakes. The performance of the barriers is evaluated by a phenomenological dynamic model that mimics the realistic response of the system. The uncertainty of the relevant parameters of the model (i.e., the response time of active and procedural barriers and the effectiveness of the barriers) is accounted for in the optimization, to provide robust solutions. Results for this case study suggest that the NaTech risk is optimally managed by improving the performances of four-out-of-six barriers (three active and one passive). Practical guidelines are provided to retrofit the safety barriers design

    Beyond Desartes and Newton: Recovering life and humanity

    Get PDF
    Attempts to ‘naturalize’ phenomenology challenge both traditional phenomenology and traditional approaches to cognitive science. They challenge Edmund Husserl’s rejection of naturalism and his attempt to establish phenomenology as a foundational transcendental discipline, and they challenge efforts to explain cognition through mainstream science. While appearing to be a retreat from the bold claims made for phenomenology, it is really its triumph. Naturalized phenomenology is spearheading a successful challenge to the heritage of Cartesian dualism. This converges with the reaction against Cartesian thought within science itself. Descartes divided the universe between res cogitans, thinking substances, and res extensa, the mechanical world. The latter won with Newton and we have, in most of objective science since, literally lost our mind, hence our humanity. Despite Darwin, biologists remain children of Newton, and dream of a grand theory that is epistemologically complete and would allow lawful entailment of the evolution of the biosphere. This dream is no longer tenable. We now have to recognize that science and scientists are within and part of the world we are striving to comprehend, as proponents of endophysics have argued, and that physics, biology and mathematics have to be reconceived accordingly. Interpreting quantum mechanics from this perspective is shown to both illuminate conscious experience and reveal new paths for its further development. In biology we must now justify the use of the word “function”. As we shall see, we cannot prestate the ever new biological functions that arise and constitute the very phase space of evolution. Hence, we cannot mathematize the detailed becoming of the biosphere, nor write differential equations for functional variables we do not know ahead of time, nor integrate those equations, so no laws “entail” evolution. The dream of a grand theory fails. In place of entailing laws, a post-entailing law explanatory framework is proposed in which Actuals arise in evolution that constitute new boundary conditions that are enabling constraints that create new, typically unprestatable, Adjacent Possible opportunities for further evolution, in which new Actuals arise, in a persistent becoming. Evolution flows into a typically unprestatable succession of Adjacent Possibles. Given the concept of function, the concept of functional closure of an organism making a living in its world, becomes central. Implications for patterns in evolution include historical reconstruction, and statistical laws such as the distribution of extinction events, or species per genus, and the use of formal cause, not efficient cause, laws

    A Profile Likelihood Analysis of the Constrained MSSM with Genetic Algorithms

    Full text link
    The Constrained Minimal Supersymmetric Standard Model (CMSSM) is one of the simplest and most widely-studied supersymmetric extensions to the standard model of particle physics. Nevertheless, current data do not sufficiently constrain the model parameters in a way completely independent of priors, statistical measures and scanning techniques. We present a new technique for scanning supersymmetric parameter spaces, optimised for frequentist profile likelihood analyses and based on Genetic Algorithms. We apply this technique to the CMSSM, taking into account existing collider and cosmological data in our global fit. We compare our method to the MultiNest algorithm, an efficient Bayesian technique, paying particular attention to the best-fit points and implications for particle masses at the LHC and dark matter searches. Our global best-fit point lies in the focus point region. We find many high-likelihood points in both the stau co-annihilation and focus point regions, including a previously neglected section of the co-annihilation region at large m_0. We show that there are many high-likelihood points in the CMSSM parameter space commonly missed by existing scanning techniques, especially at high masses. This has a significant influence on the derived confidence regions for parameters and observables, and can dramatically change the entire statistical inference of such scans.Comment: 47 pages, 8 figures; Fig. 8, Table 7 and more discussions added to Sec. 3.4.2 in response to referee's comments; accepted for publication in JHE
    corecore