113 research outputs found

    The productions of the top-pions and top-Higgs associated with the charm quark at the hadron colliders

    Get PDF
    In the topcolor-assistant technicolor (TC2) model, the typical physical particles, top-pions and top-Higgs, are predicted and the existence of these particles could be regarded as the robust evidence of the model. These particles are accessible at the Tevatron and LHC, and furthermore the flavor-changing(FC) feature of the TC2 model can provide us a unique chance to probe them. In this paper, we study some interesting FC production processes of top-pions and top-Higgs at the Tevatron and LHC, i.e., cΠt−c\Pi_{t}^{-} and cΠt0(ht0)c\Pi_{t}^{0}(h_{t}^{0}) productions. We find that the light charged top-pions are not favorable by the Tevatron experiments and the Tevatron has a little capability to probe neutral top-pion and top-Higgs via these FC production processes. At the LHC, however, the cross section can reach the level of 10∌10010\sim 100 pb for cΠt−c\Pi_t^- production and 10∌100 10\sim 100 fb for cΠt0(ht0)c\Pi_t^0(h_t^0) production. So one can expect that enough signals could be produced at the LHC experiments. Furthermore, the SM background should be clean due to the FC feature of the processes and the FC decay modes Πt−→bcˉ,Πt0(ht0)→tcˉ\Pi_t^-\to b\bar{c}, \Pi_t^0(h_t^0)\to t\bar{c} can provide us the typical signal to detect the top-pions and top-Higgs. Therefore, it is hopeful to find the signal of top-pions and top-Higgs with the running of the LHC via these FC processes.Comment: 12 pages, 6 figure

    Methods for interpreting lists of affected genes obstained in a DNA microarray experiment

    Get PDF
    Background - The aim of this paper was to describe and compare the methods used and the results obtained by the participants in a joint EADGENE (European Animal Disease Genomic Network of Excellence) and SABRE (Cutting Edge Genomics for Sustainable Animal Breeding) workshop focusing on post analysis of microarray data. The participating groups were provided with identical lists of microarray probes, including test statistics for three different contrasts, and the normalised log-ratios for each array, to be used as the starting point for interpreting the affected probes. The data originated from a microarray experiment conducted to study the host reactions in broilers occurring shortly after a secondary challenge with either a homologous or heterologous species of Eimeria. Results - Several conceptually different analytical approaches, using both commercial and public available software, were applied by the participating groups. The following tools were used: Ingenuity Pathway Analysis, MAPPFinder, LIMMA, GOstats, GOEAST, GOTM, Globaltest, TopGO, ArrayUnlock, Pathway Studio, GIST and AnnotationDbi. The main focus of the approaches was to utilise the relation between probes/genes and their gene ontology and pathways to interpret the affected probes/genes. The lack of a well-annotated chicken genome did though limit the possibilities to fully explore the tools. The main results from these analyses showed that the biological interpretation is highly dependent on the statistical method used but that some common biological conclusions could be reached. Conclusion - It is highly recommended to test different analytical methods on the same data set and compare the results to obtain a reliable biological interpretation of the affected genes in a DNA microarray experimen

    Astronomical Distance Determination in the Space Age: Secondary Distance Indicators

    Get PDF
    The formal division of the distance indicators into primary and secondary leads to difficulties in description of methods which can actually be used in two ways: with, and without the support of the other methods for scaling. Thus instead of concentrating on the scaling requirement we concentrate on all methods of distance determination to extragalactic sources which are designated, at least formally, to use for individual sources. Among those, the Supernovae Ia is clearly the leader due to its enormous success in determination of the expansion rate of the Universe. However, new methods are rapidly developing, and there is also a progress in more traditional methods. We give a general overview of the methods but we mostly concentrate on the most recent developments in each field, and future expectations. © 2018, The Author(s)

    Volume I. Introduction to DUNE

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE\u27s physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology

    Deep Underground Neutrino Experiment (DUNE), far detector technical design report, volume III: DUNE far detector technical coordination

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. Volume III of this TDR describes how the activities required to design, construct, fabricate, install, and commission the DUNE far detector modules are organized and managed. This volume details the organizational structures that will carry out and/or oversee the planned far detector activities safely, successfully, on time, and on budget. It presents overviews of the facilities, supporting infrastructure, and detectors for context, and it outlines the project-related functions and methodologies used by the DUNE technical coordination organization, focusing on the areas of integration engineering, technical reviews, quality assurance and control, and safety oversight. Because of its more advanced stage of development, functional examples presented in this volume focus primarily on the single-phase (SP) detector module

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    ISOPLETH-AREA TABLES.

    No full text

    A macro-tidal freshwater ecosystem recovering from hypereutrophication: the Schelde case study

    Get PDF
    We report a 40 year record of eutrophication and hypoxia on an estuarine ecosystem and its recovery from hypereutrophication. After decades of high inorganic nutrient concentrations and recurring anoxia and hypoxia, we observe a paradoxical increase in chlorophyll-a concentrations with decreasing nutrient inputs. We hypothesise that algal growth was inhibited due to hypereutrophication, either by elevated ammonium concentrations, severe hypoxia or the production of harmful substances in such a reduced environment. We study the dynamics of a simple but realistic mathematical model, incorporating the assumption of algal growth inhibition. It shows a high algal biomass, net oxygen production equilibrium with low ammonia inputs, and a low algal biomass, net oxygen consumption equilibrium with high ammonia inputs. At intermediate ammonia inputs it displays two alternative stable states. Although not intentional, the numerical output of this model corresponds to observations, giving extra support for assumption of algal growth inhibition. Due to potential algal growth inhibition, the recovery of hypereutrophied systems towards a classical eutrophied state, will need reduction of waste loads below certain thresholds and will be accompanied by large fluctuations in oxygen concentrations. We conclude that also flow-through systems, heavily influenced by external forcings which partly mask internal system dynamics, can display multiple stable states.
    • 

    corecore