1,359 research outputs found

    The first applications of novel gaseous detectors for UV visualization

    Get PDF
    We have demonstrated experimentally that recently developed gaseous detectors combined with solid or gaseous photo-cathodes have exceptionally low noise and high quantum efficiency for UV photons while being solar blind. For this reason they can be used for the detection of weak UV sources in daylight conditions. These detectors are extremely robust, can operate in poor gas conditions and are cheap. We present the first results of their applications to hyper-spectroscopy and flame detection in daylight conditions

    The Error and Repair Catastrophes: A Two-Dimensional Phase Diagram in the Quasispecies Model

    Full text link
    This paper develops a two gene, single fitness peak model for determining the equilibrium distribution of genotypes in a unicellular population which is capable of genetic damage repair. The first gene, denoted by σvia \sigma_{via} , yields a viable organism with first order growth rate constant k>1 k > 1 if it is equal to some target ``master'' sequence σvia,0 \sigma_{via, 0} . The second gene, denoted by σrep \sigma_{rep} , yields an organism capable of genetic repair if it is equal to some target ``master'' sequence σrep,0 \sigma_{rep, 0} . This model is analytically solvable in the limit of infinite sequence length, and gives an equilibrium distribution which depends on \mu \equiv L\eps , the product of sequence length and per base pair replication error probability, and \eps_r , the probability of repair failure per base pair. The equilibrium distribution is shown to exist in one of three possible ``phases.'' In the first phase, the population is localized about the viability and repairing master sequences. As \eps_r exceeds the fraction of deleterious mutations, the population undergoes a ``repair'' catastrophe, in which the equilibrium distribution is still localized about the viability master sequence, but is spread ergodically over the sequence subspace defined by the repair gene. Below the repair catastrophe, the distribution undergoes the error catastrophe when μ \mu exceeds \ln k/\eps_r , while above the repair catastrophe, the distribution undergoes the error catastrophe when μ \mu exceeds lnk/fdel \ln k/f_{del} , where fdel f_{del} denotes the fraction of deleterious mutations.Comment: 14 pages, 3 figures. Submitted to Physical Review

    Cosmological mass limits on neutrinos, axions, and other light particles

    Full text link
    The small-scale power spectrum of the cosmological matter distribution together with other cosmological data provides a sensitive measure of the hot dark matter fraction, leading to restrictive neutrino mass limits. We extend this argument to generic cases of low-mass thermal relics. We vary the cosmic epoch of thermal decoupling, the radiation content of the universe, and the new particle's spin degrees of freedom. Our treatment covers various scenarios of active plus sterile neutrinos or axion-like particles. For three degenerate massive neutrinos, we reproduce the well-known limit of m_nu < 0.34 eV. In a 3+1 scenario of 3 massless and 1 fully thermalized sterile neutrino we find m_nu < 1.0 eV. Thermally produced QCD axions must obey m_a < 3.0 eV, superseding limits from a direct telescope search, but leaving room for solar eV-mass axions to be discovered by the CAST experiment.Comment: 15 pages, 6 figures, matches version in JCA

    Constraints on neutrino masses from a Galactic supernova neutrino signal at present and future detectors

    Get PDF
    We study the constraints on neutrino masses that could be derived from the observation of a Galactic supernova neutrino signal with present and future neutrino detectors. Our analysis is based on a recently proposed method that uses the full statistics of neutrino events and does not depend on particular astrophysical assumptions. The statistical approach, originally justified mainly in terms of intuitive reasoning, is put on a more solid basis by means of Bayesian inference reasoning. Theoretical uncertainties in the neutrino signal time profiles are estimated by applying the method to two widely different supernova models. Present detectors can reach a sensitivity down to 1 eV. This is better than limits from tritium β\beta-decay experiments, competitive with the most conservative results from neutrinoless double β\beta-decay, less precise but less dependent from prior assumptions than cosmological bounds. Future megaton water Cerencov detectors will allow for about a factor of two improvement. However, they will not be competitive with the next generation of laboratory experiments.Comment: 28 pages, 5 Figures, added discussion on systematic errors and some clarifications. Results unchanged. Published versio

    High rate, fast timing Glass RPC for the high {\eta} CMS muon detectors

    Full text link
    The HL-LHC phase is designed to increase by an order of magnitude the amount of data to be collected by the LHC experiments. To achieve this goal in a reasonable time scale the instantaneous luminosity would also increase by an order of magnitude up to 6.1034cm2s16.10^{34} cm^{-2} s^{-1} . The region of the forward muon spectrometer (η>1.6|{\eta}| > 1.6) is not equipped with RPC stations. The increase of the expected particles rate up to 2kHz/cm22 kHz/cm^{2} (including a safety factor 3) motivates the installation of RPC chambers to guarantee redundancy with the CSC chambers already present. The actual RPC technology of CMS cannot sustain the expected background level. The new technology that will be chosen should have a high rate capability and provides a good spatial and timing resolution. A new generation of Glass-RPC (GRPC) using low-resistivity (LR) glass is proposed to equip at least the two most far away of the four high η{\eta} muon stations of CMS. First the design of small size prototypes and studies of their performance in high-rate particles flux is presented. Then the proposed designs for large size chambers and their fast-timing electronic readout are examined and preliminary results are provided.Comment: 14 pages, 11 figures, Conference proceeding for the 2016 Resistive Plate Chambers and Related Detector

    Field theory for a reaction-diffusion model of quasispecies dynamics

    Get PDF
    RNA viruses are known to replicate with extremely high mutation rates. These rates are actually close to the so-called error threshold. This threshold is in fact a critical point beyond which genetic information is lost through a second-order phase transition, which has been dubbed the ``error catastrophe.'' Here we explore this phenomenon using a field theory approximation to the spatially extended Swetina-Schuster quasispecies model [J. Swetina and P. Schuster, Biophys. Chem. {\bf 16}, 329 (1982)], a single-sharp-peak landscape. In analogy with standard absorbing-state phase transitions, we develop a reaction-diffusion model whose discrete rules mimic the Swetina-Schuster model. The field theory representation of the reaction-diffusion system is constructed. The proposed field theory belongs to the same universality class than a conserved reaction-diffusion model previously proposed [F. van Wijland {\em et al.}, Physica A {\bf 251}, 179 (1998)]. From the field theory, we obtain the full set of exponents that characterize the critical behavior at the error threshold. Our results present the error catastrophe from a new point of view and suggest that spatial degrees of freedom can modify several mean field predictions previously considered, leading to the definition of characteristic exponents that could be experimentally measurable.Comment: 13 page

    Virus Replication as a Phenotypic Version of Polynucleotide Evolution

    Full text link
    In this paper we revisit and adapt to viral evolution an approach based on the theory of branching process advanced by Demetrius, Schuster and Sigmund ("Polynucleotide evolution and branching processes", Bull. Math. Biol. 46 (1985) 239-262), in their study of polynucleotide evolution. By taking into account beneficial effects we obtain a non-trivial multivariate generalization of their single-type branching process model. Perturbative techniques allows us to obtain analytical asymptotic expressions for the main global parameters of the model which lead to the following rigorous results: (i) a new criterion for "no sure extinction", (ii) a generalization and proof, for this particular class of models, of the lethal mutagenesis criterion proposed by Bull, Sanju\'an and Wilke ("Theory of lethal mutagenesis for viruses", J. Virology 18 (2007) 2930-2939), (iii) a new proposal for the notion of relaxation time with a quantitative prescription for its evaluation, (iv) the quantitative description of the evolution of the expected values in in four distinct "stages": extinction threshold, lethal mutagenesis, stationary "equilibrium" and transient. Finally, based on these quantitative results we are able to draw some qualitative conclusions.Comment: 23 pages, 1 figure, 2 tables. arXiv admin note: substantial text overlap with arXiv:1110.336

    Probing Kaluza-Klein Dark Matter with Neutrino Telescopes

    Get PDF
    In models in which all of the Standard Model fields live in extra universal dimensions, the lightest Kaluza-Klein (KK) particle can be stable. Calculations of the one-loop radiative corrections to the masses of the KK modes suggest that the identity of the lightest KK particle (LKP) is mostly the first KK excitation of the hypercharge gauge boson. This LKP is a viable dark matter candidate with an ideal present-day relic abundance if its mass is moderately large, between 600 to 1200 GeV. Such weakly interacting dark matter particles are expected to become gravitationally trapped in large bodies, such as the Sun, and annihilate into neutrinos or other particles that decay into neutrinos. We calculate the annihilation rate, neutrino flux and the resulting event rate in present and future neutrino telescopes. The relatively large mass implies that the neutrino energy spectrum is expected to be well above the energy threshold of AMANDA and IceCube. We find that the event rate in IceCube is between a few to tens of events per year.Comment: 13 pages, 3 figures, LaTeX; typos fixed, version to appear in PR

    Trade unions and the challenge of fostering solidarities in an era of financialisation

    Get PDF
    This articles re-examines evidence that trade unions in the UK have struggled to renew themselves despite considerable investment of time and effort. It argues that financialisation in the realms of capital accumulation, organisational decision making and everyday life has introduced new barriers to building the solidarities within and between groups of workers that would be necessary to develop a stronger response to the catastrophic effects on labour of financialisation in general, and the financial crisis specifically. The crisis highlighted the weaknesses of trade unions as institutions of economic and industrial democracy, but has also given some opportunities to establish narratives of solidarity in spaces and platforms created within a financialised context
    corecore