280 research outputs found

    Modeling the quantum evolution of the universe through classical matter

    Full text link
    It is well known that the canonical quantization of the Friedmann-Lema\^itre-Robertson-Walker (FLRW) filled with a perfect fluid leads to nonsingular universes which, for later times, behave as their classical counterpart. This means that the expectation value of the scale factor (t)(t) never vanishes and, as t→∞t\to\infty, we recover the classical expression for the scale factor. In this paper, we show that such universes can be reproduced by classical cosmology given that the universe is filled with an exotic matter. In the case of a perfect fluid, we find an implicit equation of state (EoS). We then show that this single fluid with an implict EoS is equivalent to two non-interacting fluids, one of them representing stiff matter with negative energy density. In the case of two non-interacting scalar fields, one of them of the phantom type, we find their potential energy. In both cases we find that quantum mechanics changes completely the configuration of matter for small values of time, by adding a fluid or a scalar field with negative energy density. As time passes, the density of negative energy decreases and we recover the ordinary content of the classical universe. The more the initial wave function of the universe is concentrated around the classical big bang singularity, the more it is necessary to add negative energy, since this type of energy will be responsible for the removal of the classical singularity.Comment: updated version as accepted by Gen. Relativ. Gravi

    Calibration of photomultiplier arrays

    Get PDF
    A method is described that allows calibration and assessment of the linearity of response of an array of photomultiplier tubes. The method does not require knowledge of the photomultiplier single photoelectron response model and uses science data directly, thus eliminating the need for dedicated data sets. In this manner all photomultiplier working conditions (e.g. temperature, external fields, etc.) are exactly matched between calibration and science acquisitions. This is of particular importance in low background experiments such as ZEPLIN-III, where methods involving the use of external light sources for calibration are severely constrained

    Measurement and simulation of the muon-induced neutron yield in lead

    Get PDF
    A measurement is presented of the neutron production rate in lead by high energy cosmic-ray muons at a depth of 2850 m water equivalent (w.e.) and a mean muon energy of 260 GeV. The measurement exploits the delayed coincidences between muons and the radiative capture of induced neutrons in a highly segmented tonne scale plastic scintillator detector. Detailed Monte Carlo simulations reproduce well the measured capture times and multiplicities and, within the dynamic range of the instrumentation, the spectrum of energy deposits. By comparing measurements with simulations of neutron capture rates a neutron yield in lead of (View the MathML source) ×10-3 neutrons/muon/(g/cm2) has been obtained. Absolute agreement between simulation and data is of order 25%. Consequences for deep underground rare event searches are discussed

    ZE3RA: The ZEPLIN-III Reduction and Analysis package

    Get PDF
    ZE3RA is the software package responsible for processing the raw data from the ZEPLIN-III dark matter experiment and its reduction into a set of parameters used in all subsequent analyses. The detector is a liquid xenon time projection chamber with scintillation and electroluminescence signals read out by an array of 31 photomultipliers. The dual range 62-channel data stream is optimised for the detection of scintillation pulses down to a single photoelectron and of ionisation signals as small as those produced by single electrons. We discuss in particular several strategies related to data filtering, pulse finding and pulse clustering which are tuned using calibration data to recover the best electron/nuclear recoil discrimination near the detection threshold, where most dark matter elastic scattering signatures are expected. The software was designed assuming only minimal knowledge of the physics underlying the detection principle, allowing an unbiased analysis of the experimental results and easy extension to other detectors with similar requirements. ©2011 IOP Publishing Ltd and SISSA

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≄20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≀pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≀{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Phylogeographic Analysis of HIV-1 Subtype C Dissemination in Southern Brazil

    Get PDF
    The HIV-1 subtype C has spread efficiently in the southern states of Brazil (Rio Grande do Sul, Santa Catarina and Paraná). Phylogeographic studies indicate that the subtype C epidemic in southern Brazil was initiated by the introduction of a single founder virus population at some time point between 1960 and 1980, but little is known about the spatial dynamics of viral spread. A total of 135 Brazilian HIV-1 subtype C pol sequences collected from 1992 to 2009 at the three southern state capitals (Porto Alegre, Florianópolis and Curitiba) were analyzed. Maximum-likelihood and Bayesian methods were used to explore the degree of phylogenetic mixing of subtype C sequences from different cities and to reconstruct the geographical pattern of viral spread in this country region. Phylogeographic analyses supported the monophyletic origin of the HIV-1 subtype C clade circulating in southern Brazil and placed the root of that clade in Curitiba (Paraná state). This analysis further suggested that Florianópolis (Santa Catarina state) is an important staging post in the subtype C dissemination displaying high viral migration rates from and to the other cities, while viral flux between Curitiba and Porto Alegre (Rio Grande do Sul state) is very low. We found a positive correlation (r2 = 0.64) between routine travel and viral migration rates among localities. Despite the intense viral movement, phylogenetic intermixing of subtype C sequences from different Brazilian cities is lower than expected by chance. Notably, a high proportion (67%) of subtype C sequences from Porto Alegre branched within a single local monophyletic sub-cluster. These results suggest that the HIV-1 subtype C epidemic in southern Brazil has been shaped by both frequent viral migration among states and in situ dissemination of local clades

    Computational approaches for modeling human intestinal absorption and permeability

    Get PDF
    Human intestinal absorption (HIA) is an important roadblock in the formulation of new drug substances. Computational models are needed for the rapid estimation of this property. The measurements are determined via in vivo experiments or in vitro permeability studies. We present several computational models that are able to predict the absorption of drugs by the human intestine and the permeability through human Caco-2 cells. The training and prediction sets were derived from literature sources and carefully examined to eliminate compounds that are actively transported. We compare our results to models derived by other methods and find that the statistical quality is similar. We believe that models derived from both sources of experimental data would provide greater consistency in predictions. The performance of several QSPR models that we investigated to predict outside the training set for either experimental property clearly indicates that caution should be exercised while applying any of the models for quantitative predictions. However, we are able to show that the qualitative predictions can be obtained with close to a 70% success rate

    From DPSIR the DAPSI(W)R(M) Emerges
 a Butterfly – ‘protecting the natural stuff and delivering the human stuff’

    Get PDF
    The complexity of interactions and feedbacks between human activities and ecosystems can make the analysis of such social-ecological systems intractable. In order to provide a common means to understand and analyse the links between social and ecological process within these systems, a range of analytical frameworks have been developed and adopted. Following decades of practical experience in implementation, the Driver Pressure State Impact Response (DPSIR) conceptual framework has been adapted and re-developed to become the D(A)PSI(W)R(M). This paper describes in detail the D(A)PSI(W)R(M) and its development from the original DPSIR conceptual frame. Despite its diverse application and demonstrated utility, a number of inherent shortcomings are identified. In particular the DPSIR model family tend to be best suited to individual environmental pressures and human activities and their resulting environmental problems, having a limited focus on the supply and demand of benefits from nature. We present a derived framework, the “Butterfly”, a more holistic approach designed to expand the concept. The “Butterfly” model, moves away from the centralised accounting framework approach while more-fully incorporating the complexity of social and ecological systems, and the supply and demand of ecosystem services, which are central to human-environment interactions
    • 

    corecore