29 research outputs found

    Comparison of Temperature-Dependent Hadronic Current Correlation Functions Calculated in Lattice Simulations of QCD and with a Chiral Lagrangian Model

    Get PDF
    The Euclidean-time hadronic current correlation functions, GP(τ,T)G_P(\tau, T) and GV(τ,T)G_V(\tau, T), of pseudoscalar and vector currents have recently been calculated in lattice simulations of QCD and have been used to obtain the corresponding spectral functions. We have used the Nambu-Jona-Lasinio (NJL) model to calculate such spectral functions, as well as the Euclidean-time correlators, and have made a comparison to the lattice results for the correlators. We find evidence for the type of temperature dependence of the NJL coupling parameters that we have used in previous studies of the mesonic confinement-deconfinement transition. We also see that the spectral functions obtained when using the maximum-entropy-method (MEM) and the lattice data differ from the spectral functions that we calculate in our chiral model. However, our results for the Euclidean-time correlators are in general agreement with the lattice results, with better agreement when our temperature-dependent coupling parameters are used than when temperature-independent parameters are used for the NJL model. We also discuss some additional evidence for the utility of temperature-dependent coupling parameters for the NJL model. For example, if the constituent quark mass at T=0 is 352MeV352 {MeV} in the chiral limit, the transition temperature is Tc=208MeVT_c=208 {MeV} for the NJL model with a standard momentum cutoff parameter. (If a Gaussian momentum cutoff is used, we find Tc=225MeVT_c=225 {MeV} in the chiral limit, with m=368MeVm=368 {MeV} at T=0.) The introduction of a weak temperature dependence for the coupling constant will move the value of TcT_c into the range 150-170 MeV, which is more in accord with what is found in lattice simulations of QCD with dynamical quarks

    Sagnac Interferometer as a Speed-Meter-Type, Quantum-Nondemolition Gravitational-Wave Detector

    Full text link
    According to quantum measurement theory, "speed meters" -- devices that measure the momentum, or speed, of free test masses -- are immune to the standard quantum limit (SQL). It is shown that a Sagnac-interferometer gravitational-wave detector is a speed meter and therefore in principle it can beat the SQL by large amounts over a wide band of frequencies. It is shown, further, that, when one ignores optical losses, a signal-recycled Sagnac interferometer with Fabry-Perot arm cavities has precisely the same performance, for the same circulating light power, as the Michelson speed-meter interferometer recently invented and studied by P. Purdue and the author. The influence of optical losses is not studied, but it is plausible that they be fairly unimportant for the Sagnac, as for other speed meters. With squeezed vacuum (squeeze factor e−2R=0.1e^{-2R} = 0.1) injected into its dark port, the recycled Sagnac can beat the SQL by a factor 10≃3 \sqrt{10} \simeq 3 over the frequency band 10 {\rm Hz} \alt f \alt 150 {\rm Hz} using the same circulating power Ic∌820I_c\sim 820 kW as is used by the (quantum limited) second-generation Advanced LIGO interferometers -- if other noise sources are made sufficiently small. It is concluded that the Sagnac optical configuration, with signal recycling and squeezed-vacuum injection, is an attractive candidate for third-generation interferometric gravitational-wave detectors (LIGO-III and EURO).Comment: 12 pages, 6 figure

    Phases of QCD, Thermal Quasiparticles and Dilepton Radiation from a Fireball

    Get PDF
    We calculate dilepton production rates from a fireball adapted to the kinematical conditions realized in ultrarelativistic heavy ion collisions over a broad range of beam energies. The freeze-out state of the fireball is fixed by hadronic observables. We use this information combined with the initial geometry of the collision region to follow the space-time evolution of the fireball. Assuming entropy conservation, its bulk thermodynamic properties can then be uniquely obtained once the equation of state (EoS) is specified. The high-temperature (QGP) phase is modelled by a non-perturbative quasiparticle model that incorporates a phenomenological confinement description, adapted to lattice QCD results. For the hadronic phase, we interpolate the EoS into the region where a resonance gas approach seems applicable, keeping track of a possible overpopulation of the pion phase space. In this way, the fireball evolution is specified without reference to dilepton data, thus eliminating it as an adjustable parameter in the rate calculations. Dilepton emission in the QGP phase is then calculated within the quasiparticle model. In the hadronic phase, both temperature and finite baryon density effects on the photon spectral function are incorporated. Existing dilepton data from CERES at 158 and 40 AGeV Pb-Au collisions are well described, and a prediction for the PHENIX setup at RHIC for sqrt(s) = 200 AGeV is given.Comment: 31 pages, 15 figures, final versio

    A standardisation framework for bio‐logging data to advance ecological research and conservation

    Get PDF
    Bio‐logging data obtained by tagging animals are key to addressing global conservation challenges. However, the many thousands of existing bio‐logging datasets are not easily discoverable, universally comparable, nor readily accessible through existing repositories and across platforms, slowing down ecological research and effective management. A set of universal standards is needed to ensure discoverability, interoperability and effective translation of bio‐logging data into research and management recommendations. We propose a standardisation framework adhering to existing data principles (FAIR: Findable, Accessible, Interoperable and Reusable; and TRUST: Transparency, Responsibility, User focus, Sustainability and Technology) and involving the use of simple templates to create a data flow from manufacturers and researchers to compliant repositories, where automated procedures should be in place to prepare data availability into four standardised levels: (a) decoded raw data, (b) curated data, (c) interpolated data and (d) gridded data. Our framework allows for integration of simple tabular arrays (e.g. csv files) and creation of sharable and interoperable network Common Data Form (netCDF) files containing all the needed information for accuracy‐of‐use, rightful attribution (ensuring data providers keep ownership through the entire process) and data preservation security. We show the standardisation benefits for all stakeholders involved, and illustrate the application of our framework by focusing on marine animals and by providing examples of the workflow across all data levels, including filled templates and code to process data between levels, as well as templates to prepare netCDF files ready for sharing. Adoption of our framework will facilitate collection of Essential Ocean Variables (EOVs) in support of the Global Ocean Observing System (GOOS) and inter‐governmental assessments (e.g. the World Ocean Assessment), and will provide a starting point for broader efforts to establish interoperable bio‐logging data formats across all fields in animal ecology

    Theoretical foundations of human decision-making in agent-based land use models – A review

    Get PDF
    Recent reviews stated that the complex and context-dependent nature of human decision-making resulted in ad-hoc representations of human decision in agent-based land use change models (LUCC ABMs) and that these representations are often not explicitly grounded in theory. However, a systematic survey on the characteristics (e.g. uncertainty, adaptation, learning, interactions and heterogeneities of agents) of representing human decision-making in LUCC ABMs is missing. Therefore, the aim of this study is to inform this debate by reviewing 134 LUCC ABM papers. We show that most human decision sub-models are not explicitly based on a specific theory and if so they are mostly based on economic theories, such as the rational actor, and mainly ignoring other relevant disciplines. Consolidating and enlarging the theoretical basis for modelling human decision-making may be achieved by using a structural framework for modellers, re-using published decision models, learning from other disciplines and fostering collaboration with social scientists
    corecore