215 research outputs found

    Improved WKB approximation for quantum tunneling: Application to heavy ion fusion

    Full text link
    In this paper we revisit the one-dimensional tunneling problem. We consider Kemble's approximation for the transmission coefficient. We show how this approximation can be extended to above-barrier energies by performing the analytical continuation of the radial coordinate to the complex plane. We investigate the validity of this approximation by comparing their predictions for the cross section and for the barrier distribution with the corresponding quantum mechanical results. We find that the extended Kemble's approximation reproduces the results of quantum mechanics with great accuracy.Comment: 8 pages, 6 figures, in press, in European. Phys. Journal A (2017

    Approximate transmission coefficients in heavy ion fusion

    Full text link
    In this paper we revisit the one-dimensional tunnelling problem. We consider different approximations for the transmission through the Coulomb barrier in heavy ion collisions at near-barrier energies. First, we discuss approximations of the barrier shape by functional forms where the transmission coefficient is known analytically. Then, we consider Kemble's approximation for the transmission coefficient. We show how this approximation can be extended to above-barrier energies by performing the analytical continuation of the radial coordinate to the complex plane. We investigate the validity of the different approximations considered in this paper by comparing their predictions for transmission coefficients and cross sections of three heavy ion systems with the corresponding quantum mechanical results.Comment: 12 pages, 6 figure

    Is there an excess of black holes around 20M⊙20 M_{\odot}? Optimising the complexity of population models with the use of reversible jump MCMC

    Full text link
    Some analyses of the third gravitational wave catalogue released by the LIGO-Virgo-KAGRA collaboration (LVK) suggest an excess of black holes around 15−20M⊙15-20 M_{\odot}. In order to investigate this feature, we introduce two flexible population models, a semi-parametric one and a non-parametric one. Both make use of reversible jump Markov chain Monte-Carlo to optimise their complexity. We also illustrate how the latter can be used to efficiently perform model selection. Our parametric model broadly agrees with the fiducial analysis of the LVK, but finds a peak of events at slightly larger masses. Our non-parametric model shows this same displacement. Moreover, it also suggests the existence of an excess of black holes around 20M⊙20 M_{\odot}. We assess the robustness of this prediction by performing mock injections and running hierarchical analyses on those. We find that such a feature might be due to statistical fluctuations, given the small number of events observed so far, with a 5%5\% probability. We estimate that with a few hundreds of observations, as expected for O4, our non-parametric model will, be able to robustly determine the presence of this excess. It will then allow for an efficient agnostic inference of the properties of black holes.Comment: correct typo in equation

    The lure of sirens: joint distance and velocity measurements with third generation detectors

    Get PDF
    The next generation of detectors will detect gravitational waves from binary neutron stars at cosmological distances, for which around a thousand electromagnetic follow-ups may be observed per year. So far, most work devoted to the expected cosmological impact of these standard sirens employed them only as distance indicators. Only recently their use as tracers of clustering, similar to what already proposed for supernovae, has been studied. Focusing on the expected specifications of the Einstein Telescope (ET), we forecast here the performance on cosmological parameters of future standard sirens as both distance and density indicators, with emphasis on the linear perturbation growth index and on spatial curvature. We improve upon previous studies in a number of ways: a more detailed analysis of available telescope time, the inclusion of more cosmological and nuisance parameters, the Alcock-Paczynski correction, the use of sirens also as both velocity and density tracers, and a more accurate estimation of the distance posterior. We find that the analysis of the clustering of sirens improves the constraints on H0H_0 by 30% and on Ωk0\Omega_{k0} by over an order of magnitude, with respect to their use merely as distance indicators. With 5 years of joint ET and Rubin Observatory follow-ups we could reach precision of 0.1 km/s/Mpc in H0H_0 and 0.02 in Ωk0\Omega_{k0} using only data in the range 0<z<0.50<z<0.5. We also find that the use of sirens as tracers of density, and not only velocity, yields good improvements on the growth of structure constraints

    Measuring source properties and quasi-normal-mode frequencies of heavy massive black-hole binaries with LISA

    Full text link
    The laser-interferometer space antenna (LISA) will be launched in the mid 2030s. It promises to observe the coalescence of massive black-hole (BH) binaries with signal-to-noise ratios (SNRs) reaching thousands. Crucially, it will detect some of these binaries with high SNR both in the inspiral and the merger-ringdown stages. Such signals are ideal for tests of General Relativity (GR) using information from the whole waveform. Here, we consider astrophysically motivated binary systems at the high-mass end of the population observable by LISA, and simulate their LISA signals using the newly developed parametrised, multipolar, aligned-spin effective-one-body model: pSEOBNRv5HM. The merger-ringdown signal in this model depends on the binary properties (masses and spins), and also on parameters that describe fractional deviations from the GR quasi-normal-mode frequencies of the remnant BH. Performing full Bayesian analyses, we assess to which accuracy LISA will be able to constrain deviations from GR in the ringdown signal when using information from the whole signal. We find that these deviations can typically be constrained to within 10%10\% and in the best cases to within 1%1\%. We also show that we can measure the binary masses and spins with great accuracy even for very massive BH systems with low SNR in the inspiral: individual source-frame masses can typically be constrained to within 10%10\% and as precisely as 1%1\%, and individual spins can typically be constrained to within 0.10.1 and as precisely as 0.0010.001. Finally, we probe the accuracy of the SEOBNRv5HM waveform family by performing synthetic injections of GR numerical-relativity waveforms. For the source parameters considered, we measure erroneous deviations from GR due to systematics in the waveform model. These results confirm the need for improving waveform models to perform tests of GR with binary BHs at high SNR with LISA.Comment: 15 pages, 18 with appendices, 19 figure

    Observing GW190521-like binary black holes and their environment with LISA

    Get PDF
    Binaries of relatively massive black holes like GW190521 have been proposed to form in dense gas environments, such as the disks of Active Galactic Nuclei (AGNs), and they might be associated with transient electromagnetic counterparts. The interactions of this putative environment with the binary could leave a significant imprint at the low gravitational wave frequencies observable with the Laser Interferometer Space Antenna (LISA). We show that LISA will be able to detect up to ten GW190521-like black hole binaries, with sky position errors ≲1\lesssim1 deg2^2. Moreover, it will measure directly various effects due to the orbital motion around the supermassive black hole at the center of the AGN, especially the Doppler modulation and the Shapiro time delay. Thanks to a careful treatment of their frequency domain signal, we were able to perform the full parameter estimation of Doppler and Shapiro-modulated binaries as seen by LISA. We find that the Doppler and Shapiro effects will allow for measuring the AGN parameters (radius and inclination of the orbit around the AGN, central black hole mass) with up to percent-level precision. Properly modeling these low-frequency environmental effects is crucial to determine the binary formation history, as well as to avoid biases in the reconstruction of the source parameters and in tests of general relativity with gravitational waves. <br

    Detecting the start of an influenza outbreak using exponentially weighted moving average charts

    Get PDF
    Background. Influenza viruses cause seasonal outbreaks in temperate climates, usually during winter and early spring, and are endemic in tropical climates. The severity and length of influenza outbreaks vary from year to year. Quick and reliable detection of the start of an outbreak is needed to promote public health measures. Methods. We propose the use of an exponentially weighted moving average (EWMA) control chart of laboratory confirmed influenza counts to detect the start and end of influenza outbreaks. Results. The chart is shown to provide timely signals in an example application with seven years of data from Victoria, Australia. Conclusions. The EWMA control chart could be applied in other applications to quickly detect influenza outbreaks

    Challenges of keyword-based location disclosure

    Full text link
    A practical solution to location privacy should be incremen-tally deployable. We claim it should hence reconcile the eco-nomic value of location to aggregators, usually ignored by prior works, with a user’s control over her information. Loca-tion information indeed is being collected and used by many mobile services to improve revenues, and this gives rise to a heated debate: Privacy advocates ask for stricter regula-tion on information collection, while companies argue that it would jeopardize the thriving economy of the mobile web. We describe a system that gives users control over their information and does not degrade the data given to aggre-gators. Recognizing that the first challenge is to express lo-cations in a way that is meaningful for advertisers and users, we propose a keyword based design. Keywords characterize locations, let the users inform the system about their sen-sitivity to disclosure, and build information directly usable by an advertiser’s targeting campaign. Our work makes two main contributions: we design a market of location infor-mation based on keywords and we analyze its robustness to attacks using data from ad-networks, geo-located services, and cell networks. Categories and Subject Descriptors Security and Privacy [Human and societal aspects of security and privacy]: Usability in security and privac
    • …
    corecore