31 research outputs found

    An Evaluation of Journaling File Systems

    Full text link
    Many statisticians would agree that, had it not been for systems, the synthesis of virtual machines might never have occurred. In fact, few systems engineers would disagree with the improvement of the location-identity split. We motivate an algorithm for the synthesis of compilers, which we call Nap

    A chemical survey of exoplanets with ARIEL

    Get PDF
    Thousands of exoplanets have now been discovered with a huge range of masses, sizes and orbits: from rocky Earth-like planets to large gas giants grazing the surface of their host star. However, the essential nature of these exoplanets remains largely mysterious: there is no known, discernible pattern linking the presence, size, or orbital parameters of a planet to the nature of its parent star. We have little idea whether the chemistry of a planet is linked to its formation environment, or whether the type of host star drives the physics and chemistry of the planet’s birth, and evolution. ARIEL was conceived to observe a large number (~1000) of transiting planets for statistical understanding, including gas giants, Neptunes, super-Earths and Earth-size planets around a range of host star types using transit spectroscopy in the 1.25–7.8 μm spectral range and multiple narrow-band photometry in the optical. ARIEL will focus on warm and hot planets to take advantage of their well-mixed atmospheres which should show minimal condensation and sequestration of high-Z materials compared to their colder Solar System siblings. Said warm and hot atmospheres are expected to be more representative of the planetary bulk composition. Observations of these warm/hot exoplanets, and in particular of their elemental composition (especially C, O, N, S, Si), will allow the understanding of the early stages of planetary and atmospheric formation during the nebular phase and the following few million years. ARIEL will thus provide a representative picture of the chemical nature of the exoplanets and relate this directly to the type and chemical environment of the host star. ARIEL is designed as a dedicated survey mission for combined-light spectroscopy, capable of observing a large and well-defined planet sample within its 4-year mission lifetime. Transit, eclipse and phase-curve spectroscopy methods, whereby the signal from the star and planet are differentiated using knowledge of the planetary ephemerides, allow us to measure atmospheric signals from the planet at levels of 10–100 part per million (ppm) relative to the star and, given the bright nature of targets, also allows more sophisticated techniques, such as eclipse mapping, to give a deeper insight into the nature of the atmosphere. These types of observations require a stable payload and satellite platform with broad, instantaneous wavelength coverage to detect many molecular species, probe the thermal structure, identify clouds and monitor the stellar activity. The wavelength range proposed covers all the expected major atmospheric gases from e.g. H2O, CO2, CH4 NH3, HCN, H2S through to the more exotic metallic compounds, such as TiO, VO, and condensed species. Simulations of ARIEL performance in conducting exoplanet surveys have been performed – using conservative estimates of mission performance and a full model of all significant noise sources in the measurement – using a list of potential ARIEL targets that incorporates the latest available exoplanet statistics. The conclusion at the end of the Phase A study, is that ARIEL – in line with the stated mission objectives – will be able to observe about 1000 exoplanets depending on the details of the adopted survey strategy, thus confirming the feasibility of the main science objectives.Peer reviewedFinal Published versio

    On understanding multi-instrument Rosetta data of the innermost dust and gas coma of comet 67P/Churyumov-Gerasimenko - results, strengths, and limitations of models

    Get PDF
    Numerical models are powerful tools for understanding the connection between the emitted gas and dust from the surface of comets and the subsequent expansion into space where remote sensing instruments can perform measurements. We will present such a predictive model which can provide synthetic measurements for multiple instruments on board ESA's Rosetta mission to comet 67P/Churyumov-Gerasimenko (hereafter 67P). We will demonstrate why a multi instrument approach is essential and how models can be used to constrain the gas and dust source distribution on the surface

    Search for gravitational-lensing signatures in the full third observing run of the LIGO-Virgo network

    Get PDF
    Gravitational lensing by massive objects along the line of sight to the source causes distortions of gravitational wave-signals; such distortions may reveal information about fundamental physics, cosmology and astrophysics. In this work, we have extended the search for lensing signatures to all binary black hole events from the third observing run of the LIGO--Virgo network. We search for repeated signals from strong lensing by 1) performing targeted searches for subthreshold signals, 2) calculating the degree of overlap amongst the intrinsic parameters and sky location of pairs of signals, 3) comparing the similarities of the spectrograms amongst pairs of signals, and 4) performing dual-signal Bayesian analysis that takes into account selection effects and astrophysical knowledge. We also search for distortions to the gravitational waveform caused by 1) frequency-independent phase shifts in strongly lensed images, and 2) frequency-dependent modulation of the amplitude and phase due to point masses. None of these searches yields significant evidence for lensing. Finally, we use the non-detection of gravitational-wave lensing to constrain the lensing rate based on the latest merger-rate estimates and the fraction of dark matter composed of compact objects

    Search for eccentric black hole coalescences during the third observing run of LIGO and Virgo

    Get PDF
    Despite the growing number of confident binary black hole coalescences observed through gravitational waves so far, the astrophysical origin of these binaries remains uncertain. Orbital eccentricity is one of the clearest tracers of binary formation channels. Identifying binary eccentricity, however, remains challenging due to the limited availability of gravitational waveforms that include effects of eccentricity. Here, we present observational results for a waveform-independent search sensitive to eccentric black hole coalescences, covering the third observing run (O3) of the LIGO and Virgo detectors. We identified no new high-significance candidates beyond those that were already identified with searches focusing on quasi-circular binaries. We determine the sensitivity of our search to high-mass (total mass M>70 M⊙) binaries covering eccentricities up to 0.3 at 15 Hz orbital frequency, and use this to compare model predictions to search results. Assuming all detections are indeed quasi-circular, for our fiducial population model, we place an upper limit for the merger rate density of high-mass binaries with eccentricities 0<e≤0.3 at 0.33 Gpc−3 yr−1 at 90\% confidence level

    Ultralight vector dark matter search using data from the KAGRA O3GK run

    Get PDF
    Among the various candidates for dark matter (DM), ultralight vector DM can be probed by laser interferometric gravitational wave detectors through the measurement of oscillating length changes in the arm cavities. In this context, KAGRA has a unique feature due to differing compositions of its mirrors, enhancing the signal of vector DM in the length change in the auxiliary channels. Here we present the result of a search for U(1)B−L gauge boson DM using the KAGRA data from auxiliary length channels during the first joint observation run together with GEO600. By applying our search pipeline, which takes into account the stochastic nature of ultralight DM, upper bounds on the coupling strength between the U(1)B−L gauge boson and ordinary matter are obtained for a range of DM masses. While our constraints are less stringent than those derived from previous experiments, this study demonstrates the applicability of our method to the lower-mass vector DM search, which is made difficult in this measurement by the short observation time compared to the auto-correlation time scale of DM

    Data from "Surveys that prioritize site number over time per site will result in better gastropod status assessments: a case study on the rediscovery of Big Black Rocksnail"

    No full text
    Freshwater gastropods are among the most imperiled organisms on earth. Yet, they are among the most understudied freshwater taxon. Many freshwater gastropod species have gone extinct in the last 100 years, but recent rediscoveries indicate that inadequate survey work has resulted in some species being prematurely declared extinct. Such premature declaration of extinction can remove legal protections for species, which could then actually cause extinction. Thus, research and policy recommendations are needed so surveys provide the best information possible for conservation. Here, we examined the case of Lithasia hubrichti, a freshwater gastropod endemic to the Big Black River in Mississippi that was last seen in 1965. In 2022, a freshwater mollusk survey resulted in finding L. hubrichti alive. An additional survey effort in 2023 that prioritized sampling as many sites as possible in a single day clarified the current range of L. hubrichti. Genomic analyses indicated that the species has persisted with a large population size for thousands of years, rather than ever falling below a survey detection limit. When considering the case of L. hubrichti and other recent freshwater gastropod rediscoveries, we conclude that freshwater gastropod surveys should emphasize sampling as many sites as possible when targeting rare species, rather than expending high sampling effort at a small number of sites. We also advocate for policies that encourage partnerships with landowners, which was required to rediscover L. hubrichti.</p

    Multi-instrument Rosetta data and model comparison for the innermost coma of 67P for the period around equinox (May 2015)

    No full text
    From August 2014 to September 2016 ESA's Rosetta spacecraft escorted comet 67P/Chury-umov-Gerasimenko (hereafter 67P) on its journey into the inner solar system and out again. The mission provides, via various dust and gas instruments, unprecedented data on the nature of cometary activity. The determination of the activity distribution on the surface of a comet is a key goal of any mission to investigate the interaction of the comet with the Sun. As the ice sublimates the gas expands into space it fills the near-nucleus environment. Individual sources of activity have been observed on the surface but it remains uncertain where the bulk of the mass is lost and how the processes that are involved work in detail. There are several reasons for this. First, imaging experiments use the dust as a proxy for the gas activity. Because the optical depth of the dust is orders of magnitude below 1 in all but a few cases, it is not possible to trace dust filament back to the source against the backdrop of the illuminated surface. Second, remote sensing instruments detecting gas emission (i.e. infrared and sub-mm spectrometers) may suffer with limited spatial and temporal resolution. In addition, the spectra lines may be optically thick and the line-of-sight direction usually cuts through inhomogeneous coma (in density or temperature) which further complicates their interpretation considerably. However, as we will show, with good a-priori estimates of coma structures spectral lines can be accurately inverted to provide constraints of the gas coma down to a few hundreds of meters above the surface (e.g. MIRO). The in-situ instruments (e.g. ROSINA, or GIADA) must consider possible biases due to the spacecraft position relative to the nucleus and respective illumination conditions on the surface. For instance, the frequent use of terminator orbits by Rosetta introduced a significant problem because the measured local densities are at points remote from what we assume to be the main direction of outflow, namely near the sunward direction. In addition, the possible inhomogeneities of the outgassing at the surface cannot be detected due to the fact that the rapid gas expansion smoothens the coma. Therefore, measurements taken tens of kilometers above the nucleus surface are rather insensitive and provide only ambiguous results. The difficulties described above show the need for predictive models that can reproduce multiple measurements in one self-consistent framework. We will present results from our study of diverse Rosetta data sets (including OSIRIS, VIRTIS, MIRO, and ROSINA), constraining the gas emission into the coma and to establish whether the data enable us to reach appropriate conclusions on the activity distribution on the nucleus surface. The models can be used on the one hand to constrain certain properties of the activity and on the other hand they provide clues on the limits of the interpretations of some of the available datasets. We focus here on the time around May 2015 (equinox). While this period is a few months prior to perihelion, the spacecraft was close to the comet, providing a relatively high spacial resolution of the remote sensing observations such that, in principle, they can be more easily linked with the in-situ measurements
    corecore