1,654 research outputs found

    Entanglement between a telecom photon and an on-demand multimode solid-state quantum memory

    Full text link
    Entanglement between photons at telecommunication wavelengths and long-lived quantum memories is one of the fundamental requirements of long-distance quantum communication. Quantum memories featuring on-demand read-out and multimode operation are additional precious assets that will benefit the communication rate. In this work we report the first demonstration of entanglement between a telecom photon and a collective spin excitation in a multimode solid-state quantum memory. Photon pairs are generated through widely non-degenerate parametric down-conversion, featuring energy-time entanglement between the telecom-wavelength idler and a visible signal photon. The latter is stored in a Pr3+^{3+}:Y2_2SiO5_5 crystal as a spin wave using the full Atomic Frequency Comb scheme. We then recall the stored signal photon and analyze the entanglement using the Franson scheme. We measure conditional fidelities of 92(2)%92(2)\% for excited-state storage, enough to violate a CHSH inequality, and 77(2)%77(2)\% for spin-wave storage. Taking advantage of the on-demand read-out from the spin state, we extend the entanglement storage in the quantum memory for up to 47.7~μ\mus, which could allow for the distribution of entanglement between quantum nodes separated by distances of up to 10 km

    Simulations of Hot Bubbles in the ICM

    Full text link
    We review the general properties of the intracluster medium (ICM) in clusters that host a cooling flow, and in particular the effects on the ICM of the injection of hot plasma by a powerful active galactic nucleus (AGN). It is observed that, in some cases, the hot plasma produces cavities in the ICM that finally detach and rise, perhaps buoyantly. The gas dynamics induced by the rising bubbles can help explain the absence of a cooled gas component in clusters with a cooling flow. This scenario is explored using numerical simulations.Comment: 13 pages, no figures. Accepted for publication in Modern Physics Letters

    On the origin of radio-loudness in AGNs and its relationship with the properties of the central supermassive black hole

    Full text link
    We investigate the relationship between the mass of central supermassive black holes and the radio loudness of active galactic nuclei. We use the most recent calibrations to derive virial black hole masses for samples of radio loud QSOs for which relatively small masses (M_BH<10^8 M_sun) have been estimated in the literature. We take into account the effect of radiation pressure on the BLR which reduces the effective gravitational potential experienced by the broad-line clouds and affects the mass estimates of bright quasars. We show that in well defined samples of nearby low luminosity AGNs, QSOs and AGNs from the SDSS, radio-loud (RL) AGN invariably host SMBHs exceeding ~10^8 M_sun. On the other hand, radio-quiet (RQ) AGNs are associated with a much larger range of black hole masses. The overall result still holds even without correcting the BH mass estimates for the effects of radiation pressure. We present a conjecture based on these results, which aims at explaining the origin of radio-loudness in terms of two fundamental parameters: the spin of the black hole and the black hole mass. We speculate that in order to produce a RL AGN both of the following requirements must be satisfied: 1)the black hole mass M_BH has to be larger than ~10^8 M_sun; 2)the spin of the BH must be significant, in order to satisfy theoretical requirements. Taking into account the most recent observations, we envisage a scenario in which the merger history of the host galaxy plays a fundamental role in accounting for both the properties of the AGN and the galaxy morphology, which in our picture are strictly linked. RL sources might be obtained only through major dry mergers involving BH of large mass, which would give rise to both the core morphology and the significant black hole spin needed.Comment: 11 pages, 4 figures, accepted for publication in MNRA

    The Mediterranean ocean Forecasting System

    Get PDF
    The Mediterranean Forecasting System (MFS) is operationally working since year 2000 and it is continuously improved in the frame of international projects. The system is part of the Mediterranean Operational Oceanography Network-MOON and MFS is coordinated and operated by the Italian Group of Operational Oceanography (GNOO). The latest upgrades and integration to MFS has been undertaken in the EU-MERSEA and BOSS4GMES Projects. Since October 2005 ten days forecasts are produced daily as well as 15 days of analyses once a week. The daily forecast and weekly analysis data are available in real time to the users through a dedicated ftp service and every day a web bulletin is published on the web site (http://gnoo.bo.ingv.it/mfs). A continuous evaluation in near real time of the forecasts and analyses produced by MFS has been developed in order to continuously verify the system and to provide useful information to the users. The R&D is focused on different aspects of the system. A new basin scale ocean model nested with operational MERCATOR global model has been developed and run in real time operationally for a test period together with a new assimilation scheme based on the 3DVAR. This system is now under evaluation. Important activities have been carried out to: implement and test a Bayesian methodologies of Ensemble and Super-Ensemble for the Mediterranean sea; produce 20 years of re-analysis; re-formulate the air-sea fluxes bulk formulae; develop dedicated products to support particular request of end users such as: indicators, real time oil spill forecasting, search & rescue.EUROGOOS and European CommissionPublishedExeter, UK4.6. Oceanografia operativa per la valutazione dei rischi in aree marineope

    WARP liquid argon detector for dark matter survey

    Get PDF
    The WARP programme is a graded programme intended to search for cold Dark Matter in the form of WIMP's. These particles may produce via weak interactions nuclear recoils in the energy range 10-100 keV. A cryogenic noble liquid like argon, already used in the realization of very large detector, permits the simultaneous detection of both ionisation and scintillation induced by an interaction, suggesting the possibility of discriminating between nuclear recoils and electrons mediated events. A 2.3 litres two-phase argon detector prototype has been used to perform several tests on the proposed technique. Next step is the construction of a 100 litres sensitive volume device with potential sensitivity a factor 100 better than presently existing experiments.Comment: Proceeding of the 6th UCLA Symposium on Sources and detection of Dark Matter and dark Energy in the Univers

    Genetic determinants of complement activation in the general population

    Get PDF
    Complement is a fundamental innate immune response component. Its alterations are associated with severe systemic diseases. To illuminate the complement's genetic underpinnings, we conduct genome-wide association studies of the functional activity of the classical (CP), lectin (LP), and alternative (AP) complement pathways in the Cooperative Health Research in South Tyrol study (n&nbsp;= 4,990). We identify seven loci, encompassing 13 independent, pathway-specific variants located in or near complement genes (CFHR4, C7, C2, MBL2) and non-complement genes (PDE3A, TNXB, ABO), explaining up to 74% of complement pathways' genetic heritability and implicating long-range haplotypes associated with LP at MBL2. Two-sample Mendelian randomization analyses, supported by transcriptome- and proteome-wide colocalization, confirm known causal pathways, establish within-complement feedback loops, and implicate causality of ABO on LP and of CFHR2 and C7 on AP. LP causally influences collectin-11 and KAAG1 levels and the risk of mouth ulcers. These results build a comprehensive resource to investigate the role of complement in human health

    SNP Prioritization Using a B ayesian Probability of Association

    Full text link
    Prioritization is the process whereby a set of possible candidate genes or SNP s is ranked so that the most promising can be taken forward into further studies. In a genome‐wide association study, prioritization is usually based on the P ‐values alone, but researchers sometimes take account of external annotation information about the SNP s such as whether the SNP lies close to a good candidate gene. Using external information in this way is inherently subjective and is often not formalized, making the analysis difficult to reproduce. Building on previous work that has identified 14 important types of external information, we present an approximate B ayesian analysis that produces an estimate of the probability of association. The calculation combines four sources of information: the genome‐wide data, SNP information derived from bioinformatics databases, empirical SNP weights, and the researchers’ subjective prior opinions. The calculation is fast enough that it can be applied to millions of SNPS and although it does rely on subjective judgments, those judgments are made explicit so that the final SNP selection can be reproduced. We show that the resulting probability of association is intuitively more appealing than the P ‐value because it is easier to interpret and it makes allowance for the power of the study. We illustrate the use of the probability of association for SNP prioritization by applying it to a meta‐analysis of kidney function genome‐wide association studies and demonstrate that SNP selection performs better using the probability of association compared with P ‐values alone.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/96317/1/gepi21704.pd

    XIPE: the X-ray Imaging Polarimetry Explorer

    Full text link
    X-ray polarimetry, sometimes alone, and sometimes coupled to spectral and temporal variability measurements and to imaging, allows a wealth of physical phenomena in astrophysics to be studied. X-ray polarimetry investigates the acceleration process, for example, including those typical of magnetic reconnection in solar flares, but also emission in the strong magnetic fields of neutron stars and white dwarfs. It detects scattering in asymmetric structures such as accretion disks and columns, and in the so-called molecular torus and ionization cones. In addition, it allows fundamental physics in regimes of gravity and of magnetic field intensity not accessible to experiments on the Earth to be probed. Finally, models that describe fundamental interactions (e.g. quantum gravity and the extension of the Standard Model) can be tested. We describe in this paper the X-ray Imaging Polarimetry Explorer (XIPE), proposed in June 2012 to the first ESA call for a small mission with a launch in 2017 but not selected. XIPE is composed of two out of the three existing JET-X telescopes with two Gas Pixel Detectors (GPD) filled with a He-DME mixture at their focus and two additional GPDs filled with pressurized Ar-DME facing the sun. The Minimum Detectable Polarization is 14 % at 1 mCrab in 10E5 s (2-10 keV) and 0.6 % for an X10 class flare. The Half Energy Width, measured at PANTER X-ray test facility (MPE, Germany) with JET-X optics is 24 arcsec. XIPE takes advantage of a low-earth equatorial orbit with Malindi as down-link station and of a Mission Operation Center (MOC) at INPE (Brazil).Comment: 49 pages, 14 figures, 6 tables. Paper published in Experimental Astronomy http://link.springer.com/journal/1068

    A prototype for the real-time analysis of the Cherenkov Telescope Array

    Full text link
    The Cherenkov Telescope Array (CTA) observatory will be one of the biggest ground-based very-high-energy (VHE) γ- ray observatory. CTA will achieve a factor of 10 improvement in sensitivity from some tens of GeV to beyond 100 TeV with respect to existing telescopes. The CTA observatory will be capable of issuing alerts on variable and transient sources to maximize the scientific return. To capture these phenomena during their evolution and for effective communication to the astrophysical community, speed is crucial. This requires a system with a reliable automated trigger that can issue alerts immediately upon detection of γ-ray flares. This will be accomplished by means of a Real-Time Analysis (RTA) pipeline, a key system of the CTA observatory. The latency and sensitivity requirements of the alarm system impose a challenge because of the anticipated large data rate, between 0.5 and 8 GB/s. As a consequence, substantial efforts toward the optimization of highthroughput computing service are envisioned. For these reasons our working group has started the development of a prototype of the Real-Time Analysis pipeline. The main goals of this prototype are to test: (i) a set of frameworks and design patterns useful for the inter-process communication between software processes running on memory; (ii) the sustainability of the foreseen CTA data rate in terms of data throughput with different hardware (e.g. accelerators) and software configurations, (iii) the reuse of nonreal- time algorithms or how much we need to simplify algorithms to be compliant with CTA requirements, (iv) interface issues between the different CTA systems. In this work we focus on goals (i) and (ii)
    corecore