363 research outputs found

    Identification and Removal of Noise Modes in Kepler Photometry

    Full text link
    We present the Transiting Exoearth Robust Reduction Algorithm (TERRA) --- a novel framework for identifying and removing instrumental noise in Kepler photometry. We identify instrumental noise modes by finding common trends in a large ensemble of light curves drawn from the entire Kepler field of view. Strategically, these noise modes can be optimized to reveal transits having a specified range of timescales. For Kepler target stars of low photometric noise, TERRA produces ensemble-calibrated photometry having 33 ppm RMS scatter in 12-hour bins, rendering individual transits of earth-size planets around sun-like stars detectable as ~3 sigma signals.Comment: 18 pages, 7 figures, submitted to PAS

    Quantum rejection sampling

    Full text link
    Rejection sampling is a well-known method to sample from a target distribution, given the ability to sample from a given distribution. The method has been first formalized by von Neumann (1951) and has many applications in classical computing. We define a quantum analogue of rejection sampling: given a black box producing a coherent superposition of (possibly unknown) quantum states with some amplitudes, the problem is to prepare a coherent superposition of the same states, albeit with different target amplitudes. The main result of this paper is a tight characterization of the query complexity of this quantum state generation problem. We exhibit an algorithm, which we call quantum rejection sampling, and analyze its cost using semidefinite programming. Our proof of a matching lower bound is based on the automorphism principle which allows to symmetrize any algorithm over the automorphism group of the problem. Our main technical innovation is an extension of the automorphism principle to continuous groups that arise for quantum state generation problems where the oracle encodes unknown quantum states, instead of just classical data. Furthermore, we illustrate how quantum rejection sampling may be used as a primitive in designing quantum algorithms, by providing three different applications. We first show that it was implicitly used in the quantum algorithm for linear systems of equations by Harrow, Hassidim and Lloyd. Secondly, we show that it can be used to speed up the main step in the quantum Metropolis sampling algorithm by Temme et al.. Finally, we derive a new quantum algorithm for the hidden shift problem of an arbitrary Boolean function and relate its query complexity to "water-filling" of the Fourier spectrum.Comment: 19 pages, 5 figures, minor changes and a more compact style (to appear in proceedings of ITCS 2012

    Information-theoretic interpretation of quantum error-correcting codes

    Get PDF
    Quantum error-correcting codes are analyzed from an information-theoretic perspective centered on quantum conditional and mutual entropies. This approach parallels the description of classical error correction in Shannon theory, while clarifying the differences between classical and quantum codes. More specifically, it is shown how quantum information theory accounts for the fact that "redundant" information can be distributed over quantum bits even though this does not violate the quantum "no-cloning" theorem. Such a remarkable feature, which has no counterpart for classical codes, is related to the property that the ternary mutual entropy vanishes for a tripartite system in a pure state. This information-theoretic description of quantum coding is used to derive the quantum analogue of the Singleton bound on the number of logical bits that can be preserved by a code of fixed length which can recover a given number of errors.Comment: 14 pages RevTeX, 8 Postscript figures. Added appendix. To appear in Phys. Rev.

    Initial Characteristics of Kepler Short Cadence Data

    Full text link
    The Kepler Mission offers two options for observations -- either Long Cadence (LC) used for the bulk of core mission science, or Short Cadence (SC) which is used for applications such as asteroseismology of solar-like stars and transit timing measurements of exoplanets where the 1-minute sampling is critical. We discuss the characteristics of SC data obtained in the 33.5-day long Quarter 1 (Q1) observations with Kepler which completed on 15 June 2009. The truly excellent time series precisions are nearly Poisson limited at 11th magnitude providing per-point measurement errors of 200 parts-per-million per minute. For extremely saturated stars near 7th magnitude precisions of 40 ppm are reached, while for background limited measurements at 17th magnitude precisions of 7 mmag are maintained. We note the presence of two additive artifacts, one that generates regularly spaced peaks in frequency, and one that involves additive offsets in the time domain inversely proportional to stellar brightness. The difference between LC and SC sampling is illustrated for transit observations of TrES-2.Comment: 5 pages, 4 figures, ApJ Letters in pres

    Non-adaptive Measurement-based Quantum Computation and Multi-party Bell Inequalities

    Full text link
    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as Measurement-based Quantum Computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities still remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focussing on deterministic computation of Boolean functions, in which natural generalisations of the Greenberger-Horne-Zeilinger (GHZ) paradox emerge; we then explore probabilistic computation, via which multipartite Bell Inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.Comment: 13 pages, 4 figures, final version accepted for publicatio

    Observations of Ultraluminous Infrared Galaxies with the Infrared Spectrograph on the Spitzer Space Telescope: Early Results on Mrk 1014, Mrk 463, and UGC 5101

    Full text link
    We present spectra taken with the Infrared Spectrograph on Spitzer covering the 5-38micron region of three Ultraluminous Infrared Galaxies (ULIRGs): Mrk 1014 (z=0.163), and Mrk 463 (z=0.051), and UGC 5101 (z=0.039). The continua of UGC 5101 and Mrk 463 show strong silicate absorption suggesting significant optical depths to the nuclei at 10microns. UGC 5101 also shows the clear presence of water ice in absorption. PAH emission features are seen in both Mrk 1014 and UGC 5101, including the 16.4micron line in UGC 5101. The fine structure lines are consistent with dominant AGN power sources in both Mrk 1014 and Mrk 463. In UGC 5101 we detect the [NeV] 14.3micron emission line providing the first direct evidence for a buried AGN in the mid-infrared. The detection of the 9.66micron and 17.03micron H2_{2} emission lines in both UGC 5101 and Mrk 463 suggest that the warm molecular gas accounts for 22% and 48% of the total molecular gas masses in these galaxies.Comment: Accepted in ApJ Sup. Spitzer Special Issue, 4 pages, 3 figure

    The Spitzer Space Telescope Mission

    Full text link
    The Spitzer Space Telescope, NASA's Great Observatory for infrared astronomy, was launched 2003 August 25 and is returning excellent scientific data from its Earth-trailing solar orbit. Spitzer combines the intrinsic sensitivity achievable with a cryogenic telescope in space with the great imaging and spectroscopic power of modern detector arrays to provide the user community with huge gains in capability for exploration of the cosmos in the infrared. The observatory systems are largely performing as expected and the projected cryogenic lifetime is in excess of 5 years. This paper summarizes the on-orbit scientific, technical and operational performance of Spitzer. Subsequent papers in this special issue describe the Spitzer instruments in detail and highlight many of the exciting scientific results obtained during the first six months of the Spitzer mission.Comment: Accepted for publication in the Astrophyscial Journal Supplement Spitzer Special Issue, 22 pages, 3 figures. Higher resolution versions of the figures are available at http://ssc.spitzer.caltech.edu/pubs/journal2004.htm

    Palomar observations of the R impact of comet Shoemaker-Levy 9: II. Spectra

    Get PDF
    We present mid-infrared spectroscopic observations from Palomar observatory of the impact of fragment R of comet P/Shoemaker-Levy 9 with Jupiter on 21 July 1994. Low-resolution 8–13 ”m spectra taken near the peak of the lightcurve show a broad emission feature that resembles the silicate feature commonly seen in comets and the interstellar medium. We use this feature to estimate the dust content of the impact plume. The overall infrared spectral energy distribution at the time of peak brightness is consistent with emission from an optically-thin layer of small particles at ∌600 K. Integrating over the spectrum and the lightcurve, we obtain a total radiated energy from the R impact of ≄ 2 × 10^(25) ergs and a plume mass of ≄ 3 × 10^(13) g

    Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction

    Full text link
    With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used to establish a range of "reasonable" robust fit parameters. These robust fit parameters are then used to generate a Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF) which when maximized finds the best fit that simultaneously removes systematic effects while reducing the signal distortion and noise injection which commonly afflicts simple least-squares (LS) fitting. A numerical and empirical approach is taken where the Bayesian Prior PDFs are generated from fits to the light curve distributions themselves.Comment: 43 pages, 21 figures, Submitted for publication in PASP. Also see companion paper "Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves" by Martin C. Stumpe, et a
    • 

    corecore