427 research outputs found

    Identification and Removal of Noise Modes in Kepler Photometry

    Full text link
    We present the Transiting Exoearth Robust Reduction Algorithm (TERRA) --- a novel framework for identifying and removing instrumental noise in Kepler photometry. We identify instrumental noise modes by finding common trends in a large ensemble of light curves drawn from the entire Kepler field of view. Strategically, these noise modes can be optimized to reveal transits having a specified range of timescales. For Kepler target stars of low photometric noise, TERRA produces ensemble-calibrated photometry having 33 ppm RMS scatter in 12-hour bins, rendering individual transits of earth-size planets around sun-like stars detectable as ~3 sigma signals.Comment: 18 pages, 7 figures, submitted to PAS

    Not Just a Theory—The Utility of Mathematical Models in Evolutionary Biology

    Get PDF
    Models have made numerous contributions to evolutionary biology, but misunderstandings persist regarding their purpose. By formally testing the logic of verbal hypotheses, proof-of-concept models clarify thinking, uncover hidden assumptions, and spur new directions of study. thumbnail image credit: modified from the Biodiversity Heritage Librar

    Initial Characteristics of Kepler Short Cadence Data

    Full text link
    The Kepler Mission offers two options for observations -- either Long Cadence (LC) used for the bulk of core mission science, or Short Cadence (SC) which is used for applications such as asteroseismology of solar-like stars and transit timing measurements of exoplanets where the 1-minute sampling is critical. We discuss the characteristics of SC data obtained in the 33.5-day long Quarter 1 (Q1) observations with Kepler which completed on 15 June 2009. The truly excellent time series precisions are nearly Poisson limited at 11th magnitude providing per-point measurement errors of 200 parts-per-million per minute. For extremely saturated stars near 7th magnitude precisions of 40 ppm are reached, while for background limited measurements at 17th magnitude precisions of 7 mmag are maintained. We note the presence of two additive artifacts, one that generates regularly spaced peaks in frequency, and one that involves additive offsets in the time domain inversely proportional to stellar brightness. The difference between LC and SC sampling is illustrated for transit observations of TrES-2.Comment: 5 pages, 4 figures, ApJ Letters in pres

    Non-adaptive Measurement-based Quantum Computation and Multi-party Bell Inequalities

    Full text link
    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as Measurement-based Quantum Computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities still remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focussing on deterministic computation of Boolean functions, in which natural generalisations of the Greenberger-Horne-Zeilinger (GHZ) paradox emerge; we then explore probabilistic computation, via which multipartite Bell Inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.Comment: 13 pages, 4 figures, final version accepted for publicatio

    Information-theoretic interpretation of quantum error-correcting codes

    Get PDF
    Quantum error-correcting codes are analyzed from an information-theoretic perspective centered on quantum conditional and mutual entropies. This approach parallels the description of classical error correction in Shannon theory, while clarifying the differences between classical and quantum codes. More specifically, it is shown how quantum information theory accounts for the fact that "redundant" information can be distributed over quantum bits even though this does not violate the quantum "no-cloning" theorem. Such a remarkable feature, which has no counterpart for classical codes, is related to the property that the ternary mutual entropy vanishes for a tripartite system in a pure state. This information-theoretic description of quantum coding is used to derive the quantum analogue of the Singleton bound on the number of logical bits that can be preserved by a code of fixed length which can recover a given number of errors.Comment: 14 pages RevTeX, 8 Postscript figures. Added appendix. To appear in Phys. Rev.

    Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves

    Full text link
    Kepler provides light curves of 156,000 stars with unprecedented precision. However, the raw data as they come from the spacecraft contain significant systematic and stochastic errors. These errors, which include discontinuities, systematic trends, and outliers, obscure the astrophysical signals in the light curves. To correct these errors is the task of the Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline. The original version of PDC in Kepler did not meet the extremely high performance requirements for the detection of miniscule planet transits or highly accurate analysis of stellar activity and rotation. One particular deficiency was that astrophysical features were often removed as a side-effect to removal of errors. In this paper we introduce the completely new and significantly improved version of PDC which was implemented in Kepler SOC 8.0. This new PDC version, which utilizes a Bayesian approach for removal of systematics, reliably corrects errors in the light curves while at the same time preserving planet transits and other astrophysically interesting signals. We describe the architecture and the algorithms of this new PDC module, show typical errors encountered in Kepler data, and illustrate the corrections using real light curve examples.Comment: Submitted to PASP. Also see companion paper "Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction" by Jeff C. Smith et a

    Spitzer Infrared Spectrograph Observations of M, L, and T Dwarfs

    Full text link
    We present the first mid-infrared spectra of brown dwarfs, together with observations of a low-mass star. Our targets are the M3.5 dwarf GJ 1001A, the L8 dwarf DENIS-P J0255-4700, and the T1/T6 binary system epsilon Indi Ba/Bb. As expected, the mid-infrared spectral morphology of these objects changes rapidly with spectral class due to the changes in atmospheric chemistry resulting from their differing effective temperatures and atmospheric structures. By taking advantage of the unprecedented sensitivity of the Infrared Spectrograph on the Spitzer Space Telescope we have detected the 7.8 micron methane and 10 micron ammonia bands for the first time in brown dwarf spectra.Comment: 4 pages, 2 figure

    Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction

    Full text link
    With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used to establish a range of "reasonable" robust fit parameters. These robust fit parameters are then used to generate a Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF) which when maximized finds the best fit that simultaneously removes systematic effects while reducing the signal distortion and noise injection which commonly afflicts simple least-squares (LS) fitting. A numerical and empirical approach is taken where the Bayesian Prior PDFs are generated from fits to the light curve distributions themselves.Comment: 43 pages, 21 figures, Submitted for publication in PASP. Also see companion paper "Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves" by Martin C. Stumpe, et a
    • …
    corecore