77 research outputs found

    The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    Get PDF
    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center

    The Kepler Science Operations Center Pipeline Framework Extensions

    Get PDF
    The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline

    Photometer Performance Assessment in Kepler Science Data Processing

    Get PDF
    This paper describes the algorithms of the Photometer Performance Assessment (PPA) software component in the science data processing pipeline of the Kepler mission. The PPA performs two tasks: One is to analyze the health and performance of the Kepler photometer based on the long cadence science data down-linked via Ka band approximately every 30 days. The second is to determine the attitude of the Kepler spacecraft with high precision at each long cadence. The PPA component is demonstrated to work effectively with the Kepler flight data

    Data Validation in the Kepler Science Operations Center Pipeline

    Get PDF
    We present an overview of the Data Validation (DV) software component and its context within the Kepler Science Operations Center (SOC) pipeline and overall Kepler Science mission. The SOC pipeline performs a transiting planet search on the corrected light curves for over 150,000 targets across the focal plane array. We discuss the DV strategy for automated validation of Threshold Crossing Events (TCEs) generated in the transiting planet search. For each TCE, a transiting planet model is fitted to the target light curve. A multiple planet search is conducted by repeating the transiting planet search on the residual light curve after the model flux has been removed; if an additional detection occurs, a planet model is fitted to the new TCE. A suite of automated tests are performed after all planet candidates have been identified. We describe a centroid motion test to determine the significance of the motion of the target photocenter during transit and to estimate the coordinates of the transit source within the photometric aperture; a series of eclipsing binary discrimination tests on the parameters of the planet model fits to all transits and the sequences of odd and even transits; and a statistical bootstrap to assess the likelihood that the TCE would have been generated purely by chance given the target light curve with all transits removed. Keywords: photometry, data validation, Kepler, Earth-size planet

    Detection of Potential Transit Signals in the First Three Quarters of Kepler Mission Data

    Full text link
    We present the results of a search for potential transit signals in the first three quarters of photometry data acquired by the Kepler Mission. The targets of the search include 151,722 stars which were observed over the full interval and an additional 19,132 stars which were observed for only 1 or 2 quarters. From this set of targets we find a total of 5,392 detections which meet the Kepler detection criteria: those criteria are periodicity of the signal, an acceptable signal-to-noise ratio, and a composition test which rejects spurious detections which contain non-physical combinations of events. The detected signals are dominated by events with relatively low signal-to-noise ratio and by events with relatively short periods. The distribution of estimated transit depths appears to peak in the range between 40 and 100 parts per million, with a few detections down to fewer than 10 parts per million. The detected signals are compared to a set of known transit events in the Kepler field of view which were derived by a different method using a longer data interval; the comparison shows that the current search correctly identified 88.1% of the known events. A tabulation of the detected transit signals, examples which illustrate the analysis and detection process, a discussion of future plans and open, potentially fruitful, areas of further research are included

    Detection of Potential Transit Signals in Sixteen Quarters of Kepler Mission Data

    Full text link
    We present the results of a search for potential transit signals in four years of photometry data acquired by the Kepler Mission. The targets of the search include 111,800 stars which were observed for the entire interval and 85,522 stars which were observed for a subset of the interval. We found that 9,743 targets contained at least one signal consistent with the signature of a transiting or eclipsing object, where the criteria for detection are periodicity of the detected transits, adequate signal-to-noise ratio, and acceptance by a number of tests which reject false positive detections. When targets that had produced a signal were searched repeatedly, an additional 6,542 signals were detected on 3,223 target stars, for a total of 16,285 potential detections. Comparison of the set of detected signals with a set of known and vetted transit events in the Kepler field of view shows that the recovery rate for these signals is 96.9%. The ensemble properties of the detected signals are reviewed.Comment: Accepted by ApJ Supplemen

    A Framework for Propagation of Uncertainties in the Kepler Data Analysis Pipeline

    Get PDF
    The Kepler space telescope is designed to detect Earth-like planets around Sun-like stars using transit photometry by simultaneously observing 100,000 stellar targets nearly continuously over a three and a half year period. The 96-megapixel focal plane consists of 42 charge-coupled devices (CCD) each containing two 1024 x 1100 pixel arrays. Cross-correlations between calibrated pixels are introduced by common calibrations performed on each CCD requiring downstream data products access to the calibrated pixel covariance matrix in order to properly estimate uncertainties. The prohibitively large covariance matrices corresponding to the ~75,000 calibrated pixels per CCD preclude calculating and storing the covariance in standard lock-step fashion. We present a novel framework used to implement standard propagation of uncertainties (POU) in the Kepler Science Operations Center (SOC) data processing pipeline. The POU framework captures the variance of the raw pixel data and the kernel of each subsequent calibration transformation allowing the full covariance matrix of any subset of calibrated pixels to be recalled on-the-fly at any step in the calibration process. Singular value decomposition (SVD) is used to compress and low-pass filter the raw uncertainty data as well as any data dependent kernels. The combination of POU framework and SVD compression provide downstream consumers of the calibrated pixel data access to the full covariance matrix of any subset of the calibrated pixels traceable to pixel level measurement uncertainties without having to store, retrieve and operate on prohibitively large covariance matrices. We describe the POU Framework and SVD compression scheme and its implementation in the Kepler SOC pipeline
    • …
    corecore