2,542 research outputs found

    Kepler Mission's Focal Plane Characterization Models Implementation

    Get PDF
    The Kepler Mission photometer is an unusually complex array of CCDs. A large number of time-varying instrumental and systemic effects must be modeled and removed from the Kepler pixel data to produce light curves of sufficiently high quality for the mission to be successful in its planet-finding objective. After the launch of the spacecraft, many of these effects are difficult to remeasure frequently, and various interpolations over a small number of sample measurements must be used to determine the correct value of a given effect at different points in time. A library of software modules, called Focal Plane Characterization (FC) Models, is the element of the Kepler Science Data Pipeline (hereafter "pipeline") that handles this. FC, or products generated by FC, are used by nearly every element of the SOC processing chain. FC includes Java components: database persistence classes, operations classes, model classes, and data importers; and MATLAB code: model classes, interpolation methods, and wrapper functions. These classes, their interactions, and the database tables they represent, are discussed. This paper describes how these data and the FC software work together to provide the pipeline with the correct values to remove non-photometric effects caused by the photometer and its electronics from the Kepler light curves. The interpolation mathematics is reviewed, as well as the special case of the sky-to-pixel,pixel-to-sky coordinate transformation code, which incorporates a compound model that is unique in the SOC software

    Development and initial validation of the Falls Efficacy Scale-International (FES-I)

    Get PDF
    Background: there is a need for a measure of fear of falling that assesses both easy and difficult physical activities and social activities and is suitable for use in a range of languages and cultural contexts, permitting direct comparison between studies and populations in different countries and settings. Objective: to develop a modified version of the Falls Efficacy Scale to satisfy this need, and to establish its psychometric properties, reliability, and concurrent validity (i.e. that it demonstrates the expected relationship with age, falls history and falls risk factors). Design: cross-sectional survey. Setting: community sample. Method: 704 people aged between 60 and 95 years completed The Falls Efficacy Scale-International (FES-I) either in postal self-completion format or by structured interview. Results: the FES-I had excellent internal and test-retest reliability (Cronbach's α=0.96, ICC=0.96). Factor analysis suggested a unitary underlying factor, with two dimensions assessing concern about less demanding physical activities mainly in the home, and concern about more demanding physical activities mainly outside the home. The FES-I had slightly better power than the original FES items to discriminate differences in concern about falling between groups differentiated by sex, age, occupation, falls in the past year, and falls risk factors (chronic illness, taking multiple or psychoactive medications, dizziness). Conclusions: the FES-I has close continuity with the best existing measure of fear of falling, excellent psychometric properties, and assesses concerns relating to basic and more demanding activities, both physical and social. Further research is required to confirm cross-cultural and predictive validit

    Combinatorial Characterizations of K-matrices

    Get PDF
    We present a number of combinatorial characterizations of K-matrices. This extends a theorem of Fiedler and Ptak on linear-algebraic characterizations of K-matrices to the setting of oriented matroids. Our proof is elementary and simplifies the original proof substantially by exploiting the duality of oriented matroids. As an application, we show that a simple principal pivot method applied to the linear complementarity problems with K-matrices converges very quickly, by a purely combinatorial argument.Comment: 17 pages; v2, v3: clarified proof of Thm 5.5, minor correction

    The Kepler Pixel Response Function

    Full text link
    Kepler seeks to detect sequences of transits of Earth-size exoplanets orbiting Solar-like stars. Such transit signals are on the order of 100 ppm. The high photometric precision demanded by Kepler requires detailed knowledge of how the Kepler pixels respond to starlight during a nominal observation. This information is provided by the Kepler pixel response function (PRF), defined as the composite of Kepler's optical point spread function, integrated spacecraft pointing jitter during a nominal cadence and other systematic effects. To provide sub-pixel resolution, the PRF is represented as a piecewise-continuous polynomial on a sub-pixel mesh. This continuous representation allows the prediction of a star's flux value on any pixel given the star's pixel position. The advantages and difficulties of this polynomial representation are discussed, including characterization of spatial variation in the PRF and the smoothing of discontinuities between sub-pixel polynomial patches. On-orbit super-resolution measurements of the PRF across the Kepler field of view are described. Two uses of the PRF are presented: the selection of pixels for each star that maximizes the photometric signal to noise ratio for that star, and PRF-fitted centroids which provide robust and accurate stellar positions on the CCD, primarily used for attitude and plate scale tracking. Good knowledge of the PRF has been a critical component for the successful collection of high-precision photometry by Kepler.Comment: 10 pages, 5 figures, accepted by ApJ Letters. Version accepted for publication

    The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    Get PDF
    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center

    Presearch Data Conditioning in the Kepler Science Operations Center Pipeline

    Get PDF
    We describe the Presearch Data Conditioning (PDC) software component and its context in the Kepler Science Operations Center (SOC) pipeline. The primary tasks of this component are to correct systematic and other errors, remove excess flux due to aperture crowding, and condition the raw flux light curves for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) targets across the focal plane array. Long cadence corrected flux light curves are subjected to a transiting planet search in a subsequent pipeline module. We discuss the science algorithms for long and short cadence PDC: identification and correction of unexplained (i.e., unrelated to known anomalies) discontinuities; systematic error correction; and excess flux removal. We discuss the propagation of uncertainties from raw to corrected flux. Finally, we present examples of raw and corrected flux time series for flight data to illustrate PDC performance. Corrected flux light curves produced by PDC are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and will be made available to the general public in accordance with the NASA/Kepler data release policy
    • …
    corecore