90 research outputs found

    The Future of Public Service and Strategy Management-at-Scale

    Get PDF
    Increasingly, government agencies and non-profit organisations are called on to address challenges that go well beyond any individual organisation’s boundaries and direct control. Strategic management for single organisations cannot respond effectively to these crossboundary, cross-level, and often cross-sector challenges. Instead, a new approach called strategy management-at-scale is required. This article compares strategic management with strategy managementat-scale. It responds to the question, what does strategy managementat-scale look like, and what seems to contribute to its success? The new approach helps foster – but hardly guarantees – direction, alignment and commitment among the multiple organisations and groups needed to make headway against the challenge

    BP Rail Logistics Project environmental impact assessment, Bellingham, WA

    Get PDF
    The purpose of this environmental impact assessment (EIA) is to identify any environmental elements potentially impacted by the BP Cherry Point Refinery Rail Logistics Project, Both on the project site and the land (developed and undeveloped) and water bodies adjacent to the Burlington Northern Santa Fe (BNSF) railway within the western portion of Whatcom County extending from Larrabee State Park to the BP refinery at Cherry Point. The elements of the environment that will be examined will be divided into two categories: environmental and built. The environmental elements include earth, water, air, plants, animals, and energy and natural resources. The built environment includes utilities, transportation, land and shoreline use, public and environmental health, public services, light and glare, and noise. The refinery brings in approximately 225,000 barrels per day (bpd) of crude oil, and the proposed 10,200 foot rail loop project is expected to take in one oil train (consisting of 100 cars) per day transporting roughly 20,000 barrels per day, or two trains and 40,000 barrels of crude oil every other day. The crude oil shipped by rail is expected to reduce oil shipments by oil tankers by a similar amount. This document will address the impacts of the proposed action, as well as benefits and the impacts of an alternative action and a no action plan. The alternative action is to build the proposed Rail Logistics Project, but with additional mitigation to further reduce the impact(s) of the proposed project. The no action plan will be to not build the Rail Logistics Project and resulting in no impacts upon the refinery and project site. The primary environmental issues of the proposed action include a reduction in air and water quality, soil erosion, removal of vegetation and wetlands, removal of wildlife habitat, and impacts associated with train derailments and oil spills

    The Kepler Science Operations Center Pipeline Framework Extensions

    Get PDF
    The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline

    Photometer Performance Assessment in Kepler Science Data Processing

    Get PDF
    This paper describes the algorithms of the Photometer Performance Assessment (PPA) software component in the science data processing pipeline of the Kepler mission. The PPA performs two tasks: One is to analyze the health and performance of the Kepler photometer based on the long cadence science data down-linked via Ka band approximately every 30 days. The second is to determine the attitude of the Kepler spacecraft with high precision at each long cadence. The PPA component is demonstrated to work effectively with the Kepler flight data

    Overview of the Kepler Science Processing Pipeline

    Full text link
    The Kepler Mission Science Operations Center (SOC) performs several critical functions including managing the ~156,000 target stars, associated target tables, science data compression tables and parameters, as well as processing the raw photometric data downlinked from the spacecraft each month. The raw data are first calibrated at the pixel level to correct for bias, smear induced by a shutterless readout, and other detector and electronic effects. A background sky flux is estimated from ~4500 pixels on each of the 84 CCD readout channels, and simple aperture photometry is performed on an optimal aperture for each star. Ancillary engineering data and diagnostic information extracted from the science data are used to remove systematic errors in the flux time series that are correlated with these data prior to searching for signatures of transiting planets with a wavelet-based, adaptive matched filter. Stars with signatures exceeding 7.1 sigma are subjected to a suite of statistical tests including an examination of each star's centroid motion to reject false positives caused by background eclipsing binaries. Physical parameters for each planetary candidate are fitted to the transit signature, and signatures of additional transiting planets are sought in the residual light curve. The pipeline is operational, finding planetary signatures and providing robust eliminations of false positives.Comment: 8 pages, 3 figure

    KOI-54: The Kepler Discovery of Tidally Excited Pulsations and Brightenings in a Highly Eccentric Binary

    Get PDF
    Kepler observations of the star HD 187091 (KIC 8112039, hereafter KOI-54) revealed a remarkable light curve exhibiting sharp periodic brightening events every 41.8 days with a superimposed set of oscillations forming a beating pattern in phase with the brightenings. Spectroscopic observations revealed that this is a binary star with a highly eccentric orbit, e = 0.83. We are able to match the Kepler light curve and radial velocities with a nearly face-on (i = 5 degrees.5) binary star model in which the brightening events are caused by tidal distortion and irradiation of nearly identical A stars during their close periastron passage. The two dominant oscillations in the light curve, responsible for the beating pattern, have frequencies that are the 91st and 90th harmonic of the orbital frequency. The power spectrum of the light curve, after removing the binary star brightening component, reveals a large number of pulsations, 30 of which have a signal-to-noise ratio greater than or similar to 7. Nearly all of these pulsations have frequencies that are either integer multiples of the orbital frequency or are tidally split multiples of the orbital frequency. This pattern of frequencies unambiguously establishes the pulsations as resonances between the dynamic tides at periastron and the free oscillation modes of one or both of the stars. KOI-54 is only the fourth star to show such a phenomenon and is by far the richest in terms of excited modes.NASA, Science Mission DirectorateNASA NNX08AR14GEuropean Research Council under the European Community 227224W.M. Keck FoundationMcDonald Observator

    Planetary Candidates Observed by Kepler. VIII. A Fully Automated Catalog with Measured Completeness and Reliability Based on Data Release 25

    Full text link
    We present the Kepler Object of Interest (KOI) catalog of transiting exoplanets based on searching 4 yr of Kepler time series photometry (Data Release 25, Q1–Q17). The catalog contains 8054 KOIs, of which 4034 are planet candidates with periods between 0.25 and 632 days. Of these candidates, 219 are new, including two in multiplanet systems (KOI-82.06 and KOI-2926.05) and 10 high-reliability, terrestrial-size, habitable zone candidates. This catalog was created using a tool called the Robovetter, which automatically vets the DR25 threshold crossing events (TCEs). The Robovetter also vetted simulated data sets and measured how well it was able to separate TCEs caused by noise from those caused by low signal-to-noise transits. We discuss the Robovetter and the metrics it uses to sort TCEs. For orbital periods less than 100 days the Robovetter completeness (the fraction of simulated transits that are determined to be planet candidates) across all observed stars is greater than 85%. For the same period range, the catalog reliability (the fraction of candidates that are not due to instrumental or stellar noise) is greater than 98%. However, for low signal-to-noise candidates between 200 and 500 days around FGK-dwarf stars, the Robovetter is 76.7% complete and the catalog is 50.5% reliable. The KOI catalog, the transit fits, and all of the simulated data used to characterize this catalog are available at the NASA Exoplanet Archive

    A First Comparison of Kepler Planet Candidates in Single and Multiple Systems

    Get PDF
    In this letter we present an overview of the rich population of systems with multiple candidate transiting planets found in the first four months of Kepler data. The census of multiples includes 115 targets that show 2 candidate planets, 45 with 3, 8 with 4, and 1 each with 5 and 6, for a total of 170 systems with 408 candidates. When compared to the 827 systems with only one candidate, the multiples account for 17 percent of the total number of systems, and a third of all the planet candidates. We compare the characteristics of candidates found in multiples with those found in singles. False positives due to eclipsing binaries are much less common for the multiples, as expected. Singles and multiples are both dominated by planets smaller than Neptune; 69 +2/-3 percent for singles and 86 +2/-5 percent for multiples. This result, that systems with multiple transiting planets are less likely to include a transiting giant planet, suggests that close-in giant planets tend to disrupt the orbital inclinations of small planets in flat systems, or maybe even to prevent the formation of such systems in the first place.Comment: 13 pages, 13 figures, submitted to ApJ Letter

    A Framework for Propagation of Uncertainties in the Kepler Data Analysis Pipeline

    Get PDF
    The Kepler space telescope is designed to detect Earth-like planets around Sun-like stars using transit photometry by simultaneously observing 100,000 stellar targets nearly continuously over a three and a half year period. The 96-megapixel focal plane consists of 42 charge-coupled devices (CCD) each containing two 1024 x 1100 pixel arrays. Cross-correlations between calibrated pixels are introduced by common calibrations performed on each CCD requiring downstream data products access to the calibrated pixel covariance matrix in order to properly estimate uncertainties. The prohibitively large covariance matrices corresponding to the ~75,000 calibrated pixels per CCD preclude calculating and storing the covariance in standard lock-step fashion. We present a novel framework used to implement standard propagation of uncertainties (POU) in the Kepler Science Operations Center (SOC) data processing pipeline. The POU framework captures the variance of the raw pixel data and the kernel of each subsequent calibration transformation allowing the full covariance matrix of any subset of calibrated pixels to be recalled on-the-fly at any step in the calibration process. Singular value decomposition (SVD) is used to compress and low-pass filter the raw uncertainty data as well as any data dependent kernels. The combination of POU framework and SVD compression provide downstream consumers of the calibrated pixel data access to the full covariance matrix of any subset of the calibrated pixels traceable to pixel level measurement uncertainties without having to store, retrieve and operate on prohibitively large covariance matrices. We describe the POU Framework and SVD compression scheme and its implementation in the Kepler SOC pipeline
    • …
    corecore