1,545 research outputs found

    Effect of Feeding Ethanol By-Products on Performance and Marbling Deposition in Steers Fed High-Concentrate or High-Forage Diets

    Get PDF
    Research on the effect of dietary ethanol by-products on beef quality has been limited. Some Universities have reported a decrease in marbling due to distillers grains inclusion, while others have not. It is unclear why marbling deposition may be decreased when increasing amounts of distillers grains are fed; however, decreased starch availability, increased vitamin A and D, and the high oil content in ethanol by-products may contribute. In contrast, distillers grains can increase unsaturated fatty acid content of beef, thus increasing healthfulness. Our objective was to measure the effect of wet distillers grains (0, 20, or 40 % of the diet) on growth, feed intake, and marbling deposition and to determine what may be responsible for decreased marbling. Average daily gain and feed intake did not differ between wet distillers grains treatments, but cattle fed distillers grains were more efficient. Marbling score decreased in high-concentrate-fed steers as WDG concentration was increased, but increased in high-foragefed steers from the 0 to 20% WDG inclusion rate and then decreased from the 20 to 40% WDG inclusion rate. Backfat thickness decreased in high-concentrate-fed steers as WDG concentration increased but increased in high-forage-fed steers from the 0 to 20% WDG inclusion rate and then decreased from the 20 to 40% WDG inclusion rate. Cattle fed distillers grains had lower plasma total vitamin A and plasma vitamin D. Retinol, however, was positively related to marbling and vitamin D was negatively related to marbling. Polyunsaturated fatty acids, which can enhance the healthfulness of beef, were increased by feeding wet distillers grains, but were related to decreased marbling

    Earthquake Mechanism and Displacement Fields Close to Fault Zones

    Get PDF
    The Sixth Geodesy/Solid Earth and Ocean Physics (GEOP) Research Conference was held on February 4–5, 1974, at the Institute of Geophysics and Planetary Physics, University of California, San Diego, in La Jolla, California. It was attended by about 100 persons. James N. Brune, program chairman, opened the conference and delivered the introductory address, a somewhat extended version of which is printed elsewhere in this issue. Brune's paper and the following summaries of the sessions constitute a report of the conference

    Health care system collaboration to address chronic diseases: A nationwide snapshot from state public health practitioners

    Get PDF
    INTRODUCTION: Until recently, health care systems in the United States often lacked a unified approach to prevent and manage chronic disease. Recent efforts have been made to close this gap through various calls for increased collaboration between public health and health care systems to better coordinate provision of services and programs. Currently, the extent to which the public health workforce has responded is relatively unknown. The objective of this study is to explore health care system collaboration efforts and activities among a population-based sample of state public health practitioners. METHODS: During spring 2013, a national survey was administered to state-level chronic disease public health practitioners. Respondents were asked to indicate whether or not they collaborate with health care systems. Those who reported “yes” were asked to indicate all topic areas in which they collaborate and provide qualitative examples of their collaborative work. RESULTS: A total of 759 respondents (84%) reported collaboration. Common topics of collaboration activities were tobacco, cardiovascular health, and cancer screening. More client-oriented interventions than system-wide interventions were found in the qualitative examples provided. Respondents who collaborated were also more likely to use the Community Guide, use evidence-based decision making, and work in program areas that involved secondary, rather than primary, prevention. CONCLUSION: The study findings indicate a need for greater guidance on collaboration efforts that involve system-wide and cross-system interventions. Tools such as the Community Guide and evidence-based training courses may be useful in providing such guidance

    The extended tails of Palomar 5: A ten degree arc of globular cluster tidal debris

    Full text link
    Using wide-field photometric data from the Sloan Digital Sky Survey (SDSS) we recently showed that the Galactic globular cluster Palomar 5 is in the process of being tidally disrupted. Its tidal tails were initially detected in a 2.5 degree wide band along the celestial equator. A new analysis of SDSS data for a larger field now reveals that the tails of Pal 5 have a much larger spatial extent and can be traced over an arc of 10 deg across the sky, corresponding to a projected length of 4 kpc at the distance of the cluster. The number of former cluster stars found in the tails adds up to about 1.2 times the number of stars in the cluster. The radial profile of stellar surface density in the tails follows approximately a power law r^gamma with -1.5 < gamma < -1.2. The stream of debris from Pal 5 is significantly curved, which demonstrates its acceleration by the Galactic potential. The cluster is presently near the apocenter but has repeatedly undergone disk crossings in the inner part of the Galaxy leading to strong tidal shocks. Our results suggest that the observed debris originates mostly from mass loss within the last 2 Gyrs. The cluster is likely to be destroyed after the next disk crossing, which will happen in about 100 Myr. (abridged)Comment: 44 pages, including 14 figures (Figs.1,3 & 14 with decreased resolution), accepted for publication in the Astronomical Journa

    The Kepler Science Operations Center Pipeline Framework Extensions

    Get PDF
    The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline

    Promoting state health department evidence-based cancer and chronic disease prevention: A multi-phase dissemination study with a cluster randomized trial component

    Get PDF
    BACKGROUND: Cancer and other chronic diseases reduce quality and length of life and productivity, and represent a significant financial burden to society. Evidence-based public health approaches to prevent cancer and other chronic diseases have been identified in recent decades and have the potential for high impact. Yet, barriers to implement prevention approaches persist as a result of multiple factors including lack of organizational support, limited resources, competing emerging priorities and crises, and limited skill among the public health workforce. The purpose of this study is to learn how best to promote the adoption of evidence based public health practice related to chronic disease prevention. METHODS/DESIGN: This paper describes the methods for a multi-phase dissemination study with a cluster randomized trial component that will evaluate the dissemination of public health knowledge about evidence-based prevention of cancer and other chronic diseases. Phase one involves development of measures of practitioner views on and organizational supports for evidence-based public health and data collection using a national online survey involving state health department chronic disease practitioners. In phase two, a cluster randomized trial design will be conducted to test receptivity and usefulness of dissemination strategies directed toward state health department chronic disease practitioners to enhance capacity and organizational support for evidence-based chronic disease prevention. Twelve state health department chronic disease units will be randomly selected and assigned to intervention or control. State health department staff and the university-based study team will jointly identify, refine, and select dissemination strategies within intervention units. Intervention (dissemination) strategies may include multi-day in-person training workshops, electronic information exchange modalities, and remote technical assistance. Evaluation methods include pre-post surveys, structured qualitative phone interviews, and abstraction of state-level chronic disease prevention program plans and progress reports. TRIAL REGISTRATION: clinicaltrials.gov: NCT01978054

    Photometer Performance Assessment in Kepler Science Data Processing

    Get PDF
    This paper describes the algorithms of the Photometer Performance Assessment (PPA) software component in the science data processing pipeline of the Kepler mission. The PPA performs two tasks: One is to analyze the health and performance of the Kepler photometer based on the long cadence science data down-linked via Ka band approximately every 30 days. The second is to determine the attitude of the Kepler spacecraft with high precision at each long cadence. The PPA component is demonstrated to work effectively with the Kepler flight data

    Overview of the Kepler Science Processing Pipeline

    Full text link
    The Kepler Mission Science Operations Center (SOC) performs several critical functions including managing the ~156,000 target stars, associated target tables, science data compression tables and parameters, as well as processing the raw photometric data downlinked from the spacecraft each month. The raw data are first calibrated at the pixel level to correct for bias, smear induced by a shutterless readout, and other detector and electronic effects. A background sky flux is estimated from ~4500 pixels on each of the 84 CCD readout channels, and simple aperture photometry is performed on an optimal aperture for each star. Ancillary engineering data and diagnostic information extracted from the science data are used to remove systematic errors in the flux time series that are correlated with these data prior to searching for signatures of transiting planets with a wavelet-based, adaptive matched filter. Stars with signatures exceeding 7.1 sigma are subjected to a suite of statistical tests including an examination of each star's centroid motion to reject false positives caused by background eclipsing binaries. Physical parameters for each planetary candidate are fitted to the transit signature, and signatures of additional transiting planets are sought in the residual light curve. The pipeline is operational, finding planetary signatures and providing robust eliminations of false positives.Comment: 8 pages, 3 figure

    Pixel-Level Calibration in the Kepler Science Operations Center Pipeline

    Get PDF
    We present an overview of the pixel-level calibration of flight data from the Kepler Mission performed within the Kepler Science Operations Center Science Processing Pipeline. This article describes the calibration (CAL) module, which operates on original spacecraft data to remove instrument effects and other artifacts that pollute the data. Traditional CCD data reduction is performed (removal of instrument/detector effects such as bias and dark current), in addition to pixel-level calibration (correcting for cosmic rays and variations in pixel sensitivity), Kepler-specific corrections (removing smear signals which result from the lack of a shutter on the photometer and correcting for distortions induced by the readout electronics), and additional operations that are needed due to the complexity and large volume of flight data. CAL operates on long (~30 min) and short (~1 min) sampled data, as well as full-frame images, and produces calibrated pixel flux time series, uncertainties, and other metrics that are used in subsequent Pipeline modules. The raw and calibrated data are also archived in the Multi-mission Archive at Space Telescope at the Space Telescope Science Institute for use by the astronomical community
    corecore