4,396 research outputs found

    An absolute calibration system for millimeter-accuracy APOLLO measurements

    Get PDF
    Lunar laser ranging provides a number of leading experimental tests of gravitation -- important in our quest to unify General Relativity and the Standard Model of physics. The Apache Point Observatory Lunar Laser-ranging Operation (APOLLO) has for years achieved median range precision at the ~2 mm level. Yet residuals in model-measurement comparisons are an order-of-magnitude larger, raising the question of whether the ranging data are not nearly as accurate as they are precise, or if the models are incomplete or ill-conditioned. This paper describes a new absolute calibration system (ACS) intended both as a tool for exposing and eliminating sources of systematic error, and also as a means to directly calibrate ranging data in-situ. The system consists of a high-repetition-rate (80 MHz) laser emitting short (< 10 ps) pulses that are locked to a cesium clock. In essence, the ACS delivers photons to the APOLLO detector at exquisitely well-defined time intervals as a "truth" input against which APOLLO's timing performance may be judged and corrected. Preliminary analysis indicates no inaccuracies in APOLLO data beyond the ~3 mm level, suggesting that historical APOLLO data are of high quality and motivating continued work on model capabilities. The ACS provides the means to deliver APOLLO data both accurate and precise below the 2 mm level.Comment: 21 pages, 10 figures, submitted to Classical and Quantum Gravit

    A Causal, Data-Driven Approach to Modeling the Kepler Data

    Full text link
    Astronomical observations are affected by several kinds of noise, each with its own causal source; there is photon noise, stochastic source variability, and residuals coming from imperfect calibration of the detector or telescope. The precision of NASA Kepler photometry for exoplanet science---the most precise photometric measurements of stars ever made---appears to be limited by unknown or untracked variations in spacecraft pointing and temperature, and unmodeled stellar variability. Here we present the Causal Pixel Model (CPM) for Kepler data, a data-driven model intended to capture variability but preserve transit signals. The CPM works at the pixel level so that it can capture very fine-grained information about the variation of the spacecraft. The CPM predicts each target pixel value from a large number of pixels of other stars sharing the instrument variabilities while not containing any information on possible transits in the target star. In addition, we use the target star's future and past (auto-regression). By appropriately separating, for each data point, the data into training and test sets, we ensure that information about any transit will be perfectly isolated from the model. The method has four hyper-parameters (the number of predictor stars, the auto-regressive window size, and two L2-regularization amplitudes for model components), which we set by cross-validation. We determine a generic set of hyper-parameters that works well for most of the stars and apply the method to a corresponding set of target stars. We find that we can consistently outperform (for the purposes of exoplanet detection) the Kepler Pre-search Data Conditioning (PDC) method for exoplanet discovery.Comment: Accepted for publication in the PAS

    Heart failure with preserved ejection fraction.

    Get PDF
    Heart failure with preserved ejection fraction (HFpEF) has recently emerged as a major cause of cardiovascular morbidity and mortality. Contrary to initial beliefs, HFpEF is now known to be as common as heart failure with reduced ejection fraction (HFrEF) and carries an unacceptably high mortality rate. With a prevalence that has been steadily rising over the past two decades, it is very likely that HFpEF will represent the dominant heart failure phenotype over the coming few years. The scarcity of trials in this semi-discrete form of heart failure and lack of unified enrolment criteria in the studies conducted to date might have contributed to the current absence of specific therapies. Understanding the epidemiological, pathophysiological and molecular differences (and similarities) between these two forms of heart failure is cornerstone to the development of targeted therapies. Carefully designed studies that adhere to unified diagnostic criteria with the recruitment of appropriate controls and adoption of practical end-points are urgently needed to help identify effective treatment strategies

    Surface radiation budget for climate applications

    Get PDF
    The Surface Radiation Budget (SRB) consists of the upwelling and downwelling radiation fluxes at the surface, separately determined for the broadband shortwave (SW) (0 to 5 micron) and longwave (LW) (greater than 5 microns) spectral regions plus certain key parameters that control these fluxes, specifically, SW albedo, LW emissivity, and surface temperature. The uses and requirements for SRB data, critical assessment of current capabilities for producing these data, and directions for future research are presented

    Julian Ernst Besag, 26 March 1945 -- 6 August 2010, a biographical memoir

    Full text link
    Julian Besag was an outstanding statistical scientist, distinguished for his pioneering work on the statistical theory and analysis of spatial processes, especially conditional lattice systems. His work has been seminal in statistical developments over the last several decades ranging from image analysis to Markov chain Monte Carlo methods. He clarified the role of auto-logistic and auto-normal models as instances of Markov random fields and paved the way for their use in diverse applications. Later work included investigations into the efficacy of nearest neighbour models to accommodate spatial dependence in the analysis of data from agricultural field trials, image restoration from noisy data, and texture generation using lattice models.Comment: 26 pages, 14 figures; minor revisions, omission of full bibliograph

    SeaWiFS technical report series. Volume 5: Ocean optics protocols for SeaWiFS validation

    Get PDF
    Protocols are presented for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS accuracy goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review rationale. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibration instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April, 1991). This report is the proceedings of that workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are a first prescription to approach unprecedented measurement accuracies implied by the SeaWiFS goals, and research and development are needed to improve the state-of-the-art in specific areas. The protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle

    Dark energy survey year 1 results: the photometric data set for cosmology

    Get PDF
    FINEP - FINANCIADORA DE ESTUDOS E PROJETOSFAPERJ - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DO RIO DE JANEIROCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOMCTIC - MINISTÉRIO DA CIÊNCIA, TECNOLOGIA, INOVAÇÕES E COMUNICAÇÕESWe describe the creation, content, and validation of the Dark Energy Survey (DES) internal year-one cosmology data set, Y1A1 GOLD, in support of upcoming cosmological analyses. The Y1A1 GOLD data set is assembled from multiple epochs of DES imaging and consists of calibrated photometric zero-points, object catalogs, and ancillary data products-e.g., maps of survey depth and observing conditions, star galaxy classification, and photometric redshift estimates that are necessary for accurate cosmological analyses. The Y1A1 GOLD wide area object catalog consists of similar to 137 million objects detected in co-added images covering similar to 1800 deg(2) in the DES grizY filters. The 10 sigma limiting magnitude for galaxies is g = 23.4, r = 23.2, i = 22.5, z = 21.8, and Y = 20.1. Photometric calibration of Y1A1 GOLD was performed by combining nightly zero-point solutions with stellar locus regression, and the absolute calibration accuracy is better than 2% over the survey area. DES Y1A1 GOLD is the largest photometric data set at the achieved depth to date, enabling precise measurements of cosmic acceleration at z less than or similar to 1.2352135FINEP - FINANCIADORA DE ESTUDOS E PROJETOSFAPERJ - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DO RIO DE JANEIROCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOMCTIC - MINISTÉRIO DA CIÊNCIA, TECNOLOGIA, INOVAÇÕES E COMUNICAÇÕESFINEP - FINANCIADORA DE ESTUDOS E PROJETOSFAPERJ - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DO RIO DE JANEIROCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOMCTIC - MINISTÉRIO DA CIÊNCIA, TECNOLOGIA, INOVAÇÕES E COMUNICAÇÕESSem informaçãoSem informaçãoSem informaçãoSem informaçãoAgências de fomento estrangeiras apoiaram essa pesquisa, mais informações acesse artig

    New Approaches to Mapping Forest Conditions and Landscape Change from Moderate Resolution Remote Sensing Data across the Species-Rich and Structurally Diverse Atlantic Northern Forest of Northeastern North America

    Get PDF
    The sustainable management of forest landscapes requires an understanding of the functional relationships between management practices, changes in landscape conditions, and ecological response. This presents a substantial need of spatial information in support of both applied research and adaptive management. Satellite remote sensing has the potential to address much of this need, but forest conditions and patterns of change remain difficult to synthesize over large areas and long time periods. Compounding this problem is error in forest attribute maps and consequent uncertainty in subsequent analyses. The research described in this document is directed at these long-standing problems. Chapter 1 demonstrates a generalizable approach to the characterization of predominant patterns of forest landscape change. Within a ~1.5 Mha northwest Maine study area, a time series of satellite-derived forest harvest maps (1973-2010) served as the basis grouping landscape units according to time series of cumulative harvest area. Different groups reflected different harvest histories, which were linked to changes in landscape composition and configuration through time series of selected landscape metrics. Time series data resolved differences in landscape change attributable to passage of the Maine Forest Practices Act, a major change in forest policy. Our approach should be of value in supporting empirical landscape research. Perhaps the single most important source of uncertainty in the characterization of landscape conditions is over- or under-representation of class prevalence caused by prediction bias. Systematic error is similarly impactful in maps of continuous forest attributes, where regression dilution or attenuation bias causes the overestimation of low values and underestimation of high values. In both cases, patterns of error tend to produce more homogeneous characterizations of landscape conditions. Chapters 2 and 3 present a machine learning method designed to simultaneously reduce systematic and total error in continuous and categorical maps, respectively. By training support vector machines with a multi-objective genetic algorithm, attenuation bias was substantially reduced in regression models of tree species relative abundance (chapter 2), and prediction bias was effectively removed from classification models predicting tree species occurrence and forest disturbance (chapter 3). This approach is generalizable to other prediction problems, other regions, or other geospatial disciplines
    corecore