170,362 research outputs found

    Light attenuation characteristics of glacially-fed lakes

    Get PDF
    Transparency is a fundamental characteristic of aquatic ecosystems and is highly responsive to changes in climate and land use. The transparency of glacially-fed lakes may be a particularly sensitive sentinel characteristic of these changes. However, little is known about the relative contributions of glacial flour versus other factors affecting light attenuation in these lakes. We sampled 18 glacially-fed lakes in Chile, New Zealand, and the U.S. and Canadian Rocky Mountains to characterize how dissolved absorption, algal biomass (approximated by chlorophyll a), water, and glacial flour contributed to attenuation of ultraviolet radiation (UVR) and photosynthetically active radiation (PAR, 400–700 nm). Variation in attenuation across lakes was related to turbidity, which we used as a proxy for the concentration of glacial flour. Turbidity-specific diffuse attenuation coefficients increased with decreasing wavelength and distance from glaciers. Regional differences in turbidity-specific diffuse attenuation coefficients were observed in short UVR wavelengths (305 and 320 nm) but not at longer UVR wavelengths (380 nm) or PAR. Dissolved absorption coefficients, which are closely correlated with diffuse attenuation coefficients in most non-glacially-fed lakes, represented only about one quarter of diffuse attenuation coefficients in study lakes here, whereas glacial flour contributed about two thirds across UVR and PAR. Understanding the optical characteristics of substances that regulate light attenuation in glacially-fed lakes will help elucidate the signals that these systems provide of broader environmental changes and forecast the effects of climate change on these aquatic ecosystems

    Contending with animal bones (Editorial)

    Get PDF
    [FIRST PARAGRAPH] This issue has been assembled in order to focus on some of the current directions in animal remains research. Since serious study of ancient animal remains began in the nineteenth century, this field and its specific areas of inquiry have evolved and diversified, and this collection of papers highlights that diversity, by including contributions that address issues from excavation and field recording methods and preservational conditions, to the use of bone for understanding past animal populations, as well as bones as proxy indicators for human activities. This volume is not meant only for the attention of the faunal remains specialist, and only a couple of these papers have actually been contributed by "archaeozoologists". Rather we hope to demonstrate the importance of faunal remains studies, on a par with lithic or pottery research. It should be acknowledged that animal bones do not relate simply to the “economic” aspects of a culture but to all areas of the life world

    Experimenting on Human Subiects

    Get PDF
    The problems surrounding free and informed consent are discussed in this article by Dr. May, who is Professor of Religion in the Department of Religion and Religious Education at the Catholic University of America

    Surveying Persons with Disabilities: A Source Guide (Version 1)

    Get PDF
    As a collaborator with the Cornell Rehabilitation Research and Training Center on Disability Demographics and Statistics, Mathematica Policy Research, Inc. has been working on a project that identifies the strengths and limitations in existing disability data collection in both content and data collection methodology. The intended outcomes of this project include expanding and synthesizing knowledge of best practices and the extent existing data use those practices, informing the development of data enhancement options, and contributing to a more informed use of existing data. In an effort to provide the public with an up-to-date and easily accessible source of research on the methodological issues associated with surveying persons with disabilities, MPR has prepared a Source Guide of material related to this topic. The Source Guide contains 150 abstracts, summaries, and references, followed by a Subject Index, which cross references the sources from the Reference List under various subjects. The Source Guide is viewed as a “living document,” and will be periodically updated

    On Loss Functions and Ranking Forecasting Performances of Multivariate Volatility Models

    Get PDF
    A large number of parameterizations have been proposed to model conditional variance dynamics in a multivariate framework. However, little is known about the ranking of multivariate volatility models in terms of their forecasting ability. The ranking of multivariate volatility models is inherently problematic because it requires the use of a proxy for the unobservable volatility matrix and this substitution may severely affect the ranking. We address this issue by investigating the properties of the ranking with respect to alternative statistical loss functions used to evaluate model performances. We provide conditions on the functional form of the loss function that ensure the proxy-based ranking to be consistent for the true one - i.e., the ranking that would be obtained if the true variance matrix was observable. We identify a large set of loss functions that yield a consistent ranking. In a simulation study, we sample data from a continuous time multivariate diffusion process and compare the ordering delivered by both consistent and inconsistent loss functions. We further discuss the sensitivity of the ranking to the quality of the proxy and the degree of similarity between models. An application to three foreign exchange rates, where we compare the forecasting performance of 16 multivariate GARCH specifications, is provided. Un grand nombre de méthodes de paramétrage ont été proposées dans le but de modéliser la dynamique de la variance conditionnelle dans un cadre multivarié. Toutefois, on connaît peu de choses sur le classement des modèles de volatilité multivariés, du point de vue de leur capacité à permettre de faire des prédictions. Le classement des modèles de volatilité multivariés est forcément problématique du fait qu’il requiert l’utilisation d’une valeur substitutive pour la matrice de la volatilité non observable et cette substitution peut influencer sérieusement le classement. Nous abordons ce problème en examinant les propriétés du classement en relation avec les fonctions de perte statistiques alternatives utilisées pour évaluer la performance des modèles. Nous présentons des conditions liées à la forme fonctionnelle de la fonction de perte qui garantissent que le classement fondé sur une valeur de substitution est constant par rapport au classement réel, c’est-à-dire à celui qui serait obtenu si la matrice de variance réelle était observable. Nous établissons un vaste ensemble de fonctions de perte qui produisent un classement constant. Dans le cadre d’une étude par simulation, nous fournissons un échantillon de données à partir d’un processus de diffusion multivarié en temps continu et comparons l’ordre généré par les fonctions de perte constantes et inconstantes. Nous approfondissons la question de la sensibilité du classement à la qualité de la substitution et le degré de ressemblance entre les modèles. Une application à trois taux de change est proposée et, dans ce contexte, nous comparons l’efficacité de prédiction de 16 paramètres du modèle GARCH multivarié (approche d’hétéroscédasticité conditionnelle autorégressive généralisée).Volatility, multivariate GARCH, matrix norm, loss function, model confidence set, Volatilité, modèle GARCH multivarié, norme matricielle, fonction de perte, ensemble de modèles de confiance.

    The Greenhouse Gas Climate Change Initiative (GHG-CCI): comparative validation of GHG-CCI SCIAMACHY/ENVISAT and TANSO-FTS/GOSAT CO₂ and CH₄ retrieval algorithm products with measurements from the TCCON

    Get PDF
    Column-averaged dry-air mole fractions of carbon dioxide and methane have been retrieved from spectra acquired by the TANSO-FTS (Thermal And Near-infrared Sensor for carbon Observations-Fourier Transform Spectrometer) and SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) instruments on board GOSAT (Greenhouse gases Observing SATellite) and ENVISAT (ENVIronmental SATellite), respectively, using a range of European retrieval algorithms. These retrievals have been compared with data from ground-based high-resolution Fourier transform spectrometers (FTSs) from the Total Carbon Column Observing Network (TCCON). The participating algorithms are the weighting function modified differential optical absorption spectroscopy (DOAS) algorithm (WFMD, University of Bremen), the Bremen optimal estimation DOAS algorithm (BESD, University of Bremen), the iterative maximum a posteriori DOAS (IMAP, Jet Propulsion Laboratory (JPL) and Netherlands Institute for Space Research algorithm (SRON)), the proxy and full-physics versions of SRON's RemoTeC algorithm (SRPR and SRFP, respectively) and the proxy and full-physics versions of the University of Leicester's adaptation of the OCO (Orbiting Carbon Observatory) algorithm (OCPR and OCFP, respectively). The goal of this algorithm inter-comparison was to identify strengths and weaknesses of the various so-called round- robin data sets generated with the various algorithms so as to determine which of the competing algorithms would proceed to the next round of the European Space Agency's (ESA) Greenhouse Gas Climate Change Initiative (GHG-CCI) project, which is the generation of the so-called Climate Research Data Package (CRDP), which is the first version of the Essential Climate Variable (ECV) "greenhouse gases" (GHGs). For XCO₂, all algorithms reach the precision requirements for inverse modelling (< 8 ppm), with only WFMD having a lower precision (4.7 ppm) than the other algorithm products (2.4–2.5 ppm). When looking at the seasonal relative accuracy (SRA, variability of the bias in space and time), none of the algorithms have reached the demanding < 0.5 ppm threshold. For XCH₄, the precision for both SCIAMACHY products (50.2 ppb for IMAP and 76.4 ppb for WFMD) fails to meet the < 34 ppb threshold for inverse modelling, but note that this work focusses on the period after the 2005 SCIAMACHY detector degradation. The GOSAT XCH₄ precision ranges between 18.1 and 14.0 ppb. Looking at the SRA, all GOSAT algorithm products reach the < 10 ppm threshold (values ranging between 5.4 and 6.2 ppb). For SCIAMACHY, IMAP and WFMD have a SRA of 17.2 and 10.5 ppb, respectively

    Discussion of: A statistical analysis of multiple temperature proxies: Are reconstructions of surface temperatures over the last 1000 years reliable?

    Full text link
    Discussion of "A statistical analysis of multiple temperature proxies: Are reconstructions of surface temperatures over the last 1000 years reliable?" by B.B. McShane and A.J. Wyner [arXiv:1104.4002]Comment: Published in at http://dx.doi.org/10.1214/10-AOAS409 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore