903 research outputs found

    Sea turtle nesting in the Ten Thousand Islands of Florida

    Get PDF
    Loggerhead sea turtles (Caretta caretta) nest in numerous substrate and beach types within the Ten Thousand Islands (TTl) of southwest Florida. Nesting beach selection was analyzed on 12 islands within this archipelago. Numerous physical characteristics were recorded to identify the relatedness of these variables and determine their importance for nesting beach selection in C. caretta. These variables were chosen after evaluating the islands, conducting literature searches and soliciting personal communications. Along transects, data were collected, on the following: height of canopy, beach width, overall slope (beach slope and slope of offshore approach) and sand samples analyzed for pH, percentage of water, percentage of organic content, percentage of carbonate and particle size (8 size classes). Data on ordinal aspect of beaches and beach length were also recorded and included in the analysis. All of the variables were analyzed by tree regression, incorporating the nesting data into the analysis. In the TTl, loggerheads appear to prefer wider beaches (p< 0.001; R2 = 0.56) that inherently have less slope, and secondarily, wider beaches that have low amounts of carbonate (p< O.00 1). In addition, C. caretta favors nest sites within or in close proximity to the supra-littoral vegetation zone of beaches in the TTl (p< 0.001). (86 page document

    Model-robust regression and a Bayesian ``sandwich'' estimator

    Get PDF
    We present a new Bayesian approach to model-robust linear regression that leads to uncertainty estimates with the same robustness properties as the Huber--White sandwich estimator. The sandwich estimator is known to provide asymptotically correct frequentist inference, even when standard modeling assumptions such as linearity and homoscedasticity in the data-generating mechanism are violated. Our derivation provides a compelling Bayesian justification for using this simple and popular tool, and it also clarifies what is being estimated when the data-generating mechanism is not linear. We demonstrate the applicability of our approach using a simulation study and health care cost data from an evaluation of the Washington State Basic Health Plan.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS362 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Herpetofaunal Inventories of the National Parks of South Florida and the Caribbean: Volume III. Big Cypress National Preserve

    Get PDF
    Amphibian declines and extinctions have been documented around the world, often in protected natural areas. Concern for this trend has prompted the U.S. Geological Survey and the National Park Service to document all species of amphibians that occur within U.S. National Parks and to search for any signs that amphibians may be declining. This study, an inventory of amphibian species in Big Cypress National Preserve, was conducted from 2002 to 2003. The goals of the project were to create a georeferenced inventory of amphibian species, use new analytical techniques to estimate proportion of sites occupied by each species, look for any signs of amphibian decline (missing species, disease, die-offs, and so forth.), and to establish a protocol that could be used for future monitoring efforts. Several sampling methods were used to accomplish these goals. Visual encounter surveys and anuran vocalization surveys were conducted in all habitats throughout the park to estimate the proportion of sites or proportion of area occupied (PAO) by each amphibian species in each habitat. Opportunistic collections, as well as limited drift fence data, were used to augment the visual encounter methods for highly aquatic or cryptic species. A total of 545 visits to 104 sites were conducted for standard sampling alone, and 2,358 individual amphibians and 374 reptiles were encountered. Data analysis was conducted in program PRESENCE to provide PAO estimates for each of the anuran species. All of the amphibian species historically found in Big Cypress National Preserve were detected during this project. At least one individual of each of the four salamander species was captured during sampling. Each of the anuran species in the preserve was adequately sampled using standard herpetological sampling methods, and PAO estimates were produced for each species of anuran by habitat. This information serves as an indicator of habitat associations of the species and relative abundance of sites occupied, but it will also be useful as a comparative baseline for future monitoring efforts. In addition to sampling for amphibians, all encounters with reptiles were documented. The sampling methods used for detecting amphibians are also appropriate for many reptile species. These reptile locations are included in this report, but the number of reptile observations was not sufficient to estimate PAO for reptile species. We encountered 35 of the 46 species of reptiles believed to be present in Big Cypress National Preserve during this study, and evidence exists of the presence of four other reptile species in the Preserve. This study found no evidence of amphibian decline in Big Cypress National Preserve. Although no evidence of decline was observed, several threats to amphibians were identified. Introduced species, especially the Cuban treefrog (Osteopilus septentrionalis), are predators and competitors with several native frog species. The recreational use of off-road vehicles has the potential to affect some amphibian populations, and a study on those potential impacts is currently underway. Also, interference by humans with the natural hydrologic cycle of south Florida has the potential to alter the amphibian community. Continued monitoring of the amphibian species in Big Cypress National Preserve is recommended. The methods used in this study were adequate to produce reliable estimates of the proportion of sites occupied by most anuran species, and are a cost-effective means of determining the status of their populations

    Trading Bias for Precision: Decision Theory for Intervals and Sets

    Get PDF
    Interval- and set-valued decisions are an essential part of statistical inference. Despite this, the justification behind them is often unclear, leading in practice to a great deal of confusion about exactly what is being presented. In this paper we review and attempt to unify several competing methods of interval-construction, within a formal decision-theoretic framework. The result is a new emphasis on interval-estimation as a distinct goal, and not as an afterthought to point estimation. We also see that representing intervals as trade-offs between measures of precision and bias unifies many existing approaches -- as well as suggesting interpretable criteria to calibrate this trade-off. The novel statistical arguments produced allow many extensions, and we apply these to resolve several outstanding areas of disagreement between Bayesians and frequentists

    Model-Robust Bayesian Regression and the Sandwich Estimator

    Get PDF
    PLEASE NOTE THAT AN UPDATED VERSION OF THIS RESEARCH IS AVAILABLE AS WORKING PAPER 338 IN THE UNIVERSITY OF WASHINGTON BIOSTATISTICS WORKING PAPER SERIES (http://www.bepress.com/uwbiostat/paper338). In applied regression problems there is often sufficient data for accurate estimation, but standard parametric models do not accurately describe the source of the data, so associated uncertainty estimates are not reliable. We describe a simple Bayesian approach to inference in linear regression that recovers least-squares point estimates while providing correct uncertainty bounds by explicitly recognizing that standard modeling assumptions need not be valid. Our model-robust development parallels frequentist estimating equations and leads to intervals with the same robustness properties as the ’sandwich’ estimator

    The Role of Environmental Heterogeneity in Meta‐Analysis of Gene–Environment Interactions With Quantitative Traits

    Full text link
    With challenges in data harmonization and environmental heterogeneity across various data sources, meta‐analysis of gene–environment interaction studies can often involve subtle statistical issues. In this paper, we study the effect of environmental covariate heterogeneity (within and between cohorts) on two approaches for fixed‐effect meta‐analysis: the standard inverse‐variance weighted meta‐analysis and a meta‐regression approach. Akin to the results in Simmonds and Higgins ( ), we obtain analytic efficiency results for both methods under certain assumptions. The relative efficiency of the two methods depends on the ratio of within versus between cohort variability of the environmental covariate. We propose to use an adaptively weighted estimator (AWE), between meta‐analysis and meta‐regression, for the interaction parameter. The AWE retains full efficiency of the joint analysis using individual level data under certain natural assumptions. Lin and Zeng (2010a, b) showed that a multivariate inverse‐variance weighted estimator retains full efficiency as joint analysis using individual level data, if the estimates with full covariance matrices for all the common parameters are pooled across all studies. We show consistency of our work with Lin and Zeng (2010a, b). Without sacrificing much efficiency, the AWE uses only univariate summary statistics from each study, and bypasses issues with sharing individual level data or full covariance matrices across studies. We compare the performance of the methods both analytically and numerically. The methods are illustrated through meta‐analysis of interaction between Single Nucleotide Polymorphisms in FTO gene and body mass index on high‐density lipoprotein cholesterol data from a set of eight studies of type 2 diabetes.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/107543/1/gepi21810.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/107543/2/gepi21810-sup-0001-appendix.pd

    Addressing the estimation of standard errors in fixed effects meta-analysis.

    Get PDF
    Standard methods for fixed effects meta-analysis assume that standard errors for study-specific estimates are known, not estimated. While the impact of this simplifying assumption has been shown in a few special cases, its general impact is not well understood, nor are general-purpose tools available for inference under more realistic assumptions. In this paper, we aim to elucidate the impact of using estimated standard errors in fixed effects meta-analysis, showing why it does not go away in large samples and quantifying how badly miscalibrated standard inference will be if it is ignored. We also show the important role of a particular measure of heterogeneity in this miscalibration. These developments lead to confidence intervals for fixed effects meta-analysis with improved performance for both location and scale parameters

    Dust filtration at gap edges: Implications for the spectral energy distributions of discs with embedded planets

    Full text link
    The spectral energy distributions (SEDs) of some T Tauri stars display a deficit of near-IR flux that could be a consequence of an embedded Jupiter-mass planet partially clearing an inner hole in the circumstellar disc. Here, we use two-dimensional numerical simulations of the planet-disc interaction, in concert with simple models for the dust dynamics, to quantify how a planet influences the dust at different radii within the disc. We show that pressure gradients at the outer edge of the gap cleared by the planet act as a filter - letting particles smaller than a critical size through to the inner disc while holding back larger particles in the outer disc. The critical particle size depends upon the disc properties, but is typically of the order of 10 microns. This filtration process will lead to discontinuous grain populations across the planet's orbital radius, with small grains in the inner disc and an outer population of larger grains. We show that this type of dust population is qualitatively consistent with SED modelling of systems that have optically thin inner holes in their circumstellar discs. This process can also produce a very large gas-to-dust ratio in the inner disc, potentially explaining those systems with optically thin inner cavities that still have relatively high accretion rates.Comment: 9 pages, 7 figures, Accepted fir publication in MNRA

    Combined deletion of Xrcc4 and Trp53 in mouse germinal center B cells leads to novel B cell lymphomas with clonal heterogeneity

    Get PDF
    Abstract Background Activated B lymphocytes harbor programmed DNA double-strand breaks (DSBs) initiated by activation-induced deaminase (AID) and repaired by non-homologous end-joining (NHEJ). While it has been proposed that these DSBs during secondary antibody gene diversification are the primary source of chromosomal translocations in germinal center (GC)-derived B cell lymphomas, this point has not been directly addressed due to the lack of proper mouse models. Methods In the current study, we establish a unique mouse model by specifically deleting a NHEJ gene, Xrcc4, and a cell cycle checkpoint gene, Trp53, in GC B cells, which results in the spontaneous development of B cell lymphomas that possess features of GC B cells. Results We show that these NHEJ deficient lymphomas harbor translocations frequently targeting immunoglobulin (Ig) loci. Furthermore, we found that Ig translocations were associated with distinct mechanisms, probably caused by AID- or RAG-induced DSBs. Intriguingly, the AID-associated Ig loci translocations target either c-myc or Pvt-1 locus whereas the partners of RAG-associated Ig translocations scattered randomly in the genome. Lastly, these NHEJ deficient lymphomas harbor complicated genomes including segmental translocations and exhibit a high level of ongoing DNA damage and clonal heterogeneity. Conclusions We propose that combined NHEJ and p53 defects may serve as an underlying mechanism for a high level of genomic complexity and clonal heterogeneity in cancers

    Coagulation factor VIII, white matter hyperintensities and cognitive function: Results from the Cardiovascular Health Study

    Get PDF
    Objective: To investigate the relationship between high FVIII clotting activity (FVIII:C), MRI-defined white matter hyperintensities (WMH) and cognitive function over time. Methods: Data from the population-based Cardiovascular Health Study (n = 5,888, aged ≥ 65) were used. FVIII:C was measured in blood samples taken at baseline. WMH burden was assessed on two cranial MRI scans taken roughly 5 years apart. Cognitive function was assessed annually using the Modified Mini-Mental State Examination (3MSE) and Digit Symbol Substitution Test (DSST). We used ordinal logistic regression models adjusted for demographic and cardiovascular factors in cross-sectional and longitudinal WMH analyses, and adjusted linear regression and linear mixed models in the analyses of cognitive function. Results: After adjustment for confounding, higher levels of FVIII:C were not strongly associated with the burden of WMH on the initial MRI scan (OR>p75 = 1.20, 95% CI 0.99-1.45; N = 2,735) nor with WMH burden worsening over time (OR>p75 = 1.18, 95% CI 0.87-1.59; N = 1,527). High FVIII:C showed no strong association with cognitive scores cross-sectionally (3MSE>p75 β = -0.06, 95%CI -0.45 to 0.32, N = 4,005; DSST>p75 β = -0.69, 95%CI -1.52 to 0.13, N = 3,954) or over time (3MSE>p75 β = -0.07,95% CI -0.58 to 0.44, N = 2,764; DSST>p75 β = -0.22, 95% CI -0.97 to 0.53, N = 2,306) after confounding adjustment. Interpretation: The results from this cohort study of older adult participants indicate no strong relationships between higher FVIII:C levels and WMH burden or cognitive function in cross-sectional and longitudinal analyses
    corecore