2,213 research outputs found

    Assessing, valuing and protecting our environment- is there a statistical challenge to be answered?

    Get PDF
    This short article describes some of the evolution in environmental regulation, management and monitoring and the information needs, closely aligned to the statistical challenges to deliver the evidence base for change and effect

    An integrated Bayesian model for estimating the long-term health effects of air pollution by fusing modelled and measured pollution data: a case study of nitrogen dioxide concentrations in Scotland

    Get PDF
    The long-term health effects of air pollution can be estimated using a spatio-temporal ecological study, where the disease data are counts of hospital admissions from populations in small areal units at yearly intervals. Spatially representative pollution concentrations for each areal unit are typically estimated by applying Kriging to data from a sparse monitoring network, or by computing averages over grid level concentrations from an atmospheric dispersion model. We propose a novel fusion model for estimating spatially aggregated pollution concentrations using both the modelled and monitored data, and relate these concentrations to respiratory disease in a new study in Scotland between 2007 and 2011

    Spatio-temporal Modelling of Remote-sensing Lake Surface Water Temperature Data

    Get PDF
    Remote-sensing technology is widely used in environmental monitoring. The coverage and resolution of satellite based data provide scientists with great opportunities to study and understand environmental change. However, the large volume and the missing observations in the remote-sensing data present challenges to statistical analysis. This paper investigates two approaches to the spatio-temporal modelling of remote-sensing lake surface water temperature data. Both methods use the state space framework, but with different parameterizations to reflect different aspects of the problem. The appropriateness of the methods for identifying spatial/temporal patterns in the data is discussed

    Why do we need 14C inter-comparisons?: The Glasgow 14C inter-comparison series, a reflection over 30 years

    Get PDF
    Radiocarbon measurement is a well-established, routinely used, yet complex series of inter-linked procedures. The degree of sample pre-treatment varies considerably depending on the material, the methods of processing pre-treated material vary across laboratories and the detection of 14C at low levels remains challenging. As in any complex measurement process, the questions of quality assurance and quality control become paramount, both internally, i.e. within a laboratory and externally, across laboratories. The issue of comparability of measurements (and thus bias, accuracy and precision of measurement) from the diverse laboratories is one that has been the focus of considerable attention for some time, both within the 14C community and the wider user communities. In the early years of the technique when there was only a small number of laboratories in existence, inter-comparisons would function on an ad hoc basis, usually involving small numbers of laboratories (e.g.Otlet et al, 1980). However, as more laboratories were set-up and the detection methods were further developed (e.g. new AMS facilities), the need for more systematic work was recognised. The international efforts to create a global calibration curve also requires the use of data generated by different laboratories at different times, so that evidence of laboratory offsets is needed to inform curve formation. As a result of these factors, but also as part of general good laboratory practice, including laboratory benchmarking and quality assurance, the 14C community has undertaken a wide-scale, far-reaching and evolving programme of global inter-comparisons, to the benefit of laboratories and users alike. This paper looks at some of that history and considers what has been achieved in the past 30 years

    Functional Principal Component Analysis for Non-stationary Dynamic Time Series

    Get PDF
    Motivated by a highly dynamic hydrological high-frequency time series, we propose time-varying Functional Principal Component Analysis (FPCA) as a novel approach for the analysis of non-stationary Functional Time Series (FTS) in the frequency domain. Traditional FPCA does not take into account (i) the temporal dependence between the functional observations and (ii) the changes in the covariance/variability structure over time, which could result in inadequate dimension reduction. The novel time-varying FPCA proposed adapts to the changes in the auto-covariance structure and varies smoothly over frequency and time to allow investigation of whether and how the variability structure in an FTS changes over time. Based on the (smooth) time-varying dynamic FPCs, a bootstrap inference procedure is proposed to detect significant changes in the covariance structure over time. Although this time-varying dynamic FPCA can be applied to any dynamic FTS, it has been applied here to study the daily processes of partial pressure of CO2 in a small river catchment in Scotland

    The role of Statistics in the era of big data: crucial, critical and under-valued

    Get PDF
    What is the role of Statistics in the era of big data, or is Statistics still relevant? I will start this rather personal view with my answer. Statistics remains highly relevant irrespective of ‘bigness’ of data, its role remains what is has always been, but is even more important now. As a community, we need to improve our explanations and presentations to make more visible our relevance

    Appraisal of the rehabilitation program in a Massachusetts county sanatorium

    Full text link
    This item was digitized by the Internet Archive. Thesis (Ed.M.)--Boston Universit

    Do agonistic behaviours bias baited remote underwater video surveys of fish?

    Get PDF
    Marine environments require monitoring to determine the effects of impacts such as climate change, coastal development and pollution and also to assess the effectiveness of conservation measures. Marine protected areas (MPAs) are being established globally and require periodic monitoring to determine whether their objectives are being met. Baited underwater video systems are becoming a popular method for monitoring change within protected fish populations, because they are less damaging to habitats than bottom trawling and allow for more statistical powerful comparisons to determine spatial and temporal patterns in the relative abundances, lengths and biomass of demersal and pelagic fishes. However, much remains uncertain about how interactions between the fish and bait and between the fish themselves affect the results obtained. Agonistic behaviours are frequently observed around the bait of the camera and potentially bias fish density estimates by altering the number and size classes seen at cameras. Here we counted the number of agonistic behaviours between pink snappers (Pagrus auratus), the size of fish involved and whether the fish left the field of view following such behaviours. The study consisted of 20 baited underwater video deployments inside a New Zealand marine reserve and 20 in adjacent open areas. We observed a significant relationship between the peak number of fish observed at the camera and the total number of agonistic behaviours, as well as the number of both aggressor and subordinate fish leaving the camera field of view following interactions. The slope of the latter relationship and thus the absolute numbers of fish leaving were higher for subordinate fish. As subordinates were significantly smaller than aggressors, the apparent size frequency distribution is likely skewed away from smaller size classes. The staying time of the fish and thus the maximum number of fish present at the camera will be reduced by agonistic behaviours and the absolute magnitude of this effect appears to be greater at high fish densities. Our results suggest that an overall effect of these phenomena is to underestimate the differences in abundance between MPAs and open areas, but also to overestimate differences in average size

    Interactive Teaching Tools for Spatial Sampling

    Get PDF
    The statistical analysis of data which is measured over a spatial region is well established as a scientific tool which makes considerable contributions to a wide variety of application areas. Further development of these tools also remains a central part of the research scene in statistics. However, understanding of the concepts involved often benefits from an intuitive and experimental approach, as well as a formal description of models and methods. This paper describes software which is intended to assist in this understanding. The role of simulation is advocated, in order to explain the meaning of spatial correlation and to interpret the parameters involved in standard models. Realistic scenarios where decisions on the locations of sampling points in a spatial setting are required are also described. Students are provided with a variety of sampling strategies and invited to select the most appropriate one in two different settings. One involves water sampling in the lagoon of the Mururoa Atoll while the other involves sea bed sampling in a Scottish firth. Once a student has decided on a sampling strategy, simulated data are provided for further analysis. This extends the range of teaching activity from the analysis of data collected by others to involvement in data collection and the need to grapple with issues of design. It is argued that this approach has significant benefits in learning.

    A period of calm in Scottish seas: a comprehensive study of ΔR values for the northern British Isles coast and the consequent implications for archaeology and oceanography

    Get PDF
    The Marine Radiocarbon Reservoir Effect (MRE) is a 14C age offset between contemporaneous marine- and terrestrially-derived carbon. In Northern Hemisphere surface waters it is of the order of 400 years but temporal and spatial deviations, known as ΔR, occur. This study provides a comprehensive dataset of 21 ΔR and MRE values for the east coast of Scotland and 21 recalculated values for the west coast of Scotland and Ireland, for the period c. 3500 BC to 1450 AD. They are presented as mean, site-specific ΔR and MRE values, together with their associated uncertainties, calculated as standard errors for predicted values. The ΔR values range from -320 ± 35 to +150 ± 28 14C years and show no spatial or temporal trends. The MRE values range from 59 ± 40 to 531 ± 26, show an almost identical distribution pattern to the ΔR values and again show no spatial or temporal trends. Results show that ΔR values calculated for a single site using statistically indistinguishable groups of terrestrial and marine radiocarbon age measurements can produce variability of up to 225 14C years. ΔR is an important factor in the accurate calibration of samples containing marine-derived carbon for archaeological interpretation but is often also used as an indicator of changes in 14C specific activity of the oceans, and therefore a proxy for changes in ocean circulation and/or climate. Using the methods outlined in this paper, it is apparent that ΔR values for the northern part of the British Isles have been relatively stable, within our ability to quantify non-random variation in the data. The fact that significant climatic shifts have been recorded during this time, yet these are not visible in the ΔR data, presents a cautionary tale regarding the use of ΔR to infer large-scale oceanographic or climatic changes. Upon the exclusion of 5 outliers from the 42 values, the remaining ΔR values are statistically indistinguishable from one another and range from -142 ± 61 to +40 ± 47 14C years. 34 of these values are from Scottish archaeological sites and can be combined to produce a mean value for Scotland of -47 ± 52 14C years for the period 3500 BC to 1450 AD, to be used only in the absence of site- and period-specific data
    • …
    corecore