94 research outputs found

    A simulation study comparing aberration detection algorithms for syndromic surveillance

    Get PDF
    BACKGROUND: The usefulness of syndromic surveillance for early outbreak detection depends in part on effective statistical aberration detection. However, few published studies have compared different detection algorithms on identical data. In the largest simulation study conducted to date, we compared the performance of six aberration detection algorithms on simulated outbreaks superimposed on authentic syndromic surveillance data. METHODS: We compared three control-chart-based statistics, two exponential weighted moving averages, and a generalized linear model. We simulated 310 unique outbreak signals, and added these to actual daily counts of four syndromes monitored by Public Health – Seattle and King County's syndromic surveillance system. We compared the sensitivity of the six algorithms at detecting these simulated outbreaks at a fixed alert rate of 0.01. RESULTS: Stratified by baseline or by outbreak distribution, duration, or size, the generalized linear model was more sensitive than the other algorithms and detected 54% (95% CI = 52%–56%) of the simulated epidemics when run at an alert rate of 0.01. However, all of the algorithms had poor sensitivity, particularly for outbreaks that did not begin with a surge of cases. CONCLUSION: When tested on county-level data aggregated across age groups, these algorithms often did not perform well in detecting signals other than large, rapid increases in case counts relative to baseline levels

    Disease surveillance using a hidden Markov model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Routine surveillance of disease notification data can enable the early detection of localised disease outbreaks. Although hidden Markov models (HMMs) have been recognised as an appropriate method to model disease surveillance data, they have been rarely applied in public health practice. We aimed to develop and evaluate a simple flexible HMM for disease surveillance which is suitable for use with sparse small area count data and requires little baseline data.</p> <p>Methods</p> <p>A Bayesian HMM was designed to monitor routinely collected notifiable disease data that are aggregated by residential postcode. Semi-synthetic data were used to evaluate the algorithm and compare outbreak detection performance with the established Early Aberration Reporting System (EARS) algorithms and a negative binomial cusum.</p> <p>Results</p> <p>Algorithm performance varied according to the desired false alarm rate for surveillance. At false alarm rates around 0.05, the cusum-based algorithms provided the best overall outbreak detection performance, having similar sensitivity to the HMMs and a shorter average time to detection. At false alarm rates around 0.01, the HMM algorithms provided the best overall outbreak detection performance, having higher sensitivity than the cusum-based Methods and a generally shorter time to detection for larger outbreaks. Overall, the 14-day HMM had a significantly greater area under the receiver operator characteristic curve than the EARS C3 and 7-day negative binomial cusum algorithms.</p> <p>Conclusion</p> <p>Our findings suggest that the HMM provides an effective method for the surveillance of sparse small area notifiable disease data at low false alarm rates. Further investigations are required to evaluation algorithm performance across other diseases and surveillance contexts.</p

    Slower Visuomotor Corrections with Unchanged Latency are Consistent with Optimal Adaptation to Increased Endogenous Noise in the Elderly

    Get PDF
    We analyzed age-related changes in motor response in a visuomotor compensatory tracking task. Subjects used a manipulandum to attempt to keep a displayed cursor at the center of a screen despite random perturbations to its location. Cross-correlation analysis of the perturbation and the subject response showed no age-related increase in latency until the onset of response to the perturbation, but substantial slowing of the response itself. Results are consistent with age-related deterioration in the ratio of signal to noise in visuomotor response. The task is such that it is tractable to use Bayesian and quadratic optimality assumptions to construct a model for behavior. This model assumes that behavior resembles an optimal controller subject to noise, and parametrizes response in terms of latency, willingness to expend effort, noise intensity, and noise bandwidth. The model is consistent with the data for all young (n = 12, age 20–30) and most elderly (n = 12, age 65–92) subjects. The model reproduces the latency result from the cross-correlation method. When presented with increased noise, the computational model reproduces the experimentally observed age-related slowing and the observed lack of increased latency. The model provides a precise way to quantitatively formulate the long-standing hypothesis that age-related slowing is an adaptation to increased noise

    Early Detection of Tuberculosis Outbreaks among the San Francisco Homeless: Trade-Offs Between Spatial Resolution and Temporal Scale

    Get PDF
    BACKGROUND: San Francisco has the highest rate of tuberculosis (TB) in the U.S. with recurrent outbreaks among the homeless and marginally housed. It has been shown for syndromic data that when exact geographic coordinates of individual patients are used as the spatial base for outbreak detection, higher detection rates and accuracy are achieved compared to when data are aggregated into administrative regions such as zip codes and census tracts. We examine the effect of varying the spatial resolution in the TB data within the San Francisco homeless population on detection sensitivity, timeliness, and the amount of historical data needed to achieve better performance measures. METHODS AND FINDINGS: We apply a variation of space-time permutation scan statistic to the TB data in which a patient's location is either represented by its exact coordinates or by the centroid of its census tract. We show that the detection sensitivity and timeliness of the method generally improve when exact locations are used to identify real TB outbreaks. When outbreaks are simulated, while the detection timeliness is consistently improved when exact coordinates are used, the detection sensitivity varies depending on the size of the spatial scanning window and the number of tracts in which cases are simulated. Finally, we show that when exact locations are used, smaller amount of historical data is required for training the model. CONCLUSION: Systematic characterization of the spatio-temporal distribution of TB cases can widely benefit real time surveillance and guide public health investigations of TB outbreaks as to what level of spatial resolution results in improved detection sensitivity and timeliness. Trading higher spatial resolution for better performance is ultimately a tradeoff between maintaining patient confidentiality and improving public health when sharing data. Understanding such tradeoffs is critical to managing the complex interplay between public policy and public health. This study is a step forward in this direction

    Proposal of a framework for evaluating military surveillance systems for early detection of outbreaks on duty areas

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In recent years a wide variety of epidemiological surveillance systems have been developed to provide early identification of outbreaks of infectious disease. Each system has had its own strengths and weaknesses. In 2002 a Working Group of the Centers for Disease Control and Prevention (CDC) produced a framework for evaluation, which proved suitable for many public health surveillance systems. However this did not easily adapt to the military setting, where by necessity a variety of different parameters are assessed, different constraints placed on the systems, and different objectives required. This paper describes a proposed framework for evaluation of military syndromic surveillance systems designed to detect outbreaks of disease on operational deployments.</p> <p>Methods</p> <p>The new framework described in this paper was developed from the cumulative experience of British and French military syndromic surveillance systems. The methods included a general assessment framework (CDC), followed by more specific methods of conducting evaluation. These included Knowledge/Attitude/Practice surveys (KAP surveys), technical audits, ergonomic studies, simulations and multi-national exercises. A variety of military constraints required integration into the evaluation. Examples of these include the variability of geographical conditions in the field, deployment to areas without prior knowledge of naturally-occurring disease patterns, the differences in field sanitation between locations and over the length of deployment, the mobility of military forces, turnover of personnel, continuity of surveillance across different locations, integration with surveillance systems from other nations working alongside each other, compatibility with non-medical information systems, and security.</p> <p>Results</p> <p>A framework for evaluation has been developed that can be used for military surveillance systems in a staged manner consisting of initial, intermediate and final evaluations. For each stage of the process parameters for assessment have been defined and methods identified.</p> <p>Conclusion</p> <p>The combined experiences of French and British syndromic surveillance systems developed for use in deployed military forces has allowed the development of a specific evaluation framework. The tool is suitable for use by all nations who wish to evaluate syndromic surveillance in their own military forces. It could also be useful for civilian mobile systems or for national security surveillance systems.</p

    Resource profile and user guide of the Polygenic Index Repository

    Get PDF
    Polygenic indexes (PGIs) are DNA-based predictors. Their value for research in many scientific disciplines is growing rapidly. As a resource for researchers, we used a consistent methodology to construct PGIs for 47 phenotypes in 11 datasets. To maximize the PGIs’ prediction accuracies, we constructed them using genome-wide association studies — some not previously published — from multiple data sources, including 23andMe and UK Biobank. We present a theoretical framework to help interpret analyses involving PGIs. A key insight is that a PGI can be understood as an unbiased but noisy measure of a latent variable we call the ‘additive SNP factor’. Regressions in which the true regressor is this factor but the PGI is used as its proxy therefore suffer from errors-in-variables bias. We derive an estimator that corrects for the bias, illustrate the correction, and make a Python tool for implementing it publicly available

    Tracking the spatial diffusion of influenza and norovirus using telehealth data: A spatiotemporal analysis of syndromic data

    Get PDF
    Background: Telehealth systems have a large potential for informing public health authorities in an early stage of outbreaks of communicable disease. Influenza and norovirus are common viruses that cause significant respiratory and gastrointestinal disease worldwide. Data about these viruses are not routinely mapped for surveillance purposes in the UK, so the spatial diffusion of national outbreaks and epidemics is not known as such incidents occur. We aim to describe the geographical origin and diffusion of rises in fever and vomiting calls to a national telehealth system, and consider the usefulness of these findings for influenza and norovirus surveillance. Methods: Data about fever calls (5- to 14-year-old age group) and vomiting calls (≥ 5-year-old age group) in school-age children, proxies for influenza and norovirus, respectively, were extracted from the NHS Direct national telehealth database for the period June 2005 to May 2006. The SaTScan space-time permutation model was used to retrospectively detect statistically significant clusters of calls on a week-by-week basis. These syndromic results were validated against existing laboratory and clinical surveillance data. Results: We identified two distinct periods of elevated fever calls. The first originated in the North-West of England during November 2005 and spread in a south-east direction, the second began in Central England during January 2006 and moved southwards. The timing, geographical location, and age structure of these rises in fever calls were similar to a national influenza B outbreak that occurred during winter 2005–2006. We also identified significantly elevated levels of vomiting calls in South-East England during winter 2005–2006. Conclusion: Spatiotemporal analyses of telehealth data, specifically fever calls, provided a timely and unique description of the evolution of a national influenza outbreak. In a similar way the tool may be useful for tracking norovirus, although the lack of consistent comparison data makes this more difficult to assess. In interpreting these results, care must be taken to consider other infectious and non-infectious causes of fever and vomiting. The scan statistic should be considered for spatial analyses of telehealth data elsewhere and will be used to initiate prospective geographical surveillance of influenza in England.

    A Methodological Framework for the Evaluation of Syndromic Surveillance Systems: A Case Study of England

    Get PDF
    Background: Syndromic surveillance complements traditional public health surveillance by collecting and analysing health indicators in near real time. The rationale of syndromic surveillance is that it may detect health threats faster than traditional surveillance systems permitting more timely, and hence potentially more effective public health action. The effectiveness of syndromic surveillance largely relies on the methods used to detect aberrations. Very few studies have evaluated the performance of syndromic surveillance systems and consequently little is known about the types of events that such systems can and cannot detect. Methods: We introduce a framework for the evaluation of syndromic surveillance systems that can be used in any setting based upon the use of simulated scenarios. For a range of scenarios this allows the time and probability of to be determined and uncertainty is fully incorporated. In addition, we demonstrate how such a framework can model the benefits of increases in the number of centres reporting syndromic data and also determine the minimum size of outbreaks that can or cannot be detected. Here, we demonstrate its utility using simulations of national influenza outbreaks and localised outbreaks of cryptosporidiosis. Results: Influenza outbreaks are consistently detected with larger outbreaks being detected in a more timely manner. Small cryptosporidiosis outbreaks (<1000 symptomatic individuals) are unlikely to be detected. We also demonstrate the advantages of having multiple syndromic data streams (e.g. emergency attendance data, telephone helpline data, general practice consultation data) as different streams are able to detect different types outbreaks with different efficacy (e.g. emergency attendance data are useful for the detection of pandemic influenza but not for outbreaks of cryptosporidiosis). We also highlight that for any one disease, the utility of data streams may vary geographically, and that the detection ability of syndromic surveillance varies seasonally (e.g. an influenza outbreak starting in July is detected sooner than one starting later in the year). We argue that our framework constitutes a useful tool for public health emergency preparedness in multiple settings. Conclusions: The proposed framework allows the exhaustive evaluation of any syndromic surveillance system and constitutes a useful tool for emergency preparedness and response

    Cyr61/CCN1 Displays High-Affinity Binding to the Somatomedin B 1–44 Domain of Vitronectin

    Get PDF
    OV) family of extracellular-associated (matricellular) proteins that present four distinct functional modules, namely insulin-like growth factor binding protein (IGFBP), von Willebrand factor type C (vWF), thrombospondin type 1 (TSP), and C-terminal growth factor cysteine knot (CT) domain. While heparin sulphate proteoglycans reportedly mediate the interaction of Cyr61 with the matrix and cell surface, the role of other extracellular associated proteins has not been revealed. at high concentrations attenuate Cyr61 binding to immobilized VTNC, while monomeric VTNC was ineffective. Therefore, immobilization of VTNC exposes cryptic epitopes that recognize Cyr61 with high affinity, as reported for a number of antibodies, β-endorphin, and other molecules. domain suggests that VTNC represent a point of anchorage for CCN family members to the matrix. Results are discussed in the context of the role of CCN and VTNC in matrix biology and angiogenesis
    • …
    corecore