12 research outputs found

    A space-time conditional intensity model for infectious disease occurence

    Get PDF
    A novel point process model continuous in space-time is proposed for infectious disease data. Modelling is based on the conditional intensity function (CIF) and extends an additive-multiplicative CIF model previously proposed for discrete space epidemic modelling. Estimation is performed by means of full maximum likelihood and a simulation algorithm is presented. The particular application of interest is the stochastic modelling of the transmission dynamics of the two most common meningococcal antigenic sequence types observed in Germany 2002–2008. Altogether, the proposed methodology represents a comprehensive and universal regression framework for the modelling, simulation and inference of self-exciting spatio-temporal point processes based on the CIF. Application is promoted by an implementation in the R package RLadyBug

    Detecting early signals of COVID-19 outbreaks in 2020 in small areas by monitoring healthcare utilisation databases: first lessons learned from the Italian Alert_CoV project

    Get PDF
    During the COVID-19 pandemic, large-scale diagnostic testing and contact tracing have proven insufficient to promptly monitor the spread of infections.AimTo develop and retrospectively evaluate a system identifying aberrations in the use of selected healthcare services to timely detect COVID-19 outbreaks in small areas. Methods: Data were retrieved from the healthcare utilisation (HCU) databases of the Lombardy Region, Italy. We identified eight services suggesting a respiratory infection (syndromic proxies). Count time series reporting the weekly occurrence of each proxy from 2015 to 2020 were generated considering small administrative areas (i.e. census units of Cremona and Mantua provinces). The ability to uncover aberrations during 2020 was tested for two algorithms: the improved Farrington algorithm and the generalised likelihood ratio-based procedure for negative binomial counts. To evaluate these algorithms' performance in detecting outbreaks earlier than the standard surveillance, confirmed outbreaks, defined according to the weekly number of confirmed COVID-19 cases, were used as reference. Performances were assessed separately for the first and second semester of the year. Proxies positively impacting performance were identified. Results: We estimated that 70% of outbreaks could be detected early using the proposed approach, with a corresponding false positive rate of ca 20%. Performance did not substantially differ either between algorithms or semesters. The best proxies included emergency calls for respiratory or infectious disease causes and emergency room visits. Conclusion: Implementing HCU-based monitoring systems in small areas deserves further investigations as it could facilitate the containment of COVID-19 and other unknown infectious diseases in the future

    Predicting seasonal influenza transmission using functional regression models with temporal dependence

    Get PDF
    This paper proposes a novel approach that uses meteorological information to predict the incidence of influenza in Galicia (Spain). It extends the Generalized Least Squares (GLS) methods in the multivariate framework to functional regression models with dependent errors. These kinds of models are useful when the recent history of the incidence of influenza are readily unavailable (for instance, by delays on the communication with health informants) and the prediction must be constructed by correcting the temporal dependence of the residuals and using more accessible variables. A simulation study shows that the GLS estimators render better estimations of the parameters associated with the regression model than they do with the classical models. They obtain extremely good results from the predictive point of view and are competitive with the classical time series approach for the incidence of influenza. An iterative version of the GLS estimator (called iGLS) was also proposed that can help to model complicated dependence structures. For constructing the model, the distance correlation measure was employed to select relevant information to predict influenza rate mixing multivariate and functional variables. These kinds of models are extremely useful to health managers in allocating resources in advance to manage influenza epidemic

    Doctor of Philosophy

    Get PDF
    dissertationPublic health surveillance systems are crucial for the timely detection and response to public health threats. Since the terrorist attacks of September 11, 2001, and the release of anthrax in the following month, there has been a heightened interest in public health surveillance. The years immediately following these attacks were met with increased awareness and funding from the federal government which has significantly strengthened the United States surveillance capabilities; however, despite these improvements, there are substantial challenges faced by today's public health surveillance systems. Problems with the current surveillance systems include: a) lack of leveraging unstructured public health data for surveillance purposes; and b) lack of information integration and the ability to leverage resources, applications or other surveillance efforts due to systems being built on a centralized model. This research addresses these problems by focusing on the development and evaluation of new informatics methods to improve the public health surveillance. To address the problems above, we first identified a current public surveillance workflow which is affected by the problems described and has the opportunity for enhancement through current informatics techniques. The 122 Mortality Surveillance for Pneumonia and Influenza was chosen as the primary use case for this dissertation work. The second step involved demonstrating the feasibility of using unstructured public health data, in this case death certificates. For this we created and evaluated a pipeline iv composed of a detection rule and natural language processor, for the coding of death certificates and the identification of pneumonia and influenza cases. The second problem was addressed by presenting the rationale of creating a federated model by leveraging grid technology concepts and tools for the sharing and epidemiological analyses of public health data. As a case study of this approach, a secured virtual organization was created where users are able to access two grid data services, using death certificates from the Utah Department of Health, and two analytical grid services, MetaMap and R. A scientific workflow was created using the published services to replicate the mortality surveillance workflow. To validate these approaches, and provide proofs-of-concepts, a series of real-world scenarios were conducted

    Um alisador de máxima verosimilhança local (MVL) com o modelo de regressão de Poisson, para análise de regressão de dados de contagem

    Get PDF
    Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação.O modelo de regressão de Poisson é a base da análise de regressão paramétrica de dados de contagem, mas as restrições impostas por este modelo são fortes e, frequentemente, não são respeitadas na prática, nomeadamente a hipótese de equidispersão, isto é, de que o valor médio e a variância condicionais são iguais. O modelo binomial negativo admite a possibilidade de sobredispersão, porém, quando esta é elevada, o modelo binomial negativo não se ajusta aos dados. Uma situação recorrente na prática é a presença de excesso de zeros, em que os modelos mais adequados são o modelo de Poisson inflacionado em zero (ZIP) e o modelo binomial negativo inflacionado em zero (ZINB). Têm surgido diversos modelos não paramétricos e semi-paramétricos que não impõem as restrições dos modelos paramétricos, nem dependem da correcta especificação do modelo. Em Santos (2005), é desenvolvido um alisador de máxima verosimilhança local de Poisson e são deduzidos o seu viés, variância e distribuição assimptótica, apresentado boa aderência a dados reais e a dados de simulação. Este modelo, apesar de apresentar bom desempenho, revela-se computacionalmente pesado, devido à sua especificação exponencial. Neste trabalho é apresentado um modelo de Poisson de máxima verosimilhança local, com base no alisador de núcleo e na regressão polinomial local, que deixa cair a especificação exponencial, dada em Santos (2005), passando o modelo a ser especificado localmente por um polinómio do primeiro grau

    Count data regression charts for the monitoring of surveillance time series

    Full text link
    Control charts based on the Poisson and negative binomial distribution for monitoring time series of counts typically arising in the surveillance of infectious diseases are presented. The in-control mean is assumed to be time-varying and linear on the log-scale with intercept and seasonal components. If a shift in the intercept occurs the system goes out-of-control. Using the generalized likelihood ratio (GLR) statistic a monitoring scheme is formulated to detect on-line whether a shift in the intercept occurred. In the case of Poisson the necessary quantities of the GLR detector can be efficiently computed by recursive formulas. Extensions to more general alternatives e.g. containing an auto-regressive epidemic component are discussed. Using Monte Carlo simulations run-length properties of the proposed schemes are investigated and the Poisson scheme is compared to existing methods. The practicability of the charts is demonstrated by applying them to the observed number of salmonella hadar cases in Germany 2001–2006
    corecore