14,715 research outputs found

    Estimation of the basic reproductive number and mean serial interval of a novel pathogen in a small, well-observed discrete population

    Get PDF
    BACKGROUND:Accurately assessing the transmissibility and serial interval of a novel human pathogen is public health priority so that the timing and required strength of interventions may be determined. Recent theoretical work has focused on making best use of data from the initial exponential phase of growth of incidence in large populations. METHODS:We measured generational transmissibility by the basic reproductive number R0 and the serial interval by its mean Tg. First, we constructed a simulation algorithm for case data arising from a small population of known size with R0 and Tg also known. We then developed an inferential model for the likelihood of these case data as a function of R0 and Tg. The model was designed to capture a) any signal of the serial interval distribution in the initial stochastic phase b) the growth rate of the exponential phase and c) the unique combination of R0 and Tg that generates a specific shape of peak incidence when the susceptible portion of a small population is depleted. FINDINGS:Extensive repeat simulation and parameter estimation revealed no bias in univariate estimates of either R0 and Tg. We were also able to simultaneously estimate both R0 and Tg. However, accurate final estimates could be obtained only much later in the outbreak. In particular, estimates of Tg were considerably less accurate in the bivariate case until the peak of incidence had passed. CONCLUSIONS:The basic reproductive number and mean serial interval can be estimated simultaneously in real time during an outbreak of an emerging pathogen. Repeated application of these methods to small scale outbreaks at the start of an epidemic would permit accurate estimates of key parameters

    Simulation-guided design of serological surveys of the cumulative incidence of influenza infection

    Get PDF
    published_or_final_versio

    Association between Intrapartum Interventions and Breastfeeding Duration

    Get PDF
    postprin

    Factors contributing to early breast-feeding cessation among Chinese mothers: An exploratory study

    Get PDF
    Background although more than 85% of all new mothers in Hong Kong now initiate breast feeding, few exclusively breast feed and the overall duration is short. More than one-third stop breast feeding within the first month post partum. Objective to explore the breast-feeding experiences of Hong Kong Chinese mothers who prematurely discontinue breast feeding and to identify contributing factors that might be remediated to help women breast feed longer. Design qualitative exploratory study. Methods in-depth, exploratory interviews were carried out with 24 new mothers who stopped breast feeding within one month after birth, and content analysis was used to analyse the data. Findings five core themes emerged from the data: unnatural expectations, left to figure it out, uncertainty, unfulfilling experiences, and guilt versus relief. Because breast feeding is ‘natural’ participants expected that it would come naturally and thus be easy. When breast feeding did not happen naturally, however, midwives were too busy to provide breast-feeding support and mothers were left to figure it out on their own. Participants also reported difficulty in gauging whether the infant was getting adequate nutrition from their breastmilk. Few participants had positive breast-feeding experiences; while the decision to stop breast feeding caused guilt for most participants, others expressed relief at stopping breast feeding. Key conclusions and implications for practice greater postnatal breast-feeding support, both in the hospital and after the mother returns home, would likely increase the mother׳s confidence and enhance her mothering experience. Further antenatal and postnatal education on the realistic breast-feeding expectations and the amount of breastmilk required by babies is also important. More research is needed to test professional and peer support breast-feeding interventions to provide guidance to policy makers on the most effective breast-feeding support strategies.postprin

    Robust Logistic Principal Component Regression for classification of data in presence of outliers

    Get PDF
    The Logistic Principal Component Regression (LPCR) has found many applications in classification of high-dimensional data, such as tumor classification using microarray data. However, when the measurements are contaminated and/or the observations are mislabeled, the performance of the LPCR will be significantly degraded. In this paper, we propose a new robust LPCR based on M-estimation, which constitutes a versatile framework to reduce the sensitivity of the estimators to outliers. In particular, robust detection rules are used to first remove the contaminated measurements and then a modified Huber function is used to further remove the contributions of the mislabeled observations. Experimental results show that the proposed method generally outperforms the conventional LPCR under the presence of outliers, while maintaining a performance comparable to that obtained under normal condition. © 2012 IEEE.published_or_final_versionThe 2012 IEEE International Symposium on Circuits and Systems (ISCAS), Seoul, Korea, 20-23 May 2012. In IEEE International Symposium on Circuits and Systems Proceedings, 2012, p. 2809-281

    Robust recursive eigendecomposition and subspace-based algorithms with application to fault detection in wireless sensor networks

    Get PDF
    The principal component analysis (PCA) is a valuable tool in multivariate statistics, and it is an effective method for fault detection in wireless sensor networks (WSNs) and other related applications. However, its online implementation requires the computation of eigendecomposition (ED) or singular value decomposition. To reduce the arithmetic complexity, we propose an efficient fault detection approach using the subspace tracking concept. In particular, two new robust subspace tracking algorithms are developed, namely, the robust orthonormal projection approximation subspace tracking (OPAST) with rank-1 modification and the robust OPAST with deflation. Both methods rely on robust M-estimate-based recursive covariance estimate to improve the robustness against the effect of faulty samples, and they offer different tradeoff between fault detection accuracy and arithmetic complexity. Since only the ED in the major subspace is computed, their arithmetic complexities are much lower than those of other conventional PCA-based algorithms. Furthermore, we propose new robust T 2 score and SPE detection criteria with recursive update formulas to improve the robustness over their conventional counterparts and to facilitate online implementation for the proposed robust subspace ED and tracking algorithms. Computer simulation and experimental results on WSN data show that the proposed fault detection approach, which combines the aforementioned robust subspace tracking algorithms with the robust detection criteria, is able to achieve better performance than other conventional approaches. Hence, it serves as an attractive alternative to other conventional approaches to fault detection in WSNs and other related applications because of its low complexity, efficient recursive implementation, and good performance. © 2012 IEEE.published_or_final_versio

    A New Method for Preliminary Identification of Gene Regulatory Networks from Gene Microarray Cancer Data Using Ridge Partial Least Squares with Recursive Feature Elimination and Novel Brier and Occurrence Probability Measures

    Get PDF
    published_or_final_versio

    A new recursive dynamic factor analysis for point and interval forecast of electricity price

    Get PDF
    published_or_final_versio

    Ensemble approaches for uncertainty in spoken language assessment

    Get PDF
    Deep learning has dramatically improved the performance of automated systems on a range of tasks including spoken language assessment. One of the issues with these deep learning approaches is that they tend to be overconfident in the decisions that they make, with potentially serious implications for deployment of systems for high-stakes examinations. This paper examines the use of ensemble approaches to improve both the reliability of the scores that are generated, and the ability to detect where the system has made predictions beyond acceptable errors. In this work assessment is treated as a regression problem. Deep density networks, and ensembles of these models, are used as the predictive models. Given an ensemble of models measures of uncertainty, for example the variance of the predicted distributions, can be obtained and used for detecting outlier predictions. However, these ensemble approaches increase the computational and memory requirements of the system. To address this problem the ensemble is distilled into a single mixture density network. The performance of the systems is evaluated on a free speaking prompt-response style spoken language assessment test. Experiments show that the ensembles and the distilled model yield performance gains over a single model, and have the ability to detect outliers
    • …
    corecore