31 research outputs found
Risk averse reproduction numbers improve resurgence detection
The effective reproduction number R is a prominent statistic for inferring the transmissibility of infectious diseases and effectiveness of interventions. R purportedly provides an easy-to-interpret threshold for deducing whether an epidemic will grow (R>1) or decline (R1generates timely resurgence signals (upweighting risky groups), while an E<1ensures local outbreaks are under control. We propose E as an alternative to R for informing policy and assessing transmissibility at large scales (e.g., state-wide or nationally), where R is commonly computed but well-mixed or homogeneity assumptions break down
Are skyline plot-based demographic estimates overly dependent on smoothing prior assumptions?
In Bayesian phylogenetics, the coalescent process provides an informative framework for inferring changes in the effective size of a population from a phylogeny (or tree) of sequences sampled from that population. Popular coalescent inference approaches such as the Bayesian Skyline Plot, Skyride and Skygrid all model these population size changes with a discontinuous, piecewise-constant function but then apply a smoothing prior to ensure that their posterior population size estimates transition gradually with time. These prior distributions implicitly encode extra population size information that is not available from the observed coalescent data i.e., the tree. Here we present a novel statistic, Ω, to quantify and disaggregate the relative contributions of the coalescent data and prior assumptions to the resulting posterior estimate precision. Our statistic also measures the additional mutual information introduced by such priors. Using Ω we show that, because it is surprisingly easy to over-parametrise piecewise-constant population models, common smoothing priors can lead to overconfident and potentially misleading inference, even under robust experimental designs. We propose Ω as a useful tool for detecting when effective population size estimates are overly reliant on prior assumptions and for improving quantification of the uncertainty in those estimates
Point Process Analysis of Noise in Early Invertebrate Vision
Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps). Here, randomness of both the photon inputs (regarded as extrinsic noise) and the conversion process (intrinsic noise) are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder) filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter) and shape (amplitude and width) variance, it is the mean delay that is critical to noise performance. Consequently, if one wants to increase visual fidelity, reducing the photoconversion lag is much more important than improving the regularity of the electrical signal.This work was supported by the Gates Cambridge Trust (PhD studentship for research) https://www.gatescambridge.org/. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
Implementation of genomic surveillance of SARS-CoV-2 in the Caribbean: Lessons learned for sustainability in resource-limited settings
The COVID-19 pandemic highlighted the importance of global genomic surveillance to monitor the emergence and spread of SARS-CoV-2 variants and inform public health decision-making. Until December 2020 there was minimal capacity for viral genomic surveillance in most Caribbean countries. To overcome this constraint, the COVID-19: Infectious disease Molecular epidemiology for PAthogen Control & Tracking (COVID-19 IMPACT) project was implemented to establish rapid SARS-CoV-2 whole genome nanopore sequencing at The University of the West Indies (UWI) in Trinidad and Tobago (T&T) and provide needed SARS-CoV-2 sequencing services for T&T and other Caribbean Public Health Agency Member States (CMS). Using the Oxford Nanopore Technologies MinION sequencing platform and ARTIC network sequencing protocols and bioinformatics pipeline, a total of 3610 SARS-CoV-2 positive RNA samples, received from 17 CMS, were sequenced in-situ during the period December 5th 2020 to December 31st 2021. Ninety-one Pango lineages, including those of five variants of concern (VOC), were identified. Genetic analysis revealed at least 260 introductions to the CMS from other global regions. For each of the 17 CMS, the percentage of reported COVID-19 cases sequenced by the COVID-19 IMPACT laboratory ranged from 0·02% to 3·80% (median = 1·12%). Sequences submitted to GISAID by our study represented 73·3% of all SARS-CoV-2 sequences from the 17 CMS available on the database up to December 31st 2021. Increased staffing, process and infrastructural improvement over the course of the project helped reduce turnaround times for reporting to originating institutions and sequence uploads to GISAID. Insights from our genomic surveillance network in the Caribbean region directly influenced non-pharmaceutical countermeasures in the CMS countries. However, limited availability of associated surveillance and clinical data made it challenging to contextualise the observed SARS-CoV-2 diversity and evolution, highlighting the need for development of infrastructure for collecting and integrating genomic sequencing data and sample-associated metadata
Key questions for modelling COVID-19 exit strategies
This is the final version. Available on open access from the Royal Society via the DOI in this recordCombinations of intense non-pharmaceutical interventions ('lockdowns') were introduced in countries worldwide to reduce SARS-CoV-2 transmission. Many governments have begun to implement lockdown exit strategies that allow restrictions to be relaxed while attempting to control the risk of a surge in cases. Mathematical modelling has played a central role in guiding interventions, but the challenge of designing optimal exit strategies in the face of ongoing transmission is unprecedented. Here, we report discussions from the Isaac Newton Institute 'Models for an exit strategy' workshop (11-15 May 2020). A diverse community of modellers who are providing evidence to governments worldwide were asked to identify the main questions that, if answered, will allow for more accurate predictions of the effects of different exit strategies. Based on these questions, we propose a roadmap to facilitate the development of reliable models to guide exit strategies. The roadmap requires a global collaborative effort from the scientific community and policy-makers, and is made up of three parts: i) improve estimation of key epidemiological parameters; ii) understand sources of heterogeneity in populations; iii) focus on requirements for data collection, particularly in Low-to-Middle-Income countries. This will provide important information for planning exit strategies that balance socio-economic benefits with public health.Alan Turing InstituteEPSR
Key questions for modelling COVID-19 exit strategies
This is the final version. Available on open access from the Royal Society via the DOI in this recordCombinations of intense non-pharmaceutical interventions ('lockdowns') were introduced in countries worldwide to reduce SARS-CoV-2 transmission. Many governments have begun to implement lockdown exit strategies that allow restrictions to be relaxed while attempting to control the risk of a surge in cases. Mathematical modelling has played a central role in guiding interventions, but the challenge of designing optimal exit strategies in the face of ongoing transmission is unprecedented. Here, we report discussions from the Isaac Newton Institute 'Models for an exit strategy' workshop (11-15 May 2020). A diverse community of modellers who are providing evidence to governments worldwide were asked to identify the main questions that, if answered, will allow for more accurate predictions of the effects of different exit strategies. Based on these questions, we propose a roadmap to facilitate the development of reliable models to guide exit strategies. The roadmap requires a global collaborative effort from the scientific community and policy-makers, and is made up of three parts: i) improve estimation of key epidemiological parameters; ii) understand sources of heterogeneity in populations; iii) focus on requirements for data collection, particularly in Low-to-Middle-Income countries. This will provide important information for planning exit strategies that balance socio-economic benefits with public health.Alan Turing InstituteEPSR
Improved estimation of time-varying reproduction numbers at low case incidence and between epidemic waves
We construct a recursive Bayesian smoother, termed EpiFilter, for estimating the effective reproduction number, R, from the incidence of an infectious disease in real time and retrospectively. Our approach borrows from Kalman filtering theory, is quick and easy to compute, generalisable, deterministic and unlike many current methods, requires no change-point or window size assumptions. We model R as a flexible, hidden Markov state process and exactly solve forward-backward algorithms, to derive R estimates that incorporate all available incidence information. This unifies and extends two popular methods, EpiEstim, which considers past incidence, and the Wallinga-Teunis method, which looks forward in time. We find that this combination of maximising information and minimising assumptions significantly reduces the bias and variance of R estimates. Moreover, these properties make EpiFilter more statistically robust in periods of low incidence, where several existing methods can become destabilised. As a result, EpiFilter offers improved inference of time-varying transmission patterns that are advantageous for assessing the risk of upcoming waves of infection or the influence of interventions, in real time and at various spatial scales
On signalling and estimation limits for molecular birth-processes
Understanding and uncovering the mechanisms or motifs that molecular networks employ to regulate noise is a key problem in cell biology. As it is often difficult to obtain direct and detailed insight into these mechanisms, many studies instead focus on assessing the best precision attainable on the signalling pathways that compose these networks. Molecules signal one another over such pathways to solve noise regulating estimation and control problems. Quantifying the maximum precision of these solutions delimits what is achievable and allows hypotheses about underlying motifs to be tested without requiring detailed biological knowledge. The pathway capacity, which defines the maximum rate of transmitting information along it, is a widely used proxy for precision. Here it is shown, for estimation problems involving elementary yet biologically relevant birth-process networks, that capacity can be surprisingly misleading. A time-optimal signalling motif, called birth-following, is derived and proven to better the precision expected from the capacity, provided the maximum signalling rate constraint is large and the mean one above a certain threshold. When the maximum constraint is relaxed, perfect estimation is predicted by the capacity. However, the true achievable precision is found highly variable and sensitive to the mean constraint. Since the same capacity can map to different combinations of rate constraints, it can only equivocally measure precision. Deciphering the rate constraints on a signalling pathway may therefore be more important than computing its capacity
Point Process Analysis of Noise in Early Invertebrate Vision
Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps). Here, randomness of both the photon inputs (regarded as extrinsic noise) and the conversion process (intrinsic noise) are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder) filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter) and shape (amplitude and width) variance, it is the mean delay that is critical to noise performance. Consequently, if one wants to increase visual fidelity, reducing the photoconversion lag is much more important than improving the regularity of the electrical signal
Exact Bayesian inference for phylogenetic birth-death models
Motivation Inferring the rates of change of a population from a reconstructed phylogeny of genetic sequences is a central problem in macro-evolutionary biology, epidemiology and many other disciplines. A popular solution involves estimating the parameters of a birth-death process (BDP), which links the shape of the phylogeny to its birth and death rates. Modern BDP estimators rely on random Markov chain Monte Carlo (MCMC) sampling to infer these rates. Such methods, while powerful and scalable, cannot be guaranteed to converge, leading to results that may be hard to replicate or difficult to validate. Results We present a conceptually and computationally different parametric BDP inference approach using flexible and easy to implement Snyder filter (SF) algorithms. This method is deterministic so its results are provable, guaranteed and reproducible. We validate the SF on constant rate BDPs and find that it solves BDP likelihoods known to produce robust estimates. We then examine more complex BDPs with time-varying rates. Our estimates compare well with a recently developed parametric MCMC inference method. Lastly, we perform model selection on an empirical Agamid species phylogeny, obtaining results consistent with the literature. The SF makes no approximations, beyond those required for parameter quantization and numerical integration and directly computes the posterior distribution of model parameters. It is a promising alternative inference algorithm that may serve either as a standalone Bayesian estimator or as a useful diagnostic reference for validating more involved MCMC strategies. Availability and implementation The Snyder filter is implemented in Matlab and the time-varying BDP models are simulated in R. The source code and data are freely available at https://github.com/kpzoo/snyder-birth-death-code.</p