89 research outputs found
Transmission parameters of the 2001 foot and mouth epidemic in Great Britain.
Despite intensive ongoing research, key aspects of the spatial-temporal evolution of the 2001 foot and mouth disease (FMD) epidemic in Great Britain (GB) remain unexplained. Here we develop a Markov Chain Monte Carlo (MCMC) method for estimating epidemiological parameters of the 2001 outbreak for a range of simple transmission models. We make the simplifying assumption that infectious farms were completely observed in 2001, equivalent to assuming that farms that were proactively culled but not diagnosed with FMD were not infectious, even if some were infected. We estimate how transmission parameters varied through time, highlighting the impact of the control measures on the progression of the epidemic. We demonstrate statistically significant evidence for assortative contact patterns between animals of the same species. Predictive risk maps of the transmission potential in different geographic areas of GB are presented for the fitted models
Recommended from our members
Effect of admission fascia iliaca compartment blocks on post-operative abbreviated mental test scores in elderly fractured neck of femur patients: a retrospective cohort study
Background:
Post-operative cognitive impairment is common in elderly patients following surgery for hip fracture,
with undertreated pain being an important etiological factor. Non-opioid based analgesic techniques, such as nerve
blocks, may help reduce the risk of cognitive complications. The aim of this study was to investigate whether
receiving a fascia iliaca compartment block (FICB) as part of a pre-operative analgesic regime increased the odds
of high post-operative abbreviated mental test scores (AMTS) when compared with conventional analgesia without
a nerve block.
Methods:
A retrospective data analysis of a cohort of 959 patients, aged
≥
65 years with a diagnosis of hip fracture
and admitted to a single hospital over a two-year period was performed. A standardized analgesic regime was used
on all patients, and 541/959 (56.4%) of included patients received a FICB. Provision of the FICB was primarily
determined by availability of an anesthetist, rather than by patient status and condition. Post-operative cognitive
ordinal outcomes were defined by AMTS severity as high (score of
≥
9/10), moderate, (score of 7
–
8) and low (score
of
≤
6). A multivariable ordinal logistic regression analysis was performed on patient status and clinical care factors,
including admission AMTS, age, gender, source of admission, time to surgery, type of anesthesia and ASA score.
Results:
Admission FICB was associated with higher adjusted odds for a high AMTS (score of
≥
9) relative to lower
AMTS (score of
≤
8) than conventional analgesia only (OR = 1.80, 95% CI 1.27
–
2.54;
p
= 0.001). Increasing age, lower
AMTS on admission to hospital, and being admitted from a residential or nursing home were associated with
worse cognitive outcomes. Mode of anesthesia or surgery did not significantly influence post-operative AMTS.
Conclusion:
Post-operative AMTS is influenced by pre-operative analgesic regimes in elderly patients with hip fracture.
Provision of a FICB to patients on arrival to hospital may improve early post-operative cognitive performance in this
population
Improving analysis practice of continuous adverse event outcomes in randomised controlled trials – a distributional approach
Background Randomised controlled trials (RCTs) provide valuable information for developing harm profiles but current analysis practices to detect between-group differences are suboptimal. Drug trials routinely screen continuous clinical and biological data to monitor participant harm. These outcomes are regularly dichotomised into abnormal/normal values for analysis. Despite the simplicity gained for clinical interpretation, it is well established that dichotomising outcomes results in a considerable reduction in information and thus statistical power. We propose an automated procedure for the routine implementation of the distributional method for the dichotomisation of continuous outcomes proposed by Peacock and Sauzet, which retains the precision of the comparison of means. Methods We explored the use of a distributional approach to compare differences in proportions based on the comparison of means which retains the power of the latter. We applied this approach to the screening of clinical and biological data as a means to detect ‘signals’ for potential adverse drug reactions (ADRs). Signals can then be followed-up in further confirmatory studies. Three distributional methods suitable for different types of distributions are described. We propose the use of an automated approach using the observed data to select the most appropriate distribution as an analysis strategy in a RCT setting for multiple continuous outcomes. We illustrate this approach using data from three RCTs assessing the efficacy of mepolizumab in asthma or COPD. Published reference ranges were used to define the proportions of participants with abnormal values for a subset of 10 blood tests. The between-group distributional and empirical differences in proportions were estimated for each blood test and compared. Results Within trials, the distributions varied across the 10 outcomes demonstrating value in a practical approach to selecting the distributional method in the context of multiple adverse event outcomes. Across trials, there were three outcomes where the method chosen by the automated procedure varied for the same outcome. The distributional approach identified three signals (eosinophils, haematocrit, and haemoglobin) compared to only one when using the Fisher’s exact test (eosinophils) and two identified by use of the 95% confidence interval for the difference in proportions (eosinophils and potassium). Conclusion When dichotomisation of continuous adverse event outcomes aids clinical interpretation, we advocate use of a distributional approach to retain statistical power. Methods are now easy to implement. Retaining information is especially valuable in the context of the analysis of adverse events in RCTs. The routine implementation of this automated approach requires further evaluation
HIV with contact-tracing: a case study in Approximate Bayesian Computation
Missing data is a recurrent issue in epidemiology where the infection process
may be partially observed. Approximate Bayesian Computation, an alternative to
data imputation methods such as Markov Chain Monte Carlo integration, is
proposed for making inference in epidemiological models. It is a
likelihood-free method that relies exclusively on numerical simulations. ABC
consists in computing a distance between simulated and observed summary
statistics and weighting the simulations according to this distance. We propose
an original extension of ABC to path-valued summary statistics, corresponding
to the cumulated number of detections as a function of time. For a standard
compartmental model with Suceptible, Infectious and Recovered individuals
(SIR), we show that the posterior distributions obtained with ABC and MCMC are
similar. In a refined SIR model well-suited to the HIV contact-tracing data in
Cuba, we perform a comparison between ABC with full and binned detection times.
For the Cuban data, we evaluate the efficiency of the detection system and
predict the evolution of the HIV-AIDS disease. In particular, the percentage of
undetected infectious individuals is found to be of the order of 40%
Clinical implications of Plasmodium resistance to atovaquone/proguanil: a systematic review and meta-analysis.
Background: Atovaquone/proguanil, registered as Malarone®, is a fixed-dose combination recommended for first-line treatment of uncomplicated Plasmodium falciparum malaria in non-endemic countries and its prevention in travellers. Mutations in the cytochrome bc1 complex are causally associated with atovaquone resistance. Methods: This systematic review assesses the clinical efficacy of atovaquone/proguanil treatment of uncomplicated malaria and examines the extent to which codon 268 mutation in cytochrome b influences treatment failure and recrudescence based on published information. Results: Data suggest that atovaquone/proguanil treatment efficacy is 89%-98% for P. falciparum malaria (from 27 studies including between 18 and 253 patients in each case) and 20%-26% for Plasmodium vivax malaria (from 1 study including 25 patients). The in vitro P. falciparum phenotype of atovaquone resistance is an IC50 value >28 nM. Case report analyses predict that recrudescence in a patient presenting with parasites carrying cytochrome b codon 268 mutation will occur on average at day 29 (95% CI: 22, 35), 19 (95% CI: 7, 30) days longer than if the mutation is absent. Conclusions: Evidence suggests atovaquone/proguanil treatment for P. falciparum malaria is effective. Late treatment failure is likely to be associated with a codon 268 mutation in cytochrome b, though recent evidence from animal models suggests these mutations may not spread within the population. However, early treatment failure is likely to arise through alternative mechanisms, requiring further investigation
Predictors of mortality in primary antiphospholipid syndrome. A single-centre cohort study.
The vascular mortality of antiphospholipid syndrome (APS) ranges from 1.4 % to 5.5 %, but its predictors are poorly known. It was the study objective to evaluate the impact of baseline lupus anticoagulant assays, IgG anticardiolipin (aCL), plasma fibrinogen (FNG) and von Willebrand factor (VWF), platelets (PLT) and of genetic polymorphisms of methylenetetrahydrofolate reductase C677T, of prothrombin G20210A and of paraoxonase-1 Q192R on mortality in primary APS (PAPS). Cohort study on 77 thrombotic PAPS and 33 asymptomatic carriers of aPL (PCaPL) seen from 1989 to 2015 and persistently positive for aPL as per annual review. At baseline all participants were tested twice for the ratios of kaolin clotting time (KCTr), activated partial thromboplastin time (aPTTr), dilute Russell viper venom time (DRVVTr), IgG aCL, FNG, VWF and once for PLT. All thrombotic PAPS were on warfarin with regular INR monitoring. During follow-up 11 PAPS deceased (D-PAPS) of recurrent thrombosis despite adequate anticoagulation yielding an overall vascular mortality of 10 %. D-PAPS had the strongest baseline aPTTr and DRVVTr and the highest mean baseline IgG aCL, FNG, VWF and PLT. Cox proportional hazards model identified baseline DRVVTr and FNG as main predictors of mortality with adjusted hazard ratios of 5.75 (95 % confidence interval [CI]: 1.5, 22.4) and of 1.03 (95 %CI: 1.01, 1.04), respectively. In conclusion, plasma DRVVTr and FNG are strong predictors of vascular mortality in PAPS; while FNG lowering agents exist further research should be directed at therapeutic strategies able to dampen aPL production
Recommended from our members
High on-clopidogrel platelet reactivity in ischaemic stroke or transient ischaemic attack: Systematic review and meta-analysis
Objectives
To assess the prevalence of high on-clopidogrel platelet reactivity (HCPR) in patients with ischaemic stroke or transient ischaemic attack (IS/TIA), their outcome and genetic basis of on-treatment response variability in IS/TIA patients.
Methods
We conducted a comprehensive search of PubMed and EMBASE from their inceptions to March 9, 2019. Studies that reported absolute numbers/percentages of HCRP at any time point after IS/TIA onset evaluated with any type of platelet function tests, clinical outcomes and genotyping data were included.
Results
Among 21 studies of 4312 IS/TIA patients treated with clopidogrel, the pooled prevalence of HCPR was 28% (95%CI: 24–32%; high heterogeneity: I2 = 88.2%, p < 0.001). Heterogeneity degree diminished across groups defined by the HCPR testing method. Clopidogrel non-responder IS/TIA patients had poorer outcome compared to responders (RR = 2.09, 95%CI: 1.61–2.70; p = 0.036; low heterogeneity across studies: I2 = 27.4%, p = 0.210). IS/TIA carriers of CYP2C19*2 or CYP2C19*3 loss of function alleles had a higher risk of HCPR compared to wild type (RR = 1.69, 95%CI: 1.47–1.95; p < 0.001; I2 = 0.01%, p = 0.475).
Conclusions
This systematic review shows a high prevalence of clopidogrel resistance in IS/TIA and poor outcome in these patients. CYP2C19 polymorphisms may potentially influence clopidogrel resistance
Recommended from our members
The dynamics and outcomes of AKI progression during the COVID-19 pandemic.
PURPOSE: Acute kidney injury (AKI) associated with COVID-19 is associated with poor prognosis. This study assessed the hitherto uninvestigated impact of COVID-19 on the progression and clinical outcomes of patients with AKI. METHODS: Data from 576 patients with AKI admitted between 13/3/20 and 13/5/20 were studied. Increasingly complex analyses, from logistic regressions to competing-risk and multi-state models, have revealed insights into AKI progression dynamics associated with PCR-confirmed COVID-19 acquisition and death. Meta-analyses of case fatality ratios among patients with AKI were also conducted. RESULTS: The overall case-fatality ratio was 0.33 [95% CI (0.20-0.36)]; higher in COVID-19 positive (COVID+) patients 0.52 [95% CI (0.46-0.58)] than in their negative (COVID-) counterparts 0.16 [95% CI (0.12-0.20)]. In AKI Stage-3 patients, that was 0.71 [95% CI (0.64-0.79)] among COVID+ patients with 45% dead within 14 days and 0.35 [95% CI (0.25-0.44)] in the COVID- group and 28% died within 14 days. Among patients diagnosed with AKI Stage-1 within 24 h, the probability of progression to AKI Stage-3 on day 7 post admission was 0.22 [95% CI (0.17-0.27)] among COVID+ patients, and 0.06 [95% CI (0.03, 0.09)] among those who tested negative. The probability of discharge by day 7 was 0.71 [95% CI (0.66, 0.75)] in COVID- patients, and 0.27 [95% CI (0.21, 0.32)] in COVID+ patients. By day 14, in AKI Stage-3 COVID+ patients, that was 0.35 [95% CI (0.25, 0.44)] with little change by day 10, that is, 0.38 [95% CI (0.29, 0.47)]. CONCLUSION: These results are consistent with either a rapid progression in severity, prolonged hospital care, or high case fatality ratio among AKI Stage-3 patients, significantly exacerbated by COVID-19 infection
The epidemiology of soil-transmitted helminth infections in children up to 8 years of age: Findings from an Ecuadorian birth cohort.
BACKGROUND: There are few prospective longitudinal studies of soil-transmitted helminth (STH) infections during early childhood. We studied the epidemiology of and risk factors for soil-transmitted helminth infections from birth to 8 years of age in tropical Ecuador. METHODS: 2,404 newborns were followed to 8 years of age with periodic stool sample collections. Stool samples were collected also from household members at the time of the child's birth and examined by microscopy. Data on social, environmental, and demographic characteristics were collected by maternal questionnaire. Associations between potential risk factors and STH infections were estimated using generalized estimated equations applied to longitudinal binary outcomes for presence or absence of infections at collection times. RESULTS: Of 2,404 children, 1,120 (46.6%) were infected with at least one STH infection during the first 8 years of life. The risk of A. lumbricoides (16.2%) was greatest at 3 years, while risks of any STH (25.1%) and T. trichiura (16.5%) peaked at 5 years. Factors significantly associated with any STH infection in multivariable analyses included age, day-care (OR 1.34, 95% CI 1.03-1.73), maternal Afro-Ecuadorian ethnicity (non-Afro vs. Afro, OR 0.55, 95% CI 0.43-0.70) and lower educational level (secondary vs. illiterate, OR 0.31, 95% CI 0.22-0.45)), household overcrowding (OR 1.53, 95% CI 1.21-1.94)), having a latrine rather than a water closet (WC vs. latrine, OR 0.77, 95% CI 0.62-0.95)), and STH infections among household members (OR 2.03, 95% CI 1.59-2.58)). T. trichiura was more associated with poverty (high vs. low socioeconomic status, OR, 0.63, 95% CI 0.40-0.99)] and presence of infected siblings in the household (OR 3.42, 95% CI 2.24-5.22). CONCLUSION: STH infections, principally with A. lumbricoides and T. trichiura, peaked between 3 and 5 years in this cohort of children in tropical Ecuador. STH infections among household members were an important determinant of infection risk and could be targeted for control and elimination strategies
Recommended from our members
Age-dependent seroprevalence of dengue and chikungunya: inference from a cross-sectional analysis in Esmeraldas Province in coastal Ecuador
Objectives There are few population-based estimates for prevalence of past exposure to dengue and chikungunya viruses despite common epidemiological features. Here, we have developed a novel statistical method to study patterns of age-dependent prevalence of immunity in a population following exposures to two viruses which share similar epidemiological features including mode of transmission and induction of long-lasting immunity. This statistical technique accounted for sociodemographic characteristics associated with individuals and households.
Settings The data consist of a representative sample from an ongoing longitudinal birth cohort set-up in a tropical district in coastal Ecuador (Esmeraldas).
Participants We collected data and blood samples from 319 individuals belonging to 152 households following epidemics of the infections in 2015 in Latin America.
Primary outcome Plasma was tested for the presence of specific IgG antibodies to dengue and chikungunya viruses by commercial ELISA and defined a bivariate binary outcome indicating individuals’ past exposure status to dengue and chikungunya (ie, presence/absence of IgG antibodies to dengue or chikungunya or both).
Results Dengue seroprevalence increased rapidly with age reaching 97% (95% credible interval (CrI): 93%–99%) by 60 years. Chikungunya seroprevalence peaked at 42% (95% CrI: 18%–66%) around 9 years of age and averaged 27% (95% CrI: 8.7%–51.6%) for all ages. Rural areas were more likely to be associated with dengue-only exposure while urban areas and shorter distance to the nearest household were associated with exposures to both. Women living in urban settings were more likely to be chikungunya seropositive while rural men were more likely to be dengue seropositive.
Conclusion Dengue seroprevalence was strongly age dependent consistent with endemic exposure while that of chikungunya peaked in childhood consistent with the recent emergence of the virus in the study area. Our findings will inform control strategies for the two arboviruses in Ecuador including recommendations by the WHO on dengue vaccination
- …