118 research outputs found

    Pan-European Chikungunya surveillance: Designing risk stratified surveillance zones

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund - Copyright @ 2009 Tilston et alThe first documented transmission of Chikungunya within Europe took place in Italy during the summer of 2007. Chikungunya, a viral infection affecting millions of people across Africa and Asia, can be debilitating and no prophylactic treatment exists. Although imported cases are reported frequently across Europe, 2007 was the first confirmed European outbreak and available evidence suggests that Aedes albopictus was the vector responsible and the index case was a visitor from India. This paper proposed pan-European surveillance zones for Chikungunya, based on the climatic conditions necessary for vector activity and viral transmission. Pan-European surveillance provides the best hope for an early-warning of outbreaks, because national boundaries do not play a role in defining the risk of this new vector borne disease threat. A review of climates, where Chikungunya has been active, was used to inform the delineation of three pan-European surveillance zones. These vary in size each month across the June-September period of greatest risk. The zones stretch across southern Europe from Portugal to Turkey. Although the focus of this study was to define the geography of potential surveillance zones based on the climatic limits on the vector and virus, a preliminary examination of inward bound airline passengers was also undertaken. This indicated that France and Italy are likely to be at greater risk due to the number of visitors they receive from Chikungunya active regions, principally viraemic visitors from India. Therefore this study represents a first attempt at creating risk stratified surveillance zones, which we believe could be usefully refined with the use of higher resolution climate data and more complete air travel data

    Risk factors for chest infection in acute stroke: a prospective cohort study

    Get PDF
    <p><b>Background and Purpose:</b> Pneumonia is a major cause of morbidity and mortality after stroke. We aimed to determine key characteristics that would allow prediction of those patients who are at highest risk for poststroke pneumonia.</p> <p><b>Methods:</b> We studied a series of consecutive patients with acute stroke who were admitted to hospital. Detailed evaluation included the modified National Institutes of Health Stroke Scale; the Abbreviated Mental Test; and measures of swallow, respiratory, and oral health status. Pneumonia was diagnosed by set criteria. Patients were followed up at 3 months after stroke.</p> <p><b>Results:</b> We studied 412 patients, 391 (94.9%) with ischemic stroke and 21 (5.1%) with hemorrhagic stroke; 78 (18.9%) met the study criteria for pneumonia. Subjects who developed pneumonia were older (mean±SD age, 75.9±11.4 vs 64.9±13.9 years), had higher modified National Institutes of Health Stroke Scale scores, a history of chronic obstructive pulmonary disease, lower Abbreviated Mental Test scores, and a higher oral cavity score, and a greater proportion tested positive for bacterial cultures from oral swabs. In binary logistic-regression analysis, independent predictors (P<0.05) of pneumonia were age >65 years, dysarthria or no speech due to aphasia, a modified Rankin Scale score ≥4, an Abbreviated Mental Test score <8, and failure on the water swallow test. The presence of 2 or more of these risk factors carried 90.9% sensitivity and 75.6% specificity for the development of pneumonia.</p> <p><b>Conclusions:</b> Pneumonia after stroke is associated with older age, dysarthria/no speech due to aphasia, severity of poststroke disability, cognitive impairment, and an abnormal water swallow test result. Simple assessment of these variables could be used to identify patients at high risk of developing pneumonia after stroke.</p&gt

    Using legume-based mixtures to enhance the nitrogen use efficiency and economic viability of cropping systems - Final report (LK09106/HGCA3447)

    Get PDF
    As costs for mineral fertilisers rise, legume-based leys are recognised as a potential alternative nitrogen source for crops. Here we demonstrate that including species-rich legume-based leys in rotations helps to maximise synergies between agricultural productivity and other ecosystem services. By using functionally diverse plant species mixtures, these services can be optimised and fine-tuned to regional and farm-specific needs. Replicated field experiments were conducted over three years at multiple locations, testing the performance of 12 legume species and 4 grass species sown in monocultures, as well as in a mixture of 10 of the legumes and all 4 grasses (called the All Species Mix, ASM). In addition, we compared this complex mixture to farmer-chosen ley mixtures on 34 sites across the UK. The trials showed that there is a large degree of functional complementarity among the legume species. No single species scored high on all evaluation criteria. In particular, the currently most frequently used species, white clover, is outscored by other legume species on a number of parameters such as early development and resistance to decomposition. Further complementarity emerged from the different responses of legume species to environmental variables, with soil pH and grazing or cutting regime being among the more important factors. For example, while large birdsfoot trefoil showed better performance on more acidic soils, the opposite was true for sainfoin, lucerne and black medic. In comparison with the monocultures, the ASM showed increased ground cover, increased above-ground biomass and reduced weed biomass. Benefits of mixing species with regard to productivity increased over time. In addition, the stability of biomass production across sites was greater in the ASM than in the legume monocultures. Within the on-farm trials, we further found that on soils low in organic matter the biomass advantage of the ASM over the Control ley was more marked than on the soils with higher organic matter content. Ecological modelling revealed that the three best multifunctional mixtures all contained black medic, lucerne and red clover. Within the long term New Farming Systems (NFS) rotational study, the use of a clover bi-crop showed improvement to soil characteristics compared to current practice (e.g. bulk density and water infiltration rate). Improvements in wheat yield were also noted with respect to the inclusion of a clover bi-crop in 2010, but there was evidence of a decline in response as the N dose was increased. Cumulatively, over both the wheat crop and the spring oilseed rape crop, the clover bi-crop improved margin over N. The highest average yield response (~9%) was associated with the ASM legume species mix cover cropping approach

    Measured Dynamic Social Contact Patterns Explain the Spread of H1N1v Influenza

    Get PDF
    Patterns of social mixing are key determinants of epidemic spread. Here we present the results of an internet-based social contact survey completed by a cohort of participants over 9,000 times between July 2009 and March 2010, during the 2009 H1N1v influenza epidemic. We quantify the changes in social contact patterns over time, finding that school children make 40% fewer contacts during holiday periods than during term time. We use these dynamically varying contact patterns to parameterise an age-structured model of influenza spread, capturing well the observed patterns of incidence; the changing contact patterns resulted in a fall of approximately 35% in the reproduction number of influenza during the holidays. This work illustrates the importance of including changing mixing patterns in epidemic models. We conclude that changes in contact patterns explain changes in disease incidence, and that the timing of school terms drove the 2009 H1N1v epidemic in the UK. Changes in social mixing patterns can be usefully measured through simple internet-based surveys

    Chronic pain associated with the Chikungunya Fever: long lasting burden of an acute illness

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Chikungunya virus (CHIKV) is responsible for major epidemics worldwide. Autochthonous cases were recently reported in several European countries. Acute infection is thought to be monophasic. However reports on chronic pain related to CHIKV infection have been made. In particular, the fact that many of these patients do not respond well to usual analgesics suggests that the nature of chronic pain may be not only nociceptive but also neuropathic. Neuropathic pain syndromes require specific treatment and the identification of neuropathic characteristics (NC) in a pain syndrome is a major step towards pain control.</p> <p>Methods</p> <p>We carried out a cross-sectional study at the end of the major two-wave outbreak lasting 17 months in Réunion Island. We assessed pain in 106 patients seeking general practitioners with confirmed infection with the CHIK virus, and evaluated its impact on quality of life (QoL).</p> <p>Results</p> <p>The mean intensity of pain on the visual-analogical scale (VAS) was 5.8 ± 2.1, and its mean duration was 89 ± 2 days. Fifty-six patients fulfilled the definition of chronic pain. Pain had NC in 18.9% according to the DN4 questionnaire. Conversely, about two thirds (65%) of patients with NC had chronic pain. The average pain intensity was similar between patients with or without NC (6.0 ± 1.7 vs 6.1 ± 2.0). However, the total score of the Short Form-McGill Pain Questionnaire (SF-MPQ)(15.5 ± 5.2 vs 11.6 ± 5.2; p < 0.01) and both the affective (18.8 ± 6.2 vs 13.4 ± 6.7; p < 0.01) and sensory subscores (34.3 ± 10.7 vs 25.0 ± 9.9; p < 0.01) were significantly higher in patients with NC. The mean pain interference in life activities calculated from the Brief Pain Inventory (BPI) was significantly higher in patients with chronic pain than in patients without it (6.8 ± 1.9 vs 5.9 ± 1.9, p < 0.05). This score was also significantly higher in patients with NC than in those without such a feature (7.2 ± 1.5 vs 6.1 ± 1.9, p < 0.05).</p> <p>Conclusions</p> <p>There exists a specific chronic pain condition associated to CHIKV. Pain with NC seems to be associated with more aggressive clinical picture, more intense impact in QoL and more challenging pharmacological treatment.</p

    The drivers of tropical speciation

    Get PDF
    © 2014 Macmillan Publishers Limited. All rights reserved. Since the recognition that allopatric speciation can be induced by large-scale reconfigurations of the landscape that isolate formerly continuous populations, such as the separation of continents by plate tectonics, the uplift of mountains or the formation of large rivers, landscape change has been viewed as a primary driver of biological diversification. This process is referred to in biogeography as vicariance. In the most species-rich region of the world, the Neotropics, the sundering of populations associated with the Andean uplift is ascribed this principal role in speciation. An alternative model posits that rather than being directly linked to landscape change, allopatric speciation is initiated to a greater extent by dispersal events, with the principal drivers of speciation being organism-specific abilities to persist and disperse in the landscape. Landscape change is not a necessity for speciation in this model. Here we show that spatial and temporal patterns of genetic differentiation in Neotropical birds are highly discordant across lineages and are not reconcilable with a model linking speciation solely to landscape change. Instead, the strongest predictors of speciation are the amount of time a lineage has persisted in the landscape and the ability of birds to move through the landscape matrix. These results, augmented by the observation that most species-level diversity originated after episodes of major Andean uplift in the Neogene period, suggest that dispersal and differentiation on a matrix previously shaped by large-scale landscape events was a major driver of avian speciation in lowland Neotropical rainforests

    Human papillomavirus infection as a risk factor for anal and perianal skin cancer in a prospective study

    Get PDF
    Human papillomavirus has emerged as the leading infectious cause of cervical and other anogenital cancers. We have studied the relation between human papillomavirus infection and the subsequent risk of anal and perianal skin cancer. A case–cohort study within two large Nordic serum banks to which about 760 000 individuals had donated serum samples was performed. Subjects who developed anal and perianal skin cancer during follow up (median time of 10 years) were identified by registry linkage with the nationwide cancer registries in Finland and Norway. Twenty-eight cases and 1500 controls were analysed for the presence of IgG antibodies to HPV 16, 18, 33 or 73, and odds ratios of developing anal and perianal skin cancer were calculated. There was an increased risk of developing anal and perianal skin cancer among subjects seropositive for HPV 16 (OR=3.0; 95%CI=1.1–8.2) and HPV 18 (OR=4.4; 95%CI=1.1–17). The highest risks were seen for HPV 16 seropositive patients above the age of 45 years at serum sampling and for patients with a lag time of less than 10 years. This study provides prospective epidemiological evidence of an association between infection with HPV 16 and 18 and anal and perianal skin cancer

    Virological outcomes of boosted protease inhibitor-based first-line ART in subjects harbouring thymidine analogue-associated mutations as the sole form of transmitted drug resistance

    Get PDF
    OBJECTIVES: In subjects with transmitted thymidine analogue mutations (TAMs), boosted PIs (PI/b) are often chosen to overcome possible resistance to the NRTI backbone. However, data to guide treatment selection are limited. Our aim was to obtain firmer guidance for clinical practice using real-world cohort data. METHODS: We analysed 1710 subjects who started a PI/b in combination with tenofovir or abacavir plus emtricitabine or lamivudine, and compared their virological outcomes with those of 4889 patients who started an NNRTI (predominantly efavirenz), according to the presence of ≥1 TAM as the sole form of transmitted drug resistance. RESULTS: Participants with ≥1 TAM comprised predominantly MSM (213 of 269, 79.2%), subjects of white ethnicity (206 of 269, 76.6%) and HIV-1 subtype B infections (234 of 269, 87.0%). Most (203 of 269, 75.5%) had singleton TAMs, commonly a revertant of T215Y or T215F (112 of 269, 41.6%). Over a median of 2.5 years of follow-up, 834 of 6599 (12.6%) subjects experienced viraemia (HIV-1 RNA >50 copies/mL). The adjusted HR for viraemia was 2.17 with PI/b versus NNRTI-based therapy (95% CI 1.88–2.51; P < 0.001). Other independent predictors of viraemia included injecting drug use, black ethnicity, higher viral load and lower CD4 cell count at baseline, and receiving abacavir instead of tenofovir. Resistance showed no overall impact (adjusted HR 0.77 with ≥1 TAM versus no resistance; 95% CI 0.54–1.10; P = 0.15). CONCLUSION: In this cohort, patients harbouring ≥1 TAM as the sole form of transmitted drug resistance gained no apparent virological advantage from starting first-line ART with a PI/b

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Transmission Potential of Chikungunya Virus and Control Measures: The Case of Italy

    Get PDF
    During summer 2007 Italy has experienced an epidemic caused by Chikungunya virus – the first large outbreak documented in a temperate climate country – with approximately 161 laboratory confirmed cases concentrated in two bordering villages in North–Eastern Italy comprising 3,968 inhabitants. The seroprevalence was recently estimated to be 10.2%. In this work we provide estimates of the transmission potential of the virus and we assess the efficacy of the measures undertaken by public health authorities to control the epidemic spread. To such aim, we developed a model describing the temporal dynamics of the competent vector, known as Aedes albopictus, explicitly depending on climatic factors, coupled to an epidemic transmission model describing the spread of the epidemic in both humans and mosquitoes. The cumulative number of notified cases predicted by the model was 185 on average (95% CI 117–278), in good agreement with observed data. The probability of observing a major outbreak after the introduction of an infective human case was estimated to be in the range of 32%–76%. We found that the basic reproduction number was in the range of 1.8–6 but it could have been even larger, depending on the density of mosquitoes, which in turn depends on seasonal meteorological effects, besides other local abiotic factors. These results confirm the increasing risk of tropical vector–borne diseases in temperate climate countries, as a consequence of globalization. However, our results show that an epidemic can be controlled by performing a timely intervention, even if the transmission potential of Chikungunya virus is sensibly high
    corecore