75 research outputs found

    Mycobacterium bovis shedding patterns from experimentally infected calves and the effect of concurrent infection with bovine viral diarrhoea virus

    Get PDF
    Concurrent infection of cattle with bovine viral diarrhoea virus (BVDV) and Mycobacterium bovis is considered to be a possible risk factor for onward transmission of bovine tuberculosis (BTB) in infected cattle and is known to compromise diagnostic tests. A comparison is made here of M. bovis shedding (i.e. release) characteristics from 12 calves, six experimentally co-infected with BVDV and six infected with M. bovis alone, using simple models of bacterial replication. These statistical and mathematical models account for the intermittent or episodic nature of shedding, the dynamics of within-host bacterial proliferation and the sampling distribution from a given shedding episode. We show that while there are distinct differences among the shedding patterns of calves given the same infecting dose, there is no statistically significant difference between the two groups of calves. Such differences as there are, can be explained solely in terms of the shedding frequency, but with all calves potentially excreting the same amount of bacteria in a given shedding episode post-infection. The model can be thought of as a process of the bacteria becoming established in a number of discrete foci of colonization, rather than as a more generalized infection of the respiratory tract. In this case, the variability in the shedding patterns of the infected calves can be explained solely by differences in the number of foci established and shedding being from individual foci over time. Should maximum exposure on a particular occasion be a critical consideration for cattle-to-cattle transmission of BTB, cattle that shed only intermittently may still make an important contribution to the spread and persistence of the disease

    Frequency and characteristics of disease flares in ankylosing spondylitis

    Get PDF
    Objective. To examine the characteristics and frequency of disease flares in a cohort of people with AS

    Classification of accelerometer wear and non-wear events in seconds for monitoring free-living physical activity

    Get PDF
    _____________________________________________________________ This article is brought to you by Swansea University. Any person downloading material is agreeing to abide by the terms of the repository licence. Authors are personally responsible for adhering to publisher restrictions or conditions. When uploading content they are required to comply with their publisher agreement and the SHERPA RoMEO database to judge whether or not it is copyright safe to add this version of the paper to this repository. Design: A bi-moving-window-based approach was used to combine acceleration and skin temperature data to identify wear and non-wear time events in triaxial accelerometer data that monitor physical activity. Setting: Local residents in Swansea, Wales, UK. Participants: 50 participants aged under 16 years (n=23) and over 17 years (n=27) were recruited in two phases: phase 1: design of the wear/non-wear algorithm (n=20) and phase 2: validation of the algorithm (n=30). Methods: Participants wore a triaxial accelerometer (GeneActiv) against the skin surface on the wrist (adults) or ankle (children). Participants kept a diary to record the timings of wear and non-wear and were asked to ensure that events of wear/non-wear last for a minimum of 15 min. Results: The overall sensitivity of the proposed method was 0.94 (95% CI 0.90 to 0.98) and specificity 0.91 (95% CI 0.88 to 0.94). It performed equally well for children compared with adults, and females compared with males. Using surface skin temperature data in combination with acceleration data significantly improved the classification of wear/non-wear time when compared with methods that used acceleration data only ( p<0.01). Conclusions: Using either accelerometer seismic information or temperature information alone is prone to considerable error. Combining both sources of data can give accurate estimates of non-wear periods thus giving better classification of sedentary behaviour. This method can be used in population studies of physical activity in free-living environments

    Quantitative Analysis of Immune Response and Erythropoiesis during Rodent Malarial Infection

    Get PDF
    Malarial infection is associated with complex immune and erythropoietic responses in the host. A quantitative understanding of these processes is essential to help inform malaria therapy and for the design of effective vaccines. In this study, we use a statistical model-fitting approach to investigate the immune and erythropoietic responses in Plasmodium chabaudi infections of mice. Three mouse phenotypes (wildtype, T-cell-deficient nude mice, and nude mice reconstituted with T-cells taken from wildtype mice) were infected with one of two parasite clones (AS or AJ). Under a Bayesian framework, we use an adaptive population-based Markov chain Monte Carlo method and fit a set of dynamical models to observed data on parasite and red blood cell (RBC) densities. Model fits are compared using Bayes' factors and parameter estimates obtained. We consider three independent immune mechanisms: clearance of parasitised RBCs (pRBC), clearance of unparasitised RBCs (uRBC), and clearance of parasites that burst from RBCs (merozoites). Our results suggest that the immune response of wildtype mice is associated with less destruction of uRBCs, compared to the immune response of nude mice. There is a greater degree of synchronisation between pRBC and uRBC clearance than between either mechanism and merozoite clearance. In all three mouse phenotypes, control of the peak of parasite density is associated with pRBC clearance. In wildtype mice and AS-infected nude mice, control of the peak is also associated with uRBC clearance. Our results suggest that uRBC clearance, rather than RBC infection, is the major determinant of RBC dynamics from approximately day 12 post-innoculation. During the first 2–3 weeks of blood-stage infection, immune-mediated clearance of pRBCs and uRBCs appears to have a much stronger effect than immune-mediated merozoite clearance. Upregulation of erythropoiesis is dependent on mouse phenotype and is greater in wildtype and reconstitited mice. Our study highlights the informative power of statistically rigorous model-fitting techniques in elucidating biological systems

    Quantifying the Risk of Localised Animal Movement Bans for Foot-and-Mouth Disease

    Get PDF
    The maintenance of disease-free status from Foot-and-Mouth Disease is of significant socio-economic importance to countries such as the UK. The imposition of bans on the movement of susceptible livestock following the discovery of an outbreak is deemed necessary to prevent the spread of what is a highly contagious disease, but has a significant economic impact on the agricultural community in itself. Here we consider the risk of applying movement restrictions only in localised zones around outbreaks in order to help evaluate how quickly nation-wide restrictions could be lifted after notification. We show, with reference to the 2001 and 2007 UK outbreaks, that it would be practical to implement such a policy provided the basic reproduction ratio of known infected premises can be estimated. It is ultimately up to policy makers and stakeholders to determine the acceptable level of risk, involving a cost benefit analysis of the potential outcomes, but quantifying the risk of spread from different sized zones is a prerequisite for this. The approach outlined is relevant to the determination of control zones and vaccination policies and has the potential to be applied to future outbreaks of other diseases

    Dynamic Health Policies for Controlling the Spread of Emerging Infections: Influenza as an Example

    Get PDF
    The recent appearance and spread of novel infectious pathogens provide motivation for using models as tools to guide public health decision-making. Here we describe a modeling approach for developing dynamic health policies that allow for adaptive decision-making as new data become available during an epidemic. In contrast to static health policies which have generally been selected by comparing the performance of a limited number of pre-determined sequences of interventions within simulation or mathematical models, dynamic health policies produce “real-time” recommendations for the choice of the best current intervention based on the observable state of the epidemic. Using cumulative real-time data for disease spread coupled with current information about resource availability, these policies provide recommendations for interventions that optimally utilize available resources to preserve the overall health of the population. We illustrate the design and implementation of a dynamic health policy for the control of a novel strain of influenza, where we assume that two types of intervention may be available during the epidemic: (1) vaccines and antiviral drugs, and (2) transmission reducing measures, such as social distancing or mask use, that may be turned “on” or “off” repeatedly during the course of epidemic. In this example, the optimal dynamic health policy maximizes the overall population's health during the epidemic by specifying at any point of time, based on observable conditions, (1) the number of individuals to vaccinate if vaccines are available, and (2) whether the transmission-reducing intervention should be either employed or removed

    Social Class Differences in Secular Trends in Established Coronary Risk Factors over 20 Years: A Cohort Study of British Men from 1978–80 to 1998–2000

    Get PDF
    Background: Coronary heart disease (CHD) mortality in the UK since the late 1970s has declined more markedly among higher socioeconomic groups. However, little is known about changes in coronary risk factors in different socioeconomic groups. This study examined whether changes in established coronary risk factors in Britain over 20 years between 1978-80 and 1998-2000 differed between socioeconomic groups.Methods and Findings: A socioeconomically representative cohort of 7735 British men aged 40-59 years was followed-up from 1978-80 to 1998-2000; data on blood pressure (BP), cholesterol, body mass index (BMI) and cigarette smoking were collected at both points in 4252 survivors. Social class was based on longest-held occupation in middle-age. Compared with men in non-manual occupations, men in manual occupations experienced a greater increase in BMI (mean difference=0.33 kg/m(2); 95%CI 0.14-0.53; p for interaction=0.001), a smaller decline in non-HDL cholesterol (difference in mean change=0.18 mmol/l; 95%CI 0.11-0.25, p for interaction <= 0.0001) and a smaller increase in HDL cholesterol (difference in mean change=0.04 mmol/l; 95%CI 0.02-0.06, p for interaction <= 0.0001). However, mean systolic BP declined more in manual than non-manual groups (difference in mean change=3.6; 95%CI 2.1-5.1, p for interaction <= 0.0001). The odds of being a current smoker in 1978-80 and 1998-2000 did not differ between non-manual and manual social classes (p for interaction = 0.51).Conclusion: Several key risk factors for CHD and type 2 diabetes showed less favourable changes in men in manual occupations. Continuing priority is needed to improve adverse cardiovascular risk profiles in socially disadvantaged groups in the UK

    A Statistically Rigorous Method for Determining Antigenic Switching Networks

    Get PDF
    Many vector-borne pathogens rely on antigenic variation to prolong infections and increase their likelihood of onward transmission. This immune evasion strategy often involves mutually exclusive switching between members of gene families that encode functionally similar but antigenically different variants during the course of a single infection. Studies of different pathogens have suggested that switching between variant genes is non-random and that genes have intrinsic probabilities of being activated or silenced. These factors could create a hierarchy of gene expression with important implications for both infection dynamics and the acquisition of protective immunity. Inferring complete switching networks from gene transcription data is problematic, however, because of the high dimensionality of the system and uncertainty in the data. Here we present a statistically rigorous method for analysing temporal gene transcription data to reconstruct an underlying switching network. Using artificially generated transcription profiles together with in vitro var gene transcript data from two Plasmodium falciparum laboratory strains, we show that instead of relying on data from long-term parasite cultures, accuracy can be greatly improved by using transcription time courses of several parasite populations from the same isolate, each starting with different variant distributions. The method further provides explicit indications about the reliability of the resulting networks and can thus be used to test competing hypotheses with regards to the underlying switching pathways. Our results demonstrate that antigenic switch pathways can be determined reliably from short gene transcription profiles assessing multiple time points, even when subject to moderate levels of experimental error. This should yield important new information about switching patterns in antigenically variable organisms and might help to shed light on the molecular basis of antigenic variation

    Application of Frequent Itemsets Mining to Analyze Patterns of One-Stop Visits in Taiwan

    Get PDF
    BACKGROUND: The free choice of health care facilities without limitations on frequency of visits within the National Health Insurance in Taiwan gives rise to not only a high number of annual ambulatory visits per capita but also a unique "one-stop shopping"phenomenon, which refers to a patient' visits to several specialties of the same healthcare facility in one day. The visits to multiple physicians would increase the potential risk of polypharmacy. The aim of this study was to analyze the frequency and patterns of one-stop visits in Taiwan. METHODOLOGY/PRINCIPAL FINDINGS: The claims datasets of 1 million nationally representative people within Taiwan's National Health Insurance in 2005 were used to calculate the number of patients with one-stop visits. The frequent itemsets mining was applied to compute the combination patterns of specialties in the one-stop visits. Among the total 13,682,469 ambulatory care visits in 2005, one-stop visits occurred 144,132 times and involved 296,822 visits (2.2% of all visits) by 66,294 (6.6%) persons. People tended to have this behavior with age and the percentage reached 27.5% (5,662 in 20,579) in the age group ≥80 years. In general, women were more likely to have one-stop visits than men (7.2% vs. 6.0%). Internal medicine plus ophthalmology was the most frequent combination with a visited frequency of 3,552 times (2.5%), followed by cardiology plus neurology with 3,183 times (2.2%). The most frequent three-specialty combination, cardiology plus neurology and gastroenterology, occurred only 111 times. CONCLUSIONS/SIGNIFICANCE: Without the novel computational technique, it would be hardly possible to analyze the extremely diverse combination patterns of specialties in one-stop visits. The results of the study could provide useful information either for the hospital manager to set up integrated services or for the policymaker to rebuild the health care system

    Population based absolute and relative survival to 1 year of people with diabetes following a myocardial infarction: A cohort study using hospital admissions data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>People with diabetes who experience an acute myocardial infarction (AMI) have a higher risk of death and recurrence of AMI. This study was commissioned by the Department for Transport to develop survival tables for people with diabetes following an AMI in order to inform vehicle licensing.</p> <p>Methods</p> <p>A cohort study using data obtained from national hospital admission datasets for England and Wales was carried out selecting all patients attending hospital with an MI for 2003-2006 (inclusion criteria: aged 30+ years, hospital admission for MI (defined using ICD 10 code I21-I22). STATA was used to create survival tables and factors associated with survival were examined using Cox regression.</p> <p>Results</p> <p>Of 157,142 people with an MI in England and Wales between 2003-2006, the relative risk of death or recurrence of MI for those with diabetes (n = 30,407) in the first 90 days was 1.3 (95%CI: 1.26-1.33) crude rates and 1.16 (95%CI: 1.1-1.2) when controlling for age, gender, heart failure and surgery for MI) compared with those without diabetes (n = 129,960). At 91-365 days post AMI the risk was 1.7 (95% CI 1.6-1.8) crude and 1.50 (95%CI: 1.4-1.6) adjusted. The relative risk of death or re-infarction was higher at younger ages for those with diabetes and directly after the AMI (Relative risk; RR: 62.1 for those with diabetes and 28.2 for those without diabetes aged 40-49 [compared with population risk]).</p> <p>Conclusions</p> <p>This is the first study to provide population based tables of age stratified risk of re-infarction or death for people with diabetes compared with those without diabetes. These tables can be used for giving advice to patients, developing a baseline to compare intervention studies or developing license or health insurance guidelines.</p
    corecore