42 research outputs found

    Using classification and regression tree modelling to investigate response shift patterns in dentine hypersensitivity

    Get PDF
    BACKGROUND: Dentine hypersensitivity (DH) affects people's quality of life (QoL). However changes in the internal meaning of QoL, known as Response shift (RS) may undermine longitudinal assessment of QoL. This study aimed to describe patterns of RS in people with DH using Classification and Regression Trees (CRT) and to explore the convergent validity of CRT with the then-test and ideals approaches. METHODS: Data from an 8-week clinical trial of mouthwashes for dentine hypersensitivity (n = 75) using the Dentine Hypersensitivity Experience Questionnaire (DHEQ) as the outcome measure, were analysed. CRT was used to examine 8-week changes in DHEQ total score as a dependent variable with clinical status for DH and each DHEQ subscale score (restrictions, coping, social, emotional and identity) as independent variables. Recalibration was inferred when the clinical change was not consistent with the DHEQ change score using a minimally important difference for DHEQ of 22 points. Reprioritization was inferred by changes in the relative importance of each subscale to the model over time. RESULTS: Overall, 50.7% of participants experienced a clinical improvement in their DH after treatment and 22.7% experienced an important improvement in their quality of life. Thirty-six per cent shifted their internal standards downward and 14.7% upwards, suggesting recalibration. Reprioritization occurred over time among the social and emotional impacts of DH. CONCLUSIONS: CRT was a useful method to reveal both, the types and nature of RS in people with a mild health condition and demonstrated convergent validity with design based approaches to detect RS

    Quantitative PCR reveals strong spatial and temporal variation of the wasting disease pathogen, Labyrinthula zosterae in northern European eelgrass (Zostera marina) beds

    Get PDF
    Seagrass beds are the foundation species of functionally important coastal ecosystems worldwide. The world’s largest losses of the widespread seagrass Zostera marina (eelgrass) have been reported as a consequence of wasting disease, an infection with the endophytic protist Labyrinthula zosterae. During one of the most extended epidemics in the marine realm, ~90% of East and Western Atlantic eelgrass beds died-off between 1932 and 1934. Today, small outbreaks continue to be reported, but the current extent of L. zosterae in European meadows is completely unknown. In this study we quantify the abundance and prevalence of the wasting disease pathogen among 19 Z. marina populations in northern European coastal waters, using quantitative PCR (QPCR) with primers targeting a species specific portion of the internally transcribed spacer (ITS1) of L. zosterae. Spatially, we found marked variation among sites with abundances varying between 0 and 126 cells mg−1 Z. marina dry weight (mean: 5.7 L. zosterae cells mg−1 Z. marina dry weight ±1.9 SE) and prevalences ranged from 0–88.9%. Temporarily, abundances varied between 0 and 271 cells mg−1 Z. marina dry weight (mean: 8.5±2.6 SE), while prevalences ranged from zero in winter and early spring to 96% in summer. Field concentrations accessed via bulk DNA extraction and subsequent QPCR correlated well with prevalence data estimated via isolation and cultivation from live plant tissue. L. zosterae was not only detectable in black lesions, a sign of Labyrinthula-induced necrosis, but also occurred in green, apparently healthy tissue. We conclude that L. zosterae infection is common (84% infected populations) in (northern) European eelgrass populations with highest abundances during the summer months. In the light of global climate change and increasing rate of marine diseases our data provide a baseline for further studies on the causes of pathogenic outbreaks of L. zosterae

    Dynamics and Control of Diseases in Networks with Community Structure

    Get PDF
    The dynamics of infectious diseases spread via direct person-to-person transmission (such as influenza, smallpox, HIV/AIDS, etc.) depends on the underlying host contact network. Human contact networks exhibit strong community structure. Understanding how such community structure affects epidemics may provide insights for preventing the spread of disease between communities by changing the structure of the contact network through pharmaceutical or non-pharmaceutical interventions. We use empirical and simulated networks to investigate the spread of disease in networks with community structure. We find that community structure has a major impact on disease dynamics, and we show that in networks with strong community structure, immunization interventions targeted at individuals bridging communities are more effective than those simply targeting highly connected individuals. Because the structure of relevant contact networks is generally not known, and vaccine supply is often limited, there is great need for efficient vaccination algorithms that do not require full knowledge of the network. We developed an algorithm that acts only on locally available network information and is able to quickly identify targets for successful immunization intervention. The algorithm generally outperforms existing algorithms when vaccine supply is limited, particularly in networks with strong community structure. Understanding the spread of infectious diseases and designing optimal control strategies is a major goal of public health. Social networks show marked patterns of community structure, and our results, based on empirical and simulated data, demonstrate that community structure strongly affects disease dynamics. These results have implications for the design of control strategies

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore