546 research outputs found
Evaluation of the impact of 2 years of a dosing intervention on canine echinococcosis in the Alay Valley, Kyrgyzstan
Echinococcosis is a re-emerging zoonotic disease in Kyrgyzstan. In 2012, an echinococcosis control scheme was started that included dosing owned dogs in the Alay Valley, Kyrgyzstan with praziquantel. Control programmes require large investments of money and resources; as such it is important to evaluate how well these are meeting their targets. However, problems associated with echinococcosis control schemes include remoteness and semi-nomadic customs of affected communities, and lack of resources. These same problems apply to control scheme evaluations, and quick and easy assessment tools are highly desirable. Lot quality assurance sampling was used to assess the impact of approximately 2 years of echinococcosis control in the Alay valley. A pre-intervention coproELISA prevalence was established, and a 75% threshold for dosing compliance was set based on previous studies. Ten communities were visited in 2013 and 2014, with 18-21 dogs sampled per community, and questionnaires administered to dog owners. After 21 months of control efforts, 8/10 communities showed evidence of reaching the 75% praziquantel dosing target, although only 3/10 showed evidence of a reduction in coproELISA prevalence. This is understandable, since years of sustained control are required to effectively control echinococcosis, and efforts in the Alay valley should be and are being continued
Risk factors for high anti-HHV-8 antibody titers (≥1:51,200) in black, HIV-1 negative South African cancer patients: a case control study
Background: Infection with human herpesvirus 8 (HHV-8) is the necessary causal agent in the
development of Kaposi's sarcoma (KS). Infection with HIV-1, male gender and older age all increase
risk for KS. However, the geographic distribution of HHV-8 and KS both prior to the HIV/AIDS
epidemic and with HIV/AIDS suggest the presence of an additional co-factor in the development of
KS.
Methods: Between January 1994 and October 1997, we interviewed 2576 black in-patients with
cancer in Johannesburg and Soweto, South Africa. Blood was tested for antibodies against HIV-1
and HHV-8 and the study was restricted to 2191 HIV-1 negative patients. Antibodies against the
latent nuclear antigen of HHV-8 encoded by orf73 were detected with an indirect
immunofluorescence assay. We examined the relationship between high anti-HHV-8 antibody
titers (≥1:51,200) and sociodemographic and behavioral factors using unconditional logistic
regression models. Variables that were significant at p = 0.10 were included in multivariate analysis.
Results: Of the 2191 HIV-1 negative patients who did not have Kaposi's sarcoma, 854 (39.0%)
were positive for antibodies against HHV-8 according to the immunofluorescent assay. Among
those seropositive for HHV-8, 530 (62.1%) had low titers (1:200), 227 (26.6%) had medium titers
(1:51,200) and 97 (11.4%) had highest titers (1:204,800). Among the 2191 HIV-1 negative patients,
the prevalence of high anti-HHV-8 antibody titers (≥1:51,200) was independently associated with
increasing age (ptrend = 0.04), having a marital status of separated or divorced (p = 0.003), using
wood, coal or charcoal as fuel for cooking 20 years ago instead of electricity (p = 0.02) and
consuming traditional maize beer more than one time a week (p = 0.02; p-trend for increasing
consumption = 0.05) although this may be due to chance given the large number of predictors
considered in this analysis.
Conclusions: Among HIV-negative subjects, patients with high anti-HHV-8 antibody titers are
characterized by older age. Other associations that may be factors in the development of high anti-
HHV-8 titers include exposure to poverty or a low socioeconomic status environment and
consumption of traditional maize beer. The relationship between these variables and high anti-
HHV-8 titers requires further, prospective study
Cost of care associated with early sepsis (first 24-hours of ICU admission) in a United States medical center
Predicting overweight and obesity in young adulthood from childhood body-mass index: comparison of cutoffs derived from longitudinal and cross-sectional data
Background Historically, cutoff points for childhood and adolescent overweight and obesity have been based on population-specific percentiles derived from cross-sectional data. To obtain cutoff points that might better predict overweight and obesity in young adulthood, we examined the association between childhood body-mass index (BMI) and young adulthood BMI status in a longitudinal cohort. Methods In this study, we used data from the International Childhood Cardiovascular Cohort (i3C) Consortium (which included seven childhood cohorts from the USA, Australia, and Finland) to establish childhood overweight and obesity cutoff points that best predict BMI status at the age of 18 years. We included 3779 children who were followed up from 1970 onwards, and had at least one childhood BMI measurement between ages 6 years and 17 years and a BMI measurement specifically at age 18 years. We used logistic regression to assess the association between BMI in childhood and young adulthood obesity. We used the area under the receiver operating characteristic curve (AUROC) to assess the ability of fitted models to discriminate between different BMI status groups in young adulthood. The cutoff points were then compared with those defined by the International Obesity Task Force (IOTF), which used cross-sectional data, and tested for sensitivity and specificity in a separate, independent, longitudinal sample (from the Special Turku Coronary Risk Factor Intervention Project [STRIP] study) with BMI measurements available from both childhood and adulthood. Findings The cutoff points derived from the longitudinal i3C Consortium data were lower than the IOTF cutoff points. Consequently, a larger proportion of participants in the STRIP study was classified as overweight or obese when using the i3C cutoff points than when using the IOTF cutoff points. Especially for obesity, i3C cutoff points were significantly better at identifying those who would become obese later in life. In the independent sample, the AUROC values for overweight ranged from 0·75 (95% CI 0·70–0·80) to 0·88 (0·84–0·93) for the i3C cutoff points, and the corresponding values for the IOTF cutoff points ranged from 0·69 (0·62–0·75) to 0·87 (0·82–0·92). For obesity, the AUROC values ranged from 0·84 (0·75–0·93) to 0·90 (0·82–0·98) for the i3C cutoff points and 0·57 (0·49–0·66) to 0·76 (0·65–0·88) for IOTF cutoff points. Interpretation The childhood BMI cutoff points obtained from the i3C Consortium longitudinal data can better predict risk of overweight and obesity in young adulthood than can standards that are currently used based on cross-sectional data. Such cutoff points should help to more accurately identify children at risk of adult overweight or obesity
Effects on Smoking Cessation: Naltrexone Combined with a Cognitive Behavioral
A promising option in substance abuse treatment is the Community Reinforcement Approach (CRA). The opioid antagonist naltrexone (NTX) may work in combination with nicotine replacement therapy (NRT) to block the effects of smoking stimuli in abstinent smokers. Effects of lower doses than 50 mg/dd. have not been reported. A study was conducted in Amsterdam in 2000/2001 with the objective to explore the effects of the combination NTX (25/50-mg dd.), NRT, and CRA in terms of craving and abstinence. In a randomized open label, 2 × 2 between subjects design, 25 recovered spontaneous pneumothorax (SP) participants received 8 weeks of treatment. Due to side effects, only 3 participants were compliant in the 50-mg NTX condition. Craving significantly declined between each measurement and there was a significant interaction between decline in craving and craving measured at baseline. The abstinence rate in the CRA group was nearly double that in the non-psychosocial therapy group (46% vs. 25%; NS) at 3 months follow-up after treatment
Using data-driven rules to predict mortality in severe community acquired pneumonia
Prediction of patient-centered outcomes in hospitals is useful for performance benchmarking, resource allocation, and guidance regarding active treatment and withdrawal of care. Yet, their use by clinicians is limited by the complexity of available tools and amount of data required. We propose to use Disjunctive Normal Forms as a novel approach to predict hospital and 90-day mortality from instance-based patient data, comprising demographic, genetic, and physiologic information in a large cohort of patients admitted with severe community acquired pneumonia. We develop two algorithms to efficiently learn Disjunctive Normal Forms, which yield easy-to-interpret rules that explicitly map data to the outcome of interest. Disjunctive Normal Forms achieve higher prediction performance quality compared to a set of state-of-the-art machine learning models, and unveils insights unavailable with standard methods. Disjunctive Normal Forms constitute an intuitive set of prediction rules that could be easily implemented to predict outcomes and guide criteria-based clinical decision making and clinical trial execution, and thus of greater practical usefulness than currently available prediction tools. The Java implementation of the tool JavaDNF will be publicly available. © 2014 Wu et al
Recommended from our members
Recalibration of the delirium prediction model for ICU patients (PRE-DELIRIC): a multinational observational study
Purpose
Recalibration and determining discriminative power, internationally, of the existing delirium prediction model (PRE-DELIRIC) for intensive care patients.
Methods
A prospective multicenter cohort study was performed in eight intensive care units (ICUs) in six countries. The ten predictors (age, APACHE-II, urgent and admission category, infection, coma, sedation, morphine use, urea level, metabolic acidosis) were collected within 24 h after ICU admission. The confusion assessment method for the intensive care unit (CAM-ICU) was used to identify ICU delirium. CAM-ICU screening compliance and inter-rater reliability measurements were used to secure the quality of the data.
Results
A total of 2,852 adult ICU patients were screened of which 1,824 (64 %) were eligible for the study. Main reasons for exclusion were length of stay <1 day (19.1 %) and sustained coma (4.1 %). CAM-ICU compliance was mean (SD) 82 ± 16 % and inter-rater reliability 0.87 ± 0.17. The median delirium incidence was 22.5 % (IQR 12.8–36.6 %). Although the incidence of all ten predictors differed significantly between centers, the area under the receiver operating characteristic (AUROC) curve of the eight participating centers remained good: 0.77 (95 % CI 0.74–0.79). The linear predictor and intercept of the prediction rule were adjusted and resulted in improved re-calibration of the PRE-DELIRIC model.
Conclusions
In this multinational study, we recalibrated the PRE-DELIRIC model. Despite differences in the incidence of predictors between the centers in the different countries, the performance of the PRE-DELIRIC-model remained good. Following validation of the PRE-DELIRIC model, it may facilitate implementation of strategies to prevent delirium and aid improvements in delirium management of ICU patients
A process for developing a sustainable and scalable approach to community engagement : community dialogue approach for addressing the drivers of antibiotic resistance in Bangladesh
BACKGROUND: Community engagement approaches that have impacted on health outcomes are often time intensive, small-scale and require high levels of financial and human resources. They can be difficult to sustain and scale-up in low resource settings. Given the reach of health services into communities in low income countries, the health system provides a valuable and potentially sustainable entry point that would allow for scale-up of community engagement interventions. This study explores the process of developing an embedded approach to community engagement taking the global challenge of antibiotic resistance as an example. METHODS: The intervention was developed using a sequential mixed methods study design. This consisted of: exploring the evidence base through an umbrella review, and identifying key international standards on the appropriate use of antibiotics; undertaking detailed formative research through a) a qualitative study to explore the most appropriate mechanisms through which to embed the intervention within the existing health system and community infrastructure, and to understand patterns of knowledge, attitudes and practice regarding antibiotics and antibiotic resistance; and b) a household survey - which drew on the qualitative findings - to quantify knowledge, and reported attitudes and practice regarding antibiotics and antibiotic resistance within the target population; and c) drawing on appropriate theories regarding change mechanisms and experience of implementing community engagement interventions to co-produce the intervention processes and materials with key stakeholders at policy, health system and community level. RESULTS: A community engagement intervention was co-produced and was explicitly designed to link into existing health system and community structures and be appropriate for the cultural context, and therefore have the potential to be implemented at scale. We anticipate that taking this approach increases local ownership, as well as the likelihood that the intervention will be sustainable and scalable. CONCLUSIONS: This study demonstrates the value of ensuring that a range of stakeholders co-produce the intervention, and ensuring that the intervention is designed to be appropriate for the health system, community and cultural context
Stratification of the severity of critically ill patients with classification trees
<p>Abstract</p> <p>Background</p> <p>Development of three classification trees (CT) based on the CART (<it>Classification and Regression Trees</it>), CHAID (<it>Chi-Square Automatic Interaction Detection</it>) and C4.5 methodologies for the calculation of probability of hospital mortality; the comparison of the results with the APACHE II, SAPS II and MPM II-24 scores, and with a model based on multiple logistic regression (LR).</p> <p>Methods</p> <p>Retrospective study of 2864 patients. Random partition (70:30) into a Development Set (DS) n = 1808 and Validation Set (VS) n = 808. Their properties of discrimination are compared with the ROC curve (AUC CI 95%), Percent of correct classification (PCC CI 95%); and the calibration with the Calibration Curve and the Standardized Mortality Ratio (SMR CI 95%).</p> <p>Results</p> <p>CTs are produced with a different selection of variables and decision rules: CART (5 variables and 8 decision rules), CHAID (7 variables and 15 rules) and C4.5 (6 variables and 10 rules). The common variables were: inotropic therapy, Glasgow, age, (A-a)O2 gradient and antecedent of chronic illness. In VS: all the models achieved acceptable discrimination with AUC above 0.7. CT: CART (0.75(0.71-0.81)), CHAID (0.76(0.72-0.79)) and C4.5 (0.76(0.73-0.80)). PCC: CART (72(69-75)), CHAID (72(69-75)) and C4.5 (76(73-79)). Calibration (SMR) better in the CT: CART (1.04(0.95-1.31)), CHAID (1.06(0.97-1.15) and C4.5 (1.08(0.98-1.16)).</p> <p>Conclusion</p> <p>With different methodologies of CTs, trees are generated with different selection of variables and decision rules. The CTs are easy to interpret, and they stratify the risk of hospital mortality. The CTs should be taken into account for the classification of the prognosis of critically ill patients.</p
- …
