163 research outputs found

    Establishment of Valid Laboratory Case Definition for Human Leptospirosis

    Get PDF
    Laboratory case definition of leptospirosis is scarcely de ned by a solid evaluation that determines cut-off values in the tests that are applied. This study describes the process of determining optimal cut-off titers of laboratory tests for leptospirosis for a valid case definition of leptospirosis. In this case the tests are the microscopic agglutination test (MAT) and an in-house IgM enzyme-linked immunosorbent assay (ELISA) both on single serum and paired samples using a positive culture as the reference test in the Dutch population. The specificity was assessed using panels of sera from healthy donors, cases with known other diseases and non-leptospirosis cases with symptoms compatible with leptospirosis. Cases were divided into three periods corroborating the acute phase (1-10 days post onset of illness (DPO)), the early convalescent (11-20 DPO) and the late convalescent phase (>20 DPO). Cut-off titers for MAT and IgM ELISA were determined as 1:160 and 1:80 respectively for all three periods. These cut-off titers combined 100% specificity with a sensitivity that changed according to the stage of disease for both tests. The low sensitivities in the early acute phase are consistent with the dynamics of the humoral immune response. IgM ELISA yielded higher sensitivities compared to MAT in the acute and early convalescent stages. Moreover, the optimal sensitivity of MAT, the gold standard was < 82%, implying that a significant part of global cases is missed by this recommended test. MAT and IgM ELISA manifested partly complementary, resulting in a higher sensitivity when combining the results of these two tests. The availability of paired samples and of adequate clinical and epidemiological data are other parameters that

    Systematic review and meta-analysis: rapid diagnostic tests versus placental histology, microscopy and PCR for malaria in pregnant women

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>During pregnancy, malaria infection with <it>Plasmodium falciparum </it>or <it>Plasmodium vivax </it>is related to adverse maternal health and poor birth outcomes. Diagnosis of malaria, during pregnancy, is complicated by the absence or low parasite densities in peripheral blood. Diagnostic methods, other than microscopy, are needed for detection of placental malaria. Therefore, the diagnostic accuracy of rapid diagnostic tests (RDTs), detecting antigen, and molecular techniques (PCR), detecting DNA, for the diagnosis of <it>Plasmodium </it>infections in pregnancy was systematically reviewed.</p> <p>Methods</p> <p>MEDLINE, EMBASE and Web of Science were searched for studies assessing the diagnostic accuracy of RDTs, PCR, microscopy of peripheral and placental blood and placental histology for the detection of malaria infection (all species) in pregnant women.</p> <p>Results</p> <p>The results of 49 studies were analysed in metandi (Stata), of which the majority described <it>P. falciparum </it>infections. Although both placental and peripheral blood microscopy cannot reliably replace histology as a reference standard for placental <it>P. falciparum </it>infection, many studies compared RDTs and PCR to these tests. The proportion of microscopy positives in placental blood (sensitivity) detected by peripheral blood microscopy, RDTs and PCR are respectively 72% [95% CI 62-80], 81% [95% CI 55-93] and 94% [95% CI 86-98]. The proportion of placental blood microscopy negative women that were negative in peripheral blood microscopy, RDTs and PCR (specificity) are 98% [95% CI 95-99], 94% [95% CI 76-99] and 77% [95% CI 71-82]. Based on the current data, it was not possible to determine if the false positives in RDTs and PCR are caused by sequestered parasites in the placenta that are not detected by placental microscopy.</p> <p>Conclusion</p> <p>The findings suggest that RDTs and PCR may have good performance characteristics to serve as alternatives for the diagnosis of malaria in pregnancy, besides any other limitations and practical considerations concerning the use of these tests. Nevertheless, more studies with placental histology as reference test are urgently required to reliably determine the accuracy of RDTs and PCR for the diagnosis of placental malaria. <it>P. vivax</it>-infections have been neglected in diagnostic test accuracy studies of malaria in pregnancy.</p

    Long-term prevalence of post-traumatic stress disorder symptoms in patients after secondary peritonitis

    Get PDF
    INTRODUCTION: The aim of this study was to determine the long-term prevalence of post-traumatic stress disorder (PTSD) symptomology in patients following secondary peritonitis and to determine whether the prevalence of PTSD-related symptoms differed between patients admitted to the intensive care unit (ICU) and patients admitted only to the surgical ward. METHOD: A retrospective cohort of consecutive patients treated for secondary peritonitis was sent a postal survey containing a self-report questionnaire, namely the Post-traumatic Stress Syndrome 10-question inventory (PTSS-10). From a database of 278 patients undergoing surgery for secondary peritonitis between 1994 and 2000, 131 patients were long-term survivors (follow-up period at least four years) and were eligible for inclusion in our study, conducted at a tertiary referral hospital in Amsterdam, The Netherlands. RESULTS: The response rate was 86%, yielding a cohort of 100 patients; 61% of these patients had been admitted to the ICU. PTSD-related symptoms were found in 24% (95% confidence interval 17% to 33%) of patients when a PTSS-10 score of 35 was chosen as the cutoff, whereas the prevalence of PTSD symptomology when borderline patients scoring 27 points or more were included was 38% (95% confidence interval 29% to 48%). In a multivariate analyses controlling for age, sex, Acute Physiology and Chronic Health Evaluation II (APACHE II) score, number of relaparotomies and length of hospital stay, the likelihood of ICU-admitted patients having PTSD symptomology was 4.3 times higher (95% confidence interval 1.11 to 16.5) than patients not admitted to the ICU, using a PTSS-10 score cutoff of 35 or greater. Older patients and males were less likely to report PTSD symptoms. CONCLUSION: Nearly a quarter of patients receiving surgical treatment for secondary peritonitis developed PTSD symptoms. Patients admitted to the ICU were at significantly greater risk for having PTSD symptoms after adjusting for baseline differences, in particular ag

    Prospective evaluation of three rapid diagnostic tests for diagnosis of human leptospirosis.

    Get PDF
    Diagnosis of leptospirosis by the microscopic agglutination test (MAT) or by culture is confined to specialized laboratories. Although ELISA techniques are more common, they still require laboratory facilities. Rapid Diagnostic Tests (RDTs) can be used for easy point-of-care diagnosis. This study aims to evaluate the diagnostic performance of the RDTs LeptoTek Dri Dot, LeptoTek Lateral Flow, and Leptocheck-WB, prospectively. During 2001 to 2012, one or two of the RDTs at the same time have been applied prior to routine diagnostics (MAT, ELISA and culture) on serum specimens from participants sent in for leptospirosis diagnosis. The case definition was based on MAT, ELISA and culture results. Participants not fulfilling the case definition were considered not to have leptospirosis. The diagnostic accuracy was determined based on the 1(st) submitted sample and paired samples, either in an overall analysis or stratified according to days post onset of illness. The overall sensitivity and specificity for the LeptoTek Dri Dot was 75% respectively 96%, for the LeptoTek Lateral Flow 78% respectively 95%, and for the Leptocheck-WB 78% respectively 98%. Based on the 1(st) submitted sample the sensitivity was low (51% for LeptoTek Dri Dot, 69% for LeptoTek Lateral Flow, and 55% for Leptocheck-WB), but substantially increased when the results of paired samples were combined, although accompanied by a lower specificity (82% respectively 91% for LeptoTek Dri Dot, 86% respectively 84% for LeptoTek Lateral Flow, and 80% respectively 93% for Leptocheck-WB). All three tests present antibody tests contributing to the diagnosis of leptospirosis, thus supporting clinical suspicion and contributing to awareness. Since the overall sensitivity of the tested RDTs did not exceed 80%, one should be cautious to rely only on an RDT result, and confirmation by reference tests is strongly recommended

    Health related quality of life six months following surgical treatment for secondary peritonitis – using the EQ-5D questionnaire

    Get PDF
    Background: To compare health related quality of life (HR-QoL) in patients surgically treated for secondary peritonitis to that of a healthy population. And to prospectively identify factors associated with poorer (lower) HR-QoL. Design: A prospective cohort of secondary peritonitis patients was mailed the EQ-5D and EQ-VAS 6-months following initial laparotomy. Setting: Multicenter study in two academic and seven regional teaching hospitals. Patients: 130 of the 155 eligible patients (84%) responded to the HR-QoL questionnaires. Results: HR-QoL was significantly worse on all dimensions in peritonitis patients than in a healthy reference population. Peritonitis characteristics at initial presentation were not associated with HR-QoL at six months. A more complicated course of the disease leading to longer hospitalization times and patients with an enterostomy had a negative impact on the mobility (p = 0.02), self-care (p <0.001) and daily activities: (p = 0.01). In a multivariate analysis for the EQ-VAS every doubling of hospital stay decreases the EQ-VAS by 3.8 points (p = 0.015). Morbidity during the six-month follow-up was not found to be predictive for the EQ-5D or EQ-VAS. Conclusion: Six months following initial surgery, patients with secondary peritonitis report more problems in HR-QoL than a healthy reference population. Unfavorable disease characteristics at initial presentation were not predictive for poorer HR-QoL, but a more complicated course of the disease was most predictive of HR-QoL at 6 month

    Reliability, Validity and Responsiveness of the Syncope Functional Status Questionnaire

    Get PDF
    BACKGROUND: Patients with transient loss of consciousness (TLOC) have poor health-related quality of life (HR-QoL). OBJECTIVE: To test the reliability, validity, and responsiveness of the disease-specific Syncope Functional Status HR-QoL Questionnaire (SFSQ), which yields two summary scales--impairment score (IS) and fear-worry score (FWS). DESIGN: Cohort-study. PARTICIPANTS: 503 adult patients presenting with TLOC. MEASUREMENTS: HR-QoL was assessed using the SFSQ and the Short Form-36 (SF-36) after presentation and 1 year later. To test reliability, score distributions, internal consistency, and test-retest reliability were assessed. To assess validity, scores on the SFSQ and the SF-36 were compared. Clinical validity was tested using known-group comparison. Responsiveness was assessed by comparing changes in SFSQ scores with changes in health status and clinical condition. RESULTS: Response rate was 82% at baseline and 72% at 1-year follow-up. For all scales the full range of scores was seen. Score distributions were asymmetrical. Internal consistency was high (alpha = 0.88 for IS, 0.92 for FWS). Test-retest reliability was moderate to good for individual items and high for summary scales (inter-class correlation = 0.78 for both IS and FWS). Correlations between SFSQ scores and the SF-36 were modest. The SFSQ did not discriminate between patients differing in age and gender but did discriminate between patients differing in number of episodes and comorbid conditions. Changes in SFSQ scores were related to changes in health status and the presence of recurrences but did not vary by TLOC diagnosis. CONCLUSION: The SFSQ is an adequately reliable, valid, and responsive measure to assess HR-QoL in patients with TLO

    Explaining the willingness of public professionals to implement new policies: A policy alienation framework

    Get PDF
    Nowadays, many public policies focus on economic values, such as efficiency and client choice. Public professionals often show resistance to implementing such policies. We analyse this problem using an interdisciplinary approach. From public administration, we draw on the policy alienation concept, which consists of five dimensions: strategic powerlessness, tactical powerlessness, operational powerlessness, societal meaninglessness and client meaninglessness. These are considered as factors that influence the willingness of professionals to implement policies (change willingness - a concept drawn from the change management literature). We test this model in a survey among 478 Dutch healthcare professionals implementing a new reimbursement policy. The first finding was that perceived autonomy (operational powerlessness) significantly influenced change willingness, whereas strategic and tactical powerlessness were not found to be significant. Second, both the meaninglessness dimensions proved highly significant. We conclude that clarifying the value of a policy is important in getting professionals to willingly implement a policy, whereas their participation on the strategic or tactical levels seems less of a motivational factor. These insights help in understanding why public professionals embrace or resist the implementation of particular policies. Points for practitioners Policymakers develop public policies which, nowadays, tend to focus strongly on economic values, such as increasing efficiency or offering citizens the opportunity to choose among suppliers of public services. Public professionals, who have to implement these policies, are often reluctant to do so. This study shows that the causes of this resistance are unlikely to be found in the lack of influence these professionals have in the shaping of the policy at the national or organizational level. Rather, professionals might resist implementing policies because they do not see them as meaningful for society, or for their own clients. Therefore, policymakers should focus on this perceived meaninglessness and adopt ways to counter this, for example by intensively communicating the value associated with a policy

    Clinical Study Combination Antiretroviral Therapy for HIV in Rwandan Adults: Clinical Outcomes and Impact on Reproductive Health up to 24 Months

    Get PDF
    Adult women ( = 113) and men ( = 100) initiating combination antiretroviral therapy (cART) and women not yet eligible for cART ( = 199) in Kigali, Rwanda, were followed for 6-24 months between 2007 and 2010. In the cART groups, 21% of patients required a drug change due to side effects and 11% of patients had virological failure (defined as &gt;1,000 HIV RNA copies/mL) after 12 months of cART. About a third of the pregnancies since HIV diagnosis were unintended. The proportion of women in the pre-cART group using modern contraception other than condoms (50%) was similar to women in the general population, but this proportion was only 25% in women initiating cART. Of the women who carried at least one pregnancy to term since having been diagnosed HIV-positive, a third reported to have participated in a prevention-of-mother-to-child-transmission (PMTCT, option A) intervention. Many patients were coinfected with herpes simplex virus type 2 (79-92%), human papillomavirus (38-53%), and bacterial sexually transmitted infections (STIs) with no differences between groups. We applaud the Rwandan government for having strengthened family planning and PMTCT services and for having introduced HPV vaccination in recent years, but additional work is needed to strengthen STI and HPV-related cancer screening and management in the HIV-positive population

    Ventilation and thermal conditions in secondary schools in the Netherlands: Effects of COVID-19 pandemic control and prevention measures

    Get PDF
    During the COVID-19 pandemic, the importance of ventilation was widely stressed and new protocols of ventilation were implemented in school buildings worldwide. In the Netherlands, schools were recommended to keep the windows and doors open, and after a national lockdown more stringent measures such as reduction of occupancy were introduced. In this study, the actual effects of such measures on ventilation and thermal conditions were investigated in 31 classrooms of 11 Dutch secondary schools, by monitoring the indoor and outdoor CO 2 concentration and air temperature, both before and after the lockdown. Ventilation rates were calculated using the steady-state method. Pre-lockdown, with an average occupancy of 17 students, in 42% of the classrooms the CO 2 concentration exceeded the upper limit of the Dutch national guidelines (800 ppm above outdoors), while 13% had a ventilation rate per person (VR p) lower than the minimum requirement (6 l/s/p). Post-lockdown, the indoor CO 2 concentration decreased significantly while for ventilation rates significant increase was only found in VR p, mainly caused by the decrease in occupancy (average 10 students). The total ventilation rate per classrooms, mainly induced by opening windows and doors, did not change significantly. Meanwhile, according to the Dutch national guidelines, thermal conditions in the classrooms were not satisfying, both pre- and post-lockdown. While opening windows and doors cannot achieve the required indoor environmental quality at all times, reducing occupancy might not be feasible for immediate implementation. Hence, more controllable and flexible ways for improving indoor air quality and thermal comfort in classrooms are needed

    CRP polymorphisms and chronic kidney disease in the third national health and nutrition examination survey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>CRP </it>gene polymorphisms are associated with serum C-reactive protein concentrations and may play a role in chronic kidney disease (CKD) progression. We recently reported an association between the gene variant rs2808630 and CKD progression in African Americans with hypertensive kidney disease. This association has not been studied in other ethnic groups.</p> <p>Methods</p> <p>We used data from 5955 participants from Phase 2 of The Third National Health and Nutrition Examination Survey (1991-1994) to study the association between <it>CRP </it>polymorphisms and CKD prevalence in a population-based sample. The primary outcome was CKD defined as estimated glomerular filtration rate (eGFR) <60 ml/min or the presence of albuminuria. Secondary outcomes were the presence of albuminuria (any degree) and continuous eGFR. Six single nucleotide polymorphisms (SNPs) from the <it>CRP </it>gene, rs2808630, rs1205, rs3093066, rs1417938, rs3093058, and rs1800947, were evaluated.</p> <p>Results</p> <p><it>CRP </it>rs2808630 AG compared to the referent AA genotype was associated with CKD in non-Hispanic blacks (n = 1649, 293 of whom had CKD) with an adjusted odds ratio (OR) of 3.09 (95% CI 1.65-5.8; p = 0.001). For the secondary outcomes, rs2808630 AG compared to the referent AA genotype was associated with albuminuria with an adjusted OR of 3.07 (95% CI 1.59-5.94; p = 0.002), however not with eGFR. There was no association between the SNPs and CKD, albuminuria or eGFR in non-Hispanic whites or Mexicans Americans.</p> <p>Conclusions</p> <p>In this cross-sectional study, the 3' flanking <it>CRP </it>gene variant rs2808630 was associated with CKD, mainly through its association with albuminuria in the non-Hispanic blacks. Despite not finding an association with eGFR, our results support our previous study demonstrating an association between <it>CRP </it>gene variant rs2808630 and CKD progression in a longitudinal cohort of African American with hypertensive kidney disease.</p
    • …
    corecore