121 research outputs found

    A salting out and resin procedure for extracting Schistosoma mansoni DNA from human urine samples

    Get PDF
    Submitted by Nuzia Santos ([email protected]) on 2012-09-27T14:31:36Z No. of bitstreams: 1 36.2010.pdf: 789056 bytes, checksum: 0a4282ac34d4c6aef08223da45e0f126 (MD5)Made available in DSpace on 2012-09-27T14:31:36Z (GMT). No. of bitstreams: 1 36.2010.pdf: 789056 bytes, checksum: 0a4282ac34d4c6aef08223da45e0f126 (MD5) Previous issue date: 2010Fundação Oswaldo Cruz. LaboratĂłrio de Esquistossomose. Belo Horizonte, MG, BrasilFundação Oswaldo Cruz. LaboratĂłrio de Imunologia Celular e Molecular. Belo Horizonte, MG, BrasilFundação Oswaldo Cruz. LaboratĂłrio de Imunologia Celular e Molecular. Belo Horizonte, MG, Brasil/ Universidade Federal de Ouro Preto. Escola de FarmĂĄcia. LaboratĂłrio de Pesquisas ClĂ­nicas. Ouro Preto, MG, BraziBackground: In this paper a simple and cheap salting out and resin (InstaGene matrixÂź resin - BioRad) DNA extraction method from urine for PCR assays is introduced. The DNA of the fluke Schistosoma mansoni was chosen as the target since schistosomiasis lacks a suitable diagnostic tool which is sensitive enough to detect low worm burden. It is well known that the PCR technique provides high sensitivity and specificity in detecting parasite DNA. Therefore it is of paramount importance to take advantage of its excellent performance by providing a simple to handle and reliable DNA extraction procedure, which permits the diagnosis of the disease in easily obtainable urine samples. Findings: The description of the extraction procedure is given. This extraction procedure was tested for reproducibility and efficiency in artificially contaminated human urine samples. The reproducibility reached 100%, showing positive results in 5 assay repetitions of 5 tested samples each containing 20 ng DNA/5 ml. The efficiency of the extraction procedure was also evaluated in a serial dilution of the original 20 ng DNA/5 ml sample. Detectable DNA was extracted when it was at a concentration of 1.28 pg DNA/mL, revealing the high efficiency of this procedure. Conclusions: This methodology represents a promising tool for schistosomiasis diagnosis utilizing a bio-molecular technique in urine samples which is now ready to be tested under field conditions and may be applicable to the diagnosis of other parasitic disease

    Evaluation of Urine CCA Assays for Detection of Schistosoma mansoni Infection in Western Kenya

    Get PDF
    Although accurate assessment of the prevalence of Schistosoma mansoni is important for the design and evaluation of control programs, the most widely used tools for diagnosis are limited by suboptimal sensitivity, slow turn-around-time, or inability to distinguish current from former infections. Recently, two tests that detect circulating cathodic antigen (CCA) in urine of patients with schistosomiasis became commercially available. As part of a larger study on schistosomiasis prevalence in young children, we evaluated the performance and diagnostic accuracy of these tests—the carbon test strip designed for use in the laboratory and the cassette format test intended for field use. In comparison to 6 Kato-Katz exams, the carbon and cassette CCA tests had sensitivities of 88.4% and 94.2% and specificities of 70.9% and 59.4%, respectively. However, because of the known limitations of the Kato-Katz assay, we also utilized latent class analysis (LCA) incorporating the CCA, Kato-Katz, and schistosome-specific antibody results to determine their sensitivities and specificities. The laboratory-based CCA test had a sensitivity of 91.7% and a specificity of 89.4% by LCA while the cassette test had a sensitivity of 96.3% and a specificity of 74.7%. The intensity of the reaction in both urine CCA tests reflected stool egg burden and their performance was not affected by the presence of soil transmitted helminth infections. Our results suggest that urine-based assays for CCA may be valuable in screening for S. mansoni infections

    Apathy, but Not Depression, Reflects Inefficient Cognitive Strategies in Parkinson's Disease

    Get PDF
    The relationship between apathy, depression and cognitive impairment in Parkinson's disease (PD) is still controversial. The objective of this study is to investigate whether apathy and depression are associated with inefficient cognitive strategies in PD.In this prospective clinical cohort study conducted in a university-based clinical and research movement disorders center we studied 48 PD patients. Based on clinical evaluation, they were classified in two groups: PD with apathy (PD-A group, n = 23) and PD without apathy (PD-NA group, n = 25). Patients received clinical and neuropsychological evaluations. The clinical evaluation included: Apathy Evaluation Scale-patient version, Hamilton Depression Rating Scale-17 items, the Unified Parkinson's Disease Rating Scale and the Hoehn and Yahr staging system; the neuropsychological evaluation explored speed information processing, attention, working memory, executive function, learning abilities and memory, which included several measures of recall (immediate free, short delay free, long delay free and cued, and total recall).PD-A and PD-NA groups did not differ in age, disease duration, treatment, and motor condition, but differed in recall (p<0.001) and executive tasks (p<0.001). Immediate free recall had the highest predictive value for apathy (F = 10.94; p = 0.002). Depression and apathy had a weak correlation (Pearson index= 0.3; p<0.07), with three items of the depression scale correlating with apathy (Pearson index between .3 and.4; p<0.04). The depressed and non-depressed PD patients within the non-apathetic group did not differ.Apathy, but not depression, is associated with deficit in implementing efficient cognitive strategies. As the implementation of efficient strategies relies on the fronto-striatal circuit, we conclude that apathy, unlike depression, is an early expression of executive impairment in PD

    Adult cognitive outcomes in phenylketonuria:explaining causes of variability beyond average Phe levels

    Get PDF
    OBJECTIVE: The objective was to deepen the understanding of the causes of individual variability in phenylketonuria (PKU) by investigating which metabolic variables are most important for predicting cognitive outcomes (Phe average vs Phe variation) and by assessing the risk of cognitive impairment associated with adopting a more relaxed approach to the diet than is currently recommended. METHOD: We analysed associations between metabolic and cognitive measures in a mixed sample of English and Italian early-treated adults with PKU (N = 56). Metabolic measures were collected through childhood, adolescence and adulthood; cognitive measures were collected in adulthood. Metabolic measures included average Phe levels (average of median values for each year in a given period) and average Phe variations (average yearly standard deviations). Cognition was measured with IQ and a battery of cognitive tasks. RESULTS: Phe variation was as important, if not more important, than Phe average in predicting adult outcomes and contributed independently. Phe variation was particularly detrimental in childhood. Together, childhood Phe variation and adult Phe average predicted around 40% of the variation in cognitive scores. Poor cognitive scores (> 1 SD from controls) occurred almost exclusively in individuals with poor metabolic control and the risk of poor scores was about 30% higher in individuals with Phe values exceeding recommended thresholds. CONCLUSIONS: Our results provide support for current European guidelines (average Phe value = < 360 Όmol/l in childhood; = < 600 ÎŒmo/l from 12 years onwards), but they suggest an additional recommendation to maintain stable levels (possibly Phe SD = < 180 Όmol/l throughout life). PUBLIC SIGNIFICANCE STATEMENTS: We investigated the relationship between how well people with phenylketonuria control blood Phe throughout their life and their ability to carry out cognitive tasks in adulthood. We found that avoiding blood Phe peaks was as important if not more important that maintaining average low Phe levels. This was particularly essential in childhood. We also found that blood Phe levels above recommended European guidelines was associated with around 30% increase in the risk of poor cognitive outcomes

    HIV-associated neurocognitive disorders in sub-Saharan Africa: a pilot study in Cameroon

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The disease burden of human immunodeficiency virus (HIV) - acquired immunodeficiency syndrome (AIDS) is highest in sub-Saharan Africa but there are few studies on the associated neurocognitive disorders in this region. The objectives of this study were to determine whether Western neuropsychological (NP) methods are appropriate for use in Cameroon, and to evaluate cognitive function in a sample of HIV-infected adults.</p> <p>Methods</p> <p>We used a battery of 19 NP measures in a cross-sectional study with 44 HIV+ adults and 44 demographically matched HIV- controls, to explore the validity of these NP measures in Cameroon, and evaluate the effect of viral infection on seven cognitive ability domains.</p> <p>Results</p> <p>In this pilot study, the global mean z-score on the NP battery showed worse overall cognition in the HIV+ individuals. Significantly lower performance was seen in the HIV+ sample on tests of executive function, speed of information processing, working memory, and psychomotor speed. HIV+ participants with AIDS performed worse than those with less advanced HIV disease.</p> <p>Conclusions</p> <p>Similar to findings in Western cohorts, our results in Cameroon suggest that HIV infection, particularly in advanced stages, is associated with worse performance on standardized, Western neurocognitive tests. The tests used here appear to be promising for studying NeuroAIDS in sub-Saharan Africa.</p

    Diagnostic Accuracy and Applicability of a PCR System for the Detection of Schistosoma mansoni DNA in Human Urine Samples from an Endemic Area

    Get PDF
    Schistosomiasis caused by Schistosoma mansoni, one of the most neglected human parasitoses in Latin America and Africa, is routinely confirmed by microscopic visualization of eggs in stool. The main limitation of this diagnostic approach is its lack of sensitivity in detecting individual low worm burdens and consequently data on infection rates in low transmission settings are little reliable. According to the scientific literature, PCR assays are characterized by high sensitivity and specificity in detecting parasite DNA in biological samples. A simple and cost effective extraction method for DNA of Schistosoma mansoni from urine samples in combination with a conventional PCR assay was developed and applied in an endemic area. This urine based PCR system was tested for diagnostic accuracy among a population of a small village in an endemic area, comparing it to a reference test composed of three different parasitological techniques. The diagnostic parameters revealed a sensitivity of 100%, a specificity of 91.20%, positive and negative predictive values of 86.25% and 100%, respectively, and a test accuracy of 94.33%. Further statistical analysis showed a k index of 0.8806, indicating an excellent agreement between the reference test and the PCR system. Data obtained from the mouse model indicate the infection can be detected one week after cercariae penetration, opening a new perspective for early detection and patient management during this stage of the disease. The data indicate that this innovative PCR system provides a simple to handle and robust diagnostic tool for the detection of S. mansoni DNA from urine samples and a promising approach to overcome the diagnostic obstacles in low transmission settings. Furthermore the principals of this molecular technique, based on the examination of human urine samples may be useful for the diagnosis of other neglected tropical diseases that can be detected by trans-renal DNA

    A Research Agenda for Helminth Diseases of Humans: Diagnostics for Control and Elimination Programmes

    Get PDF
    Diagnostic tools appropriate for undertaking interventions to control helminth infections are key to their success. Many diagnostic tests for helminth infection have unsatisfactory performance characteristics and are not well suited for use in the parasite control programmes that are being increasingly implemented. Although the application of modern laboratory research techniques to improve diagnostics for helminth infection has resulted in some technical advances, uptake has not been uniform. Frequently, pilot or proof of concept studies of promising diagnostic technologies have not been followed by much needed product development, and in many settings diagnosis continues to rely on insensitive and unsatisfactory parasitological or serodiagnostic techniques. In contrast, PCR-based xenomonitoring of arthropod vectors, and use of parasite recombinant proteins as reagents for serodiagnostic tests, have resulted in critical advances in the control of specific helminth parasites. The Disease Reference Group on Helminths Infections (DRG4), established in 2009 by the Special Programme for Research and Training in Tropical Diseases (TDR) was given the mandate to review helminthiases research and identify research priorities and gaps. In this review, the diagnostic technologies relevant to control of helminth infections, either available or in development, are reviewed. Critical gaps are identified and opportunities to improve needed technologies are discussed

    HIV-associated neurocognitive disorders before and during the era of combination antiretroviral therapy: differences in rates, nature, and predictors

    Get PDF
    Combination antiretroviral therapy (CART) has greatly reduced medical morbidity and mortality with HIV infection, but high rates of HIV-associated neurocognitive disorders (HAND) continue to be reported. Because large HIV-infected (HIV+) and uninfected (HIV−) groups have not been studied with similar methods in the pre-CART and CART eras, it is unclear whether CART has changed the prevalence, nature, and clinical correlates of HAND. We used comparable methods of subject screening and assessments to classify neurocognitive impairment (NCI) in large groups of HIV + and HIV − participants from the pre-CART era (1988–1995; N = 857) and CART era (2000–2007; N = 937). Impairment rate increased with successive disease stages (CDC stages A, B, and C) in both eras: 25%, 42%, and 52% in pre-CART era and 36%, 40%, and 45% in CART era. In the medically asymptomatic stage (CDC-A), NCI was significantly more common in the CART era. Low nadir CD4 predicted NCI in both eras, whereas degree of current immunosuppression, estimated duration of infection, and viral suppression in CSF (on treatment) were related to impairment only pre-CART. Pattern of NCI also differed: pre-CART had more impairment in motor skills, cognitive speed, and verbal fluency, whereas CART era involved more memory (learning) and executive function impairment. High rates of mild NCI persist at all stages of HIV infection, despite improved viral suppression and immune reconstitution with CART. The consistent association of NCI with nadir CD4 across eras suggests that earlier treatment to prevent severe immunosuppression may also help prevent HAND. Clinical trials targeting HAND prevention should specifically examine timing of ART initiation
    • 

    corecore