48 research outputs found

    Predictors of the development of myocarditis or acute renal failure in patients with leptospirosis: An observational study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Leptospirosis has a varied clinical presentation with complications like myocarditis and acute renal failure. There are many predictors of severity and mortality including clinical and laboratory parameters. Early detection and treatment can reduce complications. Therefore recognizing the early predictors of the complications of leptospirosis is important in patient management. This study was aimed at determining the clinical and laboratory predictors of myocarditis or acute renal failure.</p> <p>Methods</p> <p>This was a prospective descriptive study carried out in the Teaching Hospital, Kandy, from 1st July 2007 to 31st July 2008. Patients with clinical features compatible with leptospirosis case definition were confirmed using the Microscopic Agglutination Test (MAT). Clinical features and laboratory measures done on admission were recorded. Patients were observed for the development of acute renal failure or myocarditis. Chi-square statistics, Fisher's exact test and Mann-Whitney <it>U </it>test were used to compare patients with and without complications. A logistic regression model was used to select final predictor variables.</p> <p>Results</p> <p>Sixty two confirmed leptospirosis patients were included in the study. Seven patients (11.3%) developed acute renal failure and five (8.1%) developed myocarditis while three (4.8%) had both acute renal failure and myocarditis. Conjunctival suffusion - 40 (64.5%), muscle tenderness - 28 (45.1%), oliguria - 20 (32.2%), jaundice - 12 (19.3%), hepatomegaly - 10 (16.1%), arrhythmias (irregular radial pulse) - 8 (12.9%), chest pain - 6 (9.7%), bleeding - 5 (8.1%), and shortness of breath (SOB) 4 (6.4%) were the common clinical features present among the patients. Out of these, only oliguria {odds ratio (OR) = 4.14 and 95% confidence interval (CI) 1.003-17.261}, jaundice (OR = 5.13 and 95% CI 1.149-28.003), and arrhythmias (OR = 5.774 and 95% CI 1.001-34.692), were predictors of myocarditis or acute renal failure and none of the laboratory measures could predict the two complications.</p> <p>Conclusions</p> <p>This study shows that out of clinical and laboratory variables, only oliguria, jaundice and arrhythmia are strong predictors of development of acute renal failure or myocarditis in patients with leptospirosis presented to Teaching Hospital of Kandy, Sri Lanka.</p

    A quantitative PCR (TaqMan) assay for pathogenic Leptospira spp

    Get PDF
    BACKGROUND: Leptospirosis is an emerging infectious disease. The differential diagnosis of leptospirosis is difficult due to the varied and often "flu like" symptoms which may result in a missed or delayed diagnosis. There are over 230 known serovars in the genus Leptospira. Confirmatory serological diagnosis of leptospirosis is usually made using the microscopic agglutination test (MAT) which relies on the use of live cultures as the source of antigen, often performed using a panel of antigens representative of local serovars. Other techniques, such as the enzyme linked immunosorbent assay (ELISA) and slide agglutination test (SAT), can detect different classes of antibody but may be subject to false positive reactions and require confirmation of these results by the MAT. METHODS: The polymerase chain reaction (PCR) has been used to detect a large number of microorganisms, including those of clinical significance. The sensitivity of PCR often precludes the need for isolation and culture, thus making it ideal for the rapid detection of organisms involved in acute infections. We employed real-time (quantitative) PCR using TaqMan chemistry to detect leptospires in clinical and environmental samples. RESULTS AND CONCLUSIONS: The PCR assay can be applied to either blood or urine samples and does not rely on the isolation and culture of the organism. Capability exists for automation and high throughput testing in a clinical laboratory. It is specific for Leptospira and may discriminate pathogenic and non-pathogenic species. The limit of detection is as low as two cells

    Development and Validation of a Real-Time PCR for Detection of Pathogenic Leptospira Species in Clinical Materials

    Get PDF
    Available serological diagnostics do not allow the confirmation of clinically suspected leptospirosis at the early acute phase of illness. Several conventional and real-time PCRs for the early diagnosis of leptospirosis have been described but these have been incompletely evaluated. We developed a SYBR Green-based real-time PCR targeting secY and validated it according to international guidelines. To determine the analytical specificity, DNA from 56 Leptospira strains belonging to pathogenic, non-pathogenic and intermediate Leptospira spp. as well as 46 other micro-organisms was included in this study. All the pathogenic Leptospira gave a positive reaction. We found no cross-reaction with saprophytic Leptospira and other micro-organisms, implying a high analytical specificity. The analytical sensitivity of the PCR was one copy per reaction from cultured homologous strain M 20 and 1.2 and 1.5 copy for heterologous strains 1342 K and Sarmin, respectively. In spiked serum & blood and kidney tissue the sensitivity was 10 and 20 copies for M 20, 15 and 30 copies for 1342 K and 30 and 50 copies for Sarmin. To determine the diagnostic sensitivity (DSe) and specificity (DSp), clinical blood samples from 26 laboratory-confirmed and 107 negative patients suspected of leptospirosis were enrolled as a prospective consecutive cohort. Based on culture as the gold standard, we found a DSe and DSp of 100% and 93%, respectively. All eight PCR positive samples that had a negative culture seroconverted later on, implying a higher actual DSp. When using culture and serology as the gold standard, the DSe was lower (89%) while the DSp was higher (100%). DSe was 100% in samples collected within the first – for treatment important - 4 days after onset of the illness. Reproducibility and repeatability of the assay, determined by blind testing kidney samples from 20 confirmed positive and 20 negative rodents both appeared 100%. In conclusion we have described for the first time the development of a robust SYBR Green real-time PCR for the detection of pathogenic Leptospira combined with a detailed assessment of its clinical accuracy, thus providing a method for the early diagnosis of leptospirosis with a well-defined satisfactory performance

    Small Cationic DDA:TDB Liposomes as Protein Vaccine Adjuvants Obviate the Need for TLR Agonists in Inducing Cellular and Humoral Responses

    Get PDF
    Most subunit vaccines require adjuvants in order to induce protective immune responses to the targeted pathogen. However, many of the potent immunogenic adjuvants display unacceptable local or systemic reactogenicity. Liposomes are spherical vesicles consisting of single (unilamellar) or multiple (multilamellar) phospholipid bi-layers. The lipid membranes are interleaved with an aqueous buffer, which can be utilised to deliver hydrophilic vaccine components, such as protein antigens or ligands for immune receptors. Liposomes, in particular cationic DDA:TDB vesicles, have been shown in animal models to induce strong humoral responses to the associated antigen without increased reactogenicity, and are currently being tested in Phase I human clinical trials. We explored several modifications of DDA:TDB liposomes - including size, antigen association and addition of TLR agonists – to assess their immunogenic capacity as vaccine adjuvants, using Ovalbumin (OVA) protein as a model protein vaccine. Following triple homologous immunisation, small unilamellar vesicles (SUVs) with no TLR agonists showed a significantly higher capacity for inducing spleen CD8 IFNγ responses against OVA in comparison with the larger multilamellar vesicles (MLVs). Antigen-specific antibody reponses were also higher with SUVs. Addition of the TLR3 and TLR9 agonists significantly increased the adjuvanting capacity of MLVs and OVA-encapsulating dehydration-rehydration vesicles (DRVs), but not of SUVs. Our findings lend further support to the use of liposomes as protein vaccine adjuvants. Importantly, the ability of DDA:TDB SUVs to induce potent CD8 T cell responses without the need for adding immunostimulators would avoid the potential safety risks associated with the clinical use of TLR agonists in vaccines adjuvanted with liposomes

    Network-State Modulation of Power-Law Frequency-Scaling in Visual Cortical Neurons

    Get PDF
    Various types of neural-based signals, such as EEG, local field potentials and intracellular synaptic potentials, integrate multiple sources of activity distributed across large assemblies. They have in common a power-law frequency-scaling structure at high frequencies, but it is still unclear whether this scaling property is dominated by intrinsic neuronal properties or by network activity. The latter case is particularly interesting because if frequency-scaling reflects the network state it could be used to characterize the functional impact of the connectivity. In intracellularly recorded neurons of cat primary visual cortex in vivo, the power spectral density of Vm activity displays a power-law structure at high frequencies with a fractional scaling exponent. We show that this exponent is not constant, but depends on the visual statistics used to drive the network. To investigate the determinants of this frequency-scaling, we considered a generic recurrent model of cortex receiving a retinotopically organized external input. Similarly to the in vivo case, our in computo simulations show that the scaling exponent reflects the correlation level imposed in the input. This systematic dependence was also replicated at the single cell level, by controlling independently, in a parametric way, the strength and the temporal decay of the pairwise correlation between presynaptic inputs. This last model was implemented in vitro by imposing the correlation control in artificial presynaptic spike trains through dynamic-clamp techniques. These in vitro manipulations induced a modulation of the scaling exponent, similar to that observed in vivo and predicted in computo. We conclude that the frequency-scaling exponent of the Vm reflects stimulus-driven correlations in the cortical network activity. Therefore, we propose that the scaling exponent could be used to read-out the “effective” connectivity responsible for the dynamical signature of the population signals measured at different integration levels, from Vm to LFP, EEG and fMRI

    The use of mesenchymal stem cells for cartilage repair and regeneration: a systematic review.

    Get PDF
    BACKGROUND: The management of articular cartilage defects presents many clinical challenges due to its avascular, aneural and alymphatic nature. Bone marrow stimulation techniques, such as microfracture, are the most frequently used method in clinical practice however the resulting mixed fibrocartilage tissue which is inferior to native hyaline cartilage. Other methods have shown promise but are far from perfect. There is an unmet need and growing interest in regenerative medicine and tissue engineering to improve the outcome for patients requiring cartilage repair. Many published reviews on cartilage repair only list human clinical trials, underestimating the wealth of basic sciences and animal studies that are precursors to future research. We therefore set out to perform a systematic review of the literature to assess the translation of stem cell therapy to explore what research had been carried out at each of the stages of translation from bench-top (in vitro), animal (pre-clinical) and human studies (clinical) and assemble an evidence-based cascade for the responsible introduction of stem cell therapy for cartilage defects. This review was conducted in accordance to PRISMA guidelines using CINHAL, MEDLINE, EMBASE, Scopus and Web of Knowledge databases from 1st January 1900 to 30th June 2015. In total, there were 2880 studies identified of which 252 studies were included for analysis (100 articles for in vitro studies, 111 studies for animal studies; and 31 studies for human studies). There was a huge variance in cell source in pre-clinical studies both of terms of animal used, location of harvest (fat, marrow, blood or synovium) and allogeneicity. The use of scaffolds, growth factors, number of cell passages and number of cells used was hugely heterogeneous. SHORT CONCLUSIONS: This review offers a comprehensive assessment of the evidence behind the translation of basic science to the clinical practice of cartilage repair. It has revealed a lack of connectivity between the in vitro, pre-clinical and human data and a patchwork quilt of synergistic evidence. Drivers for progress in this space are largely driven by patient demand, surgeon inquisition and a regulatory framework that is learning at the same pace as new developments take place

    Global Impact of the COVID-19 Pandemic on Cerebral Venous Thrombosis and Mortality

    Get PDF
    Background and purpose: Recent studies suggested an increased incidence of cerebral venous thrombosis (CVT) during the coronavirus disease 2019 (COVID-19) pandemic. We evaluated the volume of CVT hospitalization and in-hospital mortality during the 1st year of the COVID-19 pandemic compared to the preceding year. Methods: We conducted a cross-sectional retrospective study of 171 stroke centers from 49 countries. We recorded COVID-19 admission volumes, CVT hospitalization, and CVT in-hospital mortality from January 1, 2019, to May 31, 2021. CVT diagnoses were identified by International Classification of Disease-10 (ICD-10) codes or stroke databases. We additionally sought to compare the same metrics in the first 5 months of 2021 compared to the corresponding months in 2019 and 2020 (ClinicalTrials.gov Identifier: NCT04934020). Results: There were 2,313 CVT admissions across the 1-year pre-pandemic (2019) and pandemic year (2020); no differences in CVT volume or CVT mortality were observed. During the first 5 months of 2021, there was an increase in CVT volumes compared to 2019 (27.5%; 95% confidence interval [CI], 24.2 to 32.0; P&lt;0.0001) and 2020 (41.4%; 95% CI, 37.0 to 46.0; P&lt;0.0001). A COVID-19 diagnosis was present in 7.6% (132/1,738) of CVT hospitalizations. CVT was present in 0.04% (103/292,080) of COVID-19 hospitalizations. During the first pandemic year, CVT mortality was higher in patients who were COVID positive compared to COVID negative patients (8/53 [15.0%] vs. 41/910 [4.5%], P=0.004). There was an increase in CVT mortality during the first 5 months of pandemic years 2020 and 2021 compared to the first 5 months of the pre-pandemic year 2019 (2019 vs. 2020: 2.26% vs. 4.74%, P=0.05; 2019 vs. 2021: 2.26% vs. 4.99%, P=0.03). In the first 5 months of 2021, there were 26 cases of vaccine-induced immune thrombotic thrombocytopenia (VITT), resulting in six deaths. Conclusions: During the 1st year of the COVID-19 pandemic, CVT hospitalization volume and CVT in-hospital mortality did not change compared to the prior year. COVID-19 diagnosis was associated with higher CVT in-hospital mortality. During the first 5 months of 2021, there was an increase in CVT hospitalization volume and increase in CVT-related mortality, partially attributable to VITT

    Global Impact of the COVID-19 Pandemic on Cerebral Venous Thrombosis and Mortality.

    Get PDF
    BACKGROUND AND PURPOSE: Recent studies suggested an increased incidence of cerebral venous thrombosis (CVT) during the coronavirus disease 2019 (COVID-19) pandemic. We evaluated the volume of CVT hospitalization and in-hospital mortality during the 1st year of the COVID-19 pandemic compared to the preceding year. METHODS: We conducted a cross-sectional retrospective study of 171 stroke centers from 49 countries. We recorded COVID-19 admission volumes, CVT hospitalization, and CVT in-hospital mortality from January 1, 2019, to May 31, 2021. CVT diagnoses were identified by International Classification of Disease-10 (ICD-10) codes or stroke databases. We additionally sought to compare the same metrics in the first 5 months of 2021 compared to the corresponding months in 2019 and 2020 (ClinicalTrials.gov Identifier: NCT04934020). RESULTS: There were 2,313 CVT admissions across the 1-year pre-pandemic (2019) and pandemic year (2020); no differences in CVT volume or CVT mortality were observed. During the first 5 months of 2021, there was an increase in CVT volumes compared to 2019 (27.5%; 95% confidence interval [CI], 24.2 to 32.0; P<0.0001) and 2020 (41.4%; 95% CI, 37.0 to 46.0; P<0.0001). A COVID-19 diagnosis was present in 7.6% (132/1,738) of CVT hospitalizations. CVT was present in 0.04% (103/292,080) of COVID-19 hospitalizations. During the first pandemic year, CVT mortality was higher in patients who were COVID positive compared to COVID negative patients (8/53 [15.0%] vs. 41/910 [4.5%], P=0.004). There was an increase in CVT mortality during the first 5 months of pandemic years 2020 and 2021 compared to the first 5 months of the pre-pandemic year 2019 (2019 vs. 2020: 2.26% vs. 4.74%, P=0.05; 2019 vs. 2021: 2.26% vs. 4.99%, P=0.03). In the first 5 months of 2021, there were 26 cases of vaccine-induced immune thrombotic thrombocytopenia (VITT), resulting in six deaths. CONCLUSIONS: During the 1st year of the COVID-19 pandemic, CVT hospitalization volume and CVT in-hospital mortality did not change compared to the prior year. COVID-19 diagnosis was associated with higher CVT in-hospital mortality. During the first 5 months of 2021, there was an increase in CVT hospitalization volume and increase in CVT-related mortality, partially attributable to VITT
    corecore