360 research outputs found

    Early versus Delayed Decompression for Traumatic Cervical Spinal Cord Injury: Results of the Surgical Timing in Acute Spinal Cord Injury Study (STASCIS)

    Get PDF
    BACKGROUND:There is convincing preclinical evidence that early decompression in the setting of spinal cord injury (SCI) improves neurologic outcomes. However, the effect of early surgical decompression in patients with acute SCI remains uncertain. Our objective was to evaluate the relative effectiveness of early (<24 hours after injury) versus late (≥ 24 hours after injury) decompressive surgery after traumatic cervical SCI. METHODS:We performed a multicenter, international, prospective cohort study (Surgical Timing In Acute Spinal Cord Injury Study: STASCIS) in adults aged 16-80 with cervical SCI. Enrolment occurred between 2002 and 2009 at 6 North American centers. The primary outcome was ordinal change in ASIA Impairment Scale (AIS) grade at 6 months follow-up. Secondary outcomes included assessments of complications rates and mortality. FINDINGS:A total of 313 patients with acute cervical SCI were enrolled. Of these, 182 underwent early surgery, at a mean of 14.2(± 5.4) hours, with the remaining 131 having late surgery, at a mean of 48.3(± 29.3) hours. Of the 222 patients with follow-up available at 6 months post injury, 19.8% of patients undergoing early surgery showed a ≥ 2 grade improvement in AIS compared to 8.8% in the late decompression group (OR = 2.57, 95% CI:1.11,5.97). In the multivariate analysis, adjusted for preoperative neurological status and steroid administration, the odds of at least a 2 grade AIS improvement were 2.8 times higher amongst those who underwent early surgery as compared to those who underwent late surgery (OR = 2.83, 95% CI:1.10,7.28). During the 30 day post injury period, there was 1 mortality in both of the surgical groups. Complications occurred in 24.2% of early surgery patients and 30.5% of late surgery patients (p = 0.21). CONCLUSION:Decompression prior to 24 hours after SCI can be performed safely and is associated with improved neurologic outcome, defined as at least a 2 grade AIS improvement at 6 months follow-up

    Comparison of three methods for detection of gametocytes in Melanesian children treated for uncomplicated malaria

    Get PDF
    Background: Gametocytes are the transmission stages of Plasmodium parasites, the causative agents of malaria. As their density in the human host is typically low, they are often undetected by conventional light microscopy. Furthermore, application of RNA-based molecular detection methods for gametocyte detection remains challenging in remote field settings. In the present study, a detailed comparison of three methods, namely light microscopy, magnetic fractionation and reverse transcriptase polymerase chain reaction for detection of Plasmodium falciparum and Plasmodium vivax gametocytes was conducted.Methods. Peripheral blood samples from 70 children aged 0.5 to five years with uncomplicated malaria who were treated with either artemether-lumefantrine or artemisinin-naphthoquine were collected from two health facilities on the north coast of Papua New Guinea. The samples were taken prior to treatment (day 0) and at pre-specified intervals during follow-up. Gametocytes were measured in each sample by three methods: i) light microscopy (LM), ii) quantitative magnetic fractionation (MF) and, iii) reverse transcriptase PCR (RTPCR). Data were analysed using censored linear regression and Bland and Altman techniques.Results: MF and RTPCR were similarly sensitive and specific, and both were superior to LM. Overall, there were approximately 20% gametocyte-positive samples by LM, whereas gametocyte positivity by MF and RTPCR were both more than two-fold this level. In the subset of samples collected prior to treatment, 29% of children were positive by LM, and 85% were gametocyte positive by MF and RTPCR, respectively.Conclusions: The present study represents the first direct comparison of standard LM, MF and RTPCR for gametocyte detection in field isolates. It provides strong evidence that MF is superior to LM and can be used to detect gametocytaemic patients under field conditions with similar sensitivity and specificity as RTPCR

    Field and chirality effects on electrochemical charge transfer rates: Spin dependent electrochemistry

    Get PDF
    This work examines whether electrochemical redox reactions are sensitive to the electron spin orientation by examining the effects of magnetic field and molecular chirality on the charge transfer process. The working electrode is either a ferromagnetic nickel film or a nickel film that is coated with an ultrathin (5\u201330 nm) gold overlayer. The electrode is coated with a self-assembled monolayer that immobilizes a redox couple containing chiral molecular units, either the redox active dye toluidine blue O with a chiral cysteine linking unit or cytochrome c. By varying the direction of magnetization of the nickel, toward or away from the adsorbed layer, we demonstrate that the electrochemical current depends on the orientation of the electrons\u2019 spin. In the case of cytochrome c, the spin selectivity of the reduction is extremely high, namely, the reduction occurs mainly with electrons having their spin-aligned antiparallel to their velocity

    A Sub-Microscopic Gametocyte Reservoir Can Sustain Malaria Transmission

    Get PDF
    Novel diagnostic tools, including PCR and high field gradient magnetic fractionation (HFGMF), have improved detection of asexual Plasmodium falciparum parasites and especially infectious gametocytes in human blood. These techniques indicate a significant number of people carry gametocyte densities that fall below the conventional threshold of detection achieved by standard light microscopy (LM).To determine how low-level gametocytemia may affect transmission in present large-scale efforts for P. falciparum control in endemic areas, we developed a refinement of the classical Ross-Macdonald model of malaria transmission by introducing multiple infective compartments to model the potential impact of highly prevalent, low gametocytaemic reservoirs in the population. Models were calibrated using field-based data and several numerical experiments were conducted to assess the effect of high and low gametocytemia on P. falciparum transmission and control. Special consideration was given to the impact of long-lasting insecticide-treated bed nets (LLIN), presently considered the most efficient way to prevent transmission, and particularly LLIN coverage similar to goals targeted by the Roll Back Malaria and Global Fund malaria control campaigns. Our analyses indicate that models which include only moderate-to-high gametocytemia (detectable by LM) predict finite eradication times after LLIN introduction. Models that include a low gametocytemia reservoir (requiring PCR or HFGMF detection) predict much more stable, persistent transmission. Our modeled outcomes result in significantly different estimates for the level and duration of control needed to achieve malaria elimination if submicroscopic gametocytes are included.It will be very important to complement current methods of surveillance with enhanced diagnostic techniques to detect asexual parasites and gametocytes to more accurately plan, monitor and guide malaria control programs aimed at eliminating malaria

    Quality of life in patients treated with first-line antiretroviral therapy containing nevirapine or efavirenz in Uganda: A prospective non-randomized study

    Get PDF
    © 2015 Mwesigire et al. Background: The goal of antiretroviral therapy (ART) is to suppress viral replication, reduce morbidity and mortality, and improve quality of life (QoL). For resource-limited settings, the World Health Organization recommends a first-line regimen of two-nucleoside reverse-transcriptase inhibitors and one non-nucleoside transcriptase inhibitor (nevirapine (NVP) or efavirenz (EFV)). There are few data comparing the QoL impact of NVP versus EFV. This study assessed the change in QoL and factors associated with QoL among HIV patients receiving ART regimens based on EFV or NVP. Methods: We enrolled 640 people with HIV eligible for ART who received regimens including either NVP or EFV. QoL was assessed at baseline, three months and six months using Physical Health Summary (PHS) and Mental Health Summary (MHS) scores and the Global Person Generated Index (GPGI). Data were analyzed using generalized estimating equations, with ART regimen as the primary exposure, to identify associations between patient and disease factors and QoL. Results: QoL increased on ART. The mean QoL scores did not differ significantly for regimens based on NVP versus EFV during follow-up for MHS and GPGI regardless of CD4 stratum and for PHS among patients with a CD4 count >250 cells/μL. The PHS-adjusted β coefficients for ART regimens based on EFV versus NVP by CD4 count strata were as follows: -1.61 (95 % CI -2.74, -0.49) for CD4 count 250 cells/μL. The corresponding MHS-adjusted β coefficients were as follows: -0.39 (-1.40, 0.62) for CD4∈250 cells/μL. The GPGI-adjusted odds ratios for EFV versus NVP were 0.51 (0.25, 1.04) for CD4 count ∈250 cells/μL. QoL improved among patients on EFV over the 6-month follow-up period (MHS p

    Factors that affect quality of life among people living with HIV attending an urban clinic in Uganda: A cohort study

    Get PDF
    © 2015 Mutabazi-Mwesigire et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Introduction: With the availability of antiretroviral therapy (ART) and primary general care for people living with HIV (PLHIV) in resource limited settings, PLHIV are living longer, and HIV has been transformed into a chronic illness. People are diagnosed and started on treatment when they are relatively well. Although ART results in clinical improvement, the ultimate goal of treatment is full physical functioning and general well-being, with a focus on quality of life rather than clinical outcomes. However, there has been little research on the relationship of specific factors to quality of life in PLHIV. The objective of this study was to investigate factors associated with quality of life among PLHIV in Uganda receiving basic care and those on ART. Methods: We enrolled 1274 patients attending an HIV outpatient clinic into a prospective cohort study. Of these, 640 received ART. All were followed up at 3 and 6 months. Health related quality of life was assessed with the MOS-HIV Health Survey and the Global Person Generated Index (GPGI). Multivariate linear regression and logistic regression with generalized estimating equations were used to examine the relationship of social behavioral and disease factors with Physical Health Summary (PHS) score, Mental Health Summary (MHS) score, and GPGI. Results: Among PLHIV receiving basic care, PHS was associated with: sex (p=0.045) - females had lower PHS; age in years at enrollment (p=0.0001) - older patients had lower PHS; and depression (

    The ACTIVE cognitive training trial and predicted medical expenditures

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health care expenditures for older adults are disproportionately high and increasing at both the individual and population levels. We evaluated the effects of the three cognitive training interventions (memory, reasoning, or speed of processing) in the ACTIVE study on changes in predicted medical care expenditures.</p> <p>Methods</p> <p>ACTIVE was a multisite randomized controlled trial of older adults (≥ 65). Five-year follow-up data were available for 1,804 of the 2,802 participants. Propensity score weighting was used to adjust for potential attrition bias. Changes in predicted annual<b/>medical expenditures were calculated at the first and fifth annual follow-up assessments using a new method for translating functional status scores. Multiple linear regression methods were used in this cost-offset analysis.</p> <p>Results</p> <p>At one and five years post-training, annual predicted expenditures declined<b/>by 223(p=.024)and223 (p = .024) and 128 (p = .309), respectively, in the speed of processing treatment group, but there were no statistically significant changes in the memory or reasoning treatment groups compared to the no-contact control group at either period. Statistical adjustment for age, race, education, MMSE scores, ADL and IADL performance scores, EPT scores, chronic condition counts, and the SF-36 PCS and MCS scores at baseline did not alter the one-year (244;p=.012)orfiveyear(244; p = .012) or five-year (143; p = .250) expenditure declines in the speed of processing treatment group.</p> <p>Conclusion</p> <p>The speed of processing intervention significantly reduced subsequent annual predicted medical care expenditures at the one-year post-baseline comparison, but annual savings were no longer statistically significant at the five-year post-baseline comparison.</p

    Neurobiology of rodent self-grooming and its value for translational neuroscience

    Get PDF
    Self-grooming is a complex innate behaviour with an evolutionarily conserved sequencing pattern and is one of the most frequently performed behavioural activities in rodents. In this Review, we discuss the neurobiology of rodent self-grooming, and we highlight studies of rodent models of neuropsychiatric disorders-including models of autism spectrum disorder and obsessive compulsive disorder-that have assessed self-grooming phenotypes. We suggest that rodent self-grooming may be a useful measure of repetitive behaviour in such models, and therefore of value to translational psychiatry. Assessment of rodent self-grooming may also be useful for understanding the neural circuits that are involved in complex sequential patterns of action.National Institutes of Health (U.S.) (Grant NS025529)National Institutes of Health (U.S.) (Grant HD028341)National Institutes of Health (U.S.) (Grant MH060379

    Global Impact of the COVID-19 Pandemic on Cerebral Venous Thrombosis and Mortality

    Get PDF
    Background and purpose: Recent studies suggested an increased incidence of cerebral venous thrombosis (CVT) during the coronavirus disease 2019 (COVID-19) pandemic. We evaluated the volume of CVT hospitalization and in-hospital mortality during the 1st year of the COVID-19 pandemic compared to the preceding year. Methods: We conducted a cross-sectional retrospective study of 171 stroke centers from 49 countries. We recorded COVID-19 admission volumes, CVT hospitalization, and CVT in-hospital mortality from January 1, 2019, to May 31, 2021. CVT diagnoses were identified by International Classification of Disease-10 (ICD-10) codes or stroke databases. We additionally sought to compare the same metrics in the first 5 months of 2021 compared to the corresponding months in 2019 and 2020 (ClinicalTrials.gov Identifier: NCT04934020). Results: There were 2,313 CVT admissions across the 1-year pre-pandemic (2019) and pandemic year (2020); no differences in CVT volume or CVT mortality were observed. During the first 5 months of 2021, there was an increase in CVT volumes compared to 2019 (27.5%; 95% confidence interval [CI], 24.2 to 32.0; P&lt;0.0001) and 2020 (41.4%; 95% CI, 37.0 to 46.0; P&lt;0.0001). A COVID-19 diagnosis was present in 7.6% (132/1,738) of CVT hospitalizations. CVT was present in 0.04% (103/292,080) of COVID-19 hospitalizations. During the first pandemic year, CVT mortality was higher in patients who were COVID positive compared to COVID negative patients (8/53 [15.0%] vs. 41/910 [4.5%], P=0.004). There was an increase in CVT mortality during the first 5 months of pandemic years 2020 and 2021 compared to the first 5 months of the pre-pandemic year 2019 (2019 vs. 2020: 2.26% vs. 4.74%, P=0.05; 2019 vs. 2021: 2.26% vs. 4.99%, P=0.03). In the first 5 months of 2021, there were 26 cases of vaccine-induced immune thrombotic thrombocytopenia (VITT), resulting in six deaths. Conclusions: During the 1st year of the COVID-19 pandemic, CVT hospitalization volume and CVT in-hospital mortality did not change compared to the prior year. COVID-19 diagnosis was associated with higher CVT in-hospital mortality. During the first 5 months of 2021, there was an increase in CVT hospitalization volume and increase in CVT-related mortality, partially attributable to VITT
    corecore