1,069 research outputs found

    The effect of post-discharge educational intervention on patients in achieving objectives in modifiable risk factors six months after discharge following an episode of acute coronary syndrome, (CAM-2 Project): a randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Objectives</p> <p>We investigated whether an intervention mainly consisting of a signed agreement between patient and physician on the objectives to be reached, improves reaching these secondary prevention objectives in modifiable cardiovascular risk factors six-months after discharge following an acute coronary syndrome.</p> <p>Background</p> <p>There is room to improve mid-term adherence to clinical guidelines' recommendations in coronary heart disease secondary prevention, specially non-pharmacological ones, often neglected.</p> <p>Methods</p> <p>In CAM-2, patients discharged after an acute coronary syndrome were randomly assigned to the intervention or the usual care group. The primary outcome was reaching therapeutic objectives in various secondary prevention variables: smoking, obesity, blood lipids, blood pressure control, exercise and taking of medication.</p> <p>Results</p> <p>1757 patients were recruited in 64 hospitals and 1510 (762 in the intervention and 748 in the control group) attended the six-months follow-up visit. After adjustment for potentially important variables, there were, between the intervention and control group, differences in the mean reduction of body mass index (0.5 vs. 0.2; p < 0.001) and waist circumference (1.6 cm vs. 0.6 cm; p = 0.05), proportion of patients who exercise regularly and those with total cholesterol below 175 mg/dl (64.7% vs. 56.5%; p = 0.001). The reported intake of medications was high in both groups for all the drugs considered with no differences except for statins (98.1% vs. 95.9%; p = 0.029).</p> <p>Conclusions</p> <p>At least in the short term, lifestyle changes among coronary heart disease patients are achievable by intensifying the responsibility of the patient himself by means of a simple and feasible intervention.</p

    Comparison of treatment outcomes of new smear-positive pulmonary tuberculosis patients by HIV and antiretroviral status in a TB/HIV clinic, Malawi

    Get PDF
    Background: Smear-positive pulmonary TB is the most infectious form of TB. Previous studies on the effect of HIV and antiretroviral therapy on TB treatment outcomes among these highly infectious patients demonstrated conflicting results, reducing understanding of important issues. Methods: All adult smear-positive pulmonary TB patients diagnosed between 2008 and 2010 in Malawi's largest public, integrated TB/HIV clinic were included in the study to assess treatment outcomes by HIV and antiretroviral therapy status using logistic regression. Results: Of 2,361 new smear-positive pulmonary TB patients, 86% had successful treatment outcome (were cured or completed treatment), 5% died, 6% were lost to follow-up, 1% failed treatment, and 2% transferred-out. Overall HIV prevalence was 56%. After adjusting for gender, age and TB registration year, treatment success was higher among HIV-negative than HIV-positive patients (adjusted odds ratio 1.49; 95% CI: 1.14-1.94). Of 1,275 HIV-infected pulmonary TB patients, 492 (38%) received antiretroviral therapy during the study. Pulmonary TB patients on antiretroviral therapy were more likely to have successful treatment outcomes than those not on ART (adjusted odds ratio: 1.83; 95% CI: 1.29-2.60). Conclusion: HIV co-infection was associated with poor TB treatment outcomes. Despite high HIV prevalence and the integrated TB/HIV setting, only a minority of patients started antiretroviral therapy. Intensified patient education and provider training on the benefits of antiretroviral therapy could increase antiretroviral therapy uptake and improve TB treatment success among these most infectious patients. © 2013 Tweya et al

    Delays in starting antiretroviral therapy in patients with HIV-associated tuberculosis accessing non-integrated clinical services in a South African township

    Get PDF
    BACKGROUND: Delays in the initiation of antiretroviral therapy (ART) in patients with HIV-associated tuberculosis (TB) are associated with increased mortality risk. We examined the timing of ART among patients receiving care provided by non-integrated TB and ART services in Cape Town, South Africa. METHODS: In an observational cohort study, we determined the overall time delay between starting treatment for TB and starting ART in patients treated in Gugulethu township between 2002 and 2008. For patients referred from TB clinics to the separate ART clinic, we quantified and identified risk factors associated with the two component delays between starting TB treatment, enrolment in the ART clinic and subsequent initiation of ART. RESULTS: Among 893 TB patients studied (median CD4 count, 81 cells/μL), the delay between starting TB treatment and starting ART was prolonged (median, 95 days; IQR = 49-155). Delays were shorter in more recent calendar periods and among those with lower CD4 cell counts. However, the median delay was almost three-fold longer for patients referred from separate TB clinics compared to patients whose TB was diagnosed in the ART clinic (116 days versus 41 days, respectively; P < 0.001). In the most recent calendar period, the proportions of patients with CD4 cell counts < 50 cells/μL who started ART within 4 weeks of TB diagnosis were 11.1% for patients referred from TB clinics compared to 54.6% of patients with TB diagnosed in the ART service (P < 0.001). CONCLUSIONS: Delays in starting ART were prolonged, especially for patients referred from separate TB clinics. Non-integration of TB and ART services is likely to be a substantial obstacle to timely initiation of ART

    NfL and pNfH are increased in Friedreich's ataxia

    Get PDF
    Objective: To assess neurofilaments as neurodegenerative biomarkers in serum of patients with Friedreich’s ataxia. / Methods: Single molecule array measurements of neurofilament light (NfL) and heavy chain (pNfH) in 99 patients with genetically confirmed Friedreich’s ataxia. Correlation of NfL/pNfH serum levels with disease severity, disease duration, age, age at onset, and GAA repeat length. / Results: Median serum levels of NfL were 21.2 pg/ml (range 3.6–49.3) in controls and 26.1 pg/ml (0–78.1) in Friedreich’s ataxia (p = 0.002). pNfH levels were 23.5 pg/ml (13.3–43.3) in controls and 92 pg/ml (3.1–303) in Friedreich’s ataxia (p = 0.0004). NfL levels were significantly increased in younger patients (age 16–31 years, p < 0.001) and patients aged 32–47 years (p = 0.008), but not in patients of age 48 years and older (p = 0.41). In a longitudinal assessment, there was no difference in NfL levels in 14 patients with repeated sampling 2 years after baseline measurement. Levels of NfL correlated inversely with GAA1 repeat length (r = − 0.24, p = 0.02) but not with disease severity (r = − 0.13, p = 0.22), disease duration (r = − 0.06, p = 0.53), or age at onset (r = 0.05, p = 0.62). / Conclusion: Serum levels of NfL and pNfH are elevated in Friedreich’s ataxia, but differences to healthy controls decrease with increasing age. Long-term longitudinal data are required to explore whether this reflects a selection bias from early death of more severely affected individuals or a slowing down of the neurodegenerative process with age. In a pilot study over 2 years of follow-up—a period relevant for biomarkers indicating treatment effects—we found NfL levels to be stable

    Identification and validation of a QTL influencing bitter pit symptoms in apple (Malus x domestica)

    Get PDF
    Bitter pit is one of the most economically important physiological disorders affecting apple fruit production, causing soft discrete pitting of the cortical flesh of the apple fruits which renders them unmarketable. The disorder is heritable; however, the environment and cultural practices play a major role in expression of symptoms. Bitter pit has been shown to be controllable to a certain extent using calcium sprays and dips; however, their use does not entirely prevent the incidence of the disorder. Previously, bitter pit has been shown to be controlled by two dominant genes, and markers on linkage group 16 of the apple genome were identified that were significantly associated with the expression of bitter pit symptoms in a genome-wide association study. In this investigation, we identified a major QTL for bitter pit defined by two microsatellite (SSR) markers. The association of the SSRs with the bitter pit locus, and their ability to predict severe symptom expression, was confirmed through screening of individuals with stable phenotypic expression from an additional mapping progeny. The data generated in this current study suggest a two gene model could account for the control of bitter pit symptom expression; however, only one of the loci was detectable, most likely due to dominance of alleles carried by both parents of the mapping progeny used. The SSR markers identified are cost-effective, robust and multi-allelic and thus should prove useful for the identification of seedlings with resistance to bitter pit using marker-assisted selection in apple breeding programs

    Blood cultures in ambulatory outpatients

    Get PDF
    BACKGROUND: Blood cultures are a gold standard specific test for diagnosing many infections. However, the low yield may limit their usefulness, particularly in low-risk populations. This study was conducted to assess the utility of blood cultures drawn from ambulatory outpatients. METHODS: Blood cultures drawn at community-based collection sites in the Calgary Health Region (population 1 million) in 2001 and 2002 were included in this study. These patients were analyzed by linkages to acute care health care databases for utilization of acute care facilities within 2 weeks of blood culture draw. RESULTS: 3102 sets of cultures were drawn from 1732 ambulatory outpatients (annual rate = 89.4 per 100,000 population). Significant isolates were identified from 73 (2.4%) sets of cultures from 51 patients, including Escherichia coli in 18 (35%) and seven (14%) each of Staphylococcus aureus and Streptococcus pneumoniae. Compared to patients with negative cultures, those with positive cultures were older (mean 49.6 vs. 40.1 years, p < 0.01), and more likely to subsequently receive care at a regional emergency department, outpatient antibiotic clinic, or hospital (35/51 vs. 296/1681, p < 0.0001). Of the 331 (19%) patients who received acute care treatment, those with positive cultures presented sooner after community culture draw (median 2 vs. 3 days, p < 0.01) and had longer median treatment duration (6 vs. 2 days, p < 0.01). CONCLUSION: Blood cultures drawn in outpatient settings are uncommonly positive, but may define patients for increased intensity of therapy. Strategies to reduce utilization without excluding patients with positive cultures need to be developed for this patient population

    Defending the genome from the enemy within:mechanisms of retrotransposon suppression in the mouse germline

    Get PDF
    The viability of any species requires that the genome is kept stable as it is transmitted from generation to generation by the germ cells. One of the challenges to transgenerational genome stability is the potential mutagenic activity of transposable genetic elements, particularly retrotransposons. There are many different types of retrotransposon in mammalian genomes, and these target different points in germline development to amplify and integrate into new genomic locations. Germ cells, and their pluripotent developmental precursors, have evolved a variety of genome defence mechanisms that suppress retrotransposon activity and maintain genome stability across the generations. Here, we review recent advances in understanding how retrotransposon activity is suppressed in the mammalian germline, how genes involved in germline genome defence mechanisms are regulated, and the consequences of mutating these genome defence genes for the developing germline

    Progenitor-Derivative Relationships of Hordeum Polyploids (Poaceae, Triticeae) Inferred from Sequences of TOPO6, a Nuclear Low-Copy Gene Region

    Get PDF
    Polyploidization is a major mechanism of speciation in plants. Within the barley genus Hordeum, approximately half of the taxa are polyploids. While for diploid species a good hypothesis of phylogenetic relationships exists, there is little information available for the polyploids (4×, 6×) of Hordeum. Relationships among all 33 diploid and polyploid Hordeum species were analyzed with the low-copy nuclear marker region TOPO6 for 341 Hordeum individuals and eight outgroup species. PCR products were either directly sequenced or cloned and on average 12 clones per individual were included in phylogenetic analyses. In most diploid Hordeum species TOPO6 is probably a single-copy locus. Most sequences found in polyploid individuals phylogenetically cluster together with sequences derived from diploid species and thus allow the identification of parental taxa of polyploids. Four groups of sequences occurring only in polyploid taxa are interpreted as footprints of extinct diploid taxa, which contributed to allopolyploid evolution. Our analysis identifies three key species involved in the evolution of the American polyploids of the genus. (i) All but one of the American tetraploids have a TOPO6 copy originating from the Central Asian diploid H. roshevitzii, the second copy clustering with different American diploid species. (ii) All hexaploid species from the New World have a copy of an extinct close relative of H. californicum and (iii) possess the TOPO6 sequence pattern of tetraploid H. jubatum, each with an additional copy derived from different American diploids. Tetraploid H. bulbosum is an autopolyploid, while the assumed autopolyploid H. brevisubulatum (4×, 6×) was identified as allopolyploid throughout most of its distribution area. The use of a proof-reading DNA polymerase in PCR reduced the proportion of chimerical sequences in polyploids in comparison to Taq polymerase
    corecore