327 research outputs found

    Differential transcriptomic and metabolic profiles of M. africanum- and M. tuberculosis-infected patients after, but not before, drug treatment.

    Get PDF
    The epidemiology of Mycobacterium tuberculosis (Mtb) and M. africanum (Maf) suggests differences in their virulence, but the host immune profile to better understand the pathogenesis of tuberculosis (TB) have not been studied. We compared the transcriptomic and metabolic profiles between Mtb- and Maf-infected TB cases to identify host biomarkers associated with lineages-specific pathogenesis and response to anti-TB chemotherapy. Venous blood samples from Mtb- and Maf-infected patients obtained before and after anti-TB treatment were analyzed for cell composition, gene expression and metabolic profiles. Prior to treatment, similar transcriptomic profiles were seen in Maf- and Mtb-infected patients. In contrast, post treatment, over 1600 genes related to immune responses and metabolic diseases were differentially expressed between the groups. Notably, the upstream regulator hepatocyte nuclear factor 4-alpha (HNF4α), which regulated 15% of these genes, was markedly enriched. Serum metabolic profiles were similar in both group pre-treatment, but the decline in pro-inflammatory metabolites post treatment were most pronounced in Mtb-infected patients. Together, the differences in both peripheral blood transcriptomic and serum metabolic profiles between Maf- and Mtb-infected patients observed over the treatment period, might be indicative of intrinsic host factors related to susceptibility to TB and/or differential efficacy of the standard anti-TB treatment on the two lineages

    Rituximab in B-Cell Hematologic Malignancies: A Review of 20 Years of Clinical Experience

    Get PDF
    Rituximab is a human/murine, chimeric anti-CD20 monoclonal antibody with established efficacy, and a favorable and well-defined safety profile in patients with various CD20-expressing lymphoid malignancies, including indolent and aggressive forms of B-cell non-Hodgkin lymphoma. Since its first approval 20 years ago, intravenously administered rituximab has revolutionized the treatment of B-cell malignancies and has become a standard component of care for follicular lymphoma, diffuse large B-cell lymphoma, chronic lymphocytic leukemia, and mantle cell lymphoma. For all of these diseases, clinical trials have demonstrated that rituximab not only prolongs the time to disease progression but also extends overall survival. Efficacy benefits have also been shown in patients with marginal zone lymphoma and in more aggressive diseases such as Burkitt lymphoma. Although the proven clinical efficacy and success of rituximab has led to the development of other anti-CD20 monoclonal antibodies in recent years (e.g., obinutuzumab, ofatumumab, veltuzumab, and ocrelizumab), rituximab is likely to maintain a position within the therapeutic armamentarium because it is well established with a long history of successful clinical use. Furthermore, a subcutaneous formulation of the drug has been approved both in the EU and in the USA for the treatment of B-cell malignancies. Using the wealth of data published on rituximab during the last two decades, we review the preclinical development of rituximab and the clinical experience gained in the treatment of hematologic B-cell malignancies, with a focus on the well-established intravenous route of administration. This article is a companion paper to A. Davies, et al., which is also published in this issue

    Reducing Alaska Native paediatric oral health disparities: a systematic review of oral health interventions and a case study on multilevel strategies to reduce sugar-sweetened beverage intake

    Get PDF
    Background. Tooth decay is the most common paediatric disease and there is a serious paediatric tooth decay epidemic in Alaska Native communities. When untreated, tooth decay can lead to pain, infection, systemic health problems, hospitalisations and in rare cases death, as well as school absenteeism, poor grades and low quality-of-life. The extent to which population-based oral health interventions have been conducted in Alaska Native paediatric populations is unknown. Objective. To conduct a systematic review of oral health interventions aimed at Alaska Native children below age 18 and to present a case study and conceptual model on multilevel intervention strategies aimed at reducing sugar-sweetened beverage (SSB) intake among Alaska Native children. Design. Based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement, the terms “Alaska Native”, “children” and “oral health” were used to search Medline, Embase, Web of Science, GoogleScholar and health foundation websites (1970–2012) for relevant clinical trials and evaluation studies. Results. Eighty-five studies were found in Medline, Embase and Web of Science databases and there were 663 hits in GoogleScholar. A total of 9 publications were included in the qualitative review. These publications describe 3 interventions that focused on: reducing paediatric tooth decay by educating families and communities; providing dental chemotherapeutics to pregnant women; and training mid-level dental care providers. While these approaches have the potential to improve the oral health of Alaska Native children, there are unique challenges regarding intervention acceptability, reach and sustainability. A case study and conceptual model are presented on multilevel strategies to reduce SSB intake among Alaska Native children. Conclusions. Few oral health interventions have been tested within Alaska Native communities. Community-centred multilevel interventions are promising approaches to improve the oral and systemic health of Alaska Native children. Future investigators should evaluate the feasibility of implementing multilevel interventions and policies within Alaska Native communities as a way to reduce children's health disparities

    Mild Joint Symptoms Are Associated with Lower Risk of Falls than Asymptomatic Individuals with Radiological Evidence of Osteoarthritis

    Get PDF
    Osteoarthritis (OA) exacerbates skeletal muscle functioning, leading to postural instability and increased falls risk. However, the link between impaired physical function, OA and falls have not been elucidated. We investigated the role of impaired physical function as a potential mediator in the association between OA and falls. This study included 389 participants [229 fallers (≥2 falls or one injurious fall in the past 12 months), 160 non-fallers (no history of falls)], age (≥65 years) from a randomized controlled trial, the Malaysian Falls Assessment and Intervention Trial (MyFAIT). Physical function was assessed using Timed Up and Go (TUG) and Functional Reach (FR) tests. Knee and hip OA were diagnosed using three methods: Clinical, Radiological and Self-report. OA symptom severity was assessed using the Western Ontario and McMaster Universities Arthritis Index (WOMAC). The total WOMAC score was categorized to asymptomatic, mild, moderate and severe symptoms. Individuals with radiological OA and ‘mild’ overall symptoms on the WOMAC score had reduced risk of falls compared to asymptomatic OA [OR: 0.402(0.172–0.940), p = 0.042]. Individuals with clinical OA and ‘severe’ overall symptoms had increased risk of falls compared to those with ‘mild’ OA [OR: 4.487(1.883–10.693), p = 0.005]. In individuals with radiological OA, mild symptoms appear protective of falls while those with clinical OA and severe symptoms have increased falls risk compared to those with mild symptoms. Both relationships between OA and falls were not mediated by physical limitations. Larger prospective studies are needed for further evaluation

    The role of coronary artery calcification score in clinical practice

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Coronary artery calcification (CAC) measured by electron-beam computed tomography (EBCT) has been well studied in the prediction of coronary artery disease (CAD). We sought to evaluate the impact of the CAC score in the diagnostic process immediately after its introduction in a large tertiary referral centre.</p> <p>Methods</p> <p>598 patients with no history of CAD who underwent EBCT for evaluation of CAD were retrospectively included into the study. Ischemia detection test results (exercise stress test, single photon emission computed tomography or ST segment analysis on 24 hours ECG detection), as well as the results of coronary angiography (CAG) were collected.</p> <p>Results</p> <p>The mean age of the patients was 55 ± 11 years (57% male). Patients were divided according to CAC scores; group A < 10, B 10 – 99, C 100 – 399 and D ≥ 400 (304, 135, 89 and 70 patients respectively). Ischemia detection tests were performed in 531 (89%) patients; negative ischemia results were found in 362 patients (183 in group A, 87 in B, 58 in C, 34 in D). Eighty-eight percent of the patients in group D underwent CAG despite negative ischemia test results, against 6% in group A, 16% in group B and 29% in group C. A positive ischemia test was found in 74 patients (25 in group A, 17 in B, 16 in C, 16 in D). In group D 88% (N = 14) of the patients with a positive ischemia test were referred for CAG, whereas 38 – 47% in group A-C.</p> <p>Conclusion</p> <p>Our study showed that patients with a high CAC score are more often referred for CAG. The CAC scores can be used as an aid in daily cardiology practice to determine further decision making.</p

    Predicting implementation from organizational readiness for change: a study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment.</p> <p>Objectives</p> <p>Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias.</p> <p>Methods and Design</p> <p>We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique.</p> <p>Discussion</p> <p>We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs.</p

    Male Germ Cell Apoptosis and Epigenetic Histone Modification Induced by Tripterygium wilfordii Hook F

    Get PDF
    Multiglycosides of Tripterygium wilfordii Hook f (GTW), a Chinese herb-derived medicine used as a remedy for rheumatoid arthritis, are considered to be a reversible anti-fertility drug affecting the mammalian spermatids. However, the mechanism behind this effect is still unknown. To study the possible mechanism behind the impact of GTW on spermatogenesis, we administered 4 groups of 4-week-old male mice with different doses of GTW. We found a dose-dependent decrease in the number of germ cells after 40 days of GTW treatment, and an increase in apoptotic cells from the low-dose to the high-dose group. During this same period the dimethylated level of histone H3 lysine 9 (H3K9me2) in GTW-treated testes germ cells declined. Additionally, spermatogonial stem cells (SSCs) from 6-day-old mice were isolated to evaluate the possible effect of GTW or triptolide on development of SSCs. We found a significantly higher incidence of apoptosis and lower dimethylation level of H3K9me2 in the SSCs of GTW or triptolide treatment than in controls. Thus, these data suggest that the GTW-induced apoptosis might be responsible for the fertility impairment in mice. This damage could be traced back to the early stages of spermatogenesis. GTW also affected the epigenetic modification of H3K9 in spermatogenesis. Molecular dynamics simulation suggested that triptolide and dimethylated or trimethylated H3K9 might have similar interaction mechanisms with EED (embryonic ectoderm development). These candidate activation mechanisms provide the first glimpse into the pathway of GTW-induced gonad toxicity, which is crucial for further research and clinical application

    Variation in diabetes care by age: opportunities for customization of care

    Get PDF
    BACKGROUND: The quality of diabetes care provided to older adults has usually been judged to be poor, but few data provide direct comparison to other age groups. In this study, we hypothesized that adults age 65 and over receive lower quality diabetes care than adults age 45–64 years old. METHODS: We conducted a cohort study of members of a health plan cared for by multiple medical groups in Minnesota. Study subjects were a random sample of 1109 adults age 45 and over with an established diagnosis of diabetes using a diabetes identification method with estimated sensitivity 0.91 and positive predictive value 0.94. Survey data (response rate 86.2%) and administrative databases were used to assess diabetes severity, glycemic control, quality of life, microvascular and macrovascular risks and complications, preventive care, utilization, and perceptions of diabetes. RESULTS: Compared to those aged 45–64 years (N = 627), those 65 and older (N = 482) had better glycemic control, better health-related behaviors, and perceived less adverse impacts of diabetes on their quality of life despite longer duration of diabetes and a prevalence of cardiovascular disease twice that of younger patients. Older patients did not ascribe heart disease to their diabetes. Younger adults often had explanatory models of diabetes that interfere with effective and aggressive care, and accessed care less frequently. Overall, only 37% of patients were simultaneously up-to-date on eye exams, foot exams, and glycated hemoglobin (A1c) tests within one year. CONCLUSION: These data demonstrate the need for further improvement in diabetes care for all patients, and suggest that customisation of care based on age and explanatory models of diabetes may be an improvement strategy that merits further evaluation

    Acute Sleep Deprivation and Circadian Misalignment Associated with Transition onto the First Night of Work Impairs Visual Selective Attention

    Get PDF
    Background: Overnight operations pose a challenge because our circadian biology promotes sleepiness and dissipates wakefulness at night. Since the circadian effect on cognitive functions magnifies with increasing sleep pressure, cognitive deficits associated with night work are likely to be most acute with extended wakefulness, such as during the transition from a day shift to night shift. Methodology/Principal Findings: To test this hypothesis we measured selective attention (with visual search), vigilance (with Psychomotor Vigilance Task [PVT]) and alertness (with a visual analog scale) in a shift work simulation protocol, which included four day shifts followed by three night shifts. There was a nocturnal decline in cognitive processes, some of which were most pronounced on the first night shift. The nighttime decrease in visual search sensitivity was most pronounced on the first night compared with subsequent nights (p = .04), and this was accompanied by a trend towards selective attention becoming ‘fast and sloppy’. The nighttime increase in attentional lapses on the PVT was significantly greater on the first night compared to subsequent nights (p<.05) indicating an impaired ability to sustain focus. The nighttime decrease in subjective alertness was also greatest on the first night compared with subsequent nights (p<.05). Conclusions/Significance: These nocturnal deficits in attention and alertness offer some insight into why occupational errors, accidents, and injuries are pronounced during night work compared to day work. Examination of the nighttime vulnerabilities underlying the deployment of attention can be informative for the design of optimal work schedules and the implementation of effective countermeasures for performance deficits during night work
    corecore