1,130 research outputs found

    CKD classification based on estimated GFR over three years and subsequent cardiac and mortality outcomes: a cohort study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It is unknown whether defining chronic kidney disease (CKD) based on one versus two estimated glomerular filtration rate (eGFR) assessments changes the prognostic importance of reduced eGFR in a community-based population.</p> <p>Methods</p> <p>Participants in the Atherosclerosis Risk in Communities Study and the Cardiovascular Health Study were classified into 4 groups based on two eGFR assessments separated by 35.3 ± 2.5 months: sustained eGFR < 60 mL/min per 1.73 m<sup>2 </sup>(1 mL/sec per 1.73 m<sup>2</sup>); eGFR increase (change from below to above 60); eGFR decline (change from above to below 60); and eGFR persistently ≥60. Outcomes assessed in stratified multivariable Cox models included cardiac events and a composite of cardiac events, stroke, and mortality.</p> <p>Results</p> <p>There were 891 (4.9%) participants with sustained eGFR < 60, 278 (1.5%) with eGFR increase, 972 (5.4%) with eGFR decline, and 15,925 (88.2%) with sustained eGFR > 60. Participants with eGFR sustained < 60 were at highest risk of cardiac and composite events [HR = 1.38 (1.15, 1.65) and 1.58 (1.41, 1.77)], respectively, followed by eGFR decline [HR = 1.20 (1.00, 1.45) and 1.32 (1.17, 1.49)]. Individuals with eGFR increase trended toward increased cardiac risk [HR = 1.25 (0.88, 1.77)] and did not significantly differ from eGFR decline for any outcome. Results were similar when estimating GFR with the CKD-EPI equation.</p> <p>Conclusion</p> <p>Individuals with persistently reduced eGFR are at highest risk of cardiovascular outcomes and mortality, while individuals with an eGFR < 60 mL/min per 1.73 m<sup>2 </sup>at any time are at intermediate risk. Use of even a single measurement of eGFR to classify CKD in a community population appears to have prognostic value.</p

    Optical Trapping of an Ion

    Full text link
    For several decades, ions have been trapped by radio frequency (RF) and neutral particles by optical fields. We implement the experimental proof-of-principle for trapping an ion in an optical dipole trap. While loading, initialization and final detection are performed in a RF trap, in between, this RF trap is completely disabled and substituted by the optical trap. The measured lifetime of milliseconds allows for hundreds of oscillations within the optical potential. It is mainly limited by heating due to photon scattering. In future experiments the lifetime may be increased by further detuning the laser and cooling the ion. We demonstrate the prerequisite to merge both trapping techniques in hybrid setups to the point of trapping ions and atoms in the same optical potential.Comment: 5 pages, 3 figure

    Expression of Foxp3 in colorectal cancer but not in Treg cells correlates with disease progression in patients with colorectal cancer

    Get PDF
    Background: Regulatory T cells (Treg) expressing the transcription factor forkhead-box protein P3 (Foxp3) have been identified to counteract anti-tumor immune responses during tumor progression. Besides, Foxp3 presentation by cancer cells itself may also allow them to evade from effector T-cell responses, resulting in a survival benefit of the tumor. For colorectal cancer (CRC) the clinical relevance of Foxp3 has not been evaluated in detail. Therefore the aim of this study was to study its impact in colorectal cancer (CRC). Methods and Findings: Gene and protein analysis of tumor tissues from patients with CRC was performed to quantify the expression of Foxp3 in tumor infiltrating Treg and colon cancer cells. The results were correlated with clinicopathological parameters and patients overall survival. Serial morphological analysis demonstrated Foxp3 to be expressed in cancer cells. High Foxp3 expression of the cancer cells was associated with poor prognosis compared to patients with low Foxp3 expression. In contrast, low and high Foxp3 level in tumor infiltrating Treg cells demonstrated no significant differences in overall patient survival. Conclusions: Our findings strongly suggest that Foxp3 expression mediated by cancer cells rather than by Treg cells contribute to disease progression

    Rituximab in B-Cell Hematologic Malignancies: A Review of 20 Years of Clinical Experience

    Get PDF
    Rituximab is a human/murine, chimeric anti-CD20 monoclonal antibody with established efficacy, and a favorable and well-defined safety profile in patients with various CD20-expressing lymphoid malignancies, including indolent and aggressive forms of B-cell non-Hodgkin lymphoma. Since its first approval 20 years ago, intravenously administered rituximab has revolutionized the treatment of B-cell malignancies and has become a standard component of care for follicular lymphoma, diffuse large B-cell lymphoma, chronic lymphocytic leukemia, and mantle cell lymphoma. For all of these diseases, clinical trials have demonstrated that rituximab not only prolongs the time to disease progression but also extends overall survival. Efficacy benefits have also been shown in patients with marginal zone lymphoma and in more aggressive diseases such as Burkitt lymphoma. Although the proven clinical efficacy and success of rituximab has led to the development of other anti-CD20 monoclonal antibodies in recent years (e.g., obinutuzumab, ofatumumab, veltuzumab, and ocrelizumab), rituximab is likely to maintain a position within the therapeutic armamentarium because it is well established with a long history of successful clinical use. Furthermore, a subcutaneous formulation of the drug has been approved both in the EU and in the USA for the treatment of B-cell malignancies. Using the wealth of data published on rituximab during the last two decades, we review the preclinical development of rituximab and the clinical experience gained in the treatment of hematologic B-cell malignancies, with a focus on the well-established intravenous route of administration. This article is a companion paper to A. Davies, et al., which is also published in this issue

    2019 international consensus on cardiopulmonary resuscitation and emergency cardiovascular care science with treatment recommendations : summary from the basic life support; advanced life support; pediatric life support; neonatal life support; education, implementation, and teams; and first aid task forces

    No full text
    The International Liaison Committee on Resuscitation has initiated a continuous review of new, peer-reviewed, published cardiopulmonary resuscitation science. This is the third annual summary of the International Liaison Committee on Resuscitation International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations. It addresses the most recent published resuscitation evidence reviewed by International Liaison Committee on Resuscitation Task Force science experts. This summary addresses the role of cardiac arrest centers and dispatcher-assisted cardiopulmonary resuscitation, the role of extracorporeal cardiopulmonary resuscitation in adults and children, vasopressors in adults, advanced airway interventions in adults and children, targeted temperature management in children after cardiac arrest, initial oxygen concentration during resuscitation of newborns, and interventions for presyncope by first aid providers. Members from 6 International Liaison Committee on Resuscitation task forces have assessed, discussed, and debated the certainty of the evidence on the basis of the Grading of Recommendations, Assessment, Development, and Evaluation criteria, and their statements include consensus treatment recommendations. Insights into the deliberations of the task forces are provided in the Justification and Evidence to Decision Framework Highlights sections. The task forces also listed priority knowledge gaps for further research

    Fishing the Molecular Bases of Treacher Collins Syndrome

    Get PDF
    Treacher Collins syndrome (TCS) is an autosomal dominant disorder of craniofacial development, and mutations in the TCOF1 gene are responsible for over 90% of TCS cases. The knowledge about the molecular mechanisms responsible for this syndrome is relatively scant, probably due to the difficulty of reproducing the pathology in experimental animals. Zebrafish is an emerging model for human disease studies, and we therefore assessed it as a model for studying TCS. We identified in silico the putative zebrafish TCOF1 ortholog and cloned the corresponding cDNA. The derived polypeptide shares the main structural domains found in mammals and amphibians. Tcof1 expression is restricted to the anterior-most regions of zebrafish developing embryos, similar to what happens in mouse embryos. Tcof1 loss-of-function resulted in fish showing phenotypes similar to those observed in TCS patients, and enabled a further characterization of the mechanisms underlying craniofacial malformation. Besides, we initiated the identification of potential molecular targets of treacle in zebrafish. We found that Tcof1 loss-of-function led to a decrease in the expression of cellular proliferation and craniofacial development. Together, results presented here strongly suggest that it is possible to achieve fish with TCS-like phenotype by knocking down the expression of the TCOF1 ortholog in zebrafish. This experimental condition may facilitate the study of the disease etiology during embryonic development

    Mild Joint Symptoms Are Associated with Lower Risk of Falls than Asymptomatic Individuals with Radiological Evidence of Osteoarthritis

    Get PDF
    Osteoarthritis (OA) exacerbates skeletal muscle functioning, leading to postural instability and increased falls risk. However, the link between impaired physical function, OA and falls have not been elucidated. We investigated the role of impaired physical function as a potential mediator in the association between OA and falls. This study included 389 participants [229 fallers (≥2 falls or one injurious fall in the past 12 months), 160 non-fallers (no history of falls)], age (≥65 years) from a randomized controlled trial, the Malaysian Falls Assessment and Intervention Trial (MyFAIT). Physical function was assessed using Timed Up and Go (TUG) and Functional Reach (FR) tests. Knee and hip OA were diagnosed using three methods: Clinical, Radiological and Self-report. OA symptom severity was assessed using the Western Ontario and McMaster Universities Arthritis Index (WOMAC). The total WOMAC score was categorized to asymptomatic, mild, moderate and severe symptoms. Individuals with radiological OA and ‘mild’ overall symptoms on the WOMAC score had reduced risk of falls compared to asymptomatic OA [OR: 0.402(0.172–0.940), p = 0.042]. Individuals with clinical OA and ‘severe’ overall symptoms had increased risk of falls compared to those with ‘mild’ OA [OR: 4.487(1.883–10.693), p = 0.005]. In individuals with radiological OA, mild symptoms appear protective of falls while those with clinical OA and severe symptoms have increased falls risk compared to those with mild symptoms. Both relationships between OA and falls were not mediated by physical limitations. Larger prospective studies are needed for further evaluation

    Health care costs, utilization and patterns of care following Lyme disease

    Get PDF
    BACKGROUND:Lyme disease is the most frequently reported vector borne infection in the United States. The Centers for Disease Control have estimated that approximately 10% to 20% of individuals may experience Post-Treatment Lyme Disease Syndrome - a set of symptoms including fatigue, musculoskeletal pain, and neurocognitive complaints that persist after initial antibiotic treatment of Lyme disease. Little is known about the impact of Lyme disease or post-treatment Lyme disease symptoms (PTLDS) on health care costs and utilization in the United States. OBJECTIVES:1) to examine the impact of Lyme disease on health care costs and utilization, 2) to understand the relationship between Lyme disease and the probability of developing PTLDS, 3) to understand how PTLDS may impact health care costs and utilization. METHODS:This study utilizes retrospective data on medical claims and member enrollment for persons aged 0-64 years who were enrolled in commercial health insurance plans in the United States between 2006-2010. 52,795 individuals treated for Lyme disease were compared to 263,975 matched controls with no evidence of Lyme disease exposure. RESULTS:Lyme disease is associated with 2,968highertotalhealthcarecosts(952,968 higher total health care costs (95% CI: 2,807-3,128, p<.001) and 87% more outpatient visits (95% CI: 86%-89%, p<.001) over a 12-month period, and is associated with 4.77 times greater odds of having any PTLDS-related diagnosis, as compared to controls (95% CI: 4.67-4.87, p<.001). Among those with Lyme disease, having one or more PTLDS-related diagnosis is associated with 3,798 higher total health care costs (95% CI: 3,542-4,055, p<.001) and 66% more outpatient visits (95% CI: 64%-69%, p<.001) over a 12-month period, relative to those with no PTLDS-related diagnoses. CONCLUSIONS:Lyme disease is associated with increased costs above what would be expected for an easy to treat infection. The presence of PTLDS-related diagnoses after treatment is associated with significant health care costs and utilization

    A theory of organizational readiness for change

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Change management experts have emphasized the importance of establishing organizational readiness for change and recommended various strategies for creating it. Although the advice seems reasonable, the scientific basis for it is limited. Unlike individual readiness for change, organizational readiness for change has not been subject to extensive theoretical development or empirical study. In this article, I conceptually define organizational readiness for change and develop a theory of its determinants and outcomes. I focus on the organizational level of analysis because many promising approaches to improving healthcare delivery entail collective behavior change in the form of systems redesign--that is, multiple, simultaneous changes in staffing, work flow, decision making, communication, and reward systems.</p> <p>Discussion</p> <p>Organizational readiness for change is a multi-level, multi-faceted construct. As an organization-level construct, readiness for change refers to organizational members' shared resolve to implement a change (change commitment) and shared belief in their collective capability to do so (change efficacy). Organizational readiness for change varies as a function of how much organizational members value the change and how favorably they appraise three key determinants of implementation capability: task demands, resource availability, and situational factors. When organizational readiness for change is high, organizational members are more likely to initiate change, exert greater effort, exhibit greater persistence, and display more cooperative behavior. The result is more effective implementation.</p> <p>Summary</p> <p>The theory described in this article treats organizational readiness as a shared psychological state in which organizational members feel committed to implementing an organizational change and confident in their collective abilities to do so. This way of thinking about organizational readiness is best suited for examining organizational changes where collective behavior change is necessary in order to effectively implement the change and, in some instances, for the change to produce anticipated benefits. Testing the theory would require further measurement development and careful sampling decisions. The theory offers a means of reconciling the structural and psychological views of organizational readiness found in the literature. Further, the theory suggests the possibility that the strategies that change management experts recommend are equifinal. That is, there is no 'one best way' to increase organizational readiness for change.</p
    corecore