162 research outputs found

    Stress myocardial perfusion cardiac magnetic resonance imaging vs. coronary CT angiography in the diagnostic work-up of patients with stable chest pain:comparative effectiveness and costs

    Get PDF
    Background:To determine the comparative effectiveness and costs of coronary CT angiography (CCTA) and stress cardiac magnetic resonance imaging (CMR) for diagnosing coronary artery disease (CAD).Methods:A Markov micro-simulation model for 60-year-old patients with stable chest pain was developed, analyzing the perspective of the United States (US), United Kingdom (UK), and the Netherlands (NL).CCTA, CMR, and CCTA+CMR (CCTA, if positive followed by CMR) were considered and compared to direct catheter-based angiography (CAG) and no testing. The strategies were considered both as conservative strategy (patients with mildly-positive test results are not referred for CAG), and as invasive strategy (all patients with positive test results are referred for CAG). Outcome measures included lifetime costs, quality-adjusted life years (QALY), and radiation exposure.Results:Differences in effectiveness (QALYs) across diagnostic strategies were very small (range 0.001-0.016). For 60-year old men and women with a pre-test probability of 30% (and up to 70-90%, depending on the country considered), the CCTA, CMR, and CAG strategies were dominated, because the CCTA+CMR-conservative strategy was slightly more effective, and less expensive. Compared to the CCTA+CMR-conservative strategy, the CCTA+CMR-invasive strategy was slightly more costly and slightly more effective. The CCTA+CMR-invasive strategy was cost-effective for the US and NL, but not for the UK. When patients with false-negative test results were assumed to remain false-negative for 3 years, differences between strategies increased, and the CCTA-invasive strategy became cost-effective for UK and NL.Conclusions:Quality-adjusted life expectancy was similar across strategies. The CCTA+CMR strategy was cost-effective up to a pre-test probability of 70-90%, depending on the country. Above these thresholds, the CMR-strategy was cost-effective.<br/

    <i>TP53</i> hotspot mutations are predictive of survival in primary central nervous system lymphoma patients treated with combination chemotherapy

    No full text
    Primary central nervous system lymphoma (PCNSL) is an aggressive variant of diffuse large B-cell lymphoma (DLBCL) confined to the CNS. TP53 mutations (MUT-TP53) were investigated in the context of MIR34A/B/C- and DAPK promoter methylation status, and associated with clinical outcomes in PCNSL patients. In a total of 107 PCNSL patients clinical data were recorded, histopathology reassessed, and genetic and epigenetic aberrations of the p53-miR34-DAPK network studied. TP53 mutational status (exon 5–8), with structural classification of single nucleotide variations according to the IARC-TP53-Database, methylation status of MIR34A/B/C and DAPK, and p53-protein expression were assessed. The 57/107 (53.2 %) patients that were treated with combination chemotherapy +/− rituximab (CCT-treated) had a significantly better median overall survival (OS) (31.3 months) than patients treated with other regimens (high-dose methotrexate/whole brain radiation therapy, 6.0 months, or no therapy, 0.83 months), P < 0.0001. TP53 mutations were identified in 32/86 (37.2 %), among which 12 patients had hotspot/direct DNA contact mutations. CCT-treated patients with PCNSL harboring a hotspot/direct DNA contact MUT-TP53 (n = 9) had a significantly worse OS and progression free survival (PFS) compared to patients with non-hotspot/non-direct DNA contact MUT-TP53 or wild-type TP53 (median PFS 4.6 versus 18.2 or 45.7 months), P = 0.041 and P = 0.00076, respectively. Multivariate Cox regression analysis confirmed that hotspot/direct DNA contact MUT-TP53 was predictive of poor outcome in CCT-treated PCNSL patients, P = 0.012 and P = 0.008; HR: 1.86 and 1.95, for OS and PFS, respectively. MIR34A, MIR34B/C, and DAPK promoter methylation were detected in 53/93 (57.0 %), 80/84 (95.2 %), and 70/75 (93.3 %) of the PCNSL patients with no influence on survival. Combined MUT-TP53 and MIR34A methylation was associated with poor PFS (median 6.4 versus 38.0 months), P = 0.0070. This study suggests that disruption of the p53-pathway by MUT-TP53in hotspot/direct DNA contact codons is predictive of outcome in CCT-treated PCNSL patients, and concomitant MUT-TP53 and MIR34A methylation are associated with poor PFS. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s40478-016-0307-6) contains supplementary material, which is available to authorized users

    Mobile encounters:bus 5A as a cross-cultural meeting place

    Get PDF
    The paper explores modes of encounters in the everyday practice of bus travel. Particularly, it addresses cross-cultural encounters located in the tension between familiarity and difference, between inclusion and exclusion. The paper is located in contemporary thoughts, approaching public transport not only as a moving device but also as a social arena. Furthermore, the bus is simultaneously perceived as a public space, at once composite, contradictory and heterogeneous, and as a meeting place involving ‘Throwntogetherness’. The encounters analysed are bodily, emotional charged and outspoken meetings between passengers, with the socio-materiality of the bus and drivers as co-riders and gatekeepers

    Telomeres and the natural lifespan limit in humans

    Get PDF
    An ongoing debate in demography has focused on whether the human lifespan has a maximal natural limit. Taking a mechanistic perspective, and knowing that short telomeres are associated with diminished longevity, we examined whether telomere length dynamics during adult life could set a maximal natural lifespan limit. We define leukocyte telomere length of 5 kb as the 'telomeric brink', which denotes a high risk of imminent death. We show that a subset of adults may reach the telomeric brink within the current life expectancy and more so for a 100-year life expectancy. Thus secular trends in life expectancy should confront a biological limit due to crossing the telomeric brink

    Design choices made by target users for a pay-for-performance program in primary care: an action research approach

    Get PDF
    Contains fulltext : 110832.pdf (publisher's version ) (Open Access)BACKGROUND: International interest in pay-for-performance (P4P) initiatives to improve quality of health care is growing. Current programs vary in the methods of performance measurement, appraisal and reimbursement. One may assume that involvement of health care professionals in the goal setting and methods of quality measurement and subsequent payment schemes may enhance their commitment to and motivation for P4P programs and therefore the impact of these programs. We developed a P4P program in which the target users were involved in decisions about the P4P methods. METHODS: For the development of the P4P program a framework was used which distinguished three main components: performance measurement, appraisal and reimbursement. Based on this framework design choices were discussed in two panels of target users using an adapted Delphi procedure. The target users were 65 general practices and two health insurance companies in the South of the Netherlands. RESULTS: Performance measurement was linked to the Dutch accreditation program based on three domains (clinical care, practice management and patient experience). The general practice was chosen as unit of assessment. Relative standards were set at the 25th percentile of group performance. The incentive for clinical care was set twice as high as the one for practice management and patient experience. Quality scores were to be calculated separately for all three domains, and for both the quality level and the improvement of performance. The incentive for quality level was set thrice as high as the one for the improvement of performance. For reimbursement, quality scores were divided into seven levels. A practice with a quality score in the lowest group was not supposed to receive a bonus. The additional payment grew proportionally for each extra group. The bonus aimed at was on average 5% to 10% of the practice income. CONCLUSIONS: Designing a P4P program for primary care with involvement of the target users gave us an insight into their motives, which can help others who need to discuss similar programs. The resulting program is in line with target users' views and assessments of relevance and applicability. This may enhance their commitment to the program as was indicated by the growing number of voluntary participants after a successfully performed field test during the procedure. The elements of our framework can be very helpful for others who are developing or evaluating a P4P program

    Test performance of faecal occult blood testing for the detection of bowel cancer in people with chronic kidney disease (DETECT) protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Cancer is a major cause of mortality and morbidity in patients with chronic kidney disease (CKD). In patients without kidney disease, screening is a major strategy for reducing the risk of cancer and improving the health outcomes for those who developed cancers by detecting treatable cancers at an early stage. Among those with CKD, the effectiveness, the efficacy and patients' preferences for cancer screening are unknown.</p> <p>Methods/Design</p> <p>This work describes the protocol for the DETECT study examining the effectiveness, efficiency and patient's perspectives of colorectal cancer screening using immunochemical faecal occult blood testing (iFOBT) for people with CKD. The aims of the DETECT study are 1) to determine the test performance characteristics of iFOBT screening in individuals with CKD, 2) to estimate the incremental costs and health benefits of iFOBT screening in CKD compared to no screening and 3) to elicit patients' perspective for colorectal cancer screening in the CKD population. Three different study designs will be used to explore the uncertainties surrounding colorectal cancer screening in CKD. A diagnostic test accuracy study of iFOBT screening will be conducted across all stages of CKD in patients ages 35-70. Using individually collected direct healthcare costs and outcomes from the diagnostic test accuracy study, cost-utility and cost-effective analyses will be performed to estimate the costs and health benefits of iFOBT screening in CKD. Qualitative in-depth interviews will be undertaken in a subset of participants from the diagnostic test accuracy study to investigate the perspectives, experiences, attitudes and beliefs about colorectal cancer screening among individuals with CKD.</p> <p>Discussion</p> <p>The DETECT study will target the three major unknowns about early cancer detection in CKD. Findings from our study will provide accurate and definitive estimates of screening efficacy and efficiency for colorectal cancer, and will allow better service planning and budgeting for early cancer detection in this at-risk population.</p> <p>The DETECT study is also registered with the Australia New Zealand Clinical Trials Registry <a href="http://www.anzctr.org.au/ACTRN12611000538943.aspx">ACTRN12611000538943</a></p

    A systematic review of rodent pest research in Afro-Malagasy small-holder farming systems: Are we asking the right questions?

    Get PDF
    Rodent pests are especially problematic in terms of agriculture and public health since they can inflict considerable economic damage associated with their abundance, diversity, generalist feeding habits and high reproductive rates. To quantify rodent pest impacts and identify trends in rodent pest research impacting on small-holder agriculture in the Afro-Malagasy region we did a systematic review of research outputs from 1910 to 2015, by developing an a priori defined set of criteria to allow for replication of the review process. We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. We reviewed 162 publications, and while rodent pest research was spatially distributed across Africa (32 countries, including Madagascar), there was a disparity in number of studies per country with research biased towards four countries (Tanzania [25%], Nigeria [9%], Ethiopia [9%], Kenya [8%]) accounting for 51% of all rodent pest research in the Afro-Malagasy region. There was a disparity in the research themes addressed by Tanzanian publications compared to publications from the rest of the Afro-Malagasy region where research in Tanzania had a much more applied focus (50%) compared to a more basic research approach (92%) in the rest of the Afro-Malagasy region. We found that pest rodents have a significant negative effect on the Afro-Malagasy small-holder farming communities. Crop losses varied between cropping stages, storage and crops and the highest losses occurred during early cropping stages (46% median loss during seedling stage) and the mature stage (15% median loss). There was a scarcity of studies investigating the effectiveness of various management actions on rodent pest damage and population abundance. Our analysis highlights that there are inadequate empirical studies focused on developing sustainable control methods for rodent pests and rodent pests in the Africa-Malagasy context is generally ignored as a research topic
    corecore