973 research outputs found

    Cost-effectiveness of HIV screening of blood donations in Accra (Ghana)

    Get PDF
    AbstractObjectivesAreas with high HIV-incidence rates compared to the developed world may benefit from additional testing in blood banks and may show more favorable cost-effectiveness ratios. We evaluated the cost-effectiveness of adding p24 antigen, mini pool nucleic acid amplification testing (MP-NAT), or individual donation NAT (ID-NAT) to the HIV-antibody screening at the Korle Bu Teaching Hospital (Accra, Ghana), where currently only HIV-antibody screening is undertaken.MethodsThe residual risk of HIV transmission was derived from blood donations to the blood bank of the Korle Bu Teaching Hospital in 2004. Remaining life expectancies of patients receiving blood transfusion were estimated using the World Health Organization life expectancies. Cost-effectiveness ratios for adding the tests to HIV-antibody screening only were determined using a decision tree model and a Markov model for HIV.ResultsThe prevalence of HIV was estimated at 1.51% in 18,714 donations during 2004. The incremental cost per disability-adjusted life-year (DALY) averted was US1237forp24antigen,US1237 for p24 antigen, US3142 for MP-NAT and US7695comparedtothenextleastexpensivestrategy.HIV−antibodyscreeningitselfwascost−savingcomparedtonoscreeningatall,gainingUS7695 compared to the next least expensive strategy. HIV-antibody screening itself was cost-saving compared to no screening at all, gaining US73.85 and averting 0.86 DALY per transfused patient. Up to a willingness-to-pay of US2736perDALYaverted,HIV−antibodyscreeningwithoutadditionaltestingwasthemostcost−effectivestrategy.Overawillingness−to−payofUS2736 per DALY averted, HIV-antibody screening without additional testing was the most cost-effective strategy. Over a willingness-to-pay of US11,828 per DALY averted, ID-NAT was significantly more cost-effective than the other strategies.ConclusionsAdding p24 antigen, MP-NAT, or ID-NAT to the current antibody screening cannot be regarded as a cost-effective health-care intervention for Ghana

    Key challenges in normal tissue complication probability model development and validation:towards a comprehensive strategy

    Get PDF
    Normal Tissue Complication Probability (NTCP) models can be used for treatment plan optimisation and patient selection for emerging treatment techniques. We discuss and suggest methodological approaches to address key challenges in NTCP model development and validation, including: missing data, non-linear response relationships, multicollinearity between predictors, overfitting, generalisability and the prediction of multiple complication grades at multiple time points. The methodological approaches chosen are aimed to improve the accuracy, transparency and robustness of future NTCP-models. We demonstrate our methodological approaches using clinical data

    PEN: a low energy test of lepton universality

    Full text link
    Allowed charged π\pi meson decays are characterized by simple dynamics, few available decay channels, mainly into leptons, and extremely well controlled radiative and loop corrections. In that sense, pion decays represent a veritable triumph of the standard model (SM) of elementary particles and interactions. This relative theoretical simplicity makes charged pion decays a sensitive means for testing the underlying symmetries and the universality of weak fermion couplings, as well as for studying pion structure and chiral dynamics. Even after considerable recent improvements, experimental precision is lagging far behind that of the theoretical description for pion decays. We review the current state of experimental study of the pion electronic decay π+→e+νe(γ)\pi^+ \to e^+\nu_e(\gamma), or πe2(γ)\pi_{e2(\gamma)}, where the (γ)(\gamma) indicates inclusion and explicit treatment of radiative decay events. We briefly review the limits on non-SM processes arising from the present level of experimental precision in πe2(γ)\pi_{e2(\gamma)} decays. Focusing on the PEN experiment at the Paul Scherrer Institute (PSI), Switzerland, we examine the prospects for further improvement in the near term.Comment: 11 pages, 5 figures; paper presented at the XIII International Conference on Heavy Quarks and Leptons, 22-27 May 2016, Blacksburg, Virginia, US

    Fatigue during acute systemic inflammation is associated with reduced mental effort expenditure while task accuracy is preserved

    Get PDF
    BACKGROUNDEarlier work within the physical domain showed that acute inflammation changes motivational prioritization and effort allocation rather than physical abilities. It is currently unclear whether a similar motivational framework accounts for the mental fatigue and cognitive symptoms of acute sickness. Accordingly, this study aimed to assess the relationship between fatigue, cytokines and mental effort-based decision making during acute systemic inflammation.METHODSEighty-five participants (41 males; 18-30 years (M = 23.0, SD = 2.4)) performed a mental effort-based decision-making task before, 2 h after, and 5 h after intravenous administration of 1 ng/kg bacterial lipopolysaccharide (LPS) to induce systemic inflammation. Plasma concentrations of cytokines (interleukin (IL)-6, IL-8 and tumor necrosis factor (TNF)) and fatigue levels were assessed at similar timepoints. In the task, participants decided whether they wanted to perform (i.e., 'accepted') arithmetic calculations of varying difficulty (3 levels: easy, medium, hard) in order to obtain rewards (3 levels: 5, 6 or 7 points). Acceptance rates were analyzed using a binomial generalized estimated equation (GEE) approach with effort, reward and time as independent variables. Arithmetic performance was measured per effort level prior to the decisions and included as a covariate. Associations between acceptance rates, fatigue (self-reported) and cytokine concentration levels were analyzed using partial correlation analyses.RESULTSPlasma cytokine concentrations and fatigue were increased at 2 h post-LPS compared to baseline and 5 h post-LPS administration. Acceptance rates decreased for medium, but not for easy or hard effort levels at 2 h post-LPS versus baseline and 5 h post-LPS administration, irrespective of reward level. These reductions in acceptance rates occurred despite improved accuracy on the arithmetic calculations itself. Reduced acceptance rates for medium effort were associated with increased fatigue, but not with increased cytokine concentrations.CONCLUSIONFatigue during acute systemic inflammation is associated with alterations in mental effort allocation, similarly as observed previously for physical effort-based choice. Specifically, willingness to exert mental effort depended on effort and not reward information, while task accuracy was preserved. These results extend the motivational account of inflammation to the mental domain and suggest that inflammation may not necessarily affect domain-specific mental abilities, but rather affects domain-general effort-allocation processes.</p

    The Importance of Radiation Dose to the Atherosclerotic Plaque in the Left Anterior Descending Coronary Artery for Radiation-Induced Cardiac Toxicity of Breast Cancer Patients?

    Get PDF
    IMPORTANCE: Radiation-induced acute coronary events (ACEs) may occur as treatment-related late side effect of breast cancer (BC) radiation. However, the underlying mechanisms behind this radiation-induced cardiac disease remains to be determined. OBJECTIVE: The objective of this study was to test the hypothesis that radiation dose to calcified atherosclerotic plaques in the left anterior descending coronary artery (LAD) is a better predictor for ACEs than radiation dose to the whole heart or left ventricle in BC patients treated with radiotherapy (RT). DESIGN, SETTING, PARTICIPANTS, AND MAIN OUTCOMES AND MEASURES: The study cohort consisted of 910 BC patients treated with postoperative RT after breast conserving surgery. In total, 163 patients had an atherosclerotic plaque in the LAD. The endpoint was the occurrence of an ACE after treatment. For each individual patient, the mean heart dose (MHD), volume of the left ventricle receiving ≥ 5 Gy (LV-V5), mean LAD dose and mean dose to calcified atherosclerotic plaques in the LAD, if present, were acquired based on planning CT-scans. Cox-regression analysis was used to analyse the effects on the cumulative incidence of ACEs. RESULTS: The median follow-up time was 9.2 years (range: 0.1-14.3 years). In total, 38 patients (4.2%) developed an ACE during follow-up. For patients with an atherosclerotic plaque (n=163) the mean dose to the atherosclerotic plaque was the strongest predictor for ACE, even after correction for cardiovascular risk factors (HR: 1.269 (95% CI: 1.090-1.477), P=0.002). The LV-V5 was associated with ACEs in patients without atherosclerotic plaques in the LAD (n=680) (hazard ratio (HR): 1.021 (95% CI: 1.003-1.039; P=0.023). CONCLUSION AND RELEVANCE: The results of this study suggest that radiation dose to pre-existing calcified atherosclerotic plaques in the LAD is strongly associated with the development of ACEs in BC patients

    Acute symptoms during the course of head and neck radiotherapy or chemoradiation are strong predictors of late dysphagia

    Get PDF
    AbstractPurposeTo determine if acute symptoms during definitive radiotherapy (RT) or chemoradiation (CHRT) are prognostic factors for late dysphagia in head and neck cancer (HNC).Material and methodsThis prospective cohort study consisted of 260 HNC patients who received definitive RT or CHRT. The primary endpoint was grade 2–4 swallowing dysfunction at 6months after completing RT (SWALM6). During treatment, acute symptoms, including oral mucositis, xerostomia and dysphagia, were scored, and the scores were accumulated weekly and entered into an existing reference model for SWALM6 that consisted of dose–volume variables only.ResultsBoth acute xerostomia and dysphagia were strong prognostic factors for SWALM6. When acute scores were added as variables to the reference model, model performance increased as the course of treatment progressed: the AUC rose from 0.78 at the baseline to 0.85 in week 6. New models built for weeks 3–6 were significantly better able to identify patients with and without late dysphagia.ConclusionAcute xerostomia and dysphagia during the course of RT are strong prognostic factors for late dysphagia. Including accumulated acute symptom scores on a weekly basis in prediction models for late dysphagia significantly improves the identification of high-risk and low-risk patients at an early stage during treatment and might facilitate individualized treatment adaptation
    • …
    corecore