160 research outputs found

    Self-report Versus Measured Physical Activity Levels During Outpatient Cardiac Rehabilitation

    Get PDF
    Purpose: Many patients with coronary artery disease (CAD) do not achieve the recommended physical activity (PA) levels during and after cardiac rehabilitation (CR). The aim of this study was to analyze moderate to vigorous physical activity (MVPA) levels and the differences between perceived (self-reported) and measured (activity monitor) MVPA in CAD patients during CR. The second aim was to analyze which patient characteristics were associated with this difference. Methods: A two-center observational-sectional study was conducted within the Department of Rehabilitation Medicine of the University Medical Center Groningen between January and April 2018. Adults with CAD, following an outpatient CR program, were included. Perceived MVPA was assessed with the Short Questionnaire to Assess Health-enhancing Physical Activity and compared with ActivPAL3 activity monitor outcomes over a period of 7 d. Results: Fifty-one patients with CAD (age 59.4 ± 7.1 yr, eight females) were recruited. Four patients (8%) did not achieve the recommended guideline level of ≥150 min/wk of MVPA. Patients spent ≥80% of the week in sedentary activities. Patients overestimated MVPA with a median of 805 (218, 1363) min/wk (P < .001). The selected patient characteristics (age, body mass index, type of CAD, type of CR, social support, and self-efficacy) were not associated with this overestimation. Conclusions: Most patients with CAD, participating in an outpatient CR program, do achieve MVPA exercise recommendations but spend simultaneously too much time in sedentary activities

    Short-Term Clinical Outcomes of Single Versus Dual Antiplatelet Therapy after Infrainguinal Endovascular Treatment for Peripheral Arterial Disease

    Get PDF
    After infrainguinal endovascular treatment for peripheral arterial disease (PAD), it is uncertain whether single antiplatelet therapy (SAPT) or dual antiplatelet therapy (DAPT) should be preferred. This study investigated major adverse limb events (MALE) and major adverse cardiovascular events (MACE) between patients receiving SAPT and DAPT. Patient data from three centers in the Netherlands were retrospectively collected and analyzed. All patients treated for PAD by endovascular revascularization of the superficial femoral, popliteal, or below-the-knee (BTK) arteries and who were prescribed acetylsalicylic acid or clopidogrel, were included. End points were 1-, 3-, and 12-month MALE and MACE, and bleeding complications. In total, 237 patients (258 limbs treated) were included, with 149 patients receiving SAPT (63%) and 88 DAPT (37%). No significant differences were found after univariate and multivariate analyses between SAPT and DAPT on 1-, 3-, and 12-month MALE and MACE, or bleeding outcomes. Subgroup analyses of patients with BTK treatment showed a significantly lower 12-month MALE rate when treated with DAPT (hazard ratio 0.33; 95% confidence interval 0.12-0.95; p = 0.04). In conclusion, although patient numbers were small, no differences were found between SAPT and DAPT regarding MALE, MACE, or bleeding complications. DAPT should, however, be considered over SAPT for the subgroup of patients with below-the-knee endovascular treatment

    Synthesis and Evaluation of F-18-Enzalutamide, a New Radioligand for PET Imaging of Androgen Receptors:A Comparison with 16 beta-F-18-Fluoro-5 alpha-Dihydrotestosterone

    Get PDF
    16 beta-F-18-fluoro-5 alpha-dihydrotestosterone (F-18-FDHT) is a radiopharmaceutical that has been investigated as a diagnostic agent for the assessment of androgen receptor (AR) density in prostate cancer using PET. However, F-18-FDHT is rapidly metabolized in humans and excreted via the kidneys into the urine, potentially compromising the detection of tumor lesions close to the prostate. Enzalutamide is an AR signaling inhibitor currently used in different stages of prostate cancer. Enzalutamide and its primary metabolite N-desmethylenzalutamide have an AR affinity comparable to that of FDHT but are excreted mainly via the hepatic route. Radiolabeled enzalutamide could thus be a suitable candidate PET tracer for AR imaging. Here, we describe the radiolabeling of enzalutamide with F-18. Moreover, the in vitro and in vivo behavior of F-18-enzalutamide was evaluated and compared with the current standard, F-18-FDHT. Methods: F-18-enzalutamide was obtained by fluorination of the nitro precursor. In vitro cellular uptake studies with F-18-enzalutamide and F-18-FDHT were performed in LNCaP (AR-positive) and HEK293 (AR-negative) cells. Competition assays with both tracers were conducted on the LNCaP (AR-positive) cell line. In vivo PET imaging, ex vivo biodistribution, and metabolite studies with F-18-enzalutamide and F-18-FDHT were conducted on athymic nude male mice bearing an LNCaP xenograft in the shoulder. Results: F-18-enzalutamide was obtained in 1.4% +/- 0.9% radiochemical yield with an apparent molar activity of 6.2 +/- 10.3 GBq/mu mol. F-18-FDHT was obtained in 1.5% +/- 0.8% yield with a molar activity of more than 25 GBq/mu mol. Coincubation with an excess of 5 alpha-dihydrotestosterone or enzalutamide significantly reduced the cellular uptake of F-18-enzalutamide and F-18-FDHT to about 50% in AR-positive LNCaP cells but not in AR-negative HEK293 cells. PET and biodistribution studies on male mice bearing a LnCaP xenograft showed about 3 times higher tumor uptake for F-18-enzalutamide than for F-18-FDHT. Sixty minutes after tracer injection, 93% of F-18-enzalutamide in plasma was still intact, compared with only 3% of F-18-FDHT. Conclusion: Despite its lower apparent molar activity, F-18-enzalutamide shows higher tumor uptake and better metabolic stability than F-18-FDHT and thus seems to have more favorable properties for imaging of AR with PET. However, further evaluation in other oncologic animal models and patients is warranted to confirm these results

    Preclinical exploration of combining plasmacytoid and myeloid dendritic cell vaccination with BRAF inhibition

    Get PDF
    Contains fulltext : 171226.pdf (publisher's version ) (Open Access)Background: Melanoma is the most lethal type of skin cancer and its incidence is progressively increasing. The introductions of immunotherapy and targeted therapies have tremendously improved the treatment of melanoma. Selective inhibition of BRAF by vemurafenib results in objective clinical responses in around 50 % of patients suffering from BRAFV600 mutated melanoma. However, drug resistance often results in hampering long-term tumor control. Alternatively, immunotherapy by vaccination with natural dendritic cells (nDCs) demonstrated long-term tumor control in a proportion of patients. We postulate that the rapid tumor debulking by vemurafenib can synergize the long-term tumor control of nDC vaccination to result in an effective treatment modality in a large proportion of patients. Here, we investigated the feasibility of this combination by analyzing the effect of vemurafenib on the functionality of nDCs. Methods: Plasmacytoid DCs (pDCs) and myeloid DCs (mDCs) were isolated from PBMCs obtained from buffy coats from healthy volunteers or vemurafenib-treated melanoma patients. Maturation of pDCs, mDCs and immature mono-cyte-derived DCs was induced by R848 in the presence or absence of vemurafenib and analyzed by FACS. Cytokine production and T cell proliferation induced by mature DCs were analyzed. Results: Vemurafenib inhibited maturation and cytokine production of highly purified nDCs of healthy volunteers resulting in diminished allogeneic T cell proliferation. This deleterious effect of vemurafenib on nDC functionality was absent when total PBMCs were exposed to vemurafenib. In patients receiving vemurafenib, nDC functionality and T cell allostimulatory capacity were unaffected. Conclusion: Although vemurafenib inhibited the functionality of purified nDC of healthy volunteers, this effect was not observed when nDCs were matured in the complete PBMC fraction. This might have been caused by increased vemurafenib uptake in absence of other cell types. In accordance, nDCs isolated from patients on active vemurafenib treatment showed no negative effects. In conclusion, our results pave the way for a combinatorial treatment strategy and, we propose that combining vemurafenib with nDC vaccination represent a powerful opportunity that deserves more investigation in the clinic

    Arterio-ureteral fistula:a nationwide cross-sectional questionnaire analysis

    Get PDF
    PURPOSE: Arterio-ureteral fistula (AUF) is an uncommon diagnosis, but potentially lethal. Although the number of reports has increased over the past two decades, the true incidence and contemporary urologists’ experience and approach in clinical practice remains unknown. This research is conducted to provide insight in the incidence of AUF in The Netherlands, and the applied diagnostic tests and therapeutic approaches in modern practice. METHODS: A nationwide cross-sectional questionnaire analysis was performed by sending a survey to all registered Dutch urologists. Data collection included information on experience with patients with AUF; and their medical history, diagnostics, treatment, and follow-up, and were captured in a standardized template by two independent reviewers. Descriptive statistics were used. RESULTS: Response rate was 62% and 56 AUFs in 53 patients were reported between 2003 and 2018. The estimated incidence of AUF in The Netherlands in this time period is 3.5 AUFs per year. Hematuria was observed in all patients; 9% intermittent microhematuria, and 91% presenting with, or building up to massive hematuria. For the final diagnosis, angiography was the most efficient modality, confirming diagnosis in 58%. Treatment comprised predominantly endovascular intervention. CONCLUSION: The diagnosis AUF should be considered in patients with persistent intermittent or massive hematuria. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00345-021-03910-3

    A nationwide evaluation of deceased donor kidney transplantation indicates detrimental consequences of early graft loss

    Get PDF
    Early graft loss (EGL) is a feared outcome of kidney transplantation. Consequently, kidneys with an anticipated risk of EGL are declined for transplantation. In the most favorable scenario, with optimal use of available donor kidneys, the donor pool size is balanced by the risk of EGL, with a tradeoff dictated by the consequences of EGL. To gauge the consequence of EGL we systematically evaluated its impact in an observational study that included all 10,307 deceased-donor kidney transplantations performed in The Netherlands between 1990 and 2018. Incidence of EGL, defined as graft loss within 90 days, in primary transplantation was 8.2% (699/8,511). The main causes were graft rejection (30%), primary nonfunction (25%), and thrombosis or infarction (20%). EGL profoundly impacted short- and long-term patient survival (adjusted hazard ratio; 95% confidence interval: 8.2; 5.1-13.2 and 1.7; 1.3-2.1, respectively). Of the EGL recipients who survived 90 days after transplantation (617/699) only 440 of the 617 were relisted for re-transplantation. Of those relisted, only 298 were ultimately re-transplanted leading to an actual re-transplantation rate of 43%. Noticeably, re-transplantation was associated with a doubled incidence of EGL, but similar long-term graft survival (adjusted hazard ratio 1.1; 0.6-1.8). Thus, EGL after kidney transplantation is a medical catastrophe with high mortality rates, low relisting rates, and increased risk of recurrent EGL following re-transplantation. This implies that detrimental outcomes also involve convergence of risk factors in recipients with EGL. The 8.2% incidence of EGL minimally impacted population mortality, indicating this incidence is acceptable

    Improving outcomes for donation after circulatory death kidney transplantation:Science of the times

    Get PDF
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period decreased from 20 to 15 hours. This national study shows major improvements in outcomes of transplanted DCD kidneys over time. The time-dependent shift underpins that kidney transplantation has come of age and DCD results are nowadays comparable to DBD transplants. It also calls for careful interpretation of conclusions based on historical cohorts, and emphasises that retrospective studies should correct for time-related effects

    Mimics of Autoimmune Encephalitis:Validation of the 2016 Clinical Autoimmune Encephalitis Criteria

    Get PDF
    BACKGROUND AND OBJECTIVES: The clinical criteria for autoimmune encephalitis (AE) were proposed by Graus et al. in 2016. In this study, the AE criteria were validated in the real world, and common AE mimics were described. In addition, criteria for probable anti-LGI1 encephalitis were proposed and validated. METHODS: In this retrospective cohort study, patients referred to our national referral center with suspicion of AE and specific neuroinflammatory disorders with similar clinical presentations were included from July 2016 to December 2019. Exclusion criteria were pure cerebellar or peripheral nerve system disorders. All patients were evaluated according to the AE criteria. RESULTS: In total, 239 patients were included (56% female; median age 42 years, range 1-85). AE was diagnosed in 104 patients (44%) and AE mimics in 109 patients (46%). The most common AE mimics and misdiagnoses were neuroinflammatory CNS disorders (26%), psychiatric disorders (19%), epilepsy with a noninflammatory cause (13%), CNS infections (7%), neurodegenerative diseases (7%), and CNS neoplasms (6%). Common confounding factors were mesiotemporal lesions on brain MRI (17%) and false-positive antibodies in serum (12%). Additional mesiotemporal features (involvement extralimbic structures, enhancement, diffusion restriction) were observed more frequently in AE mimics compared with AE (61% vs 24%; p = 0.005). AE criteria showed the following sensitivity and specificity: possible AE, 83% (95% CI 74-89) and 27% (95% CI 20-36); definite autoimmune limbic encephalitis (LE), 10% (95% CI 5-17) and 98% (95% CI 94-100); and probable anti-NMDAR encephalitis, 50% (95% CI 26-74) and 96% (95% CI 92-98), respectively. Specificity of the criteria for probable seronegative AE was 99% (95% CI 96-100). The newly proposed criteria for probable anti-LGI1 encephalitis showed a sensitivity of 66% (95% CI 47-81) and specificity of 96% (95% CI 93-98). DISCUSSION: AE mimics occur frequently. Common pitfalls in AE misdiagnosis are mesiotemporal lesions (predominantly with atypical features) and false-positive serum antibodies. As expected, the specificity of the criteria for possible AE is low because these criteria represent the minimal requirements for entry in the diagnostic algorithm for AE. Criteria for probable AE (-LGI1, -NMDAR, seronegative) and definite autoimmune LE are applicable for decisions on immunotherapy in early disease stage, as specificity is high.</p

    Smoking onset and the time-varying effects of self-efficacy, environmental smoking, and smoking-specific parenting by using discrete-time survival analysis

    Get PDF
    This study examined the timing of smoking onset during mid- or late adolescence and the time-varying effects of refusal self-efficacy, parental and sibling smoking behavior, smoking behavior of friends and best friend, and parental smoking-specific communication. We used data from five annual waves of the ‘Family and Health’ project. In total, 428 adolescents and their parents participated at baseline. Only never smokers were included at baseline (n = 272). A life table and Kaplan–Meier survival curve showed that 51% of all adolescents who did not smoke at baseline did not start smoking within 4 years. The risk for smoking onset during mid- or late adolescence is rather stable (hazard ratio between 16 and 19). Discrete-time survival analyses revealed that low refusal self-efficacy, high frequency of communication, and sibling smoking were associated with smoking onset one year later. No interaction effects were found. Conclusively, the findings revealed that refusal self-efficacy is an important predictor of smoking onset during mid- or late adolescence and is independent of smoking-specific communication and smoking behavior of parents, siblings, and (best) friend(s). Findings emphasize the importance of family prevention programs focusing on self-efficacy skills
    corecore