38 research outputs found

    Blood Pressure Response and Pulse Arrival Time During Exercise Testing in Well-Trained Individuals

    Get PDF
    Introduction: There is a lack of data describing the blood pressure response (BPR) in well-trained individuals. In addition, continuous bio-signal measurements are increasingly investigated to overcome the limitations of intermittent cuff-based BP measurements during exercise testing. Thus, the present study aimed to assess the BPR in well-trained individuals during a cycle ergometer test with a particular focus on the systolic BP (SBP) and to investigate pulse arrival time (PAT) as a continuous surrogate for SBP during exercise testing. Materials and Methods: Eighteen well-trained male cyclists were included (32.4 ± 9.4 years; maximal oxygen uptake 63 ± 10 ml/min/kg) and performed a stepwise lactate threshold test with 5-minute stages, followed by a continuous test to voluntary exhaustion with 1-min increments when cycling on an ergometer. BP was measured with a standard automated exercise BP cuff. PAT was measured continuously with a noninvasive physiological measurements device (IsenseU) and metabolic consumption was measured continuously during both tests. Results: At lactate threshold (281 ± 56 W) and maximal intensity test (403 ± 61 W), SBP increased from resting values of 136 ± 9 mmHg to maximal values of 219 ± 21 mmHg and 231 ± 18 mmHg, respectively. Linear within-participant regression lines between PAT and SBP showed a mean r 2 of 0.81 ± 17. Conclusion: In the present study focusing on the BPR in well-trained individuals, we observed a more exaggerated systolic BPR than in comparable recent studies. Future research should follow up on these findings to clarify the clinical implications of the high BPR in well-trained individuals. In addition, PAT showed strong intra-individual associations, indicating potential use as a surrogate SBP measurement during exercise testing.publishedVersio

    Blood Pressure Response and Pulse Arrival Time During Exercise Testing in Well-Trained Individuals

    Get PDF
    Introduction: There is a lack of data describing the blood pressure response (BPR) in well-trained individuals. In addition, continuous bio-signal measurements are increasingly investigated to overcome the limitations of intermittent cuff-based BP measurements during exercise testing. Thus, the present study aimed to assess the BPR in well-trained individuals during a cycle ergometer test with a particular focus on the systolic BP (SBP) and to investigate pulse arrival time (PAT) as a continuous surrogate for SBP during exercise testing. Materials and Methods: Eighteen well-trained male cyclists were included (32.4 ± 9.4 years; maximal oxygen uptake 63 ± 10 ml/min/kg) and performed a stepwise lactate threshold test with 5-minute stages, followed by a continuous test to voluntary exhaustion with 1-min increments when cycling on an ergometer. BP was measured with a standard automated exercise BP cuff. PAT was measured continuously with a non-invasive physiological measurements device (IsenseU) and metabolic consumption was measured continuously during both tests. Results: At lactate threshold (281 ± 56 W) and maximal intensity test (403 ± 61 W), SBP increased from resting values of 136 ± 9 mmHg to maximal values of 219 ± 21 mmHg and 231 ± 18 mmHg, respectively. Linear within-participant regression lines between PAT and SBP showed a mean r2 of 0.81 ± 17. Conclusion: In the present study focusing on the BPR in well-trained individuals, we observed a more exaggerated systolic BPR than in comparable recent studies. Future research should follow up on these findings to clarify the clinical implications of the high BPR in well-trained individuals. In addition, PAT showed strong intra-individual associations, indicating potential use as a surrogate SBP measurement during exercise testing.publishedVersio

    Survival of patients treated with extended-hours haemodialysis in Europe : an analysis of the ERA-EDTA Registry

    Get PDF
    Background. Previous US studies have indicated that haemodialysis with >= 6-h sessions [extended-hours haemodialysis (EHD)] may improve patient survival. However, patient characteristics and treatment practices vary between the USA and Europe. We therefore investigated the effect of EHD three times weekly on survival compared with conventional haemodialysis (CHD) among European patients. Methods. We included patients who were treated with haemodialysis between 2010 and 2017 from eight countries providing data to the European Renal Association-European Dialysis and Transplant Association Registry. Haemodialysis session duration and frequency were recorded once every year or at every change of haemodialysis prescription and were categorized into three groups: CHD (three times weekly, 3.5-4h/treatment), EHD (three times weekly, >= 6h/treatment) or other. In the primary analyses we attributed death to the treatment at the time of death and in secondary analyses to EHD if ever initiated. We compared mortality risk for EHD to CHD with causal inference from marginal structural models, using Cox proportional hazards models weighted for the inverse probability of treatment and censoring and adjusted for potential confounders. Results. From a total of 142 460 patients, 1338 patients were ever treated with EHD (three times, 7.10.8h/week) and 89 819 patients were treated exclusively with CHD (three times, 3.9 +/- 0.2h/week). Crude mortality rates were 6.0 and 13.5/100 person-years. In the primary analyses, patients treated with EHD had an adjusted hazard ratio (HR) of 0.73 [95% confidence interval (CI) 0.62-0.85] compared with patients treated with CHD. When we attributed all deaths to EHD after initiation, the HR for EHD was comparable to the primary analyses [HR 0.80 (95% CI 0.71-0.90)]. Conclusions. EHD is associated with better survival in European patients treated with haemodialysis three times weekly.Peer reviewe

    Accuracy of non-invasive cuffless blood pressure in the intensive care unit: Promises and challenges

    Get PDF
    ObjectiveContinuous non-invasive cuffless blood pressure (BP) monitoring may reduce adverse outcomes in hospitalized patients if accuracy is approved. We aimed to investigate accuracy of two different BP prediction models in critically ill intensive care unit (ICU) patients, using a prototype cuffless BP device based on electrocardiogram and photoplethysmography signals. We compared a pulse arrival time (PAT)-based BP model (generalized PAT-based model) derived from a general population cohort to more complex and individualized models (complex individualized models) utilizing other features of the BP sensor signals.MethodsPatients admitted to an ICU with indication of invasive BP monitoring were included. The first half of each patient’s data was used to train a subject-specific machine learning model (complex individualized models). The second half was used to estimate BP and test accuracy of both the generalized PAT-based model and the complex individualized models. A total of 7,327 measurements of 15 s epochs were included in pairwise comparisons across 25 patients.ResultsThe generalized PAT-based model achieved a mean absolute error (SD of errors) of 7.6 (7.2) mmHg, 3.3 (3.1) mmHg and 4.6 (4.4) mmHg for systolic BP, diastolic BP and mean arterial pressure (MAP) respectively. Corresponding results for the complex individualized model were 6.5 (6.7) mmHg, 3.1 (3.0) mmHg and 4.0 (4.0) mmHg. Percentage of absolute errors within 10 mmHg for the generalized model were 77.6, 96.2, and 89.6% for systolic BP, diastolic BP and MAP, respectively. Corresponding results for the individualized model were 83.8, 96.2, and 94.2%. Accuracy was significantly improved when comparing the complex individualized models to the generalized PAT-based model in systolic BP and MAP, but not diastolic BP.ConclusionA generalized PAT-based model, developed from a different population was not able to accurately track BP changes in critically ill ICU patients. Individually fitted models utilizing other cuffless BP sensor signals significantly improved accuracy, indicating that cuffless BP can be measured non-invasively, but the challenge toward generalizable models remains for future research to resolve

    Data from the ERA-EDTA Registry were examined for trends in excess mortality in European adults on kidney replacement therapy

    Get PDF
    The objective of this study was to investigate whether the improvement in survival seen in patients on kidney replacement therapy reflects the enhanced survival of the general population. Patient and general population statistics were obtained from the European Renal Association-European Dialysis and Transplant Association (ERA-EDTA) Registry and the World Health Organization databases, respectively. Relative survival models were composed to examine trends over time in all-cause and cause-specific excess mortality, stratified by age and modality of kidney replacement therapy, and adjusted for sex, primary kidney disease and country. In total, 280,075 adult patients started kidney replacement therapy between 2002 and 2015. The excess mortality risk in these patients decreased by 16% per five years (relative excess mortality risk (RER) 0.84; 95% confidence interval 0.83-0.84). This reflected a 14% risk reduction in dialysis patients (RER 0.86; 0.85-0.86), and a 16% increase in kidney transplant recipients (RER 1.16; 1.07-1.26). Patients on dialysis showed a decrease in excess mortality risk of 28% per five years for atheromatous cardiovascular disease as the cause of death (RER 0.72; 0.70-0.74), 10% for non-atheromatous cardiovascular disease (RER 0.90; 0.88-0.92) and 10% for infections (RER 0.90; 0.87-0.92). Kidney transplant recipients showed stable excess mortality risks for most causes of death, although it did worsen in some subgroups. Thus, the increase in survival in patients on kidney replacement therapy is not only due to enhanced survival in the general population, but also due to improved survival in the patient population, primarily in dialysis patients.Peer reviewe

    The epidemiology of renal replacement therapy in two different parts of the worldThe Latin American Dialysis and Transplant Registry versus the European Renal Association-European Dialysis and Transplant Association Registry

    Get PDF
    Publisher Copyright: © 2018 Pan American Health Organization. All rights reserved.Objective: To compare the epidemiology of renal replacement therapy (RRT) for end-stage renal disease (ESRD) in Latin America and Europe, as well as to study differences in macro-economic indicators, demographic and clinical patient characteristics, mortality rates, and causes of death between these two populations. Methods: We used data from 20 Latin American and 49 European national and subnational renal registries that had provided data to the Latin American Dialysis and Renal Transplant Registry (RLADTR) and the European Renal Association-European Dialysis and Transplant Association (ERA-EDTA) Registry, respectively. The incidence and prevalence of RRT in 2013 were calculated per million population (pmp), overall and by subcategories of age, sex, primary renal disease, and treatment modality. The correlation between gross domestic product and the prevalence of RRT was analyzed using linear regression. Trends in the prevalence of RRT between 2004 and 2013 were assessed using Joinpoint regression analysis. Results: In 2013, the overall incidence at day 91 after the onset of RRT was 181 pmp for Latin American countries and 130 pmp for European countries. The overall prevalence was 660 pmp for Latin America and 782 pmp for Europe. In the Latin American countries, the annual increase in the prevalence averaged 4.0% (95% confdence interval (CI): 2.5%-5.6%) from 2004 to 2013, while the European countries showed an average annual increase of 2.2% (95% CI: 2.0%-2.4%) for the same time period. The crude mortality rate was higher in Latin America than in Europe (112 versus 100 deaths per 1 000 patient-years), and cardiovascular disease was the main cause of death in both of those regions. Conclusions. There are considerable differences between Latin America and Europe in the epidemiology of RRT for ESRD. Further research is needed to explore the reasons for these differences.Peer reviewe

    Erratum to: 36th International Symposium on Intensive Care and Emergency Medicine

    Get PDF
    [This corrects the article DOI: 10.1186/s13054-016-1208-6.]

    A survey of school nurse staffing in the school health services

    No full text

    Impact of initial dialysis modality on mortality: a propensity-matched study

    Get PDF
    Background Whether the choice of dialysis modality in patients with end stage renal disease may impact mortality is undecided. No randomized controlled trial has properly addressed this issue. Propensity-matched observational studies could give important insight into the independent effect of peritoneal (PD) opposed to haemodialysis (HD) on all-cause and cardiovascular mortality. Methods To correct for case-mix differences between patients treated with PD and HD, propensity-matched analyses were utilized in all patients who initiated dialysis as first renal replacement therapy in Norway in the period 2005–2012. PD patients were matched in a 1:1 fashion with HD patients, creating 692 pairs of patients with comparable baseline variables. As-treated and intention-to treat analyses were undertaken to assess cardiovascular and all-cause mortality. Interaction analyses were used to assess differences in the relationship between initial dialysis modality and mortality, between strata of age, gender and prevalent diabetes mellitus. Results In the as-treated analyses, initial dialysis modality did not impact 2-year (PD vs. HD: HR 0.87, 95 % CI 0.67–1.12) or 5-year all-cause mortality (HR 0.95, 95 % CI 0.77–1.17). In patients younger than 65 years, PD was superior compared to HD with regard to both 2-year (HR 0.39, 95 % CI 0.19–0.81), and 5-year all-cause mortality (HR 0.49, 95 % CI 0.27–0.89). Cardiovascular mortality was also lower in the younger patients treated with PD (5-year HR 0.38, 95 % CI 0.15–0.96). PD was not associated with impaired prognosis in any of the prespecified subgroups compared to HD. The results were similar in the as-treated and intention-to-treat analyses. Conclusion Survival in PD was not inferior to HD in any subgroup of patients even after five years of follow-up. In patients below 65 years, PD yielded superior survival rates compared to HD. Increased use of PD as initial dialysis modality in ESRD patients could be encouraged

    Is HRQOL in dialysis associated with patient survival or graft function after kidney transplantation?

    No full text
    Background Health related quality of life (HRQOL) is patient-reported, and an important treatment outcome for patients undergoing renal replacement therapy. Whether HRQOL in dialysis can affect mortality or graft survival after renal transplantation (RTX) is not determined. The aims of the present study were to investigate whether pretransplant HRQOL is associated with post-RTX patient survival or graft function, and to assess whether improvement in HRQOL from dialysis to RTX is associated with patient survival. Methods In a longitudinal prospective study, HRQOL was measured in 142 prevalent dialysis patients (67 % males, mean age 51 ± 15.5 years) who subsequent underwent renal transplantation. HRQOL could be repeated in 110 transplant patients 41 (IQR 34–51) months after RTX using the self-administered Kidney Disease and Quality of Life Short Form (KDQOL-SF) measure. Kaplan-Meier plots were utilized for survival analyses, and linear regression models were used to address HRQOL and effect on graft function. Results Follow-up time was 102 (IQR 97–108) months after RTX. Survival after RTX was higher in patients who perceived good physical function (PF) in dialysis compared to patients with poorer PF (p = 0.019). Low scores in the domain mental health measured in dialysis was associated with accelerated decline in graft function (p = 0.048). Improvements in the kidney-specific domains “symptoms” and “effect of kidney disease” in the trajectory from dialysis to RTX were associated with a survival benefit (p = 0.007 and p = 0.02, respectively). Conclusion HRQOL measured in dialysis patients was associated with survival and graft function after RTX. These findings may be useful in clinical pretransplant evaluations. Improvements in some of the kidney-specific HRQOL domains from dialysis to RTX were associated with lower mortality. Prospective and interventional studies are warranted
    corecore