365 research outputs found
Netherlands Commercial Court als mogelijke WHOA-rechter bij internationale herstructureringen
Coherent privaatrech
Incidence and determinants of hypophosphatemia in diabetic ketoacidosis:an observational study
Introduction Diabetic ketoacidosis (DKA) is a life-threatening complication of type 1 diabetes mellitus (T1DM) characterized by hyperglycemia and metabolic acidosis. Hypophosphatemia in DKA often occurs during hospital admittance for DKA. Literature on the magnitude, determinants and consequences of hypophosphatemia in DKA is scarce. Primary aim of this study was to investigate the incidence and consequences of hypophosphatemia during hospitalisation for DKA. Research design and methods Cohort study among individuals with T1DM who were admitted for DKA between 2005 and 2020 in an academic and a non-academic hospital. Multivariate regression models were performed to investigate determinants of the lowest phosphate during the treatment of DKA. Results A total of 127 episodes of DKA among 80 individuals were identified. Age at DKA presentation was 28 (22-46) years, 45% of the cases was female, diabetes duration was 13.2 (8.9-25.5) years with glycosylated hemoglobin levels of 91.9 +/- 26.2 mmol/mol. In 9% of all cases, DKA was the first presentation of T1DM. Lowest phosphate levelss reported during the treatment phase were 0.54 (0.32-0.83) mmol/L and hypophosphatemia was present in 74% (62/84). The time to lowest phosphate was 16 (8-23) hours. In multivariate analysis, baseline bicarbonate and hemoglobin at admission were significantly associated with the lowest phosphate level reported. No adverse effects of hypophosphatemia on hospital stay duration, morbidity or mortality were found, even if left untreated. Conclusions Hypophosphatemia during DKA is common and increases with severe acidosis. However, in this study it was not related to adverse outcomes. Although limitations of this retrospective study should be taken into account, the routine and repeated measurement of phosphate levels in DKA could be reconsidered, provided that possible symptoms related to hypophosphatemia are monitored
Immediate versus staged revascularisation of non-culprit arteries in patients with acute coronary syndrome: a systematic review and meta-analysis
Although there is robust evidence that revascularisation of non-culprit vessels should be pursued in patients presenting with an acute coronary syndrome (ACS) and multivessel coronary artery disease (MVD), the optimal timing of complete revascularisation remains disputed. In this systematic review and meta-analysis our results suggest that outcomes are comparable for immediate and staged complete revascularisation in patients with ACS and MVD. However, evidence from randomised controlled trials remains scarce and cautious interpretation of these results is recommended. More non-biased evidence is necessary to aid future decision making on the optimal timing of complete revascularisation
Different routes of insulin administration do not influence serum free thiols in type 1 diabetes mellitus
Aims: Intraperitoneal (IP) insulin administration is a last-resort treatment option for selected patients with type 1 diabetes mellitus (T1DM). As the IP route of insulin administration mimics the physiology more closely than the subcutaneous (SC) route, we hypothesized that IP insulin would result in less oxidative stress (expressed as systemic level of free sulphydryl (R-SH) content) compared to SC insulin in subjects with T1DM.Materials and methods: Prospective, observational case-control study. Serum thiol measurements were performed at baseline and at 26 weeks in age- and gender-matched patients with T1DM. Serum-free thiols, compounds with a R-SH group that are readily oxidized by reactive oxygen species, are considered to be a marker of systemic redox status.Results: A total of 176 patients, 39 of which used IP and 141 SC insulin therapy were analysed. Mean baseline R-SH concentration was 248 (31) μmol/L. In multivariable analysis, the route of insulin therapy had no impact on baseline R-SH levels. The estimated geometric mean concentrations of R-SH did not differ significantly between both groups: 264 (95% CI 257, 270) for the IP group and 258 (95% CI 254, 261) for the SC group with a difference of 6 (95% CI -2, 14) μmol/L.Conclusions: Based on R-SH as a marker of systemic oxidative stress, these findings demonstrate that the route of insulin administration, IP or SC, does not influence systemic redox status in patients with T1DM.</p
Favourable serum calcification propensity with intraperitoneal as compared with subcutaneous insulin administration in type 1 diabetes
Background: Serum calcification propensity can be monitored using the maturation time of calciprotein particles in serum (T-50 test). A shorter T-50 indicates greater propensity to calcify; this is an independent determinant of cardiovascular disease. As the intraperitoneal (IP) route of insulin administration mimics the physiology more than the subcutaneous (SC) route in persons with type 1 diabetes (T1DM), we hypothesized that IP insulin influences determinants of calcium propensity and therefore result in a longer T-50 than SC insulin administration. Methods: Prospective, observational case-control study. Measurements were performed at baseline and at 26 weeks in age and gender matched persons with T1DM. Results: A total of 181 persons, 39 (21.5%) of which used IP and 142 (78.5%) SC insulin were analysed. Baseline T-50 was 356 (45) minutes. The geometric mean T-50 significantly differed between both treatment groups: 367 [95% confidence interval (CI) 357, 376] for the IP group and 352 (95% CI 347, 357) for the SC group with a difference of -15 (95% CI -25, -4) minutes, in favour of IP treatment. In multivariable analyses, the IP route of insulin administration had a positive relation on T-50 concentrations while higher age, triglycerides and phosphate concentrations had an inverse relation. Conclusion: Among persons with T1DM, IP insulin administration results in a more favourable calcification propensity time then SC insulin. It has yet to be shown if this observation translates into improved cardiovascular outcomes
Computer-assisted glucose control in critically ill patients
Objective: Intensive insulin therapy is associated with the risk of hypoglycemia and increased costs of material and personnel. We therefore evaluated the safety and efficiency of a computer-assisted glucose control protocol in a large population of critically ill patients. Design and setting: Observational cohort study in three intensive care units (32 beds) in a 1,300-bed university teaching hospital. Patients: All 2,800 patients admitted to the surgical, neurosurgical, and cardiothoracic units; the study period started at each ICU after implementation of Glucose Regulation for Intensive Care Patients (GRIP), a freely available computer-assisted glucose control protocol. Measurements and results: We analysed compliance in relation to recommended insulin pump rates and glucose measurement frequency. Patients were on GRIP-ordered pump rates 97% of time. Median measurement time was 5 min late (IQR 20 min early to 34 min late). Hypoglycemia was uncommon (7% of patients for mild hypoglycemia, <3.5 mmol/l; 0.86% for severe hypoglycemia, <2.2 mmol/l). Our predefined target range (4.0 - 7.5 mmol/l) was reached after a median of 5.6h (IQR 0.2 - 11.8) and maintained for 89% (70 - 100%) of the remaining stay at the ICU. The number of measurements needed was 5.9 (4.8 - 7.3) per patient per day. In-hospital mortality was 10.1%. Conclusions: Our computer-assisted glucose control protocol provides safe and efficient glucose regulation in routine intensive care practice. A low rate of hypoglycemic episodes was achieved with a considerably lower number of glucose measurements than used in most other schemes
Measuring adherence to antiretroviral treatment in resource-poor settings: The feasibility of collecting routine data for key indicators
<p>Abstract</p> <p>Background</p> <p>An East African survey showed that among the few health facilities that measured adherence to antiretroviral therapy, practices and definitions varied widely. We evaluated the feasibility of collecting routine data to standardize adherence measurement using a draft set of indicators.</p> <p>Methods</p> <p>Targeting 20 facilities each in Ethiopia, Kenya, Rwanda, and Uganda, in each facility we interviewed up to 30 patients, examined 100 patient records, and interviewed staff.</p> <p>Results</p> <p>In 78 facilities, we interviewed a total of 1,631 patients and reviewed 8,282 records. Difficulties in retrieving records prevented data collection in two facilities. Overall, 94.2% of patients reported perfect adherence; dispensed medicine covered 91.1% of days in a six month retrospective period; 13.7% of patients had a gap of more than 30 days in their dispensed medication; 75.8% of patients attended clinic on or before the date of their next appointment; and 87.1% of patients attended within 3 days.</p> <p>In each of the four countries, the facility-specific median indicators ranged from: 97%-100% for perfect self-reported adherence, 90%-95% of days covered by dispensed medicines, 2%-19% of patients with treatment gaps of 30 days or more, and 72%-91% of appointments attended on time. Individual facilities varied considerably.</p> <p>The percentages of days covered by dispensed medicine, patients with more than 95% of days covered, and patients with a gap of 30 days or more were all significantly correlated with the percentages of patients who attended their appointments on time, within 3 days, or within 30 days of their appointment. Self reported recent adherence in exit interviews was significantly correlated only with the percentage of patients who attended within 3 days of their appointment.</p> <p>Conclusions</p> <p>Field tests showed that data to measure adherence can be collected systematically from health facilities in resource-poor settings. The clinical validity of these indicators is assessed in a companion article. Most patients and facilities showed high levels of adherence; however, poor levels of performance in some facilities provide a target for quality improvement efforts.</p
Applied science facilitates the large-scale expansion of protected areas in an Amazonian hot spot
Meeting international commitments to protect 17% of terrestrial ecosystems worldwide will require \u3e3 million square kilometers of new protected areas and strategies to create those areas in a way that respects local communities and land use. In 2000–2016, biological and social scientists worked to increase the protected proportion of Peru’s largest department via 14 interdisciplinary inventories covering \u3e9 million hectares of this megadiverse corner of the Amazon basin. In each landscape, the strategy was the same: convene diverse partners, identify biological and sociocultural assets, document residents’ use of natural resources, and tailor the findings to the needs of decision-makers. Nine of the 14 landscapes have since been protected (5.7 million hectares of new protected areas), contributing to a quadrupling of conservation coverage in Loreto (from 6 to 23%). We outline the methods and enabling conditions most crucial for successfully applying similar campaigns elsewhere on Earth
- …