38 research outputs found
Recommended from our members
The dermomyotome ventrolateral lip is essential for the hypaxial myotome formation
Background
The myotome is the primitive skeletal muscle that forms within the embryonic metameric body wall. It can be subdivided into an epaxial and hypaxial domain. It has been shown that the formation of the epaxial myotome requires the dorsomedial lip of the dermomyotome (DML). Although the ventrolateral lip (VLL) of the dermomyotome is believed to be required for the formation of the hypaxial myotome, experimentally evidence for this statement still needs to be provided. Provision of such data would enable the resolution of a debate regarding the formation of the hypaxial dermomyotome. Two mechanisms have been proposed for this tissue. The first proposes that the intermediate dermomyotome undergoes cellular expansion thereby pushing the ventral lateral lip in a lateral direction (translocation). In contrast, the alternative view holds that the ventral lateral lip grows laterally.
Results
Using time lapse confocal microscopy, we observed that the GFP-labelled ventrolateral lip (VLL) of the dermomyotome grows rather than translocates in a lateral direction. The necessity of the VLL for lateral extension of the myotome was addressed by ablation studies. We found that the hypaxial myotome did not form after VLL ablation. In contrast, the removal of an intermediate portion of the dermomyotome had very little effect of the hypaxial myotome. These results demonstrate that the VLL is required for the formation of the hypaxial myotome.
Conclusion
Our study demonstrates that the dermomyotome ventrolateral lip is essential for the hypaxial myotome formation and supports the lip extension model. Therefore, despite being under independent signalling controls, both the dorsomedial and ventrolateral lip fulfil the same function, i.e. they extend into adjacent regions permitting the growth of the myotome
A Valuable Tool for Risk Stratification in Septic Patients Admitted to ICU
The lactate/albumin ratio has been reported to be associated with mortality in
pediatric patients with sepsis. We aimed to evaluate the lactate/albumin ratio
for its prognostic relevance in a larger collective of critically ill (adult)
patients admitted to an intensive care unit (ICU). A total of 348 medical
patients admitted to a German ICU for sepsis between 2004 and 2009 were
included. Follow-up of patients was performed retrospectively between May 2013
and November 2013. The association of the lactate/albumin ratio (cut-off 0.15)
and both in-hospital and post-discharge mortality was investigated. An optimal
cut-off was calculated by means of Youden’s index. The lactate/albumin ratio
was elevated in non-survivors (p < 0.001). Patients with an increased
lactate/albumin ratio were of similar age, but clinically in a poorer
condition and had more pronounced laboratory signs of multi-organ failure. An
increased lactate/albumin ratio was associated with adverse in-hospital
mortality. An optimal cut-off of 0.15 was calculated and was associated with
adverse long-term outcome even after correction for APACHE2 and SAPS2. We
matched 99 patients with a lactate/albumin ratio >0.15 to case-controls with a
lactate/albumin ratio <0.15 corrected for APACHE2 scores: The group with a
lactate/albumin ratio >0.15 evidenced adverse in-hospital outcome in a paired
analysis with a difference of 27% (95%CI 10–43%; p < 0.01). Regarding long-
term mortality, again, patients in the group with a lactate/albumin ratio
>0.15 showed adverse outcomes (p < 0.001). An increased lactate/albumin ratio
was significantly associated with an adverse outcome in critically ill
patients admitted to an ICU, even after correction for confounders. The
lactate/albumin ratio might constitute an independent, readily available, and
important parameter for risk stratification in the critically ill. View Full-
Tex
Pharmacosimulation of delays and interruptions during administration of tirofiban: a systematic comparison between EU and US dosage regimens.
Tirofiban is a glycoproteine (GP) IIb/IIIa receptor antagonist, which inhibits platelet-platelet aggregation and is a potential adjunctive antithrombotic treatment in patients with acute coronary syndromes (ACS) or high-risk percutaneous coronary interventions (PCI). It is administered intravenously as a bolus followed by continuous infusion. However, the dosage recommendations in the United States (US) and European Union (EU) differ considerably. Furthermore, in routine clinical practice, deviations from the recommendations may occur. The objective of the present study was to investigate the impact of different alterations on tirofiban plasma concentrations in US and EU administration regimens and to give suggestions for delay management in clinical practice. We therefore mathematically simulated the effects of different bolus-infusion delays and infusion interruptions in different scenarios according to the renal function. Here, we provide a systematic assessment of concentration patterns of tirofiban in the US versus EU dosage regimens. We show that differences between the two regimens have important effects on plasma drug levels. Furthermore, we demonstrate that deviations from the proper administration mode affect the concentration of tirofiban. Additionally, we calculated the optimal dosage of a second bolus to rapidly restore the initial concentration without causing overdosage. In conclusion, differences in tirofiban dosing regimens between the U.S and EU and potential infusion interruptions have important effects on drug levels that may impact on degrees of platelet inhibition and thus antithrombotic effects. Thus, the findings of our modelling studies may help to explain differences in clinical outcomes observed in previous clinical trials on tirofiban
Arginase Inhibition Reverses Monocrotaline-Induced Pulmonary Hypertension
Pulmonary hypertension (PH) is a heterogeneous disorder associated with a poor
prognosis. Thus, the development of novel treatment strategies is of great
interest. The enzyme arginase (Arg) is emerging as important player in PH
development. The aim of the current study was to determine the expression of
ArgI and ArgII as well as the effects of Arg inhibition in a rat model of PH.
PH was induced in 35 Sprague–Dawley rats by monocrotaline (MCT, 60 mg/kg as
single-dose). There were three experimental groups: sham-treated controls
(control group, n = 11), MCT-induced PH (MCT group, n = 11) and MCT-induced PH
treated with the Arg inhibitor Nω-hydroxy-nor-l-arginine (nor-NOHA;
MCT/NorNoha group, n = 13). ArgI and ArgII expression was determined by
immunohistochemistry and Western blot. Right ventricular systolic pressure
(RVPsys) was measured and lung tissue remodeling was determined. Induction of
PH resulted in an increase in RVPsys (81 ± 16 mmHg) compared to the control
group (41 ± 15 mmHg, p = 0.002) accompanied by a significant elevation of
histological sum-score (8.2 ± 2.4 in the MCT compared to 1.6 ± 1.6 in the
control group, p < 0.001). Both, ArgI and ArgII were relevantly expressed in
lung tissue and there was a significant increase in the MCT compared to the
control group (p < 0.01). Arg inhibition resulted in a significant reduction
of RVPsys to 52 ± 19 mmHg (p = 0.006) and histological sum-score to 5.8 ± 1.4
compared to the MCT group (p = 0.022). PH leads to increased expression of
Arg. Arg inhibition leads to reduction of RVPsys and diminished lung tissue
remodeling and therefore represents a potential treatment strategy in PH
Clinical Frailty Scale (CFS) reliably stratifies octogenarians in German ICUs: a multicentre prospective cohort study
Background: In intensive care units (ICU) octogenarians become a routine patients group with aggravated therapeutic and diagnostic decision-making. Due to increased mortality and a reduced quality of life in this high-risk population, medical decision-making a fortiori requires an optimum of risk stratification. Recently, the VIP-1 trial prospectively observed that the clinical frailty scale (CFS) performed well in ICU patients in overall-survival and short-term outcome prediction. However, it is known that healthcare systems differ in the 21 countries contributing to the VIP-1 trial. Hence, our main focus was to investigate whether the CFS is usable for risk stratification in octogenarians admitted to diversified and high tech German ICUs.
Methods: This multicentre prospective cohort study analyses very old patients admitted to 20 German ICUs as a sub-analysis of the VIP-1 trial. Three hundred and eight patients of 80Â years of age or older admitted consecutively to participating ICUs. CFS, cause of admission, APACHE II, SAPS II and SOFA scores, use of ICU resources and ICU- and 30-day mortality were recorded. Multivariate logistic regression analysis was used to identify factors associated with 30-day mortality.
Results: Patients had a median age of 84 [IQR 82–87] years and a mean CFS of 4.75 (± 1.6 standard-deviation) points. More than half of the patients (53.6%) were classified as frail (CFS ≥ 5). ICU-mortality was 17.3% and 30-day mortality was 31.2%. The cause of admission (planned vs. unplanned), (OR 5.74) and the CFS (OR 1.44 per point increase) were independent predictors of 30-day survival.
Conclusions: The CFS is an easy determinable valuable tool for prediction of 30-day ICU survival in octogenarians, thus, it may facilitate decision-making for intensive care givers in Germany.
Trial registration: The VIP-1 study was retrospectively registered on ClinicalTrials.gov (ID: NCT03134807 ) on May 1, 2017
Impella versus extracorporal life support in cardiogenic shock: a propensity score adjusted analysis
Aims: The mortality in cardiogenic shock (CS) is high. The role of specific mechanical circulatory support (MCS) systems is unclear. We aimed to compare patients receiving Impella versus ECLS (extracorporal life support) with regard to baseline characteristics, feasibility, and outcomes in CS. Methods and results: This is a retrospective cohort study including CS patients over 18 years with a complete follow-up of the primary endpoint and available baseline lactate level, receiving haemodynamic support either by Impella 2.5 or ECLS from two European registries. The decision for device implementation was made at the discretion of the treating physician. The primary endpoint of this study was all-cause mortality at 30 days. A propensity score for the use of Impella was calculated, and multivariable logistic regression was used to obtain adjusted odds ratios (aOR). In total, 149 patients were included, receiving either Impella (n = 73) or ECLS (n = 76) for CS. The feasibility of device implantation was high (87%) and similar (aOR: 3.14; 95% CI: 0.18–56.50; P = 0.41) with both systems. The rates of vascular injuries (aOR: 0.95; 95% CI: 0.10–3.50; P = 0.56) and bleedings requiring transfusions (aOR: 0.44; 95% CI: 0.09–2.10; P = 0.29) were similar in ECLS patients and Impella patients. The use of Impella or ECLS was not associated with increased odds of mortality (aOR: 4.19; 95% CI: 0.53–33.25; P = 0.17), after correction for propensity score and baseline lactate level. Baseline lactate level was independently associated with increased odds of 30 day mortality (per mmol/L increase; OR: 1.29; 95% CI: 1.14–1.45; P < 0.001). Conclusions: In CS patients, the adjusted mortality rates of both ECLS and Impella were high and similar. The baseline lactate level was a potent predictor of mortality and could play a role in patient selection for therapy in future studies. In patients with profound CS, the type of device is likely to be less important compared with other parameters including non-cardiac and neurological factors
Rolle von CXCR4 und SDF-1 in der Embryonalentwicklung der SchultergĂĽrtelmuskulatur
In der hier vorliegenden Dissertationsarbeit wurde eine neuartige Methode entwickelt, die es erlaubte den "In-Out"-Mechanismus während der embryonalen Entwicklung der Schultergürtelmuskulatur erstmalig live darzustellen. Weiterhin wurde ein Rezeptor-Liganden-Paar identifiziert, das an der Bildung der Schultergürtelmuskulatur in Hühner- und Mausembryonen essentiell beteiligt ist. Die im Rahmen dieser Dissertation erhobenen Daten belegen, dass das Signalsystem CXCR4/SDF-1 für die Bildung der Schultergürtelmuskulatur im Hühner- und Mausmodell unabdingbar ist. Außerdem wurde mithilfe der hier entwickelten Time-lapse-Imaging-Technik bewiesen, dass der CXCR4-Inhibitor die retrograde Migration von Muskelvorläuferzellen während des "In-Out"-Vorgangs beeinträchtigt
Diagnosing sarcomatoid carcinoma and carcinosarcoma of the large bowel: heads or tails?
Sarcomatoid carcinoma and carcinosarcoma of the large bowel represent 2 faces of a rare tumor with an ominous prognosis. When grappling with this disease, the choice of the appropriate terminology is far from being an easy task as discussed below
Lactate Clearance Predicts Good Neurological Outcomes in Cardiac Arrest Patients Treated with Extracorporeal Cardiopulmonary Resuscitation
Background: We evaluated critically ill patients undergoing extracorporeal cardiopulmonary resuscitation (ECPR) due to cardiac arrest (CA) with respect to baseline characteristics and laboratory assessments, including lactate and lactate clearance for prognostic relevance. Methods: The primary endpoint was 30-day mortality. The impact on 30-day mortality was assessed by uni- and multivariable Cox regression analyses. Neurological outcome assessed by Glasgow Outcome Scale (GOS) was pooled into two groups: scores of 1–3 (bad GOS score) and scores of 4–5 (good GOS score). Results: A total of 93 patients were included in the study. Serum lactate concentration (hazard ratio (HR) 1.09; 95% confidence interval (CI) 1.04–1.13; p < 0.001), hemoglobin, (Hb; HR 0.87; 95% CI 0.79–0.96; p = 0.004), and catecholamine use were associated with 30-day-mortality. In a multivariable model, only lactate clearance (after 6 h; OR 0.97; 95% CI 0.94–0.997; p = 0.03) was associated with a good GOS score. The optimal cut-off of lactate clearance at 6 h for the prediction of a bad GOS score was at ≤13%. Patients with a lactate clearance at 6 h ≤13% evidenced higher rates of bad GOS scores (97% vs. 73%; p = 0.01). Conclusions: Whereas lactate clearance does not predict mortality, it was the sole predictor of good neurological outcomes and might therefore guide clinicians when to stop ECPR
Differential Impact of Hyperglycemia in Critically Ill Patients: Significance in Acute Myocardial Infarction but Not in Sepsis?
Hyperglycemia is a common condition in critically ill patients admitted to an intensive care unit (ICU). These patients represent an inhomogeneous collective and hyperglycemia might need different evaluation depending on the underlying disorder. To elucidate this, we investigated and compared associations of severe hyperglycemia (>200 mg/dL) and mortality in patients admitted to an ICU for acute myocardial infarction (AMI) or sepsis as the two most frequent admission diagnoses. From 2006 to 2009, 2551 patients 69 (58–77) years; 1544 male; 337 patients suffering from type 2 diabetes (T2DM)) who were admitted because of either AMI or sepsis to an ICU in a tertiary care hospital were investigated retrospectively. Follow-up of patients was performed between May 2013 and November 2013. In a Cox regression analysis, maximum glucose concentration at the day of admission was associated with mortality in the overall cohort (HR = 1.006, 95% CI: 1.004–1.009; p < 0.001) and in patients suffering from myocardial infarction (HR = 1.101, 95% CI: 1.075–1.127; p < 0.001) but only in trend in patients admitted to an ICU for sepsis (HR = 1.030, 95% CI: 0.998–1.062; p = 0.07). Severe hyperglycemia was associated with adverse intra-ICU mortality in the overall cohort (23% vs. 13%; p < 0.001) and patients admitted for AMI (15% vs. 5%; p < 0.001) but not for septic patients (39% vs. 40%; p = 0.48). A medical history of type 2 diabetes (n = 337; 13%) was not associated with increased intra-ICU mortality (15% vs. 15%; p = 0.93) but in patients with severe hyperglycemia and/or a known medical history of type 2 diabetes considered in combination, an increased mortality in AMI patients (intra-ICU 5% vs. 13%; p < 0.001) but not in septic patients (intra-ICU 38% vs. 41%; p = 0.53) could be evidenced. The presence of hyperglycemia in critically ill patients has differential impact within the different etiological groups. Hyperglycemia in AMI patients might identify a sicker patient collective suffering from pre-diabetes or undiagnosed diabetes with its’ known adverse consequences, especially in the long-term. Hyperglycemia in sepsis might be considered as adaptive survival mechanism to hypo-perfusion and consecutive lack of glucose in peripheral cells. AMI patients with hyperglycemic derailment during an ICU-stay should be closely followed-up and extensively screened for diabetes to improve patients’ outcome