4,676 research outputs found

    Consumption of nitric oxide by endothelial cells: Evidence for the involvement of a NAD(P)H-, flavin- and heme-dependent dioxygenase reaction

    Get PDF
    AbstractIn the present study, we investigated the mechanism of nitric oxide (NO) inactivation by endothelial cells. All experiments were performed in the presence of superoxide dismutase to minimize the peroxynitrite reaction. Incubation of the NO donor diethylamine/NO adduct with increasing amounts of intact cells led to a progressive decrease of the NO concentration, demonstrating a cell-dependent consumption of NO. In cell homogenates, consumption of NO critically depended on the presence of NADPH or NADH and resulted in the formation of nitrate. Both NO consumption and nitrate formation were largely inhibited by the heme poisons NaCN and phenylhydrazine as well as the flavoenzyme inhibitor diphenylene iodonium. Further characterization of this NO consumption pathway suggests that endothelial cells express a unique membrane-associated enzyme or enzyme system analogous to the bacterial NO dioxygenase that converts NO to nitrate in a NAD(P)H-, flavin- and heme-dependent manner

    Immunological mechanisms in specific immunotherapy

    Get PDF
    Specific immunotherapy (SIT) represents the only curative treatment of allergy and is, therefore, of particular interest for immunological and pharmacological research. The current understanding of immunological mechanisms underlying SIT focuses on regulatory T cells (T regs), which balance Th1 and Th2 effector functions. This ensures that allergens are recognized, but tolerated by the immune system. There is clear evidence that SIT restores the disturbed balance of T regs and effector cells in allergic patients. Current efforts are focused to improve SIT regimens to make them more applicable in atopy and asthma. The current review provides an overview on the mechanisms of SIT and possible adjuvant treatment strategies on the background of the T reg concep

    Statistische Analyse des Einflusses von Herzrhythmusstörungen auf das Mortalitätsrisiko

    Get PDF
    Herzrhythmusstörungen stellen eine äußerst bedrohliche Krankheit dar und können zum plötzlichen Herztod führen. So sterben in der Bundesrepublik Deutschland pro Jahr etwa 100.000 Patienten an einem Herz-Kreislauf-Stillstand, der in 65 - 80 % durch eine Rhythmusstörung hervorgerufen wird (Trappe et al., 1996). Seit über 20 Jahren ist bekannt, daß das Ausmaß der Rhythmusstörungen wesentlich das Risiko für einen plötzlichen Herztod beeinflußt (Moss et al., 1979). Die Identifizierung von Patienten mit einem erhöhten Mortalitätsrisiko ist daher von erheblichem Interesse und nach wie vor noch nicht zufriedenstellend gelöst. Von dieser Frage hängt die Wahl der geeigneten Therapie ab. Bei Patienten mit einem erhöhten Mortalitätsrisiko ist derzeit die Implementierung eines Defibrillators die einzig wirksame Therapie. Die medikamentöse Behandlung mit sog. Antiarrhythmika war lange Zeit die Therapie der Wahl, bis Ende der 80er Jahre eine Studie aus den USA für einige Medikamente ein erhöhtes Mortalitätsrisiko nachwies (CAST-Studie, 1989). Seit dieser Zeit konzentriert sich die Forschung auf zwei Punkte, die Entwicklung neuer Medikamente und die Erkennung von besonders gefährdeten Patienten. Die einzige nicht-invasive Methode zur Erfassung der Häufigkeit der Arrhythmien ist gegenwärtig das 24 Std. Holter-EKG. Derzeit wird für die Unterteilung in verschiedene Risikogruppen nur das Ausmaß der Rhythmusstörungen, die Häufigkeit der sog. ventrikulären Extrasystolen (VES) erfaßt. Dieser Faktor ist aber nicht aussagekräftig genug. Daher liegt es nahe, die Information über die Rhythmusstörungen besser zu nutzen und vor allem die Komplexität der Arrhythmien besser zu beschreiben. Hierzu werden aus dem 24 Std. Holter-EKG alle Abstände zwischen zwei aufeinanderfolgenden Herzschlägen, die sog. RR-Intervalle, erfaßt. Wenn im Durchschnitt ein Herzschlag pro Sekunde erfolgt, liegen über 24 Stunden ca. 90 000 solcher Intervalle vor. Diese Datenmenge stellt an die Analyseverfahren eine besondere Herausforderung dar. In einem ersten Ansatz wurden Methoden aus dem Bereich der nichtlinearen Dynamik angewandt (Schmidt et al., 1996). Es ist bekannt, daß neben den Rhythmusstörungen auch die Variabilität der RR-Intervalle das Risiko beeinflussen. Mit den Ansätzen, basierend auf der nichtlinearen Dynamik, wurden aus den Daten eines 24 Std. Holter-EKG's zwei Parameter abgeleitet (alpha_VES und alpha_sin ). Der erste Parameter beschreibt die Komplexität, der zweite steht für die Variabilität. Die vorliegende Arbeit wendet statistische Verfahren aus den Bereichen Kurvenschätzung, logistische Regression, Coxsche Regression an, um besonders gefährdete Patienten zu erkennen. Für diese Analyse standen die Daten von 60 Patienten zur Verfügung. Das Ziel dieser Untersuchung ist es insbesondere, die aufwendige Methode der Bestimmung von alpha_sin , alpha_VES durch eine neue zu ersetzen, die konzeptionell und numerisch einfacher ist, die - im Unterschied zur eingeführten - vollständig algorithmisch durchgeführt werden kann und die auch - bei entsprechender Weiterentwicklung - zum Teil online erfolgen könnte

    Enterprise Composition Architecture for Micro-Granular Digital Services and Products

    Get PDF
    The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This defines the strategical context for composing resilient enterprise architectures for micro-granular digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of system architectures defines the moving context for adaptable systems, which are essential to enable the digital transformation. Enterprises are presently transforming their strategy and culture together with their processes and information systems to become more digital. The digital transformation deeply disrupts existing enterprises and economies. Since years a lot of new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT systems with many rather small and distributed structures, like Internet of Things or mobile systems. In this paper, we are focusing on the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like Internet of Things and Microservices, as part of a new digital enterprise architecture. To integrate micro-granular architecture models to living architectural model versions we are extending more traditional enterprise architecture reference models with state of art elements for agile architectural engineering to support the digitalization of services with related products, and their processes

    Risk Stratification in Post-MI Patients Based on Left Ventricular Ejection Fraction and Heart-Rate Turbulence

    Get PDF
    Objectives: Development of risk stratification criteria for predicting mortality in post-infarction patients taking into account LVEF and heart-rate turbulence (HRT). Methods: Based on previous results the two parameters LVEF (continuously) and turbulence slope (TS) as an indicator of the HRT were combined for risk stratification. The method has been applied within two independent data sets (the MPIP-trial and the EMIAT-study). Results: The criteria were defined in order to match the outcome of applying LVEF ( 30 % in sensitivity. In the MPIP trial the optimal criteria selected are TS normal and LVEF ( 21 % or TS abnormal and LVEF ( 40 %. Within the placebo group of the EMIAT-study the corresponding criteria are: TS normal and LVEF ( 23 % or TS abnormal and LVEF ( 40 %. Combining both studies the following criteria could be obtained: TS normal and LVEF ( 20 % or TS abnormal and LVEF ( 40 %. In the MPIP study 83 out of the 581 patients (= 14.3 %) are fulfilling these criteria. Within this group 30 patients have died during the follow-up. In the EMIAT-trial 218 out of the 591 patients (= 37.9 %) are classified as high risk patients with 53 deaths. Combining both studies the high risk group contains 301 patients with 83 deaths (ppv = 27.7 %). Using the MADIT-criterion as classification rule (LVEF ( 30 %) a sample of 375 patients with 85 deaths (ppv = 24 %) can be selected. Conclusions: The stratification rule based on LVEF and TS is able to select high risk patients suitable for implanting an ICD. The rule performs better than the classical one with LVEF alone. The high risk group applying the new criteria is smaller with about the same number of deaths and therefor with a higher positive predictive value. The classification criteria have been validated within a bootstrap study with 100 replications. In all samples the rule based on TS and LVEF (= NEW) was superior to LVEV alone, the high risk group has been smaller (( s: 301 ( 14.5 (NEW) vs. 375 ( 14.5 (LVEF)) and the positive predictive value was larger (( s: 27.2 ( 2.6 % (NEW) vs. 23.3 ( 2.2 % (LVEF)). The new criteria are less expensive due to a reduced number of high risk patients selected

    Geschlechtsspezifische Probleme der Begabtenförderung

    Get PDF

    A Statistical Model for Risk Stratification on the Basis of Left Ventricular Ejection Fraction and Heart-Rate Turbulence

    Get PDF
    The MPIP data set was used to obtain a model for mortality risk stratification of acute myocardial infarction patients. The predictors heart rate turbulence (HRT) and left-ventricular ejection fraction (LVEF) were employed. HRT was a categorical variable of three levels; LVEF was continuous and its influence on the relative risk was explained by the natural logarithm function (found using fractional polynomials). Cox - PH model with HRT and lnLVEF was constructed and used for risk stratification. The model can be used to divide the patients into two or more groups according to mortality risk. It also describes the relationship between risk and predictors by a (continuous) function, which allows the calculation of individual mortality risk

    Reversible inactivation of endothelial nitric oxide synthase by NG-nitro-l-arginine

    Get PDF
    AbstractNG-Methyl-l-arginine (L-NMA) and NG-nitro-l-arginine (L-NNA) inhibited NO-induced cGMP accumulation in porcine aortic endothelial cells with half-maximally effective concentrations of 15 and 3.4 μM, respectively. The effects of both compounds were reversible, but the L-NNA-induced inhibition was only reversed by wash-out in the presence of 1 mM l-arginine. In short-term incubations (45 s) of membrane fractions, L-NMA and L-NNA exhibited similar potencies to inhibit endothelial NO synthase, but L-NNA was markedly more potent than L-NMA after prolonged incubation periods (⩾ 3 min) due to induction of a pronounced, reversible enzyme inactivation

    Challenging the Need for Transparency, Controllability, and Consistency in Usable Adaptation Design

    Get PDF
    Adaptive applications constitute the basis for many ubiquitous computing scenarios as they can dynamically adapt to changing contexts. The usability design principles transparency, controllability, and consistency have been recommended for the design of adaptive interfaces. However, designing self-adaptive applications that may act completely autonomous is still a challenging task because there is no set of usability design guidelines. Applying the three principles in the design of the five different adaptations of the mobile adaptive application Meet-U revealed as difficult. Based on an analysis of the design problem space, we elaborate an approach for the design of usable adaptations. Our approach is based on a notification design concept which calculates the attention costs and utility benefits of notified adaptations by varying the design aspects transparency and controllability. We present several designs for the adaptations of Meet‑U. The results of a user study shows that the notification design approach is beneficial for the design of adaptations. Varying transparency and controllability is necessary to adjust an adaptation’s design to the particular context of use. This leads to a partially inconsistent design for adaptations within an application
    corecore