34 research outputs found

    Auf dem Weg zur individualisierten Medizin - Grid-basierte Services für die EPA der Zukunft.

    Get PDF
    Personalized Medicine is of paramount interest for many areas in Medical Informatics. Therefore genotype data as well a phenotype data about patients have to be available. This data will be stored in Electronic Health Records or – patient controlled - in Personal Health Records. As the amount of (raw) data is rising continuously, methods for a secure data administration have to be found. Grid Services offer data storage, can support data retrieval and the presentation of the data. The basic security services could be provided by the German health professional infrastructure, but there are many security challenges to be faced

    Inhibition of DHCR24 activates LXRα to ameliorate hepatic steatosis and inflammation

    Get PDF
    Liver X receptor (LXR) agonism has theoretical potential for treating NAFLD/NASH, but synthetic agonists induce hyperlipidemia in preclinical models. Desmosterol, which is converted by Δ24-dehydrocholesterol reductase (DHCR24) into cholesterol, is a potent endogenous LXR agonist with anti-inflammatory properties. We aimed to investigate the effects of DHCR24 inhibition on NAFLD/NASH development. Here, by using APOE*3-Leiden. CETP mice, a well-established translational model that develops diet-induced human-like NAFLD/NASH characteristics, we report that SH42, a published DHCR24 inhibitor, markedly increases desmosterol levels in liver and plasma, reduces hepatic lipid content and the steatosis score, and decreases plasma fatty acid and cholesteryl ester concentrations. Flow cytometry showed that SH42 decreases liver inflammation by preventing Kupffer cell activation and monocyte infiltration. LXRα deficiency completely abolishes these beneficial effects of SH42. Together, the inhibition of DHCR24 by SH42 prevents diet-induced hepatic steatosis and inflammation in a strictly LXRα-dependent manner without causing hyperlipidemia. Finally, we also showed that SH42 treatment decreased liver collagen content and plasma alanine transaminase levels in an established NAFLD model. In conclusion, we anticipate that pharmacological DHCR24 inhibition may represent a novel therapeutic strategy for treatment of NAFLD/NASH.</p

    A finite element method model to simulate laser interstitial thermo therapy in anatomical inhomogeneous regions

    Get PDF
    BACKGROUND: Laser Interstitial ThermoTherapy (LITT) is a well established surgical method. The use of LITT is so far limited to homogeneous tissues, e.g. the liver. One of the reasons is the limited capability of existing treatment planning models to calculate accurately the damage zone. The treatment planning in inhomogeneous tissues, especially of regions near main vessels, poses still a challenge. In order to extend the application of LITT to a wider range of anatomical regions new simulation methods are needed. The model described with this article enables efficient simulation for predicting damaged tissue as a basis for a future laser-surgical planning system. Previously we described the dependency of the model on geometry. With the presented paper including two video files we focus on the methodological, physical and mathematical background of the model. METHODS: In contrast to previous simulation attempts, our model is based on finite element method (FEM). We propose the use of LITT, in sensitive areas such as the neck region to treat tumours in lymph node with dimensions of 0.5 cm – 2 cm in diameter near the carotid artery. Our model is based on calculations describing the light distribution using the diffusion approximation of the transport theory; the temperature rise using the bioheat equation, including the effect of microperfusion in tissue to determine the extent of thermal damage; and the dependency of thermal and optical properties on the temperature and the injury. Injury is estimated using a damage integral. To check our model we performed a first in vitro experiment on porcine muscle tissue. RESULTS: We performed the derivation of the geometry from 3D ultrasound data and show for this proposed geometry the energy distribution, the heat elevation, and the damage zone. Further on, we perform a comparison with the in-vitro experiment. The calculation shows an error of 5% in the x-axis parallel to the blood vessel. CONCLUSIONS: The FEM technique proposed can overcome limitations of other methods and enables an efficient simulation for predicting the damage zone induced using LITT. Our calculations show clearly that major vessels would not be damaged. The area/volume of the damaged zone calculated from both simulation and in-vitro experiment fits well and the deviation is small. One of the main reasons for the deviation is the lack of accurate values of the tissue optical properties. In further experiments this needs to be validated

    Ein Datenschutz- und Datensicherheits-konzept für medizinischen Anwendungen in einer Grid-Computing Umgebung

    No full text
    Diese Arbeit befasst sich mit dem Autorisierungsproblem in HealthGrid-Umgebungen, welches immer noch ein Hindernis vor der Nutzung des Grid-Computings für medizinische Anwendungen ist. Die Autorisierungssysteme für Grid-Umgebungen nutzen die traditionellen Ansätze der Zugriffskontrolle, in denen jedem Benutzer statische Rechte gewährt werden. Diese Ansätze wurden ursprünglich für geschlossene Systeme entwickelt, d.h. das Computing-System setzt die Sicherheitsrichtlinien selbst durch. Grid-Computing ist ein offenes System, d.h. verschiedene Domänen bzw. mehrere Agenten nehmen am Autorisierungsprozess teil. Daher ist die Nutzung klassischer Autorisierungsmodelle in eine Grid-Computing-Umgebung nicht angemessen. In dieser Arbeit wurde die Zugriffskontrolle im Grid-Computing aus einer neuen Perspektive behandelt. Der Autorisierungsprozess wird im Rahmen der Spieltheorie behandelt, und ein neues Zugriffskontrolle-Modell wurde entwickelt. Dieses Modell das Grid Usage Control Modell (G-UCON) ist ein Lösungskonzept für ein angemessenes Zugriffskontrollmodell für Grid-Computing-Umgebungen. G-UCON verwendet die Prinzipien der Spieltheorie, der Multi-Agenten-Systeme und der offenen Systeme zum Aufbau des neuen Zugriffskontrollmodells. Diese sind die minimalen Anforderungen für ein Grid-Computing Zugriffs- bzw. Nutzungskontroll-modell. Die Alternating-Time temporale Logik wurde für die Entwicklung der G-UCON Spezifikationen benutzt, weil sie geeignet ist, spieltheoretische Spezifikationen offener Systeme zu erfassen. Mehrere Beispiele am Ende dieser Arbeit zeigen, wie G-UCON komplexe Zustande erfassen kann, die in HealthGrid-Umgebungen entstehen könnten. Die besonderen Anforderungen an die Nutzungskontrolle in den medizinischen Anwendungen kön-nen anhand des G-UCONs modelliert werden. Beispiele hierfür sind die Re-Identi-fizierungsgefahr bzw. die Gewehrleistung der k-Anonymität. Derzeit steht kein Vali-dierungstool zur Verfügung, das speziellen Grid Delegationsmechanismus model-lieren kann. Eine formale Validierung der G-UCON Spezifikationen sollte deswegen eine zukünftige Arbeit sein. Dennoch ist es üblich, dass neue Autorisierungs-modelle mit mehreren Beilspielen bestätigt werden, bis eine formale Validierung bzw. eine Umsetzung erscheint, die normalerweise einige Jahre später passiert

    Scientific Workflow Optimization for Improved Peptide and Protein Identification

    Get PDF
    Background: Peptide-spectrum matching is a common step in most data processing workflows for massspectrometry-based proteomics. Many algorithms and software packages, both free and commercial, have beendeveloped to address this task. However, these algorithms typically require the user to select instrument- andsample-dependent parameters, such as mass measurement error tolerances and number of missed enzymaticcleavages. In order to select the best algorithm and parameter set for a particular dataset, in-depth knowledgeabout the data as well as the algorithms themselves is needed. Most researchers therefore tend to use defaultparameters, which are not necessarily optimal.Results: We have applied a new optimization framework for the Taverna scientific workflow management system(http://ms-utils.org/Taverna_Optimization.pdf) to find the best combination of parameters for a given scientificworkflow to perform peptide-spectrum matching. The optimizations themselves are non-trivial, as demonstrated byseveral phenomena that can be observed when allowing for larger mass measurement errors in sequence databasesearches. On-the-fly parameter optimization embedded in scientific workflow management systems enables expertsand non-experts alike to extract the maximum amount of information from the data. The same workflows could beused for exploring the parameter space and compare algorithms, not only for peptide-spectrum matching, but alsofor other tasks, such as retention time prediction.Conclusion: Using the optimization framework, we were able to learn about how the data was acquired as well asthe explored algorithms. We observed a phenomenon identifying many ammonia-loss b-ion spectra as peptideswith N-terminal pyroglutamate and a large precursor mass measurement error. These insights could only be gainedwith the extension of the common range for the mass measurement error tolerance parameters explored by theoptimization framework

    A New Optimization Phase for Scientific Workflow Management Systems

    No full text
    Scientific workflows have emerged as an important tool for combining computational power with data analysis for all scientific domains in e-science, especially in the life sciences. They help scientists to design and execute complex in silico experiments. However, with rising complexity it becomes increasingly impractical to optimize scientific workflows by trial and error. To address this issue, we propose to insert a new optimization phase into the common scientific workflow life cycle. This paper describes the design and implementation of an automated optimizationframework for scientific workflows to implement this phase. Our framework was integrated into Taverna, a lifescience oriented workflow management system and oers a versatile programming interface (API), which enables easy integration of arbitrary optimization methods. We have used this API to develop an example plugin for parameter optimization that is based on a Genetic Algorithm. Two use cases taken from the areas of structural bioinformatics and proteomics demonstrate how our framework facilitates setup, execution, and monitoring of workflow parameter optimization in high performance computing e-science environments
    corecore