2,179 research outputs found

    Current and Future Challenges in Knowledge Representation and Reasoning

    Full text link
    Knowledge Representation and Reasoning is a central, longstanding, and active area of Artificial Intelligence. Over the years it has evolved significantly; more recently it has been challenged and complemented by research in areas such as machine learning and reasoning under uncertainty. In July 2022 a Dagstuhl Perspectives workshop was held on Knowledge Representation and Reasoning. The goal of the workshop was to describe the state of the art in the field, including its relation with other areas, its shortcomings and strengths, together with recommendations for future progress. We developed this manifesto based on the presentations, panels, working groups, and discussions that took place at the Dagstuhl Workshop. It is a declaration of our views on Knowledge Representation: its origins, goals, milestones, and current foci; its relation to other disciplines, especially to Artificial Intelligence; and on its challenges, along with key priorities for the next decade

    Notation3 as an Existential Rule Language

    Full text link
    Notation3 Logic (\nthree) is an extension of RDF that allows the user to write rules introducing new blank nodes to RDF graphs. Many applications (e.g., ontology mapping) rely on this feature as blank nodes -- used directly or in auxiliary constructs -- are omnipresent on the Web. However, the number of fast \nthree reasoners covering this very important feature of the logic is rather limited. On the other hand, there are engines like VLog or Nemo which do not directly support Semantic Web rule formats but which are developed and optimized for very similar constructs: existential rules. In this paper, we investigate the relation between \nthree rules with blank nodes in their heads and existential rules. We identify a subset of \nthree which can be mapped directly to existential rules and define such a mapping preserving the equivalence of \nthree formulae. In order to also illustrate that in some cases \nthree reasoning could benefit from our translation, we then employ this mapping in an implementation to compare the performance of the \nthree reasoners EYE and cwm to VLog and Nemo on \nthree rules and their mapped counterparts. Our tests show that the existential rule reasoners perform particularly well for use cases containing many facts while especially the EYE reasoner is very fast when dealing with a high number of dependent rules. We thus provide a tool enabling the Semantic Web community to directly use existing and future existential rule reasoners and benefit from the findings of this active community

    Asymptotic initial value representation of the solutions of semi-classical systems presenting smooth codimension one crossings

    Full text link
    This paper is devoted to the construction of approximations of the propagator associated with a semi-classical matrix-valued Schr\"odinger operator with symbol presenting smooth eigenvalues crossings. Inspired by the approach of the theoretical chemists Herman and Kluk who propagated continuous superpositions of Gaussian wave-packets for scalar equations, we consider frozen and thawed Gaussian initial value representations that incorporate classical transport and branching processes along a hopping hypersurface. Based on the Gaussian wave-packet framework, our result relies on an accurate analysis of the solutions of the associated Schr\"odinger equation for data that are vector-valued wave-packets. We prove that these solutions are asymptotic to wavepackets at any order in terms of the semi-classical parameter

    Method versatility in analysing human attitudes towards technology

    Get PDF
    Various research domains are facing new challenges brought about by growing volumes of data. To make optimal use of them, and to increase the reproducibility of research findings, method versatility is required. Method versatility is the ability to flexibly apply widely varying data analytic methods depending on the study goal and the dataset characteristics. Method versatility is an essential characteristic of data science, but in other areas of research, such as educational science or psychology, its importance is yet to be fully accepted. Versatile methods can enrich the repertoire of specialists who validate psychometric instruments, conduct data analysis of large-scale educational surveys, and communicate their findings to the academic community, which corresponds to three stages of the research cycle: measurement, research per se, and communication. In this thesis, studies related to these stages have a common theme of human attitudes towards technology, as this topic becomes vitally important in our age of ever-increasing digitization. The thesis is based on four studies, in which method versatility is introduced in four different ways: the consecutive use of methods, the toolbox choice, the simultaneous use, and the range extension. In the first study, different methods of psychometric analysis are used consecutively to reassess psychometric properties of a recently developed scale measuring affinity for technology interaction. In the second, the random forest algorithm and hierarchical linear modeling, as tools from machine learning and statistical toolboxes, are applied to data analysis of a large-scale educational survey related to students’ attitudes to information and communication technology. In the third, the challenge of selecting the number of clusters in model-based clustering is addressed by the simultaneous use of model fit, cluster separation, and the stability of partition criteria, so that generalizable separable clusters can be selected in the data related to teachers’ attitudes towards technology. The fourth reports the development and evaluation of a scholarly knowledge graph-powered dashboard aimed at extending the range of scholarly communication means. The findings of the thesis can be helpful for increasing method versatility in various research areas. They can also facilitate methodological advancement of academic training in data analysis and aid further development of scholarly communication in accordance with open science principles.Verschiedene Forschungsbereiche müssen sich durch steigende Datenmengen neuen Herausforderungen stellen. Der Umgang damit erfordert – auch in Hinblick auf die Reproduzierbarkeit von Forschungsergebnissen – Methodenvielfalt. Methodenvielfalt ist die Fähigkeit umfangreiche Analysemethoden unter Berücksichtigung von angestrebten Studienzielen und gegebenen Eigenschaften der Datensätze flexible anzuwenden. Methodenvielfalt ist ein essentieller Bestandteil der Datenwissenschaft, der aber in seinem Umfang in verschiedenen Forschungsbereichen wie z. B. den Bildungswissenschaften oder der Psychologie noch nicht erfasst wird. Methodenvielfalt erweitert die Fachkenntnisse von Wissenschaftlern, die psychometrische Instrumente validieren, Datenanalysen von groß angelegten Umfragen im Bildungsbereich durchführen und ihre Ergebnisse im akademischen Kontext präsentieren. Das entspricht den drei Phasen eines Forschungszyklus: Messung, Forschung per se und Kommunikation. In dieser Doktorarbeit werden Studien, die sich auf diese Phasen konzentrieren, durch das gemeinsame Thema der Einstellung zu Technologien verbunden. Dieses Thema ist im Zeitalter zunehmender Digitalisierung von entscheidender Bedeutung. Die Doktorarbeit basiert auf vier Studien, die Methodenvielfalt auf vier verschiedenen Arten vorstellt: die konsekutive Anwendung von Methoden, die Toolbox-Auswahl, die simultane Anwendung von Methoden sowie die Erweiterung der Bandbreite. In der ersten Studie werden verschiedene psychometrische Analysemethoden konsekutiv angewandt, um die psychometrischen Eigenschaften einer entwickelten Skala zur Messung der Affinität von Interaktion mit Technologien zu überprüfen. In der zweiten Studie werden der Random-Forest-Algorithmus und die hierarchische lineare Modellierung als Methoden des Machine Learnings und der Statistik zur Datenanalyse einer groß angelegten Umfrage über die Einstellung von Schülern zur Informations- und Kommunikationstechnologie herangezogen. In der dritten Studie wird die Auswahl der Anzahl von Clustern im modellbasierten Clustering bei gleichzeitiger Verwendung von Kriterien für die Modellanpassung, der Clustertrennung und der Stabilität beleuchtet, so dass generalisierbare trennbare Cluster in den Daten zu den Einstellungen von Lehrern zu Technologien ausgewählt werden können. Die vierte Studie berichtet über die Entwicklung und Evaluierung eines wissenschaftlichen wissensgraphbasierten Dashboards, das die Bandbreite wissenschaftlicher Kommunikationsmittel erweitert. Die Ergebnisse der Doktorarbeit tragen dazu bei, die Anwendung von vielfältigen Methoden in verschiedenen Forschungsbereichen zu erhöhen. Außerdem fördern sie die methodische Ausbildung in der Datenanalyse und unterstützen die Weiterentwicklung der wissenschaftlichen Kommunikation im Rahmen von Open Science

    A feasibility study on integrating electric buses with waste gasification for a green public transport system and solid waste management

    Get PDF
    Waste management and public transport are two major issues requiring decarbonisation in the face of climate change and environmental concerns related to global warming. Green transport systems are classified as zero or low carbon alternatives to the fossil fuel-based approach and vehicles. These systems rely on zero emission fuels such as hydrogen. Thermochemical processes (e.g., gasification) and biochemical technologies (e.g., fermentation) can convert carbon-based feedstock such as waste to produce desirable products like hydrogen. Waste-to Hydrogen is proposed as a potential solution to provide both sustainable waste management and hydrogen production. Waste-to-Hydrogen (WtH) is a hybrid solution that simultaneously combines sustainable waste management and non-fossil-fuel based hydrogen production. The concept of distributed WtH systems, based on gasification and fermentation, is to support hydrogen fuel cell buses in Glasgow is considered as a potential solution zero emission transport development. Hydrogen has potential to replace petrol and diesel fuels and consequently become part the zero-carbon measures to aid the transition to cleaner energy sources. When hydrogen is produced from renewable or sustainable energy sources it can help decarbonise the energy and transport sector. To be attractive to policymakers and investors it is necessary for the hydrogen from a WtH system to demonstrate its carbon footprint is lower than conventional methods. By supporting the effort to reach carbon emission reduction targets, hydrogen is part of the solution to limit climate change, a global emergency. Providing research to support the roadmap of hydrogen-powered public transport to shape the direction of future technological improvement and policy formulation. As well as the potential to provide a clean versatile fuel through hydrogen, WtH can offer an alternative waste management practice that diverts waste away from landfill and incineration. By utilising and transforming waste into a useful energy resource, a value is applied which can encourage the development of sustainable disposal methods such as WtH conversion processes. Glasgow was chosen as the location for the study due to the large population which would supply regular amounts of waste to be used as feedstock. The city council is also actively trying to decarbonise local industries including transport, this is seen by the strategies and targets in place such as Net Zero by 2045. An aim of this study is to demonstrate how low carbon hydrogen production technologies could fit into the city’s transport and energy plan and support the hydrogen strategy, thereby benefitting the people of Glasgow. Whilst Glasgow does not currently use fuel cell electric buses (FCEB) for public transport, an intention to run a fleet has been presented through the publication of the Scottish Governments Hydrogen Policy Statement (2020) and Hydrogen Action Plan (2022). FCEB fleets in other parts of the UK notably London and Birmingham, have shown the environmental benefit through the annual carbon savings made. FCEBs are classified as zero emissions buses (ZEB) which the UK Department of Transport has stated can reduce carbon emissions by 46 tonnes per year and nitrogen oxide (NOx) by 23kg when compared to a diesel bus (UK Government Department for Transport, 2021). This study contributes to the growing evidence of the benefits of using hydrogen as a transport fuel in terms of the carbon savings as an alternative to conventional fossil fuels. Whilst the main concerns of the underdeveloped industrial status, relatively immature technology and high costs are explored. In practice WtH is currently limited to laboratory and pilot scale systems and requires further investment and policy support for advancements to be made. These bottlenecks and limitations are considered in the discussion section of this study. The research question centres around the economic and environmental feasibility of WtH within Glasgow. A feasible project would show the carbon savings compared to conventional methods in both aspects of waste management and hydrogen production. The feasibility is also a measurement of positive returns on economic investment where total project costs do not outweigh the environmental benefits associated with low carbon technologies. This study critically assesses the current situation for WtH development in terms of the environmental impact and potential carbon savings, economic implications, and cost benefits, plus transport and climate policy. The novelty of the study establishes a procedure for defining how WtH could support the growing hydrogen industry as a low carbon hydrogen production technique. The results from the environmental impact analysis and economic assessment add data sets to existing research in academia and energy industry. Life cycle assessment (LCA), cost benefit analysis (CBA) and multi-bjective optimization (MOO) have been conducted to determine the feasibility of WtH projects to support green transport systems and sustainable waste management schemes. A variety of WtH scenarios were designed based on biomass waste feedstock, hydrogen production reactors, and upstream and downstream system components. The WtH systems selected use thermochemical and biochemical technologies to convert the different waste feedstocks available in Glasgow with suitable operational conditions according to the waste characteristics. The waste considered in this study is biodegradable, carbon based and organic including household, plastics, waste wood products, as well as the wet fraction of waste such as food and sewage sludge. Five scenarios, four WtH technologies and one conventional hydrogen production technology of steam methane reforming (SMR), were designed to allow for comparison of environmental and economic results. The scenarios differ in waste feedstock type and technology leading to differences in hydrogen production rates, hydrogen yields, and process carbon emissions. Waste that is less suitable for thermochemical conversion processes can be utilised by biochemical technology to ensure the most efficient and least energy intensive method is applied. The environmental approach for this work focuses on the LCA method to evaluate environmental performance through the carbon saving potential using global warming potential (GWP) as the impact indicator for the WtH technologies. It was shown that WtH technologies could reduce <55% of CO2-eq emissions per kg H2 compared to SMR. Gasification treating municipal solid waste and waste wood had global warming potentials of 4.99 and 4.11 kg CO2-eq/kg H2 respectively, which were lower than dark fermentation treating wet waste at 6.6 kg CO2-eq/kg H2 and combined dark and photo fermentation at 6.4 kg CO2-eq/kg H2. The distance emissions of WtH-based electric fuel cell bus scenarios were 0.33-0.44 kg CO2-eq/km as compared to 0.89 kg CO2-eq/km for the SMR-based scenario. The economic assessment in this study uses cost benefit analysis to determine whether the carbon savings outweigh the expected cost of a WtH system. The CBA was conducted to compare the economic feasibility of the different WtH systems with the conventional SMR. A database was that includes, direct cost data on construction, maintenance, operations, infrastructure, and storage, along with indirect cost data comprising environmental impacts and externalities, cost of pollution, carbon taxes and subsidies was collated. The results are in the form of economic indicators Net present value (NPV), Internal rate of return (IRR), Benefit cost ratio (BCR) and Levelized cost of hydrogen (LCoH). The LCoH was calculated as 0.49 GBP/kg for the gasification systems using MSW feedstock and 0.52 GBP/kg for waste wood gasification. The LCoH for dark fermentation was calculated to be 0.52 GBP/kg and 0.59 GBP/kg for combined dark and photo fermentation systems. Sensitivity analysis was conducted to identify the most significant influential factors of distributed WtH systems. The results indicate that the conversion efficiency and the energy density of the waste had the largest impact for biochemical technology and thermochemical technologies, respectively. It is concluded that WtH could be economically feasible for hydrogen production in Glasgow. However, limitations including high capital expenditure will require cost reduction through technical advancements and carbon tax on conventional hydrogen production methods to improve the outlook for WtH. The multi-objective optimisation results suggest that optimisation is possible with the best solution calculated to minimise both total cost and GWP for the four Scenarios assessed in this work. The results from the three analysis types in this work, indicate the feasibility of WtH in Glasgow. The results suggest there is potential to utilise the waste generated within Glasgow to produce hydrogen, reduce the environmental impact of waste management practices, and provide economic benefit to both the energy and transport industry

    Automatic Generation of Personalized Recommendations in eCoaching

    Get PDF
    Denne avhandlingen omhandler eCoaching for personlig livsstilsstøtte i sanntid ved bruk av informasjons- og kommunikasjonsteknologi. Utfordringen er å designe, utvikle og teknisk evaluere en prototyp av en intelligent eCoach som automatisk genererer personlige og evidensbaserte anbefalinger til en bedre livsstil. Den utviklede løsningen er fokusert på forbedring av fysisk aktivitet. Prototypen bruker bærbare medisinske aktivitetssensorer. De innsamlede data blir semantisk representert og kunstig intelligente algoritmer genererer automatisk meningsfulle, personlige og kontekstbaserte anbefalinger for mindre stillesittende tid. Oppgaven bruker den veletablerte designvitenskapelige forskningsmetodikken for å utvikle teoretiske grunnlag og praktiske implementeringer. Samlet sett fokuserer denne forskningen på teknologisk verifisering snarere enn klinisk evaluering.publishedVersio

    Workshop Proceedings of the 12th edition of the KONVENS conference

    Get PDF
    The 2014 issue of KONVENS is even more a forum for exchange: its main topic is the interaction between Computational Linguistics and Information Science, and the synergies such interaction, cooperation and integrated views can produce. This topic at the crossroads of different research traditions which deal with natural language as a container of knowledge, and with methods to extract and manage knowledge that is linguistically represented is close to the heart of many researchers at the Institut für Informationswissenschaft und Sprachtechnologie of Universität Hildesheim: it has long been one of the institute’s research topics, and it has received even more attention over the last few years

    Measuring the impact of COVID-19 on hospital care pathways

    Get PDF
    Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted

    Biophysical Insights into Peptide and Alcohol Perturbations on Biomimetic Membranes

    Get PDF
    Biological membranes exist in every domain of life. Life exists due to the presence of these special structures for which we take for granted. They are composed of fatty lipids and workhorse proteins and act as the premier interface of biological processes. Due to the sheer quantity and complexity within their thin boundary, studying their actions and properties pose challenges to researchers. As a result, simplified biomembrane mimics are employed regularly. We will use several types of biomembrane mimics to understand fundamental properties of membranes. In the present thesis, we also attempt to move beyond the canonical structure-based theories upon which a majority of biophysical studies are predicated upon. This has been the case as structural quantities still greatly inform on the conditions of the bilayer system, yet the exact lipid distribution and movement are less studied. We will focus upon the movement and organization of phospholipids using a bounty of biophysical techniques, such as small angle neutron scattering, molecular dynamics, and more. The results will be interpreted to show how phospholipid mobility fits into the greater membrane framework

    Metal Cations in Protein Force Fields: From Data Set Creation and Benchmarks to Polarizable Force Field Implementation and Adjustment

    Get PDF
    Metal cations are essential to life. About one-third of all proteins require metal cofactors to accurately fold or to function. Computer simulations using empirical parameters and classical molecular mechanics models (force fields) are the standard tool to investigate proteins’ structural dynamics and functions in silico. Despite many successes, the accuracy of force fields is limited when cations are involved. The focus of this thesis is the development of tools and strategies to create system-specific force field parameters to accurately describe cation-protein interactions. The accuracy of a force field mainly relies on (i) the parameters derived from increasingly large quantum chemistry or experimental data and (ii) the physics behind the energy formula. The first part of this thesis presents a large and comprehensive quantum chemistry data set on a consistent computational footing that can be used for force field parameterization and benchmarking. The data set covers dipeptides of the 20 proteinogenic amino acids with different possible side chain protonation states, 3 divalent cations (Ca2+, Mg2+, and Ba2+), and a wide relative energy range. Crucial properties related to force field development, such as partial charges, interaction energies, etc., are also provided. To make the data available, the data set was uploaded to the NOMAD repository and its data structure was formalized in an ontology. Besides a proper data basis for parameterization, the physics covered by the terms of the additive force field formulation model impacts its applicability. The second part of this thesis benchmarks three popular non-polarizable force fields and the polarizable Drude model against a quantum chemistry data set. After some adjustments, the Drude model was found to reproduce the reference interaction energy substantially better than the non-polarizable force fields, which showed the importance of explicitly addressing polarization effects. Tweaking of the Drude model involved Boltzmann-weighted fitting to optimize Thole factors and Lennard-Jones parameters. The obtained parameters were validated by (i) their ability to reproduce reference interaction energies and (ii) molecular dynamics simulations of the N-lobe of calmodulin. This work facilitates the improvement of polarizable force fields for cation-protein interactions by quantum chemistry-driven parameterization combined with molecular dynamics simulations in the condensed phase. While the Drude model exhibits its potential simulating cation-protein interactions, it lacks description of charge transfer effects, which are significant between cation and protein. The CTPOL model extends the classical force field formulation by charge transfer (CT) and polarization (POL). Since the CTPOL model is not readily available in any of the popular molecular-dynamics packages, it was implemented in OpenMM. Furthermore, an open-source parameterization tool, called FFAFFURR, was implemented that enables the (system specific) parameterization of OPLS-AA and CTPOL models. Following the method established in the previous part, the performance of FFAFFURR was evaluated by its ability to reproduce quantum chemistry energies and molecular dynamics simulations of the zinc finger protein. In conclusion, this thesis steps towards the development of next-generation force fields to accurately describe cation-protein interactions by providing (i) reference data, (ii) a force field model that includes charge transfer and polarization, and (iii) a freely-available parameterization tool.Metallkationen sind für das Leben unerlässlich. Etwa ein Drittel aller Proteine benötigen Metall-Cofaktoren, um sich korrekt zu falten oder zu funktionieren. Computersimulationen unter Verwendung empirischer Parameter und klassischer Molekülmechanik-Modelle (Kraftfelder) sind ein Standardwerkzeug zur Untersuchung der strukturellen Dynamik und Funktionen von Proteinen in silico. Trotz vieler Erfolge ist die Genauigkeit der Kraftfelder begrenzt, wenn Kationen beteiligt sind. Der Schwerpunkt dieser Arbeit liegt auf der Entwicklung von Werkzeugen und Strategien zur Erstellung systemspezifischer Kraftfeldparameter zur genaueren Beschreibung von Kationen-Protein-Wechselwirkungen. Die Genauigkeit eines Kraftfelds hängt hauptsächlich von (i) den Parametern ab, die aus immer größeren quantenchemischen oder experimentellen Daten abgeleitet werden, und (ii) der Physik hinter der Kraftfeld-Formel. Im ersten Teil dieser Arbeit wird ein großer und umfassender quantenchemischer Datensatz auf einer konsistenten rechnerischen Grundlage vorgestellt, der für die Parametrisierung und das Benchmarking von Kraftfeldern verwendet werden kann. Der Datensatz umfasst Dipeptide der 20 proteinogenen Aminosäuren mit verschiedenen möglichen Seitenketten-Protonierungszuständen, 3 zweiwertige Kationen (Ca2+, Mg2+ und Ba2+) und einen breiten relativen Energiebereich. Wichtige Eigenschaften für die Entwicklung von Kraftfeldern, wie Wechselwirkungsenergien, Partialladungen usw., werden ebenfalls bereitgestellt. Um die Daten verfügbar zu machen, wurde der Datensatz in das NOMAD-Repository hochgeladen und seine Datenstruktur wurde in einer Ontologie formalisiert. Neben einer geeigneten Datenbasis für die Parametrisierung beeinflusst die Physik, die von den Termen des additiven Kraftfeld-Modells abgedeckt wird, dessen Anwendbarkeit. Der zweite Teil dieser Arbeit vergleicht drei populäre nichtpolarisierbare Kraftfelder und das polarisierbare Drude-Modell mit einem Datensatz aus der Quantenchemie. Nach einigen Anpassungen stellte sich heraus, dass das Drude-Modell die Referenzwechselwirkungsenergie wesentlich besser reproduziert als die nichtpolarisierbaren Kraftfelder, was zeigt, wie wichtig es ist, Polarisationseffekte explizit zu berücksichtigen. Die Anpassung des Drude-Modells umfasste eine Boltzmann-gewichtete Optimierung der Thole-Faktoren und Lennard-Jones-Parameter. Die erhaltenen Parameter wurden validiert durch (i) ihre Fähigkeit, Referenzwechselwirkungsenergien zu reproduzieren und (ii) Molekulardynamik-Simulationen des Calmodulin-N-Lobe. Diese Arbeit demonstriert die Verbesserung polarisierbarer Kraftfelder für Kationen-Protein-Wechselwirkungen durch quantenchemisch gesteuerte Parametrisierung in Kombination mit Molekulardynamiksimulationen in der kondensierten Phase. Während das Drude-Modell sein Potenzial bei der Simulation von Kation - Protein - Wechselwirkungen zeigt, fehlt ihm die Beschreibung von Ladungstransfereffekten, die zwischen Kation und Protein von Bedeutung sind. Das CTPOL-Modell erweitert die klassische Kraftfeldformulierung um den Ladungstransfer (CT) und die Polarisation (POL). Da das CTPOL-Modell in keinem der gängigen Molekulardynamik-Pakete verfügbar ist, wurde es in OpenMM implementiert. Außerdem wurde ein Open-Source-Parametrisierungswerkzeug namens FFAFFURR implementiert, welches die (systemspezifische) Parametrisierung von OPLS-AA und CTPOL-Modellen ermöglicht. In Anlehnung an die im vorangegangenen Teil etablierte Methode wurde die Leistung von FFAFFURR anhand seiner Fähigkeit, quantenchemische Energien und Molekulardynamiksimulationen des Zinkfingerproteins zu reproduzieren, bewertet. Zusammenfassend lässt sich sagen, dass diese Arbeit einen Schritt in Richtung der Entwicklung von Kraftfeldern der nächsten Generation zur genauen Beschreibung von Kationen-Protein-Wechselwirkungen darstellt, indem sie (i) Referenzdaten, (ii) ein Kraftfeldmodell, das Ladungstransfer und Polarisation einschließt, und (iii) ein frei verfügbares Parametrisierungswerkzeug bereitstellt
    • …
    corecore