38 research outputs found

    Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    Get PDF
    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. © 2014 Hogg et al

    Efficiency of a mathematical model in generating CAD/CAM-partial crowns with natural tooth morphology

    Full text link
    The "biogeneric tooth model" can be used for computer-aided design (CAD) of the occlusal surface of dental restorations. From digital 3D-data, it automatically retrieves a morphology matching the natural surface left after preparation. This study evaluates the potential of this method for generating well-matched and well-adjusted CAD/computer-aided manufacturing (CAM) fabricated partial crowns. Twelve models with partial crown preparations were mounted into an articulator. Partial crowns were designed with the Cerec 3D CAD software based on the biogeneric tooth model (Biog.CAD) and, for control, with a conventional data-based Cerec 3D CAD software (Conv.CAD). The design time was measured, and the naturalness of the morphology was visually assessed. The restorations were milled, cemented on the models, and the vertical discrepancy and the time for final occlusal adjustment were measured. The Biog.CAD software offered a significantly higher naturalness (up to 225 to 11 scores) and was significantly faster by 251 (+/-78) s in designing partial crowns (p < 0.01) compared to Conv.CAD software. Vertical discrepancy, 0.52 (+/-0.28) mm for Conv.CAD and 0.46 (+/-0.19) mm for Biog.CAD, and occlusal adjustment time, 118 (+/-132) s for Conv.CAD and 102 (+/-77) s for Biog.CAD, did not differ significantly. In conclusion, the biogeneric tooth model is able to generate occlusal morphology of partial crowns in a fully automated process with higher naturalness compared to conventional interactive CAD software

    Exploratorische Faktorenanalyse (EFA)

    Full text link
    In diesem Kapitel wird auf die wichtigsten Aspekte bei der Durchführung einer EFA eingegangen. Es wird mit der allgemeinen Modellvorstellung in der Faktorenanalyse begonnen (Fundamentaltheorem), sodann wird die darauf basierende Varianzzerlegung in durch gemeinsame Faktoren erklärte und unerklärte Anteile dargestellt. Anschließend werden die zentralen Begriffe in der EFA eingeführt, d. h. die Eigenwerte der Faktoren sowie die Kommunalität und Spezifität der Items. Die wichtigsten Extraktionsmethoden, d. h. die Hauptachsenanalyse (Principal Factor Analysis, PFA) und Maximum-Likelihood-EFA (ML-EFA), sowie Rotationskriterien (orthogonal vs. oblique) werden diskutiert, bevor auf weitere Aspekte wie die Beurteilung der Modellgüte, alternative Schätzverfahren und die Berechnung von Faktorwerten eingegangen wird. Dieses Kapitel liefert einen Überblick über die exploratorische Faktorenanalyse (EFA) mit dem Schwerpunkt auf ihren Einsatz in der Testkonstruktion. Die EFA kann in der Testkonstruktion z. B. der Beurteilung der Dimensionalität der in einem Test enthaltenen Items dienen oder sie kann die Dimensionalität mehrerer Subtests eines Tests zueinander in Beziehung setzen. Die EFA kann verwendet werden, um die Frage zu beantworten, ob die einzelnen Items, die zu einer Facette eines Tests gehören, auch dasselbe messen, ob also eine Zusammenfassung der einzelnen Itemwerte zu einem Testwert gerechtfertigt ist. In der EFA unterscheidet man Extraktionsverfahren und -kriterien sowie Rotationsverfahren. Extraktionsverfahren und -kriterien werden verwendet, um zu entscheiden, wie viele Dimensionen (Faktoren) notwendig sind, um die in den Items enthaltenen multivariaten Informationen ökonomisch zu repräsentieren. Die verschiedenen Rotationsverfahren erlauben außerdem eine bessere Zuordnung der Items zu den Faktoren. Das Kapitel beginnt mit der Einführung der Grundlagen der EFA zur Bestimmung der Anzahl der relevanten Faktoren, gefolgt vom Konzept der Rotation. Anschließend werden Aspekte der Modellüberprüfung und -interpretation vorgestellt. Zuletzt wird auf aktuelle Entwicklungen der EFA eingegangen
    corecore