90 research outputs found

    Composite Finite Elements for Trabecular Bone Microstructures

    Get PDF
    In many medical and technical applications, numerical simulations need to be performed for objects with interfaces of geometrically complex shape. We focus on the biomechanical problem of elasticity simulations for trabecular bone microstructures. The goal of this dissertation is to develop and implement an efficient simulation tool for finite element simulations on such structures, so-called composite finite elements. We will deal with both the case of material/void interfaces (complicated domains) and the case of interfaces between different materials (discontinuous coefficients). In classical finite element simulations, geometric complexity is encoded in tetrahedral and typically unstructured meshes. Composite finite elements, in contrast, encode geometric complexity in specialized basis functions on a uniform mesh of hexahedral structure. Other than alternative approaches (such as e.g. fictitious domain methods, generalized finite element methods, immersed interface methods, partition of unity methods, unfitted meshes, and extended finite element methods), the composite finite elements are tailored to geometry descriptions by 3D voxel image data and use the corresponding voxel grid as computational mesh, without introducing additional degrees of freedom, and thus making use of efficient data structures for uniformly structured meshes. The composite finite element method for complicated domains goes back to Wolfgang Hackbusch and Stefan Sauter and restricts standard affine finite element basis functions on the uniformly structured tetrahedral grid (obtained by subdivision of each cube in six tetrahedra) to an approximation of the interior. This can be implemented as a composition of standard finite element basis functions on a local auxiliary and purely virtual grid by which we approximate the interface. In case of discontinuous coefficients, the same local auxiliary composition approach is used. Composition weights are obtained by solving local interpolation problems for which coupling conditions across the interface need to be determined. These depend both on the local interface geometry and on the (scalar or tensor-valued) material coefficients on both sides of the interface. We consider heat diffusion as a scalar model problem and linear elasticity as a vector-valued model problem to develop and implement the composite finite elements. Uniform cubic meshes contain a natural hierarchy of coarsened grids, which allows us to implement a multigrid solver for the case of complicated domains. Besides simulations of single loading cases, we also apply the composite finite element method to the problem of determining effective material properties, e.g. for multiscale simulations. For periodic microstructures, this is achieved by solving corrector problems on the fundamental cells using affine-periodic boundary conditions corresponding to uniaxial compression and shearing. For statistically periodic trabecular structures, representative fundamental cells can be identified but do not permit the periodic approach. Instead, macroscopic displacements are imposed using the same set as before of affine-periodic Dirichlet boundary conditions on all faces. The stress response of the material is subsequently computed only on an interior subdomain to prevent artificial stiffening near the boundary. We finally check for orthotropy of the macroscopic elasticity tensor and identify its axes.Zusammengesetzte finite Elemente für trabekuläre Mikrostrukturen in Knochen In vielen medizinischen und technischen Anwendungen werden numerische Simulationen für Objekte mit geometrisch komplizierter Form durchgeführt. Gegenstand dieser Dissertation ist die Simulation der Elastizität trabekulärer Mikrostrukturen von Knochen, einem biomechanischen Problem. Ziel ist es, ein effizientes Simulationswerkzeug für solche Strukturen zu entwickeln, die sogenannten zusammengesetzten finiten Elemente. Wir betrachten dabei sowohl den Fall von Interfaces zwischen Material und Hohlraum (komplizierte Gebiete) als auch zwischen verschiedenen Materialien (unstetige Koeffizienten). In klassischen Finite-Element-Simulationen wird geometrische Komplexität typischerweise in unstrukturierten Tetraeder-Gittern kodiert. Zusammengesetzte finite Elemente dagegen kodieren geometrische Komplexität in speziellen Basisfunktionen auf einem gleichförmigen Würfelgitter. Anders als alternative Ansätze (wie zum Beispiel fictitious domain methods, generalized finite element methods, immersed interface methods, partition of unity methods, unfitted meshes und extended finite element methods) sind die zusammengesetzten finiten Elemente zugeschnitten auf die Geometriebeschreibung durch dreidimensionale Bilddaten und benutzen das zugehörige Voxelgitter als Rechengitter, ohne zusätzliche Freiheitsgrade einzuführen. Somit können sie effiziente Datenstrukturen für gleichförmig strukturierte Gitter ausnutzen. Die Methode der zusammengesetzten finiten Elemente geht zurück auf Wolfgang Hackbusch und Stefan Sauter. Man schränkt dabei übliche affine Finite-Element-Basisfunktionen auf gleichförmig strukturierten Tetraedergittern (die man durch Unterteilung jedes Würfels in sechs Tetraeder erhält) auf das approximierte Innere ein. Dies kann implementiert werden durch das Zusammensetzen von Standard-Basisfunktionen auf einem lokalen und rein virtuellen Hilfsgitter, durch das das Interface approximiert wird. Im Falle unstetiger Koeffizienten wird die gleiche lokale Hilfskonstruktion verwendet. Gewichte für das Zusammensetzen erhält man hier, indem lokale Interpolationsprobleme gelöst werden, wozu zunächst Kopplungsbedingungen über das Interface hinweg bestimmt werden. Diese hängen ab sowohl von der lokalen Geometrie des Interface als auch von den (skalaren oder tensorwertigen) Material-Koeffizienten auf beiden Seiten des Interface. Wir betrachten Wärmeleitung als skalares und lineare Elastizität als vektorwertiges Modellproblem, um die zusammengesetzten finiten Elemente zu entwickeln und zu implementieren. Gleichförmige Würfelgitter enthalten eine natürliche Hierarchie vergröberter Gitter, was es uns erlaubt, im Falle komplizierter Gebiete einen Mehrgitterlöser zu implementieren. Neben Simulationen einzelner Lastfälle wenden wir die zusammengesetzten finiten Elemente auch auf das Problem an, effektive Materialeigenschaften zu bestimmen, etwa für mehrskalige Simulationen. Für periodische Mikrostrukturen wird dies erreicht, indem man Korrekturprobleme auf der Fundamentalzelle löst. Dafür nutzt man affin-periodische Randwerte, die zu uniaxialem Druck oder zu Scherung korrespondieren. In statistisch periodischen trabekulären Mikrostrukturen lassen sich ebenfalls Fundamentalzellen identifizieren, sie erlauben jedoch keinen periodischen Ansatz. Stattdessen werden makroskopische Verschiebungen zu denselben affin-periodischen Randbedingungen vorgegeben, allerdings durch Dirichlet-Randwerte auf allen Seitenflächen. Die Spannungsantwort des Materials wird anschließend nur auf einem inneren Teilbereich berechnet, um künstliche Versteifung am Rand zu verhindern. Schließlich prüfen wir den makroskopischen Elastizitätstensor auf Orthotropie und identifizieren deren Achsen

    Editorial: Multiscale modeling for the liver

    Get PDF

    Digitization of Pathology Labs: A Review of Lessons Learned

    Full text link
    Pathology laboratories are increasingly using digital workflows. This has the potential of increasing lab efficiency, but the digitization process also involves major challenges. Several reports have been published describing the individual experiences of specific laboratories with the digitization process. However, a comprehensive overview of the lessons learned is still lacking. We provide an overview of the lessons learned for different aspects of the digitization process, including digital case management, digital slide reading, and computer-aided slide reading. We also cover metrics used for monitoring performance and pitfalls and corresponding values observed in practice. The overview is intended to help pathologists, IT decision-makers, and administrators to benefit from the experiences of others and to implement the digitization process in an optimal way to make their own laboratory future-proof.Comment: 22 pages, 1 figur

    Focused scores enable reliable discrimination of small differences in steatosis

    Get PDF
    Background: Automated image analysis enables quantitative measurement of steatosis in histological images. However, spatial heterogeneity of steatosis can make quantitative steatosis scores unreliable. To improve the reliability, we have developed novel scores that are “focused” on steatotic tissue areas. Methods: Focused scores use concepts of tile-based hotspot analysis in order to compute statistics about steatotic tissue areas in an objective way. We evaluated focused scores on three data sets of images of rodent liver sections exhibiting different amounts of dietary-induced steatosis. The same evaluation was conducted with the standard steatosis score computed by most image analysis methods. Results: The standard score reliably discriminated large differences in steatosis (intraclass correlation coefficient ICC = 0.86), but failed to discriminate small (ICC = 0.54) and very small (ICC = 0.14) differences. With an appropriate tile size, mean-based focused scores reliably discriminated large (ICC = 0.92), small (ICC = 0.86) and very small (ICC = 0.83) differences. Focused scores based on high percentiles showed promise in further improving the discrimination of very small differences (ICC = 0.93). Conclusions: Focused scores enable reliable discrimination of small differences in steatosis in histological images. They are conceptually simple and straightforward to use in research studies

    Computational Modeling in Liver Surgery

    Get PDF
    The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.Peer Reviewe

    Data-Driven Discovery of Immune Contexture Biomarkers

    Get PDF
    Background: Features characterizing the immune contexture (IC) in the tumor microenvironment can be prognostic and predictive biomarkers. Identifying novel biomarkers can be challenging due to complex interactions between immune and tumor cells and the abundance of possible features.Methods: We describe an approach for the data-driven identification of IC biomarkers. For this purpose, we provide mathematical definitions of different feature classes, based on cell densities, cell-to-cell distances, and spatial heterogeneity thereof. Candidate biomarkers are ranked according to their potential for the predictive stratification of patients.Results: We evaluated the approach on a dataset of colorectal cancer patients with variable amounts of microsatellite instability. The most promising features that can be explored as biomarkers were based on cell-to-cell distances and spatial heterogeneity. Both the tumor and non-tumor compartments yielded features that were potentially predictive for therapy response and point in direction of further exploration.Conclusion: The data-driven approach simplifies the identification of promising IC biomarker candidates. Researchers can take guidance from the described approach to accelerate their biomarker research

    Ten simple rules for typographically appealing scientific texts.

    No full text
    • …
    corecore