81 research outputs found

    An Efficient Algorithm for Automatic Structure Optimization in X-ray Standing-Wave Experiments

    Full text link
    X-ray standing-wave photoemission experiments involving multilayered samples are emerging as unique probes of the buried interfaces that are ubiquitous in current device and materials research. Such data require for their analysis a structure optimization process comparing experiment to theory that is not straightforward. In this work, we present a new computer program for optimizing the analysis of standing-wave data, called SWOPT, that automates this trial-and-error optimization process. The program includes an algorithm that has been developed for computationally expensive problems: so-called black-box simulation optimizations. It also includes a more efficient version of the Yang X-ray Optics Program (YXRO) [Yang, S.-H., Gray, A.X., Kaiser, A.M., Mun, B.S., Sell, B.C., Kortright, J.B., Fadley, C.S., J. Appl. Phys. 113, 1 (2013)] which is about an order of magnitude faster than the original version. Human interaction is not required during optimization. We tested our optimization algorithm on real and hypothetical problems and show that it finds better solutions significantly faster than a random search approach. The total optimization time ranges, depending on the sample structure, from minutes to a few hours on a modern laptop computer, and can be up to 100x faster than a corresponding manual optimization. These speeds make the SWOPT program a valuable tool for realtime analyses of data during synchrotron experiments

    Exact subgrid interface correction schemes for elliptic interface problems

    Get PDF
    Abstract We introduce a non-conforming finite element method for second order elliptic interface problems. Our approach applies to problems in which discontinuous coefficients and singular sources on the interface may give rise to jump discontinuities in either the solution or its normal derivative. Given a standard background mesh and an interface that passes between elements, the key idea is to construct a singular correction function which satisfies the prescribed jump conditions, providing accurate sub-grid resolution of the discontinuities. Utilizing the closest point extension and an implicit interface representation by the signed distance function, an algorithm is established to construct the correction function. The result is a function which is supported only on the interface elements, represented by the regular basis functions, and bounded independently of the interface location with respect to the background mesh. In the particular case of a constant second order coefficient, our regularization by singular function is straightforward, and the resulting left-hand-side is identical to that of a regular problem without introducing any instability. The influence of the regularization appears solely on the right-hand-side, which simplifies the implementation. In the more general case of discontinuous second order coefficients, a normalization is invoked which introduces a constraint equation on the interface. This results in a problem statement similar to that of a saddle-point problem. We employ two-level-iteration as the solution strategy, which exhibits aspects similar to those of iterative preconditioning strategies. Elliptic interface problems appear in many physical applications, including Stefan problems, fluids problems, materials issues, free boundary problems, and shape optimization In the above, as a model problem, we can take a Poisson equation with piecewise constant coefficient, a i > 0 on each Ω i ; −∇·(a i ∇u) = f on Ω i for i = 1, 2 with the boundary data given on ∂Ω. For well-posedness, we need additional information on the behavior of the solution on the interface. Let g and h be

    Multicellular Architecture of Malignant Breast Epithelia Influences Mechanics

    Get PDF
    Cell–matrix and cell–cell mechanosensing are important in many cellular processes, particularly for epithelial cells. A crucial question, which remains unexplored, is how the mechanical microenvironment is altered as a result of changes to multicellular tissue structure during cancer progression. In this study, we investigated the influence of the multicellular tissue architecture on mechanical properties of the epithelial component of the mammary acinus. Using creep compression tests on multicellular breast epithelial structures, we found that pre-malignant acini with no lumen (MCF10AT) were significantly stiffer than normal hollow acini (MCF10A) by 60%. This difference depended on structural changes in the pre-malignant acini, as neither single cells nor normal multicellular acini tested before lumen formation exhibited these differences. To understand these differences, we simulated the deformation of the acini with different multicellular architectures and calculated their mechanical properties; our results suggest that lumen filling alone can explain the experimentally observed stiffness increase. We also simulated a single contracting cell in different multicellular architectures and found that lumen filling led to a 20% increase in the “perceived stiffness” of a single contracting cell independent of any changes to matrix mechanics. Our results suggest that lumen filling in carcinogenesis alters the mechanical microenvironment in multicellular epithelial structures, a phenotype that may cause downstream disruptions to mechanosensing

    Evaluation of state-of-the-art segmentation algorithms for left ventricle infarct from late Gadolinium enhancement MR images

    Get PDF
    Studies have demonstrated the feasibility of late Gadolinium enhancement (LGE) cardiovascular magnetic resonance (CMR) imaging for guiding the management of patients with sequelae to myocardial infarction, such as ventricular tachycardia and heart failure. Clinical implementation of these developments necessitates a reproducible and reliable segmentation of the infarcted regions. It is challenging to compare new algorithms for infarct segmentation in the left ventricle (LV) with existing algorithms. Benchmarking datasets with evaluation strategies are much needed to facilitate comparison. This manuscript presents a benchmarking evaluation framework for future algorithms that segment infarct from LGE CMR of the LV. The image database consists of 30 LGE CMR images of both humans and pigs that were acquired from two separate imaging centres. A consensus ground truth was obtained for all data using maximum likelihood estimation. Six widely-used fixed-thresholding methods and five recently developed algorithms are tested on the benchmarking framework. Results demonstrate that the algorithms have better overlap with the consensus ground truth than most of the n-SD fixed-thresholding methods, with the exception of the FullWidth-at-Half-Maximum (FWHM) fixed-thresholding method. Some of the pitfalls of fixed thresholding methods are demonstrated in this work. The benchmarking evaluation framework, which is a contribution of this work, can be used to test and benchmark future algorithms that detect and quantify infarct in LGE CMR images of the LV. The datasets, ground truth and evaluation code have been made publicly available through the website: https://www.cardiacatlas.org/web/guest/challenges
    corecore