2,970 research outputs found

    An improved parameter estimation and comparison for soft tissue constitutive models containing an exponential function

    Get PDF
    Motivated by the well-known result that stiffness of soft tissue is proportional to the stress, many of the constitutive laws for soft tissues contain an exponential function. In this work, we analyze properties of the exponential function and how it affects the estimation and comparison of elastic parameters for soft tissues. In particular, we find that as a consequence of the exponential function there are lines of high covariance in the elastic parameter space. As a result, one can have widely varying mechanical parameters defining the tissue stiffness but similar effective stress–strain responses. Drawing from elementary algebra, we propose simple changes in the norm and the parameter space, which significantly improve the convergence of parameter estimation and robustness in the presence of noise. More importantly, we demonstrate that these changes improve the conditioning of the problem and provide a more robust solution in the case of heterogeneous material by reducing the chances of getting trapped in a local minima. Based upon the new insight, we also propose a transformed parameter space which will allow for rational parameter comparison and avoid misleading conclusions regarding soft tissue mechanics

    Estimating the intrinsic dimension of datasets by a minimal neighborhood information

    Get PDF
    Analyzing large volumes of high-dimensional data is an issue of fundamental importance in data science, molecular simulations and beyond. Several approaches work on the assumption that the important content of a dataset belongs to a manifold whose Intrinsic Dimension (ID) is much lower than the crude large number of coordinates. Such manifold is generally twisted and curved; in addition points on it will be non-uniformly distributed: two factors that make the identification of the ID and its exploitation really hard. Here we propose a new ID estimator using only the distance of the first and the second nearest neighbor of each point in the sample. This extreme minimality enables us to reduce the effects of curvature, of density variation, and the resulting computational cost. The ID estimator is theoretically exact in uniformly distributed datasets, and provides consistent measures in general. When used in combination with block analysis, it allows discriminating the relevant dimensions as a function of the block size. This allows estimating the ID even when the data lie on a manifold perturbed by a high-dimensional noise, a situation often encountered in real world data sets. We demonstrate the usefulness of the approach on molecular simulations and image analysis

    Learning image quality assessment by reinforcing task amenable data selection

    Get PDF
    In this paper, we consider a type of image quality assessment as a task-specific measurement, which can be used to select images that are more amenable to a given target task, such as image classification or segmentation. We propose to train simultaneously two neural networks for image selection and a target task using reinforcement learning. A controller network learns an image selection policy by maximising an accumulated reward based on the target task performance on the controller-selected validation set, whilst the target task predictor is optimised using the training set. The trained controller is therefore able to reject those images that lead to poor accuracy in the target task. In this work, we show that the controller-predicted image quality can be significantly different from the task-specific image quality labels that are manually defined by humans. Furthermore, we demonstrate that it is possible to learn effective image quality assessment without using a ``clean'' validation set, thereby avoiding the requirement for human labelling of images with respect to their amenability for the task. Using 67126712, labelled and segmented, clinical ultrasound images from 259259 patients, experimental results on holdout data show that the proposed image quality assessment achieved a mean classification accuracy of 0.94±0.010.94\pm0.01 and a mean segmentation Dice of 0.89±0.020.89\pm0.02, by discarding 5%5\% and 15%15\% of the acquired images, respectively. The significantly improved performance was observed for both tested tasks, compared with the respective 0.90±0.010.90\pm0.01 and 0.82±0.020.82\pm0.02 from networks without considering task amenability. This enables image quality feedback during real-time ultrasound acquisition among many other medical imaging applications

    Adaptable image quality assessment using meta-reinforcement learning of task amenability

    Get PDF
    The performance of many medical image analysis tasks are strongly associated with image data quality. When developing modern deep learning algorithms, rather than relying on subjective (human-based) image quality assessment (IQA), task amenability potentially provides an objective measure of task-specific image quality. To predict task amenability, an IQA agent is trained using reinforcement learning (RL) with a simultaneously optimised task predictor, such as a classification or segmentation neural network. In this work, we develop transfer learning or adaptation strategies to increase the adaptability of both the IQA agent and the task predictor so that they are less dependent on high-quality, expert-labelled training data. The proposed transfer learning strategy re-formulates the original RL problem for task amenability in a meta-reinforcement learning (meta-RL) framework. The resulting algorithm facilitates efficient adaptation of the agent to different definitions of image quality, each with its own Markov decision process environment including different images, labels and an adaptable task predictor. Our work demonstrates that the IQA agents pre-trained on non-expert task labels can be adapted to predict task amenability as defined by expert task labels, using only a small set of expert labels. Using 6644 clinical ultrasound images from 249 prostate cancer patients, our results for image classification and segmentation tasks show that the proposed IQA method can be adapted using data with as few as respective 19.7 % % and 29.6 % % expert-reviewed consensus labels and still achieve comparable IQA and task performance, which would otherwise require a training dataset with 100 % % expert labels

    Cosmic Hydrogen Was Significantly Neutral a Billion Years After the Big Bang

    Full text link
    The ionization fraction of cosmic hydrogen, left over from the big bang, provides crucial fossil evidence for when the first stars and quasar black holes formed in the infant universe. Spectra of the two most distant quasars known show nearly complete absorption of photons with wavelengths shorter than the Ly-alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift z~6.3, about a billion years after the big bang. Here we show that the radii of influence of ionizing radiation from these quasars imply that the surrounding IGM had a neutral hydrogen fraction of tens of percent prior to the quasar activity, much higher than previous lower limits of ~0.1%. When combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination from the WMAP data, our result suggests the existence of a second peak in the mean ionization history, potentially due to an early formation episode of the first stars.Comment: 14 Pages, 2 Figures. Accepted for publication in Nature. Press embargo until publishe

    A review of artificial intelligence in prostate cancer detection on imaging

    Get PDF
    A multitude of studies have explored the role of artificial intelligence (AI) in providing diagnostic support to radiologists, pathologists, and urologists in prostate cancer detection, risk-stratification, and management. This review provides a comprehensive overview of relevant literature regarding the use of AI models in (1) detecting prostate cancer on radiology images (magnetic resonance and ultrasound imaging), (2) detecting prostate cancer on histopathology images of prostate biopsy tissue, and (3) assisting in supporting tasks for prostate cancer detection (prostate gland segmentation, MRI-histopathology registration, MRI-ultrasound registration). We discuss both the potential of these AI models to assist in the clinical workflow of prostate cancer diagnosis, as well as the current limitations including variability in training data sets, algorithms, and evaluation criteria. We also discuss ongoing challenges and what is needed to bridge the gap between academic research on AI for prostate cancer and commercial solutions that improve routine clinical care

    A Revised Design for Microarray Experiments to Account for Experimental Noise and Uncertainty of Probe Response

    Get PDF
    Background Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance. Results Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements. Conclusion The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations

    Transforming growth factor-beta and mutant p53 conspire to induce metastasis by antagonizing p63: a (ternary) complex affair

    Get PDF
    How and when a tumor acquires metastatic properties remain largely unknown. Recent work has uncovered an intricate new mechanism through which transforming growth factor-beta (TGFβ) acts in concert with oncogenic Ras to antagonize p63-metastasis protective function. p63 inhibition requires the combined action of Ras-activated mutant p53 and TGFβ-induced Smads. Mechanistically, it involves the formation of a p63-Smads-mutant p53 ternary complex. Remarkably, just two of the key downstream targets of p63 turn out to be sufficient as a prognostic tool for breast cancer metastasis. Moreover, the molecular mechanism of this inhibition points to novel therapeutic possibilities

    Plasmonically Enhanced Reflectance of Heat Radiation from Low-Bandgap Semiconductor Microinclusions

    Get PDF
    Increased reflectance from the inclusion of highly scattering particles at low volume fractions in an insulating dielectric offers a promising way to reduce radiative thermal losses at high temperatures. Here, we investigate plasmonic resonance driven enhanced scattering from microinclusions of low-bandgap semiconductors (InP, Si, Ge, PbS, InAs and Te) in an insulating composite to tailor its infrared reflectance for minimizing thermal losses from radiative transfer. To this end, we compute the spectral properties of the microcomposites using Monte Carlo modeling and compare them with results from Fresnel equations. The role of particle size-dependent Mie scattering and absorption efficiencies, and, scattering anisotropy are studied to identify the optimal microinclusion size and material parameters for maximizing the reflectance of the thermal radiation. For composites with Si and Ge microinclusions we obtain reflectance efficiencies of 57 - 65% for the incident blackbody radiation from sources at temperatures in the range 400 - 1600 {\deg}C. Furthermore, we observe a broadbanding of the reflectance spectra from the plasmonic resonances due to charge carriers generated from defect states within the semiconductor bandgap. Our results thus open up the possibility of developing efficient high-temperature thermal insulators through use of the low-bandgap semiconductor microinclusions in insulating dielectrics.Comment: Main article (8 Figures and 2 Tables) + Supporting Information (8 Figures

    Holographic Wilsonian flows and emergent fermions in extremal charged black holes

    Full text link
    We study holographic Wilsonian RG in a general class of asymptotically AdS backgrounds with a U(1) gauge field. We consider free charged Dirac fermions in such a background, and integrate them up to an intermediate radial distance, yielding an equivalent low energy dual field theory. The new ingredient, compared to scalars, involves a `generalized' basis of coherent states which labels a particular half of the fermion components as coordinates or momenta, depending on the choice of quantization (standard or alternative). We apply this technology to explicitly compute RG flows of charged fermionic operators and their composites (double trace operators) in field theories dual to (a) pure AdS and (b) extremal charged black hole geometries. The flow diagrams and fixed points are determined explicitly. In the case of the extremal black hole, the RG flows connect two fixed points at the UV AdS boundary to two fixed points at the IR AdS_2 region. The double trace flow is shown, both numerically and analytically, to develop a pole singularity in the AdS_2 region at low frequency and near the Fermi momentum, which can be traced to the appearance of massless fermion modes on the low energy cut-off surface. The low energy field theory action we derive exactly agrees with the semi-holographic action proposed by Faulkner and Polchinski in arXiv:1001.5049 [hep-th]. In terms of field theory, the holographic version of Wilsonian RG leads to a quantum theory with random sources. In the extremal black hole background the random sources become `light' in the AdS_2 region near the Fermi surface and emerge as new dynamical degrees of freedom.Comment: 37 pages (including 8 pages of appendix), 10 figures and 2 table
    • …
    corecore