332 research outputs found

    Assessment of algorithms for mitosis detection in breast cancer histopathology images

    Get PDF
    The proliferative activity of breast tumors, which is routinely estimated by counting of mitotic figures in hematoxylin and eosin stained histology sections, is considered to be one of the most important prognostic markers. However, mitosis counting is laborious, subjective and may suffer from low inter-observer agreement. With the wider acceptance of whole slide images in pathology labs, automatic image analysis has been proposed as a potential solution for these issues. In this paper, the results from the Assessment of Mitosis Detection Algorithms 2013 (AMIDA13) challenge are described. The challenge was based on a data set consisting of 12 training and 11 testing subjects, with more than one thousand annotated mitotic figures by multiple observers. Short descriptions and results from the evaluation of eleven methods are presented. The top performing method has an error rate that is comparable to the inter-observer agreement among pathologists

    The kidney and the elderly : assessment of renal function ; prognosis following renal failure

    Get PDF

    Probabilistic Models of Motor Production

    Get PDF
    N. Bernstein defined the ability of the central neural system (CNS) to control many degrees of freedom of a physical body with all its redundancy and flexibility as the main problem in motor control. He pointed at that man-made mechanisms usually have one, sometimes two degrees of freedom (DOF); when the number of DOF increases further, it becomes prohibitively hard to control them. The brain, however, seems to perform such control effortlessly. He suggested the way the brain might deal with it: when a motor skill is being acquired, the brain artificially limits the degrees of freedoms, leaving only one or two. As the skill level increases, the brain gradually "frees" the previously fixed DOF, applying control when needed and in directions which have to be corrected, eventually arriving to the control scheme where all the DOF are "free". This approach of reducing the dimensionality of motor control remains relevant even today. One the possibles solutions of the Bernstetin's problem is the hypothesis of motor primitives (MPs) - small building blocks that constitute complex movements and facilitite motor learnirng and task completion. Just like in the visual system, having a homogenious hierarchical architecture built of similar computational elements may be beneficial. Studying such a complicated object as brain, it is important to define at which level of details one works and which questions one aims to answer. David Marr suggested three levels of analysis: 1. computational, analysing which problem the system solves; 2. algorithmic, questioning which representation the system uses and which computations it performs; 3. implementational, finding how such computations are performed by neurons in the brain. In this thesis we stay at the first two levels, seeking for the basic representation of motor output. In this work we present a new model of motor primitives that comprises multiple interacting latent dynamical systems, and give it a full Bayesian treatment. Modelling within the Bayesian framework, in my opinion, must become the new standard in hypothesis testing in neuroscience. Only the Bayesian framework gives us guarantees when dealing with the inevitable plethora of hidden variables and uncertainty. The special type of coupling of dynamical systems we proposed, based on the Product of Experts, has many natural interpretations in the Bayesian framework. If the dynamical systems run in parallel, it yields Bayesian cue integration. If they are organized hierarchically due to serial coupling, we get hierarchical priors over the dynamics. If one of the dynamical systems represents sensory state, we arrive to the sensory-motor primitives. The compact representation that follows from the variational treatment allows learning of a motor primitives library. Learned separately, combined motion can be represented as a matrix of coupling values. We performed a set of experiments to compare different models of motor primitives. In a series of 2-alternative forced choice (2AFC) experiments participants were discriminating natural and synthesised movements, thus running a graphics Turing test. When available, Bayesian model score predicted the naturalness of the perceived movements. For simple movements, like walking, Bayesian model comparison and psychophysics tests indicate that one dynamical system is sufficient to describe the data. For more complex movements, like walking and waving, motion can be better represented as a set of coupled dynamical systems. We also experimentally confirmed that Bayesian treatment of model learning on motion data is superior to the simple point estimate of latent parameters. Experiments with non-periodic movements show that they do not benefit from more complex latent dynamics, despite having high kinematic complexity. By having a fully Bayesian models, we could quantitatively disentangle the influence of motion dynamics and pose on the perception of naturalness. We confirmed that rich and correct dynamics is more important than the kinematic representation. There are numerous further directions of research. In the models we devised, for multiple parts, even though the latent dynamics was factorized on a set of interacting systems, the kinematic parts were completely independent. Thus, interaction between the kinematic parts could be mediated only by the latent dynamics interactions. A more flexible model would allow a dense interaction on the kinematic level too. Another important problem relates to the representation of time in Markov chains. Discrete time Markov chains form an approximation to continuous dynamics. As time step is assumed to be fixed, we face with the problem of time step selection. Time is also not a explicit parameter in Markov chains. This also prohibits explicit optimization of time as parameter and reasoning (inference) about it. For example, in optimal control boundary conditions are usually set at exact time points, which is not an ecological scenario, where time is usually a parameter of optimization. Making time an explicit parameter in dynamics may alleviate this

    Numerical Modeling of Flexible Structures in Open Ocean Environment

    Get PDF
    The dissertation presents advancements in numerical modeling of offshore aquaculture and harbor protection structures in the open ocean environment. The advancements were implemented in the finite element software Hydro-FE that expands the Morison equation approach previously incorporated in Aqua-FE software developed at the University of New Hampshire. The concept of equivalent dropper was introduced and validated on the example of a typical mussel longline design. Parametric studies for mussel dropper drag coefficients and bending stiffness contributions were performed for different environmental conditions. To model kelp aggregates in macroalgae aquaculture, a corresponding numerical technique was developed. The technique proposes a modified Morison-type approach calibrated in full-scale physical tow tank experiments conducted at Hydromechanics Laboratory of the United States Naval Academy. In addition to the numerical modeling techniques, an advanced methodology for multidimensional approximation of the current velocity fields around offshore installations was proposed. The methodology was applied to model a response of a kelp farm by utilizing tidal-driven acoustic Doppler current profiler measurements. Finally, a numerical model of a floating protective barrier was built in the Hydro-FE software to evaluate its seaworthiness. The model was validated by comparison to measurements obtained in scaled physical wave tank tests and field deployments

    Medical Image Registration Using Deep Neural Networks

    Get PDF
    Registration is a fundamental problem in medical image analysis wherein images are transformed spatially to align corresponding anatomical structures in each image. Recently, the development of learning-based methods, which exploit deep neural networks and can outperform classical iterative methods, has received considerable interest from the research community. This interest is due in part to the substantially reduced computational requirements that learning-based methods have during inference, which makes them particularly well-suited to real-time registration applications. Despite these successes, learning-based methods can perform poorly when applied to images from different modalities where intensity characteristics can vary greatly, such as in magnetic resonance and ultrasound imaging. Moreover, registration performance is often demonstrated on well-curated datasets, closely matching the distribution of the training data. This makes it difficult to determine whether demonstrated performance accurately represents the generalization and robustness required for clinical use. This thesis presents learning-based methods which address the aforementioned difficulties by utilizing intuitive point-set-based representations, user interaction and meta-learning-based training strategies. Primarily, this is demonstrated with a focus on the non-rigid registration of 3D magnetic resonance imaging to sparse 2D transrectal ultrasound images to assist in the delivery of targeted prostate biopsies. While conventional systematic prostate biopsy methods can require many samples to be taken to confidently produce a diagnosis, tumor-targeted approaches have shown improved patient, diagnostic, and disease management outcomes with fewer samples. However, the available intraoperative transrectal ultrasound imaging alone is insufficient for accurate targeted guidance. As such, this exemplar application is used to illustrate the effectiveness of sparse, interactively-acquired ultrasound imaging for real-time, interventional registration. The presented methods are found to improve registration accuracy, relative to state-of-the-art, with substantially lower computation time and require a fraction of the data at inference. As a result, these methods are particularly attractive given their potential for real-time registration in interventional applications

    Advances in Image Processing, Analysis and Recognition Technology

    Get PDF
    For many decades, researchers have been trying to make computers’ analysis of images as effective as the system of human vision is. For this purpose, many algorithms and systems have previously been created. The whole process covers various stages, including image processing, representation and recognition. The results of this work can be applied to many computer-assisted areas of everyday life. They improve particular activities and provide handy tools, which are sometimes only for entertainment, but quite often, they significantly increase our safety. In fact, the practical implementation of image processing algorithms is particularly wide. Moreover, the rapid growth of computational complexity and computer efficiency has allowed for the development of more sophisticated and effective algorithms and tools. Although significant progress has been made so far, many issues still remain, resulting in the need for the development of novel approaches

    A systematic review of application of multi-criteria decision analysis for aging-dam management

    Get PDF
    [EN] Decisions for aging-dam management requires a transparent process to prevent the dam failure, thus to avoid severe consequences in socio-economic and environmental terms. Multiple criteria analysis arose to model complex problems like this. This paper reviews specific problems, applications and Multi-Criteria Decision Making techniques for dam management. Multi-Attribute Decision Making techniques had a major presence under the single approach, specially the Analytic Hierarchy Process, and its combination with Technique for Order of Preference by Similarity to Ideal Solution was prominent under the hybrid approach; while a high variety of complementary techniques was identified. A growing hybridization and fuzzification are the two most relevant trends observed. The integration of stakeholders within the decision making process and the inclusion of trade-offs and interactions between components within the evaluation model must receive a deeper exploration. Despite the progressive consolidation of Multi-Criteria Decision Making in dam management, further research is required to differentiate between rational and intuitive decision processes. Additionally, the need to address benefits, opportunities, costs and risks related to repair, upgrading or removal measures in aging dams suggests the Analytic Network Process, not yet explored under this approach, as an interesting path worth investigating.This research was funded by the Spanish Ministry of Economy and Competitiveness along with FEDER funding (Projects BIA201456574-R and ECO2015-66673-R).Zamarrón-Mieza, I.; Yepes, V.; Moreno-Jiménez, JM. (2017). A systematic review of application of multi-criteria decision analysis for aging-dam management. Journal of Cleaner Production. 147:217-230. https://doi.org/10.1016/j.jclepro.2017.01.092S21723014

    Applied probabilistic forecasting

    Get PDF
    In any actual forecast, the future evolution of the system is uncertain and the forecasting model is mathematically imperfect. Both, ontic uncertainties in the future (due to true stochasticity) and epistemic uncertainty of the model (reflecting structural imperfections) complicate the construction and evaluation of probabilistic forecast. In almost all nonlinear forecast models, the evolution of uncertainty in time is not tractable analytically and Monte Carlo approaches (”ensemble forecasting”) are widely used. This thesis advances our understanding of the construction of forecast densities from ensembles, the evolution of the resulting probability forecasts and methods of establishing skill (benchmarks). A novel method of partially correcting the model error is introduced and shown to outperform a competitive approach. The properties of Kernel dressing, a method of transforming ensembles into probability density functions, are investigated and the convergence of the approach is illustrated. A connection between forecasting and Information theory is examined by demonstrating that Kernel dressing via minimization of Ignorance implicitly leads to minimization of Kulback-Leibler divergence. The Ignorance score is critically examined in the context of other Information theory measures. The method of Dynamic Climatology is introduced as a new approach to establishing skill (benchmarking). Dynamic Climatology is a new, relatively simple, nearest neighbor based model shown to be of value in benchmarking of global circulation models of the ENSEMBLES project. ENSEMBLES is a project funded by the European Union bringing together all major European weather forecasting institutions in order to develop and test state-of-the-art seasonal weather forecasting models. Via benchmarking the seasonal forecasts of the ENSEMBLES models we demonstrate that Dynamic Climatology can help us better understand the value and forecasting performance of large scale circulation models. Lastly, a new approach to correcting (improving) imperfect model is presented, an idea inspired by [63]. The main idea is based on a two-stage procedure where a second stage ‘corrective’ model iteratively corrects systematic parts of forecasting errors produced by a first stage ‘core’ model. The corrector is of an iterative nature so that at a given time t the core model forecast is corrected and then used as an input into the next iteration of the core model to generate a time t + 1 forecast. Using two nonlinear systems we demonstrate that the iterative corrector is superior to alternative approaches based on direct (non-iterative) forecasts. While the choice of the corrector model class is flexible, we use radial basis functions. Radial basis functions are frequently used in statistical learning and/or surface approximations and involve a number of computational aspects which we discuss in some detail

    Machine Learning for Diagnosis of AD and Prediction of MCI Progression From Brain MRI Using Brain Anatomical Analysis Using Diffeomorphic Deformation.

    Get PDF
    Background:With the growing momentum for the adoption of machine learning (ML) in medical field, it is likely that reliance on ML for imaging will become routine over the next few years. We have developed a software named BAAD, which uses ML algorithms for the diagnosis of Alzheimer\u27s disease (AD) and prediction of mild cognitive impairment (MCI) progression.Methods:We constructed an algorithm by combining a support vector machine (SVM) to classify and a voxel-based morphometry (VBM) to reduce concerned variables. We grouped progressive MCI and AD as an AD spectrum and trained SVM according to this classification. We randomly selected half from the total 1,314 subjects of AD neuroimaging Initiative (ADNI) from North America for SVM training, and the remaining half were used for validation to fine-tune the model hyperparameters. We created two types of SVMs, one based solely on the brain structure (SVMst), and the other based on both the brain structure and Mini-Mental State Examination score (SVMcog). We compared the model performance with two expert neuroradiologists, and further evaluated it in test datasets involving 519, 592, 69, and 128 subjects from the Australian Imaging, Biomarker & Lifestyle Flagship Study of Aging (AIBL), Japanese ADNI, the Minimal Interval Resonance Imaging in AD (MIDIAD) and the Open Access Series of Imaging Studies (OASIS), respectively.Results:BAAD\u27s SVMs outperformed radiologists for AD diagnosis in a structural magnetic resonance imaging review. The accuracy of the two radiologists was 57.5 and 70.0%, respectively, whereas, that of the SVMst was 90.5%. The diagnostic accuracy of the SVMst and SVMcog in the test datasets ranged from 88.0 to 97.1% and 92.5 to 100%, respectively. The prediction accuracy for MCI progression was 83.0% in SVMst and 85.0% in SVMcog. In the AD spectrum classified by SVMst, 87.1% of the subjects were Aβ positive according to an AV-45 positron emission tomography. Similarly, among MCI patients classified for the AD spectrum, 89.5% of the subjects progressed to AD.Conclusion:Our ML has shown high performance in AD diagnosis and prediction of MCI progression. It outperformed expert radiologists, and is expected to provide support in clinical practice
    corecore