704 research outputs found
Medical image registration using Edgeworth-based approximation of Mutual Information
International audienceWe propose a new similarity measure for iconic medical image registration, an Edgeworth-based third order approximation of Mutual Information (MI) and named 3-EMI. Contrary to classical Edgeworth-based MI approximations, such as those proposed for inde- pendent component analysis, the 3-EMI measure is able to deal with potentially correlated variables. The performance of 3-EMI is then evaluated and compared with the Gaussian and B-Spline kernel-based estimates of MI, and the validation is leaded in three steps. First, we compare the intrinsic behavior of the measures as a function of the number of samples and the variance of an additive Gaussian noise. Then, they are evaluated in the context of multimodal rigid registration, using the RIRE data. We finally validate the use of our measure in the context of thoracic monomodal non-rigid registration, using the database proposed during the MICCAI EMPIRE10 challenge. The results show the wide range of clinical applications for which our measure can perform, including non-rigid registration which remains a challenging problem. They also demonstrate that 3-EMI outperforms classical estimates of MI for a low number of samples or a strong additive Gaussian noise. More generally, our measure gives competitive registration results, with a much lower numerical complexity compared to classical estimators such as the reference B-Spline kernel estimator, which makes 3-EMI a good candidate for fast and accurate registration tasks
BMICA-independent component analysis based on B-spline mutual information estimator
The information theoretic concept of mutual information provides a general framework to evaluate dependencies between variables. Its estimation however using B-Spline has not been used before in creating an approach for Independent Component Analysis. In this paper we present a B-Spline estimator for mutual information to find the independent components in mixed signals. Tested using electroencephalography (EEG) signals the resulting BMICA (B-Spline Mutual Information Independent Component Analysis)
exhibits better performance than the standard Independent Component Analysis algorithms of FastICA, JADE, SOBI and EFICA in similar simulations. BMICA was found to be also more reliable than the 'renown' FastICA
Ordering structures in vector optimization and applications in medical engineering
This manuscript is on the theory and numerical procedures of vector optimization w.r.t. various ordering structures, on recent developments in this area and, most important, on their application to medical engineering.
In vector optimization one considers optimization problems with a vector-valued objective map and thus one has to compare elements in a linear space. If the linear space is the finite dimensional space R^m this can be done componentwise. That corresponds to the notion of an Edgeworth-Pareto-optimal solution of a multiobjective optimization problem. Among the multitude of applications which can be modeled by such a multiobjective optimization problem, we present an application in intensity modulated radiation therapy and its solution by a numerical procedure.
In case the linear space is arbitrary, maybe infinite dimensional, one may introduce a partial ordering which defines how elements are compared. Such problems arise for instance in magnetic resonance tomography where the number
of Hermitian matrices which have to be considered for a control of the maximum local specific absorption rate can be reduced by applying procedures from vector optimization. In addition to a short introduction and the application problem, we present a numerical solution method for solving such vector optimization problems. A partial ordering can be represented by a convex cone which describes the set of directions in which one assumes that the current values are deteriorated.
If one assumes that this set may vary dependently on the actually considered element in the linear space, one may replace the partial ordering by a variable ordering structure. This was for instance done in an application in medical
image registration. We present a possibility of how to model such variable ordering structures mathematically and how optimality can be defined in such a case. We also give a numerical solution method for the case of a finite set of alternatives
Demystifying Fixed k-Nearest Neighbor Information Estimators
Estimating mutual information from i.i.d. samples drawn from an unknown joint
density function is a basic statistical problem of broad interest with
multitudinous applications. The most popular estimator is one proposed by
Kraskov and St\"ogbauer and Grassberger (KSG) in 2004, and is nonparametric and
based on the distances of each sample to its nearest neighboring
sample, where is a fixed small integer. Despite its widespread use (part of
scientific software packages), theoretical properties of this estimator have
been largely unexplored. In this paper we demonstrate that the estimator is
consistent and also identify an upper bound on the rate of convergence of the
bias as a function of number of samples. We argue that the superior performance
benefits of the KSG estimator stems from a curious "correlation boosting"
effect and build on this intuition to modify the KSG estimator in novel ways to
construct a superior estimator. As a byproduct of our investigations, we obtain
nearly tight rates of convergence of the error of the well known fixed
nearest neighbor estimator of differential entropy by Kozachenko and
Leonenko.Comment: 55 pages, 8 figure
A Multivariate Approach to Functional Neuro Modeling
This Ph.D. thesis, A Multivariate Approach to Functional Neuro Modeling, deals with the analysis and modeling of data from functional neuro imaging experiments. A multivariate dataset description is provided which facilitates efficient representation of typical datasets and, more importantly, provides the basis for a generalization theoretical framework relating model performance to model complexity and dataset size. Briefly summarized the major topics discussed in the thesis include: ffl An introduction of the representation of functional datasets by pairs of neuronal activity patterns and overall conditions governing the functional experiment, via associated micro- and macroscopic variables. The description facilitates an efficient microscopic re-representation, as well as a handle on the link between brain and behavior; the latter is obtained by hypothesizing variations in the micro- and macroscopic variables to be manifestations of an underlying system. ffl A review of two micros..
Exploring variability in medical imaging
Although recent successes of deep learning and novel machine learning techniques improved the perfor-
mance of classification and (anomaly) detection in computer vision problems, the application of these
methods in medical imaging pipeline remains a very challenging task. One of the main reasons for this
is the amount of variability that is encountered and encapsulated in human anatomy and subsequently
reflected in medical images. This fundamental factor impacts most stages in modern medical imaging
processing pipelines.
Variability of human anatomy makes it virtually impossible to build large datasets for each disease
with labels and annotation for fully supervised machine learning. An efficient way to cope with this is
to try and learn only from normal samples. Such data is much easier to collect. A case study of such
an automatic anomaly detection system based on normative learning is presented in this work. We
present a framework for detecting fetal cardiac anomalies during ultrasound screening using generative
models, which are trained only utilising normal/healthy subjects.
However, despite the significant improvement in automatic abnormality detection systems, clinical
routine continues to rely exclusively on the contribution of overburdened medical experts to diagnosis
and localise abnormalities. Integrating human expert knowledge into the medical imaging processing
pipeline entails uncertainty which is mainly correlated with inter-observer variability. From the per-
spective of building an automated medical imaging system, it is still an open issue, to what extent
this kind of variability and the resulting uncertainty are introduced during the training of a model
and how it affects the final performance of the task. Consequently, it is very important to explore the
effect of inter-observer variability both, on the reliable estimation of model’s uncertainty, as well as
on the model’s performance in a specific machine learning task. A thorough investigation of this issue
is presented in this work by leveraging automated estimates for machine learning model uncertainty,
inter-observer variability and segmentation task performance in lung CT scan images.
Finally, a presentation of an overview of the existing anomaly detection methods in medical imaging
was attempted. This state-of-the-art survey includes both conventional pattern recognition methods
and deep learning based methods. It is one of the first literature surveys attempted in the specific
research area.Open Acces
The development of statistical theory in Britain, 1865-1925: a historical and sociological perspective
This thesis discusses the development of statistical theory
in Britain in the period 1865 to 1925, and attempts to
account for this development as an institutional and an
intellectual phenomenon. Close connections are shown to
have existed between statistical theory as a scientific
specialty and eugenics and social Darwinism, in particular
in the work of Francis Galton (1822 -1911) and Karl Pearson
(1857- 1936). An analysis of eugenics as a social and
political movement is presented, and it is argued that
eugenics played a major role in facilitating the institutional
growth of statistical theory as a field of study. Two
scientific controversies involving Karl Pearson and his
followers (with William Bateson and the early Mendelians,
and with George Udny Yule) are examined, and it is suggested
that these controversies might usefully be seen as generated
and sustained by divergent social interests. The development
of the theory of statistical inference in this period is discussed
briefly, and the early pioneering work of W.S. Gosset
('Student') and R.A. Fisher is surveyed.It is concluded that the generation and assessment of scientific
innovations by statisticians in this period must be seen as
fundamentally affected by social factors having their origins
both within science and in the wider society
- …