2,282 research outputs found
Geodesic Information Flows: Spatially-Variant Graphs and Their Application to Segmentation and Fusion
Clinical annotations, such as voxel-wise binary or probabilistic tissue segmentations, structural parcellations, pathological regionsof- interest and anatomical landmarks are key to many clinical studies. However, due to the time consuming nature of manually generating these annotations, they tend to be scarce and limited to small subsets of data. This work explores a novel framework to propagate voxel-wise annotations between morphologically dissimilar images by diffusing and mapping the available examples through intermediate steps. A spatially-variant graph structure connecting morphologically similar subjects is introduced over a database of images, enabling the gradual diffusion of information to all the subjects, even in the presence of large-scale morphological variability. We illustrate the utility of the proposed framework on two example applications: brain parcellation using categorical labels and tissue segmentation using probabilistic features. The application of the proposed method to categorical label fusion showed highly statistically significant improvements when compared to state-of-the-art methodologies. Significant improvements were also observed when applying the proposed framework to probabilistic tissue segmentation of both synthetic and real data, mainly in the presence of large morphological variability
DeepSphere: Efficient spherical Convolutional Neural Network with HEALPix sampling for cosmological applications
Convolutional Neural Networks (CNNs) are a cornerstone of the Deep Learning
toolbox and have led to many breakthroughs in Artificial Intelligence. These
networks have mostly been developed for regular Euclidean domains such as those
supporting images, audio, or video. Because of their success, CNN-based methods
are becoming increasingly popular in Cosmology. Cosmological data often comes
as spherical maps, which make the use of the traditional CNNs more complicated.
The commonly used pixelization scheme for spherical maps is the Hierarchical
Equal Area isoLatitude Pixelisation (HEALPix). We present a spherical CNN for
analysis of full and partial HEALPix maps, which we call DeepSphere. The
spherical CNN is constructed by representing the sphere as a graph. Graphs are
versatile data structures that can act as a discrete representation of a
continuous manifold. Using the graph-based representation, we define many of
the standard CNN operations, such as convolution and pooling. With filters
restricted to being radial, our convolutions are equivariant to rotation on the
sphere, and DeepSphere can be made invariant or equivariant to rotation. This
way, DeepSphere is a special case of a graph CNN, tailored to the HEALPix
sampling of the sphere. This approach is computationally more efficient than
using spherical harmonics to perform convolutions. We demonstrate the method on
a classification problem of weak lensing mass maps from two cosmological models
and compare the performance of the CNN with that of two baseline classifiers.
The results show that the performance of DeepSphere is always superior or equal
to both of these baselines. For high noise levels and for data covering only a
smaller fraction of the sphere, DeepSphere achieves typically 10% better
classification accuracy than those baselines. Finally, we show how learned
filters can be visualized to introspect the neural network.Comment: arXiv admin note: text overlap with arXiv:astro-ph/0409513 by other
author
Maximal entropy random walk in community finding
The aim of this paper is to check feasibility of using the maximal-entropy
random walk in algorithms finding communities in complex networks. A number of
such algorithms exploit an ordinary or a biased random walk for this purpose.
Their key part is a (dis)similarity matrix, according to which nodes are
grouped. This study encompasses the use of the stochastic matrix of a random
walk, its mean first-passage time matrix, and a matrix of weighted paths count.
We briefly indicate the connection between those quantities and propose
substituting the maximal-entropy random walk for the previously chosen models.
This unique random walk maximises the entropy of ensembles of paths of given
length and endpoints, which results in equiprobability of those paths. We
compare performance of the selected algorithms on LFR benchmark graphs. The
results show that the change in performance depends very strongly on the
particular algorithm, and can lead to slight improvements as well as
significant deterioration.Comment: 7 pages, 4 figures, submitted to European Physical Journal Special
Topics following the 4-th Conference on Statistical Physics: Modern Trends
and Applications, July 3-6, 2012 Lviv, Ukrain
Statistical equilibrium of tetrahedra from maximum entropy principle
Discrete formulations of (quantum) gravity in four spacetime dimensions build
space out of tetrahedra. We investigate a statistical mechanical system of
tetrahedra from a many-body point of view based on non-local, combinatorial
gluing constraints that are modelled as multi-particle interactions. We focus
on Gibbs equilibrium states, constructed using Jaynes' principle of constrained
maximisation of entropy, which has been shown recently to play an important
role in characterising equilibrium in background independent systems. We apply
this principle first to classical systems of many tetrahedra using different
examples of geometrically motivated constraints. Then for a system of quantum
tetrahedra, we show that the quantum statistical partition function of a Gibbs
state with respect to some constraint operator can be reinterpreted as a
partition function for a quantum field theory of tetrahedra, taking the form of
a group field theory.Comment: v3 published version; v2 18 pages, 4 figures, improved text in
sections IIIC & IVB, minor changes elsewher
Depth-based Hypergraph Complexity Traces from Directed Line Graphs
In this paper, we aim to characterize the structure of hypergraphs in terms of structural complexity measure. Measuring the complexity of a hypergraph in a straightforward way tends to be elusive since the hyperedges of a hypergraph may exhibit varying relational orders. We thus transform a hypergraph into a line graph which not only accurately reflects the multiple relationships exhibited by the hyperedges but is also easier to manipulate for complexity analysis. To locate the dominant substructure within a line graph, we identify a centroid vertex by computing the minimum variance of its shortest path lengths. A family of centroid expansion subgraphs of the line graph is then derived from the centroid vertex. We compute the depth-based complexity traces for the hypergraph by measuring either the directed or undirected entropies of its centroid expansion subgraphs. The resulting complexity traces provide a flexible framework that can be applied to both hypergraphs and graphs. We perform (hyper)graph classification in the principal component space of the complexity trace vectors. Experiments on (hyper)graph datasets abstracted from bioinformatics and computer vision data demonstrate the effectiveness and efficiency of the complexity traces.This work is supported by National Natural Science Foundation of China (Grant no. 61503422). This work is supported by the Open Projects Program of National Laboratory of Pattern Recognition. Francisco Escolano is supported by the project TIN2012-32839 of the Spanish Government. Edwin R. Hancock is supported by a Royal Society Wolfson Research Merit Award
A new approach to pointwise heat kernel upper bounds on doubling metric measure spaces
On doubling metric measure spaces endowed with a strongly local regular
Dirichlet form, we show some characterisations of pointwise upper bounds of the
heat kernel in terms of global scale-invariant inequalities that correspond
respectively to the Nash inequality and to a Gagliardo-Nirenberg type
inequality when the volume growth is polynomial. This yields a new proof and a
generalisation of the well-known equivalence between classical heat kernel
upper bounds and relative Faber-Krahn inequalities or localized Sobolev or Nash
inequalities. We are able to treat more general pointwise estimates, where the
heat kernel rate of decay is not necessarily governed by the volume growth. A
crucial role is played by the finite propagation speed property for the
associated wave equation, and our main result holds for an abstract semigroup
of operators satisfying the Davies-Gaffney estimates
3D geological models and their hydrogeological applications : supporting urban development : a case study in Glasgow-Clyde, UK
Urban planners and developers in some parts of the United Kingdom can now access geodata in an easy-to-retrieve and understandable format. 3D attributed geological framework models and associated GIS outputs, developed by the British Geological Survey (BGS), provide a predictive tool for planning site investigations for some of the UK's largest regeneration projects in the Thames and Clyde River catchments.
Using the 3D models, planners can get a 3D preview of properties of the subsurface using virtual cross-section and borehole tools in visualisation software, allowing critical decisions to be made before any expensive site investigation takes place, and potentially saving time and money. 3D models can integrate artificial and superficial deposits and bedrock geology, and can be used for recognition of major resources (such as water, thermal and sand and gravel), for example in buried valleys, groundwater modelling and assessing impacts of underground mining. A preliminary groundwater recharge and flow model for a pilot area in Glasgow has been developed using the 3D geological models as a framework.
This paper focuses on the River Clyde and the Glasgow conurbation, and the BGS's Clyde Urban Super-Project (CUSP) in particular, which supports major regeneration projects in and around the City of Glasgow in the West of Scotland
- …