8,746 research outputs found
Biological control networks suggest the use of biomimetic sets for combinatorial therapies
Cells are regulated by networks of controllers having many targets, and
targets affected by many controllers, but these "many-to-many" combinatorial
control systems are poorly understood. Here we analyze distinct cellular
networks (transcription factors, microRNAs, and protein kinases) and a
drug-target network. Certain network properties seem universal across systems
and species, suggesting the existence of common control strategies in biology.
The number of controllers is ~8% of targets and the density of links is 2.5%
\pm 1.2%. Links per node are predominantly exponentially distributed, implying
conservation of the average, which we explain using a mathematical model of
robustness in control networks. These findings suggest that optimal
pharmacological strategies may benefit from a similar, many-to-many
combinatorial structure, and molecular tools are available to test this
approach.Comment: 33 page
Fast Optimal Transport Averaging of Neuroimaging Data
Knowing how the Human brain is anatomically and functionally organized at the
level of a group of healthy individuals or patients is the primary goal of
neuroimaging research. Yet computing an average of brain imaging data defined
over a voxel grid or a triangulation remains a challenge. Data are large, the
geometry of the brain is complex and the between subjects variability leads to
spatially or temporally non-overlapping effects of interest. To address the
problem of variability, data are commonly smoothed before group linear
averaging. In this work we build on ideas originally introduced by Kantorovich
to propose a new algorithm that can average efficiently non-normalized data
defined over arbitrary discrete domains using transportation metrics. We show
how Kantorovich means can be linked to Wasserstein barycenters in order to take
advantage of an entropic smoothing approach. It leads to a smooth convex
optimization problem and an algorithm with strong convergence guarantees. We
illustrate the versatility of this tool and its empirical behavior on
functional neuroimaging data, functional MRI and magnetoencephalography (MEG)
source estimates, defined on voxel grids and triangulations of the folded
cortical surface.Comment: Information Processing in Medical Imaging (IPMI), Jun 2015, Isle of
Skye, United Kingdom. Springer, 201
Partial-volume Bayesian classification of material mixtures in MR volume data using voxel histograms
The authors present a new algorithm for identifying the distribution of different material types in volumetric datasets such as those produced with magnetic resonance imaging (MRI) or computed tomography (CT). Because the authors allow for mixtures of materials and treat voxels as regions, their technique reduces errors that other classification techniques can create along boundaries between materials and is particularly useful for creating accurate geometric models and renderings from volume data. It also has the potential to make volume measurements more accurately and classifies noisy, low-resolution data well. There are two unusual aspects to the authors' approach. First, they assume that, due to partial-volume effects, or blurring, voxels can contain more than one material, e.g., both muscle and fat; the authors compute the relative proportion of each material in the voxels. Second, they incorporate information from neighboring voxels into the classification process by reconstructing a continuous function, ρ(x), from the samples and then looking at the distribution of values that ρ(x) takes on within the region of a voxel. This distribution of values is represented by a histogram taken over the region of the voxel; the mixture of materials that those values measure is identified within the voxel using a probabilistic Bayesian approach that matches the histogram by finding the mixture of materials within each voxel most likely to have created the histogram. The size of regions that the authors classify is chosen to match the sparing of the samples because the spacing is intrinsically related to the minimum feature size that the reconstructed continuous function can represent
- …