11,594 research outputs found
Histogram Tomography
In many tomographic imaging problems the data consist of integrals along
lines or curves. Increasingly we encounter "rich tomography" problems where the
quantity imaged is higher dimensional than a scalar per voxel, including
vectors tensors and functions. The data can also be higher dimensional and in
many cases consists of a one or two dimensional spectrum for each ray. In many
such cases the data contain not just integrals along rays but the distribution
of values along the ray. If this is discretized into bins we can think of this
as a histogram. In this paper we introduce the concept of "histogram
tomography". For scalar problems with histogram data this holds the possibility
of reconstruction with fewer rays. In vector and tensor problems it holds the
promise of reconstruction of images that are in the null space of related
integral transforms. For scalar histogram tomography problems we show how bins
in the histogram correspond to reconstructing level sets of function, while
moments of the distribution are the x-ray transform of powers of the unknown
function. In the vector case we give a reconstruction procedure for potential
components of the field. We demonstrate how the histogram longitudinal ray
transform data can be extracted from Bragg edge neutron spectral data and
hence, using moments, a non-linear system of partial differential equations
derived for the strain tensor. In x-ray diffraction tomography of strain the
transverse ray transform can be deduced from the diffraction pattern the full
histogram transverse ray transform cannot. We give an explicit example of
distributions of strain along a line that produce the same diffraction pattern,
and characterize the null space of the relevant transform.Comment: Small corrections from last versio
Sard's approximation processes and oblique projections
Three problems arising in approximation theory are studied. These problems have already been studied by Arthur Sard. The main goal of this paper is to use geometrical compatibility theory to extend Sard's results and get characterizations of the sets of solutions.Fil: Corach, Gustavo. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto CalderĂłn; ArgentinaFil: Giribet, Juan Ignacio. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto CalderĂłn; ArgentinaFil: Maestripieri, Alejandra Laura. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto CalderĂłn; Argentin
Sketching for Large-Scale Learning of Mixture Models
Learning parameters from voluminous data can be prohibitive in terms of
memory and computational requirements. We propose a "compressive learning"
framework where we estimate model parameters from a sketch of the training
data. This sketch is a collection of generalized moments of the underlying
probability distribution of the data. It can be computed in a single pass on
the training set, and is easily computable on streams or distributed datasets.
The proposed framework shares similarities with compressive sensing, which aims
at drastically reducing the dimension of high-dimensional signals while
preserving the ability to reconstruct them. To perform the estimation task, we
derive an iterative algorithm analogous to sparse reconstruction algorithms in
the context of linear inverse problems. We exemplify our framework with the
compressive estimation of a Gaussian Mixture Model (GMM), providing heuristics
on the choice of the sketching procedure and theoretical guarantees of
reconstruction. We experimentally show on synthetic data that the proposed
algorithm yields results comparable to the classical Expectation-Maximization
(EM) technique while requiring significantly less memory and fewer computations
when the number of database elements is large. We further demonstrate the
potential of the approach on real large-scale data (over 10 8 training samples)
for the task of model-based speaker verification. Finally, we draw some
connections between the proposed framework and approximate Hilbert space
embedding of probability distributions using random features. We show that the
proposed sketching operator can be seen as an innovative method to design
translation-invariant kernels adapted to the analysis of GMMs. We also use this
theoretical framework to derive information preservation guarantees, in the
spirit of infinite-dimensional compressive sensing
Multiple Projection Optical Diffusion Tomography with Plane Wave Illumination
We describe a new data collection scheme for optical diffusion tomography in
which plane wave illumination is combined with multiple projections in the slab
imaging geometry. Multiple projection measurements are performed by rotating
the slab around the sample. The advantage of the proposed method is that the
measured data can be much more easily fitted into the dynamic range of most
commonly used detectors. At the same time, multiple projections improve image
quality by mutually interchanging the depth and transverse directions, and the
scanned (detection) and integrated (illumination) surfaces. Inversion methods
are derived for image reconstructions with extremely large data sets. Numerical
simulations are performed for fixed and rotated slabs
- …