5,624 research outputs found
Tomographic inversion using -norm regularization of wavelet coefficients
We propose the use of regularization in a wavelet basis for the
solution of linearized seismic tomography problems , allowing for the
possibility of sharp discontinuities superimposed on a smoothly varying
background. An iterative method is used to find a sparse solution that
contains no more fine-scale structure than is necessary to fit the data to
within its assigned errors.Comment: 19 pages, 14 figures. Submitted to GJI July 2006. This preprint does
not use GJI style files (which gives wrong received/accepted dates).
Corrected typ
The Impact of Different Image Thresholding based Mammogram Image Segmentation- A Review
Images are examined and discretized numerical capacities. The goal of computerized image processing is to enhance the nature of pictorial data and to encourage programmed machine elucidation. A computerized imaging framework ought to have fundamental segments for picture procurement, exceptional equipment for encouraging picture applications, and a tremendous measure of memory for capacity and info/yield gadgets. Picture segmentation is the field broadly scrutinized particularly in numerous restorative applications and still offers different difficulties for the specialists. Segmentation is a critical errand to recognize districts suspicious of tumor in computerized mammograms. Every last picture have distinctive sorts of edges and diverse levels of limits. In picture transforming, the most regularly utilized strategy as a part of extricating articles from a picture is "thresholding". Thresholding is a prevalent device for picture segmentation for its straightforwardness, particularly in the fields where ongoing handling is required
Single-shot compressed ultrafast photography: a review
Compressed ultrafast photography (CUP) is a burgeoning single-shot computational imaging technique that provides an imaging speed as high as 10 trillion frames per second and a sequence depth of up to a few hundred frames. This technique synergizes compressed sensing and the streak camera technique to capture nonrepeatable ultrafast transient events with a single shot. With recent unprecedented technical developments and extensions of this methodology, it has been widely used in ultrafast optical imaging and metrology, ultrafast electron diffraction and microscopy, and information security protection. We review the basic principles of CUP, its recent advances in data acquisition and image reconstruction, its fusions with other modalities, and its unique applications in multiple research fields
The Impact of Different Image Thresholding based Mammogram Image Segmentation- A Review
Images are examined and discretized numerical capacities. The goal of computerized image processing is to enhance the nature of pictorial data and to encourage programmed machine elucidation. A computerized imaging framework ought to have fundamental segments for picture procurement, exceptional equipment for encouraging picture applications, and a tremendous measure of memory for capacity and info/yield gadgets. Picture segmentation is the field broadly scrutinized particularly in numerous restorative applications and still offers different difficulties for the specialists. Segmentation is a critical errand to recognize districts suspicious of tumor in computerized mammograms. Every last picture have distinctive sorts of edges and diverse levels of limits. In picture transforming, the most regularly utilized strategy as a part of extricating articles from a picture is "thresholding". Thresholding is a prevalent device for picture segmentation for its straightforwardness, particularly in the fields where ongoing handling is required
Computer vision
The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed
Compressive Wave Computation
This paper considers large-scale simulations of wave propagation phenomena.
We argue that it is possible to accurately compute a wavefield by decomposing
it onto a largely incomplete set of eigenfunctions of the Helmholtz operator,
chosen at random, and that this provides a natural way of parallelizing wave
simulations for memory-intensive applications.
This paper shows that L1-Helmholtz recovery makes sense for wave computation,
and identifies a regime in which it is provably effective: the one-dimensional
wave equation with coefficients of small bounded variation. Under suitable
assumptions we show that the number of eigenfunctions needed to evolve a sparse
wavefield defined on N points, accurately with very high probability, is
bounded by C log(N) log(log(N)), where C is related to the desired accuracy and
can be made to grow at a much slower rate than N when the solution is sparse.
The PDE estimates that underlie this result are new to the authors' knowledge
and may be of independent mathematical interest; they include an L1 estimate
for the wave equation, an estimate of extension of eigenfunctions, and a bound
for eigenvalue gaps in Sturm-Liouville problems.
Numerical examples are presented in one spatial dimension and show that as
few as 10 percents of all eigenfunctions can suffice for accurate results.
Finally, we argue that the compressive viewpoint suggests a competitive
parallel algorithm for an adjoint-state inversion method in reflection
seismology.Comment: 45 pages, 4 figure
Unsupervised empirical Bayesian multiple testing with external covariates
In an empirical Bayesian setting, we provide a new multiple testing method,
useful when an additional covariate is available, that influences the
probability of each null hypothesis being true. We measure the posterior
significance of each test conditionally on the covariate and the data, leading
to greater power. Using covariate-based prior information in an unsupervised
fashion, we produce a list of significant hypotheses which differs in length
and order from the list obtained by methods not taking covariate-information
into account. Covariate-modulated posterior probabilities of each null
hypothesis are estimated using a fast approximate algorithm. The new method is
applied to expression quantitative trait loci (eQTL) data.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS158 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- …