170 research outputs found

    Multidimensional Wavelets and Computer Vision

    Get PDF
    This report deals with the construction and the mathematical analysis of multidimensional nonseparable wavelets and their efficient application in computer vision. In the first part, the fundamental principles and ideas of multidimensional wavelet filter design such as the question for the existence of good scaling matrices and sensible design criteria are presented and extended in various directions. Afterwards, the analytical properties of these wavelets are investigated in some detail. It will turn out that they are especially well-suited to represent (discretized) data as well as large classes of operators in a sparse form - a property that directly yields efficient numerical algorithms. The final part of this work is dedicated to the application of the developed methods to the typical computer vision problems of nonlinear image regularization and the computation of optical flow in image sequences. It is demonstrated how the wavelet framework leads to stable and reliable results for these problems of generally ill-posed nature. Furthermore, all the algorithms are of order O(n) leading to fast processing

    Case study:shipping trend estimation and prediction via multiscale variance stabilisation

    Get PDF
    <p>Shipping and shipping services are a key industry of great importance to the economy of Cyprus and the wider European Union. Assessment, management and future steering of the industry, and its associated economy, is carried out by a range of organisations and is of direct interest to a number of stakeholders. This article presents an analysis of shipping credit flow data: an important and archetypal series whose analysis is hampered by rapid changes of variance. Our analysis uses the recently developed data-driven Haar–Fisz transformation that enables accurate trend estimation and successful prediction in these kinds of situation. Our trend estimation is augmented by bootstrap confidence bands, new in this context. The good performance of the data-driven Haar–Fisz transform contrasts with the poor performance exhibited by popular and established variance stabilisation alternatives: the Box–Cox, logarithm and square root transformations.</p

    The locally stationary dual-tree complex wavelet model

    Get PDF
    We here harmonise two significant contributions to the field of wavelet analysis in the past two decades, namely the locally stationary wavelet process and the family of dual-tree complex wavelets. By combining these two components, we furnish a statistical model that can simultaneously access benefits from these two constructions. On the one hand, our model borrows the debiased spectrum and auto-covariance estimator from the locally stationary wavelet model. On the other hand, the enhanced directional selectivity is obtained from the dual-tree complex wavelets over the regular lattice. The resulting model allows for the description and identification of wavelet fields with significantly more directional fidelity than was previously possible. The corresponding estimation theory is established for the new model, and some stationarity detection experiments illustrate its practicality

    Using the Sharp Operator for edge detection and nonlinear diffusion

    Get PDF
    In this paper we investigate the use of the sharp function known from functional analysis in image processing. The sharp function gives a measure of the variations of a function and can be used as an edge detector. We extend the classical notion of the sharp function for measuring anisotropic behaviour and give a fast anisotropic edge detection variant inspired by the sharp function. We show that these edge detection results are useful to steer isotropic and anisotropic nonlinear diffusion filters for image enhancement

    Wavelet and Multiscale Methods

    Get PDF
    Various scientific models demand finer and finer resolutions of relevant features. Paradoxically, increasing computational power serves to even heighten this demand. Namely, the wealth of available data itself becomes a major obstruction. Extracting essential information from complex structures and developing rigorous models to quantify the quality of information leads to tasks that are not tractable by standard numerical techniques. The last decade has seen the emergence of several new computational methodologies to address this situation. Their common features are the nonlinearity of the solution methods as well as the ability of separating solution characteristics living on different length scales. Perhaps the most prominent examples lie in multigrid methods and adaptive grid solvers for partial differential equations. These have substantially advanced the frontiers of computability for certain problem classes in numerical analysis. Other highly visible examples are: regression techniques in nonparametric statistical estimation, the design of universal estimators in the context of mathematical learning theory and machine learning; the investigation of greedy algorithms in complexity theory, compression techniques and encoding in signal and image processing; the solution of global operator equations through the compression of fully populated matrices arising from boundary integral equations with the aid of multipole expansions and hierarchical matrices; attacking problems in high spatial dimensions by sparse grid or hyperbolic wavelet concepts. This workshop proposed to deepen the understanding of the underlying mathematical concepts that drive this new evolution of computation and to promote the exchange of ideas emerging in various disciplines

    Signal processing for high-definition television

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mathematics, 1995.Includes bibliographical references (p. 60-62).by Peter Monta.Ph.D

    Structures in magnetohydrodynamic turbulence: detection and scaling

    Full text link
    We present a systematic analysis of statistical properties of turbulent current and vorticity structures at a given time using cluster analysis. The data stems from numerical simulations of decaying three-dimensional (3D) magnetohydrodynamic turbulence in the absence of an imposed uniform magnetic field; the magnetic Prandtl number is taken equal to unity, and we use a periodic box with grids of up to 1536^3 points, and with Taylor Reynolds numbers up to 1100. The initial conditions are either an X-point configuration embedded in 3D, the so-called Orszag-Tang vortex, or an Arn'old-Beltrami-Childress configuration with a fully helical velocity and magnetic field. In each case two snapshots are analyzed, separated by one turn-over time, starting just after the peak of dissipation. We show that the algorithm is able to select a large number of structures (in excess of 8,000) for each snapshot and that the statistical properties of these clusters are remarkably similar for the two snapshots as well as for the two flows under study in terms of scaling laws for the cluster characteristics, with the structures in the vorticity and in the current behaving in the same way. We also study the effect of Reynolds number on cluster statistics, and we finally analyze the properties of these clusters in terms of their velocity-magnetic field correlation. Self-organized criticality features have been identified in the dissipative range of scales. A different scaling arises in the inertial range, which cannot be identified for the moment with a known self-organized criticality class consistent with MHD. We suggest that this range can be governed by turbulence dynamics as opposed to criticality, and propose an interpretation of intermittency in terms of propagation of local instabilities.Comment: 17 pages, 9 figures, 5 table
    • …
    corecore