3,982 research outputs found

    Data-Adaptive Wavelets and Multi-Scale Singular Spectrum Analysis

    Full text link
    Using multi-scale ideas from wavelet analysis, we extend singular-spectrum analysis (SSA) to the study of nonstationary time series of length NN whose intermittency can give rise to the divergence of their variance. SSA relies on the construction of the lag-covariance matrix C on M lagged copies of the time series over a fixed window width W to detect the regular part of the variability in that window in terms of the minimal number of oscillatory components; here W = M Dt, with Dt the time step. The proposed multi-scale SSA is a local SSA analysis within a moving window of width M <= W <= N. Multi-scale SSA varies W, while keeping a fixed W/M ratio, and uses the eigenvectors of the corresponding lag-covariance matrix C_M as a data-adaptive wavelets; successive eigenvectors of C_M correspond approximately to successive derivatives of the first mother wavelet in standard wavelet analysis. Multi-scale SSA thus solves objectively the delicate problem of optimizing the analyzing wavelet in the time-frequency domain, by a suitable localization of the signal's covariance matrix. We present several examples of application to synthetic signals with fractal or power-law behavior which mimic selected features of certain climatic and geophysical time series. A real application is to the Southern Oscillation index (SOI) monthly values for 1933-1996. Our methodology highlights an abrupt periodicity shift in the SOI near 1960. This abrupt shift between 4 and 3 years supports the Devil's staircase scenario for the El Nino/Southern Oscillation phenomenon.Comment: 24 pages, 19 figure

    A Fractal Analysis of the HI Emission from the Large Magellanic Cloud

    Full text link
    A composite map of HI in the LMC using the ATCA interferometer and the Parkes multibeam telescope was analyzed in several ways in an attempt to characterize the structure of the neutral gas and to find an origin for it. Fourier transform power spectra in 1D, 2D, and in the azimuthal direction were found to be approximate power laws over 2 decades in length. Delta-variance methods also showed the same power-law structure. Detailed models of these data were made using line-of-sight integrals over fractals that are analogous to those generated by simulations of turbulence with and without phase transitions. The results suggested a way to measure directly for the first time the line-of-sight thickness of the cool component of the HI disk of a nearly face-on galaxy. The signature of this thickness was found to be present in all of the measured power spectra. The character of the HI structure in the LMC was also viewed by comparing positive and negative images of the integrated emission. The geometric structure of the high-emission regions was found to be filamentary, whereas the geometric structure of the low-emission (intercloud) regions was found to be patchy and round. This result suggests that compressive events formed the high-emission regions, and expansion events, whether from explosions or turbulence, formed the low-emission regions. The character of the structure was also investigated as a function of scale using unsharp masks. All of these results suggest that most of the ISM in the LMC is fractal, presumably the result of pervasive turbulence, self-gravity, and self-similar stirring.Comment: 30 pages, 21 figures, scheduled for ApJ Vol 548n1, Feb 10, 200

    Giving eyes to ICT!, or How does a computer recognize a cow?

    Get PDF
    Het door Schouten en andere onderzoekers op het CWI ontwikkelde systeem berust op het beschrijven van beelden met behulp van fractale meetkunde. De menselijke waarneming blijkt mede daardoor zo efficiënt omdat zij sterk werkt met gelijkenissen. Het ligt dus voor de hand het te zoeken in wiskundige methoden die dat ook doen. Schouten heeft daarom beeldcodering met behulp van 'fractals' onderzocht. Fractals zijn zelfgelijkende meetkundige figuren, opgebouwd door herhaalde transformatie (iteratie) van een eenvoudig basispatroon, dat zich daardoor op steeds kleinere schalen vertakt. Op elk niveau van detaillering lijkt een fractal op zichzelf (Droste-effect). Met fractals kan men vrij eenvoudig bedrieglijk echte natuurvoorstellingen maken. Fractale beeldcodering gaat ervan uit dat het omgekeerde ook geldt: een beeld effectief opslaan in de vorm van de basispatronen van een klein aantal fractals, samen met het voorschrift hoe het oorspronkelijke beeld daaruit te reconstrueren. Het op het CWI in samenwerking met onderzoekers uit Leuven ontwikkelde systeem is mede gebaseerd op deze methode. ISBN 906196502

    Mapping the Mechanical Anisotropy of the Lithosphere using a 2D Wavelet Coherence, and its Applicaton to Australia

    Get PDF
    We develop a new method for imaging the spatial variations of the anisotropy of the flexural response of the lithosphere, and apply it to recent topographic and gravity data sets over Australia. The method uses two-dimensional Morlet wavelet transforms, superposed in a strictly controlled geometry, to estimate the auto- and cross-spectra of the two data sets in a number of different directions. The resulting wavelet coherence is a function of scale, or wavelength, as well as orientation, and is inverted, at each spatial location, for the three parameters of an anisotropic, thin elastic plate model, i.e., maximum and minimum flexural rigidities and the orientation of the maximum. Extensive tests of the method on synthetic anisotropic, but uniform, data sets, show that it retrieves the amplitude and orientation of the anisotropy with useful accuracy. The results for Australia west of 143oE show a strong correlation with the shallower layers (75-175 km) of a recent model of seismic SV wave azimuthal anisotropy. The 'weak' axes (i.e., of minimum flexural rigidity) in most cases are approximately at right angles to the fast axes of the seismic anisotropy, implying that, for Precambrian Australia, they arise from the same source. This is most likely deformation resulting from the most recent episode of orogeny

    Measuring Visual Consistency in 3D Rendering Systems

    Get PDF
    One of the major challenges facing a present day game development company is the removal of bugs from such complex virtual environments. This work presents an approach for measuring the correctness of synthetic scenes generated by a rendering system of a 3D application, such as a computer game. Our approach builds a database of labelled point clouds representing the spatiotemporal colour distribution for the objects present in a sequence of bug-free frames. This is done by converting the position that the pixels take over time into the 3D equivalent points with associated colours. Once the space of labelled points is built, each new image produced from the same game by any rendering system can be analysed by measuring its visual inconsistency in terms of distance from the database. Objects within the scene can be relocated (manually or by the application engine); yet the algorithm is able to perform the image analysis in terms of the 3D structure and colour distribution of samples on the surface of the object. We applied our framework to the publicly available game RacingGame developed for Microsoft(R) Xna(R). Preliminary results show how this approach can be used to detect a variety of visual artifacts generated by the rendering system in a professional quality game engine

    Digital Image Processing

    Get PDF
    Newspapers and the popular scientific press today publish many examples of highly impressive images. These images range, for example, from those showing regions of star birth in the distant Universe to the extent of the stratospheric ozone depletion over Antarctica in springtime, and to those regions of the human brain affected by Alzheimer’s disease. Processed digitally to generate spectacular images, often in false colour, they all make an immediate and deep impact on the viewer’s imagination and understanding. Professor Jonathan Blackledge’s erudite but very useful new treatise Digital Image Processing: Mathematical and Computational Methods explains both the underlying theory and the techniques used to produce such images in considerable detail. It also provides many valuable example problems - and their solutions - so that the reader can test his/her grasp of the physical, mathematical and numerical aspects of the particular topics and methods discussed. As such, this magnum opus complements the author’s earlier work Digital Signal Processing. Both books are a wonderful resource for students who wish to make their careers in this fascinating and rapidly developing field which has an ever increasing number of areas of application. The strengths of this large book lie in: • excellent explanatory introduction to the subject; • thorough treatment of the theoretical foundations, dealing with both electromagnetic and acoustic wave scattering and allied techniques; • comprehensive discussion of all the basic principles, the mathematical transforms (e.g. the Fourier and Radon transforms), their interrelationships and, in particular, Born scattering theory and its application to imaging systems modelling; discussion in detail - including the assumptions and limitations - of optical imaging, seismic imaging, medical imaging (using ultrasound), X-ray computer aided tomography, tomography when the wavelength of the probing radiation is of the same order as the dimensions of the scatterer, Synthetic Aperture Radar (airborne or spaceborne), digital watermarking and holography; detail devoted to the methods of implementation of the analytical schemes in various case studies and also as numerical packages (especially in C/C++); • coverage of deconvolution, de-blurring (or sharpening) an image, maximum entropy techniques, Bayesian estimators, techniques for enhancing the dynamic range of an image, methods of filtering images and techniques for noise reduction; • discussion of thresholding, techniques for detecting edges in an image and for contrast stretching, stochastic scattering (random walk models) and models for characterizing an image statistically; • investigation of fractal images, fractal dimension segmentation, image texture, the coding and storing of large quantities of data, and image compression such as JPEG; • valuable summary of the important results obtained in each Chapter given at its end; • suggestions for further reading at the end of each Chapter. I warmly commend this text to all readers, and trust that they will find it to be invaluable. Professor Michael J Rycroft Visiting Professor at the International Space University, Strasbourg, France, and at Cranfield University, England
    corecore