427 research outputs found

    Design &implementation of complex-valued FIR digital filters with application to migration of seismic data

    Get PDF
    One-dimensional (I-D) and two-dimensional (2-D) frequency-space seismic migration FIR digital filter coefficients are of complex values when such filters require special space domain as well as wavenumber domain characteristics. In this thesis, such FIR digital filters are designed using Vector Space Projection Methods (VSPMs), which can satisfy the desired predefined filters' properties, for 2-D and three-dimensional (3-D) seismic data sets, respectively. More precisely, the pure and the relaxed projection algorithms, which are part of the VSPM theory, are derived. Simulation results show that the relaxed version of the pure algorithm can introduce significant savings in terms of the number of iterations required. Also, due to some undesirable background artifacts on migrated sections, a modified version of the pure algorithm was used to eliminate such effects. This modification has also led to a significant reduction in the number of computations when compared to both the pure and relaxed algorithms. We further propose a generalization of the l-D (real/complex-valued) pure algorithm to multi-dimensional (m-D) complex-valued FIR digital filters, where the resulting frequency responses possess an approximate equiripple nature. Superior designs are obtained when compared with other previously reported methods. In addition, we also propose a new scheme for implementing the predesigned 2-D migration FIR filters. This realization is based on Singular Value Decomposition (SVD). Unlike the existing realization methods which are used for this geophysical application, this cheap realization via SVD, compared with the true 2-D convolution, results in satisfactory wavenumber responses. Finally, an application to seismic migration of 2-D and 3-D synthetic sections is shown to confirm our theoretical conclusions. The proposed resulting migration FIR filters are applied also to the challenging SEGIEAGE Salt model data. The migrated section (image) outperformed images obtained using other FIR filters and with other standard migration techniques where difficult structures contained in such a challenging model are imaged clearly

    Efficient Seismic Imaging of Hexagonally Sampled Seismic Data

    Get PDF

    Digital Image Processing

    Get PDF
    Newspapers and the popular scientific press today publish many examples of highly impressive images. These images range, for example, from those showing regions of star birth in the distant Universe to the extent of the stratospheric ozone depletion over Antarctica in springtime, and to those regions of the human brain affected by Alzheimer’s disease. Processed digitally to generate spectacular images, often in false colour, they all make an immediate and deep impact on the viewer’s imagination and understanding. Professor Jonathan Blackledge’s erudite but very useful new treatise Digital Image Processing: Mathematical and Computational Methods explains both the underlying theory and the techniques used to produce such images in considerable detail. It also provides many valuable example problems - and their solutions - so that the reader can test his/her grasp of the physical, mathematical and numerical aspects of the particular topics and methods discussed. As such, this magnum opus complements the author’s earlier work Digital Signal Processing. Both books are a wonderful resource for students who wish to make their careers in this fascinating and rapidly developing field which has an ever increasing number of areas of application. The strengths of this large book lie in: • excellent explanatory introduction to the subject; • thorough treatment of the theoretical foundations, dealing with both electromagnetic and acoustic wave scattering and allied techniques; • comprehensive discussion of all the basic principles, the mathematical transforms (e.g. the Fourier and Radon transforms), their interrelationships and, in particular, Born scattering theory and its application to imaging systems modelling; discussion in detail - including the assumptions and limitations - of optical imaging, seismic imaging, medical imaging (using ultrasound), X-ray computer aided tomography, tomography when the wavelength of the probing radiation is of the same order as the dimensions of the scatterer, Synthetic Aperture Radar (airborne or spaceborne), digital watermarking and holography; detail devoted to the methods of implementation of the analytical schemes in various case studies and also as numerical packages (especially in C/C++); • coverage of deconvolution, de-blurring (or sharpening) an image, maximum entropy techniques, Bayesian estimators, techniques for enhancing the dynamic range of an image, methods of filtering images and techniques for noise reduction; • discussion of thresholding, techniques for detecting edges in an image and for contrast stretching, stochastic scattering (random walk models) and models for characterizing an image statistically; • investigation of fractal images, fractal dimension segmentation, image texture, the coding and storing of large quantities of data, and image compression such as JPEG; • valuable summary of the important results obtained in each Chapter given at its end; • suggestions for further reading at the end of each Chapter. I warmly commend this text to all readers, and trust that they will find it to be invaluable. Professor Michael J Rycroft Visiting Professor at the International Space University, Strasbourg, France, and at Cranfield University, England

    Signal processing techniques for the enhancement of marine seismic data

    Get PDF
    This thesis presents several signal processing techniques applied to the enhancement of marine seismic data. Marine seismic exploration provides an image of the Earth's subsurface from reflected seismic waves. Because the recorded signals are contaminated by various sources of noise, minimizing their effects with new attenuation techniques is necessary. A statistical analysis of background noise is conducted using Thomson’s multitaper spectral estimator and Parzen's amplitude density estimator. The results provide a statistical characterization of the noise which we use for the derivation of signal enhancement algorithms. Firstly, we focus on single-azimuth stacking methodologies and propose novel stacking schemes using either enhanced weighted sums or a Kalman filter. It is demonstrated that the enhanced methods yield superior results by their ability to exhibit cleaner and better defined reflected events as well as a larger number of reflections in deep waters. A comparison of the proposed stacking methods with existing ones is also discussed. We then address the problem of random noise attenuation and present an innovative application of sparse code shrinkage and independent component analysis. Sparse code shrinkage is a valuable method when a noise-free realization of the data is generated to provide data-driven shrinkages. Several models of distribution are investigated, but the normal inverse Gaussian density yields the best results. Other acceptable choices of density are discussed as well. Finally, we consider the attenuation of flow-generated nonstationary coherent noise and seismic interference noise. We suggest a multiple-input adaptive noise canceller that utilizes a normalized least mean squares alg orithm with a variable normalized step size derived as a function of instantaneous frequency. This filter attenuates the coherent noise successfully when used either by itself or in combination with a time-frequency median filter, depending on the noise spectrum and repartition along the data. Its application to seismic interference attenuation is also discussed

    Advanced Techniques for Ground Penetrating Radar Imaging

    Get PDF
    Ground penetrating radar (GPR) has become one of the key technologies in subsurface sensing and, in general, in non-destructive testing (NDT), since it is able to detect both metallic and nonmetallic targets. GPR for NDT has been successfully introduced in a wide range of sectors, such as mining and geology, glaciology, civil engineering and civil works, archaeology, and security and defense. In recent decades, improvements in georeferencing and positioning systems have enabled the introduction of synthetic aperture radar (SAR) techniques in GPR systems, yielding GPR–SAR systems capable of providing high-resolution microwave images. In parallel, the radiofrequency front-end of GPR systems has been optimized in terms of compactness (e.g., smaller Tx/Rx antennas) and cost. These advances, combined with improvements in autonomous platforms, such as unmanned terrestrial and aerial vehicles, have fostered new fields of application for GPR, where fast and reliable detection capabilities are demanded. In addition, processing techniques have been improved, taking advantage of the research conducted in related fields like inverse scattering and imaging. As a result, novel and robust algorithms have been developed for clutter reduction, automatic target recognition, and efficient processing of large sets of measurements to enable real-time imaging, among others. This Special Issue provides an overview of the state of the art in GPR imaging, focusing on the latest advances from both hardware and software perspectives

    Least-Squares Wavelet Analysis and Its Applications in Geodesy and Geophysics

    Get PDF
    The Least-Squares Spectral Analysis (LSSA) is a robust method of analyzing unequally spaced and non-stationary data/time series. Although this method takes into account the correlation among the sinusoidal basis functions of irregularly spaced series, its spectrum still shows spectral leakage: power/energy leaks from one spectral peak into another. An iterative method called AntiLeakage Least-Squares Spectral Analysis (ALLSSA) is developed to attenuate the spectral leakages in the spectrum and consequently is used to regularize data series. In this study, the ALLSSA is applied to regularize and attenuate random noise in seismic data down to a certain desired level. The ALLSSA is subsequently extended to multichannel, heterogeneous and coarsely sampled seismic and related gradient measurements intended for geophysical exploration applications that require regularized (equally spaced) data free from aliasing effects. A new and robust method of analyzing unequally spaced and non-stationary time/data series is rigorously developed. This method, namely, the Least-Squares Wavelet Analysis (LSWA), is a natural extension of the LSSA that decomposes a time series into the time-frequency domain and obtains its spectrogram. It is shown through many synthetic and experimental time/data series that the LSWA supersedes all state-of-the-art spectral analyses methods currently available, without making any assumptions about or preprocessing (editing) the time series, or even applying any empirical methods that aim to adapt a time series to the analysis method. The LSWA can analyze any non-stationary and unequally spaced time series with components of low or high amplitude and frequency variability over time, including datum shifts (offsets), trends, and constituents of known forms, and by taking into account the covariance matrix associated with the time series. The stochastic confidence level surface for the spectrogram is rigorously derived that identifies statistically significant peaks in the spectrogram at a certain confidence level; this supersedes the empirical cone of influence used in the most popular continuous wavelet transform. All current state-of-the-art cross-wavelet transforms and wavelet coherence analyses methods impose many stringent constraints on the properties of the time series under investigation, requiring, more often than not, preprocessing of the raw measurements that may distort their content. These methods cannot generally be used to analyze unequally spaced and non-stationary time series or even two equally spaced time series of different sampling rates, with trends and/or datum shifts, and with associated covariance matrices. To overcome the stringent requirements of these methods, a new method is developed, namely, the Least-Squares Cross-Wavelet Analysis (LSCWA), along with its statistical distribution that requires no assumptions on the series under investigation. Numerous synthetic and geoscience examples establish the LSCWA as the method of methods for rigorous coherence analysis of any experimental series

    Combined Industry, Space and Earth Science Data Compression Workshop

    Get PDF
    The sixth annual Space and Earth Science Data Compression Workshop and the third annual Data Compression Industry Workshop were held as a single combined workshop. The workshop was held April 4, 1996 in Snowbird, Utah in conjunction with the 1996 IEEE Data Compression Conference, which was held at the same location March 31 - April 3, 1996. The Space and Earth Science Data Compression sessions seek to explore opportunities for data compression to enhance the collection, analysis, and retrieval of space and earth science data. Of particular interest is data compression research that is integrated into, or has the potential to be integrated into, a particular space or earth science data information system. Preference is given to data compression research that takes into account the scien- tist's data requirements, and the constraints imposed by the data collection, transmission, distribution and archival systems
    corecore