252 research outputs found

    Lifting dual tree complex wavelets transform

    Get PDF
    We describe the lifting dual tree complex wavelet transform (LDTCWT), a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). We describe a way to estimate the accuracy of this approximation and style appropriate filters to attain this. These benefits are often exploited among applications like denoising, segmentation, image fusion and compression. The results of applications shrinkage denoising demonstrate objective and subjective enhancements over the dual tree complex wavelet transform (DTCWT). The results of the shrinkage denoising example application indicate empirical and subjective enhancements over the DTCWT. The new transform with the DTCWT provide a trade-off between denoising computational competence of performance, and memory necessities. We tend to use the PSNR (peak signal to noise ratio) alongside the structural similarity index measure (SSIM) and the SSIM map to estimate denoised image quality

    A Novel Multimodal Image Fusion Method Using Hybrid Wavelet-based Contourlet Transform

    Full text link
    Various image fusion techniques have been studied to meet the requirements of different applications such as concealed weapon detection, remote sensing, urban mapping, surveillance and medical imaging. Combining two or more images of the same scene or object produces a better application-wise visible image. The conventional wavelet transform (WT) has been widely used in the field of image fusion due to its advantages, including multi-scale framework and capability of isolating discontinuities at object edges. However, the contourlet transform (CT) has been recently adopted and applied to the image fusion process to overcome the drawbacks of WT with its own advantages. Based on the experimental studies in this dissertation, it is proven that the contourlet transform is more suitable than the conventional wavelet transform in performing the image fusion. However, it is important to know that the contourlet transform also has major drawbacks. First, the contourlet transform framework does not provide shift-invariance and structural information of the source images that are necessary to enhance the fusion performance. Second, unwanted artifacts are produced during the image decomposition process via contourlet transform framework, which are caused by setting some transform coefficients to zero for nonlinear approximation. In this dissertation, a novel fusion method using hybrid wavelet-based contourlet transform (HWCT) is proposed to overcome the drawbacks of both conventional wavelet and contourlet transforms, and enhance the fusion performance. In the proposed method, Daubechies Complex Wavelet Transform (DCxWT) is employed to provide both shift-invariance and structural information, and Hybrid Directional Filter Bank (HDFB) is used to achieve less artifacts and more directional information. DCxWT provides shift-invariance which is desired during the fusion process to avoid mis-registration problem. Without the shift-invariance, source images are mis-registered and non-aligned to each other; therefore, the fusion results are significantly degraded. DCxWT also provides structural information through its imaginary part of wavelet coefficients; hence, it is possible to preserve more relevant information during the fusion process and this gives better representation of the fused image. Moreover, HDFB is applied to the fusion framework where the source images are decomposed to provide abundant directional information, less complexity, and reduced artifacts. The proposed method is applied to five different categories of the multimodal image fusion, and experimental study is conducted to evaluate the performance of the proposed method in each multimodal fusion category using suitable quality metrics. Various datasets, fusion algorithms, pre-processing techniques and quality metrics are used for each fusion category. From every experimental study and analysis in each fusion category, the proposed method produced better fusion results than the conventional wavelet and contourlet transforms; therefore, its usefulness as a fusion method has been validated and its high performance has been verified

    Lossless and low-cost integer-based lifting wavelet transform

    Get PDF
    Discrete wavelet transform (DWT) is a powerful tool for analyzing real-time signals, including aperiodic, irregular, noisy, and transient data, because of its capability to explore signals in both the frequency- and time-domain in different resolutions. For this reason, they are used extensively in a wide number of applications in image and signal processing. Despite the wide usage, the implementation of the wavelet transform is usually lossy or computationally complex, and it requires expensive hardware. However, in many applications, such as medical diagnosis, reversible data-hiding, and critical satellite data, lossless implementation of the wavelet transform is desirable. It is also important to have more hardware-friendly implementations due to its recent inclusion in signal processing modules in system-on-chips (SoCs). To address the need, this research work provides a generalized implementation of a wavelet transform using an integer-based lifting method to produce lossless and low-cost architecture while maintaining the performance close to the original wavelets. In order to achieve a general implementation method for all orthogonal and biorthogonal wavelets, the Daubechies wavelet family has been utilized at first since it is one of the most widely used wavelets and based on a systematic method of construction of compact support orthogonal wavelets. Though the first two phases of this work are for Daubechies wavelets, they can be generalized in order to apply to other wavelets as well. Subsequently, some techniques used in the primary works have been adopted and the critical issues for achieving general lossless implementation have solved to propose a general lossless method. The research work presented here can be divided into several phases. In the first phase, low-cost architectures of the Daubechies-4 (D4) and Daubechies-6 (D6) wavelets have been derived by applying the integer-polynomial mapping. A lifting architecture has been used which reduces the cost by a half compared to the conventional convolution-based approach. The application of integer-polynomial mapping (IPM) of the polynomial filter coefficient with a floating-point value further decreases the complexity and reduces the loss in signal reconstruction. Also, the “resource sharing” between lifting steps results in a further reduction in implementation costs and near-lossless data reconstruction. In the second phase, a completely lossless or error-free architecture has been proposed for the Daubechies-8 (D8) wavelet. Several lifting variants have been derived for the same wavelet, the integer mapping has been applied, and the best variant is determined in terms of performance, using entropy and transform coding gain. Then a theory has been derived regarding the impact of scaling steps on the transform coding gain (GT). The approach results in the lowest cost lossless architecture of the D8 in the literature, to the best of our knowledge. The proposed approach may be applied to other orthogonal wavelets, including biorthogonal ones to achieve higher performance. In the final phase, a general algorithm has been proposed to implement the original filter coefficients expressed by a polyphase matrix into a more efficient lifting structure. This is done by using modified factorization, so that the factorized polyphase matrix does not include the lossy scaling step like the conventional lifting method. This general technique has been applied on some widely used orthogonal and biorthogonal wavelets and its advantages have been discussed. Since the discrete wavelet transform is used in a vast number of applications, the proposed algorithms can be utilized in those cases to achieve lossless, low-cost, and hardware-friendly architectures

    Image Annotation and Topic Extraction Using Super-Word Latent Dirichlet

    Get PDF
    This research presents a multi-domain solution that uses text and images to iteratively improve automated information extraction. Stage I uses local text surrounding an embedded image to provide clues that help rank-order possible image annotations. These annotations are forwarded to Stage II, where the image annotations from Stage I are used as highly-relevant super-words to improve extraction of topics. The model probabilities from the super-words in Stage II are forwarded to Stage III where they are used to refine the automated image annotation developed in Stage I. All stages demonstrate improvement over existing equivalent algorithms in the literature

    Wavelet Theory: for Economic & Financial Cycles

    Get PDF
    Cycles - their nature in existence, their implications on human-kind and the study thereof have sparked some important philosophical debates since the very pre-historic days. Notable contributions by famous, genius philosophers, mathematicians, historians and economists such as Pareto, Deulofeu, Danielewski, Kuznets, Kondratiev, Elliot and many others in itself shows how cycles and their study have been deemed important, through the history and process of scientific and philosophical inquiry. Particularly, the explication of Business, Economic and Financial cycles have seen some significant research and policy attention. Nevertheless, most of the methodologies employed in this space are either purely empirical in nature, time series based or the so-called Regime-Switching Markov model popularized in Economics by James Hamilton. In this work, we develop a Statistical, non-linear model fit based on circle geometry which is applicable for the dating of cycles. This study proposes a scalable, smooth and differentiable quarter-circular wavelet basis for the smoothing and dating of business, economic and financial cycles. The dating then necessitates the forecasting of the cyclical patterns in the evolution of business, economic and financial time series. The practical significance of dating and forecasting business and financial cycles cannot be over-emphasized. The use of wavelet decomposition in explaining cycles can be seen as an critical contribution of spectral methods of statistical modelling to finance and economic policy at large. Being a relatively new method, wavelet analysis has seen some great contribution in geophysical modelling. This study endeavours to widen the use and application of frequency-time decomposition to the economic and financial space. Wavelets are localized in both time and frequency, such that there is no loss of the time resolution. The importance of time resolution in dating of cycles is another motivation behind using wavelets. Moreover, the preservation of time resolution in wavelet analysis is a fundamental strength employed in the dating of cycles.Thesis (DPhil) -- Faculty of Science, Mathematical Statistics, 201

    Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)

    Get PDF
    The implicit objective of the biennial "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) is to foster collaboration between international scientific teams by disseminating ideas through both specific oral/poster presentations and free discussions. For its second edition, the iTWIST workshop took place in the medieval and picturesque town of Namur in Belgium, from Wednesday August 27th till Friday August 29th, 2014. The workshop was conveniently located in "The Arsenal" building within walking distance of both hotels and town center. iTWIST'14 has gathered about 70 international participants and has featured 9 invited talks, 10 oral presentations, and 14 posters on the following themes, all related to the theory, application and generalization of the "sparsity paradigm": Sparsity-driven data sensing and processing; Union of low dimensional subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph sensing/processing; Blind inverse problems and dictionary learning; Sparsity and computational neuroscience; Information theory, geometry and randomness; Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?; Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website: http://sites.google.com/site/itwist1

    Wavelet Theory: for Economic & Financial Cycles

    Get PDF
    Cycles - their nature in existence, their implications on human-kind and the study thereof have sparked some important philosophical debates since the very pre-historic days. Notable contributions by famous, genius philosophers, mathematicians, historians and economists such as Pareto, Deulofeu, Danielewski, Kuznets, Kondratiev, Elliot and many others in itself shows how cycles and their study have been deemed important, through the history and process of scientific and philosophical inquiry. Particularly, the explication of Business, Economic and Financial cycles have seen some significant research and policy attention. Nevertheless, most of the methodologies employed in this space are either purely empirical in nature, time series based or the so-called Regime-Switching Markov model popularized in Economics by James Hamilton. In this work, we develop a Statistical, non-linear model fit based on circle geometry which is applicable for the dating of cycles. This study proposes a scalable, smooth and differentiable quarter-circular wavelet basis for the smoothing and dating of business, economic and financial cycles. The dating then necessitates the forecasting of the cyclical patterns in the evolution of business, economic and financial time series. The practical significance of dating and forecasting business and financial cycles cannot be over-emphasized. The use of wavelet decomposition in explaining cycles can be seen as an critical contribution of spectral methods of statistical modelling to finance and economic policy at large. Being a relatively new method, wavelet analysis has seen some great contribution in geophysical modelling. This study endeavours to widen the use and application of frequency-time decomposition to the economic and financial space. Wavelets are localized in both time and frequency, such that there is no loss of the time resolution. The importance of time resolution in dating of cycles is another motivation behind using wavelets. Moreover, the preservation of time resolution in wavelet analysis is a fundamental strength employed in the dating of cycles.Thesis (DPhil) -- Faculty of Science, Mathematical Statistics, 201

    HEART RHYTHM CLASSIFICATION FROM STATIC AND ECG TIME-SERIES DATA USING HYBRID MULTIMODAL DEEP LEARNING

    Get PDF
    Cardiovascular arrhythmia diseases are considered as the most common diseases that cause death around the world. Abnormal arrhythmia diseases can be identified by analyzing heart rhythm using an electrocardiogram (ECG). However, this analysis is done manually by cardiologists, which may be subjective and susceptible to different cardiologist observations and experiences, as well as to noise and irregularities in those signals. This can lead to misdiagnosis. Motivated by this challenge, an automated heart rhythm diagnosis approach from ECG signals using Deep Learning has been proposed. In order to achieve this goal, three research problems have been addressed. First, recognize the role of each single-lead of a 12-lead ECG to classify heart rhythms. Second, understanding the importance of static data (e.g., demographics and clinical profile) in classifying heart rhythms. Third, realizing whether the static data can be combined with the ECG time series data for better classification performance. In this thesis, different deep learning models have been proposed to address these problems and satisfactory results are achieved. Therefore, using this knowledge, an effective hybrid deep learning model to classify heart rhythms has been proposed. As per knowledge obtained from relevant literature, this is the first work to identify the importance of individual lead and combined lead as well as the importance of combining static data with ECG time series data in classifying heart rhythms. Extensive experiments have been performed to evaluate this algorithm on a 12-lead ECG database that contains data from more than 10,000 individual subjects and obtained a high average of accuracy (up to 98.7%) and F1-measure (up to 98.7%). Moreover, in this thesis, the distribution of heart rhythms from the database based on heart rhythm type, gender, and age group have been analyzed, which will be valuable for further improvement of classification performance. This study will provide valuable insights and will prove to be an effective tool in automated heart rhythm classification and will assist cardiologists in effectively and accurately diagnosing heart disease

    A systematic review of physiological signals based driver drowsiness detection systems.

    Get PDF
    Driving a vehicle is a complex, multidimensional, and potentially risky activity demanding full mobilization and utilization of physiological and cognitive abilities. Drowsiness, often caused by stress, fatigue, and illness declines cognitive capabilities that affect drivers' capability and cause many accidents. Drowsiness-related road accidents are associated with trauma, physical injuries, and fatalities, and often accompany economic loss. Drowsy-related crashes are most common in young people and night shift workers. Real-time and accurate driver drowsiness detection is necessary to bring down the drowsy driving accident rate. Many researchers endeavored for systems to detect drowsiness using different features related to vehicles, and drivers' behavior, as well as, physiological measures. Keeping in view the rising trend in the use of physiological measures, this study presents a comprehensive and systematic review of the recent techniques to detect driver drowsiness using physiological signals. Different sensors augmented with machine learning are utilized which subsequently yield better results. These techniques are analyzed with respect to several aspects such as data collection sensor, environment consideration like controlled or dynamic, experimental set up like real traffic or driving simulators, etc. Similarly, by investigating the type of sensors involved in experiments, this study discusses the advantages and disadvantages of existing studies and points out the research gaps. Perceptions and conceptions are made to provide future research directions for drowsiness detection techniques based on physiological signals. [Abstract copyright: © The Author(s), under exclusive licence to Springer Nature B.V. 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
    corecore