1,089 research outputs found

    Evolutionary Computing and Second generation Wavelet Transform optimization: Current State of the Art

    Get PDF
    The Evolutionary Computation techniques are exposed to number of domains to achieve optimization. One of those domains is second generation wavelet transformations for image compression. Various types of Lifting Schemes are being introduced in recent literature. Since the growth in Lifting Schemes is in an incremental way and new types of Lifting Schemes are appearing continually. In this context, developing flexible and adaptive optimization approaches is a severe challenge. Evolutionary Computing based lifting scheme optimization techniques are a valuable technology to achieve better results in image compression. However, despite the variety of such methods described in the literature in recent years, security tools incorporating anomaly detection functionalities are just starting to appear, and several important problems remain to be solved. In this paper, we present a review of the most well-known EC approaches for optimizing Secondary level Wavelet transformations

    Evolutionary Approach to Improve Wavelet Transforms for Image Compression in Embedded Systems

    Get PDF
    A bioinspired, evolutionary algorithm for optimizing wavelet transforms oriented to improve image compression in embedded systems is proposed, modelled, and validated here. A simplified version of an Evolution Strategy, using fixed point arithmetic and a hardware-friendly mutation operator, has been chosen as the search algorithm. Several cutdowns on the computing requirements have been done to the original algorithm, adapting it for an FPGA implementation. The work presented in this paper describes the algorithm as well as the test strategy developed to validate it, showing several results in the effort to find a suitable set of parameters that assure the success in the evolutionary search. The results show how high-quality transforms are evolved from scratch with limited precision arithmetic and a simplified algorithm. Since the intended deployment platform is an FPGA, HW/SW partitioning issues are also considered as well as code profiling accomplished to validate the proposal, showing some preliminary results of the proposed hardware architecture

    Accelerating FPGA-based evolution of wavelet transform filters by optimized task scheduling

    Get PDF
    Adaptive embedded systems are required in various applications. This work addresses these needs in the area of adaptive image compression in FPGA devices. A simplified version of an evolution strategy is utilized to optimize wavelet filters of a Discrete Wavelet Transform algorithm. We propose an adaptive image compression system in FPGA where optimized memory architecture, parallel processing and optimized task scheduling allow reducing the time of evolution. The proposed solution has been extensively evaluated in terms of the quality of compression as well as the processing time. The proposed architecture reduces the time of evolution by 44% compared to our previous reports while maintaining the quality of compression unchanged with respect to existing implementations. The system is able to find an optimized set of wavelet filters in less than 2 min whenever the input type of data changes

    On the Equivalence Between a Minimal Codomain Cardinality Riesz Basis Construction, a System of Hadamard–Sylvester Operators, and a Class of Sparse, Binary Optimization Problems

    Get PDF
    Piecewise, low-order polynomial, Riesz basis families are constructed such that they share the same coefficient functionals of smoother, orthonormal bases in a localized indexing subset. It is shown that a minimal cardinality basis codomain can be realized by inducing sparsity, via l1 regularization, in the distributional derivatives of the basis functions and that the optimal construction can be found numerically by constrained binary optimization over a suitably large dictionary. Furthermore, it is shown that a subset of these solutions are equivalent to a specific, constrained analytical solution, derived via Sylvester-type Hadamard operators

    A Primal-Dual Proximal Algorithm for Sparse Template-Based Adaptive Filtering: Application to Seismic Multiple Removal

    Get PDF
    Unveiling meaningful geophysical information from seismic data requires to deal with both random and structured "noises". As their amplitude may be greater than signals of interest (primaries), additional prior information is especially important in performing efficient signal separation. We address here the problem of multiple reflections, caused by wave-field bouncing between layers. Since only approximate models of these phenomena are available, we propose a flexible framework for time-varying adaptive filtering of seismic signals, using sparse representations, based on inaccurate templates. We recast the joint estimation of adaptive filters and primaries in a new convex variational formulation. This approach allows us to incorporate plausible knowledge about noise statistics, data sparsity and slow filter variation in parsimony-promoting wavelet frames. The designed primal-dual algorithm solves a constrained minimization problem that alleviates standard regularization issues in finding hyperparameters. The approach demonstrates significantly good performance in low signal-to-noise ratio conditions, both for simulated and real field seismic data

    Evolutionary design and optimization of Wavelet Transforms for image compression in embedded systems

    Get PDF
    This paper describes the initial studies of an Evolution Strategy aimed at implementation on embedded systems for the evolution of Wavelet Transforms for image compression. Previous works in the literature have already been proved useful for this application, but they are highly computationally intensive. Therefore, the work described here, deals with the simplifications made to those algorithms to reduce their computing requirements. Several optimizations have been done in the evaluation phase and in the EA operators. The results presented show how the proposed algorithm cut outs still allow for good results to be achieved, while effectively reducing the computing requirements

    Graph Signal Processing: Overview, Challenges and Applications

    Full text link
    Research in Graph Signal Processing (GSP) aims to develop tools for processing data defined on irregular graph domains. In this paper we first provide an overview of core ideas in GSP and their connection to conventional digital signal processing. We then summarize recent developments in developing basic GSP tools, including methods for sampling, filtering or graph learning. Next, we review progress in several application areas using GSP, including processing and analysis of sensor network data, biological data, and applications to image processing and machine learning. We finish by providing a brief historical perspective to highlight how concepts recently developed in GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE

    Automated Accident Detection In Intersections Via Digital Audio Signal Processing

    Get PDF
    The aim of this thesis is to design a system for automated accident detection in intersections. The input to the system is a three-second audio signal. The system can be operated in two modes: two-class and multi-class. The output of the two-class system is a label of ?crash? or ?non-crash?. In the multi-class system, the output is the label of ?crash? or various non-crash incidents including ?pile drive?, ?brake?, and ?normal-traffic? sounds. The system designed has three main steps in processing the input audio signal. They are: feature extraction, feature optimization and classification. Five different methods of feature extraction are investigated and compared; they are based on the discrete wavelet transform, fast Fourier transform, discrete cosine transform, real cepstrum transform and Mel frequency cepstral transform. Linear discriminant analysis (LDA) is used to optimize the features obtained in the feature extraction stage by linearly combining the features using different weights. Three types of statistical classifiers are investigated and compared: the nearest neighbor, nearest mean, and maximum likelihood methods. Data collected from Jackson, MS and Starkville, MS and the crash signals obtained from Texas Transportation Institute crash test facility are used to train and test the designed system. The results showed that the wavelet based feature extraction method with LDA and maximum likelihood classifier is the optimum design. This wavelet-based system is computationally inexpensive compared to other methods. The system produced classification accuracies of 95% to 100% when the input signal has a signal-to-noise-ratio of at least 0 decibels. These results show that the system is capable of effectively classifying ?crash? or ?non-crash? on a given input audio signal
    • 

    corecore