282 research outputs found

    Scalable video/image transmission using rate compatible PUM turbo codes

    Get PDF
    The robust delivery of video over emerging wireless networks poses many challenges due to the heterogeneity of access networks, the variations in streaming devices, and the expected variations in network conditions caused by interference and coexistence. The proposed approach exploits the joint optimization of a wavelet-based scalable video/image coding framework and a forward error correction method based on PUM turbo codes. The scheme minimizes the reconstructed image/video distortion at the decoder subject to a constraint on the overall transmission bitrate budget. The minimization is achieved by exploiting the rate optimization technique and the statistics of the transmission channel

    Hyperspectral image compression : adapting SPIHT and EZW to Anisotropic 3-D Wavelet Coding

    Get PDF
    Hyperspectral images present some specific characteristics that should be used by an efficient compression system. In compression, wavelets have shown a good adaptability to a wide range of data, while being of reasonable complexity. Some wavelet-based compression algorithms have been successfully used for some hyperspectral space missions. This paper focuses on the optimization of a full wavelet compression system for hyperspectral images. Each step of the compression algorithm is studied and optimized. First, an algorithm to find the optimal 3-D wavelet decomposition in a rate-distortion sense is defined. Then, it is shown that a specific fixed decomposition has almost the same performance, while being more useful in terms of complexity issues. It is shown that this decomposition significantly improves the classical isotropic decomposition. One of the most useful properties of this fixed decomposition is that it allows the use of zero tree algorithms. Various tree structures, creating a relationship between coefficients, are compared. Two efficient compression methods based on zerotree coding (EZW and SPIHT) are adapted on this near-optimal decomposition with the best tree structure found. Performances are compared with the adaptation of JPEG 2000 for hyperspectral images on six different areas presenting different statistical properties

    Reduced reference image and video quality assessments: review of methods

    Get PDF
    With the growing demand for image and video-based applications, the requirements of consistent quality assessment metrics of image and video have increased. Different approaches have been proposed in the literature to estimate the perceptual quality of images and videos. These approaches can be divided into three main categories; full reference (FR), reduced reference (RR) and no-reference (NR). In RR methods, instead of providing the original image or video as a reference, we need to provide certain features (i.e., texture, edges, etc.) of the original image or video for quality assessment. During the last decade, RR-based quality assessment has been a popular research area for a variety of applications such as social media, online games, and video streaming. In this paper, we present review and classification of the latest research work on RR-based image and video quality assessment. We have also summarized different databases used in the field of 2D and 3D image and video quality assessment. This paper would be helpful for specialists and researchers to stay well-informed about recent progress of RR-based image and video quality assessment. The review and classification presented in this paper will also be useful to gain understanding of multimedia quality assessment and state-of-the-art approaches used for the analysis. In addition, it will help the reader select appropriate quality assessment methods and parameters for their respective applications

    Reduced reference image and video quality assessments: review of methods

    Get PDF
    With the growing demand for image and video-based applications, the requirements of consistent quality assessment metrics of image and video have increased. Different approaches have been proposed in the literature to estimate the perceptual quality of images and videos. These approaches can be divided into three main categories; full reference (FR), reduced reference (RR) and no-reference (NR). In RR methods, instead of providing the original image or video as a reference, we need to provide certain features (i.e., texture, edges, etc.) of the original image or video for quality assessment. During the last decade, RR-based quality assessment has been a popular research area for a variety of applications such as social media, online games, and video streaming. In this paper, we present review and classification of the latest research work on RR-based image and video quality assessment. We have also summarized different databases used in the field of 2D and 3D image and video quality assessment. This paper would be helpful for specialists and researchers to stay well-informed about recent progress of RR-based image and video quality assessment. The review and classification presented in this paper will also be useful to gain understanding of multimedia quality assessment and state-of-the-art approaches used for the analysis. In addition, it will help the reader select appropriate quality assessment methods and parameters for their respective applications

    The Wavelet Transform for Image Processing Applications

    Get PDF

    On the Efficient Broadcasting of Heterogeneous Services over Band-Limited Channels: Unequal Power Allocation for Wavelet Packet Division Multiplexing

    Get PDF
    Multiple transmission of heterogeneous services is a central aspect of broadcasting technology. Often, in this framework, the design of efficient communication systems is complicated by stringent bandwidth constraint. In wavelet packet division multiplexing (WPDM), the message signals are waveform coded onto wavelet packet basis functions. The overlapping nature of such waveforms in both time and frequency allows improving the performance over the commonly used FDM and TDM schemes, while their orthogonality properties permit to extract the message signals by a simple correlator receiver. Furthermore, the scalable structure of WPDM makes it suitable for broadcasting heterogeneous services. This work investigates unequal error protection (UEP) of data which exhibit different sensitivities to channel errors to improve the performance of WPDM for transmission over band-limited channels. To cope with bandwidth constraint, an appropriate distribution of power among waveforms is proposed which is driven by the channel error sensitivities of the carried message signals in case of Gaussian noise. We address this problem by means of the genetic algorithms (GAs), which allow flexible suboptimal solution with reduced complexity. The mean square error (MSE) between the original and the decoded message, which has a strong correlation with subjective perception, is used as an optimization criterion

    Prioritizing Content of Interest in Multimedia Data Compression

    Get PDF
    Image and video compression techniques make data transmission and storage in digital multimedia systems more efficient and feasible for the system's limited storage and bandwidth. Many generic image and video compression techniques such as JPEG and H.264/AVC have been standardized and are now widely adopted. Despite their great success, we observe that these standard compression techniques are not the best solution for data compression in special types of multimedia systems such as microscopy videos and low-power wireless broadcast systems. In these application-specific systems where the content of interest in the multimedia data is known and well-defined, we should re-think the design of a data compression pipeline. We hypothesize that by identifying and prioritizing multimedia data's content of interest, new compression methods can be invented that are far more effective than standard techniques. In this dissertation, a set of new data compression methods based on the idea of prioritizing the content of interest has been proposed for three different kinds of multimedia systems. I will show that the key to designing efficient compression techniques in these three cases is to prioritize the content of interest in the data. The definition of the content of interest of multimedia data depends on the application. First, I show that for microscopy videos, the content of interest is defined as the spatial regions in the video frame with pixels that don't only contain noise. Keeping data in those regions with high quality and throwing out other information yields to a novel microscopy video compression technique. Second, I show that for a Bluetooth low energy beacon based system, practical multimedia data storage and transmission is possible by prioritizing content of interest. I designed custom image compression techniques that preserve edges in a binary image, or foreground regions of a color image of indoor or outdoor objects. Last, I present a new indoor Bluetooth low energy beacon based augmented reality system that integrates a 3D moving object compression method that prioritizes the content of interest.Doctor of Philosoph

    Optimized Transmission of JPEG2000 Streams Over Wireless Channels

    Get PDF
    The transmission of JPEG2000 images over wireless channels is examined using reorganization of the compressed images into error-resilient, product-coded streams. The product-code consists of Turbo-codes and Reed-Solomon codes which are optimized using an iterative process. The generation of the stream to be transmitted is performed directly using compressed JPEG2000 streams. The resulting scheme is tested for the transmission of compressed JPEG2000 images over wireless channels and is shown to outperform other algorithms which were recently proposed for the wireless transmission of images
    • 

    corecore