2,812 research outputs found

    Fast Search Approaches for Fractal Image Coding: Review of Contemporary Literature

    Get PDF
    Fractal Image Compression FIC as a model was conceptualized in the 1989 In furtherance there are numerous models that has been developed in the process Existence of fractals were initially observed and depicted in the Iterated Function System IFS and the IFS solutions were used for encoding images The process of IFS pertaining to any image constitutes much lesser space for recording than the actual image which has led to the development of representation the image using IFS form and how the image compression systems has taken shape It is very important that the time consumed for encoding has to be addressed for achieving optimal compression conditions and predominantly the inputs that are shared in the solutions proposed in the study depict the fact that despite of certain developments that has taken place still there are potential chances of scope for improvement From the review of exhaustive range of models that are depicted in the model it is evident that over period of time numerous advancements have taken place in the FCI model and is adapted at image compression in varied levels This study focus on the existing range of literature on FCI and the insights of various models has been depicted in this stud

    Candidate One-Way Functions and One-Way Permutations Based on Quasigroup String Transformations

    Full text link
    In this paper we propose a definition and construction of a new family of one-way candidate functions RN:QN→QN{\cal R}_N:Q^N \to Q^N, where Q={0,1,...,s−1}Q=\{0,1,...,s-1\} is an alphabet with ss elements. Special instances of these functions can have the additional property to be permutations (i.e. one-way permutations). These one-way functions have the property that for achieving the security level of 2n2^n computations in order to invert them, only nn bits of input are needed. The construction is based on quasigroup string transformations. Since quasigroups in general do not have algebraic properties such as associativity, commutativity, neutral elements, inverting these functions seems to require exponentially many readings from the lookup table that defines them (a Latin Square) in order to check the satisfiability for the initial conditions, thus making them natural candidates for one-way functions.Comment: Submitetd to conferenc

    A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Get PDF
    Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression

    BASED ON RANGE AND DOMAIN FRACTAL IMAGE COMPRESSION OF SATELLITE IMAGERIES IMPROVED ALGORITHM FOR RESEARCH

    Get PDF
    Fractal coding is a novel method to compress images, which was proposed by Barnsley, and implemented by Jacquin. It offers many advantages. Fractal image coding has the advantage of higher compression ratio, but is a lossy compression scheme. The encoding procedure consists of dividing the image into range blocks and domain blocks and then it takes a range block and matches it with the domain block. The image is encoded by partitioning the domain block and using affine transformation to achieve fractal compression. The image is reconstructed using iterative functions and inverse transforms. However, the encoding time of traditional fractal compression technique is too long to achieve real-time image compression, so it cannot be widely used. Based on the theory of fractal image compression; this paper raised an improved algorithm form the aspect of image segmentation. In the present work the fractal coding techniques are applied for the compression of satellite imageries. The Peak Signal to Noise Ratio (PSNR) values are determined for images namely Satellite Rural image and Satellite Urban image. The Matlab simulation results for the reconstructed image shows that PSNR values achievable for Satellite Rural image ~33 and for Satellite urban image ~42

    Single-image super-resolution using sparsity constraints and non-local similarities at multiple resolution scales

    Get PDF
    Traditional super-resolution methods produce a clean high-resolution image from several observed degraded low-resolution images following an acquisition or degradation model. Such a model describes how each output pixel is related to one or more input pixels and it is called data fidelity term in the regularization framework. Additionally, prior knowledge such as piecewise smoothness can be incorporated to improve the image restoration result. The impact of an observed pixel on the restored pixels is thus local according to the degradation model and the prior knowledge. Therefore, the traditional methods only exploit the spatial redundancy in a local neighborhood and are therefore referred to as local methods. Recently, non-local methods, which make use of similarities between image patches across the whole image, have gained popularity in image restoration in general. In super-resolution literature they are often referred to as exemplar-based methods. In this paper, we exploit the similarity of patches within the same scale (which is related to the class of non-local methods) and across different resolution scales of the same image (which is also related to the fractal-based methods). For patch fusion, we employ a kernel regression algorithm, which yields a blurry and noisy version of the desired high-resolution image. For the final reconstruction step, we develop a novel restoration algorithm. The joint deconvolution/denoising algorithm is based on the split Bregman iterations and, as prior knowledge, the algorithm exploits the sparsity of the image in the shearlet-transformed domain. Initial results indicate an improvement over both classical local and state-of-the art non-local super-resolution methods

    Study on high Performance and Effective Watermarking Scheme using Hybrid Transform (DCT-DWT)

    Get PDF
    Nowadays healthcare infrastructure depends on Hospital Information Systems (HIS), Radiology Information Systems (RIS),Picture archiving and Communication Systems (PACS) as these provide new ways to store, access and distribute medical data . It eliminates the security risk. Conversely, these developments have introduced new risks for unsuitable deployment of medical information flowing in open networks, provided the effortlessness with which digital content can be manipulated. It is renowned that the integrity and confidentiality of medical data is a serious topic for ethical and legal reasons. Medical images need to be kept intact in any condition and prior to any operation as well need to be checked for integrity and verification. Watermarking is a budding technology that is capable of assisting this aim. In recent times, frequency domain watermarking algorithms have gained immense importance due to their widespread use. Subsequently, the watermark embedding and extraction are performed in frequency domain using the presented scheme. The proposed watermarking scheme, the watermark extraction compared with the original image for calculating SSIM.The effectiveness of the proposed watermarking scheme is demonstrated with the aid of experimental results

    Fractal image compression and the self-affinity assumption : a stochastic signal modelling perspective

    Get PDF
    Bibliography: p. 208-225.Fractal image compression is a comparatively new technique which has gained considerable attention in the popular technical press, and more recently in the research literature. The most significant advantages claimed are high reconstruction quality at low coding rates, rapid decoding, and "resolution independence" in the sense that an encoded image may be decoded at a higher resolution than the original. While many of the claims published in the popular technical press are clearly extravagant, it appears from the rapidly growing body of published research that fractal image compression is capable of performance comparable with that of other techniques enjoying the benefit of a considerably more robust theoretical foundation. . So called because of the similarities between the form of image representation and a mechanism widely used in generating deterministic fractal images, fractal compression represents an image by the parameters of a set of affine transforms on image blocks under which the image is approximately invariant. Although the conditions imposed on these transforms may be shown to be sufficient to guarantee that an approximation of the original image can be reconstructed, there is no obvious theoretical reason to expect this to represent an efficient representation for image coding purposes. The usual analogy with vector quantisation, in which each image is considered to be represented in terms of code vectors extracted from the image itself is instructive, but transforms the fundamental problem into one of understanding why this construction results in an efficient codebook. The signal property required for such a codebook to be effective, termed "self-affinity", is poorly understood. A stochastic signal model based examination of this property is the primary contribution of this dissertation. The most significant findings (subject to some important restrictions} are that "self-affinity" is not a natural consequence of common statistical assumptions but requires particular conditions which are inadequately characterised by second order statistics, and that "natural" images are only marginally "self-affine", to the extent that fractal image compression is effective, but not more so than comparable standard vector quantisation techniques

    Entropy in Image Analysis III

    Get PDF
    Image analysis can be applied to rich and assorted scenarios; therefore, the aim of this recent research field is not only to mimic the human vision system. Image analysis is the main methods that computers are using today, and there is body of knowledge that they will be able to manage in a totally unsupervised manner in future, thanks to their artificial intelligence. The articles published in the book clearly show such a future
    • …
    corecore