4,512 research outputs found

    A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Get PDF
    Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression

    Herding as a Learning System with Edge-of-Chaos Dynamics

    Full text link
    Herding defines a deterministic dynamical system at the edge of chaos. It generates a sequence of model states and parameters by alternating parameter perturbations with state maximizations, where the sequence of states can be interpreted as "samples" from an associated MRF model. Herding differs from maximum likelihood estimation in that the sequence of parameters does not converge to a fixed point and differs from an MCMC posterior sampling approach in that the sequence of states is generated deterministically. Herding may be interpreted as a"perturb and map" method where the parameter perturbations are generated using a deterministic nonlinear dynamical system rather than randomly from a Gumbel distribution. This chapter studies the distinct statistical characteristics of the herding algorithm and shows that the fast convergence rate of the controlled moments may be attributed to edge of chaos dynamics. The herding algorithm can also be generalized to models with latent variables and to a discriminative learning setting. The perceptron cycling theorem ensures that the fast moment matching property is preserved in the more general framework

    Complexity Measures of Music

    Get PDF
    We present a technique to search for the presence of crucial events in music, based on the analysis of the music volume. Earlier work on this issue was based on the assumption that crucial events correspond to the change of music notes, with the interesting result that the complexity index of the crucial events is mu ~ 2, which is the same inverse power-law index of the dynamics of the brain. The search technique analyzes music volume and confirms the results of the earlier work, thereby contributing to the explanation as to why the brain is sensitive to music, through the phenomenon of complexity matching. Complexity matching has recently been interpreted as the transfer of multifractality from one complex network to another. For this reason we also examine the mulifractality of music, with the observation that the multifractal spectrum of a computer performance is significantly narrower than the multifractal spectrum of a human performance of the same musical score. We conjecture that although crucial events are demonstrably important for information transmission, they alone are not suficient to define musicality, which is more adequately measured by the multifractality spectrum

    Fractal image compression and the self-affinity assumption : a stochastic signal modelling perspective

    Get PDF
    Bibliography: p. 208-225.Fractal image compression is a comparatively new technique which has gained considerable attention in the popular technical press, and more recently in the research literature. The most significant advantages claimed are high reconstruction quality at low coding rates, rapid decoding, and "resolution independence" in the sense that an encoded image may be decoded at a higher resolution than the original. While many of the claims published in the popular technical press are clearly extravagant, it appears from the rapidly growing body of published research that fractal image compression is capable of performance comparable with that of other techniques enjoying the benefit of a considerably more robust theoretical foundation. . So called because of the similarities between the form of image representation and a mechanism widely used in generating deterministic fractal images, fractal compression represents an image by the parameters of a set of affine transforms on image blocks under which the image is approximately invariant. Although the conditions imposed on these transforms may be shown to be sufficient to guarantee that an approximation of the original image can be reconstructed, there is no obvious theoretical reason to expect this to represent an efficient representation for image coding purposes. The usual analogy with vector quantisation, in which each image is considered to be represented in terms of code vectors extracted from the image itself is instructive, but transforms the fundamental problem into one of understanding why this construction results in an efficient codebook. The signal property required for such a codebook to be effective, termed "self-affinity", is poorly understood. A stochastic signal model based examination of this property is the primary contribution of this dissertation. The most significant findings (subject to some important restrictions} are that "self-affinity" is not a natural consequence of common statistical assumptions but requires particular conditions which are inadequately characterised by second order statistics, and that "natural" images are only marginally "self-affine", to the extent that fractal image compression is effective, but not more so than comparable standard vector quantisation techniques

    Fast Search Approaches for Fractal Image Coding: Review of Contemporary Literature

    Get PDF
    Fractal Image Compression FIC as a model was conceptualized in the 1989 In furtherance there are numerous models that has been developed in the process Existence of fractals were initially observed and depicted in the Iterated Function System IFS and the IFS solutions were used for encoding images The process of IFS pertaining to any image constitutes much lesser space for recording than the actual image which has led to the development of representation the image using IFS form and how the image compression systems has taken shape It is very important that the time consumed for encoding has to be addressed for achieving optimal compression conditions and predominantly the inputs that are shared in the solutions proposed in the study depict the fact that despite of certain developments that has taken place still there are potential chances of scope for improvement From the review of exhaustive range of models that are depicted in the model it is evident that over period of time numerous advancements have taken place in the FCI model and is adapted at image compression in varied levels This study focus on the existing range of literature on FCI and the insights of various models has been depicted in this stud

    Image detection in real time based on fuzzy fractal theory.

    No full text
    International audienceReal time image detection is still a challenge in research. Several methods have been used, but all can be divide in two approaches: the first is based on image field estimation in this case the quality of image is depending on the estimation method. The second is based on electrons collection, the particularity of this approach is that more the collection time is longer, better will be the quality of image. In both of these approaches, the global image should be obtained by assembling the mosaic local image or the visual index of the different point of the image. In this paper we introduce and Hybrid Fractal Fuzzy theory to track image in real time. The error is minimized using RANSAC (Random Sample Consensus) algorithm, by computing the homograph “pixel” union of image. In practice for mobile image a loop can be realize to focus the image in real time, so, we can have and efficient view of the global image in real time, which confer to the propose approach his efficient flexibility

    Joint Inversion of Fracture Model Properties for CO2 Storage Monitoring or Oil Recovery History Matching

    No full text
    International audienceFor oil recovery or CO2 storage, "reservoirs" are commonly used to designate geological structures where oil can be found or CO2 can be stored. All reservoirs present a heterogeneity in terms of rock type and properties (such as porosity and permeability). In addition, some of these reservoirs present fractures and faults. Fractured reservoirs are an important part of the oil reserves in the world (Middle East, Gulf of Mexico, etc.) and some of them are important reservoirs in terms of oil volume and productivity in spite of the fractures. In addition, studies of reservoirs for geologic storage of CO2 have shown the existence of diffuse fractures and faults and their strong impacts on flow. A key point in fractured reservoirs is to understand the geometry and hydraulic conductivity of the network formed by the fractures. This requires the construction of a reservoir model that integrates all available conceptual knowledge and quantitative data. The topic of the present paper deals with a new methodology able to perform the history matching of a fractured reservoir model by adapting the sub-seismic fault properties and positions. The main difficulty of this work is to generate a sub-seismic fault network whose fault positions can be easily modified while respecting the statistical fault model. The sub-seismic fault model we have chosen allows us to obtain a sub-seismic fault network that is consistent with the seismic fault network and that succeeds in capturing the specific spatial organization of the faults. In a first step, the geometry of the seismic fault network is characterized using fractal methods. Sub-seismic faults are then generated according to a stochastic algorithm. Finally, the geometry of this discrete fracture network is optimized in order to match the hydrodynamic data about the reservoir. The optimization algorithm modifies the sub-seismic fault positions, leading to the history matching of the reservoir model. Fractal properties are preserved during the deformation process. These different steps are demonstrated on a realistic synthetic case

    Consensus clustering in complex networks

    Get PDF
    The community structure of complex networks reveals both their organization and hidden relationships among their constituents. Most community detection methods currently available are not deterministic, and their results typically depend on the specific random seeds, initial conditions and tie-break rules adopted for their execution. Consensus clustering is used in data analysis to generate stable results out of a set of partitions delivered by stochastic methods. Here we show that consensus clustering can be combined with any existing method in a self-consistent way, enhancing considerably both the stability and the accuracy of the resulting partitions. This framework is also particularly suitable to monitor the evolution of community structure in temporal networks. An application of consensus clustering to a large citation network of physics papers demonstrates its capability to keep track of the birth, death and diversification of topics.Comment: 11 pages, 12 figures. Published in Scientific Report
    • …
    corecore