3,011 research outputs found

    Closed Contour Fractal Dimension Estimation by the Fourier Transform

    Full text link
    This work proposes a novel technique for the numerical calculus of the fractal dimension of fractal objects which can be represented as a closed contour. The proposed method maps the fractal contour onto a complex signal and calculates its fractal dimension using the Fourier transform. The Fourier power spectrum is obtained and an exponential relation is verified between the power and the frequency. From the parameter (exponent) of the relation, it is obtained the fractal dimension. The method is compared to other classical fractal dimension estimation methods in the literature, e. g., Bouligand-Minkowski, box-couting and classical Fourier. The comparison is achieved by the calculus of the fractal dimension of fractal contours whose dimensions are well-known analytically. The results showed the high precision and robustness of the proposed technique

    Multifractal Analysis of Packed Swiss Cheese Cosmologies

    Full text link
    The multifractal spectrum of various three-dimensional representations of Packed Swiss Cheese cosmologies in open, closed, and flat spaces are measured, and it is determined that the curvature of the space does not alter the associated fractal structure. These results are compared to observational data and simulated models of large scale galaxy clustering, to assess the viability of the PSC as a candidate for such structure formation. It is found that the PSC dimension spectra do not match those of observation, and possible solutions to this discrepancy are offered, including accounting for potential luminosity biasing effects. Various random and uniform sets are also analyzed to provide insight into the meaning of the multifractal spectrum as it relates to the observed scaling behaviors.Comment: 3 latex files, 18 ps figure

    Practical implementation of nonlinear time series methods: The TISEAN package

    Full text link
    Nonlinear time series analysis is becoming a more and more reliable tool for the study of complicated dynamics from measurements. The concept of low-dimensional chaos has proven to be fruitful in the understanding of many complex phenomena despite the fact that very few natural systems have actually been found to be low dimensional deterministic in the sense of the theory. In order to evaluate the long term usefulness of the nonlinear time series approach as inspired by chaos theory, it will be important that the corresponding methods become more widely accessible. This paper, while not a proper review on nonlinear time series analysis, tries to make a contribution to this process by describing the actual implementation of the algorithms, and their proper usage. Most of the methods require the choice of certain parameters for each specific time series application. We will try to give guidance in this respect. The scope and selection of topics in this article, as well as the implementational choices that have been made, correspond to the contents of the software package TISEAN which is publicly available from http://www.mpipks-dresden.mpg.de/~tisean . In fact, this paper can be seen as an extended manual for the TISEAN programs. It fills the gap between the technical documentation and the existing literature, providing the necessary entry points for a more thorough study of the theoretical background.Comment: 27 pages, 21 figures, downloadable software at http://www.mpipks-dresden.mpg.de/~tisea

    Methods for Estimation of Intrinsic Dimensionality

    Get PDF
    Dimension reduction is an important tool used to describe the structure of complex data (explicitly or implicitly) through a small but sufficient number of variables, and thereby make data analysis more efficient. It is also useful for visualization purposes. Dimension reduction helps statisticians to overcome the ‘curse of dimensionality’. However, most dimension reduction techniques require the intrinsic dimension of the low-dimensional subspace to be fixed in advance. The availability of reliable intrinsic dimension (ID) estimation techniques is of major importance. The main goal of this thesis is to develop algorithms for determining the intrinsic dimensions of recorded data sets in a nonlinear context. Whilst this is a well-researched topic for linear planes, based mainly on principal components analysis, relatively little attention has been paid to ways of estimating this number for non–linear variable interrelationships. The proposed algorithms here are based on existing concepts that can be categorized into local methods, relying on randomly selected subsets of a recorded variable set, and global methods, utilizing the entire data set. This thesis provides an overview of ID estimation techniques, with special consideration given to recent developments in non–linear techniques, such as charting manifold and fractal–based methods. Despite their nominal existence, the practical implementation of these techniques is far from straightforward. The intrinsic dimension is estimated via Brand’s algorithm by examining the growth point process, which counts the number of points in hyper-spheres. The estimation needs to determine the starting point for each hyper-sphere. In this thesis we provide settings for selecting starting points which work well for most data sets. Additionally we propose approaches for estimating dimensionality via Brand’s algorithm, the Dip method and the Regression method. Other approaches are proposed for estimating the intrinsic dimension by fractal dimension estimation methods, which exploit the intrinsic geometry of a data set. The most popular concept from this family of methods is the correlation dimension, which requires the estimation of the correlation integral for a ball of radius tending to 0. In this thesis we propose new approaches to approximate the correlation integral in this limit. The new approaches are the Intercept method, the Slop method and the Polynomial method. In addition we propose a new approach, a localized global method, which could be defined as a local version of global ID methods. The objective of the localized global approach is to improve the algorithm based on a local ID method, which could significantly reduce the negative bias. Experimental results on real world and simulated data are used to demonstrate the algorithms and compare them to other methodology. A simulation study which verifies the effectiveness of the proposed methods is also provided. Finally, these algorithms are contrasted using a recorded data set from an industrial melter process

    A New Estimator of Intrinsic Dimension Based on the Multipoint Morisita Index

    Full text link
    The size of datasets has been increasing rapidly both in terms of number of variables and number of events. As a result, the empty space phenomenon and the curse of dimensionality complicate the extraction of useful information. But, in general, data lie on non-linear manifolds of much lower dimension than that of the spaces in which they are embedded. In many pattern recognition tasks, learning these manifolds is a key issue and it requires the knowledge of their true intrinsic dimension. This paper introduces a new estimator of intrinsic dimension based on the multipoint Morisita index. It is applied to both synthetic and real datasets of varying complexities and comparisons with other existing estimators are carried out. The proposed estimator turns out to be fairly robust to sample size and noise, unaffected by edge effects, able to handle large datasets and computationally efficient

    Texture analysis using volume-radius fractal dimension

    Full text link
    Texture plays an important role in computer vision. It is one of the most important visual attributes used in image analysis, once it provides information about pixel organization at different regions of the image. This paper presents a novel approach for texture characterization, based on complexity analysis. The proposed approach expands the idea of the Mass-radius fractal dimension, a method originally developed for shape analysis, to a set of coordinates in 3D-space that represents the texture under analysis in a signature able to characterize efficiently different texture classes in terms of complexity. An experiment using images from the Brodatz album illustrates the method performance.Comment: 4 pages, 4 figure

    Estimation of complexity of field contours of layer building with the use of cell method of determining the fractal dimension

    Get PDF
    The results of estimating the geometric complexity of contours in layered building of a product by additive manufacturing are presented. The contour complexity was evaluated on the basis of a statistical analysis of their fractal dimension obtained the cell method. To determine fractal dimension of the contour, measures commensurate with the geometric limitations of layered building were used. Such limitations exist because of layered building peculiarities and technological capabilities of used equipment. Software has been developed to implement a layer-by-layer analysis of the original triangulation 3D model. Approbation of the software was carried out on the basis of models of industrial products. As a result, the possibility of estimating the geometric complexity of the field contours of layered building is confirmed on basis of statistical analysis of distribution characteristics of fractal dimension of the contours

    Estimating the fractal dimension: a comparative review and open source implementations

    Full text link
    The fractal dimension of state space sets is typically estimated via the scaling of either the generalized (R\'enyi) entropy or the correlation sum versus a size parameter, and more recently, via a new method based on extreme value theory. Motivated by the lack of quantitative and systematic comparisons of fractal dimension estimators in the literature, and also by new methods both for estimating fractal dimensions and for performing delay embeddings, in this paper we provide a detailed and quantitative comparison for estimating the fractal dimension. We start with summarizing existing estimators and then perform an evaluation of these estimators, comparing their performance and precision using different data sets and taking into account the impact of features like length, noise, embedding dimension, non-stationarity, falsify-ability, among many others. Our analysis shows that for synthetic data the correlation sum and extreme value theory perform equally well as best estimators. Our results also challenge the widely established notion that the correlation sum method cannot be applied to high dimensional data. For real experimental data the entropy is better than the correlation sum, but the extreme value theory is equally good to the entropy. It appears that overall the more recent extreme value theory estimator is very powerful, but it also has two downsides when it comes to real data. First, it over-estimates the dimension values for short data lengths, and second, it will also yield low-dimensional deterministic results even for inappropriate data like stock market timeseries, requiring caution when applied to data of unknown underlying dynamics. All quantities discussed are implemented as performant and easy to use open source code via the DynamicalSystems.jl library
    corecore