102,048 research outputs found

    Image Characterization and Classification by Physical Complexity

    Full text link
    We present a method for estimating the complexity of an image based on Bennett's concept of logical depth. Bennett identified logical depth as the appropriate measure of organized complexity, and hence as being better suited to the evaluation of the complexity of objects in the physical world. Its use results in a different, and in some sense a finer characterization than is obtained through the application of the concept of Kolmogorov complexity alone. We use this measure to classify images by their information content. The method provides a means for classifying and evaluating the complexity of objects by way of their visual representations. To the authors' knowledge, the method and application inspired by the concept of logical depth presented herein are being proposed and implemented for the first time.Comment: 30 pages, 21 figure

    History of art paintings through the lens of entropy and complexity

    Full text link
    Art is the ultimate expression of human creativity that is deeply influenced by the philosophy and culture of the corresponding historical epoch. The quantitative analysis of art is therefore essential for better understanding human cultural evolution. Here we present a large-scale quantitative analysis of almost 140 thousand paintings, spanning nearly a millennium of art history. Based on the local spatial patterns in the images of these paintings, we estimate the permutation entropy and the statistical complexity of each painting. These measures map the degree of visual order of artworks into a scale of order-disorder and simplicity-complexity that locally reflects qualitative categories proposed by art historians. The dynamical behavior of these measures reveals a clear temporal evolution of art, marked by transitions that agree with the main historical periods of art. Our research shows that different artistic styles have a distinct average degree of entropy and complexity, thus allowing a hierarchical organization and clustering of styles according to these metrics. We have further verified that the identified groups correspond well with the textual content used to qualitatively describe the styles, and that the employed complexity-entropy measures can be used for an effective classification of artworks.Comment: 10 two-column pages, 5 figures; accepted for publication in PNAS [supplementary information available at http://www.pnas.org/highwire/filestream/824089/field_highwire_adjunct_files/0/pnas.1800083115.sapp.pdf

    DNA sequences classification and computation scheme based on the symmetry principle

    Get PDF
    The DNA sequences containing multifarious novel symmetrical structure frequently play crucial role in how genomes work. Here we present a new scheme for understanding the structural features and potential mathematical rules of symmetrical DNA sequences using a method containing stepwise classification and recursive computation. By defining the symmetry of DNA sequences, we classify all sequences and conclude a series of recursive equations for computing the quantity of all classes of sequences existing theoretically; moreover, the symmetries of the typical sequences at different levels are analyzed. The classification and quantitative relation demonstrate that DNA sequences have recursive and nested properties. The scheme may help us better discuss the formation and the growth mechanism of DNA sequences because it has a capability of educing the information about structure and quantity of longer sequences according to that of shorter sequences by some recursive rules. Our scheme may provide a new stepping stone to the theoretical characterization, as well as structural analysis, of DNA sequences

    Classification and Verification of Online Handwritten Signatures with Time Causal Information Theory Quantifiers

    Get PDF
    We present a new approach for online handwritten signature classification and verification based on descriptors stemming from Information Theory. The proposal uses the Shannon Entropy, the Statistical Complexity, and the Fisher Information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results produced surpass state-of-the-art techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.Comment: Submitted to PLOS On

    Seafloor Segmentation Based on Bathymetric Measurements from Multibeam Echosounders Data

    Get PDF
    Bathymetric data depicts the geomorphology of the seabottom and allows characterization of spatial distributions of apparent benthic habitats. The variability of seafloor topography can be defined as a texture. This prompts for the application of well developed image processing techniques for automatic delineation of regions with clucially different physiographic characteristics. In the present paper histograms of biologically motivated invariant image attributes are used for characterization of local geomorphological feahires. This technique can be naturally applied in a range of spatial scales. Local feature vectors are then submitted to a procedure which divides the set into a number of clusters each representing a distinct type of the seafloor. Prior knowledge about benthic habitat locations allows the use of supervised classification, by training a Suppolt Vector Machine on a chosen data set, and then applying the developed model to a full set. The classification method is shown to perform well on the multibeam echosounder (MBES) data from Piscataqua River, New Hampshire, USA

    On the non-local geometry of turbulence

    Get PDF
    A multi-scale methodology for the study of the non-local geometry of eddy structures in turbulence is developed. Starting from a given three-dimensional field, this consists of three main steps: extraction, characterization and classification of structures. The extraction step is done in two stages. First, a multi-scale decomposition based on the curvelet transform is applied to the full three-dimensional field, resulting in a finite set of component three-dimensional fields, one per scale. Second, by iso-contouring each component field at one or more iso-contour levels, a set of closed iso-surfaces is obtained that represents the structures at that scale. The characterization stage is based on the joint probability density function (p.d.f.), in terms of area coverage on each individual iso-surface, of two differential-geometry properties, the shape index and curvedness, plus the stretching parameter, a dimensionless global invariant of the surface. Taken together, this defines the geometrical signature of the iso-surface. The classification step is based on the construction of a finite set of parameters, obtained from algebraic functions of moments of the joint p.d.f. of each structure, that specify its location as a point in a multi-dimensional ‘feature space’. At each scale the set of points in feature space represents all structures at that scale, for the specified iso-contour value. This then allows the application, to the set, of clustering techniques that search for groups of structures with a common geometry. Results are presented of a first application of this technique to a passive scalar field obtained from 5123 direct numerical simulation of scalar mixing by forced, isotropic turbulence (Reλ = 265). These show transition, with decreasing scale, from blob-like structures in the larger scales to blob- and tube-like structures with small or moderate stretching in the inertial range of scales, and then toward tube and, predominantly, sheet-like structures with high level of stretching in the dissipation range of scales. Implications of these results for the dynamical behaviour of passive scalar stirring and mixing by turbulence are discussed

    Effect of operating temperature on direct recycling aluminium chips (AA6061) in hot press forging process

    Get PDF
    A method of solid-state recycling aluminum alloy using hot press forging process was studied as well as the possibility of the recycled chip to be used as secondary resources. This paper presents the results of recycled AA6061 aluminium alloy chip using different operating temperature for hot press forging process. Mechanical properties and microstructure of the recycled specimens and as-received (reference) specimen were investigated. The recycled specimens exhibit a good potential in the strength properties. The result for yield strength (YS) and ultimate tensile strength (UTS) at the minimum temperature 430˚C is 25.8 MPa and 27.13 MPa. For the maximum operating temperature 520˚C YS and UTS are 107.0MPa and 117.53 MPa. Analysis for different operating temperatures shows that the higher temperatures giving better result on mechanical properties and finer microstructure. The strength of recycled specimen increases due to the grain refinement strengthening whereas particle dispersion strengthening has minor effects. In this study, the recycled AA6061 chip shows the good potential in strengthening as the comparison of using only 17.5% of suggested pressure (70.0/400.0) MPa, the UTS exhibit 35.8% (117.58/327.69) MPa. This shows a remarkable potential of direct recycling by using hot press forging process
    corecore