953 research outputs found

    Designing labeled graph classifiers by exploiting the R\'enyi entropy of the dissimilarity representation

    Full text link
    Representing patterns as labeled graphs is becoming increasingly common in the broad field of computational intelligence. Accordingly, a wide repertoire of pattern recognition tools, such as classifiers and knowledge discovery procedures, are nowadays available and tested for various datasets of labeled graphs. However, the design of effective learning procedures operating in the space of labeled graphs is still a challenging problem, especially from the computational complexity viewpoint. In this paper, we present a major improvement of a general-purpose classifier for graphs, which is conceived on an interplay between dissimilarity representation, clustering, information-theoretic techniques, and evolutionary optimization algorithms. The improvement focuses on a specific key subroutine devised to compress the input data. We prove different theorems which are fundamental to the setting of the parameters controlling such a compression operation. We demonstrate the effectiveness of the resulting classifier by benchmarking the developed variants on well-known datasets of labeled graphs, considering as distinct performance indicators the classification accuracy, computing time, and parsimony in terms of structural complexity of the synthesized classification models. The results show state-of-the-art standards in terms of test set accuracy and a considerable speed-up for what concerns the computing time.Comment: Revised versio

    Uncertainty shocks of Trump election in an interval model of stock market

    Get PDF
    This paper proposes a new class of nonlinear interval models for interval-valued time series. By matching the interval model with interval observations, we develop a nonlinear minimum-distance estimation method for the proposed models, and establish the asymptotic theory for the proposed estimators. Superior to traditional point-based methods, the proposed interval modelling approach can assess the change in both the trend and volatility simultaneously. Within the proposed interval framework, this paper examines the impact of the 2016 US presidential election (henceforth Trump election) on the US stock market as a case study. Considering the validity of daily high-low range as a proxy of market efficiency, we employ an interval-valued return to jointly measure the fundamental value movement and market efficiency simultaneously. Empirical results suggest a strong evidence that the Trump election has increased the level/trend and lowered the volatility of the S&P 500 index in both ex ante and ex post analysis. Furthermore, a longer half-life period for the impact on fundamental value (62.4 days) than high-low range (15.9 days) has shown that the impact of Trump's victory on fundamental value is more persistent than its impact on market efficiency

    A precise bare simulation approach to the minimization of some distances. Foundations

    Full text link
    In information theory -- as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition -- many flexibilizations of the omnipresent Kullback-Leibler information distance (relative entropy) and of the closely related Shannon entropy have become frequently used tools. To tackle corresponding constrained minimization (respectively maximization) problems by a newly developed dimension-free bare (pure) simulation method, is the main goal of this paper. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). As a side effect, we also derive an innovative way of constructing new useful distances/divergences. To illustrate the core of our approach, we present numerous examples. The potential for widespread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences and entropies in various different research fields (which may also serve as an interdisciplinary interface)
    • …
    corecore