8,291 research outputs found

    Discriminative Segmental Cascades for Feature-Rich Phone Recognition

    Full text link
    Discriminative segmental models, such as segmental conditional random fields (SCRFs) and segmental structured support vector machines (SSVMs), have had success in speech recognition via both lattice rescoring and first-pass decoding. However, such models suffer from slow decoding, hampering the use of computationally expensive features, such as segment neural networks or other high-order features. A typical solution is to use approximate decoding, either by beam pruning in a single pass or by beam pruning to generate a lattice followed by a second pass. In this work, we study discriminative segmental models trained with a hinge loss (i.e., segmental structured SVMs). We show that beam search is not suitable for learning rescoring models in this approach, though it gives good approximate decoding performance when the model is already well-trained. Instead, we consider an approach inspired by structured prediction cascades, which use max-marginal pruning to generate lattices. We obtain a high-accuracy phonetic recognition system with several expensive feature types: a segment neural network, a second-order language model, and second-order phone boundary features

    Can FCA-based Recommender System Suggest a Proper Classifier?

    Full text link
    The paper briefly introduces multiple classifier systems and describes a new algorithm, which improves classification accuracy by means of recommendation of a proper algorithm to an object classification. This recommendation is done assuming that a classifier is likely to predict the label of the object correctly if it has correctly classified its neighbors. The process of assigning a classifier to each object is based on Formal Concept Analysis. We explain the idea of the algorithm with a toy example and describe our first experiments with real-world datasets.Comment: 10 pages, 1 figure, 4 tables, ECAI 2014, workshop "What FCA can do for "Artifficial Intelligence

    Signatures of Infinity: Nonergodicity and Resource Scaling in Prediction, Complexity, and Learning

    Full text link
    We introduce a simple analysis of the structural complexity of infinite-memory processes built from random samples of stationary, ergodic finite-memory component processes. Such processes are familiar from the well known multi-arm Bandit problem. We contrast our analysis with computation-theoretic and statistical inference approaches to understanding their complexity. The result is an alternative view of the relationship between predictability, complexity, and learning that highlights the distinct ways in which informational and correlational divergences arise in complex ergodic and nonergodic processes. We draw out consequences for the resource divergences that delineate the structural hierarchy of ergodic processes and for processes that are themselves hierarchical.Comment: 8 pages, 1 figure; http://csc.ucdavis.edu/~cmg/compmech/pubs/soi.pd

    Evolutionary Microeconomics and the Theory of Expectations

    Get PDF
    This paper sketches a framework for the analysis of expectations in an evolutionary microeconomics. The core proposition is that expectations form a network structure, and that the geometry of that network will provide a suitable guide as to the dynamical behaviour of that network. It is a development towards a theory of the computational processes that construct the data set of expectations. The role of probability theory is examined in this context. Two key issues will be explored: (1) on the nature and stability of expectations when they form as a complex network; and (2), the way in which this may be modelled within a multi-agent simulation platform. It is argued that multi-agent simulation (a-life) techniques provide an expedient analytical environment to study the dynamic nature of mass expectations, as generated or produced objects, in a way that bridges micro and macroeconomics.

    What Is a Macrostate? Subjective Observations and Objective Dynamics

    Get PDF
    We consider the question of whether thermodynamic macrostates are objective consequences of dynamics, or subjective reflections of our ignorance of a physical system. We argue that they are both; more specifically, that the set of macrostates forms the unique maximal partition of phase space which 1) is consistent with our observations (a subjective fact about our ability to observe the system) and 2) obeys a Markov process (an objective fact about the system's dynamics). We review the ideas of computational mechanics, an information-theoretic method for finding optimal causal models of stochastic processes, and argue that macrostates coincide with the ``causal states'' of computational mechanics. Defining a set of macrostates thus consists of an inductive process where we start with a given set of observables, and then refine our partition of phase space until we reach a set of states which predict their own future, i.e. which are Markovian. Macrostates arrived at in this way are provably optimal statistical predictors of the future values of our observables.Comment: 15 pages, no figure
    • …
    corecore