6,541 research outputs found

    On the structure of finite level and \omega-decomposable Borel functions

    Full text link
    We give a full description of the structure under inclusion of all finite level Borel classes of functions, and provide an elementary proof of the well-known fact that not every Borel function can be written as a countable union of \Sigma^0_\alpha-measurable functions (for every fixed 1 \leq \alpha < \omega_1). Moreover, we present some results concerning those Borel functions which are \omega-decomposable into continuous functions (also called countably continuous functions in the literature): such results should be viewed as a contribution towards the goal of generalizing a remarkable theorem of Jayne and Rogers to all finite levels, and in fact they allow us to prove some restricted forms of such generalizations. We also analyze finite level Borel functions in terms of composition of simpler functions, and we finally present an application to Banach space theory.Comment: 31 pages, 2 figures, revised version, accepted for publication on the Journal of Symbolic Logi

    Learning morphology with Morfette

    Get PDF
    Morfette is a modular, data-driven, probabilistic system which learns to perform joint morphological tagging and lemmatization from morphologically annotated corpora. The system is composed of two learning modules which are trained to predict morphological tags and lemmas using the Maximum Entropy classifier. The third module dynamically combines the predictions of the Maximum-Entropy models and outputs a probability distribution over tag-lemma pair sequences. The lemmatization module exploits the idea of recasting lemmatization as a classification task by using class labels which encode mappings from wordforms to lemmas. Experimental evaluation results and error analysis on three morphologically rich languages show that the system achieves high accuracy with no language-specific feature engineering or additional resources

    A Meta-Logic of Inference Rules: Syntax

    Get PDF
    This work was intended to be an attempt to introduce the meta-language for working with multiple-conclusion inference rules that admit asserted propositions along with the rejected propositions. The presence of rejected propositions, and especially the presence of the rule of reverse substitution, requires certain change the definition of structurality

    AI Feynman: a Physics-Inspired Method for Symbolic Regression

    Full text link
    A core challenge for both physics and artificial intellicence (AI) is symbolic regression: finding a symbolic expression that matches data from an unknown function. Although this problem is likely to be NP-hard in principle, functions of practical interest often exhibit symmetries, separability, compositionality and other simplifying properties. In this spirit, we develop a recursive multidimensional symbolic regression algorithm that combines neural network fitting with a suite of physics-inspired techniques. We apply it to 100 equations from the Feynman Lectures on Physics, and it discovers all of them, while previous publicly available software cracks only 71; for a more difficult test set, we improve the state of the art success rate from 15% to 90%.Comment: 15 pages, 2 figs. Our code is available at https://github.com/SJ001/AI-Feynman and our Feynman Symbolic Regression Database for benchmarking can be downloaded at https://space.mit.edu/home/tegmark/aifeynman.htm
    corecore