340 research outputs found

    A Potpourri of Reason Maintenance Methods

    Get PDF
    We present novel methods to compute changes to materialized views in logic databases like those used by rule-based reasoners. Such reasoners have to address the problem of changing axioms in the presence of materializations of derived atoms. Existing approaches have drawbacks: some require to generate and evaluate large transformed programs that are in Datalog - while the source program is in Datalog and significantly smaller; some recompute the whole extension of a predicate even if only a small part of this extension is affected by the change. The methods presented in this article overcome these drawbacks and derive additional information useful also for explanation, at the price of an adaptation of the semi-naive forward chaining

    Algebraic aspects of increasing subsequences

    Get PDF
    We present a number of results relating partial Cauchy-Littlewood sums, integrals over the compact classical groups, and increasing subsequences of permutations. These include: integral formulae for the distribution of the longest increasing subsequence of a random involution with constrained number of fixed points; new formulae for partial Cauchy-Littlewood sums, as well as new proofs of old formulae; relations of these expressions to orthogonal polynomials on the unit circle; and explicit bases for invariant spaces of the classical groups, together with appropriate generalizations of the straightening algorithm.Comment: LaTeX+amsmath+eepic; 52 pages. Expanded introduction, new references, other minor change

    Efficient Classification for Metric Data

    Full text link
    Recent advances in large-margin classification of data residing in general metric spaces (rather than Hilbert spaces) enable classification under various natural metrics, such as string edit and earthmover distance. A general framework developed for this purpose by von Luxburg and Bousquet [JMLR, 2004] left open the questions of computational efficiency and of providing direct bounds on generalization error. We design a new algorithm for classification in general metric spaces, whose runtime and accuracy depend on the doubling dimension of the data points, and can thus achieve superior classification performance in many common scenarios. The algorithmic core of our approach is an approximate (rather than exact) solution to the classical problems of Lipschitz extension and of Nearest Neighbor Search. The algorithm's generalization performance is guaranteed via the fat-shattering dimension of Lipschitz classifiers, and we present experimental evidence of its superiority to some common kernel methods. As a by-product, we offer a new perspective on the nearest neighbor classifier, which yields significantly sharper risk asymptotics than the classic analysis of Cover and Hart [IEEE Trans. Info. Theory, 1967].Comment: This is the full version of an extended abstract that appeared in Proceedings of the 23rd COLT, 201

    Comparison of random variables from a game-theoretic perspective

    Get PDF
    This work consists of four related parts, divided into eight chapters. A ¯rst part introduces the framework of cycle-transitivity, developed by De Baets et al. It is shown that this framework is ideally suited for describing and compar- ing forms of transitivity of probabilistic relations. Not only does it encompass most already known concepts of transitivity, it is also ideally suited to describe new types of transitivity that are encountered in this work (such as isostochas- tic transitivity and dice-transitivity). The author made many non-trivial and sometimes vital contributions to the development of this framework. A second part consists of the development and study of a new method to compare random variables. This method, which bears the name generalized dice model, was developed by De Meyer et al. and De Schuymer et al., and can be seen as a graded alternative to the well-known concept of ¯rst degree stochastic dominance. A third part involves the determination of the optimal strategies of three game variants that are closely related to the developed comparison scheme. The de¯nitions of these variants di®er from each other solely by the copula that is used to de¯ne the payo® matrix. It turns out however that the characterization of the optimal strategies, done by De Schuymer et al., is completely di®erent for each variant. A last part includes the study of some combinatorial problems that orig- inated from the investigation of the transitivity of probabilistic relations ob- tained by utilizing the developed method to compare random variables. The study, done by De Schuymer et al., includes the introduction of some new and interesting concepts in partition theory and combinatorics. A more thorough discussion, in which each section of this work is taken into account, can be found in the overview at the beginning of this manuscript. Although this work is oriented towards a mathematical audience, the intro- duced concepts are immediately applicable in practical situations. Firstly, the framework of cycle-transitivity provides an easy means to represent and compare obtained probabilistic relations. Secondly, the generalized dice model delivers a useful alternative to the concept of stochastic dominance for comparing random variables. Thirdly, the considered dice games can be viewed in an economical context in which competitors have the same resources and alternatives, and must choose how to distribute these resources over their alternatives. Finally, it must be noted that this work still leaves opportunities for future research. As immediate candidates we see, ¯rstly the investigation of the tran- sitivity of generalized dice models in which the random variables are pairwisely coupled by a di®erent copula. Secondly, the characterization of the transitivity of higher-dimensional dice models, starting with dimension 4. Thirdly, the study of the applicability of the introduced comparison schemes in areas such as mar- ket e±ciency, portfolio selection, risk estimation, capital budgeting, discounted cash °ow analysis, etc
    corecore