3,496 research outputs found

    Analyzing sparse dictionaries for online learning with kernels

    Full text link
    Many signal processing and machine learning methods share essentially the same linear-in-the-parameter model, with as many parameters as available samples as in kernel-based machines. Sparse approximation is essential in many disciplines, with new challenges emerging in online learning with kernels. To this end, several sparsity measures have been proposed in the literature to quantify sparse dictionaries and constructing relevant ones, the most prolific ones being the distance, the approximation, the coherence and the Babel measures. In this paper, we analyze sparse dictionaries based on these measures. By conducting an eigenvalue analysis, we show that these sparsity measures share many properties, including the linear independence condition and inducing a well-posed optimization problem. Furthermore, we prove that there exists a quasi-isometry between the parameter (i.e., dual) space and the dictionary's induced feature space.Comment: 10 page

    Causal inference using the algorithmic Markov condition

    Full text link
    Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information and describe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution.Comment: 16 figure

    A WHITE PAPER ON THE RELEVANCE OF SOCIAL CAPITAL FOR THE COLLEGE OF AGRICULTURE AND NATURAL RESOURCES (CANR)

    Get PDF
    Social capital is about relationships that are often based on earned or inherited kernels of commonality. Social capital raises the ethical question of when relationships should be allowed to influence outcomes. The essential theory underlying the social capital paradigm is that relationships of sympathy or social capital influence almost every interpersonal transaction. Since interpersonal transactions occur in many settings, the study of social capital is multi-disciplinary and interested in such diverse topics as charitable giving, leadership development, educational achievements, migration patterns, formation of cooperatives, how people care for the environment, diffusion of technology, advertising, economic development, family integrity, flow of legal, recreational, and health services, management of organizations, community development, animal health, passage of legislation, and the creation of civil society. Social capital is relevant to the College of Agriculture and Natural Resources (CANR) because it represents an important resource that must be studied and managed to achieve CANR's mission.Institutional and Behavioral Economics, Teaching/Communication/Extension/Profession,

    Parameterized Algorithmics for Computational Social Choice: Nine Research Challenges

    Full text link
    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in this field include the vulnerability of voting procedures against attacks, or preference aggregation in multi-agent systems. Parameterized Algorithmics is a subfield of Theoretical Computer Science seeking to exploit meaningful problem-specific parameters in order to identify tractable special cases of in general computationally hard problems. In this paper, we propose nine of our favorite research challenges concerning the parameterized complexity of problems appearing in this context

    Quadratic distances on probabilities: A unified foundation

    Full text link
    This work builds a unified framework for the study of quadratic form distance measures as they are used in assessing the goodness of fit of models. Many important procedures have this structure, but the theory for these methods is dispersed and incomplete. Central to the statistical analysis of these distances is the spectral decomposition of the kernel that generates the distance. We show how this determines the limiting distribution of natural goodness-of-fit tests. Additionally, we develop a new notion, the spectral degrees of freedom of the test, based on this decomposition. The degrees of freedom are easy to compute and estimate, and can be used as a guide in the construction of useful procedures in this class.Comment: Published in at http://dx.doi.org/10.1214/009053607000000956 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Bayesian analysis of ranking data with the constrained Extended Plackett-Luce model

    Get PDF
    Multistage ranking models, including the popular Plackett-Luce distribution (PL), rely on the assumption that the ranking process is performed sequentially, by assigning the positions from the top to the bottom one (forward order). A recent contribution to the ranking literature relaxed this assumption with the addition of the discrete-valued reference order parameter, yielding the novel Extended Plackett-Luce model (EPL). Inference on the EPL and its generalization into a finite mixture framework was originally addressed from the frequentist perspective. In this work, we propose the Bayesian estimation of the EPL with order constraints on the reference order parameter. The proposed restrictions reflect a meaningful rank assignment process. By combining the restrictions with the data augmentation strategy and the conjugacy of the Gamma prior distribution with the EPL, we facilitate the construction of a tuned joint Metropolis-Hastings algorithm within Gibbs sampling to simulate from the posterior distribution. The Bayesian approach allows to address more efficiently the inference on the additional discrete-valued parameter and the assessment of its estimation uncertainty. The usefulness of the proposal is illustrated with applications to simulated and real datasets.Comment: 20 pages, 4 figures, 4 tables. arXiv admin note: substantial text overlap with arXiv:1803.0288
    • …
    corecore