1,084 research outputs found

    Reduction of Markov Chains using a Value-of-Information-Based Approach

    Full text link
    In this paper, we propose an approach to obtain reduced-order models of Markov chains. Our approach is composed of two information-theoretic processes. The first is a means of comparing pairs of stationary chains on different state spaces, which is done via the negative Kullback-Leibler divergence defined on a model joint space. Model reduction is achieved by solving a value-of-information criterion with respect to this divergence. Optimizing the criterion leads to a probabilistic partitioning of the states in the high-order Markov chain. A single free parameter that emerges through the optimization process dictates both the partition uncertainty and the number of state groups. We provide a data-driven means of choosing the `optimal' value of this free parameter, which sidesteps needing to a priori know the number of state groups in an arbitrary chain.Comment: Submitted to Entrop

    Laplacian Mixture Modeling for Network Analysis and Unsupervised Learning on Graphs

    Full text link
    Laplacian mixture models identify overlapping regions of influence in unlabeled graph and network data in a scalable and computationally efficient way, yielding useful low-dimensional representations. By combining Laplacian eigenspace and finite mixture modeling methods, they provide probabilistic or fuzzy dimensionality reductions or domain decompositions for a variety of input data types, including mixture distributions, feature vectors, and graphs or networks. Provable optimal recovery using the algorithm is analytically shown for a nontrivial class of cluster graphs. Heuristic approximations for scalable high-performance implementations are described and empirically tested. Connections to PageRank and community detection in network analysis demonstrate the wide applicability of this approach. The origins of fuzzy spectral methods, beginning with generalized heat or diffusion equations in physics, are reviewed and summarized. Comparisons to other dimensionality reduction and clustering methods for challenging unsupervised machine learning problems are also discussed.Comment: 13 figures, 35 reference

    Atrio – attribution model orchestrator

    Get PDF
    Project Work presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementIn Digital Advertising, Attribution Modelling is used to assess the contribution of media touchpoints to the campaign outcome, by analyzing each person’s sequence of contacts and interactions with these touchpoints, designated as the Consumer Journey. The ability to acquire, model and analyze campaign data to derive meaningful insights, usually involves proprietary tools, provided by campaign delivery platforms. ATRIO is proposed as an open-sourced framework for Attribution Modelling, orchestrating the data pipeline through transformation, integration, and delivery, to provide Attribution Modelling capabilities for digital media agencies with proprietary data, who need control over the Attribution Modeling process. From a tabular dataset, ATRIO can produce simple heuristics such as last-click analysis, but also data-driven attribution models, based on Shapley’s Game Theory and Markov Chains. As opposed to the black-boxed tools offered by campaign delivery platforms, which are focused in their media channels performance, ATRIO empowers digital media agencies to customize and apply different Attribution Models for each campaign, providing an agnostic, open-source based, holistic and multi-channel analysis

    A game-based approach towards human augmented image annotation.

    Get PDF
    PhDImage annotation is a difficult task to achieve in an automated way. In this thesis, a human-augmented approach to tackle this problem is discussed and suitable strategies are derived to solve it. The proposed technique is inspired by human-based computation in what is called “human-augmented” processing to overcome limitations of fully automated technology for closing the semantic gap. The approach aims to exploit what millions of individual gamers are keen to do, i.e. enjoy computer games, while annotating media. In this thesis, the image annotation problem is tackled by a game based framework. This approach combines image processing and a game theoretic model to gather media annotations. Although the proposed model behaves similar to a single player game model, the underlying approach has been designed based on a two-player model which exploits the player’s contribution to the game and previously recorded players to improve annotations accuracy. In addition, the proposed framework is designed to predict the player’s intention through Markovian and Sequential Sampling inferences in order to detect cheating and improve annotation performances. Finally, the proposed techniques are comprehensively evaluated with three different image datasets and selected representative results are reported

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Learning And Decision Making In Groups

    Get PDF
    Many important real-world decision-making problems involve group interactions among individuals with purely informational interactions. Such situations arise for example in jury deliberations, expert committees, medical diagnoses, etc. We model the purely informational interactions of group members, where they receive private information and act based on that information while also observing other people\u27s beliefs or actions. In the first part of the thesis, we address the computations that a rational (Bayesian) decision-maker should undertake to realize her optimal actions, maximizing her expected utility given all available information at every decision epoch. We use an approach called iterated eliminations of infeasible signals (IEIS) to model the thinking process as well as the calculations of a Bayesian agent in a group decision scenario. Accordingly, as the Bayesian agent attempts to infer the true state of the world from her sequence of observations, she recursively refines her belief about the signals that other players could have observed and beliefs that they would have hold given the assumption that other players are also rational. We show that IEIS algorithm runs in exponential time; however, when the group structure is a partially ordered set the Bayesian calculations simplify and polynomial-time computation of the Bayesian recommendations is possible. We also analyze the computational complexity of the Bayesian belief formation in groups and show that it is NP-hard. We investigate the factors underlying this computational complexity and show how belief calculations simplify in special network structures or cases with strong inherent symmetries. We finally give insights about the statistical efficiency (optimality) of the beliefs and its relations to computational efficiency. In the second part, we propose the no-recall model of inference for heuristic decision-making that is rooted in the Bayes rule but avoids the complexities of rational inference in group interactions. Accordingly to this model, the group members behave rationally at the initiation of their interactions with each other; however, in the ensuing decision epochs, they rely on heuristics that replicate their experiences from the first stage and can be justified as optimal responses to simplified versions of their complex environments. We study the implications of the information structure, together with the properties of the probability distributions, which determine the structure of the so-called ``Bayesian heuristics\u27\u27 that the agents follow in this model. We also analyze the group decision outcomes in two classes of linear action updates and log-linear belief updates and show that many inefficiencies arise in group decisions as a result of repeated interactions between individuals, leading to overconfident beliefs as well as choice-shifts toward extreme actions. Nevertheless, balanced regular structures demonstrate a measure of efficiency in terms of aggregating the initial information of individuals. Finally, we extend this model to a case where agents are exposed to a stream of private data in addition to observing each other\u27s actions and analyze properties of learning and convergence under the no-recall framework
    • …
    corecore