2,547,443 research outputs found

    Performing Relevance/ Relevant Performances: Shakespeare, Jonson, Hitchcock

    Get PDF
    Engages with questions of historicism and presentism in the modern performance of early modern drama, and compares Ben Jonson with Alfred Hitchcock

    Assessing relevance

    Get PDF
    This paper advances an approach to relevance grounded on patterns of material inference called argumentation schemes, which can account for the reconstruction and the evaluation of relevance relations. In order to account for relevance in different types of dialogical contexts, pursuing also non-cognitive goals, and measuring the scalar strength of relevance, communicative acts are conceived as dialogue moves, whose coherence with the previous ones or the context is represented as the conclusion of steps of material inferences. Such inferences are described using argumentation schemes and are evaluated by considering 1) their defeasibility, and 2) the acceptability of the implicit premises on which they are based. The assessment of both the relevance of an utterance and the strength thereof depends on the evaluation of three interrelated factors: 1) number of inferential steps required; 2) the types of argumentation schemes involved; and 3) the implicit premises required

    Manifold Relevance Determination

    Full text link
    In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear(in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a "softly" shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.Comment: ICML201

    China's unstoppable relevance

    Full text link
    https://www.researchgate.net/publication/339210813_China's_unstoppable_Relevancehttps://www.researchgate.net/publication/339210813_China's_unstoppable_RelevancePublished versio

    Relevance-based Word Embedding

    Full text link
    Learning a high-dimensional dense representation for vocabulary terms, also known as a word embedding, has recently attracted much attention in natural language processing and information retrieval tasks. The embedding vectors are typically learned based on term proximity in a large corpus. This means that the objective in well-known word embedding algorithms, e.g., word2vec, is to accurately predict adjacent word(s) for a given word or context. However, this objective is not necessarily equivalent to the goal of many information retrieval (IR) tasks. The primary objective in various IR tasks is to capture relevance instead of term proximity, syntactic, or even semantic similarity. This is the motivation for developing unsupervised relevance-based word embedding models that learn word representations based on query-document relevance information. In this paper, we propose two learning models with different objective functions; one learns a relevance distribution over the vocabulary set for each query, and the other classifies each term as belonging to the relevant or non-relevant class for each query. To train our models, we used over six million unique queries and the top ranked documents retrieved in response to each query, which are assumed to be relevant to the query. We extrinsically evaluate our learned word representation models using two IR tasks: query expansion and query classification. Both query expansion experiments on four TREC collections and query classification experiments on the KDD Cup 2005 dataset suggest that the relevance-based word embedding models significantly outperform state-of-the-art proximity-based embedding models, such as word2vec and GloVe.Comment: to appear in the proceedings of The 40th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '17

    Dilation and Asymmetric Relevance

    Get PDF
    A characterization result of dilation in terms of positive and negative association admits an extremal counterexample, which we present together with a minor repair of the result. Dilation may be asymmetric whereas covariation itself is symmetric. Dilation is still characterized in terms of positive and negative covariation, however, once the event to be dilated has been specified
    • …
    corecore