834 research outputs found

    Comparing Probabilistic Models for Melodic Sequences

    Get PDF
    Modelling the real world complexity of music is a challenge for machine learning. We address the task of modeling melodic sequences from the same music genre. We perform a comparative analysis of two probabilistic models; a Dirichlet Variable Length Markov Model (Dirichlet-VMM) and a Time Convolutional Restricted Boltzmann Machine (TC-RBM). We show that the TC-RBM learns descriptive music features, such as underlying chords and typical melody transitions and dynamics. We assess the models for future prediction and compare their performance to a VMM, which is the current state of the art in melody generation. We show that both models perform significantly better than the VMM, with the Dirichlet-VMM marginally outperforming the TC-RBM. Finally, we evaluate the short order statistics of the models, using the Kullback-Leibler divergence between test sequences and model samples, and show that our proposed methods match the statistics of the music genre significantly better than the VMM.Comment: in Proceedings of the ECML-PKDD 2011. Lecture Notes in Computer Science, vol. 6913, pp. 289-304. Springer (2011

    Representation Learning: A Review and New Perspectives

    Full text link
    The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data. Although specific domain knowledge can be used to help design representations, learning with generic priors can also be used, and the quest for AI is motivating the design of more powerful representation-learning algorithms implementing such priors. This paper reviews recent work in the area of unsupervised feature learning and deep learning, covering advances in probabilistic models, auto-encoders, manifold learning, and deep networks. This motivates longer-term unanswered questions about the appropriate objectives for learning good representations, for computing representations (i.e., inference), and the geometrical connections between representation learning, density estimation and manifold learning

    Gamma rays and positrons from a decaying hidden gauge boson

    Get PDF
    We study a scenario that a hidden gauge boson constitutes the dominant component of dark matter and decays into the standard model particles through a gauge kinetic mixing. Interestingly, gamma rays and positrons produced from the decay of hidden gauge boson can explain both the EGRET excess of diffuse gamma rays and the HEAT anomaly in the positron fraction. The spectra of the gamma rays and the positrons have distinctive features; the absence of line emission of the gamma ray and a sharp peak in the positron fraction. Such features may be observed by the GLAST and PAMELA satellites.Comment: 16 pages, 4 figures, adding PAMELA data, the version accepted by PL

    Training Restricted Boltzmann Machines on Word Observations

    Get PDF
    The restricted Boltzmann machine (RBM) is a flexible tool for modeling complex data, however there have been significant computational difficulties in using RBMs to model high-dimensional multinomial observations. In natural language processing applications, words are naturally modeled by K-ary discrete distributions, where K is determined by the vocabulary size and can easily be in the hundreds of thousands. The conventional approach to training RBMs on word observations is limited because it requires sampling the states of K-way softmax visible units during block Gibbs updates, an operation that takes time linear in K. In this work, we address this issue by employing a more general class of Markov chain Monte Carlo operators on the visible units, yielding updates with computational complexity independent of K. We demonstrate the success of our approach by training RBMs on hundreds of millions of word n-grams using larger vocabularies than previously feasible and using the learned features to improve performance on chunking and sentiment classification tasks, achieving state-of-the-art results on the latter

    Anomaly-Mediation and Sequestering from a Higher-Dimensional viewpoint

    Full text link
    We study a five-dimensional supergravity model with boundary-localized visible sector exhibiting anomaly-mediated supersymmetry breaking, in which the central requirements of sequestering and radius stabilization are achieved perturbatively. This makes it possible to understand these various mechanisms in a more integrated and transparent fashion, mostly from the higher-dimensional viewpoint. Local supersymmetry, in the presence of visible sector quantum effects, is enforced by the formalism of the five-dimensional superconformal tensor calculus. The construction results in only mild warping, which allows a natural supersymmetry-breaking mediation mechanism of (finite) boundary-to-boundary gravity loops to co-dominate with anomaly-mediation, thereby solving the latter's tachyonic slepton problem. We make the non-trivial check that this can occur while dangerous loops of stabilizing fields remain highly suppressed. Our discussion is a well-controlled starting point for considering other generalizations of anomaly-mediation, or for string theory realizations.Comment: 33 pages, typos corrected, added references, version appearing in JHE
    • …
    corecore