678 research outputs found

    Composite Correlation Quantization for Efficient Multimodal Retrieval

    Full text link
    Efficient similarity retrieval from large-scale multimodal database is pervasive in modern search engines and social networks. To support queries across content modalities, the system should enable cross-modal correlation and computation-efficient indexing. While hashing methods have shown great potential in achieving this goal, current attempts generally fail to learn isomorphic hash codes in a seamless scheme, that is, they embed multiple modalities in a continuous isomorphic space and separately threshold embeddings into binary codes, which incurs substantial loss of retrieval accuracy. In this paper, we approach seamless multimodal hashing by proposing a novel Composite Correlation Quantization (CCQ) model. Specifically, CCQ jointly finds correlation-maximal mappings that transform different modalities into isomorphic latent space, and learns composite quantizers that convert the isomorphic latent features into compact binary codes. An optimization framework is devised to preserve both intra-modal similarity and inter-modal correlation through minimizing both reconstruction and quantization errors, which can be trained from both paired and partially paired data in linear time. A comprehensive set of experiments clearly show the superior effectiveness and efficiency of CCQ against the state of the art hashing methods for both unimodal and cross-modal retrieval

    A Bayesian reassessment of nearest-neighbour classification

    Get PDF
    The k-nearest-neighbour procedure is a well-known deterministic method used in supervised classification. This paper proposes a reassessment of this approach as a statistical technique derived from a proper probabilistic model; in particular, we modify the assessment made in a previous analysis of this method undertaken by Holmes and Adams (2002,2003), and evaluated by Manocha and Girolami (2007), where the underlying probabilistic model is not completely well-defined. Once a clear probabilistic basis for the k-nearest-neighbour procedure is established, we derive computational tools for conducting Bayesian inference on the parameters of the corresponding model. In particular, we assess the difficulties inherent to pseudo-likelihood and to path sampling approximations of an intractable normalising constant, and propose a perfect sampling strategy to implement a correct MCMC sampler associated with our model. If perfect sampling is not available, we suggest using a Gibbs sampling approximation. Illustrations of the performance of the corresponding Bayesian classifier are provided for several benchmark datasets, demonstrating in particular the limitations of the pseudo-likelihood approximation in this set-up

    Creation and characterization of vortex clusters in atomic Bose-Einstein condensates

    Full text link
    We show that a moving obstacle, in the form of an elongated paddle, can create vortices that are dispersed, or induce clusters of like-signed vortices in 2D Bose-Einstein condensates. We propose new statistical measures of clustering based on Ripley's K-function which are suitable to the small size and small number of vortices in atomic condensates, which lack the huge number of length scales excited in larger classical and quantum turbulent fluid systems. The evolution and decay of clustering is analyzed using these measures. Experimentally it should prove possible to create such an obstacle by a laser beam and a moving optical mask. The theoretical techniques we present are accessible to experimentalists and extend the current methods available to induce 2D quantum turbulence in Bose-Einstein condensates.Comment: 9 pages, 9 figure

    Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks

    Get PDF
    Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.Comment: 26 pages, 2 figures, accepted in Journal of Computational and Graphical Statistics (http://www.amstat.org/publications/jcgs.cfm

    The statistical mechanics of networks

    Full text link
    We study the family of network models derived by requiring the expected properties of a graph ensemble to match a given set of measurements of a real-world network, while maximizing the entropy of the ensemble. Models of this type play the same role in the study of networks as is played by the Boltzmann distribution in classical statistical mechanics; they offer the best prediction of network properties subject to the constraints imposed by a given set of observations. We give exact solutions of models within this class that incorporate arbitrary degree distributions and arbitrary but independent edge probabilities. We also discuss some more complex examples with correlated edges that can be solved approximately or exactly by adapting various familiar methods, including mean-field theory, perturbation theory, and saddle-point expansions.Comment: 15 pages, 4 figure

    Solution of the 2-star model of a network

    Full text link
    The p-star model or exponential random graph is among the oldest and best-known of network models. Here we give an analytic solution for the particular case of the 2-star model, which is one of the most fundamental of exponential random graphs. We derive expressions for a number of quantities of interest in the model and show that the degenerate region of the parameter space observed in computer simulations is a spontaneously symmetry broken phase separated from the normal phase of the model by a conventional continuous phase transition.Comment: 5 pages, 3 figure

    Continuous inference for aggregated point process data

    Get PDF
    The paper introduces new methods for inference with count data registered on a set of aggregation units. Such data are omnipresent in epidemiology because of confidentiality issues: it is much more common to know the county in which an individual resides, say, than to know their exact location in space. Inference for aggregated data has traditionally made use of models for discrete spatial variation, e.g. conditional auto-regressive models. We argue that such discrete models can be improved from both a scientific and an inferential perspective by using spatiotemporally continuous models to model the aggregated counts directly. We introduce methods for delivering (limiting) continuous inference with spatiotemporal aggregated count data in which the aggregation units might change over time and are subject to uncertainty. We illustrate our methods by using two examples: from epidemiology, spatial prediction of malaria incidence in Namibia, and, from politics, forecasting voting under the proposed changes to parliamentary boundaries in the UK. © 2018 Royal Statistical Societ

    Careful prior specification avoids incautious inference for log-Gaussian Cox point processes

    Get PDF
    The BCI forest dynamics research project was founded by S.P. Hubbell and R.B. Foster and is now managed by R. Condit, S. Lao, and R. Perez under the Center for Tropical Forest Science and the Smithsonian Tropical Research in Panama. Numerous organizations have provided funding, principally the U.S. National Science Foundation, and hundreds of field workers have contributed. The data used can be requested and generally granted at http://ctfs.si.edudatarequest. Kriged estimates for concentration of the soil nutrients were downloaded from http://ctfs.si.edu/webatlas/datasets/bci/soilmaps/BCIsoil.html. We acknowledge the principal investigators that were responsible for collecting and analysing the soil maps (Jim Dallin, Robert John, Kyle Harms, Robert Stallard and Joe Yavitt), the funding sources (NSF DEB021104,021115, 0212284,0212818 and OISE 0314581, STRI Soils Initiative and CTFS) and field assistants (Paolo Segre and Juan Di Trani).Peer reviewedPostprin

    Bayes-optimal inverse halftoning and statistical mechanics of the Q-Ising model

    Get PDF
    On the basis of statistical mechanics of the Q-Ising model, we formulate the Bayesian inference to the problem of inverse halftoning, which is the inverse process of representing gray-scales in images by means of black and white dots. Using Monte Carlo simulations, we investigate statistical properties of the inverse process, especially, we reveal the condition of the Bayes-optimal solution for which the mean-square error takes its minimum. The numerical result is qualitatively confirmed by analysis of the infinite-range model. As demonstrations of our approach, we apply the method to retrieve a grayscale image, such as standard image `Lenna', from the halftoned version. We find that the Bayes-optimal solution gives a fine restored grayscale image which is very close to the original.Comment: 13pages, 12figures, using elsart.cl

    Common pitfalls, and how to avoid them, in child and adolescent psychopharmacology: Part I

    Get PDF
    \ua9 The Author(s) 2024.As Faculty of the British Association for Psychopharmacology course on child and adolescent psychopharmacology, we present here what we deem are the most common pitfalls, and how to avoid them, in child and adolescent psychopharmacology. In this paper, we specifically addressed common pitfalls in the pharmacological treatment of attention-deficit/hyperactivity disorder, anxiety, bipolar disorder, depression, obsessive-compulsive disorder and related disorders, and tic disorder. Pitfalls in the treatment of other disorders are addressed in a separate paper (part II)
    corecore