1,890 research outputs found

    Generalized residual vector quantization for large scale data

    Full text link
    Vector quantization is an essential tool for tasks involving large scale data, for example, large scale similarity search, which is crucial for content-based information retrieval and analysis. In this paper, we propose a novel vector quantization framework that iteratively minimizes quantization error. First, we provide a detailed review on a relevant vector quantization method named \textit{residual vector quantization} (RVQ). Next, we propose \textit{generalized residual vector quantization} (GRVQ) to further improve over RVQ. Many vector quantization methods can be viewed as the special cases of our proposed framework. We evaluate GRVQ on several large scale benchmark datasets for large scale search, classification and object retrieval. We compared GRVQ with existing methods in detail. Extensive experiments demonstrate our GRVQ framework substantially outperforms existing methods in term of quantization accuracy and computation efficiency.Comment: published on International Conference on Multimedia and Expo 201

    Chinese Literature\u27s Route to World Literature

    Get PDF
    In his article Chinese Literature\u27s Route to World Literature Hongtao Liu argues that Goethe\u27s theory of world literature based on the conflicting and unifying values of cosmopolitanism and localism has fueled Chinese literature\u27s desire to join world literatures. Proposed by Zhenduo Zheng with the notion of the unification of literature at the beginning of the twentieth century and developed in the 1980s, the global elements of twentieth-century Chinese literature in the twenty-first century, this notion remains a feature of Chinese literature\u27s global trajectory. Liu argues that although the experience of a number of transitions, China\u27s pursuit remains relevant and translation remains a significant route for Chinese literature to join the spaces of world literatures. He also posits that other routes such as regional world literature and world literature in Chinese are gaining in importance

    Graph Regularized Tensor Sparse Coding for Image Representation

    Full text link
    Sparse coding (SC) is an unsupervised learning scheme that has received an increasing amount of interests in recent years. However, conventional SC vectorizes the input images, which destructs the intrinsic spatial structures of the images. In this paper, we propose a novel graph regularized tensor sparse coding (GTSC) for image representation. GTSC preserves the local proximity of elementary structures in the image by adopting the newly proposed tubal-tensor representation. Simultaneously, it considers the intrinsic geometric properties by imposing graph regularization that has been successfully applied to uncover the geometric distribution for the image data. Moreover, the returned sparse representations by GTSC have better physical explanations as the key operation (i.e., circular convolution) in the tubal-tensor model preserves the shifting invariance property. Experimental results on image clustering demonstrate the effectiveness of the proposed scheme
    • …
    corecore