10 research outputs found

    Content-based product image retrieval using squared-hinge loss trained convolutional neural networks

    Get PDF
    Convolutional neural networks (CNN) have proven to be highly effective in large-scale object detection and image classification, as well as in serving as feature extractors for content-based image retrieval. While CNN models are typically trained with category label supervision and softmax loss for product image retrieval, we propose a different approach for feature extraction using the squared-hinge loss, an alternative multiclass classification loss function. First, transfer learning is performed on a pre-trained model, followed by fine-tuning the model. Then, image features are extracted based on the fine-tuned model and indexed using the nearest-neighbor indexing technique. Experiments are conducted on VGG19, InceptionV3, MobileNetV2, and ResNet18 CNN models. The model training results indicate that training the models with squared-hinge loss reduces the loss values in each epoch and reaches stability in less epoch than softmax loss. Retrieval results show that using features from squared-hinge trained models improves the retrieval accuracy by up to 3.7% compared to features from softmax-trained models. Moreover, the squared-hinge trained MobileNetV2 features outperformed others, while the ResNet18 feature gives the advantage of having the lowest dimensionality with competitive accuracy

    DEANN: Speeding up Kernel-Density Estimation using Approximate Nearest Neighbor Search

    Full text link
    Kernel Density Estimation (KDE) is a nonparametric method for estimating the shape of a density function, given a set of samples from the distribution. Recently, locality-sensitive hashing, originally proposed as a tool for nearest neighbor search, has been shown to enable fast KDE data structures. However, these approaches do not take advantage of the many other advances that have been made in algorithms for nearest neighbor algorithms. We present an algorithm called Density Estimation from Approximate Nearest Neighbors (DEANN) where we apply Approximate Nearest Neighbor (ANN) algorithms as a black box subroutine to compute an unbiased KDE. The idea is to find points that have a large contribution to the KDE using ANN, compute their contribution exactly, and approximate the remainder with Random Sampling (RS). We present a theoretical argument that supports the idea that an ANN subroutine can speed up the evaluation. Furthermore, we provide a C++ implementation with a Python interface that can make use of an arbitrary ANN implementation as a subroutine for KDE evaluation. We show empirically that our implementation outperforms state of the art implementations in all high dimensional datasets we considered, and matches the performance of RS in cases where the ANN yield no gains in performance.Comment: 24 pages, 1 figure. Submitted for revie

    27th Annual European Symposium on Algorithms: ESA 2019, September 9-11, 2019, Munich/Garching, Germany

    Get PDF
    corecore