1 research outputs found
Unsupervised Learning using Pretrained CNN and Associative Memory Bank
Deep Convolutional features extracted from a comprehensive labeled dataset,
contain substantial representations which could be effectively used in a new
domain. Despite the fact that generic features achieved good results in many
visual tasks, fine-tuning is required for pretrained deep CNN models to be more
effective and provide state-of-the-art performance. Fine tuning using the
backpropagation algorithm in a supervised setting, is a time and resource
consuming process. In this paper, we present a new architecture and an approach
for unsupervised object recognition that addresses the above mentioned problem
with fine tuning associated with pretrained CNN-based supervised deep learning
approaches while allowing automated feature extraction. Unlike existing works,
our approach is applicable to general object recognition tasks. It uses a
pretrained (on a related domain) CNN model for automated feature extraction
pipelined with a Hopfield network based associative memory bank for storing
patterns for classification purposes. The use of associative memory bank in our
framework allows eliminating backpropagation while providing competitive
performance on an unseen dataset.Comment: Paper was accepted at the 2018 International Joint Conference on
Neural Networks (IJCNN 2018