Learning and Transferring Deep ConvNet Representations with Group-Sparse Factorization
نویسنده
چکیده
Deep convolutional neural networks (Deep ConvNets or CNNs) have exhibited their promise as a universal image representation for recognition. In this work we explore how the transferability of such deep ConvNet representations trained on large-scale annotated object-centric datasets (ImageNet) can be further enhanced for other visual recognition tasks with limited amount of unlabeled training data. We use group-sparse non-negative matrix factorization (GSNMF), a variant of NMF, to identify a rich set of high-level latent variables built from the pre-trained Imagenet deep ConvNets that are informative across scene and fine-grained recognition tasks. The resulting architecture can itself be seen as a feed-forward model that combines deep ConvNets and two-layer structured NMF. We demonstrate state-of-the-art image clustering performance on challenging scene (MIT-67) and fine-grained (Birds-200, Flowers-102) benchmarks. The consistent superior performance of our GSNMF-CNN shows that it is more generic for novel tasks/categories compared to the deep ConvNets activations.
منابع مشابه
Image Classification via Sparse Representation and Subspace Alignment
Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...
متن کاملAnalyzing Learned Convnet Features with Dirichlet Process Gaussian Mixture Models
Convolutional Neural Networks (Convnets) have achieved good results in a range of computer vision tasks the recent years. Though given a lot of attention, visualizing the learned representations to interpret Convnets, still remains a challenging task. The high dimensionality of internal representations and the high abstractions of deep layers are the main challenges when visualizing Convnet fun...
متن کاملA Deep Non-Negative Matrix Factorization Neural Network
Recently, deep neural network algorithms have emerged as one of the most successful machine learning strategies, obtaining state of the art results for speech recognition, computer vision, and classification of large data sets. Their success is due to advancement in computing power, availability of massive amounts of data and the development of new computational techniques. Some of the drawback...
متن کاملOn Deep Representation Learning from Noisy Web Images
The keep-growing content of Web images may be the next important data source to scale up deep neural networks, which recently obtained a great success in the ImageNet classification challenge and related tasks. This prospect, however, has not been validated on convolutional networks (convnet) – one of best performing deep models – because of their supervised regime. While unsupervised alternati...
متن کاملSparse Matrix Factorization
We investigate the problem of factoring a matrix into several sparse matrices and propose an algorithm for this under randomness and sparsity assumptions. This problem can be viewed as a simplification of the deep learning problem where finding a factorization corresponds to finding edges in different layers and also values of hidden units. We prove that under certain assumptions on a sparse li...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015