Efficient Algorithms for Convolutional Sparse Representations
نویسندگان
چکیده
منابع مشابه
Dense Mapping for Range Sensors: Efficient Algorithms and Sparse Representations
This paper focuses on efficient occupancy grid building based on wavelet occupancy grids, a new sparse grid representation and on a new update algorithm for range sensors. The update algorithm takes advantage of the natural multiscale properties of the wavelet expansion to update only parts of the environement that are modified by the sensor measurements and at the proper scale. The sparse wave...
متن کاملConvolutional Sparse Representations with Gradient Penalties
While convolutional sparse representations enjoy a number of useful properties, they have received limited attention for image reconstruction problems. The present paper compares the performance of block-based and convolutional sparse representations in the removal of Gaussian white noise. While the usual formulation of the convolutional sparse coding problem is slightly inferior to the block-b...
متن کاملOptimization Algorithms for Sparse Representations and Applications
We consider the following sparse representation problem, which is called Sparse Component Analysis: identify the matrices S ∈ IRn×N and A ∈ IRm×n (m ≤ n < N) uniquely (up to permutation of scaling), knowing only their multiplication X = AS, under some conditions, expressed either in terms of A and sparsity of S (identifiability conditions), or in terms of X (Sparse Component Analysis conditions...
متن کاملAn Efficient Algorithm for Sparse Representations
Basis Pursuit (BP) and Basis Pursuit Denoising (BPDN), well established techniques for computing sparse representations, minimize an ` data fidelity term, subject to an ` sparsity constraint or regularization term, by mapping the problem to a linear or quadratic program. BPDN with an ` data fidelity term has recently been proposed, also implemented via a mapping to a linear program. We introduc...
متن کاملEfficient Sparse-Winograd Convolutional Neural Networks
Convolutional Neural Networks (CNNs) are compute intensive which limits their application on mobile devices. Their energy is dominated by the number of multiplies needed to perform the convolutions. Winograd’s minimal filtering algorithm (Lavin (2015)) and network pruning (Han et al. (2015)) reduce the operation count. Unfortunately, these two methods cannot be combined — because applying the W...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Image Processing
سال: 2016
ISSN: 1057-7149,1941-0042
DOI: 10.1109/tip.2015.2495260