نتایج جستجو برای: sparse code shrinkage enhancement method

تعداد نتایج: 1924896  

2016
Guojun Qin Jingfang Wang

Compressed sensing (CS) is a kind of sampling method based on signal sparse property, it can effectively extract the signal which was contained in the message. In this study, a new noise speech enhancement method was proposed based on CS process. Voice sparsity is used to this algorithm in the discrete fast Fourier transform (Fast Fourier transform, FFT), and observation matrix is designed in c...

2006
Luca Bergamaschi Marco Caliari Angeles Martinez Marco Vianello

We have implemented a numerical code (ReLPM, Real Leja Points Method) for polynomial interpolation of the matrix exponential propagators exp (∆tA)v and φ(∆tA)v, φ(z) = (exp (z) − 1)/z. The ReLPM code is tested and compared with Krylov-based routines, on large scale sparse matrices arising from the spatial discretization of 2D and 3D advection-diffusion equations.

Journal: :CoRR 2014
Rodrigo C. de Lamare Raimundo Sampaio Neto

This letter proposes a novel sparsity-aware adaptive filtering scheme and algorithms based on an alternating optimization strategy with shrinkage. The proposed scheme employs a two-stage structure that consists of an alternating optimization of a diagonally-structured matrix that speeds up the convergence and an adaptive filter with a shrinkage function that forces the coefficients with small m...

Journal: :Journal of Machine Learning Research 2017
Yuting Ma Tian Zheng

Stochastic gradient descent (SGD) is commonly used for optimization in large-scale machine learning problems. Langford et al. (2009) introduce a sparse online learning method to induce sparsity via truncated gradient. With high-dimensional sparse data, however, this method suffers from slow convergence and high variance due to heterogeneity in feature sparsity. To mitigate this issue, we introd...

Journal: :CoRR 2018
Benjamin Cowen Apoorva Nandini Saridena Anna Choromanska

We propose an efficient sparse coding (SC) framework for obtaining sparse representation of data. The proposed framework is very general and applies to both the single dictionary setting, where each data point is represented as a sparse combination of the columns of one dictionary matrix, as well as the multiple dictionary setting as given in morphological component analysis (MCA), where the go...

2012
Z. C. Yang Z. Liu X. Li L. Nie

In the field of space-time adaptive processing (STAP), spare recovery type STAP (SR-STAP) algorithms exploit formulation of the clutter estimation problem in terms of sparse representation of a small number of clutter positions among a much larger number of potential positions in the angle-Doppler plane, and provide an effective approach to suppress the clutter especially in very short snapshot...

2016
Jyotishka Datta David B. Dunson

There is growing interest in analysing high-dimensional count data, which often exhibit quasi-sparsity corresponding to an overabundance of zeros and small nonzero counts. Existing methods for analysing multivariate count data via Poisson or negative binomial log-linear hierarchical models with zero-inflation cannot flexibly adapt to quasi-sparse settings. We develop a new class of continuous l...

2017
Olivier Ledoit Michael Wolf

This paper introduces a nonlinear shrinkage estimator of the covariance matrix that does not require recovering the population eigenvalues first. We estimate the sample spectral density and its Hilbert transform directly by smoothing the sample eigenvalues with a variable-bandwidth kernel. Relative to numerically inverting the so-called QuEST function, the main advantages of direct kernel estim...

Journal: :IEICE Transactions 2017
Katsuyuki Hagiwara

Soft-thresholding is a sparse modeling method typically applied to wavelet denoising in statistical signal processing. It is also important in machine learning since it is an essential nature of the well-known LASSO (Least Absolute Shrinkage and Selection Operator). It is known that soft-thresholding, thus, LASSO suffers from a problem of dilemma between sparsity and generalization. This is cau...

Journal: :Current Biology 2014
Peter Kloppenburg Martin Paul Nawrot

To code information efficiently, sensory systems use sparse representations. In a sparse code, a specific stimulus activates only few spikes in a small number of neurons. A new study shows that the temporal pattern across sparsely activated neurons encodes information, suggesting that the sparse code extends into the time domain.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید