نتایج جستجو برای: norm l0
تعداد نتایج: 46034 فیلتر نتایج به سال:
We consider the problem of learning underlying graph a sparse Ising model with p nodes from n i.i.d. samples. The most recent and best performing approaches combine an empirical loss (the logistic regression or interaction screening loss) regularizer (an L1 penalty constraint). This results in convex that can be solved separately for each node graph. In this work, we leverage cardinality constr...
Reducing the radiation in computerized tomography is today a major concern in radiology. Low dose computerized tomography (LDCT) offers a sound way to deal with this problem. However, more severe noise in the reconstructed CT images is observed under low dose scan protocols (e.g. lowered tube current or voltage values). In this paper we propose a Gamma regularization based algorithm for LDCT im...
This paper provides a sparse learning algorithm for Support Vector Classification (SVC), called Sparse Support Vector Classification (SSVC), which leads to sparse solutions by automatically setting the irrelevant parameters exactly to zero. SSVC adopts the L0-norm regularization term and is trained by an iteratively reweighted learning algorithm. We show that the proposed novel approach contain...
In this paper, novel space-time adaptive processing algorithms based on sparse recovery (SR-STAP) that utilize weighted l1-norm penalty are proposed to further enforce the sparsity and approximate the original l0-norm. Because the amplitudes of the clutter components from different snapshots are random variables, we design the corresponding weights according to two different ways, i.e., the Cap...
Our notation is standard ([1], [3], [4], [9]). Throughout this note ∆ will denote the Cantor space {−1, 1}, Σ the σ-algebra of subsets of ∆ generated by the n-cylinders of ∆ for each n ∈ N, and ν the Borel probability ⊗i=1νi on Σ, where νi : 2 {−1,1} → [0, 1] is defined by νi(∅) = 0, νi({−1}) = νi({1}) = 1/2 and νi({−1, 1}) = 1 for each i ∈ N. In what follows X will be a real Banach space and L...
Dictionaries are collections of vectors used for representations of random vectors in Euclidean spaces. Recent research on optimal dictionaries is focused on constructing dictionaries that offer sparse representations, i.e., l0-optimal representations. Here we consider the problem of finding optimal dictionaries with which representations of samples of a random vector are optimal in an l2-sense...
Underwater acoustic channels are recognized for being one of the most difficult propagation media due to considerable difficulties such as: multipath, ambient noise, time-frequency selective fading. The exploitation of sparsity contained in underwater acoustic channels provides a potential solution to improve the performance of underwater acoustic channel estimation. Compared with the classic l...
Rank minimization problems, which consist of finding a matrix of minimum rank subject to linear constraints, have been proposed in many areas of engineering and science. A specific problem is the matrix completion problem in which a low rank data matrix is recovered from incomplete samples of its entries by solving a rank penalized least squares problem. The rank penalty is in fact the l0 norm ...
The low-rank matrix approximation problem with respect to the component-wise l1-norm (l1LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims at recovering a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Altho...
Gravity data have been frequently used in researching the subsurface to map 3D geometry of density structure, which is considered basis for further interpretations, such as estimation exploration potential mineral exploration. The gravity inversion, practically employed can be achieved by different methods. method based on Tikhonov regularization most commonly among them. Usually, discretized i...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید