نتایج جستجو برای: reproducing kernel hilbert space method
تعداد نتایج: 2079705 فیلتر نتایج به سال:
The Wiener series is one of the standard methods to systematically characterize the nonlinearity of a neural system. The classical estimation method of the expansion coefficients via cross-correlation suffers from severe problems that prevent its application to high-dimensional and strongly nonlinear systems. We propose a new estimation method based on regression in a reproducing kernel Hilbert...
We focus on covariance criteria for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis, and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case some variants are considered and the meth...
Let F be a separable Banach space, and let (X, Y ) be a random pair taking values in F×R. Motivated by a broad range of potential applications, we investigate rates of convergence of the k-nearest neighbor estimate rn(x) of the regression function r(x) = E[Y |X = x], based on n independent copies of the pair (X, Y ). Using compact embedding theory, we present explicit and general finite sample ...
Early stopping is a form of regularization based on choosing when to stop running an iterative algorithm. Focusing on non-parametric regression in a reproducing kernel Hilbert space, we analyze the early stopping strategy for a form of gradient-descent applied to the least-squares loss function. We propose a data-dependent stopping rule that does not involve hold-out or cross-validation data, a...
Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It’s well-known that reproducing kernel (R.K) is a useful kernel function ...
P (α) = C(α, F (x, y)) = αF (x, x) + 2αF (x, y) + F (x, y)F (y, y), which is ≥ 0. In the case F (x, x) = 0, the fact that P ≥ 0 implies that F (x, y) = 0. In the case F (x, y) 6= 0, P (α) is a quadratic polynomial and because P ≥ 0 it follows that the discriminant of P is ≤ 0: 4F (x, y) − 4 · F (x, x) · F (x, y)F (y, y) ≤ 0. That is, F (x, y) ≤ F (x, y)F (x, x)F (y, y), and this implies that F ...
We study a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax rates of convergence and show that smoothness regularized estimators achieve the optimal rates of convergence fo...
Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of y(i) is a label that indicates which ...
A kernel method for realizing Bayes’ rule is proposed, based on representations of probabilities in reproducing kernel Hilbert spaces. Probabilities are uniquely characterized by the mean of the canonical map to the RKHS. The prior and conditional probabilities are expressed in terms of RKHS functions of an empirical sample: no explicit parametric model is needed for these quantities. The poste...
A Hilbert space embedding for probability measures has recently been proposed, with applications including dimensionality reduction, homogeneity testing and independence testing. This embedding represents any probability measure as a mean element in a reproducing kernel Hilbert space (RKHS). The embedding function has been proven to be injective when the reproducing kernel is universal. In this...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید