نتایج جستجو برای: kernel trick

تعداد نتایج: 52726  

Journal: :Journal of Machine Learning Research 2010
Sayed Kamaledin Ghiasi Shirazi Reza Safabakhsh Mostafa Shamsi

Appropriate selection of the kernel function, which implicitly defines the feature space of an algorithm, has a crucial role in the success of kernel methods. In this paper, we consider the problem of optimizing a kernel function over the class of translation invariant kernels for the task of binary classification. The learning capacity of this class is invariant with respect to rotation and sc...

2013
Jianwei Zheng Hong Qiu Xinli Xu Wanliang Wang Qiongfang Huang

Feature is important for many applications in biomedical signal analysis and living system analysis. A fast discriminative stochastic neighbor embedding analysis (FDSNE) method for feature extraction is proposed in this paper by improving the existing DSNE method. The proposed algorithm adopts an alternative probability distribution model constructed based on its K-nearest neighbors from the in...

2016
Dominik Lang Daniel Kottke Georg Krempl Myra Spiliopoulou

Active learning provides a solution for annotating huge pools of data efficiently to use it for mining and business analytics. Therefore, it reduces the number of instances that have to be annotated by an expert to the most informative ones. A common approach is to use uncertainty sampling in combination with a support vector machine (SVM). Some papers argue that uncertainty sampling performs b...

2001
Thomas G. Dietterich Xin Wang

We present three ways of combining linear programming with the kernel trick to find value function approximations for reinforcement learning. One formulation is based on SVM regression; the second is based on the Bellman equation; and the third seeks only to ensure that good moves have an advantage over bad moves. All formulations attempt to minimize the number of support vectors while fitting ...

2013
Xiaopeng Hong Guoying Zhao Haoyu Ren Xilin Chen

This paper accelerates boosted nonlinear weak classifiers in boosting framework for object detection. Although conventional nonlinear classifiers are usually more powerful than linear ones, few existing methods integrate them into boosting framework as weak classifiers owing to the highly computational cost. To address this problem, this paper proposes a novel nonlinear weak classifier named Pa...

2009
Stephen Sullivan

The paper evaluates two kernel-based methods on the problem of predicting precipitation based on observable variables. The support vector machine (SVM) method finds the two parallel hyperplanes that provide maximal separation of two subsets, excepting outliers. The minimax probability machine (MPM) method finds an optimal separating hyperplane that minimizes the probability of misclassification...

2014
Tapio Pahikkala

Supervised learning with pair-input data has recently become one of the most intensively studied topics in pattern recognition literature, and its applications are numerous, including, for example, collaborative filtering, information retrieval, and drug-target interaction prediction. Regularized least-squares (RLS) is a kernel-based learning algorithm that, together with tensor product kernels...

2014
V. Badrinath

Conditional Volatility of stock market returns is one of the major problems in time series analysis. Support Vector Machine (SVM) has been applied for volatility estimation of stock market data with limited success, the limitation being in accurate volatility feature predictions due to general kernel functions. However, since Principal Component Analysis(PCA) technique yields good characteristi...

Journal: :IEEE transactions on neural networks 2006
Qingshan Liu Xiaoou Tang Hanqing Lu Songde Ma

There are two fundamental problems with the Fisher linear discriminant analysis for face recognition. One is the singularity problem of the within-class scatter matrix due to small training sample size. The other is that it cannot efficiently describe complex nonlinear variations of face images because of its linear property. In this letter, a kernel scatter-difference-based discriminant analys...

Journal: :Neurocomputing 2005
Songcan Chen Lei Chen Zhi-Hua Zhou

Kernel method is an effective and popular trick in machine learning. In this paper, by introducing it into conventional auto-associative memory models (AMs), we construct a unified framework of kernel auto-associative memory models (KAMs),which makes the existing exponential and polynomial AMs become its special cases. Further, in order to reduce KAM’s connect complexity, inspired by “small-wor...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید