نتایج جستجو برای: kernel trick
تعداد نتایج: 52726 فیلتر نتایج به سال:
Support Vector Machines (SVMs) represent a powerful learning paradigm able to provide accurate and reliable decision functions in several application fields. In particular, they are really attractive for application in medical domain, where often a lack of knowledge exists. Kernel trick, on which SVMs are based, allows to map non-linearly separable data into potentially linearly separable one, ...
Hyperspectral Imagery Sensing (HIS) is widely gained tremendous popularity in many research areas such as remotely sensed satellite imaging and aerial reconnaissance. HIS is an important technique with the measurement, analysis, and interpretation of spectra acquired sensing scene an airborne or satellite sensor. The development of sensor technology brought the developing of collecting image da...
Unsupervised clustering of image sets of 3D objects has been an active research field within vision community. It is a challenging task since the appearance variation of the same object under different illumination condition is often larger than the appearance variation of different object under the same illumination condition. Some previous methods perform the appearance clustering using k-sub...
Locality preserving projection (LPP) aims at finding an embedded subspace that preserves the local structure of data. Though LPP can provide intrinsic compact representation for image data, it has limitations on image recognition. In this paper, an improved algorithm called kernel scatter-difference based discriminant locality preserving projection (KSDLPP) is proposed. KSDLPP uses kernel trick...
In recent years there has been growing interest in designing dictionaries for image classification. These methods, however, neglect the fact that data of interest often has non-linear structure. Motivated by the fact that this non-linearity can be handled by the kernel trick, we propose learning of dictionaries in the high-dimensional feature space which are simultaneously reconstructive and di...
In this paper we present an active learning procedure for the two-class supervised classification problem. The utilized methodology exploits the Bayesian modeling and inference paradigm to tackle the problem of kernel-based data classification. This Bayesian methodology is appropriate for both finite and infinite dimensional feature spaces. Parameters are estimated, using the kernel trick, foll...
This paper presents a memory-efficient implementation of the kernel matrix-vector product (sparse convolution) and the way to link it with automatic differentiation libraries such as PyTorch. This piece of software alleviates the major bottleneck of autodiff libraries as far as diffeomorphic shape registration is concerned: memory consumption. As a result, symbolic python code can now scale up ...
The abilities to learn and to categorize are fundamental for cognitive systems, be it animals or machines, and therefore have attracted attention from engineers and psychologists alike. Modern machine learning methods and psychological models of categorization are remarkably similar, partly because these two fields share a common history in artificial neural networks and reinforcement learning....
We propose a framework for dealing with binary hard-margin classification in Banach spaces, centering on the use of a supporting semi-inner-product (s.i.p.) taking the place of an inner-product in Hilbert spaces. The theory of semi-inner-product spaces allows for a geometric, Hilbert-like formulation of the problems, and we show that a surprising number of results from the Euclidean case can be...
Multilayer Perceptrons (MLP) are formulated within Support Vector Machine (SVM) framework by constructing multilayer networks of SVMs. The coupled approximation scheme can take advantage of generalization capabilities of the SVM and the combinatory feature of the hidden layer of MLP. The network, the Multilayer Kerceptron (MLK) assumes its own backpropagation procedure that we shall derive here...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید