نتایج جستجو برای: machine selection

تعداد نتایج: 565809  

2005
Yi-Wei Chen Chih-Jen Lin

This article investigates the performance of combining support vector machines (SVM) and various feature selection strategies. Some of them are filtertype approaches: general feature selection methods independent of SVM, and some are wrapper-type methods: modifications of SVM which can be used to select features. We apply these strategies while participating at NIPS 2003 Feature Selection Chall...

Journal: :Journal of machine learning research : JMLR 2013
Alexander R. Statnikov Jan Lemeire Constantin F. Aliferis

Algorithms for Markov boundary discovery from data constitute an important recent development in machine learning, primarily because they offer a principled solution to the variable/feature selection problem and give insight on local causal structure. Over the last decade many sound algorithms have been proposed to identify a single Markov boundary of the response variable. Even though faithful...

2012
Danai Georgara Katia Kermanidis Ioannis Mariolis

In this study protein sequences are assigned to functional families using machine learning techniques. The assignment is based on support vector machine classification of binary feature vectors denoting the presence or absence in the protein of highly conserved sequences of amino-acids called motifs. Since the input vectors of the classifier consist of a great number of motifs, feature selectio...

2009
Tetsu Matsukawa Koji Suzuki Takio Kurita

This paper proposes a selection method of foreground local features for generic object recognition in “bag of features”. Usually all local features detected from an given image are voted to a histogram of visual words in conventional bag-of-features method. But it may not be good choice because in the standard object recognition task, an image includes target regions and background regions. To ...

2011
Maria Muntean Honoriu Vălean Remus Joldeş Emilian Ceuca M. Muntean H. Vălean R. Joldeş E. Ceuca

Most of the time a lot of data means better results. This case is not valid all the time because sometimes we have a lot of redundant data and a lot of attributes that are weakly related to what we are trying to find out by evaluating the data. The main idea behind feature selection is to keep the data that bring the most amount of information for learning how to evaluate future data that are g...

Journal: :Soft Comput. 2006
Z. Ying K. C. Keong

Gene selection procedure is a necessary step to increase the accuracy of machine learning algorithms that help in disease diagnosis based on gene expression data. This is commonly known as a feature subset selection problem in machine learning domain. A fast leave-one-out (LOO) evaluation formula for least-squares support vector machines (LSSVMs) is introduced here that can guide our backward f...

2017
Khanh Nguyen

Max-margin and kernel methods are dominant approaches to solve many tasks in machine learning. However, the paramount question is how to solve model selection problem in these methods. It becomes urgent in online learning context. Grid search is a common approach, but it turns out to be highly problematic in real-world applications. Our approach is to view max-margin and kernel methods under a ...

2011
Andreas Argyriou

We study the problem of recovering a sparse vector from a set of linear measurements. This problem also relates to feature or variable selection in statistics and machine learning. A widely used method for such problems has been regularization with the L1 norm. We extend this methodology to allow for a broader class of regularizers which includes the L1 norm. This class is characterized by a co...

2007
Mark Palatucci

Recent work in neuroimaging has shown that it is possible to classify cognitive states from functional magnetic resonance images (fMRI). Machine learning classifiers such as Gaussian Naive Bayes, Support Vector Machines, and Nearest Neighbors have all been applied successfully to this domain. Although it is a natural question to ask which classifiers work best, research has shown that the accur...

2004
Zhili Wu Chunhung Li

Given unlabeled data in advance, transductive feature selection (TFS) is to maximize the classification accuracy on these particular unlabeled data by selecting a small set of relevant and less redundant features. Specifically, this paper introduces the use of Transductive Support Vector Machines(TSVMs) for feature selection. We study three inductive SVM-related feature selection methods: corre...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید