نتایج جستجو برای: weak learner

تعداد نتایج: 155822  

2008
Philip M. Long Rocco A. Servedio

In recent work Long and Servedio [LS05] presented a “martingale boosting” algorithm that works by constructing a branching program over weak classifiers and has a simple analysis based on elementary properties of random walks. [LS05] showed that this martingale booster can tolerate random classification noise when it is run with a noise-tolerant weak learner; however, a drawback of the algorith...

2011
Pannagadatta K. Shivaswamy Tony Jebara

This paper proposes a novel boosting algorithm called VadaBoost which is motivated by recent empirical Bernstein bounds. VadaBoost iteratively minimizes a cost function that balances the sample mean and the sample variance of the exponential loss. Each step of the proposed algorithm minimizes the cost efficiently by providing weighted data to a weak learner rather than requiring a brute force e...

2009
Zhi-Hua Zhou

Boosting is a kind of ensemble methods which produce a strong learner that is capable of making very accurate predictions by combining rough and moderately inaccurate learners (which are called as base learners or weak learners). In particular, Boosting sequentially trains a series of base learners by using a base learning algorithm, where the training examples wrongly predicted by a base learn...

2011
Shizuka Nakamura Yoshinori Sagisaka Michiko Nakano

To improve more effectively the learners’ proficiency to control contrast of the stressed to the unstressed in English teaching, it is necessary to analyze how the acoustical characteristics of learners' speech are related to the perceptual evaluation by teachers. This paper analyzes A) learner characteristics of durations measured in speech units related to stress, which are stressed and unstr...

2001
Shie Mannor Ron Meir

We consider geometric conditions on a labeled data set which guarantee that boosting algorithms work well when linear classifiers are used as weak learners. We start by providing conditions on the error of the weak learner which guarantee that the empirical error of the composite classifier is small. We then focus on conditions required in order to insure that the linear weak learner itself ach...

2000
Shie Mannor Ron Meir

The problem of constructing weak classifiers for boosting algorithms is studied. We present an algorithm that produces a linear classifier that is guaranteed to achieve an error better than random guessing for any distribution on the data. While this weak learner is not useful for learning in general, we show that under reasonable conditions on the distribution it yields an effective weak learn...

2015
Artem Sokolov Stefan Riezler Shay B. Cohen

Coactive learning describes the interaction between an online structured learner and a human user who corrects the learner by responding with weak feedback, that is, with an improved, but not necessarily optimal, structure. We apply this framework to discriminative learning in interactive machine translation. We present a generalization to latent variable models and give regret and generalizati...

Journal: :Transactions of the Institute of Systems, Control and Information Engineers 2004

2012
Ramesh Babu

Machine learning [1] is concerned with the design and development of algorithms that allow computers to evolve intelligent behaviors based on empirical data. Weak learner is a learning algorithm with accuracy less than 50%. Adaptive Boosting (Ada-Boost) is a machine learning algorithm may be used to increase accuracy for any weak learning algorithm. This can be achieved by running it on a given...

2001
Shai Ben-David Philip M. Long Yishay Mansour

We extend the boosting paradigm to the realistic setting of agnostic learning, that is, to a setting where the training sample is generated by an arbitrary (unknown) probability distribution over examples and labels. We deene a-weak agnostic learner with respect to a hypothesis class F as follows: given a distribution P it outputs some hypothesis h 2 F whose error is at most erP(F) + , where er...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید