نتایج جستجو برای: یادگیری adaboost

تعداد نتایج: 22173  

2005
Yanmin Sun Andrew K. C. Wong Yang Wang

Several cost-sensitive boosting algorithms have been reported as effective methods in dealing with class imbalance problem. Misclassification costs, which reflect the different level of class identification importance, are integrated into the weight update formula of AdaBoost algorithm. Yet, it has been shown that the weight update parameter of AdaBoost is induced so as the training error can b...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه شیراز 1390

این پایان نامه روی ردیابی خودکار چندین چهره که هم اکنون یک مسئله ی پرچالش در کاربردهایی نظیر ارتباط متقابل انسان و کامپیوتر، سیستم های دیده بانی و امنیت هوشمند می‏باشد، تمرکز دارد. از آن جا که آشکارسازی چهره مرحله ی مقدماتی بسیاری از کاربردها از جمله ردیابی چهره می‏باشد، ابتدا ضمن مطالعه ی انواع روش های آشکارسازی و مزایا و معایب هر کدام، یک آشکارساز چهره ی مناسب به منظور استفاده در سیستم ردیاب ...

2002
Christopher James Cartmell Chris Cartmell Amanda Sharkey Christopher Cartmell

Declaration All sentences or passages quoted in this dissertation from other people's work have been specifically acknowledged by clear cross-referencing to author, work and page(s). Any illustrations which are not the work of the author of this dissertation have been used with the explicit permission of the originator and are specifically acknowledged. I understand that failure to do this amou...

Journal: :Journal of Machine Learning Research 2017
Abraham J. Wyner Matthew Olson Justin Bleich David Mease

There is a large literature explaining why AdaBoost is a successful classifier. The literature on AdaBoost focuses on classifier margins and boosting's interpretation as the optimization of an exponential likelihood function. These existing explanations, however, have been pointed out to be incomplete. A random forest is another popular ensemble method for which there is substantially less expl...

2016
Jianfang Cao Lichao Chen Min Wang Hao Shi Yun Tian

Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural...

Journal: :Informatica 2021

Forecasting stock market behavior has received tremendous attention from investors, and researchers for a very long time due to its potential profitability. Predicting is regarded as one of the extremely challenging applications series forecasting. While there divided opinion on efficiency markets, numerous empirical studies which are widely accepted have shown that predictable some extent. Sta...

Journal: :CoRR 2011
Robert E. Schapire

Boosting is a general method of generating many simple classification rules and combining them into a single, highly accurate rule. This paper re­ views the AdaBoost boosting algorithm and some of its underlying theory, and then looks at some of the challenges of applying AdaBoost to bidding in complicated auctions and to human-computer spoken-dialogues systems.

2009
Henry Lin

In this project I use heterogeneous experts, managed by AdaBoost, to find regions containing text in an image. Using both an AdaBoost and LPBoost expert framework as well as figure-ground segmentation, I show that experts based on the intensity gradient beats Haar basis features in this problem domain.

1999
Robert E. Schapire

Boosting is a general method for improving the accuracy of any given learning algorithm. Focusing primarily on the AdaBoost algorithm , we brieey survey theoretical work on boosting including analyses of AdaBoost's training error and generalization error, connections between boosting and game theory, methods of estimating probabilities using boosting, and extensions of AdaBoost for multiclass c...

2004
Nikunj C. Oza

AdaBoost [4] is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence [6]. The idea is to ma...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید