Anytime learning of anycost classifiers
نویسندگان
چکیده
منابع مشابه
Learning for anytime classification
Many on-line applications of machine learning require that the learned classifiers complete classification within strict real-time constraints. In consequence, efficient classifiers such as naive Bayes (NB) are often employed that can complete the required classification tasks even under peak computational loads. While NB provides acceptable accuracy, more computationally intensive approaches c...
متن کاملAnytime Representation Learning
Evaluation cost during test-time is becoming increasingly important as many real-world applications need fast evaluation (e.g. web search engines, email spam filtering) or use expensive features (e.g. medical diagnosis). We introduce Anytime Feature Representations (AFR), a novel algorithm that explicitly addresses this trade-off in the data representation rather than in the classifier. This en...
متن کاملAnytime Learning of Decision Trees
The majority of existing algorithms for learning decision trees are greedy—a tree is induced topdown, making locally optimal decisions at each node. In most cases, however, the constructed tree is not globally optimal. Even the few non-greedy learners cannot learn good trees when the concept is difficult. Furthermore, they require a fixed amount of time and are not able to generate a better tre...
متن کاملAnytime Active Learning
A common bottleneck in deploying supervised learning systems is collecting human-annotated examples. In many domains, annotators form an opinion about the label of an example incrementally — e.g., each additional word read from a document or each additional minute spent inspecting a video helps inform the annotation. In this paper, we investigate whether we can train learning systems more effic...
متن کاملAnytime Query-Tuned Kernel Machine Classifiers Via Cholesky Factorization
We recently demonstrated 2 to 64-fold query-time speedups of Support Vector Machine and Kernel Fisher classifiers via a new computational geometry method for anytime output bounds (DeCoste, 2002). This new paper refines our approach in two key ways. First, we introduce a simple linear algebra formulation based on Cholesky factorization, yielding simpler equations and lower computational overhea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2010
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-010-5228-1