Simple PAC Learning of Simple Decision Lists

نویسندگان

  • Jorge Castro
  • José L. Balcázar
چکیده

We prove that log n-decision lists |the class of decision lists such that all their terms have low Kolmogorov complexity| are learnable in the simple PAC learning model. The proof is based on a transformation from an algorithm based on equivalence queries (found independently by Simon). Then we introduce the class of simple decision lists, and extend our algorithm to show that simple decision lists are simple-PAC learnable as well. This last result is relevant in that it is, to our knowledge, the rst learning algorithm for decision lists in which an exponentially wide set of functions may be used for the terms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PAC Learning under Helpful Distributions

A PAC model under helpful distributions is introduced. A teacher associates a teaching set with each target concept and we only consider distributions such that each example in the teaching set has a non-zero weight. The performance of a learning algorithm depends on the probabilities of the examples in this teaching set. In this model, an Occam's razor theorem and its converse are proved. The ...

متن کامل

COMS 6253 : Advanced

Previously: • Administrative basics, introduction and high-level overview • Concept classes and the relationships among them: DNF formulas, decision trees, decision lists, linear and polynomial threshold functions. • The Probably Approximately Correct (PAC) learning model. • PAC learning linear threshold functions in poly(n, 1/ , log 1/δ) time • PAC learning polynomial threshold functions. Toda...

متن کامل

Agnostic PAC Learning Decision Lists is Hard

Agnostic PAC Learning Let X be the set of individuals and H a set of predicates over X . A learning algorithm L is said to be a agnostic PAC learning algorithm for H if it satisfies the following: given any , δ ∈ (0, 1), there is an integer m( , δ) such that for all m ≥ m( , δ), for any t ∈ H and any probability distribution μ on X × {0, 1}, with probability at least 1 − δ, given a sample of si...

متن کامل

On Learning Simple Deterministic and Probabilistic Neural Concepts

We investigate the learnability, under the uniform distribution, of deterministic and probabilistic neural concepts that can be represented as simple combinations of nonoverlapping perceptrons with binary weights. Two perceptrons are said to be nonoverlapping if they do not share any input variables. In the deterministic case, we investigate, within the distribution-specific PAC model, the lear...

متن کامل

On learning ?-perceptron networks on the uniform distribution

We investigate the learnability, under the uniform distribution, of neural concepts that can be represented as simple combinations of nonoverlapping perceptrons (also called μ perceptrons) with binary weights and arbitrary thresholds. Two perceptrons are said to be nonoverlapping if they do not share any input variables. Specifically, we investigate, within the distribution-specific PAC model, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995