An Eecient Membership-query Algorithm for Learning Dnf with Respect to the Uniform Distribution

نویسنده

  • Jeerey C Jackson
چکیده

We present a membership-query algorithm for eeciently learning DNF with respect to the uniform distribution. In fact, the algorithm properly learns with respect to uniform the class TOP of Boolean functions expressed as a majority vote over parity functions. We also describe extensions of this algorithm for learning DNF over certain nonuniform distributions and for learning a class of geometric concepts that generalizes DNF. Furthermore, we show that DNF is weakly learnable with respect to uniform from noisy examples. Our strong learning algorithm utilizes one of Freund's boosting techniques and relies on the fact that boosting does not require a completely distribution-independent weak learner. The boosted weak learner is a nonuniform extension of a parity-nding algorithm discovered by Goldreich and Levin.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution

We present a membership-query algorithm for ef i ciently learning DNF with respect to the uniform distribution. In fact, the algorithm properly learns the more general class of functions that are computable as a majority of polynomially-many parity functions. We also describe extensions of this algorithm for learning DNF over certain nonuniform distributions and from noisy examples as well as f...

متن کامل

A Query Algorithm for Agnostically Learning DNF?

Motivation: One of the most celebrated results in computational learning theory is Jackson’s query algorithm for PAC learning DNF formulas with respect to the uniform distribution [3]. A natural question is whether DNF formulas can be learned (even with queries and with respect to the uniform distribution) in a highly noisy setting, i.e., the wellknown agnostic framework of learning [5]. Additi...

متن کامل

Efficiency and Computational Limitations of Learning Algorithms

This thesis presents new positive and negative results concerning the learnability of several well-studied function classes in the Probably Approximately Correct (PAC) model of learning. Learning Disjunctive Normal Form (DNF) expressions in the PAC model is widely considered to be the main open problem in Computational Learning Theory. We prove that PAC learning of DNF expressions by an algorit...

متن کامل

Attribute-Efficient and Non-adaptive Learning of Parities and DNF Expressions

We consider the problems of attribute-efficient PAC learning of two well-studied concept classes: parity functions and DNF expressions over {0,1}n. We show that attribute-efficient learning of parities with respect to the uniform distribution is equivalent to decoding high-rate random linear codes from low number of errors, a long-standing open problem in coding theory. This is the first eviden...

متن کامل

On Attribute Efficient and Non-adaptive Learning of Parities and DNF Expressions

We consider the problems of attribute-efficient PAC learning of two well-studied concept classes: parity functions and DNF expressions over {0, 1}n. We show that attribute-efficient learning of parities with respect to the uniform distribution is equivalent to decoding highrate random linear codes from low number of errors, a long-standing open problem in coding theory. An algorithm is said to ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994