Inapproximability of VC Dimension and Littlestone's Dimension
نویسندگان
چکیده
We study the complexity of computing the VC Dimension and Littlestone’s Dimension. Given an explicit description of a finite universe and a concept class (a binary matrix whose (x,C)-th entry is 1 iff element x belongs to concept C), both can be computed exactly in quasipolynomial time (n ). Assuming the randomized Exponential Time Hypothesis (ETH), we prove nearly matching lower bounds on the running time, that hold even for approximation algorithms.
منابع مشابه
VC v. VCG: Inapproximability of Combinatorial Auctions via Generalizations of the VC Dimension
The existence of incentive-compatible computationally-efficient protocols for combinatorial auctions with decent approximation ratios is the paradigmatic problem in computational mechanism design. It is believed that in many cases good approximations for combinatorial auctions may be unattainable due to an inherent clash between truthfulness and computational efficiency. However, to date, resea...
متن کاملError Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions
The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...
متن کاملCOS 511 : Theoretical Machine Learning
The dot sign means inner product. If b is forced to be 0, the VC-dimension reduces to n. It is often the case that the VC-dimension is equal to the number of free parameters of a concept (for example, a rectangle’s parameters are its topmost, bottommost, leftmost and rightmost bounds, and its VC-dimension is 4). However, it is not always true; there exists concepts with 1 parameter but an infin...
متن کاملRIFIS Technical Report Complexity of Computing Generalized VC-Dimensions
In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of multi-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*dimension, *,-dimension, and SG-dimension. For each dim...
متن کاملOn the VC-Dimension of Univariate Decision Trees
In this paper, we give and prove lower bounds of the VC-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and the number of inputs. In our previous work (Aslan et al., 2009), we proposed a search algorithm that calculates the VC-dimension of univariate decision trees exhaustively. Using...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017