Emergent Mind

Provable guarantees for decision tree induction: the agnostic setting

(2006.00743)
Published Jun 1, 2020 in cs.DS , cs.CC , and cs.LG

Abstract

We give strengthened provable guarantees on the performance of widely employed and empirically successful {\sl top-down decision tree learning heuristics}. While prior works have focused on the realizable setting, we consider the more realistic and challenging {\sl agnostic} setting. We show that for all monotone functions~$f$ and parameters $s\in \mathbb{N}$, these heuristics construct a decision tree of size $s{\tilde{O}((\log s)/\varepsilon2)}$ that achieves error $\le \mathsf{opt}s + \varepsilon$, where $\mathsf{opt}s$ denotes the error of the optimal size-$s$ decision tree for $f$. Previously, such a guarantee was not known to be achievable by any algorithm, even one that is not based on top-down heuristics. We complement our algorithmic guarantee with a near-matching $s{\tilde{\Omega}(\log s)}$ lower bound.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.