Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universal guarantees for decision tree induction via a higher-order splitting criterion (2010.08633v1)

Published 16 Oct 2020 in cs.LG, cs.DS, and stat.ML

Abstract: We propose a simple extension of top-down decision tree learning heuristics such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions $f: {-1,1}n \to {-1,1}$ with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between $f$ and small subsets of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between $f$ and its individual attributes. Our algorithm satisfies the following guarantee: for all target functions $f : {-1,1}n \to {-1,1}$, sizes $s\in \mathbb{N}$, and error parameters $\epsilon$, it constructs a decision tree of size $s{\tilde{O}((\log s)2/\epsilon2)}$ that achieves error $\le O(\mathsf{opt}_s) + \epsilon$, where $\mathsf{opt}_s$ denotes the error of the optimal size $s$ decision tree. A key technical notion that drives our analysis is the noise stability of $f$, a well-studied smoothness measure.

Citations (9)

Summary

We haven't generated a summary for this paper yet.