Emergent Mind

Abstract

Conformal predictors are an important class of algorithms that allow predictions to be made with a user-defined confidence level. They are able to do this by outputting prediction sets, rather than simple point predictions. The conformal predictor is valid in the sense that the accuracy of its predictions is guaranteed to meet the confidence level, only assuming exchangeability in the data. Since accuracy is guaranteed, the performance of a conformal predictor is measured through the efficiency of the prediction sets. Typically, a conformal predictor is built on an underlying machine learning algorithm and hence its predictive power is inherited from this algorithm. However, since the underlying machine learning algorithm is not trained with the objective of minimizing predictive efficiency it means that the resulting conformal predictor may be sub-optimal and not aligned sufficiently to this objective. Hence, in this study we consider an approach to train the conformal predictor directly with maximum predictive efficiency as the optimization objective, and we focus specifically on the inductive conformal predictor for classification. To do this, the conformal predictor is approximated by a differentiable objective function and gradient descent used to optimize it. The resulting parameter estimates are then passed to a proper inductive conformal predictor to give valid prediction sets. We test the method on several real world data sets and find that the method is promising and in most cases gives improved predictive efficiency against a baseline conformal predictor.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.