Emergent Mind

Abstract

The recent decade has seen an enormous rise in the popularity of deep learning and neural networks. These algorithms have broken many previous records and achieved remarkable results. Their outstanding performance has significantly sped up the progress of AI, and so far various milestones have been achieved earlier than expected. However, in the case of relatively small datasets, the performance of Deep Neural Networks (DNN) may suffer from reduced accuracy compared to other Machine Learning models. Furthermore, it is difficult to construct prediction intervals or evaluate the uncertainty of predictions when dealing with regression tasks. In this paper, we propose an ensemble method that attempts to estimate the uncertainty of predictions, increase their accuracy and provide an interval for the expected variation. Compared with traditional DNNs that only provide a prediction, our proposed method can output a prediction interval by combining DNNs, extreme gradient boosting (XGBoost) and dissimilarity computation techniques. Albeit the simple design, this approach significantly increases accuracy on small datasets and does not introduce much complexity to the architecture of the neural network. The proposed method is tested on various datasets, and a significant improvement in the performance of the neural network model is seen. The model's prediction interval can include the ground truth value at an average rate of 71% and 78% across training sizes of 90% and 55%, respectively. Finally, we highlight other aspects and applications of the approach in experimental error estimation, and the application of transfer learning.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.