Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 44 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Confidence-Nets: A Step Towards better Prediction Intervals for regression Neural Networks on small datasets (2210.17092v1)

Published 31 Oct 2022 in cs.LG and cs.AI

Abstract: The recent decade has seen an enormous rise in the popularity of deep learning and neural networks. These algorithms have broken many previous records and achieved remarkable results. Their outstanding performance has significantly sped up the progress of AI, and so far various milestones have been achieved earlier than expected. However, in the case of relatively small datasets, the performance of Deep Neural Networks (DNN) may suffer from reduced accuracy compared to other Machine Learning models. Furthermore, it is difficult to construct prediction intervals or evaluate the uncertainty of predictions when dealing with regression tasks. In this paper, we propose an ensemble method that attempts to estimate the uncertainty of predictions, increase their accuracy and provide an interval for the expected variation. Compared with traditional DNNs that only provide a prediction, our proposed method can output a prediction interval by combining DNNs, extreme gradient boosting (XGBoost) and dissimilarity computation techniques. Albeit the simple design, this approach significantly increases accuracy on small datasets and does not introduce much complexity to the architecture of the neural network. The proposed method is tested on various datasets, and a significant improvement in the performance of the neural network model is seen. The model's prediction interval can include the ground truth value at an average rate of 71% and 78% across training sizes of 90% and 55%, respectively. Finally, we highlight other aspects and applications of the approach in experimental error estimation, and the application of transfer learning.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.