Papers
Topics
Authors
Recent
2000 character limit reached

Scale Calibrated Training: Improving Generalization of Deep Networks via Scale-Specific Normalization (1909.00182v2)

Published 31 Aug 2019 in cs.CV and cs.LG

Abstract: Standard convolutional neural networks(CNNs) require consistent image resolutions in both training and testing phase. However, in practice, testing with smaller image sizes is necessary for fast inference. We show that trivially evaluating low-resolution images on networks trained with high-resolution images results in a catastrophic accuracy drop in standard CNN architectures. We propose a novel training regime called Scale calibrated Training(SCT) which allows networks to learn from various scales of input simultaneously. By taking advantages of SCT, single network can provide decent accuracy at test time in response to multiple test scales. In our analysis, we surprisingly find that vanilla batch normalization can lead to sub-optimal performance in SCT. Therefore, a novel normalization scheme called Scale-Specific Batch Normalization is equipped to SCT in replacement of batch normalization. Experiment results show that SCT improves accuracy of single Resnet-50 on ImageNet by 1.7% and 11.5% accuracy when testing on image sizes of 224 and 128 respectively.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.