Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 49 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Remote Sensing Object Counting with Online Knowledge Learning (2303.10318v2)

Published 18 Mar 2023 in cs.CV

Abstract: Efficient models for remote sensing object counting are urgently required for applications in scenarios with limited computing resources, such as drones or embedded systems. A straightforward yet powerful technique to achieve this is knowledge distillation, which steers the learning of student networks by leveraging the experience of already-trained teacher networks. However, it faces a pair of challenges: Firstly, due to its two-stage training nature, a longer training period is essential, especially as the training samples increase. Secondly, despite the proficiency of teacher networks in transmitting assimilated knowledge, they tend to overlook the latent insights gained during their learning process. To address these challenges, we introduce an online distillation learning method for remote sensing object counting. It builds an end-to-end training framework that seamlessly integrates two distinct networks into a unified one. It comprises a shared shallow module, a teacher branch, and a student branch. The shared module serving as the foundation for both branches is dedicated to learning some primitive information. The teacher branch utilizes prior knowledge to reduce the difficulty of learning and guides the student branch in online learning. In parallel, the student branch achieves parameter reduction and rapid inference capabilities by means of channel reduction. This design empowers the student branch not only to receive privileged insights from the teacher branch but also to tap into the latent reservoir of knowledge held by the teacher branch during the learning process. Moreover, we propose a relation-in-relation distillation method that allows the student branch to effectively comprehend the evolution of the relationship of intra-layer teacher features among different inter-layer features. Extensive experiments demonstrate the effectiveness of our method.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.