Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 114 tok/s
Gemini 3.0 Pro 53 tok/s Pro
Gemini 2.5 Flash 132 tok/s Pro
Kimi K2 176 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Knowledge distillation with error-correcting transfer learning for wind power prediction (2204.00649v1)

Published 1 Apr 2022 in cs.LG and eess.SP

Abstract: Wind power prediction, especially for turbines, is vital for the operation, controllability, and economy of electricity companies. Hybrid methodologies combining advanced data science with weather forecasting have been incrementally applied to the predictions. Nevertheless, individually modeling massive turbines from scratch and downscaling weather forecasts to turbine size are neither easy nor economical. Aiming at it, this paper proposes a novel framework with mathematical underpinnings for turbine power prediction. This framework is the first time to incorporate knowledge distillation into energy forecasting, enabling accurate and economical constructions of turbine models by learning knowledge from the well-established park model. Besides, park-scale weather forecasts non-explicitly are mapped to turbines by transfer learning of predicted power errors, achieving model correction for better performance. The proposed framework is deployed on five turbines featuring various terrains in an Arctic wind park, the results are evaluated against the competitors of ablation investigation. The major findings reveal that the proposed framework, developed on favorable knowledge distillation and transfer learning parameters tuning, yields performance boosts from 3.3 % to 23.9 % over its competitors. This advantage also exists in terms of wind energy physics and computing efficiency, which are verified by the prediction quality rate and calculation time.

Citations (7)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.