Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LoadCNN: A Low Training Cost Deep Learning Model for Day-Ahead Individual Residential Load Forecasting (1908.00298v3)

Published 1 Aug 2019 in eess.SP, cs.LG, and cs.NE

Abstract: Accurate day-ahead individual residential load forecasting is of great importance to various applications of smart grid on day-ahead market. Deep learning, as a powerful machine learning technology, has shown great advantages and promising application in load forecasting tasks. However, deep learning is a computationally-hungry method, and requires high costs (e.g., time, energy and CO2 emission) to train a deep learning model, which aggravates the energy crisis and incurs a substantial burden to the environment. As a consequence, the deep learning methods are difficult to be popularized and applied in the real smart grid environment. In this paper, we propose a low training cost model based on convolutional neural network, namely LoadCNN, for next-day load forecasting of individual resident with reduced training cost. The experiments show that the training time of LoadCNN is only approximately 1/54 of the one of other state-of-the-art models, and energy consumption and CO2 emissions are only approximate 1/45 of those of other state-of-the-art models based on the same indicators. Meanwhile, the prediction accuracy of our model is equal to that of current state-of-the-art models, making LoadCNN the first load forecasting model simultaneously achieving high prediction accuracy and low training costs. LoadCNN is an efficient green model that is able to be quickly, cost-effectively and environmentally-friendly deployed in a realistic smart grid environment.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yunyou Huang (20 papers)
  2. Nana Wang (12 papers)
  3. Wanling Gao (47 papers)
  4. Xiaoxu Guo (2 papers)
  5. Cheng Huang (56 papers)
  6. Tianshu Hao (10 papers)
  7. Jianfeng Zhan (92 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.