You are on page 1of 6

Incremental Learning for the Improvement of

Ampacity Predictions over Time


Gabriela Molinar Corinna Übel Wilhelm Stork
Karlsruhe Institute of Technology Karlsruhe Institute of Technology Karlsruhe Institute of Technology
Karlsruhe, Germany Karlsruhe, Germany Karlsruhe, Germany
https://orcid.org/0000-0003-0875-4125 https://orcid.org/0000-0002-9018-7416 https://orcid.org/0000-0003-0579-4615

Abstract—An incremental learning system has been devel- considers the weather conditions in the surroundings of the
oped to reduce the time to begin operations of a monitoring conductor to improve the accuracy of the ampacity predictions.
system. The system consists of weather stations installed at Until now, the smallest required training dataset was one year
two electrical towers at different topographical locations. Based
2021 IEEE PES Innovative Smart Grid Technologies Europe (ISGT Europe) | 978-1-6654-4875-8/21/$31.00 ©2021 IEEE | DOI: 10.1109/ISGTEUROPE52324.2021.9639914

on weather simulations and numerical weather predictions, a of historical weather measurements [3], as well as historical
machine learning ampacity forecasting system was pre-trained. weather predictions [4]. From the practical point of view,
Afterwards, monthly weather measurements were added to the a waiting period of one year to get the system running
training process to analyze the accuracy improvement of the is too long. TSOs require a monitoring system, which can
ampacity predictions. In this study, feedforward neural networks begin working as soon as the installation is completed. That
and gradient boosting regressors are the base of the machine
learning system. The models achieve an improvement of up to requirement raises the need to develop an incremental learning
20% after six months of incremental dataset expansion. Machine solution for the existing models.
learning has shown accuracy improvements of the ampacity The incremental learning approach described in this paper
predictions especially when the considered place is surrounded considers a pre-training phase on weather simulations. After-
by trees. The implementation of incremental learning reduces the wards, a periodical-training-phase on the collected measure-
time into operations of a just installed weather based overhead
line monitoring system. This fosters its implementation and use ment data begins. Our hypothesis is that the accuracy of the
as an optimization method of electrical networks. forecasting system will improve with each incremental training
Index Terms—Ampacity forecast, incremental learning, over- step, until achieving its best performance.
head line monitoring, dynamic line rating, numerical weather The next chapters are organized based on the machine
predictions. learning development process, QUA3 CK [5]: first, we present
a description of the case study, followed by the dataset
I. I NTRODUCTION preparation step. Next, the individual models are explained.

M ONITORING systems for overhead lines are getting


more importance. Network optimization has in the
German energy transition plan the highest priority, before
Finally, we discuss and compare the results and present our
conclusions.

expansion and construction [1]. The market offers either Data requirment II. C ASE S TUDY
sensors to be installed directly on the overhead line, or This study is based on six months of weather measurements,
weather stations to calculate the current carrying capacity, also from weather stations located at two electrical towers from
called ampacity, using the heat balance equation. Transmission two different overhead lines in Germany. It considers four
System Operators (TSOs) not only require the ampacity in months of weather simulations at the same places, for the
real-time but also its prediction. Only with this information, time period before the installation of the weather stations. The
it is possible for them to increase the transmission capacity whole dataset period covers from April 2020 until February
of the lines. An ampacity forecasting system is necessary to 2021, as shown in Fig. 1. The idea is to simulate the case of
arrange better dispatching plans, hence, decreasing the amount a just installed monitoring system.
of redispatched electrical power.
This article is developed under the framework of the re-
search project PrognoNetz from the Karlsruhe Institute of
Technology. One of the research key-points of this project
is the development of machine learning models for ampacity
forecasting. As demonstrated in [2], [3] and [4], the approach Fig. 1. Time diagram for the weather dataset.

This project is financed by the German Federal Ministry for Economic


Affairs and Energy (BMWi).
For the scope of this article, two topographically different
locations are compared. One weather station located in a for-
est, surrounded by trees (station A, measurement height 19 m
978-1-6654-4875-8/21/$31.00 ©2021 IEEE over ground) and another in a grassland, without surrounding

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.
trees or buildings (station B, measurement height 23 m over are considered the true values (or labels) at their corresponding
ground). The selection of these two places was done based period.
on the most difficult and easiest locations to model compared The reference model on station A behaves as shown in the
to the rest of the studied places (initial study based on ten error distribution plots in Fig. 2, with respect to the simulated
weather stations spread on two overhead lines in Germany). dataset, and as in Fig. 3, corresponding to the difference to
The measured weather parameters correspond those neces- the weather measurements. Both plots show the 48 ampacity
sary to calculate the ampacity, i.e. ambient temperature, wind prediction and its error distribution, considering minimum,
direction, wind speed and solar radiation. The ampacity was maximum, mean, 0.25 and 0.75-quantile. There is a strong
Machine
calculated with the heat balance equation from Cigré 601. difference between both figures. The NWP-based reference
learningThe models used for this study were selected based on the model overestimates in average around 200 A when comparing
models
knowledge earned in the last research years. The two, until to the actual measurements (Fig. 3). In contrast, the mean of
now best performing approaches are compared in this article: the difference between the reference model and the simulated
the Feedforward Neural Networks (FNNs) based on Numerical weather dataset stays around and near to zero. This can be
Weather Predictions (NWPs) [4] and the Quantile Regression explained by considering that the source of the NWP and
Forests (QRFs) based on historical weather measurements [3]. the simulated weather data come from the same atmospheric
A library to implement QRFs is found today for the model.
programming language R. However, the developments of the
PrognoNetz project are based in Python. Therefore, the QRFs
were migrated using Gradient Boosting Regressors (GBRs).
They are also tree-based regressors, implemented in the Cat-
boost Python library [6].
III. DATA PREPARATION
The incremental learning approach was developed as an
autonomous system, which is installed once by the TSO and
then it runs by itself, without human intervention, as also
specified in the state-of-the-art [7].
Every pre-defined time interval, e.g. every month, the sys-
tem downloads the last weather measurements and predictions. Fig. 2. Reference model on station A for ampacity forecasting com-
This step increases the size of the dataset and gives the pared to the ampacity calculated from simulated weather scenarios.
opportunity to continue or to repeat the training process. Once
the data was downloaded, it has to be transformed in order to
be understood by the models, hence it has to resemble the
input data format of the models.
Before training, the integrity of the new dataset has to be
checked, i.e. if all weather parameters are present. Polynomial
interpolation is used as the method for cleaning the data, since
large ranges of values can be missing from the dataset, due to
communication errors of the sensors.
After scaling with the normalization method the dataset is
separated into a training, a validation and a test set. With this
split the dataset is ready to train the models.
IV. M ODELS AND THEIR INCREMENTAL LEARNING
Fig. 3.Reference model on station A for ampacity forecasting
A. Reference model compared to the ampacity calculated from weather measurements.
The reference model reflects a 48 hour ampacity prediction
system without any machine learning. It corresponds to a basis On the other hand, comparing again the reference model to
for comparison to the incremental learning models. the simulated (Fig. 4) and weather measurements (Fig. 5), but
As explained in Section (II), there are two weather datasets this time on station B, the ampacity prediction of the reference
available for training: one with weather simulations, for the model stays in both cases with a mean around zero. It slightly
pre-training of the machine learning models, and another with overestimates compared to the simulated weather scenario, and
weather measurements, for the incremental learning. it slightly underestimates compared to the ampacity calculated
The reference model generates 48 hours ampacity forecasts from the weather measurements.
based on NWPs and covering the whole time period of the case The more complicated the topography (station A), the
study. The accuracy of this model is then calculated comparing stronger the difference between weather simulation and mea-
directly to the weather simulations or the measurements, which surements. In both cases (station A and B) the spread of the

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.
Fig. 4. Reference model on station B for ampacity forecasting com-
pared to the ampacity calculated from simulated weather scenarios. Fig. 6. Incremental learning on FNN at station A: pre-trained model
(yellow), FNN-continue (blue), FNN-fine-tuning (red).

Fig. 5.Reference model on station B for ampacity forecasting


compared to the ampacity calculated from weather measurements. Fig. 7. Incremental learning on FNN at station B: pre-trained model
(yellow), FNN-continue (blue), FNN-fine-tuning (red).

error between forecast and the labels stays in a similar range


(about 800 A for station A and about 1500 A for station B). In dataset (simulated and measured). As seen by the FNNs, the
other words, the change of data source does not have a strong inclusion of measurement data to the training is beneficial for
influence in the variance. station A, which is not the case for station B.
B. Feedforward Neural Networks
Incremental learning for FNNs can be implemented either
by a continuation of the training process (FNN-continue) or
by freezing the model and just training the last layer (FNN-
fine-tuning) [8].
In the following figures the Mean Absolute Percentage
Error (MAPE) of both incremental learning methods are
compared to the pre-trained model, using the whole six months
of weather measurement data. There is no great accuracy
improvement over the pre-trained model on station B (easiest
topography, see Fig. 7), in contrast to station A (see Fig. 6).
In any case, using the FNN-continue or the FNN-fine-tuning
methods, both are about equally good performing. In the fol- Fig. 8. Incremental learning on GBR at station A: pre-trained model
lowing analysis only the FNN-continue method is considered (yellow), measurements model (blue), whole dataset model (red).
because of its stronger learning possibilities, due to a whole
model adjustment.
V. C OMPARISON AND ANALYSIS
C. Gradient Boosting Regressors After testing different approaches for incremental learning
Applying incremental learning to GBRs means training for the FNN and comparing the GBR performance depending
from scratch the model on the expanded dataset. Fig. 8 and on the training datasets, it is time to simulate a practical
Fig. 9 compare the MAPE of the pre-trained GBR model, to scenario. Let us consider that two weather stations (A and
the GBR on measurement data and to the GBR on the whole B) were installed on August 1st 2020 for overhead line

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.
were added to the training process to analyze the accuracy
improvement of the ampacity predictions.
Station B is located in a flat field without disturbing influ-
ences, such as mountains or forests. It measures similar values
as the atmospheric simulation models. In contrast, station A is
located in a forest. This results in a large difference between
the local weather conditions and the simulated data as well as
the weather forecasts. The machine learning models achieve
better results at this station, i.e., at less favorable topography
along the line, since the model learns to generate an ampacity
forecast, mapping the NWPs to the local behavior of the
weather.
Fig. 9. Incremental learning on GBR at station B: pre-trained model The models based on the measurements dataset achieve an
(yellow), measurements model (blue), whole dataset model (red). improvement of up to 20% after six months of incremental
dataset expansion. It has been found that the GBR model
monitoring. The TSO requires ampacity forecasts from the delivers the best results, although the difference of MAPE is
first day of operations. only about 1% compared to the FNNs.
Historical weather simulations for both places are available In overall, this article has shown that the implementation
(weather analysis in meteorological terminology). Based on of incremental learning for ampacity predictions reduces the
this dataset, pre-trained FNN and GBR models are generated, time into operations of a just installed weather based overhead
to adjust the NWPs to the local weather conditions and thus line monitoring system. It also remarks the importance of
to improve the ampacity forecast accuracy. This results in a having local measurements along the lines and the inclusion
MAPE shown with the yellow curves in Fig. 6 to Fig. 9. of machine learning approaches. These finding try to motivate
After one month, the simulated dataset is expanded with further implementation and use of overhead line monitoring
actual weather measurements. This allows us to continue systems as an optimization method of electrical networks.
training the models. The process is repeated monthly.
Fig. 10 and Fig. 11 show the evolution of the MAPE for The models are fine-tuned
both stations in a six months period. From top left to bottom monthly.
right a month is successively accumulated. The performance
of the FNN model is in blue and in red the performance of
the GBR model. The gray dashed lines are for orientation, so
that the changes of the individual graphs are more visible. An
improvement in the ampacity forecast accuracy is observed
over the six months on both stations.
The comparison of these results to the reference model
(prediction based on NWPs, no machine learning) strongly
suggests that the location of the stations affects the ampacity
prediction. If the stations are located in an unfavorable topog-
raphy, e.g. near to mountains or forests, the machine learning
can achieve an accuracy improvement of about 20%.
When evaluating the results, it is found that both machine
learning approaches have similar potential. In the case of
the FNN model, the deviation of the results can be reduced
by up to 10% through targeted further training of a pre-
trained model. The GBR model also provides a comparable
improvement by re-training, although with around 1% better
performance.
S UMMARY AND CONCLUSION
An incremental learning system has been developed to
reduce the time to begin operations of a monitoring system. In
this article, the system consists of weather stations (A and B)
installed at two electrical towers at different topographical lo-
cations. Based on weather simulations and numerical weather
predictions, a machine learning ampacity forecasting system
was pre-trained. Afterwards, monthly weather measurements

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.
Fig. 10. Incremental learning in a monthly basis for station A. Comparison of FNN (blue) to GBR (red) approaches. The dashed line is a
helping line to realize the level of improvement by month.

Fig. 11. Incremental learning in a monthly basis for station B. Comparison of FNN (blue) to GBR (red) approaches. The dashed line is a
helping line to realize the level of improvement by month.

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.
R EFERENCES
[1] ”Netzentwicklungsplan Strom 2035, Version 2021 - Zweiter
Entwurf der Übertragungsnetzbetreiber,” [Online]. Available:
www.netzentwicklungsplan.de/de [Accessed May 14, 2021].
[2] G. Molinar, N. Popovic, and W. Stork, “From Data Points to Ampacity
Forecasting: Gated Recurrent Unit Networks,” in 2018 IEEE Fourth In-
ternational Conference on Big Data Computing Service and Applications
(BigDataService), 2018, pp. 200-207.
[3] G. Molinar, L. T. Fan, and W. Stork, “Ampacity forecasting: an approach
using Quantile Regression Forests,” in 2019 IEEE Power & Energy
Society Innovative Smart Grid Technologies Conference (ISGT), 2019,
pp. 1-5.
[4] G. Molinar, J. Bassler, N. Popovic, and W. Stork, ”Ampacity forecasting
from numerical weather predictions: a fusion of the traditional and
machine learning methods,” in IEEE International Conference for Smart
Grid Technologies - Europe. Virtual event: IEEE, October 2020.
[5] J. Becker, D. Grimm, T. Hotfilter, C. Meier, G. Molinar, M. Stang, S.
Stock, and W. Stork, “The QUA3 CK Machine Learning Development
Process and the Laboratory for Applied Machine Learning Approaches
(LAMA).” Presentation given at Symposium Artificial Intelligence for
Science, Industry and Society (AISIS 2019), Mexico City, Mexico,
20–25 October 2019 (2019).
[6] ”CatBoost: Overview of CatBoost”. [Online]. Available: catboost.ai
[Accessed Sep. 19, 2020].
[7] V. Lomonaco, ”Why Continual Learning is the key towards
Machine Intelligence.” [Online]. Available: medium.com/continual-
ai/why-continuous-learning-is-the-key-towards-machine-intelligence-
1851cb57c308, [Accessed: Feb. 22, 2021].
[8] Keras, ”Transfer learning & fine-tuning” [Online]. Available:
keras.io/guides/transfer learning/, [Accessed: Feb. 23, 2021].

Authorized licensed use limited to: Kansas State University. Downloaded on April 29,2024 at 02:20:58 UTC from IEEE Xplore. Restrictions apply.

You might also like