You are on page 1of 5

2018 15th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE)

Mexico City, Mexico. September 5-7, 2018

A New Vision-Based Method Using Deep Learning


for Damage Inspection in Wind Turbine Blades
Sahir Moreno, Miguel Peña, Alexia Toledo, Ricardo Treviño, Hiram Ponce
Universidad Panamericana, Facultad de Ingenierı́a,
Augusto Rodin 498, México, Ciudad de México, 03920, México
{0207660, 0160601, 0161578, 0131056, hponce}@up.edu.mx

Abstract—Wind turbines are having great impact in the field


of clean energies. However, there is a need to improve these
technologies in various aspects, such as: maintenance, energy
storage, cases of overload or mechanical failure. In maintenance,
they constantly suffer of damage in blades typically to be in
the open air and in constant operation. The most well-known
damages in blades are identified as: impact of rays, wearing,
fractures by cutting forces, freezing, among others. Because
of all these factors, it is necessary to develop a predictive
technique to help us to do the inspections of the blades in a safer
and more effective way than manual inspection. In that sense,
this paper introduces a deep learning vision-based approach to
automatically analyze each part of the face of the blade, capable
of making the detection of certain faults (impact of rays, wear
and fractures). In addition, we present a proof-of-concept using
a robot to automatically detect failures in wind turbine blades.
Experimental results validate our vision system.
Index Terms—blades, wind turbine, inspection, deep learning, Fig. 1. Parts of a wind turbine.
machine learning, vision, robot

I. I NTRODUCTION structural, forces of control and mechanical vibrations. The


Currently, clean energies are being more used and consid- fatigue experienced by the components can be of a great
ered for their use and consumption at the global level, not magnitude that in other systems or machines. Moreover, wind
only in the industrial sector but also in housing. [1]. Different turbines are growing in size and power which means that
clean energies have been proposed and implemented, such as: a single drive failure can cause great loss of energy. These
solar energy, water energy, thermal energy and wind energy. machines operate under complex and dynamic load variations
In particular, the latter uses kinetic energy of the moving air in environmental conditions such as turbulence or bursts, so
to produce electricity from large wind turbines on land or on they can be easily damaged [3].
the sea or freshwater. The presence of damages in wind turbine blades is of
A wind turbine is a device that converts the kinetic energy of the particular importance in this work. Damages can be
wind into electrical energy. The blades of a wind turbine rotate identified as: impact of rays, wearing, fractures by cutting
between 13 and 20 revolutions per minute, depending on its forces, freezing, among others [2], [3]. For instance, rotor
technology [2]. They can rotate at a constant or variable speed, blades are constantly subjected to cyclic load forces (e.g. axial
where the speed of the rotor varies according to the speed of tensile, compression and bending) that can result in the aging
the wind in order to achieve greater efficiency. The elements of the components and damaging specific components (i.e.
that make up a wind machine are the following (see Fig. 1) [2]: lamination within the structure of mixed media). Other damage
supports (towers or straps), collection system (rotor), guidance in blades is caused by lightning that can result in firing the
system, rotatory speed control system, transmission system equipment. Visual inspection is then conducted for detecting
(axles and multiplier), and generation system (generator). such damages, but some of them are less easy to be detected
Among the key components, the components of the drive than others.
train of wind turbines are subject to dynamic loads due to Proper preventive maintenance should be done in a wind
the interaction of inertial forces, aerodynamic forces, elastic, farm to reduce those damages. Due to multiple failure can oc-

978-1-5386-7033-0/18/$31.00 ©2018 IEEE


cur in blades, there are several consequences that can happen: Authors in [7] proposed a computationally light and highly
decreasing in energy generation, economic losses, damages scalable, collision-free, control scheme for multiple unmanned
to the equipment, human accidents, and so on. Currently, aerial vehicles (UAV) that could be implemented in a dis-
maintenance work considers to manually inspect the surface tributed manner. The system defines trajectories for UAVs to
of blades making this a tedious procedure and very time visually inspect structural assets such as tanks, flair stacks,
consuming, that should be done periodically. In addition, it chimneys, and wind turbines. This scheme is applied to
considers to stop the wind turbine for inspection, but this automated structural inspection, where coverage is achieved by
period should represent wasting tim and energy [4]. minimizing the un-scanned areas. They perform the coverage
In that sense, this paper aims to introduce a deep learning path planning (CPP) for total coverage of the target object.
vision-based approach for detecting certain damages in the However, visual inspection is done by humans using the video
face of a wind turbine blade, i.e. by impact of rays, wearing streaming provided by the UAVs.
and fractures. In addition, we present a proof-of-concept using Another work based on visual inspection for wind turbines
a robotic system to grab images from the surface of blades was proposed by Sarrafi and Mao [8]. They used phase-based
and automatically detect damages. In order to do so, we motion estimation (PME) for detecting damages in blades
curate a dataset of images containing the types of damages suffered by vibrations. They extended the two-dimensional
considered in this work. Then, we train a convolutional neural PME to propose a three-dimensional PME that improves the
network with these images for detecting damages. After that, detection algorithm. However, this approach only considers
a prototype was made. A camera was mounted in a robot, and one type of damage.
a simple path planning strategy was programmed for scanning In that sense, we are proposing to develop an automatic
all the surface of wind turbine blades. To this end, a mockup robotic system, e.g. a drone, that can coverage the surface
of a wind turbine was built to test the whole mechatronics of blades and using deep learning for predicting different
system. damages on them, making the maintenance of wind turbine
The rest of the paper is organized as follows. Section blades more efficient than manual inspection.
II presents the related work, Section III describes the pro-
posal of the vision-based approach using convolutional neural III. D ESCRIPTION OF THE P ROPOSAL
networks. In Section IV, experimentation and results are In order to improve the maintenance process in wind turbine
described and discussed. Lastly, Section V concludes the work. blades, inspection is proposed to do using an autonomous
robotic system, i.e. a drone, that flights all around the blades
II. R ELATED W ORK to scan the surface of them, when the wind turbine is not
In [5], authors proposed to apply a technique with a working. Using a camera installed in the drone, a convolutional
thermographic laser capable of detecting damage presented neural network is able to detect and classify the type of damage
in the blades of wind turbines. Thermal waves are generated in the images. After the surface of a blade is totally covered,
by a laser beam continued and directed to the rotation of the autonomous system is able to determine the areas of blades
the blades of the turbines. Thermal responses are measured with damages by type. Fig. 2 shows the entire process of the
simultaneously by an infrared camera. After that, statistics proposed vision-based system.
and pattern recognition algorithms were developed and applied
to measure of thermal images in time domain. This method
allows an inspection of the blades without having any direct
contact with them, and obtaining information that can be
interpreted in an intuitive way of doing it in a way in situ.
Unfortunately, its implementation has not yet been applied to
real systems so it might not be useful in the field as well
as its analysis algorithm could take too much with human
interpretation of the results.
In the work [6], authors implemented a new methodology
based on the analysis of supervisory control and data ac-
quisition (SCADA) system for the monitoring of conditions
and diagnosis of wind turbines. This methodology takes as
input the signals obtained by programmable logic controllers Fig. 2. Block diagram of the proposed system.
(PLC) or programmable automation controllers (PACS) of the
overall wind turbine system. In addition, the proposal is not Since standard protocols for maintanence in wind turbines
only used for detecting faults exclusively in the palettes, it stands that it must be turn off and the blade in inspection
is only used for detecting failures of the entire system and be positioned along the vertical axis; then, in this work, we
monitor its operation to assess and provide the necessary propose to use a simple coverage method by introducing the
maintenance. However, the data obtained through this method start position of the drone on the surface of the blade, and
depends largely on the environmental conditions. manually turn on the device. The covarage method should be
calibrated to assign vertical movements with the drone. For IV. E XPERIMENTAL R ESULTS AND D ISCUSSION
simplicity, the path of the drone is programmed before. In
In this section, we describe the prototype implemented for
addition, the drone is manually located at start position, and
this work, the experimentation done and the preliminary results
the user should select from one side or another of the blade,
obtained so far. Some advantages and weaknesses are also
e.g. front (A side) or back (B side).
discussed.
During the execution of the trajectory, the drone takes
photographs of the surface. An individual photograph works as
A. Prototype Implementation
an input of a convolutional neural network, previously trained,
to detect any damage in the image and the type of it. If there Firt, we made a mock-up in order to determine the viability
is any damage in the image, the latter is stored; otherwise, it of the proposal. Therefore, we built a scale model of a wind
is deleted. After the whole area is coveraged, then the drone turbine that consists of a wooden base as well as a tower of
stops and the images stored can be now inspected by user 32cm in height and 3D printed blades made of ABS plastic.
experts in the field. In the following, a detailed description of Then, a camera was placed 3.8cm approximately from the
the convolutional neural network is presented. wind blades. Damages were simulated in the blades.
Instead of using a drone, we developed an arm robot with
A. Convolutional Neural Networks the camera placed at the end effector. This decision was
made mainly for simplicity of the coverage strategy. The robot
Convolutional neural network (CNN) is a well-known deep
was built using the LEGO Mindstorms EV3 and the Genius
learning architecture inspired on the nature of visual percep-
Eye110 webcam was employed as the camera. Fig. 4 shows
tion in living creatures [9] typically applied for classification
the implementation of the protoype using the arm robot and the
and regression in image processing [10]. There exists dif-
mock-up of the wind turbine. Particularly, the robot consisted
ferent architectures of CNN; but it is mainly constituted by
of three motors, two at the support of the robot (i.e. two
three types of layers namely convolutional, pooling and fully-
degrees of freedom) and one at the second linkage. In addition,
connected. The first layer aims to compute feature representa-
two sensors were employed for determining the maximum
tions of the input, a pooling layer aims to reduce the resolution
amplitude of linkages.
of feature maps, and a fully-connected layer aims to perform
high-level reasoning [9]. Lastly, all CNN requires an output To this end, the coverage method and the CNN were
layer aiming to compute the classification or regression task. programmed in MATLAB using a computer connected through
Particularly, image and video applications have been widely USB cables to both the camera and the robot. Fig. 5 shows
explored with CNN. the schematic diagram of the robot-camera-computer commu-
nication. We used the LEGO Mindstorms EV3 Support from
For implementation purposes, the size of our input images
MATLAB for the coverage method, and the functions of CNN
is 28 × 28. Then, a convolutional layer with 16 filters of
from the Neural Network Toolbox for MATLAB.
size 3 × 3 is placed to compute feature representations of the
input image, followed by a sequence of a batch normalization,
B. Training of CNN
a rectified linear (ReLu) and a 2-size max pooling without
overlapping layers. Next, the same sequence of layers is con- The CNN was trained through a curated dataset that consists
nected with a convolutional layer with 32 filters. Then, another of a folder in which there are 78 images of real wind turbine
sequence of layers is also connected with a convolutional layer blades found on public Internet. The dataset was divided into:
of 64 filters without the max pooling layer. Finally, a fully- blades without damage (21 samples), those with wearing (20
connected layer with output size 4 and a classification layer, samples), impact of lighting (25 samples) and fractures (12
with a softmax layer, are located at the end of the CNN to samples). The resolution of images was 150 × 150 pixels.
perform high-level classification of damages. In this work, we The training procedure was the stochastic descent gradient
consider three possible damages: by lightning impact, wearing method minimizing the error between the target output classes
or fracture. In addition, we also consider a no possible damage and the estimated classes by the CNN. 4-size mini-batches
class. Fig. 3 shows the CNN implemented. were used with a maximum of 50 epochs.
Once the CNN was trained, we used it for estimating the
B. Real-Time Implementation of CNN type of damage. For implementability, we employed the Neural
To detect the damages in real-time, the CNN should be Network Toolbox for MATLAB.
trained offline. In that sense, a dataset of damages is requiered
for training. Then, the CNN should be deployed in the system C. Metrics for Evaluation
that is going to serve as the automatic inspector, e.g. in the In this work, we use the accuracy metric (1) to validate the
drone. It is remarkable to say that the trained CNN estimates performance of the vision-based damage inspection system for
the class of damage seen in an image in less than 100ms, blades.
depending on the resources of the unit processor. Below,
we describe how the CNN was trained and deployed in our number of images correctly classified
prototype. accuracy = × 100% (1)
total number of images
28 x 28

4x1
damage
class
estimation
image
classification
output layer
+
convolutional layer convolutional layer convolutional layer softmax
16 filters (3 x 3) 32 filters (3 x 3) 64 filters (3 x 3)
+ + +
Batch Normalization Batch Normalization Batch Normalization fully-connected layer
+ + +
ReLU ReLU ReLU
+ +
Max Pooling Max Pooling

Fig. 3. Architecture of the CNN proposed for the visual inspection system of blades.

D. Preliminary Results
At the beginning of the experiment, the camera performs
a vertical sweep up capturing 4 photographs, after a pause
descends and takes other 4 pictures. In Fig. 6, it can be
shown the taken images corresponding to damage by lightining
impact. Table I summarizes the results of this experiment.

Fig. 6. Images taken for damages by lightning impact. The coverage method
Fig. 4. Prototoype of the wind turbine and the arm robot with the vision-based runs in ascending direction (top) and in descending direction (bottom).
damage inspection system attached.

TABLE I
ACCURACY IN EXPERIMENTS

Damage Accuracy [%]


lightning impact 100.0
wear 50.0
camera fracture 87.5
no damage 87.5
USB cable

The same experiment was done using a blade corresponding


image
to the wear damage. Fig. 7 reports the images taken from the
motion
vision-based inspection system over the mock-up and Table I
shows the results of the experiment.
computer robot
USB cable The third experiment consisted on the same methodology
(MATLAB) (EV3) using a blade with fractures. Fig. 8 shows the images taken
sensors
by the system and Table I summarizes the results.
Fig. 5. Schematic diagram of the robot-camera-computer communication. To this end, the last experiment considers a normal blade
without damages. Fig. 9 depicts the images of this experiment
and Table I summarizes the results.
does not occupy drones, and experimentation with them must
be done before implementing in real conditions.
F. Capabilities of the Proposal
From the above, the proposed automatic inspection system
is able to determine the type of damage found in blades. This
is an important function since it helps to reduce the time
of inspection. Also, the system is able to store a record of
images from the blades that can be used in future (historical)
analysis of them. Furthermore, the proposed system uses a
Fig. 7. Images taken for damages by wear. The coverage method runs in convolutional neural network that is able to learn from new
ascending direction (top) and in descending direction (bottom). images taken from the blades, aiming the system flexible to
adapt for any conditions in the wind turbines. In addition, this
CNN can be used in real-time for inspections in less than
100ms. To this end, if the system is used for inspection, it
will reduce risks for workers.
V. C ONCLUSION
This paper aimed to introduce a deep learning vision-based
approach for detecting certain damages in the face of a wind
turbine blade. In addition, we presented a proof-of-concept
using a robotic system to grab images from the surface of
mock-up blades and automatically detect damages.
Fig. 8. Images taken for damages by fracture. The coverage method runs in Preliminary results validate that this proposed system can
ascending direction (top) and in descending direction (bottom).
be explored in deep for future vision-based damage inspection,
classifying three different damages in blades. Several insights
E. Discussion have shown in this work, such as CNN can be considered for
this task; but some issues should be solved firstly.
After the experiments, the overall accuracy of the system is
We are interested in curate a larger dataset for traning the
81.25%. This value can be explained by the following reasons:
CNN as well as using transfer learning for improving the
• The CNN was trained with few number of images.
results in the trained network. In addition, we are investigating
• Sample images are real photographs of blades and the
the usage of the drone for real wind turbines.
experiments were done over a mock-up.
• As noticed in Fig. 6, damage by lightning impact is the R EFERENCES
easiest one to detect even for human inspection, while [1] Cash, D. (2018), Choices on the road to the clean energy future, Energy
wear is very similar to blades without damages (see Figs. Research & Social Science, vol. 35(1), pp. 224 – 226.
[2] Tong, W. (2010), Wind Power Generation and Wind Turbine Design,
7 and 9). WIT Press, pp. 725.
In that sense, the visual-based damage inspection system can [3] Hur, S., 2018, Modelling and control of a wind turbine and farm, Energy,
vol. 156(8), pp. 360 – 370.
be validated for these experiments and the accuracy is quite [4] González-González, A., Jimenez, A., Galar, D., Ciani, L. (2018),
large even for the issues presented in the CNN. This prelimi- Condition monitoring of wind turbine pitch controller: A maintenance
nary results can confirm that this approach can be practical, but approach, Measurement, vol. 123(7), pp. 80 – 93.
[5] Soonkyu, Y., Sohna, H. (2017), Continuous line laser thermography for
a large number of samples should be considered, a refinement damage imaging of rotating wind turbine blades, Artificial Intelligence,
in the training of CNN is also required and transfer learning is vol. 188, pp. 225 – 232.
suggested for better classification. In addition, this experiment [6] Dao, P., Staszewski, W., Barszcz, T., Uhl, T. (2018), Condition monitor-
ing and fault detection in wind turbines based on cointegration analysis
of SCADA data, Renewable Energy, vol. 116, pp. 107 – 122.
[7] Clark, R., Punzo, G., MacLeod, C., Dobie, G., Summan, R., Bolton,
G., Pierce, S., Macdonald, M. (2017), Autonomous and scalable con-
trol for remote inspection with multiple aerial vehicles, Robotics and
Autonomous Systems, vol. 87(1), pp. 258 – 268.
[8] Saffari, A., Mao, Z. (2017), Wind turbine blade damage detection
via 3-dimensional phase-based motion estimation, 11th International
Workshop on Structural Health Monitoring, Stanford University, United
States, pp. 2545 – 2552.
[9] Jiuxiang Gu, Zhenhua Wang, Jason Kuen, Lianyang Ma, Amir
Shahroudy, Bing Shuai, Ting Liu, Xingxing Wang, Gang Wang, Jianfei
Cai, and Tsuhan Chen. Recent advances in convolutional neural
networks. Pattern Recognition, 2017:1–24, 2017.
[10] K. Nogueira, O.A. Penatti, and J.A. dos Santos. Towards better
Fig. 9. Images taken for blades without damages. The coverage method runs exploiting convolutional neural networks for remote sensing scene
in ascending direction (top) and in descending direction (bottom). classification. Pattern Recognition, 61:539 – 556, 2017.

You might also like