Professional Documents
Culture Documents
net/publication/363350797
CITATIONS READS
2 138
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Raphaël Langhendries on 07 September 2022.
Exhaust Gas Temperature (EGT) denotes the temperature of the exhaust gas when it leaves the turbine. EGT is an
important parameter for measuring the energy efficiency of a turbofan engine. Indeed, the heat energy produced by
an aircraft engine corresponds to a loss of power. Therefore, forecasting the exhaust gas temperature is a key task to
monitor the engine performance and schedule maintenance operations. In this paper, we propose a new method for
forecasting EGT throughout engine life. The EGT is regarded as a time series, for each flight we aim to predict the
EGT during the cruise. Our model is a neural network that leans on recurrent networks and attention mechanisms to
compute a state vector that represents the wear of the engine. Moreover, we show that this state vector can be used
to monitor the engine’s energy efficiency over time.
Keywords: Artificial Neural Network (ANN), Attention Mechanisms, Prognostic Health Management, Exhaust Gas
Temperature, Turbofan Engine.
3095
3096 Proceedings of the 32nd European Safety and Reliability Conference (ESREL 2022)
related to the weather (temperature, pressure, etc.) Zsi s<t and Xti ). Consequently, abrupt changes
and data describing the air pollution (dust, salt can sometimes occur in the EGT.
concentration etc.). We provide below an example of a cruise EGT.
An abrupt dynamic change can be observed near
2.3. Notations and main goal flight 800. This change is caused by a maintenance
For each engine and for each flight we retrieve operation that has decreased the EGT.
data from the takeoff, climb and cruise snapshots
and environmental data (weather and air pollution
at the departure and arrival airport). For engine i
and flight t, we denote by Zti this vector. The EGT
recorded during the cruise phase for the engine i
Temperature (°C)
and flight t is denoted Eti .
We emphasize that the EGT Eti is included in
the vector Zti . Moreover, other features strongly
correlated with the EGT are also included in Zti
(for example internal temperature in the engine).
Therefore, we also define the vector Xti equal to
Zti into which we remove the EGT Eti and any
features directly related to the engine operation
Cycles Since New
except the rotational speed of the turbine engine
(denoted N1 which represents the thrust delivered
by the engine). Fig. 1. EGT during cruise phase, the abscissa cor-
Indeed, our problem is to predict the EGT using responds to the number of operating cycles and the
past data Zsi s<t and data describing the flight ordinate to the measured EGT (degree Celsius). For
number t (the vector Xti ). confidentiality matters, the temperature scale on the y-
axis is not provided.
3. Model
3.1. Towards a probabilistic model Due to such operations, instead of investigating
deterministic models to predict EGT, it appears
The cruise EGT during flight depends on condi-
more promising to learn a probabilistic model that
tions at flight t (thrust delivered by the engine,
fits the EGT distribution for each flight.
altitude, external temperature, etc.) and the state
In this way, we aim to take account of the
of the engine at time t (which depends on past
uncertainty linked to maintenance policy.
data). However, other events can impact the per-
formance of the aircraft. In particular, water-wash 3.2. Gaussian Mixture Model (GMM)
are common maintenance operations that consist
in washing the engine in order to remove airborne A gaussian mixture is a probability distribution
particles that can build up in the engine. Such consisting of a superposition of m gaussian densi-
maintenance operation are regularly carried out ties. An introduction to gaussian mixtures can be
by airline companies and aim to restore engine found in Bishop (2006) Chapter 2.3.9).
performance (Chen and Sun (2018)). Probability density for a m components gaus-
Besides water-wash, many other events can im- sian mixture is defined by
pact aircraft performance: it includes every main- m
tenance operations and component replacements. pi,j
t (x) = πti,j N (μi,j i,j
t , σt ). (1)
j=1
Such events mainly depend on the mainte-
nance policy of airline companies and are not To simplify we will now imply indices i (the
always possible to predict with available data (i.e. engine) and t (the flight cycle) when necessary.
Proceedings of the 32nd European Safety and Reliability Conference (ESREL 2022) 3097
(π j )j∈[1,m] are called mixture coefficients and The use of gated recurrent networks to com-
m
their sum must be equal to 1 (i.e. j=1 π j = pute the physical state of a device has become
1). Coefficients (μj , σ j )j∈[1,m] are respectively a common idea in the PHM field. It has already
expectations and standard deviations of gaussian been proposed for example in Wu et al. (2018)
random variables. for RUL (Remaining Useful Life) estimation and
The gaussian mixture model seems pertinent to in Langhendries and Lacaille (2021) for survival
use in our context. Indeed, we expect to model analysis in the aerospace field. Moreover, recent
the uncertainty due to the several possible main- works (for example Zhong et al. (2018)) use an-
tenance operations using several gaussian com- other type of gated recurrent network (i.e. GRU)
ponents. In this way, our problem is to compute for EGT forecasting and show that such meth-
each engine i and each flight t parameters
for ods achieve state of the art performance. We also
πti,j , μi,j i,j
t , σt with j ∈ [1, m]. The number of emphasize that recurrent networks can be stacked
gaussian components (m) is a hyper-parameter. into layers (hidden states computed by first re-
The gaussian mixture model is commonly used current network are used as inputs by a second
in the field of prognostic health management. For recurrent network). It allows the capture of more
example Wei et al. (2022) uses GMM to predict complex information in the final layer of hidden
the RUL (Remaining Useful Life) of batteries. In states. In this work, we used two stacked LSTM
the aerospace field, gaussian mixture models are but to simplify notations we will continue to de-
also used in the context of unsupervised learning note and draw the recurrent layer as there is only
for fault detection (for example Lacaille et al. one LSTM.
(2014)).
3.5. Attention mechanisms
L θ| Zti t∈[1,ni ]
fμ (Zt )t∈[1,ni ] |θ (9) ⎛ ⎛
⎞⎞
ni m j j 2
ni
ni
π 1 Z t − μ
=− ln ⎝ t
exp ⎝− t ⎠⎠
= fμ (Xt ) fμ (Zt |Xt , Zt−1 . . . Z1 , θ) . j 2 σtj
t=1 j=1 σt
t=1 t=1
(10) (12)
ni 3.7. Implementation
The density t=1 fμ (Xt ) does not depend on
model parameters θ. Thus this term can be omitted We implemented the model using PyTorch. The
in the loss function. dense network for the short term attention mech-
We denote πtj , μjt , σtj parameters of the gaus- anism is composed of two layers. The dense net-
sian mixture. πtj (same for μjt and σtj ) denotes work that computes the gaussian mixture coeffi-
the mixture coefficient for the flight t and the cients is composed of four layers. Three gaussian
gaussian mixture j (therefore πtj depends on components are used. The model is trained using
Xt , Zt−1 . . . Z1 , θ). a dataset of 800 engines (all Leap1A) and validate
on another dataset of 22 engines. Throughout the
rest of the article, all given results are obtained on
fμ (Zt |Xt , Zt−1 . . . Z1 , θ) this validation dataset.
⎛
⎞
m j 2
πtj 1 Z t − μ 4. Results and Applications
= j√
exp ⎝− t ⎠.
j=1 σt 2π
2 σtj 4.1. Comparison with a flight by flight
(11) (FbF) dense neural network
In this section, we compare results obtained using
Maximizing the likelihood is equivalent to min- our model to results obtained using a baseline
imizing the negative log likelihood. Therefore, we model. The chosen baseline model is a neural
can choose the following loss function network taking for each engine and each flight
3100 Proceedings of the 32nd European Safety and Reliability Conference (ESREL 2022)
Acknowledgement
Cycles Since New This work has been support by a CIFRE contract be-
tween Safran Aircraft Engines and University Paris 1
Pantheon Sorbonne.
Temperature (°C)
Appendix B. Results: EGT for engine Neural Attention Models in Natural Language
number 4. Processing. arXiv:1902.02181 [cs, stat]. arXiv:
1902.02181.
Hochreiter, S. and J. Schmidhuber (1997, 11).
Long Short-Term Memory. Neural Computa-
tion 9(8), 1735–1780.
Temperature (°C)