Professional Documents
Culture Documents
In our work in Sec. 2, we describe the deep learning III. CHAOS AND KNOCKING
technique, in Sec. 3 we identify the chaos in vibration
knocking and in Sec. 4, we present our first results. Large gas engine knocking is a nonlinear, complex
phenomena and could not be characterized by linear,
II. DEEP LEARNING statistical parameters. In this study our aim is to define
and to learn the nonlinear characteristics via knocking
probabilities of the chaotical vibration data series.
Deep learning could be defined as a part of the neural In the mechanical engineering [1] for large gas engines
network based identification schemes of multiple hidden the knocking probabilities are characterized by SNR as
level input-output data series. In our case the input layer follows:
contains working cycles of the Deutz MWM G234V8 1. norm(SNR) 0.1 very weak knocking
large gas engine vibration signals, and the output layers
are the knocking combustion probabilities. 2. 0.1 d norm(SNR) 0.2 weak knocking
Between several deep learning techniques, for our 3. 0.2 d norm(SNR) 0.6 knocking
research the Deep Belief Networks (DBNs) have become 4. 0.6 d norm(SNR) strong knocking
an attractive option for
However, the knocking probabilities can give inaccurate
1. data dimensionality reduction information about knocking resonance transitions, and
2. collaborative filtering nonlinearities, therefore, we need further applicable
3. feature learning information of the knocking chaoticity is contained in the
4. modeling and vibration signals.
5. solving classification problems [9]. The main idea is: describing the chaoticity of the
knocking signals by the K2-entropy as follows
Further deep learning constructions such as
Convolutional Neural Networks, Stacked Denoising 1. K2=0 in strong knocking system, (more than half
Autoencoders, and Deep Recurrent Neural Networks the working cycles contain knocking
have also expanded some purchase recently as they have combustions)
been exposed to overtake some neural network tools for 2. K2ofin purely random knocking system,
handling large data spaces to learn features in order to (weak knocking system)
perform detection, classification and prediction [10].
3. 0 K2 fin chaotic knocking system [14].
The basic key of the DBN deep learning method is to
using the Restricted Boltzmann Machine (RBM), where
The informal definition of K2 by means of correlation
multiple RBMs are weighted on upper of another to form
integrals is:
a deep network [11].
DBNs usually can be trained in an unsupervised
manner: the first RBM layer is trained with the vibration K2 lim lim K2m r (2)
m of r o0
signals (knocking working cycles and not-knocking
cycles in the crank angle [0-719°] subset together) as where
input. Through training, the first layer purchases an
image of the vibration signal by updating its weights and 1 P m (r )
K 2m ln m k (3)
biases between the input and hidden layers which in turn k't P r
converts the input of the second layer [12].
The main task of the deep learning of the DBNs is to
k is adequately minor integer number, m is the embedded
catch the weights that maximizes the expected log
dimension, r is the discretized symbol sequence of the
likelihood logP(v) of the vibration signal data v:
knocking probabilities r ^a, b, c, d ` (see Figure 2) and
ª º Pm(r) is the transition probabilities of the deep learning
arg max E «¦ log P(v)» (1)
matrix (see Figure 2).
Wights
¬ vV ¼
In order to decrease fluctuations and to recover the
statistics, we use averaging of Eq. (2) over 4 different
The optimization is resolved by gradient based transition values of the deep learning probability matrix
structures. Trusting the weights and biases of the input ( r ^a, b, c, d ` ) and approximate dependence of the
layer constant after it is trained, the transformed input
from the first hidden layer is utilized to train the next knocking characteristic K2 on m by means of least-
hidden layer. This procedure is continued for the chosen squares fit by the function with the signal size L
number of hidden layers in the network with each
iteration spreading both the vibration values and the mean v L 1 § P m 2l (r ) · (4)
activations to higher levels, while the product of
K 2m (r ) ¦ ln¨ ¸
4 L't l 1 l ¨© P m (r ) ¸¹
probabilities allocated to the input is maximized [13].
– 40 –
INES 2016 • 20th Jubilee IEEE International Conference on Intelligent Engineering Systems • June 30-July 2, 2016 • Budapest, Hungary
IV. RESULTS
Figure 1. shows the basics of test gas engine and AÆ The vibration working cycles are trained by the
generator used for the knocking analysis. The working DBN network with 31 hidden layers. The training set
principle of the gas engine is the following: the gas is contains of 917,000 training working cycles covering
burnt in the cylinders of the engine, the energy shifts a 461,000 knocking and 456,000 non-knocking working
crankshaft within the engine. The crankshaft shifts an cycles. A learning rate of 0.01 is used for the gradient
generator which results in the alternation of electricity. descent algorithm.
Heat from the combustion process is discharged from BÆ During supervised finetuning, based on the K-
the cylinders, which must be either reclaimed and used in Entropy knocking signals (Eq.4.) classification errors on
a combined heat and power structure or dissipated via the validation vibration signals shall be related contrary
heaters placed near the to the engine. to the errors from training set as a degree to avoid
overtraining the network based on the chaoticity of the
The background of the nonlinear knocking analysis by working cycles.
deep learning scheme can be seen on Figure 2. CÆ The symbolic sequence signal is found erstwhile to
the point when the chaoticity of the knocking
characteristic dependably sophisticated than the training
error in following training iterations.
– 41 –
J. Z. Szabó, P. Bakucz • Identification of Nonlinearity in Knocking Vibration Signals of Large Gas Engine by Deep Learning
VI. REFERENCES
– 42 –
INES 2016 • 20th Jubilee IEEE International Conference on Intelligent Engineering Systems • June 30-July 2, 2016 • Budapest, Hungary
based on flame describing function. Journal of Fluid Mechanics, [15] Hinton, G., Osindero, S., & Teh, Y.-W. (2006). A fast learning
615, 139-167. algorithm for deep belief nets. Neural computation, 18(7), 1527–
[11] Palies, P., Schuller, T., Durox, D., & Candel, S. (2011). Modelling 1554.
of premixed swirling flame transfer functions. Proceedings of the [16] Hinton, G. E. (2009). Deep belief networks. Scholarpedia, 4(5),
combustion institute, 33(2), 2967-2974. 5947.
[12] Erhan, D., Bengio, Y., Courville, A., Manzagol, P.-A., Vincent, P., [17] Y.-C. Lai and T. T_el, Transient Chaos: Complex Dynamics in
& Bengio, S. (2010). Why does unsupervised pre-training help Finite Time Scales, Applied Mathematical Sciences (Springer,
deep learning? The Journal of Machine Learning Research, 11, 2011), Vol. 173.
625–660. [18] Kavaris, C, Kantor J. Geometric methods for nonlinear process
[13] Erhan, D., Courville, A., & Bengio, Y. (2010). Understanding control. Ind Eng. Chem. Res. 1990 29 2995.
representations learned in deep architectures. Department [19] Wu et al. Control Design for Residual Gas Fraction in Engine
d’Informatique et Recherche Operationnelle, University of Based on Stochastic Logical Dynamics SICE Annual Conference
Montreal, QC, Canada, Tech. Rep, 1355. 2015.
[14] Fischer, A., & Igel, C. (2014). Training restricted Boltzmann
machines: An introduction. Pattern Recognition, 47(1), 25–39.
– 43 –
J. Z. Szabó, P. Bakucz • Identification of Nonlinearity in Knocking Vibration Signals of Large Gas Engine by Deep Learning
– 44 –