Professional Documents
Culture Documents
1509
Thanh-Huong Tran et al., International Journal of Advanced Trends in Computer Science and Engineering, 10(3), May - June 2021, 1509 – 1514
1510
Thanh-Huong Tran et al., International Journal of Advanced Trends in Computer Science and Engineering, 10(3), May - June 2021, 1509 – 1514
2.4 Fast Fourier Transform (FFT) ( RBF) Where, γ = 1/2σ2 is of the kernel. If over-estimated,
an adjustable the exponential will behave
The Fourier Transform, especially the Fast Fourier Transform, parameter and ||x - almost linearly, and the higher-
is one of the most popular spectral analysis, and is widely used y||is denoted as dimensional projection will
in several induction motor’sfault detection methods such as squared Euclidean start to lose its non-linear
MCSA.Some work such as [17][18] have also used this distance between two power.
transform to analyze the stator current spectrum for diagnose features vectors.
the broken rotor bars. From these research, internal faults and
broken rotor bars may be detected in the stator current Beside SVM, KNN is one of the simplest machine learning
spectrum. algorithms that are commonly used in classifying data and
In this paper, we use Fast Fourier transform to represent the learning-based approach. Like SVM algorithms, it can also
waveform of the original stator current signal in the frequency handle data with various characteristics, including nonlinear,
domains. Then, the 8 frequency components with maximum multimodal, and even non-Gaussian data. In that method, the
amplitude are selected to sample and calculate statistical position of the training data is kept fixed (K clusters), then for
features. Through FFT-based spectral analysis, eight features the new data samples, the distance between the training data
defined in Table 1 are also determines. and the query data is measured. Then it continued to adjust the
K values until it becomes stable. The optimal K value finally
3. MACHINE LEARNING RESULTS be used to classify the input data by transforming an
In this paper, 2 machine learning classification algorithms, anonymous dataset into a known one.
KNN and SVM, are selected with 12 different classifiers.
Ensemble is a superior classifier that combines multiple
3.1 The classification algorithms
diverse single classifier to boost the prediction accuracy.The
SVM is a commonly used machine learning method in data main idea of this decision fusion methods is to construct
classification and regression based on statistical analyzing and multiple classifiers with different types of features, then
structural risk minimization. This algorithm h ability to handle ensemble classification results obtained by each classifier
the large features spaces as SVM training is performed in such based on some predefined rules and achieve final classification
a way that the dimension of the classified vectors does not have result, which is better than the result of a single classifier.
any significant effect on the performance of the SVM as well
3.2 Classificaiton Algorithms
with the performance of other common classifiers[19]. SVM
also is generally suitable with separable and non-separable data The MATLAB Classification Learner Apps is an application
profile. In such profile, SVM would divide the data into 2 of MATLAB that can trains models to classify data using
classes: positive and negative then both classes are trained to supervised learning. In this paper two classification
provide informationabout the classification and builds the algorithms, SVM and KNN are chosen to performed fault
hyperplane. Accordingly, the hyperplane maximise the margin diagnosis:
of separation between the positive and the negative classes. SVM: linear SVM, quadratic SVM, cubic SVM, fine
That is when the soft margine (hyperplane), or the smallest Gaussian SVM, medium Gaussian SVM, coarse
distance between the structure of the separable and non- Gaussian SVM.
separable data set, being used to distinguish the data point.
Kernel functions of SVM is selected for non-linear KNN: fine KNN, medium KNN, coarse KNN, cosine
transformation. For example, a kernel function can convert a KNN, cubic KNN and weighted KNN.
nonlinearly separable object into linearly separable by mapping
them in a higher dimensional feature space. The selection of an Table 3 below will show description of each classifiers used in
appropriate kernel function is critical in the classification paper.
process as the kernel defines the feature space in which the Table 3: Classifiers From Matlab Apps
training set examples are classified.Linear, polynomial, and Classificatio Classifier Classifier description from
n algorithms types MATLAB classification learner
radial basis kernels are chosen for this task and its functions toolbox
are listed in the Table 2 below. Makes a simple linear separation
Table 2: Common kernel function Linear between classes, using the linear
Kernel Kernel function SVM kernel. The easiest SVM to
Descriptions
name formulas interpret.
Linear kernel is the basic Quadratic
Linear kernel function. It’s given by Use the quadratic kernel.
( , ) = + SVM
Kernel the inner product ( , ) plus Cubic SVM Use the cubic kernel.
an optional constant C. Make finely detailed distinctions
Polynomial kernel is a non- Support
( , ) Fine between classes, using the
stationary kernel, suited for vector
= ( + ) Gaussian Gaussian kernel with kernel scale
problems where all the machines
Where, adjustable SVM set to sqrt(P)/4 where P is the
Polynom training data is normalized. (SVM)
parameters are the number of the predictors.
ial The most used degree is d = 2 Make fewer distinctions than a
slope alpha, the
Kernel (quadratic kernel) and d = 3 Medium Fine Gaussian SVM, using the
constant term is C
(cubic kernel) as larger Gaussian Gaussian kernel with kernel scale
and the polynomial
degree seems overfit for SVM set to sqrt(P), where P is the
degree is d.
Machine Learning problems. number of the predictors.
Gaussian ( , ) In Gaussian kernel, γ plays a Coarse Make coarse distinctions between
Kernel = exp (− ‖ − ‖ ) major role in the performance Gaussian the classes, using the Gaussian
1511
Thanh-Huong Tran et al., International Journal of Advanced Trends in Computer Science and Engineering, 10(3), May - June 2021, 1509 – 1514
100
accuracy of all algorithms with different load. 80
60
40
20
0
Boosted Trees
Quadratic SVM
Coarse KNN
Cosine KNN
RUSBoosted Trees
Bagged Trees
Cubic SVM
Fine Gaussian SVM
Linear SVM
Weighted KNN
Medium KNN
Subspace KNN
Cubic KNN
Subspace Discriminant
The results shown at those two figure 2 and 3 that there are 5 ACKNOWLEDGEMENTS
best classification functionsfor both data using DWT method This work was supported in part by ShanghaiPujiang Program,
and FFT method: Fine Gaussian SVM, Fine KNN, and and in part by Belt andRoad InternationalCooperation Project.
Weighted KNN, Bagged Trees and Subspace KNN which give
the classification accuracy of almost 100% for all faults for REFERENCES
induction motors. However, not every algorithm chosen to be [1] K. Kim and A. G. Parlos, “Model-based fault diagnosis of
applied in fault diagnosis is suitable. In the worst case, the induction motors using non-stationary signal
classification accuracy of Coarse KNN, Boosted Trees and segmentation,” Mech. Syst. Signal Process., 2002, doi:
RUSBoosted Trees is only 12.5%. Further, as we just only 10.1006/mssp.2002.1481.
focus on one type of single faults, so the classification [2] M. Z. Ali, M. N. S. K. Shabbir, X. Liang, Y. Zhang, and
accuracy still not very high, most of them only about 50 – T. Hu, “Machine learning-based fault diagnosis for single-
80%. and multi-faults in induction motors using measured stator
currents and vibration signals,” IEEE Trans. Ind. Appl.,
120 vol. 55, no. 3, pp. 2378–2391, 2019, doi:
Percentages of accuracy
100 10.1109/TIA.2019.2895797.
[3] F. Filippetti, G. Franceschini, and C. Tassoni, “Neural
80 networks aided on-line diagnostics of induction motor
60 rotor faults,” IEEE Trans. Ind. Appl., 1995, doi:
10.1109/28.395301.
40 [4] S. Farag, B. K. Lin, T. G. Habetler, and J. H. Schlag, “An
20 Unsupervised, On-Line System for Induction Motor Fault
Detection Using Stator Current Monitoring,” IEEE Trans.
0 Ind. Appl., 1995, doi: 10.1109/28.475698.
Boosted Trees
Bagged Trees
RUSBoosted Trees
Quadratic SVM
Cubic SVM
Fine Gaussian SVM
Medium Gaussian SVM
Fine KNN
Weighted KNN
Medium KNN
Coarse KNN
Cosine KNN
Cubic KNN
Subspace KNN
Subspace Discriminant
Linear SVM
1513
Thanh-Huong Tran et al., International Journal of Advanced Trends in Computer Science and Engineering, 10(3), May - June 2021, 1509 – 1514
[14] W. T. Thomson and M. Fenger, “Current signature [18] M. Y. Kaikaa and M. Hadjami, “Effects of the
analysis to detect induction motor faults,” IEEE Ind. Appl. simultaneous presence of static eccentricity and broken
Mag., 2001, doi: 10.1109/2943.930988. rotor bars on the stator current of induction machine,”
[15] R. G. Bartheld, T. G. Habetler, and F. Kamran, “Motor IEEE Trans. Ind. Electron., 2014, doi:
Bearing Damage Detection Using Stator Current 10.1109/TIE.2013.2270216.
Monitoring,” IEEE Trans. Ind. Appl., 1995, doi: [19] S. Pan, T. Han, A. C. C. Tan, and T. R. Lin, “Fault
10.1109/28.475697. Diagnosis System of Induction Motors Based on
[16] J. J. Saucedo-Dorantes, M. Delgado-Prieto, R. A. Multiscale Entropy and Support Vector Machine with
Osornio-Rios, and R. De Jesus Romero-Troncoso, Mutual Information Algorithm,” Shock Vib., 2016, doi:
“Multifault Diagnosis Method Applied to an Electric 10.1155/2016/5836717.
Machine Based on High-Dimensional Feature [20] K. R. Müller, S. Mika, G. Rätsch, K. Tsuda, and B.
Reduction,” IEEE Trans. Ind. Appl., 2017, doi: Schölkopf, “An introduction to kernel-based learning
10.1109/TIA.2016.2637307. algorithms,” IEEE Transactions on Neural Networks.
[17] P. Shi, Z. Chen, Y. Vagapov, and Z. Zouaoui, “A new 2001, doi: 10.1109/72.914517.
diagnosis of broken rotor bar fault extent in three phase
squirrel cage induction motor,” Mech. Syst. Signal
Process., 2014, doi: 10.1016/j.ymssp.2013.09.002.
1514