You are on page 1of 1

On the Selection of Variables for Quantitative Multi-

Elemental LIBS Using Artificial Neural Networks


a b a
Danny Luarte , Jorge Yáñez and Daniel Sbarbaro
a
Department of Electrical Engineering, University of Concepción, Concepción, Chile
b
Department of Analytical Chemistry, University of Concepción, Concepción, Chile
dannyluarte@udec.cl, jyanez@udec.cl, dsbarbar@udec.cl

1. Introduction
• The use of LIBS and Artificial Neural Networks for qualitative elemental analysis of mineralized rocks and soils has demonstrated to be a very
effective tool to deal with many practical issues such as interference and matrix effects.
• LIBS spectrum is a high dimensional signal. Each input of the ANN is associated to a given wavelength of the spectra. Thus, using the raw
spectrum as input of an ANN can lead to a model with serious overfitting problems.

• This work explores three approaches to reduce the dimension of the ANN’s input designed to quantify seven components from LIBS spectra.

2. Materials and Methods 4. Results and Discussion


• A three-layer ANN was implemented.
The input layer receives LIBS intensities
of certain spectral lines. The hidden layer The use of prior knowledge (P.K.),
is optimized during the training step. The following the criteria described in Sec-
output layer has many outputs as to the tion 3, leads to seven wavelengths; i.e.
number of components to be quantified. one wavelength for each element [1].
Figure 1 (a) shows the validation re-
• The calibration and validation data con- sults.
sidered LIBS spectra of 6 certified pow-
ders and 25 rock samples [1]. The loading vectors associated to
PCA of the training dataset are con-
• The ANN neural network was trained us- sidered. The strongest lines associ- (a)
ing Levenberger-Marquant with Bayesian ated to the most important loading
regularization. The number of hidden vector were used as inputs to the
neurons was set to 10. ANN. The total number of lines were
25. Figure 1 (b) shows the validation
• The KBest algorithm was implemented
results.
with Python’s scikit-learn library.
The F -test was applied separately for
3. Variable Selection Algorithms each element. Thus, for each element,
the line with the highest score, based
• Prior knowledge. One reference line on a F -test, was selected, giving a
per detectable element is selected. The total of 7 wavelengths. Figure 1 (c)
selection criteria were the strongest line shows the validation results.
with no overlapping with other atomic (b)
lines and no significant self-absorption [1].
• Principal component Analysis. It is All the algorithms performed well
a lineal dimensional reduction technique with and R2 bigger than 0.9, as seen
that transforms the input space into a in Table 1. The best performing algo-
smaller dimension space [2]. The selection rithm was Kbest (R2 0.937) followed
of lines is based on the strongest lines as- by prior knowledge (R2 0.933) and
sociated with the most important loading PCA based (R2 0.916).
vectors.
Table 1: Quantitative Results
• Univariate selection. Lines are found
by statistical inference methods selecting Algorithm R2 No. inputs
the strongest relationship with the output
P. K. 0.933 7 (c)
variable [3]. The test statistics normally
used are the χ2 test and F -test. This is PCA 0.916 25 Figure 1: Regression Plots. (a) P.K, (b) PCA, (c) KBest
a univariate algorithm and therefore must KBest 0.937 7
be applied separately for each element.

5. Conclusions
1. These results show that the concentrations of the main elements for the analized data set can be estimated by using prior knowledge to select
just seven wavelengths.
2. The use of a univariate feature selection algorithm; i.e. Kbest, can even further enhance the performance without increasing the number of inputs
and without prior knowledge of representative wavelengths for each element.

6. References 7. Acknowledgments
This work was supported by Conicyt Project
[1] V. Motto-Ros et al., “Quantitative multi-elemental laser-induced breakdown spectroscopy using
ACM170008. Danny Luarte would also like to
artificial neural networks”, Journal of the European Optical Society, 08011, 2008.
thank the Conicyt Scholarship 2017-21170161.
[2] I.T. Jolliffe , “Principal Component Analysis”, Second Edition, Springer, 2002.
[3] scikit-learn , “Feature Selection: SelectKBest”, [Online]. Available: https://scikit-learn.
org/stable/modules/generated/sklearn.feature_selection.SelectKBest.html

You might also like