Professional Documents
Culture Documents
https://doi.org/10.1007/s13042-019-01054-w
ORIGINAL ARTICLE
Abstract
Extreme learning machine (ELM) as a machine learning method has been successfully applied to many classification prob-
lems. However, when applying ELM to classification tasks on the encrypted data in cloud, the classification performance is
extremely low. Due to the data encryption, ELM is hard to extract informative features from the encrypted data for correct
classification. Moreover, the trained neural network is un-protected on the cloud environments, that makes cloud service
highly risky to the attackers. In this paper, we propose a novel fully homomorphic ELM (Homo-ELM), which makes cloud
searching tasks under a fully protected environment without compromising the privacy of users. To demonstrate the effec-
tiveness of our approach, we conduct a comprehensive experiment on both cloud and local environments. The experiment
results show that Homo-ELM can achieve high accuracy on the local environments as well as cloud environments than other
machine learning methods.
Keywords Encrypted machine learning · Fully homomorphic encryption · Privacy preservation · Extreme learning
machine · Encrypted image classification
13
Vol.:(0123456789)
International Journal of Machine Learning and Cybernetics
on multilayer extreme learning machine (ML-ELM) and the trained Homo-ELM is fully encrypted and protected
applied it to the cloud searching task in our previous work in the cloud, which includes the network structure and
[9]. The ML-ELM unitizes the advantages of both deep parameters as well as the outputs of Homo-ELM. That
learning and ELM. Deep learning can automatically extract means Homo-ELM can well preserve the user’s privacy
features from low-level representation. ELM [10–14] can on cloud environments.
achieve high accuracy in the extremely fast learning speed. The contributions of this paper are summarized as
Moreover, ML-ELM can deal with text data as well as media follows:
data such as images and videos.
The results of our previous work show that ML-ELM • We propose a novel fully homomorphic extreme learn-
achieves the best classification performance over the ing machine with polynomial activation function that can
encrypted data than other methods on cloud searching tasks, achieve the high classification accuracy.
but it has the following limitations: (1) ML-ELM can only • We solve a privacy breach problem avoiding the deploy-
achieve high accuracy on binary classification problem. ment of an un-encrypted neural network to cloud envi-
When applying ML-ELM to cloud searching problem, users ronments.
must train multiple ML-ELMs for each searching topic, this • We improve the classification accuracy that makes
is not a cost-effective way in the practice. (2) The trained Homo-ELM can deal with multi-class classification
ML-ELM must be uploaded to the cloud before providing searching problem on cloud environments.
the searching service to users, that means the trained ML-
ELM may be attacked without the permissions of users. The remaining of the paper is organized as follows: Sect. 2
Many attacks of deep learning are reported in recent years first introduces the extreme learning machine and fully
[15, 16]. For example, attackers retrieved the parameters homomorphic encryption, then it describes the proposed
and structure information from a un-protected neural net- Homo-ELM algorithm. Section 3 provides the performance
work and inferred the training data by providing a meticu- evaluation of Homo-ELM and the benchmark with other
lous designed attacking example. This is flagrant threat for machine learning methods. Section 4 introduces a case study
cloud-based applications. to illustrate how to apply Homo-ELM on public cloud stor-
Many researchers focus on the privacy-preserving age. Section 5 concludes the paper and plans future work.
extreme learning machine algorithms in recent years, which
aims to avoid disclosing confidential information to unau-
thorized users. The work [17] presents the privacy-preserv-
ing protocols for the backpropagation and ELM algorithms 2 Homo‑ELM: homomorphic extreme
when data and weight vectors are horizontally and verti- learning machine
cally partitioned among several parties. The papers [18,
19] provide ELM-based privacy-preserving protocols for In this section, we first introduce the preliminary of ELM
implementing aware agents on portable/wearable comput- and fully homomorphic encryption and then present the
ing devices. The work [20] proposes a privacy-preserving details in the construction of Homo-ELM.
learning scheme over arbitrarily vertically partitioned data
between two or more parties. The paper [21] proposes an
outsourcing learning model over the encrypted data. How- 2.1 Extreme learning machine
ever, the related work has following deficiencies: (a) they
require the extra communication costs between the client and Extreme learning machine (ELM) [10] is an extremely fast
cloud, (b) and need a trust third party to calculate the out- training speed neural network with a single hidden layer.
put weights and decrypt the encrypted data, (c) the trained ELM does not use gradient descent to update the param-
models are totally un-encrypted on the cloud. eters, instead it randomly generates the parameters between
In this paper, we proposed a novel fully homomorphic input and hidden layers [22–24] and computes the optimal
extreme learning machine (Homo-ELM) that can achieve parameters between hidden and output layers with the least
high accuracy on cloud environments as the same on- square algorithm. Many studies [12, 25–27] show that ELM
premise environments without compromising the privacy can achieve high classification performance at extremely fast
of users. Homo-ELM eliminates the limitations as men- learning speed than other machine learning methods espe-
tioned in the above paragraph and can achieve high clas- cially when the problem only has a small dataset.
sification performance on binary classification problem Formally, given a set of N arbitrary distinct samples
as well as multiclass classification problems without an (xi , ti ) , where i = 1, 2, … , N and xi is a d dimensional vec-
extra third party that means a Homo-ELM network can tor, the output function of ELM with L hidden neurons is
support a cloud searching task on many topics. Moreover, represented by
13
International Journal of Machine Learning and Cybernetics
L
∑ arbitrary number of operations on the encrypted data after
f (xi ) = 𝛽j h(xi ) = h(xi )𝜷 (1) the reduction of noise. It introduces a mechanism that
j=1 enables the encryption evaluating the low-degree polyno-
where 𝜷 = [𝛽1 , … , 𝛽L ] is the output weight between the hid- mial homomorphic operations first, then applies a theo-
den layer and output layer, h(x | w, b) = g(w ⋅ x + b) is the rem called bootstrapping to refresh a ciphertext for noise
hidden layer output with respect to input x , and the parame- reduction. Therefore, FHE enables an arbitrary number of
ters of input weight w = [wi , … , wL ] and bias b = [bi , … , bL ] operations under polynomial time on the encrypted data.
are both randomly generated without updating, g is the acti- However, this mechanism introduces extra computation
vation function of hidden layer. Briefly, the matrix form of costs that make FHE very low efficiency. To improve the
ELM can be written as efficiency, the paper [34] proposed BGV - a leveled FHE
without Bootstrapping based on the General Learning with
H𝜷 = T (2) Errors (GLWE) problem, which dramatically improves the
computational efficiency and has 2𝜆 security against lattice
where T = [t1 , … , tN ]T is t he t arget labels and attacks, where 𝜆 is the security parameter. We will use BGV
H = [h(x1 ), … , h(xL )]T is the outputs of the hidden layer. to our proposed Homo-ELM for privacy-preserving. The
The output weight 𝜷 can be calculated by: basic notations of BGV are presented first, then BGV algo-
𝜷 = H† T (3) rithms are introduced.
13
International Journal of Machine Learning and Cybernetics
13
International Journal of Machine Learning and Cybernetics
Fig. 1 Homo-ELM overview
Training Stage Prediction Stage
(Local) (Cloud) New Encrypted Data
(Testing data)
2.3 Homo‑ELM: homomorphic extreme learning an un-encrypted training model and then use homomorphic
machine encryption algorithm BGVEnc to encrypt the trained model.
In the prediction stage, we put the encrypted testing set into
In this subsection, we propose a novel fully homomorphic the encrypted Homo-ELM model and get an encrypted class
extreme learning machine (Homo-ELM) by integrating ELM label EncY. Finally, we can use the decryption algorithm
and fully homomorphic encryption with a polynomial acti- BGVDec to decrypt the label and check whether the classifi-
vation function. The overview of Homo-ELM is shown in cation result is correct or not.
Fig. 1. Firstly, an un-encrypted Homo-ELM is learned from
a dataset, and then it is transformed into an encrypted Homo- 2.3.1 Activation function for homomorphic
ELM through the homomorphic encryption algorithm. When
a new encrypted data is provided, encrypted Homo-ELM Generally, the activation functions of ELM are the sigmoid
can give a prediction result. We use red and green colors function, sine function, and Gaussian function. They are
to indicate whether Homo-ELM is under-protected. As we listed as follows:
can see, all the new input and output data are encrypted as
well as the neural network of Homo-ELM eventually. When Sigmoid function: g(x) = 1∕(1 + exp(−x))
applying Homo-ELM to cloud searching task, user privacy Sin function: g(x) = sin(x)
can be preserved from the start to the end. The details will Gaussian function:
2
g(x) = e−x
be discussed in Sect. 4.
However, homomorphic encryption does not support
enough mathematical operations to construct the above
activation functions, it only supports homomorphic addi-
tion, subtraction, and multiplication. Therefore, we need
a suitable activation function for Homo-ELM. Polyno-
mial activation function is an ideal choice, it only contains
computations of addition, subtraction, and multiplication.
Moreover, the papers [35, 36] prove that ELM with polyno-
mial activation function does not degrade the generalization
capabilities. The general polynomial activation function is
as follows:
N
∑
We illustrate training and prediction procedures of f (x) = an x n (5)
Homo-ELM using the MNIST dataset1 as follows. The
n=0
60,000 handwritten digits are regarded as the training set. where the coefficients an can be real or complex. How-
The remaining 10,000 handwritten images are encrypted by ever, this general polynomial activation function can not
BGVEnc as prediction set (New Encrypted Data). We first be used in homomorphic encryption, because homomor-
use a learning algorithm ELM and the training set to train phic encryption only supports calculating on the integers
ℤ . Thus, a modified polynomial activation function for HE
is as follows:
1
http://yann.lecun.com/exdb/mnist.
13
International Journal of Machine Learning and Cybernetics
Homofunc(x) = an xn + an−1 xn−1 + ⋯ + a1 x + a0 (6) BGVAdd () , BGVMult () to compute those encrypted param-
eters. Homo-ELM prediction algorithm is shown in
where n is a positive integer and ai (0 ≤ i ≤ n) is Algorithm 8.
restricted to be an integer. Note that polynomial approx- This algorithm first computes the encrypted hidden
imation to sigmoid and other functions can not be values encH based on the encrypted parameters and test
used with HE due to division operations, for instance: images EncY (only test sets are required to be encrypted
sigmoid(x) = 21 + 14 x − 48
1 3 1 5
x + 480 x + O(x6 ) [37]. In Eq. 6, by BGVEnc ), and then computes the encrypted prediction
if large values are given to a and n, Homo-ELM can tackle results EncYPre. Finally, a user who possessed the private
more complex problems, but it requires more computing key sk can decrypt EncYPre by BGVDec (param, sk, EncYPre)
resources. We need to leverage them according to the com- to obtain the unencrypted prediction results. Note that
plexity of the target problem and the resources. when encrypting an image, BGVEnc must be repeatedly
invoked for each pixel of the image.
2.3.2 Homo‑ELM training
3 Evaluation
13
International Journal of Machine Learning and Cybernetics
Table 1 Description of Database/number Total images Training images Test images Image size Category
databases
Iris 150 100 50 4×1 3
Land Satellite Image 6435 3217 3218 6×6 6
MNIST 70,000 60,000 10,000 28 × 28 10
NIST 19 150,000 140,000 10,000 32 × 32 26
varying in the input size of images, number of categories to on four un-encrypted databases are described at the last two
predict, and dataset size. lines of Table 2. It shows that the relative test accuracies
Note that NIST Special Database 19 has 810,000 images of Homofunc(x) = x3 activation function (99.59%, 97.43%,
from 3600 writers which include four forms: digit, upper- 98.62%, and 97.71%) are better than Homofunc(x) = x2
case, lowercase, and free form text. Considering the limita- (99.45%, 96.11%, 97.97%, and 96.91%). With large values
tion of computing resource, we use 150,000 lowercase char- assigned to a and n, ELM can achieve higher accuracy on
acters as training dataset and only choose 6000 lowercase those classification tasks.
as test dataset.
3.2.2 Normalization of data and Homo‑ELM parameters
3.2 Experimental result and discussion
In general, the normalization makes input weight 𝜔 from −1
The experiments were carried out from the Microsoft Surface to 1, the range of bias b from 0 to 1, and the range of input
laptop, which has a core i7 5820K 3.30 GHz CPU, 16 GB data X [−1, 1] . However, homomorphic encryption only sup-
RAM, and 512 GB SSD based on Matlab 2017b and homo- ports calculating the integers ℤ . Therefore, the parameters
morphic encryption framework HElib.5 The main hyperpa- of the trained model and data are normalized from 0 to Z
rameters of HElib are modulus p = 957073583844512077 , before homomorphic encryption. In details, 𝜔 , 𝛽 and X are
a parameter d = 4096 of polynomial ring, security param- normalized through the following two steps:
eter 𝜆 = 128 , levels L = 16 . Note that the input weights of
Homo-ELM are randomly generated. Therefore, all results • Feature scaling For a given feature x, it is normalized
are averaged from 10 repeated experiments. into the range [0, 1] by Min–Max normalization method
(see Eq. 7) at first.
3.2.1 Analysis of the activation function Homofunc
x − Min(x)
x� = (7)
Table 2 shows the comparison results between Homofunc Max(x) − Min(x)
and Sigmoid functions of ELM (500 hidden nodes) on the • Scaling rounding Then, we round x up to the nearest
′
un-encrypted data. We do this experiment because homo- whole number by round function.
morphic encryption only supports a few mathematical opera-
tors that can not implement the common activation func-
�
xnorm = round(x × 𝛼) (8)
tions such as Sigmoid and Guassion activation functions,
we must make sure the modified polynomial activation func- where 𝛼 is a multiple of 10. Note that when the larger
tion supports homomorphic encryption without too much value of 𝛼 is chosen, the more information can be
accuracy lost. Moreover, the proposed Homofunc function reserved. We need chose this value according to the
contains the parameters n and ai . They can not be set too size of the input dataset and the available computing
large because the large size of the encrypted data will exceed resources.
the maximum machine word length. Therefore, we take
Homofunc(x) = x2 and x3 as our activation function, where
Table 2 ELM accuracy on unnormalized and unencrypted data
n = 2 or 3, an = 1, ai = 0, 0 ≤ i < n . From the results, we
can see that Homofunc activation function has the closest Activation function Iris Land satellite MNIST NIST19
accuracy to the Sigmoid function. Hence, the results proved Sigmoid 94.31 88.95 91.28 89.37
that Homofunc activation function is feasible for homomor- 93.79 85.49 89.43 86.61
Homofunc(x) = x2
phic encryption. Moreover, the relative accuracy between
Homofunc(x) = x3 93.92 86.66 90.02 87.32
the accuracy of ELM by Homofunc and Sigmoid function
Homofunc(x) = x2 99.45 a 96.11 a 97.97 a 96.91 a
Homofunc(x) = x3 99.59 a 97.43 a 98.62 a 97.71 a
5
https://github.com/shaih/HElib. a
Denotes the relative accuracy from sigmoid function
13
International Journal of Machine Learning and Cybernetics
Table 3 ELM accuracy on normalized but unencrypted data by AES) on all datasets. In details, Homo-ELM achieves
Activation Function Iris Land satellite MNIST NIST19
test accuracy of 89.55%, 78.62%, 72.27%, and 71.89% on
encrypted data while ELM achieves the accuracy of 81.52%
Sigmoid 34.31 22.65 21.91 25.37 72.62%, 62.23%, and 65.81%. In addition, compared to
Homofunc(x) = x 2 91.79 86.30 85.70 83.41 the Homo-ELM with 100 hidden nodes on the encrypted
Homofunc(x) = x3 92.12 86.96 86.82 84.92 datasets, ELM achieves the accuracy of 90.72%, 80.16%,
74.49%, and 73.12% over un-encrypted datasets. The relative
accuracy between Homo-ELM (encrypted) and ELM (un-
Table 4 Homo-ELM accuracy on normalized and encrypted data encrypted) is 98.71%, 98.07%, 97.02%, and 98.32% respec-
Homofunc(x) Iris Land satellite MNIST NIST19
tively. The average of those relative accuracies is 98.03%. In
short, Homo-ELM can achieve better classification accuracy
x2 89.55 78.62 72.27 71.89 with high-level security than ELM on the encrypted data.
x3 90.21 77.23 74.59 72.16 Compared to the ELM on un-encrypted data, the proposed
x2 + x + 1 91.37 79.19 73.24 73.02 Homo-ELM can classify encrypted images without much
accuracy loss.
We normalize the input weight 𝜔 , bias and input data X into 3.2.4 Limitation
the range 0 < 𝜔 < 100 , 0 < bias < 10000 and 0 < X < 100
respectively. The comparison results between Homofunc Fully homomorphic encryption can make computations on
and Sigmoid functions of ELM (500 hidden nodes) on the the encrypted data without compromising the privacy of
un-encrypted data are shown in Table 3. It shows that ELM users. However, some challenges are still unsolved when
based on sigmoid function has too low accuracies to be applying it to the complex problems in practices, because
used in practice. ELM based on Homofunc(x) function can when we set a large parameter d and modulus p of homo-
achieve the acceptable accuracy. morphic encryption, in which the d and p determine the size
of the polynomial ring, the length of encryption results will
In addition, we take Homofunc function on encrypted be explosive, and the computation time of homomorphic
data. The results of Homo-ELM with 100 hidden nodes addition and multiple will be too long to be acceptable.
based on different settings of Homofunc function are shown Therefore, we need to leverage them according to the com-
in Table 4. It demonstrates the different values of a and n plexity of the target problem and the resources. Besides,
will affect the accuracy of Homo-ELM. The more complex homomorphic matrix multiplication can be executed in par-
Homofunc function given, Homo-ELM can achieve higher allel, this will much save the computing time and alleviate
accuracy. This conforms the experiments of ELM. Note this problem. We will do this work in the future.
that, we take the Homofunc(x) = x2 as activation function
in following experiments due to the limitation of comput-
ing resource. 4 Application: Homo‑ELM cloud storage
13
International Journal of Machine Learning and Cybernetics
Searching Result
and encrypted on local user devices such as personal com- to the usage of machine learning in cloud computing
puters or mobile devices first, and then the encrypted data environments.
and encrypted Homo-ELM are uploaded to the cloud. The In the future, we will apply Homo-ELM to more complex
privacy is preserved because all the data and neural net- cloud tasks on various kinds of devices such as computers,
work of Homo-ELM are encrypted before transferred to the smartphone, and wearable devices. Moreover, we plan to
cloud. Homo-ELM is a multi-class classifier that can only improve the classification performance by designing new
deal with the classification problems. Therefore, we need a polynomial functions to replace non-polynomial functions
cloud service that warps Homo-ELM classifier to deal with and extend Homo-ELM to support more mathematics opera-
the searching problems. In the right side, a searching service tors on encrypted data. Furthermore, other learning methods
is provided to end-users. It takes a user searching query Q [38, 39] will be considered for improving the classification
as inputs (Note that Q is encrypted before to be sent to the performance.
cloud) and returns the searching results idx of data storage
index based on the encrypted data and encrypted Homo- Acknowledgements The work presented in this paper was funded
by the Multi-Year Research Grant of the University of Macau (No.
ELM. The details are shown as follows. For each encrypted 2019-00020-FST, 2018-00138-FST) and the Science and Technology
data d of the user u, the encrypted Homo-ELM predicts an Development Fund of Macau SAR (No.0021/2019/A, 273/2017/A).
encrypted classification label l in the cloud, if the encrypted
label l equals to the searching query Q, return the index idx
of the encrypted data, then the user can download the target
encrypted data based on the index idx to local devices. References
1. Chi P, Lei C (2018) Audit-free cloud storage via deniable attrib-
5 Conclusion and future work ute-based encryption. IEEE Tran Cloud Comput 6(2):414–427
2. Wei J, Liu W, Hu X (2018) Secure data sharing in cloud comput-
ing using revocable-storage identity-based encryption. IEEE Trans
In this paper, we proposed a novel fully homomorphic Cloud Comput 6(4):1136–1148
extreme learning machine with a newly designed activa- 3. Luna JM, Abdallah CT, Heileman GL (2018) Probabilistic opti-
tion function that enables ELM for cloud searching task mization of resource distribution and encryption for data storage
in the cloud. IEEE Trans Cloud Comput 6(2):428–439
without compromising the privacy of user. The prediction 4. Dawn XS, Wagner D, Perrig A (2000) Practical techniques for
process of neural network is fully protected that means searches on encrypted data. In: Proceeding 2000 IEEE symposium
the potential privacy breaching is prevented on the pub- on security and privacy. S P 2000, pp 44–55
lic cloud. Moreover, the encrypted data can be directly 5. Bösch C, Hartel P, Jonker W, Peter A (2014) A survey of
provably secure searchable encryption. ACM Comput Surv
and accurately searched without time-consuming steps of 47(2):18:1–18:51
decryption. The experiments demonstrate the effective-
ness of our proposed Homo-ELM. This work contributes
13
International Journal of Machine Learning and Cybernetics
6. Awad A, Matthews A, Qiao Y, Lee B (2018) Chaotic searchable 24. Wang X, Wang R, Xu C (2018) Discovering the relationship
encryption for mobile cloud storage. IEEE Trans Cloud Comput between generalization and uncertainty by incorporating com-
6(2):440–452 plexity of classification. IEEE Trans Cybern 48(2):703–715
7. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 25. Tu E, Zhang G, Rachmawati L, Rajabally E, Huang G (2018)
521(7553):436 Exploiting AIS data for intelligent maritime navigation: a com-
8. Schmidhuber J (2015) Deep learning in neural networks: an over- prehensive survey from data to methodology. IEEE Trans Intell
view. Neural Netw 61:85–117 Transp Syst 19(5):1559–1582
9. Wang W, Vong CM, Yang Y, Wong PK (2017) Encrypted image 26. Cui D, Huang GB, Liu T (2018) ELM based smile detection using
classification based on multilayer extreme learning machine. Mul- distance vector. Pattern Recognit 79:356–369
tidimens Syst Signal Process 28(3):851–865 27. Sun F, Huang G, Wu QMJ, Song S, Wunsch DC II (2017) Effi-
10. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: cient and rapid machine learning algorithms for big data and
theory and applications. Neurocomputing 70(1):489–501 (Neural dynamic varying systems. IEEE Trans Syst Man Cybern Syst
Networks) 47(10):2625–2626
11. Lendasse A, Vong CM, Toh KA, Miche Y, Huang GB (2006) 28. Huang G, Zhou H, Ding X, Zhang R (2012) Extreme learning
Advances in extreme learning machines (ELM2015). Neurocom- machine for regression and multiclass classification. IEEE Trans
puting 261:1–3 (Advances in Extreme Learning Machines) Syst Man Cybern Part B (Cybern) 42(2):513–529
12. Wang R, Chow C, Lyu Y, Lee VCS, Kwong S, Li Y, Zeng J (2018) 29. Rivest RL, Adleman L, Dertouzos ML (1978) On data banks and
Taxirec: recommending road clusters to taxi drivers using rank- privacy homomorphisms. Found Secure Comput 4(11):169–180
ing-based extreme learning machines. IEEE Trans Knowl Data 30. Naccache D, Stern J (1998) A new public key cryptosystem based
Eng 30(3):585–598 on higher residues. In: Proceedings of the 5th ACM conference
13. Rong H, Huang G, Sundararajan N, Saratchandran P (2009) on computer and communications security. CCS ’98. ACM, New
Online sequential fuzzy extreme learning machine for function York, NY, pp 59–66
approximation and classification problems. IEEE Trans Syst Man 31. Boneh D, Goh EJ, Nissim K (2005) Evaluating 2-DNF formulas
Cybern Part B (Cybern) 39(4):1067–1072 on ciphertexts. In: Proceedings of the second international con-
14. Rong HJ, Jia YX, Zhao GS (2014) Aircraft recognition using mod- ference on theory of cryptography. TCC’05. Springer, Berlin, pp
ular extreme learning machine. Neurocomputing 128:166–174 325–341
15. Phong LT, Aono Y, Hayashi T, Wang L, Moriai S (2018) Pri- 32. Gentry C (2009) Fully homomorphic encryption using ideal lat-
vacy-preserving deep learning via additively homomorphic tices. In: Proceedings of the forty-first annual ACM symposium
encryption. IEEE Trans Inf Forensics Secur 13(5):1333–1345 on theory of computing. STOC ’09. ACM, New York, NY, pp
16. Zhang Q, Wang C, Wu H, Xin C, Phuong TV (2018) Gelu-net: 169–178
a globally encrypted, locally unencrypted deep neural network 33. van Dijk M, Gentry C, Halevi S, Vaikuntanathan V (2010) Fully
for privacy-preserved learning. In: Proceedings of the twenty- homomorphic encryption over the integers. In: Gilbert H (ed)
seventh international joint conference on artificial intelligence, Advances in cryptology—EUROCRYPT 2010. Springer, Berlin,
IJCAI-18, international joint conferences on artificial intelli- pp 24–43
gence organization, no 7, pp 3933–3939 34. Brakerski Z, Gentry C, Vaikuntanathan V (2014) (leveled) Fully
17. Samet S, Miri A (2012) Privacy-preserving back-propagation homomorphic encryption without bootstrapping. ACM Trans
and extreme learning machine algorithms. Data Knowl Eng Comput Theory (TOCT) 6(3):13:1–13:36
79–80:40–61 35. Liu X, Lin S, Fang J, Xu Z (2015) Is extreme learning machine
18. Hashimoto M, Kaneda Y, Zhao Q (2016) An elm-based privacy feasible? a theoretical assessment (part I). IEEE Trans Neural
preserving protocol for cloud systems. In: 2016 IEEE sympo- Netw Learn Syst 26(1):7–20
sium series on computational intelligence (SSCI), pp 1–6 36. Lin S, Liu X, Fang J, Xu Z (2015) Is extreme learning machine
19. Hashimoto M, Zhao Q (2017) An ELM-based privacy preserv- feasible? a theoretical assessment (part II). IEEE Trans Neural
ing protocol for implementing aware agents. In: 2017 3rd IEEE Netw Learn Syst 26(1):21–34
international conference on cybernetics (CYBCONF), pp 1–6 37. Temurtas F, Gulbag A, Yumusak N (2004) A study on neural
20. Özgür Çatak F, Mustacoglu AF (2018) CPP-ELM: cryptograph- networks using Taylor series expansion of sigmoid activation
ically privacy-preserving extreme learning machine for cloud function. In: Computational science and its applications—ICCSA
systems. Int J Comput Intell Syst 11:33–44 2004. Springer, Berlin, pp 389–397
21. Kuri S, Hayashi T, Omori T, Ozawa S, Aono Y, Phong LT, 38. Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans
Wang L, Moriai S (2017) Privacy preserving extreme learning Knowl Data Eng 22(10):1345–1359
machine using additively homomorphic encryption. In: 2017 39. Wang R, Wang X, Kwong S, Xu C (2017) Incorporating diversity
IEEE symposium series on computational intelligence (SSCI), and informativeness in multiple-instance active learning. IEEE
pp 1–8 Trans Fuzzy Syst 25(6):1460–1475
22. Cao W, Wang X, Ming Z, Gao J (2018) A review on neural net-
works with random weights. Neurocomputing 275:278–287 Publisher’s Note Springer Nature remains neutral with regard to
23. Wang X, Zhang T, Wang R (2019) Noniterative deep learning: jurisdictional claims in published maps and institutional affiliations.
incorporating restricted boltzmann machine into multilayer ran-
dom weight neural networks. IEEE Trans Syst Man Cybern Syst
49(7):1299–1308
13