You are on page 1of 5

2021 International Conference on Neuromorphic Computing (ICNC), Wuhan, China, October 15-17, 2021

Memristor-based Brain-like Reconfigurable


Neuromorphic System
1st Le Yang∗ 2nd Zhixia Ding
School of Electrical and Information Engineering School of Electrical and Information Engineering
Wuhan Institute of Technology Wuhan Institute of Technology
Wuhan, 430000 China Wuhan, 430000 China
leyangmail@163.com zxding89@163.com

3rd Hongfei Liu 4th Yanyang Xu 5th Ting Su


Wuhan Donghu University School of Electrical and Information School of Electrical and Information
Wuhan, 430000 China Engineering Engineering
2021 International Conference on Neuromorphic Computing (ICNC) | 978-1-6654-1287-2/21/$31.00 ©2021 IEEE | DOI: 10.1109/ICNC52316.2021.9608461

760100616@qq.com Wuhan Institute of Technology Wuhan Institute of Technology


Wuhan, 430000 China Wuhan, 430000 China
15549402042@163.com st2496363237@163.com

Abstract—It improves the data processing performance and complete various fine and complex tasks, and the overall power
efficiency notably that the memristor-based neuromorphic system consumption is less than 20 watts. Therefore, constructing
is constructed by mimicking the features of brain. This paper the circuit hardware to mimic the human brain has good
proposes a memristor-based circuit implementation for brain-
like reconfigurable neuromorphic system. This neuromorphic development and application prospects [1]–[3]. Associative
system contains memristor-based back-propagation neural net- memory and forgetting along time are important biological
work, memristor-based long short-term memory network, and mechanisms, which have been verified in the biological exper-
memristor-based associative memory network. The reconfig- iments like Pavlov’s associative memory, aplysia gill reflex, as
urable circuit components can constitute the circuit hardware well as human brain, etc [4].
of the three memristor-based networks according to the task
requirements or release the circuit hardware of the three The memristor is nonvolatile when there is no voltage on
memristor-based networks based on the forgetting mechanism the memrsitor. On the other hand, the voltage on the memristor
along with time of the biology. Hence, this neuromorphic system can change the memristance. Moreover, the memristor can be
has dynamic topology structure, which is similar as the feature manufactured to nanoscale. Hence, the memristor is a proper
of biological neural networks. A case study of the memristor- choice to realize in-memory computing and large-scale parallel
based brain-like reconfigurable neuromorphic system is present-
ed in the paper. In the case study, the memristor-based back- computing [5], [6]. Memristor-based neuromorphic system de-
propagation neural network and the memristor-based long short- pending on Ohm’s law and Kirchhoffs’ law can accomplish the
term memory network are applied for the image recognition and corresponding computations immediately, which improves the
speech recognition, respectively. The recognition results of the computation speed remarkably. By mimicking the biological
two memristor-based networks are input to the memristor-based features, the memristor-based brain-like neuromorphic system
associative memory network to recall the correlated information.
This neuromorphic system has the potential to apply for the is a promising candidate to deal with the tasks like brain [7],
intelligent robot system. [8].
Index Terms—memristor-based neuromorphic system, back-propagation (BP) algorithm is one of the most basic
memristor-based circuit, associative memory, forgetting process, and commonly used algorithms for the training of multi-
back-propagation neural network, long short-term memory layer neural networks. It is the basis of many artificial neural
network
network algorithms and deep learning algorithms. The circuit
I. I NTRODUCTION design of the memristor-based BP neural network lays the
foundation for the construction of the memristor-based deep
The human brain has tens of billions of neurons, and each
neural network. Based on the structure of BP neural network
neuron connects with other neurons through thousands of
and the excellent properties of the memristor, some memristor-
synapses. Depending on learning and forgetting mechanisms,
based BP neural networks have designed to fulfill the iris
the human brain can change its topology structure dynamically.
classification, handwritten character recognition, etc [9]–[13].
Furthermore, the human brain can quickly and efficiently
Long short-term memory (LSTM) network is a commonly
The work was supported by the Natural Science Foundation of China used recurrent neural network. It is widely used in natural lan-
under Grant 62176189, 62106181, the Hubei Province Key Laboratory of guage processing, machine translation, speech recognition, etc.
Systems Science in Metallurgical Process (Wuhan University of Science and
Technology) under Grant Y202002, the Science Foundation of Wuhan Institute Based on the memristor-based cross array, the multiplications
of Technology K202017. of the matrices in the forward propagation of LSTM network

978-0-7381-2549-7/21/$31.00 ©2021 IEEE 292


Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY DELHI. Downloaded on January 23,2024 at 10:44:19 UTC from IEEE Xplore. Restrictions apply.
are accomplished on the memristor-based circuit hardware,
Memristor-based brain-like reconfigurable neuromorphic system
which greatly improves the data processing speed. [14]–[16].
In [17], the memristor-based associative memory network Memristor-based BP neural network Memristor-based LSTM network
is designed for the first time to simulate Pavlovian associative
memory. In the following researches, the simulated functions Circuit design of forward
propagation process
Circuit design of back
propagation process
Circuit design of
forward propagation
Circuit design of back-
propagation through

of memristor-based associative memory networks are further process time process

expanded, and physical memristors or SPICE simulations are


applied to verify the effectiveness of the corresponding circuit

po se
ts
m ea
Re po
designs. The expanded biological functions include: the food

ne
co

co Rel
Reconfigurable circuit

lea ne
m

se ts
Re net
componets

or re
co wo

tw igu
and bell forgetting processes, the forgetting process along with

nf rk

k
ne onf
ig
Network topology

ur

c
Memristor-based

Re
e
reconfiguration
time, the learning speed becomes faster with the increase circuit
cross array

of learning times, the forgetting speed becomes slower after Forgetting


mechanism

multiple learning, etc [18]–[21].

Reconfigure

componets
network

Release
As far as we know, there is no study to presented the
memristor-based circuit implementation for brain-like recon- Memristor-based associative
figurable neuromorphic system. Based on the associative mem- memory network

ory and the forgetting process, combining the BP neural Associative memory
mechanism

network and LSTM network, the memristor-based brain-like


reconfigurable neuromorphic system is proposed in this pa- Fig. 1. The architecture of the memristor-based brain-like reconfigurable
per. The biological neural network also dynamically changes neuromorphic system.
the network topology according to the learning and forget-
ting. Similarly, this neuromorphic system can dynamically
reconfigure the topology structure. Furthermore, due to the One iteration of the BP neural network contains the forward
dynamic feature, this neuromorphic system can constitute the propagation process and the back propagation process. The
memristor-based network according to the task requirement forward propagation process can be represented as
or release the circuit hardware. Hence, this neuromorphic sys- Z m+1 = f (W m+1 Z m + b) (1)
tem can handle different tasks by reconfiguring the topology
structure. In a case study of the memristor-based brain-like where Z m+1 represents the output vector of the m + 1 layer,
reconfigurable neuromorphic system, the system is utilized to W m+1 denotes the synaptic weight matrix of the m + 1 layer,
recall the corresponding correlated information of image and b represents the bias vector. Z m represents the output vector
voice. Accordingly, this neuromorphic system can be applied of the m layer, which is also the input vector of the m + 1
to the intelligent robot system. layer. The computations in the back propagation process are
The rest of paper is organized as follow. Section II presents given by
the architecture of the memristor-based brain-like reconfig-
W m+1 (k + 1) = W m+1 (k) − αsm+1 (Z m+1 )T (2)
urable neuromorphic system. Section III introduces a case
study of the memristor-based brain-like reconfigurable neu- r m+1
(k + 1) = r m+1
(k) − βs m+1
(3)
romorphic system. Conclusions are drawn in Section IV.
sM = β(d − Z M ) (4)
II. T HE A RCHITECTURE OF THE M EMRISTOR -BASED
B RAIN -L IKE R ECONFIGURABLE N EUROMORPHIC S YSTEM sm = Ḟ m W m+1 (k)T sm+1 (5)
By mimicking the features of biological neural networks, where sM is the sensitivity vector of the last layer, Z M is
this paper proposes the memristor-based brain-like recon- the output vector of the BP neural network, sm+1 is the
figurable neuromorphic system. This neuromorphic system sensitivity vector of m + 1 layer, Ḟ m is the derivative of
can dynamically reconfigure the topology structure, which is the activation function of m layer. As analyzed above, we
similar as biological neural networks. Furthermore, the system can see both the forward propagation process and the back
can reconfigure the corresponding memristor-based network to propagation process contain the multiplications related to the
accomplish different tasks. Hence, this neuromorphic system is synaptic weight matrix.
not constructed for single task but can fulfill various tasks. The The memristor-based crossbar array is a proper choice to
architecture of the memristor-based brain-like reconfigurable accomplish the multiplications related to the synaptic weight
neuromorphic system is shown in Fig. 1. matrix on the memristive circuit hardware. The schematic
As shown in Fig. 1, there are memristor-based BP neural diagram of the memristor-based crossbar array which consists
network, memristor-based LSTM network, and memristor- of four MOS transistors and one memristor is shown in
based associative memory network in the memristor-based Fig. 2. Relying on Ohm’s law and Kirchhoffs’ laws, the
brain-like reconfigurable neuromorphic system. In order to memristor-based crossbar array can obtain the corresponding
implement the memristor-based BP neural network, we need output immediately when the input signals are input. The
to analyze the computations in the BP neural network first. memristor-based crossbar array can be utilized to represent

293
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY DELHI. Downloaded on January 23,2024 at 10:44:19 UTC from IEEE Xplore. Restrictions apply.
KW KW KW

h 
WDQK

$ h h $
· · WDQK ·

;W ;W ;W

Fig. 3. The structure diagram of LSTM network with time series expansion.

hidden state of the last moment, xt represents the current


⊙ data
vector. U is the sigmoid function or tanh function, is the
Hardamard product.
In the BPTT process of the LSTM network, the errors back
propagate in two directions. On the one hand, the errors back
propagate along the layers of LSTM network. On the other
hand, the errors back propagate along time series. The error
Fig. 2. The schematic diagram of the memristor-based crossbar array. back propagations along time series only operate in the current
layer. Accordingly, the input or output values in other layers do
not affect the calculations of the error back propagations along
the synaptic weight matrix in the neural network. Hence, if
time series in the current layer. The loss function of LSTM
the memristor-based crossbar array is applied to accomplish
network is defined as L. Then, the error back propagation
the multiplication of the synaptic weight matrix, the computing
values along time series at time t − 1 are calculated by
speed can be accelerated remarkably.
As shown in Fig. 2, if the input voltage signals are applied ∂L
δt = (9)
to the row lines, the current on the column lines will represent ∂ht
the output product of the multiplications of the synaptic weight ∂L ⊙ ⊙ ⊙
matrix. This situation is presented as the blue arrows in the = δ To,t = δ Tt tanh(C t ) ot (1 − ot ) (10)
∂sot
figure. Then, the computation of the synaptic weight matrix
is fulfilled immediately. On the other hand, when the input
∂L ⊙ ⊙ ⊙
signals are applied to the column lines, the current on the = δ Tf,t = δ Tt ot (1 − tanh(C t )2 ) C t−1
row lines will denote the output product of the multiplications ∂sf t
⊙ ⊙
of transposed matrix of the synaptic weights. This process is ft (1 − f t ) (11)
shown as the red arrows in the figures. Then, the computation
of the transposed matrix of the synaptic weights is accom- ⊙ ⊙ ⊙
∂L ft
plished immediately. Accordingly, we can use the similar = δ Ti,t = δ Tt ot (1 − tanh(C t )2 ) C
∂sit
methods to fulfill the computations in the forward process and ⊙ ⊙
back propagation process on the memristive circuit hardware. it (1 − it ) (12)
Similarly, in order to construct the memristor-based long
short-term memory (LSTM) network, the structure of the ⊙ ⊙ ⊙
∂L ft
LSTM is analyzed first. The structure diagram of LSTM = δ TC,t T
f = δt ot (1 − tanh(C t )2 ) C
network with time series expansion is shown in Fig. 3. One gt
∂ sC
⊙ ⊙ 2
iteration of the LSTM network contains forward propagation it (1 − C ft ) (13)
process and back-propagation through time (BPTT) process.
According to the presentation of Fig. 3, the computations in
the forward propagation process are presented as ∂L
δ t−1 = = W Thf δ f,t + W Thi δ i,t + W Thc
f δ cf
,t
∂ht−1
K t = U (SK) = U (W hK ht−1 +W xK xt +bK ), K = f, i, C, e o T
+ W ho δ o,t
(14)
⊙ (6)
Ct = f t ft
C t−1 + it C (7) Furthermore, the error back propagation values along time
⊙ series at time r can be obtained
ht = ot tanh(C t ) (8)

t−1

where W hf , W xf , W hi , W xi , W hC f , W ho , W xo ,
f , W xC
δ r = (W Thf δ f,j + W Thi δ i,j + W Thc
f δ c,j
T
f + W ho δ o,j )
and bf , bi , bCe , bo represent the corresponding synaptic weight j=r

matrices and the bias vectors, respectively. ht−1 denotes the (15)

294
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY DELHI. Downloaded on January 23,2024 at 10:44:19 UTC from IEEE Xplore. Restrictions apply.
Then, the gradient values of the W hf , W hi , W hC f , W ho , and conditional stimulus (CS). When US and CS are applied to
W hy , bf , bi , bCe , and bo can be represented as the memristor-based associative memory network at the same
time to accomplish the associative learning, the correlation
∂L ∑ t
= e o
δ K,j hTj−1 , K = f, i, C, (16) forms between US and CS. Then, the single presentation of CS
∂W hK j−1
will activate the memristor-based associative memory network
after the associative learning. Moreover, the memristor-based
∂L ∑ t
associative memory network suffers the forgetting process
= e o
δ K,j , K = f, i, C, (17)
∂bK along time. Namely, although there is no input stimulus,
j−1
the memristor-based associative memory network also enters
On the other hand, the errors back propagate along the a forgetting process after the associative learning, and the
layers of the LSTM network. If the current layer is defined synaptic weights of CS fades along with time.
as p layer, the previous layer is p − 1 layer. The error back
propagation values of p − 1 layer are expressed as III. A C ASE S TUDY OF THE M EMRISTOR -BASED
B RAIN -L IKE R ECONFIGURABLE N EUROMORPHIC S YSTEM
∂L ⊙ ′
= p − 1 = p − 1T · p dp−1 (18) A case study of the memristor-based brain-like reconfig-
p−1 δ k,t W xK δ k,t
st
∂sK t urable neuromorphic system is shown in Fig. 4. This system

can be applied for the intelligent robot system, realizing
where dp−1
st is the derivative vector of the activation function human-machine interaction.
of p − 1 layer. Moreover, we can get the gradient values of In this case study, the memristor-based BP neural network
the W xf , W xi , W xC
f , W xo is utilized to deal with the image recognition task. The
preprocessed images are input to the trained memristor-based
∂L e o
= δ K,t X Tt , K = f, i, C, (19) BP neural network for the recognition. The reconfigurable
∂W xK circuit components and the forgetting process control circuit
Then, based on the calculated gradient values, the synaptic manage the reconfiguration of the system.
weight matrices and bias vectors can be updated in one Based on the characteristic of the LSTM network, the
iteration. speech recognition is achieved on the memristor-based LSTM
As analyzed above, in the forward process and the BPTT network. Firstly, Mel cepstrum coefficient is used to extract
process of LSTM network, there are several multiplications the features of speech signals. Next, the extracted features
of the synaptic weight matrices and the transposed matrices are input into memristor-based LSTM network to obtain
of the synaptic weights. Hence, using the similar method the corresponding phoneme sequence. Then, based on the
shown in Fig. 2, it accelerates the computing speed notably phoneme sequence, the final recognition result is obtained by
to accomplish these computations on the memristive circuit using language model and decoding technology.
hardware. For memristor-based recall network, ‘recognizing the image
As shown in Fig. 1, the reconfigurable circuit components of the fruit’s name’ or ‘hearing the voice of the fruit’s name’
are utilized to reconfigure the topology of the memristor- are conditional stimuli, and the parameter information such
based brain-like reconfigurable neuromorphic system. When as fruit’s shape, color, texture, etc are unconditional stimuli.
one task is proper to be solved by the BP neural network, In the initial stage of the memristor-based recall network,
the network topology reconfiguration circuit configures the ‘recognizing the image of the fruit’s name’ or ‘hearing the
structure of the memristor-based crossbar array to constitute voice of the fruit’s name’ cannot recall any parameter in-
the memristor-based BP neural network. Then, the task can be formation of the fruit. After the conditioned reflex between
realized according to the corresponding training. At the same conditional and unconditional stimuli is established through
time, based on the forgetting feature of biology, the memristor- associative learning, the memristor-based recall network can
based BP neural network suffers a forgetting process after the recall the shape, color, texture, etc parameter information when
training. In the forgetting process, the memristor-based BP ‘recognizing the image of the fruit’s name’ or ‘hearing the
neural network still can handle the task. Once the forgetting voice of the fruit’s name’.
process finishes, the circuit hardware of the memristor-based At the same time, the established conditioned reflex will be
BP neural network is released to be the shared reconfigurable gradually forgotten along with time. Accordingly, the synaptic
circuit elements under the control of the network topology weights of the conditional stimuli decrease along with time.
reconfiguration circuit. If the image or voice of fruit’s name is recognized, the
Similarly, the memristor-based LSTM network can also memristor-based network will reinforce the conditioned reflex
realize the reconfigurable function according to the control again after outputting the parameter information of the fruit.
of the reconfigurable circuit components. If one fruit is not recognized all the time, the corresponding
The memristor-based associative memory network depends synaptic weights of the conditional stimuli decrease to the
on the associative memory mechanism to correlate the corre- critical value. Then, under the control of the reconfigurable
sponding stimuli. In the memristor-based associative memory circuit components and the forgetting process control circuit,
network, there are two stimuli: unconditional stimulus (US) the circuit hardware of the memristor-based recall network is

295
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY DELHI. Downloaded on January 23,2024 at 10:44:19 UTC from IEEE Xplore. Restrictions apply.
Reconfigurable
[4] P. I. Pavlov, “Conditioned reflexes: an investigation of the physiological
circuit componets activity of the cerebral cortex,” Annals of neurosciences, vol. 17, no. 3,
p. 136, 2010.
Image recognition Memristor-based
Network topology [5] Y. Zhang, Z. Wang, J. Zhu, Y. Yang, M. Rao, W. Song, and J. Yang,
reconfiguration
cross array
circuit “Brain-inspired computing with memristors: Challenges in devices,
Memristor-based BP
circuits, and systems,” Applied Physics Reviews, vol. 7, no. 1, p. 011308,
neural network
preprocessing

2020.
Image

[6] L. Yang, Z. Zeng, and Y. Huang, “An associative-memory-based re-


Information recall configurable memristive neuromorphic system with synchronous weight
Output training,” IEEE Transactions on Cognitive and Developmental Systems,
Associative memory information
mechanism
vol. 12, no. 3, pp. 529–540, 2020.
[7] J. Sun, J. Han, P. Liu, and Y. Wang, “Memristor-based neural network
Speech recognition circuit of pavlov associative memory with dual mode switching,” AEU-
International Journal of Electronics and Communications, vol. 129, p.
Memristor-based

Language model
Speech feature

LSTM network

Forgetting process 153552, 2021.


extraction

technology
decoding

control circuit [8] L. Shang, S. Duan, L. Wang, and T. Huang, “Srmc: A multibit memristor
crossbar for self-renewing image mask,” IEEE Transactions on Very
Timing control
circuit ... Large Scale Integration (VLSI) Systems, vol. 26, no. 12, pp. 2830–2841,
2018.
[9] Y. Zhang, X. Wang, and E. G. Friedman, “Memristor-based circuit
Fig. 4. A case study of the memristor-based brain-like reconfigurable design for multilayer neural networks,” IEEE Transactions on Circuits
neuromorphic system. and Systems I: Regular Papers, vol. 65, no. 2, pp. 677–686, 2017.
[10] S. P. Adhikari, C. Yang, H. Kim, and L. O. Chua, “Memristor bridge
synapse-based neural network and its learning,” IEEE Transactions on
Neural Networks and Learning Systems, vol. 23, no. 9, pp. 1426–1435,
released. Moreover, when a new recall function is needed, 2012.
the corresponding memristor-based recall network can be [11] S. Wen, S. Xiao, Z. Yan, Z. Zeng, and T. Huang, “Adjusting learning
rate of memristor-based multilayer neural networks via fuzzy method,”
reconfigured. IEEE Transactions on Computer-Aided Design of Integrated Circuits
The system in this case study is expected to be used in and Systems, vol. 38, no. 6, pp. 1084–1094, 2019.
the intelligent robot system. When a person write or speech [12] F. Cai, M. Correll, H. Lee, Y. Lim, V. Bothra, Z. Zhang, and et al., “A
fully integrated reprogrammable memristorccmos system for efficient
the fruit’s name, the system outputs the correlated parameter multiplycaccumulate operations,” Nat. Electron., vol. 2, no. 7, pp. 290–
information of the fruit. Hence, the system in this case study 299, 2019.
can realize the human-machine interaction to improve the [13] P. Yao, H. Wu, B. Gao, J. Tang, Q. Zhang, W. Zhang, and et al.,
“Fully hardware-implemented memristor convolutional neural network,”
service experience. Nature, vol. 577, no. 7792, pp. 641–646, 2020.
[14] C. Li, Z. Wang, M. Rao, D. Belkin, W. Song, H. Jiang, and et al., “Long
IV. C ONCLUSIONS short-term memory networks in memristor crossbar arrays,” Nature
Machine Intelligence, vol. 1, pp. 49–57, 2019.
This paper proposes a memristor-based brain-like recon- [15] K. Smagulova, O. Krestinskaya, and A. James, “A memristor-based
figurable neuromorphic system. First, the architecture of the long short term memory circuit,” Analog Integrated Circuits and Signal
Processing, vol. 95, pp. 467–472, 2018.
system is presented. There are three memristor-based neural [16] S. Wen, H. Wei, and Y. Yang, “Memristive lstm network for sentiment
networks in the system: memristor-based BP neural network, analysis,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 51,
memristor-based LSTM network, and the memristor-based no. 3, pp. 1794–1804, 2021.
[17] Y. V. Pershin and M. Di Ventra, “Experimental demonstration of
associative memory network. Then, the computations in the associative memory with memristive neural networks,” Neural Networks,
BP neural network and LSTM network are analyzed for the vol. 23, no. 7, pp. 881–886, 2010.
circuit implementation on the memristor-based crossbar array. [18] X. Liu, Z. Zeng, and S. Wen, “Implementation of memristive neural
network with full-function pavlov associative memory,” IEEE Transac-
Based on the associative memory and forgetting features of the tions on Circuits and Systems I: Regular Papers, vol. 63, no. 9, pp.
biology, this system realizes the reconfigurable function. Then, 1454–1463, 2016.
a case study of the memristor-based brain-like reconfigurable [19] L. Yang, Z. Zeng, Y. Huang, and S. Wen, “Memristor-based circuit im-
plementations of recognition network and recall network with forgetting
neuromorphic system is presented. In the case study, the stages,” IEEE Transactions on Cognitive and Developmental Systems,
recognition results of the image of the fruit’s name or voice vol. 10, no. 4, pp. 1133–1142, 2018.
of the fruit’s name are input to the memristor-based recall [20] L. Wang, H. Li, S. Duan, T. Huang, and H. Wang, “Pavlov associative
memory in a memristive neural network and its circuit implementation,”
network, outputting the correlated parameter information. This Neurocomputing, vol. 171, pp. 23–29, 2016.
case study has a potential to be used in intelligent robot [21] Z. Wang and X. Wang, “A novel nemristor-based circuit implementation
system. of full-function pavlov associative memory accorded with biological
feature,” IEEE Transactions on Circuits and Systems I: Regular Papers,
vol. 65, no. 7, pp. 2210–2220, 2018.
R EFERENCES
[1] J. Hawkins and S. Blakeslee, On intelligence: How a new understanding
of the brain will lead to the creation of truly intelligent machines.
Macmillan, 2007.
[2] H. Tang, R. Yan, and K. Tan, “Cognitive navigation by neuro-inspired
localization, mapping, and episodic memory,” IEEE Transactions on
Cognitive and Developmental Systems, vol. 10, no. 3, pp. 751–761, 2018.
[3] H. Tang, W. Huang, A. Narayanamoorthy, and R. Yan, “Cognitive
memory and mapping in a brain-like system for robotic navigation,”
Neural Networks, vol. 87, pp. 27–37, 2017.

296
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY DELHI. Downloaded on January 23,2024 at 10:44:19 UTC from IEEE Xplore. Restrictions apply.

You might also like