You are on page 1of 4

Project Title: A Real-Time Air Pollution Monitoring with Prediction and

Safest Path Routing System

Contributions in work:
1. It is an IoT-envisaged smart city project in which first hardware development of air
pollution monitoring device using Raspberry Pi and sensors.
2. The time-series based prediction model using ML/DL techniques using the real-time
data taken from the hardware.
3. The safest path routing algorithm proposed to study analysis in MATLAB to test and
verify the analysis of safest path.
4. Development of an Android app for user-interface for the same using android studio
platform.
5. Two Conference papers being accepted (Springer) one on hardware and another on
Prediction analysis.
6. Working on Journal papers regarding the work contributions given in the M.Tech
project. (Under process)

Deep Learning Algorithm – Recurrent Neural Networks with Long Short Term
Memory (RNN-LSTM Model):

Recurrent Neural Networks (RNN)


A glaring limitation of Vanilla Neural Networks (and also Convolutional Networks) is that
their API is too constrained: they accept a fixed-sized vector as input (e.g. an image) and
produce a fixed-sized vector as output (e.g. probabilities of different classes). Not only that:
These models perform this mapping using a fixed amount of computational steps (e.g. the
number of layers in the model). The core reason that recurrent nets are more exciting is that
they allow us to operate over sequences of vectors: Sequences in the input, the output, or in
the most general case both.
Recurrent Neural Networks (RNN) as shown in Figure shows type of Neural Network
where the output from the previous step is fed as input to the current step. RNN has a
“memory” which remembers all information about what has been calculated. It uses the
same parameters for each input as it performs the same task on all the inputs or hidden
layers to produce the output. This reduces the complexity of parameters, unlike other
neural networks.

A recurrent neural network (RNN) can be thought of as multiple copies of the same
network, each passing a message to a successor. An RNN remembers each information
through time. It is useful in time series prediction only because of the feature to
remember the previous inputs as well. This is called Long Short Term Memory (LSTM) as
shown in the below Figure.

Why LSTM in RNN?

The problem in RNN is the short term memory which means when the data-set contains a
long sequence, then it will leave the important information from the starting of the data-set
available. Also, it suffers from the vanishing gradient problem i.e. the gradient value
becomes too small to contribute to any learning in the backpropagation. Gradients are the
values that update the weights of the neural network. Then, LSTM comes into the picture
for the solution of short term memory which has the internal mechanisms called gates that
regulate the flow of information. Applications in which these neural networks useful are
speech synthesis, speech recognition, and text generation.

The process of RNN-LSTM model as shown in below Figure is as follows:


• In the process, RNN passes the previous hidden state to the next step of a
sequence. Then, the input and previous hidden state are combined to form a
vector.

• This vector passes through the tanh activation and output is the new memory of
the network. This activation helps in regulating the values between -1 and 1
flowing through the network.

• The gates in LSTM decides which information is allowed to keep or forget in the
cell state for training. Gates contain sigmoid activation regulating values between
0 and 1. This helps update or forget the data using forget gate.

• To update the cell state, the input gate is there, and to regulate the network tanh
output is multiplied with the sigmoid output.

• The cell state gets pointwise multiplied by the forget vector and then the
pointwise addition of the output from the input gate which updates the cell state
to new values relevant for the network.

• Then, last is the output gate that decides what the next hidden state should be
which is used for predictions. Then, the modified cell state is passed to tanh
function and the process continues in the LSTM model.

Skills:
IoT (Internet of things).
Machine Learning.

Deep Learning.

Python/Basic C/OOPS.

Wireless Communication.

Computer Networks.

You might also like