You are on page 1of 23

Stock Price Prediction using LSTM

Project Guide Team Members


Mr. S. Raj Kumar Manu Prathap (113217205033)
Assistant Professor I Sathish Kumar. S (113217205047)
Department of Information Technology Shanmuga Ganesh. T (113217205049)
Velammal Engineering College, Chennai
Abstract

● In the financial market, there are a large number of indicators used to describe the
change of stock price, which provides a good data basis for our stock price
forecast. Different stocks are affected by different factors due to their different
industry types and regions. Therefore, it is very important to find a multi factor
combination suitable for a particular stock to predict the price of the stock. This
project proposes to use Genetic Algorithm(GA) for feature selection and develop
an optimized Long Short-Term Memory(LSTM) neural network stock prediction
model. Firstly, we use the GA to obtain a factors importance ranking. Then, the
optimal combination of factors is obtained from this ranking with the method of
trial and error. Finally, we use the combination of optimal factors and LSTM
model for stock prediction.
Introduction
● Machine learning has significant applications in the stock price prediction. In this project, we will be
talking about predicting the returns on stocks. This is a very complex task and has uncertainties. We
will develop this project into two parts:

1. First, we will learn how to predict stock price using the LSTM neural network.

1. Then we will build a dashboard using Plotly dash/ipynb for stock analysis.

2. We are going to build an AI neural network model to predict stock prices. Specifically, we will work
with the FAANG stock!
Literature Survey
       
Author Title Year Algorithm Disadvantage

The analysis and


forecasting of stock Dec 2020 Deep Learning Expensive to train.
Zerin, Atakan
with deep learning

   
Deep Attentive  
Ramit Sawhney, Learning for Stock
Shivam Agarwal, Movement
Prediction from Nov 2020 DNN with MAN-SF GRU is the shorter version of
Arnav Wadhwa
and GRU. LSTM.
and Rajiv Ratn Social Media Text
Shah and Company
Correlations
Jimmy H.  
Stock price  
Moedjahedy, They lose efficiency in high
forecasting on  May 2020 Gaussian Process
dimensional spaces.
Reymon Rotikan,
Indonesian stocks.
Joe Yuan Mambu
       
Author Title Year Algorithm Disadvantage

Mehar Vijha , Stock Closing Price Computation may go far more


Prediction using September 2019 Random Forest complex compare to other
Deeksha
Machine Learning algorithms.
Chandolab, Vinay
Techniques
Anand
   
A SVM stock  
H. Yu, R. Chen, selection model May 2019 SVM in PCA. Not Suitable for large data sets
and G. Zhang within PCA

 
S. Selvin, R. Stock price  
Vinayakumar, E. A. prediction using   LSTM based
Time Complexity is High
LSTM, RNN and Sep. 2017 RNN and CNN.
Gopalakrishnan,
CNN-sliding
V. K. Menon, and
window mode
K. P. Soman
PROPOSED SYSTEM
● Accuracy plays an important role in stock market prediction. Although many algorithms
are available for this purpose, selecting the most accurate one continues to be the
fundamental task in getting the best results. In order to achieve this, we have compared
and analysed the performance of various available algorithms such as Logistic
regression, SVM, Random Forest, etc. This involves training the algorithms, executing
them, getting the results, comparing various performance parameters of these
algorithms and finally obtaining the most accurate one.

ADVANTAGE
• The successful prediction will maximize the benefit of the customer.
• In this project, we used stock data of five companies from the Huge Stock market dataset
consisting of data ranging from 2011 to 2021 to train different machine learning
algorithms.
• Hence we compared the accuracy of different machine learning algorithms and came up
with the most accurate one.
PROPOSED SYSTEM- ARCHITECTURE
RNN
● Recurrent Neural Networks implement the same concept using machines; they have loops
and allow information to persist where traditional neural networks can’t.
● Here comes the vanishing gradient problem of the RNN, where it can not handle large
sequences. Long short-term memory (LSTM) are designed to handle long-term
dependencies.

LSTM
● Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture
that you can use in the deep learning field. In LSTM, you can process an entire sequence of data.
For example, handwriting generation, question answering or speech recognition, and much
more.
● Unlike the traditional feed-forward neural network, that passes the values sequentially through
each layer of the network, LSTM has a feedback connection that helps it remember preceding
information, making it the perfect model for our needs to do time series analysis.
LSTM - Model

Forget Gate: This gate decides what information should be


thrown away or kept.

Input Gate: To update the cell state, we have the input gate.

Output Gate: The output gate decides what the next hidden state
should be.
Modules

1. Importing libraries - Seaborn, Keras, Sklearn, Numpy, Pandas, Matplotlib.


2. Loading data - Online Yahoo finance.
3. Reading data - Pandas.
4. Feature extraction - using GA.
5. Model building - Keras LSTM.
6. Results visualization - Matplotlib.
Importing Libraries
We are going to use numpy for scientific operations, pandas to modify our
dataset, matplotlib to visualize the results, sklearn to scale our data, and keras to
work as a wrapper on low-level libraries like TensorFlow or Theano high-level
neural networks library.

Loading Data
Reading Data

After uploading our data, we need to make a data frame using Panda lib.

Feature Extraction
Model Building

First, we initialized our model as a sequential one with 96 units in the


output’s dimensionality. We used return_sequences=True to make the
LSTM layer with three-dimensional input and input_shape to shape our
dataset. Making the dropout fraction 0.2 drops 20% of the layers. Finally,
we added a dense layer with a value of 1 because we want to output one
value.
Results Visualization
Data Set
RESULT
Final Graph
Future Improvements

● We can further increase model size, tweak the hyperparameters, use Bidirectional
LSTM and test if the model performs better.

● We can further optimize it by upgrading simple LSTM to LSTM with peephole


architecture.

● We can combine other machine learning algorithms such as Random Forest,


SVM along with LSTM to improve the efficiency of the model.
Conclusion

With the help of Genetic Algorithm and LSTM, we were able to predict the stock price of a company with
an accuracy over 90 %. Thus making it one of the best methods to predict the stock price of any company’s
stocks.
REFERENCES
● U. F. Siddiqi, S. M. Sait, and O. Kaynak, ‘‘Genetic algorithm for the mutual information-based
feature selection in univariate time series data,’’ IEEE Access, vol. 8, pp. 9597–9609, 2020.

● J. Hu and W. Zheng, ‘‘Multistage attention network for multivariate time series prediction,’’
Neurocomputing, vol. 18, no. 383, pp. 122–137, Mar. 2020.

● J. Deng and L. Li, ‘‘Application of parameter optimization stochastic forest in stock prediction,’’
Software, vol. 41, no. 1, pp. 178–182, Jan. 2020.

● D. Lv, D. Wang, M. Li, and Y. Xiang, ‘‘DNN models based on dimensionality reduction for stock
trading,’’ Intell. Data Anal., vol. 24, no. 1, pp. 19–45, Feb. 2020.
● D. Wang and J. Fang, ‘‘Research on optimization of big data construction engineering quality
management based on RNN-LSTM,’’ Complexity, vol. 15, no. 2, pp. 15–20, Jun. 2018.

● S. Selvin, R. Vinayakumar, E. A. Gopalakrishnan, V. K. Menon, and K. P. Soman, ‘‘Stock price


prediction using LSTM, RNN and CNN-sliding window model,’’ in Proc. Int. Conf. Adv.
Comput., Commun. Informat. (ICACCI), Beijing, China, pp. 234–239,Sep. 2017.

● A. Tsantekidis, N. Passalis, A. Tefas, J. Kanniainen, M. Gabbouj, and A. Iosifidis, ‘‘Forecasting


stock prices from the limit order book using convolutional neural networks,’’ in Proc. IEEE
19th Conf. Bus. Informat. (CBI), Beijing, China, pp. 10–15,Jul. 2017.

● Y. Kim, J.-H. Roh, and H. Kim, ‘‘Early forecasting of rice blast disease using long short-term
memory recurrent neural networks,’’ Sustainability, vol. 10, no. 2, p. 34, Dec. 2017.
THANK YOU

You might also like