You are on page 1of 12

NEURAL NETWORK TRAINING USING BACK-PROPOGATION USING IRIS DATASET

SWE1011 - Soft Computing Project


Review 1

By
Nadhiya.S-17MIS0144
Saranya.P-17MIS0148
Literature Survey :
Authors &Year Methodology or Advantages Issues Metrics used
Techniques used

Dr.Torsten SOM technique SOM is faster SOM has less Competitive learning
Alvager,Emeritus teaching technique accuracy 49.33% as opposed to error-
professor of physics trained using 84 than back correction
seconds propogation which learning(such as
had 94.67% backup propogation
with gradient
descent) 
Seema Singh,Surabi Statistical methods Obtain 78% accuracy Statistical method  incorrectly identified
BR,Harini J & and neural network provide 22% of plants by training set
Susmitha H based methods are classification error and testing set
,Assistant professor used. Bayesian
2016 classifier,linear
classifiers,logistic
regression back
propogation
approach
Sudarshan The Gauss newton The proposed Convergence  feed forward
nandy,DETS Kalyani numerical technique is criteria is important neural network
university, Kalyani, optimization superior in terms of issue for neural
nadia, west Bengal technique time take to network training
converge and the algorithm
memory

Mukul Jian,P.K The softmax Minimizes the error  cross entropy error  sigmoid activation
Butey and Manus probablisitic on the selection
Pratap singh 2017 method.Quasi- instances of the
newton method dataset
Authors &Year Methodology or Advantages Issues Metrics used
Techniques used
Ms. Prachitore Proposed It uses the Cnes Classification Data mining,
shekhawat, Prof method is to train the dataset often association
sheetal CARS(Class network to contain many mining rule.
S.Dhande Associative perform the continuous
Rules) classification attributes.Mining
task.So the of association
system will rules with
render the continuous
efficient and attributes is still a
accurate class research issue
based on the
prediction
attributes
Blake and Merz Classification of Output error Hardware Sample from the
iris plant data decreaser with implementation fixed vector
with sensitivity respect to the cost increases known as centre
based RBF number of critical for RBF.
vectors
Cheng-Jian Lin I A clustering method The result show Black box format. 25 instances with 4
Fang Chung called the SCA is that the average Lack a features from each
cheng hung chen proposed classification systematically way species were
2018 accuracy of the to determine the randomly selected
EQNFIS model is appropriate model
better than other structure
methods.
It converges
quickly.
It uses an online
,fast and one pass
self constructing
learning algorithm
R.A Abdulkadir Back propogation Provide faster and Difficlut to model 120 out of the total
khalipha A.Imam neural network efficient training using conventional 150 instances were
M.B Jibril training modelling technique used.
algorithm[multilayer
perception(MLP)
feed forward
network]
Authors &Year Methodology or Advantages Issues Metrics used
Techniques used
Madhusmita Chebyshev Gives the best There is a serious 25 instances of
swain,sanjit functional link accuracy ranges problem where setosa,
kumar neural network from 83.33% to neural network versicolor and
dash,sweta 96.66%.Faster in reduces the error virginia
dash and terms of learning to an extent that
Ayeskanta speed. it simply
mohapatra memorizes the
dataset used in
training
MS.Prachite Neural network Used to classify The data mining 3 classes of 50
shekhawat, associative unseen iris plant based on the instances
prof.sheetal classification data. neural network
S.Dhande,2012 using CBA can only handle
the numeric data
so it is needed to
transform the
character data
into numeric data
Mr.D.T Mane The hybrid model Presented model It is difficult to 150 total
from Dr.B.A.M ANNPSO,how PSO correctly classify design neural samples.120
university of is used to train the IRIS flower network samples are used
Aurangabad. the back dataset patterns architecture. for training
DR.U.V.Kulkarni propogation and gives testing purpose.30
neural network accuracy samples for
99.3%.PSO is a validation testing
powerful
stochastic
optimization
method with non
convex function
Bhavana Radial basis Radial basis Most dominating 3 classes of 50
devi,Mtech function function provides technique instances,4
scholar 2015 best results with attributes
99.225% amd
100% testing
accuracy
Issues in Existing Systems
Classification comes under supervised learning method as the classes are
determined before examining the data.All approaches to performing classification
assume some knowledge of data. Usually a training set is used to.
Then testing is performed to determine the class of input datasets.IRIS data
problem is also concerned to the classification problem and it is the best known
databases of the neural network application.
The data set contains 3 classes of 50 instances each, where each class
refers to a type of iris plant. One class is linearly separable from the other two; the
latter are not linearly separable from each other.To solve the classification problem
different techniques are used.
Objectives of the proposed system
• To simplify the problem of classification neural network is
used.Classifiction of IRIS dataset would be discovering patterns for
examining the petal and sepal size of the IRIS plant.
• By using this pattern and classification the unknown data can be
predicted more precisely.
• In this work,Multilayer feed-forward networks are trained using back
propogation learning algorithm
Dataset Links
• The data base contains the following attributes:
1) sepal length in cm
2) sepal width in cm
3) petal length in cm
4) petal width in cm
5) class:
- Iris Setosa
- Iris Versicolour
- Iris Virginica
LINK: http://archive.ics.uci.edu/ml/datasets/Iris.
References
• https://visualstudiomagazine.com/Articles/2013/09/01/Neural-
Network-Training-UsingBack-Propagation.aspx
• http://airccse.org/journal/ijsc/papers/2112ijsc07.pdf
• http://ipasj.org/IIJCS/Volume5Issue8/IIJCS-2017-08-18-18.pdf
• https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-
example/
THANK YOU

You might also like