You are on page 1of 24

By, Neural Network

Vishnu Sankar
Roll no 51
Agenda
• What is Neural Network
• Why its is important
• History
• How it works
• Applications
• Types of Neural Network
– ANN
– CNN
– RNN
– Difference between CN & RNN
• Forward Propagation in Neural Network
• Backward propagation in Neural Network
• limitations
• Key Takeaways
What is Neural Network
• Neural networks are computing systems with interconnected
nodes.
• That work much like neurons in the human brain.
• Using algorithms, they can recognize hidden patterns and
correlations in raw data
• Cluster and classify it, and over time –continuously learn and
improve.
• Neural networks can adapt to changing input.
• So the network generates the best possible result without
redesign the output criteria.
History of neural networks

• The first neural network was conceived of by Warren McCulloch and


Walter Pitts in 1943.
• They wrote a seminal paper on how neurons may work and modeled
their ideas
• By creating a simple neural network using electrical circuits.
• This breakthrough model paved the way for neural network research
in two areas:
– Biological processes in the brain.
– The application of neural networks to AI.
• AI research increase with Kunihiko Fukushima developing the
first true, multilayered neural network in 1975.
• The original goal of the neural network approach was to
create a system that could solve problems like a human brain.
• However, over time, researchers shifted their focus to using
neural networks to match specific tasks.
• Since then, neural networks have supported diverse tasks,
including computer vision, speech recognition, machine
translation, social network filtering and medical diagnosis.
How it works
• A neural network has many layers.
• Each layer performs a specific function, and the complex the
network is, the more the layers are.
• That’s why a neural network is also called a multi-layer
perceptron.
• The purest form of a neural network has three layers:
– The input layer
– The hidden layer
– The output layer
• These layers are made up of nodes. 
• The input layer picks up the input signals and transfers them
to the next layer.
• It gathers the data from the outside world. 
• The hidden layer performs all the back-end tasks of
calculation.
• A neural network has at least one hidden layer. 
• The output layer transmits the final result of the hidden
layer’s calculation. 
• You will have to train a neural network with some training
data before you provide it with a particular problem.
Why are neural networks important?

• Neural networks are also ideally suited to help people solve


complex problems in real-life situations.
• They can learn and
– Model the relationships between inputs and outputs that are
nonlinear and complex
– Make generalizations and inferences
– Reveal hidden relationships, patterns and predictions
– Model highly volatile data (such as financial time series data)
– And variances needed to predict rare events (such as fraud
detection). 
Applications
• Character and voice recognition, also known as natural
language processing.
• Medical and disease diagnosis.
• Financial predictions for stock prices, currency, options,
futures, bankruptcy and bond ratings.
• Process and quality control.
• Electrical load and energy demand forecasting.
• Credit card and Medicare fraud detection.
Types of Neural Network

• Artificial Neural Networks (ANN)
• Convolutional neural networks (CNNs)
• Recurrent neural networks (RNNs) 
Artificial Neural Networks (ANN)

• An artificial neural network (ANN) is the piece


of a computing system
• Designed to simulate the way the human brain analyzes and
processes information.
• It is the foundation of AI and solves problems that would
prove impossible or difficult by human.
• ANN is also known as a Feed-Forward Neural
network because inputs are processed only in the forward
direction.
• As you can see here, ANN consists of 3 layers – Input, Hidden
and Output.
• The input layer accepts the inputs, the hidden layer processes
the inputs, and the output layer produces the result.
• Essentially, each layer tries to learn certain weights.
• ANN can be used to solve problems related to:
– Tabular data
– Image data
– Text data
Convolutional neural networks (CNNs)
• This network consists of one or multiple convolutional
layers.
• Contain five types of layers: input, convolution, pooling,
fully connected and output.
• Each layer has a specific purpose, like summarizing,
connecting or activating.
• The convolutional layer present in this network applies a
convolutional function on the input.
• Before transferring it to the next layer. 
• Due to this, the network has fewer parameters, but
it becomes more profound.
• Convolutional neural networks have popularized
image classification and object detection.
• Also been applied to other areas, such as natural
language processing and forecasting.
Recurrent neural networks (RNNs)

• In this network, the output of a layer is saved and transferred


back to the input.
• This way, the nodes of a particular layer remember some
information about the past steps.
•  The combination of the input layer is the product of the sum of
weights and features.
• The recurrent neural network process begins in the hidden layers.
• Here, each node remembers some of the information of its
antecedent step.
• The model retains some information from each iteration, which it
can use later.
•  The system self-learns when its outcome is wrong.

• It then uses that information to increase the accuracy  


• The most popular application of RNN is in text-to-speech
technology. 
Difference between CNN & RNN
Forward Propagation in Neural networks

• Calculates and stores intermediate variables within the


computational graph defined by the neural network.
• It proceeds from the input to the output layer.
• Each hidden layer accepts the input data, processes it and
passes to the successive layer.
• The advantage is that it can represent more complex
functions very easily. 
• Used in Image recognition tasks, Natural language
processing .
Backward Propagation in Neural networks

• Back propagation is a training algorithm consisting of 2


steps:
• Feed forward the values
• Calculate the error and propagate it back to the earlier
layers.
• Forward-propagation is part of the back
propagation algorithm but comes before back-propagating.
• loops of multiple forward and backward propagation is
completed it lead us to give accurate results.
Limitations

• Although a neural network can often work effectively with


vague and incomplete information, the outcome can be
incomplete as well.
• Neural networks are black boxes, meaning we cannot know
how much each independent variable is influencing the
dependent variables.
• Neural networks depend a lot on training data.
• This leads to the problem of over-fitting and generalization.
• It is computationally very expensive and time consuming
to train with traditional CPUs.
• The user has little influence on the function of the
network.
• The outcome of a neural network contains some
uncertainty that isn’t always desirable.
Key Takeaways
• Neural networks provide ability to provide more human-like
AI
• Takes rough approximation and hard-coded reactions out of
AI design.
• Still require a lot of fine-tuning during development.
Thank You..

You might also like