© All Rights Reserved

5 views

© All Rights Reserved

- Pgdm Executive Combined 4th June
- Introduction to Deep Learning Using R
- neural approaches im engeineering matlab supplement
- [George a. Rovithakis Manolis a. Christodoulou] Adaptive Control With Recurrent High Order Neural Networks
- Software Effort Estimation Using Artificial Intelligence
- ANN modelling
- Binary Classification Tutorial With the Keras Deep Learning Library
- An Empirical Comparison of Machine Learning Models for Time Series Forecasting
- Principal - Zahedi2013
- CSE Artificial Intelligence Report
- Neural Networks
- g Synopsis
- Scheduling Jobs on Parallel Machines Applying Neural
- Samta Faculty Profile 13 14
- An Application of Artificial Intelligent Neural Network and Discriminant Analyses On Credit Scoring.pdf
- paper9
- Som
- V4I11-0368.pdf
- Paper 9
- Quick Help

You are on page 1of 47

NEURON

1

Materi

2. The artificial neuron activation

function

3. Decision boundary

4. The artificial neuron learning

5. Network architecture/topology

6. Neural Processing

2

Models of Neuron

Neuron is an information processing unit

A set of synapses or connecting links

characterized by weight or strength

An adder

summing the input signals weighted by synapses

a linear combiner

An activation function

also called squashing function

squash (limits) the output to some finite values

3

Nonlinear model of a neuron (I)

4

Analogy

Weights represent the strengths of

synaptic links

Wi represents dentrite secretion

Summation block represents the

addition of the secretions

Output represents axon voltage

5

Nonlinear model of a neuron (II)

6

Types of Activation Function

7

Activation Functions...

Pitts model)

Linear: neurons using a linear activation

function are called in the literature

ADALINEs (Widrow 1960)

Sigmoidal functions: functions which more

exactly describe non-linear functions of the

biological neurons.

8

Types of Activation Function

Identity Function

9

Types of Activation Function

Also known as

Heaviside function

10

Activation Functions - sigmoid

11

Activation Functions - sigmoid

12

Activation Function value range

13

Stochastic Model of a Neuron

models of ANNs

A stochastic(probabilistic)model can also be

defined

If x denotes the state of a neuron, then P(v)

denotes the prob. of firing a neuron, where v is

the induced activation potential(bias + linear

combination).

14

Stochastic Model of a Neuron

control the noise level (and therefore the

uncertainty in firing)

T = 0 Stochastic model deterministic

model

15

Decision boundaries

drawing a hyperplan across it.

Known as a decision boundary.

Discriminant function: returns different values

on opposite sides. (straight line)

Problems which can be thus classified are

linearly separable.

16

E.g. Decision Surface of a

Perceptron

AND(x1,x2) choose weights w0=-1.5, w1=1, w2=1

But functions that are not linearly separable (e.g. XOR)

are not representable

17

Linear Separability

18

Rugby players & Ballet dancers

19

Training the neuron

20

The artificial neuron learning

Supervised Learning

Unsupervised Learning

21

Supervised Learning

by a teacher, e.g., the distance ρ[d,o] as as

error measure

Estimate the negative error gradient direction

and reduce the error accordingly

Modify the synaptic weights to reduce the

stochastic minimization of error in

multidimensional weight space

22

Unsupervised Learning

(Learning without a teacher)

error information can be used to improve

network behavior. E.g. finding the cluster

boundaries of input pattern

Suitable weight self-adaptation mechanisms

have to embedded in the trained network

23

Neuron Learning

(b). Unsupervised learning

24

Training

25

Simple network

26

Learning algorithm

27

Learning algorithm

the neural network. In the case of the AND function

an epoch consists of four sets of inputs being

presented to the network (i.e. [0,0], [0,1], [1,0],

[1,1])

Error: The error value is the amount by which the

value output by the network differs from the target

value. For example, if we required the network to

output 0 and it output a 1, then Error = -1

28

Learning algorithm

not only present it with the input but also with a value

that we require the network to produce. For example, if

we present the network with [1,1] for the AND function

the target value will be 1

Output , O: The output value from the neuron

Ij: Inputs being presented to the neuron

Wj: Weight from input neuron (Ij) to the output neuron

LR : The learning rate. This dictates how quickly the

network converges. It is set by a matter of

experimentation. It is typically 0.1

29

Training the neuron

30

Training the neuron

31

Learning in Neural Networks

with random weights

Load training example’s input

Observe computed input

Modify weights to reduce difference

Iterate over all training examples

Terminate when weights stop changing OR when

error is very small

32

Network Architecture

input layer and output layer

single (computation) layer

feedforward, acyclic

Multilayer FeedforwardNetworks

hidden layers

hidden neurons and hidden units

enables to extract high order statistics

10-4-2 network, 100-30-10-3 network

fully connected layered network

Recurrent Networks

at least one feedback loop

with or without hidden neuron

33

Feedforward Networks (static)

34

Feedforward Networks

One or more hidden layers

Each hidden layer is built from artificial neurons

Each element of the preceding layer is connected

with each element of the next layer.

There is no interconnection between artificial

neurons from the same layer.

Finding weights is a task which has to be done

depending on which solution problem is to be

performed by a specific network.

35

Feedforward Networks

(Recurrent or dynamic systems)

36

Feedforward Networks

(Recurrent or dynamic systems)

between ANNs or with the feedback.

Boltzmann machine is an example of

recursive nets which is a generalization of

Hopfield nets. Other example of recursive

nets: Adaptive Resonance Theory (ART)

nets.

37

Neural network as directed Graph

38

Neural network as directed Graph

signal flow graph

node is associated with signal

directed link is associated with transfer

function

synaptic links

governed by linear input-output relation

signal xj is multiplied by synaptic weight wkj

activation links

governed by nonlinear input-output relation

nonlinear activation function

39

Feedback

feedback

depending on w

stable, linear divergence, exponential divergence

we are interested in the case of |w| <1 ; infinite

memory

output depends on inputs of infinite past

NN with feedback loop : recurrent network

40

Neural Processing

Recall

The process of computation of an output o for a

given input x performed by the ANN.

It’s objective is to retrieve the information, i.e., to

decode the stored content which must have been

encoded in the network previously

Autoassociation

A network is presented a pattern similar to a

member of the stored set, autoassociation

associates the input pattern with the closest stored

pattern.

Autoassociation: reconstruction of incomplete or

noisy image

41

Neural Processing

Heteroassociation:

The network associates the input pattern with pairs

of patterns stored

42

Neural Processing

Classification

A set of patterns is already divided into a number

of classes, or categories

When an input pattern is presented, the classifier

recalls the information regarding the class

membership of the input pattern

The classes are expressed by discrete

valued output vectors, thus the output neurons of

the classifier employ binary activation functions

A special case of heteroassociation

43

Neural Processing

Recognition

If the desired response is the class number, but

the input pattern doesn’t exactly corresponding

to any of the patterns in the stored set

44

Neural Processing

Clustering

Unsupervised classification of patterns/objects

without providing information about the actual

classes

The network must discover for itself any existing

patterns, regularities, separating properties, etc.

While discovering these, the network undergoes

change of its parameters, which is called Self-

organization

45

Neural Processing

patterns stored

46

Summary

based neural net) is a good approach for complex

pattern recognition

(e.g. image recognition, forecasting, text retrieval,

optimization)

Less need to determine relevant factors a priori when

building a neural network

Lots of training data are needed

High tolerance to noisy data. In fact, noisy data

enhance post-training performance

Difficult to verify or discern learned relationships even

with special knowledge extraction utilities developed for

neural nets

47

- Pgdm Executive Combined 4th JuneUploaded byDxscrib
- Introduction to Deep Learning Using RUploaded byJorgeGazal
- neural approaches im engeineering matlab supplementUploaded byMagno Silva
- [George a. Rovithakis Manolis a. Christodoulou] Adaptive Control With Recurrent High Order Neural NetworksUploaded byruben210979
- Software Effort Estimation Using Artificial IntelligenceUploaded byClinton Mercieca
- ANN modellingUploaded byMurugesan
- Binary Classification Tutorial With the Keras Deep Learning LibraryUploaded byShudu Tang
- An Empirical Comparison of Machine Learning Models for Time Series ForecastingUploaded bycharlescoutinho85
- Principal - Zahedi2013Uploaded byJosue Marshall
- CSE Artificial Intelligence ReportUploaded byAbanum Chuks
- Neural NetworksUploaded byMohamed Abdorabelnabi
- g SynopsisUploaded bygirishkhare1109
- Scheduling Jobs on Parallel Machines Applying NeuralUploaded byAntoniojuarezjuarez
- Samta Faculty Profile 13 14Uploaded byengineeringwatch
- An Application of Artificial Intelligent Neural Network and Discriminant Analyses On Credit Scoring.pdfUploaded byAlexander Decker
- paper9Uploaded byIntegrated Intelligent Research
- SomUploaded byAna-Maria Romila
- V4I11-0368.pdfUploaded byomravi
- Paper 9Uploaded byRakesh Lin
- Quick HelpUploaded byLeo TD
- Currency PredictionUploaded byMohammad Irwansyah Zulkarnain
- Web Proxy Cache Replacement Strategies_ Simulation, Implementation, and Performance Evaluation.pdfUploaded bykjhgfd
- IJSET_2016_1009.pdfUploaded byInnovative Research Publications
- The Term Neural Network Was Traditionally Used to Refer to a Network or Circuit of Biological NeuronsUploaded byPrashant Rai
- 9D06105 Neural Networks and ApplicationsUploaded bysubbu
- Modeling Oxygen transfer of Multiple Plunging Jets Aerators using Artificial Neural NetworksUploaded byAdvanced Research Publications
- how to biuld api using c++.docxUploaded byTAJBIR2000
- Determining the Cluster Labels by Using Keywords Based Similarity MeasureUploaded byseventhsensegroup
- ES98BWUploaded byMustaf Mohamed
- CONTENT BASED AUDIO CLASSIFIER & FEATURE EXTRACTION USING ANN TECNIQUESUploaded byIJIRAE- International Journal of Innovative Research in Advanced Engineering

- ocb_dataUploaded byneeraj kumar singh
- Computer Organization and Design 4th Edition Chapter 1 SlidesUploaded byMark Perkins
- Orignal Point of Sale.docxUploaded byAbdulMoueed
- bca syllabusUploaded bySulekha Uniyal
- lec01Uploaded byshailygoyal
- status.pdftructure and NCERT guidelines, this thoroughly revised and updated textbook is designed for class XI of senior secondary ... Which book should i refer to for PCM in cbse class 11 science ... uk.answers.yahoo.com › ... › Primary & Secondary Education 22 Apr 2013 - maths - rd sharma and rs agarwall. chemistry - morrison & boyd and tata mcgraw hill and chand physics - concepts of physics by hc verma and ... Which are the best reference books for class 11 cbse? 4 answers 25 Jun 2013 Best chemistry and physics books for CBSE and ISC ... 1 answer 25 May 2013 Reference books for class 11 and 12 CBSE? 2 answers 13 Jan 2010 More results from uk.answers.yahoo.com NCERT Class XI Chemistry Book - AglaSem Schools schools.aglasem.com/?p=4993 14 Oct 2012 - This page is about: ncert chemistry class 11; ncert class 11 chemistry; ncert notes for class 11 chemistry; ncert class 11 chemistry notes; cbse ... Download CBSE Books - Notemonk www.notemonk.com/cbse_books/ CBSE Books. AUploaded byChowdhury Sujay
- Time ComplexityUploaded byRaJu SinGh
- ITC LABUploaded bySubrata Sarkar
- picforthUploaded byRajeswaran Dhandapani
- 8051 Addressing ModesUploaded byJay Tiwari
- ch05Uploaded byKrithiga Gunasekaran
- mcqtest1111.xlsUploaded byImran Shareef
- UNIT 5(DSP Processor)Uploaded bygravitarse
- B-66102E_10---FANUCUploaded bybabak671
- Further MaterialUploaded byGeethaRamakrishnan
- Document 1Uploaded byDeepak Samuel
- Chapter 2 CADUploaded byLavitSutcharitkul
- Chapter+06Uploaded byBùi Tuấn Anh
- This Chapter Gives a Basic Overview of the TIBCO HawkUploaded byalayham_mh
- Automatic Question PaperUploaded bypdparthasarathy03
- rawrUploaded byJanoria
- Information Retrieval From Google Map Using Semantic Web TechnologyUploaded byesatjournals
- eec112Uploaded byfeezy1
- The Description Logic Handbook Theory, Implementation and ApplicationsUploaded byAnuj More
- Compiler Design -- Software Design ProjectUploaded byal3mary
- LaptrinhNhung_BKHNUploaded byMinh Nam
- 02_HANA_Architecture_v2.pdfUploaded byMourad Rabhi
- Introduction to research ProblemUploaded byamin jamal
- Computational GeometryUploaded byZvonko Trzun
- distinguish between systems programs and application programsUploaded byapi-247871582