You are on page 1of 11

INTELLIGENT CONTROL SYSTEM

PART 1:
Neural
Network
Instructor: Dr. Dang Xuan Ba
Email: badx@hcmute.edu.vn

Department of Automatic Control


Chapter 1: Introduction of Neural
network

Chapter 2: Single-layer
Feedforward Neural Network

Chapter 3: Multi-layer Neural


Content Network

Chapter 4: RBF neural Network

Chapter 5: Several Applications.

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 2


CHAPTER 4 – RADIAL-BASIS-FUNCTION
(RBF) NEURAL NETWORK

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 3


Chapter 4: RBF Neural Network
1. Motivation (Động lực)
Example 1: Classification of two distinguish Example 3: Design a network to separate
sets the following data to two distinguished
class?
X1 X2 d =x1 xor x2
0 0 0
0 1 1
1 0 1
1 1 0

Example 4: Build a function representing


Example 2: Design a network to separate the following data?
the following data to two distinguished
class?

What will we do exactly?


Part I: Neural Network Presenter: Dr. Dang Xuan Ba 4
Chapter 4: RBF Neural Network
2. Structure:
Distribution Mixing Output
y1
Gauss pi Li,Li

x1 ...

...
Gauss yi
...

...
Gauss

xn ...
ym
pi Li,Li

Gauss

Three distinguish layers: Distribution, Mixing and Output layer.


The distribution layer: Using Gauss function to encode the input signal to distributed
probabilities.
The mixing layer: Mix the input probabilities for all surface.
The out layer: Using linear functions for both integration and activation functions.

Its applications include complex classification, approximation, recognition….


Part I: Neural Network Presenter: Dr. Dang Xuan Ba 5
Chapter 4: RBF Neural Network
2. Learning algorithm:
Distribution Mixing Output
y1
Gauss pi Li,Li

x1 ...

...
Gauss yi
...

...
Gauss

xn ...
ym
pi Li,Li

Gauss

supervised learning: Only applied to output layers


Consider k sample:
x =  x1 ; x2 ;...xn ;1
T
Input signal: Labelled desired signal:
NN output signal: y = f ( wn , x)

L

Estimation error: E= ( yi − di ) 2
i =1 T
 E 
wok = − wok E = − 
  ( w ) 
Learning law (Back-propagations):
 ok 

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 6


Chapter 4: RBF Neural Network
Implementation:
 , Estop , E = 0, w0 ,
Initialization
supervised learning: epod max , epod = 0
E = 0, k = 0
Consider k sample: epod + +
Forward wk , k + +
computation
x =  x1 ; x2 ;...xn ;1
T
Input signal: net , y Updating
law
Labelled desired signal:
Error kK
NN output signal: y = f ( w, x) Calculation ek = d k − yk
nL
kK 
Estimation error: E= i =1
( yi − di ) 2  E = E + ek
2

Learning law (Back-propagations): Stop checking E , epod


No
Yes
T
 E  Done
wok = − wok E = − 
  ( w ) 
 ok 

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 7


Chapter 4: RBF Neural Network

Example 1: Design a network to approximate the following data


yd = 4*sin(x)*(1/(x2 + 2)) - 2*cos(sin(0.1*x) + x2);

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 8


Chapter 4: RBF Neural Network

Solution: Designing a RBF neural network, and the learning result is as:

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 9


Chapter 4: RBF Neural Network
4. Comment:

The RBF network is good for approximation applications (with/without noise…).

Effectiveness of the neural network is mainly based on the characteristics of the problem,
number data acquired (large), and “Intelligence” of the designer. The design could use
overdesign.

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 10


Chapter 4: RBF Neural Network

END OF CHAPTER 4

Part I: Neural Network Presenter: Dr. Dang Xuan Ba 11

You might also like