Professional Documents
Culture Documents
ANFIS: Adaptive Neuro-Fuzzy Inference Systems: Adriano Oliveira Cruz
ANFIS: Adaptive Neuro-Fuzzy Inference Systems: Adriano Oliveira Cruz
Systems
Adriano Oliveira Cruz
• Introduction
• ANFIS Architecture
• Hybrid Learning Algorithm
• ANFIS as a Universal Approximatior
• Simulation Examples
A1 B1
W1
X Y
A2 B2
W2
X Y
x y
w1.f1+w2.f2
f1=p1x+q1y+r1 f2=p2x+q2y+r2 f=
w1+w2
x y
A1
x W1 W1f1
Prod Norm
f
A2 Sum
W2
Prod Norm W1f2
B1 x y
y
B2
x y
A1
x W1 W1f1
Prod
W1f1+W2f2
f
A2 Sum /
W2
Prod W1f2
B1 x y
y
B2 Sum
X
#(L)
Ep = L
(Tm,p − Om,p )2
m=1
∂Ep L
L
= −2(T i,p − Oi,p ) (1)
∂Oi,p
• For the internal node at (k, i), the error rate can be derived by the
chain rule:
∂Ep X ∂Ep ∂Om,p
#(k+1) k+1
k
= k+1 ∂O k
, (2)
∂Oi,p m=1
∂Om,p i,p
where 1 ≤ k ≤ L − 1
• The error rate of an internal node is a linear combination of the
error rates of the nodes in the next layer.
Fuzzy Logic-ANFIS – p. 19/53
Error Rate for each parameter
• Consider α one of the parameters.
• Therefore
∂Ep X ∂Ep ∂O∗
= , (3)
∂α ∂O ∂α
∗
∗ O ∈S
∂E X ∂Ep
P
= , (4)
∂α p=1
∂α
∂E X
P
∂Ep
= , (6)
∂α ∂α
p=1
• Combines:
• the gradient rule;
• the least squares estimate.
∂Ep X
#(k+1) k+1
∂Ep ∂Om,p
k
= k+1 ∂O k
∂Oi,p m=1
∂Om,p i,p
w1 f1 + w2 f2 ŵ1 f1 + ŵ2 f2
az + bẑ = a +b
w1 + w2 ŵ1 + ŵ2
w1 ŵ1 (af1 + bfˆ1 ) + w1 ŵ2 (af1 + bfˆ2 ) + w2 ŵ1 (af2 + bfˆ1 ) + w2 ŵ2 (af2 + bfˆ2 )
=
w1 ŵ1 + w1 ŵ2 + w2 ŵ1 + w2 ŵ2
• Initializing
• Training
• Testing
• Using
0.8
0.6
0.4
0.2
0
0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
input 1 (pimf)
0.8
0.6
0.4
0.2
0
−2 −1 0 1 2 3 4
input 2 (trimf)
• run exemplo06_03.m