You are on page 1of 13

RAJKIYA ENGINEERING

COLLEGE BIJNOR

INTRODUCTION TO SOFT
COMPUTING
PRESENTATIO
N BY :-
SUBMITTED SUBMITTED TO :-
Abhishek Nishad 2107350200003 MR. Pushp Maheshwari
Dheeraj Yadav 2107350200015 (Assistant Professor)
Electrical Engineering
2nd Year
ADAPTIVE
NETWORK
BASED
FUZZY
INTERFACE
CONTENTS
1 INTRODUCTION TO
ANFIS
2 STRENGTHS

3 LIMITATIONS
A D A P T I V E NEURAL F U Z Z Y INFERENCE S Y S T E M

One of the prominent neuro-fuzzy systems is Adaptive Neuro- Fuzzy


Inference System (ANFIS), introduced by Jang in 1993. •ANFIS is based
on Sugeno fuzzy model. where a rule Rk can be represented as:

Rₖ : IF µAᵢ(x) AND µB ᵢ(y) THEN ƒ = pₖx + qₖy + rₖ

where k is the number of rules. Ai and Bi are n fuzzy membership


functions denoted by μ in the antecedent part of the rule Rk, and
pk.qk.rk are the linear parameters of consequent part of the kth rule.
ARCHITECTURE OF A N F I S

ANFIS five-layers architecture comprises of two types of nodes: fixed


and adaptable.
A N FI S ( L A Y E R - 1 )

Layer 1, every node "i" in this layer is an adaptive node with a


node membership function:

0 ᵢ¹ = μAᵢ(x), i = 1,2,.... 0 ᵢ¹ = μBᵢ(y), i = 1,2,....

fuzzy membership functions of any shape i.e.. gaussian.


triangular, triopzoidal, etc...
A N F I S ( L AY E R - 2 )

Layer 2. calculates the firing strength of a rule vía product π


operation.

0 ᵢ² = wᵢ= μAᵢ(x) * μBᵢ(y), i = 1,2,....


A N FI S ( L A Y E R - 3 )

Layer 3. calculates the normalized firing strength of a rule from


previous layer.
A N F I S ( L AY E R - 4 )

In Layer 4, each node represents consequent part of fuzzy rule. The


linear
>coefficients of rule consequent are trainable.

0 ᵢ⁴ = wᵢ * fᵢ = wᵢ * (Pₖx+qₖy +rₖ), i = 1,2,...

Where pₖ, qₖ and rₖ are the linear parameters.


A N FI S ( L A Y E R -
5)

Nodes in Layer 5. perform defuzzification of consequent part of rules


by summing outputs of all the rules.

0 ᵢ⁵ = Σⁿᵢ=₁ wᵢ • fᵢ= Σⁿᵢ=₁ wᵢ * (pₖx+qₖY+rₖ)


S TR E N G TH O F A N FI S

The success of ANFIS can be attributed to the robustness of


results it provides.

ANFIS has as highly generalization capability as neural networks and


other machine learning techniques.

ANFIS is able to take crisp input, represent in the form of membership


functions and fuzzy rules, as well as, again generates crisp output out of
fuzzy rules for reasoning purpose. This provides room for applications
that involve crisp inputs and outputs.
L I M I TA TI O N S O F A N FI S

The computational cost of ANFIS is high due to complex structure


and gradient learning.

This is a significant bottleneck to applications with large inputs.

In ANFIS, tuneable parameters consist of membership function


parameters and consequent parameters. This demands efficient training
mechanism that can tune the parameters more effectively. The
parameter complexity is directly related to computational cost.
Therefore the more the parameters in ANFIS architecture, the more is
the training and computational cost.
THAN
K
YOU! Have a
great
day
ahead.

You might also like