You are on page 1of 6

5/30/2020

Intelligent Control Systems

Dr.S.ALBERT ALEXANDER Ph.D., PDF (USA)., SMIEEE.,


UGC - Raman Research Fellow
VICE PRESIDENT -- ENERGY CONSERVATION SOCIETY
MHRD NATIONAL AWARDEE FOR TEACHING INNOVATIONS
DEPARTMENT OF EEE, KONGU ENGINEERING COLLEGE

Email : ootyalex@gmail.com Website: www.ootyalex.webs.com

5/30/2020 Dr.ALBERT/EEE/KEC 1 5/30/2020 Dr.ALBERT/EEE/KEC 2

Topics to be covered Problem example


Choice of Controllers
Performance Parameters
overshoot
Artificial Intelligence
Closed loop control using ANN
Case Studies
Research issues
Speed control applications
Outcome Expected is P3

5/30/2020 ALBERT/EEE/KEC 3 5/30/2020 ALBERT/EEE/KEC 4

Chopper controlled DC motor drive Analog control method


Power Switching converter Load
input
+ i
Vg
+ sensor
V
- gain
H(s)
-
error
transistor
signal
gate driver
δ(t) Pulse-width Vc Ve -
δ modulator Gc(s) +
compensator
Reference Vref
dTs Ts t input

5/30/2020 ALBERT/EEE/KEC 5 5/30/2020 ALBERT/EEE/KEC 6

Webinar: New Horizon College of


Engineering 1
5/30/2020

Digital control method PID controllers


PID hardest task is tuning

Controller Response Overshoot Error


time

On-off Smallest Highest Large


Proportional Small Large Small
Integral Decreases Increases Zero
Derivative Increases Decreases Small
change

5/30/2020 ALBERT/EEE/KEC 7 5/30/2020 ALBERT/EEE/KEC 8

Soft Computing
“Soft Computing is an emerging approach to
computing which parallels the remarkable
ability of the human mind to reason and learn
INTELLIGENT TECHNIQUES in an environment of uncertainty and
imprecision” (Lofti A.Zadeh, 1992)
Methodology Strength
Neural Network Learning and adaptation
Fuzzy set theory Knowledge representation
via fuzzy if then rules
GA and SA Systematic random search
Conventional AI Symbolic manipulation
5/30/2020 Dr.ALBERT/EEE/KEC 9 5/30/2020 Dr.SAA/EEE/KEC 10

Artificial Neural Networks Biological Inspiration


A neural network is a system composed of
many simple processing elements operating
in parallel whose function is determined by
network structure, connection strengths and
the processing performed at computing
elements or nodes
Artificial NN is an information processing Inspiration from Neurobiology
system that has certain performance • A neuron: many-inputs / one-output unit
characteristics in common with biological • Cell Body is 5 – 10 microns in diameter
neural networks • There are about 10 billion neurons in the human cortex
and 60 trillion synapses of connections

5/30/2020 Dr.SAA/EEE/KEC 11 5/30/2020 Dr.SAA/EEE/KEC 12

Webinar: New Horizon College of


Engineering 2
5/30/2020

History of Neural Networks Classification of NN models


1943 : Mc Cullough and Pitts – Modeling the Neuron for
(i) Learning methods : Supervised, Unsupervised
parallel distributed processing
1949 : Hebb network (ii) Architecture : Feed forward, Recurrent
1958 : Rosenblatt- Perceptron (iii) Output types : Binary, Continuous
1960 : Adaline (iv) Node types : Uniform, Hybrid
1969 : Minsky and Papert publish limits on the ability of a
(v) Implementations : Software, Hardware
perceptron to generalize
1972 : Kohonen SOFM (vi) Connection weights : Adjustable, hardwired
1986 : Rumelhart, Hinton + Williams present BPN (vii) Operations : Biologically motivated
1988 : Broomhead & Lowe - RBFN Psychologically motivated
1989 : Tsividis - Neural Network on a chip

5/30/2020 Dr.SAA/EEE/KEC 13 5/30/2020 Dr.SAA/EEE/KEC 14

Characteristics of Neural Networks Unsupervised learning methods


UNSUPERVISED LEARNING
Feedback Nets:
Pattern of connection between the neurons Additive Grossberg (AG)
Shunting Grossberg (SG)
(Architecture) Binary Adaptive Resonance Theory (ART1)
Analog Adaptive Resonance Theory (ART2, ART2a)
Method of determining the weights Discrete Hopfield (DH)
Continuous Hopfield (CH)
(Training/Learning algorithm) Discrete Bidirectional Associative Memory (BAM)
Temporal Associative Memory (TAM)
Activation Function/transfer/Output function Adaptive Bidirectional Associative Memory (ABAM)
Kohonen Self-organizing Map (SOM)
Kohonen Topology-preserving Map (TPM)
Feedforward-only Nets:
Learning Matrix (LM)
Driver-Reinforcement Learning (DR)
Linear Associative Memory (LAM)
Optimal Linear Associative Memory (OLAM)
Sparse Distributed Associative Memory (SDM)
Fuzzy Associative Memory (FAM)
Counterprogation (CPN)

5/30/2020 Dr.SAA/EEE/KEC 15 5/30/2020 Dr.SAA/EEE/KEC 16

Supervised learning methods Adaline


SUPERVISED LEARNING
Feedback Nets:
Brain-State-in-a-Box (BSB)
Adaptive linear element
Fuzzy Congitive Map (FCM) Suggested by Widrow and Hoff in 1960
Boltzmann Machine (BM)
Mean Field Annealing (MFT) Adapt itself to achieve a given modeling task
Recurrent Cascade Correlation (RCC)
Learning Vector Quantization (LVQ)
Feedforward-only Nets:
Perceptron
Network output = Σ xiwi+wo ,,,,,i=1,..n
Adaline, Madaline
Back propagation (BP)
Cauchy Machine (CM) Delta rule for adjusting weights
Adaptive Heuristic Critic (AHC)
Time Delay Neural Network (TDNN) It has purely linear output
Associative Reward Penalty (ARP)
Avalanche Matched Filter (AMF)
Madaline is integration of two or more adaline
Backpercolation (Perc) components
Artmap
Adaptive Logic Network (ALN)
To solve nonlinearly separable logic functions
Cascade Correlation (CasCor)

5/30/2020 Dr.SAA/EEE/KEC 17 5/30/2020 Dr.SAA/EEE/KEC 18

Webinar: New Horizon College of


Engineering 3
5/30/2020

Back propagation Radial basis function Networks


Resembles the back propagtion network but the activation
Requires training set (input / output pairs)
function used is the Gaussian Function
Starts with small random weights Inspired by research in regions of the cerebral cortex & the
Error is used to adjust weights (supervised learning) visual cortex
RBFNs have been proposed by Moody & Darken in 1988 as a
supervised learning neural networks
Desired output of the training examples The activation level of the ith receptive field unit is:
Error = difference between actual & desired output wi = Ri(x) = Ri (||x – ui|| / σi), i = 1, 2, …, H
Change weight relative to error size
x is a multidimensional input vector
Calculate output layer error , then propagate back to
ui is a vector with same dimension as x
previous layer
H is the number of radial basis functions called also
receptive field units
Gradient descent on error landscape Ri(.) is the ith radial basis function with a single maximum
at the origin

5/30/2020 Dr.SAA/EEE/KEC 19 5/30/2020 Dr.SAA/EEE/KEC 20

Fuzzy Sets
Fuzzy Set Theory is a simple extension of the
basic concept of sets
END OF PART 1
When we poll possible set members we may
decide that they are not members of the set at
all (0% membership) or that they are entirely
members of the set (100% membership)

But if membership can be any number between


0 and 100%, it is a fuzzy set

5/30/2020 Dr.SAA/EEE/KEC 21 5/30/2020 Dr.SAA/EEE/KEC 22

Fuzzy sets Fuzzy sets


Continuous universe
Discrete universe (ordered) X = R+ be the set of possible ages for human beings
Let X = {0, 1, 2, 3, 4, 5, 6} be a set of numbers of children a family fuzzy set B = “about 50 years old” may be expressed as
may possibly have
B = {(x, µB(x) I x Є X}, where
fuzzy set A with “sensible number of children in a family” may be
described by 1
µ B(x) = 2
A = {(0, 0.1), (1, 0.3), (2, 0.7), (3, 1), (4, 0.7), (5, 0.3), (6, 0.1)}  x − 50 
1+  
 10 

5/30/2020 Dr.SAA/EEE/KEC 23 5/30/2020 Dr.SAA/EEE/KEC 24

Webinar: New Horizon College of


Engineering 4
5/30/2020

Definitions- Contd.. Membership functions


Linguistic variables and linguistic values Triangular
 x −a c − x 
X=“age” Linguistic variables trimf ( x ; a , b , c ) = max  min  ,  , 0
 b −a c −b 
Fuzzy sets “young”, “middle aged” and “old” Linguistic
values Trapezoidal
 x − a d − x 
tra p m f ( x ; a , b , c , d ) = m a x  m in  ,1 ,  , 0
 b − a d −c 
Set theoretic operations
(i) Union: C=AUB Gaussian 2
1  x −c 
−  
μC(x)=max(μA(x), μB(x))= μA(x) V μB(x) gaussm f ( x ; a , b , c ) = e 2  σ 

(i) Intersection :C=A∩B


Generalized bell
μC(x)=min(μA(x), μB(x))= μA(x) Λ μB(x) 1
g b e llm f ( x ; a , b , c ) = 2b
(i) Complement : µ Ā (x) =1- μA(x) 1 +
x − c
b

5/30/2020 Dr.SAA/EEE/KEC 25 5/30/2020 Dr.SAA/EEE/KEC 26

Fuzzy rules Mamdani FIS


If pressure is high, then volume is small
If the road is slippery, then driving is dangerous
If the speed is high, then apply the brake a little

Antecedent Consequent
Fuzzy Inference Systems
(i) Mamdani

(ii) Sugeno

(iii) Tsukamoto

5/30/2020 Dr.SAA/EEE/KEC 27 5/30/2020 Dr.SAA/EEE/KEC 28

Sugeno & Tsukamoto FIS

END OF PART 2

5/30/2020 Dr.SAA/EEE/KEC 29 5/30/2020 Dr.SAA/EEE/KEC 30

Webinar: New Horizon College of


Engineering 5
5/30/2020

Any Questions?

5/30/2020 Dr.SAA/EEE/KEC 31 5/30/2020 Dr.SAA/EEE/KEC 32

Finished!!! You did it!!!

CONTACT NUMBER

9865931597

5/30/2020 Dr.S.ALBERT ALEXANDER 33 5/30/2020 Dr.S.ALBERT ALEXANDER 34

Webinar: New Horizon College of


Engineering 6

You might also like