You are on page 1of 7

Neural

1-Introduction

P. 1

Artificial Neural Network (ANN)


Introduction to ANN Def: A structure (network) composed of a number of interconnected units (artificial neurons). Each unit has an I/O characteristic and implements a local computation or function. The output of any unit is determined by its I/O characteristic, its interconnection to other units, and (possibly) external inputs. Although hand crafting of the network is possible, the network usually develops an overall functionality through one or more forms of training.
: ,,. ,,, . ANN History: Background: , , (1940 ) model, AI NN. Biological Neural : (, ) 1000 , 1000 ( 100 ) : Cell body(neuron) ,Nucleus(),Axon(),Dandrites(), synapse()

, Excitatory or Inhibitory , :

Artificial Neural: Creation Age( )(before1956) --In 1943, McCulloch & Pitts first proposed the neural mathematical model, earlier than the first computer were created. Their premises(assumptions) are: 1.Neuron has two status, i.e., all or none. (excitatory or inhibitory) 2.A neuron is triggered by certain amount of accumulated
Introduction -- P. 1

Neural

1-Introduction

P. 2

synapses and the trigger action has nothing to do with the previous status. 3.Only synapse can be delayed. 4.When under inhibitory status a neuron can not be trigged. 5.The neuron net structure is not changed. --In 1949, Hebb proposed the Hebb learning rule: 1.Information resides in Synapes 2.learning rule 3.symmetrical weights 4.When a group of weak neurons is triggered, the strength of connection between them is increased. ( weights ) Birth Age( )(1957-1968): In 1957, Rosenblatt proposed the first network model, i.e., perceptron. () In 1960, Widrow porposed another model, i.e., Adaline. (, learning rule ) Dark Age( )(1969-1981): In 1969, Minsky & Papert proved that Perceptron has limited learning usage, because this model cannot solve the XOR problem. ( AI ,, Minsky , NN ) .. Self Organization Map(SOM) Grossberg Adaptive : Kohonen Resonance Theory(ART) Reborn Age( )(1982-1986): In 1982, Hopfield proposed the Hopfield Network and the Autoassociative Memory Models. Again, in 1985, Hopfield proposed another Hopfield & Tank Network to solve the Traveling Salesman Problem. After these researches, the ANN models were again treated as a promising research area . In 1986, Rumelhart et. al. introduced the BPN in their book: Parallel Distributed Processing in which generalized delta rule are explained. In addition, they explain how BPN can solve the XOR problem. Until 1990, BPN had become one of the most popular and highly utilized ANN model. Mature Age( )(1987~present): Up to now, the ANN models has been widely studied and many models has been proposed. Conferences and Journals are created for ANN studies, such as ICNN(International Conference on NN, IJCNN( International Joint Conference on NN, held by IEEE & INNS). Besides, many tools and software are been
Introduction -- P. 2

Neural

1-Introduction

P. 3

developed for making applying NN easier. ANN Characteristics: (1) Input: training sets(or training patterns), [X1, X2, ., Xn]. (2) Output: computed output, [Y1,Y2,,Yj], testing sets[T1,T2,,Tj] (3) Connections: Weights, Wij. (4) Processing Element(PE): summation function, activity function, transfer function. X1 X2 Xn Summation Ij

Activity fc. netj

Transfer fc. F(netj)

Output Yj

Introduction -- P. 3

Neural

1-Introduction

P. 4

Types of ANN: : Supervise Learning: ,,: Perceptron, BPN, PNN, LVQ, CPN Unsupervise Learning: , :SOM, ART Associative memory Learning: or , : Hopfield, Bidirectional Associative Memory(BAM), Hopfield-Tank Optimization Application:, : ANN, HTN : forward:single layer, multilayer feedback:single layer X1 Feedword( / ) X2 Xn

X1 Feedforword( ) X2 Xn

X1 X2 Y

Feedback( )

Xn

Introduction -- P. 4

Neural

1-Introduction

P. 5

Research Application Area: Classification() Clustering() Prediction() Memorizing() Learning() Optimization() Control() Recognition() Decision-making() : , , , , ,, , , , , , , , , , , , , , , , . An example using ANN model: Customer who apply for loan Input: Customers salary Customers house debt Customers car debt Customers average living cost Customers credit history

ANN Model

The comparison of ANN with other methods: (1) Variable prediction vs. Regression Analysis
Y = a 0 + a1 X 1 + a 2 X 2 + a 3 X 3 + ... + a n X n

For regression, the main task is to find out the parameters a0,a1,a2,a3,an .However, if the problem is non-linear solvable, then it will become difficult to solve. Regression can be used to do the classification or prediction. ANN vs. Regression The advantage: -Can solve non-linear problem -Parameters can be modified easily -Easy to construct the model -Accepts any type of input ANN vs. Regression The disadvantage: -Take time to find the global minimum(the best solution)
Introduction -- P. 5

Neural

1-Introduction

P. 6

-May be over learning. (2) Time Series


X t = a 0 + a1 X t 1 + a 2 X t 2 + a 3 X t 3 +... + a p X t p

Based on the history values to predict future results. EX: prediction of stock market, prediction of volcano eruption (3) Decision Making
f 1 = f 1 ( X 1 , X 2 , X 3 ,... X n ) = a10 + a11 X 1 + a12 X 2 + ... + a1n X n f 2 = f 2 ( X 1 , X 2 , X 3 ,... X n ) = a 20 + a 21 X 1 + a 22 X 2 + ... + a 2 n X n : f n = f m ( X 1 , X 2 , X 3 ,... X n ) = a10 + a m1 X 1 + a m 2 X 2 + ... + a mn X n

By Applying the same inputs to find out which fi has the best outcome. The decision is made based on the best outcome. EX: Credit evaluation, Scheduling, Strategic decision

190 185 180 175 170 165 160 155 150 40 45 50 55

Classifica tion line y

paremeter s

60

65

70

75

80

85

90

Remark: Extra material is listed in Lilys web page: http://www.cyut.edu.tw/~lhli About ANN software and programming tools can be found in NN
Introduction -- P. 6

Neural

1-Introduction

P. 7

About ANN research resources can be found in NN- , NN-, NN-

Introduction -- P. 7

You might also like