neuron network architecture

Attribution Non-Commercial (BY-NC)

45 views

neuron network architecture

Attribution Non-Commercial (BY-NC)

- Neuroscience
- Flood 2 Users Guide
- Neural Network Introduction
- Mitja Perus- Multi-level Synergetic Computation in Brain
- annn
- IJAIEM-2014-01-02-114
- Neural
- Artificial Neural Network Modeling Technique for Voltage
- Deep_Learning_and_Chemistry.pdf
- Jordi Mike bussiness
- tech_sem
- Advanced Learning Methodologies
- Innova Lecture Berber
- Nural Network
- Articulo Surrogate3 (1)
- 2 Anirudh Biswas 1456 Research Article VSRDIJMCAPE February 2013
- lect08-Geo Computations
- Artificial Neural Networks
- Computer Vision-Aided Fabric Inspection System
- Lab4

You are on page 1of 48

Fall 2007 Instructor: Tai-Ye (Jason) Wang Department of Industrial and Information Management Institute of Information Management

1

Neuron Abstraction

Neurons transduce signalselectrical to chemical, and from chemical back again to electrical. Each synapseis associated with what we call the synaptic efficacy the efficiency with which a signal is transmitted from the presynaptic to postsynaptic neuron

S

2/48

Neuron Abstraction

3/48

the jth artificial neuron that receives input signals si , from possibly n different sources an internal activation xj which is a linear weighted aggregation of the impinging signals, modified by an internal threshold, j

4/48

the j th artificial neuron that connection weights wij model the synaptic efficacies of various interneuron synapses.

5/48

Notation:

6/48

The activation of the neuron is subsequently transformed through a signal function S() Generates the output signal sj = S(xj ) of the neuron.

7/48

8/48

The activation xj is simply the binner product of the impinging signal vector S = (s0, . . . , sn)T , with the neuronal weight vector Wj = (w0j , . . . ,wnj )T

Adaptiv e Filter

9/48

Net positive activations translate to a +1 signal value Net negative activations translate to a 0 signal value.

10/48

sj = S(xj ) {0, 1}

11/48

12/48

The updated signal value S(xjk+1) at time instant k + 1 is generated from the neuron activation xik+1 , sampled at time instant k + 1. The response of the threshold logic neuron as a two-state machine can be extended to the bipolar case where the signals are

sj {1, 1}

13/48

The resulting signal function is then none other than the signum function, sign(x) commonly encountered in communication theory.

14/48

Interpretation of Threshold

15/48

Interpretation of Threshold

16/48

j = 1/xm is the slope parameter of the function Figure plotted for xm = 2 and j = 0.5. .

17/48

Sj (xj ) = max(0, min(jxj , 1)) Note that in this course we assume that neurons within a network are homogeneous.

18/48

j is a gain scale factor In the limit, as j the smooth logistic function approaches the non-smooth binary threshold function.

19/48

The sigmoidal signal function has some very useful mathematical properties. It is

20/48

j is the Gaussian spread factor and cj is the center. Varying the spread makes the function sharper or more diffuse.

21/48

Changing the center shifts the function to the right or left along the activation axis This function is an example of a nonmonotonic signal function

22/48

Stochastic Neurons

sj {0, 1} or {1, 1}

Neuron switches into these states depending upon a probabilistic function of its activation, P(xj ).

23/48

24/48

Artificial neural networks are massively parallel adaptive networks of simple nonlinear computing elements called neurons which are intended to abstract and model some of the functionality of the human nervous system in an attempt to partially capture some of its computational strengths.

25/48

Input: receive external stimuli Hidden: compute intermediate functions Output: generate outputs from the network

Activation state vector. This is a vector of the activation level xi of individual neurons in the neural network,

26/48

Signal function. A function that generates the output signal of the neuron based on its activation.

Pattern of connectivity. This essentially determines the inter-neuron connection architecture or the graph of the network. Connections which model the inter-neuron synaptic efficacies, can be excitatory (+) inhibitory () absent (0).

27/48

Activity aggregation rule. A way of aggregating activity at a neuron, and is usually computed as an inner product of the input vector and the neuron fan-in weight vector.

28/48

Activation rule. A function that determines the new activation level of a neuron on the basis of its current activation and its external inputs. Learning rule. Provides a means of modifying connection strengths based both on external stimuli and network performance with an aim to improve the latter.

29/48

Environment. The environments within which neural networks can operate could be

30/48

a feedforward architecture, in which the network has no loops, or a feedback (recurrent) architecture, in which loops occur in the network because of feedback connections.

31/48

32/48

f : Rn Rp

Multilayered networks that associate vectors from one space to vectors of another space are called heteroassociators.

Map or associate two different patterns with one anotherone as input and the other as output. Mathematically we write, f : Rn Rp.

33/48

When neurons in a single field connect back onto themselves the resulting network is called an autoassociator since it associates a single pattern in Rn with itself.

f : Rn Rn

34/48

For a p-dimensional field of neurons, the activation state space is Rp. The signal state space is the Cartesian cross space,

Ip = [0, 1] [0, 1] p times = [0, 1]p Rp if the neurons have continuous signal functions in the interval [0, 1] [1, 1]p if the neurons have continuous signal functions in the interval [1, 1].

35/48

For the case when the neuron signal functions are binary threshold, the signal state space is

Bp = {0, 1} {0, 1} p times = {0, 1}p Ip Rp {1, 1}p when the neuron signal functions are bipolar threshold.

36/48

Organized into different layers Unidirectional connections memory-less: output depends only on the present input

X Rn

S = f (X)

37/48

X Rn

S = f (X)

38/48

Non-linear dynamical systems New state of the network is a function of the current input and the present state of the network

39/48

40/48

Network activations and signals are in a flux of change until they settle down to a steady value

Issue of Stability: Given a feedback network architecture we must ensure that the network dynamics leads to behavior that can be interpreted in a sensible way.

41/48

fixed point equilibria where the system eventually converges to a fixed point Chaotic dynamics where the system wanders aimless in state space

42/48

43/48

44/48

Robustness Ability to operate, albeit with some performance loss, in the event of damage to internal structure. Associative Recall Ability to invoke related memories from one concept.

For e.g. a friends name elicits vivid mental pictures and related emotions

45/48

Function Approximation and Generalization Ability to approximate functions using learning algorithms by creating internal representations and hence not requiring the mathematical model of how outputs depend on inputs. So neural networks are often referred to as adaptive function estimators.

46/48

Fault Tolerance

Associative Recall

47/48

Function Approximation Prediction

Control

48/48

- NeuroscienceUploaded byJosé Ramírez
- Flood 2 Users GuideUploaded bytonehog457011792
- Neural Network IntroductionUploaded bygaban27
- Mitja Perus- Multi-level Synergetic Computation in BrainUploaded bydcsi3
- annnUploaded byJohnny Jany
- IJAIEM-2014-01-02-114Uploaded byAnonymous vQrJlEN
- NeuralUploaded bypriyankapatel_88
- Artificial Neural Network Modeling Technique for VoltageUploaded bymossdas
- Deep_Learning_and_Chemistry.pdfUploaded byliang102009
- Jordi Mike bussinessUploaded byxavier
- tech_semUploaded byravvalavamsi
- Advanced Learning MethodologiesUploaded bySuresh Bandaru
- Innova Lecture BerberUploaded byOgunranti Rasaq
- Nural NetworkUploaded bydulanbackup1
- Articulo Surrogate3 (1)Uploaded byRuben Quenhua
- 2 Anirudh Biswas 1456 Research Article VSRDIJMCAPE February 2013Uploaded bykpmaity
- Artificial Neural NetworksUploaded byBhavik Prajapati
- lect08-Geo ComputationsUploaded byFaisal Rehman
- Computer Vision-Aided Fabric Inspection SystemUploaded byAnonymous zd5HPB5
- Lab4Uploaded byMairos Kunze Bonga
- HONNUploaded byc_mc2
- Lecture 2Uploaded byPranesh Jewalikar
- cs11_12.pdfUploaded byadhimadhu
- Using BackUploaded byrecepakan
- IJETR2160Uploaded byanil kasot
- dos+Santos%2C+F.C.R.ab_Intelligent-system-for-improving-dosage-controlArticle_2017.docxUploaded byAnthony Oviedo
- 1604.04326Uploaded byLucas Lima
- 234670049-Artificial-Intelligence-Neural-Networks-Unit-5-Basics-of-NN.pdfUploaded byAkshay Reddy Makutam
- Neural ComputingUploaded byShubha Shri
- forests-10-00016 (2)Uploaded byAlin Chiper

- 3D_AutoCAD_2009Uploaded byAlexandru Cojocaru
- Advanced Module in Mechatronics IntroUploaded bybalaji3108
- SbsUploaded bybalaji3108
- SustainabilityUploaded bybalaji3108
- Handbook Project Management English 11 AUploaded bylaguitar
- Engineering ManagementUploaded byDarnell H. de Luna
- French 2013Uploaded byAmanda Thiérry
- Engineering ManagementUploaded bybalaji3108
- Globalization Pros and ConsUploaded bybalaji3108
- CIPS KI Contract Management Guidev2Uploaded bymalaynv
- Circuit ApplicationsUploaded byAman Baviskar
- 4_plcUploaded bybak2345
- 3tpmUploaded bybalaji3108
- 02. INtools AdminUploaded bybalaji3108
- Bai Giang Ve Cong Nghe Chat DeoUploaded bytoanfbk
- AbstractUploaded bybalaji3108

- 05698665Uploaded byLalo Pato
- Lect2UP350_(100328)Uploaded byrkatreep
- Mechanical Engineering Sample PapersUploaded byrajesh0145
- Boundary Layer Velocity ProfileUploaded bySabir Ahmed
- TCS Set 2.docxUploaded byAbcdef
- linh tran - resume 6-11-2015Uploaded byapi-290361732
- EC2306 Lab for anna universityUploaded byprinceram123
- The Surgeons Orange Vision by Jackson WillsUploaded byCollective Exile
- Wound Analysis MethodUploaded byJeyam Siva
- Exploratory Factor Analysis With RUploaded byGediminas Ubartas
- holmstrom-lecture_split-range (1).pdfUploaded byXIOMARAramos
- 3rd Practice Test 2 Level 6-8 - With CalculatorUploaded bywong chee sheng
- Lower and Upper Bound Divisor FunctionUploaded bytalanum1
- Grammar in Philosophy, Subject and Predicate in Logic and GrammarUploaded byEduardo Ramírez
- Analisis Pola Tata Ruang Terbuka Tepian Sungai Winongo Di Kampung Budaya BangunrejoUploaded byJalu Tathit
- Center of Mass - Wikipedia, The Free EncyclopediaUploaded bydonodoni0008
- Preventive Maintenance Scheduling of Multi-cogeneration Plants Using Integer ProgrammingUploaded byabu1jubair
- DMDW-16-QTNSUploaded bycomputerstudent
- Flashover KennedyUploaded byPrakash Rao
- homework5.pdfUploaded byMary G
- Pages From Reservoir Rock Properties and Fluid Flow-6425a9aeb1626f470f74a38ddbc798afUploaded byMohamed Saeed
- asm1Uploaded byJagan Rajendiran
- ch 2 sec 1 measurement - uploadUploaded byapi-270861823
- General Mental Ability.pdfUploaded byRUTHERSON S
- cycle 2 honor algebraiiUploaded byapi-262886009
- Human Gait Analysis AndUploaded byCS & IT
- User Guide Pro Model VersionUploaded byVinod Das
- Asimov AlgebraUploaded byChichiMiguel
- Alan Belkin - ArmoniaUploaded bypreca72
- Find The Magnetic Field At An Arbitrary Point P ... _ Chegg.pdfUploaded byjosue868