80 views

Uploaded by mrtfkhang

SHES 2402 ADAPTIVE SYSTEMS WEEK 8: NEURAL NETWORKS (PART 1) PRELIMINARIES • The brain is a product of cumulatively, evolutionarily acquired traits that allows it to function like a computing device. Pattern recognition is one of the most basic task that a brain does. Knowing what can be eaten, and what animals are dangerous, deﬁnitely confers a selective advantage to survival. • Structurally, the brain consists of a network of massively connected neurons. The connectivity of each neuron can be a

- 10.1.1.38.8150
- Neural Fuzzy Based Self-Learning Algorithms
- Deep Learning Tutorial_ Perceptrons to Machine Learning Algorithms _ Toptal
- scimakelatex.98868.bubloo
- 10.1109@21.199466
- Forecasting Material Properties using Functional Networks
- Lecture 03
- 2007 02 01b Janecek Perceptron
- Dean I. Radin- Searching for "Signatures" in Anomalous Human-Machine Interaction Data: A Neural Network Approach
- Face Detection System Based on MLP Neural Network
- 2-36-1-PB
- ANN for QbD, Roller Compaction
- Face Detection Using Neural Network & Gabor Wavelet Transform
- Comparative Studies of Data Mining Techniques
- Tr 200904text24
- Application of Artificial Intelligence in Gas Storage Management
- PhD - Alex Conradie
- paper.pdf
- Successive Iteration Method for Reconstruction of Missing Data
- DDOS Attack Detection Based on an Ensemble of Neural Clasifier

You are on page 1of 4

PRELIMINARIES

function like a computing device. Pattern recognition is one of the most basic task that

a brain does. Knowing what can be eaten, and what animals are dangerous, definitely

confers a selective advantage to survival.

connectivity of each neuron can be as high as 10000. It has been estimated that there

are approximately 1014 to 1015 interconnections in the human brain, resulting in its

having the following properties.

Properties Examples

1. Massive parallelism : Thoughts occur in fractions of a second

because computing tasks are parallelised.

2. Image recognition : A is your friend, B is your enemy.

3. Memory : You remember your teacher’s name.

4. Learning ability : Young primates copy tool-using

from their elders.

5. Computational ability : You can add 13 and 86 inside your head.

6. Generalisation ability : Fire is hot; anything that burns will also

be hot - stay away!

7. Adaptivity : If you keep repeating a task, you become

better at it.

8. Contextual information processing : We understand references made to certain

people in satirical writings.

9. Fault tolerance : People who have had a stroke can

still think, albeit at a slower pace.

10. Energy efficiency : You can still play sports after attending

this lecture.

• These properties excite the computer science community. Current computing archi-

tecture is very strong in iterative procedures (humans can’t), but poor in many of the

aspects in Table 1. By studying how neurons are wired up, computer scientists may

be able to come up with more efficient designs for computing systems. On the other

hand, biologists might also benefit from insights in neural network research by moving

away from a purely descriptive approach to model-driven tests.

1

• We have already seen how evolutionary biology has stimulated the developement of

GA to solve practical problems, including using GA to do sequence alignment, which

is problem in molecular biology.

• The first, and simplest model used in neural network research is the McCulloch-Pitts

(MP) model. The MP model consists of a simple set of inputs (x1 , x2 , · · · , xn ), with

corresponding weights (w1 , w2 , . . . , wn ) connected to an artificial neuron. Note that

the weights can be positive (excitatory) or negative (inhibitory) The inputs are then

processed in the neuron according to the following decision rule:

P

If ni=1 wi xi < θ, the neuron is silent (0).

• Mathematically, we can represent the decision rule (activator function) as the indicator

function !

X n

1 wi xi ≥ θ ,

i=1

which takes value 1 when the condition given in the brackets is satisfied, and 0 oth-

erwise. Other alternatives for the activator function are possible. These include the

piecewise linear, Gaussian, or sigmoid functions. Figure 1 shows the graphs of com-

monly used activator functions.

0.8

0.8

f(t)

f(t)

0.4

0.4

0.0

0.0

−2 −1 0 1 2 −2 −1 0 1 2

t t

Sigmoid Gaussian

0.8

0.8

f(t)

f(t)

0.4

0.4

0.0

0.0

−2 −1 0 1 2 −2 −1 0 1 2

t t

2

• An example of the MP model with inputs and weights is given here (sketch in blank

spaces below).

are simple (no loops), then the network is described as “feed-forward; if loops exist,

then it is a “feedback” network.

• Feed-forward networks such as single layer perceptrons, are simple but memoryless.

Their static nature means that they do not have the ability to learn. On the otherhand,

feedback networks are dynamic, as they can use output values to adjust the inputs,

allowing a certain degree of self-regulation without further interference from the user.

(Sketch examples of feed-forward and feedback networks in the blank spaces below)

classification problem could be represented in the connection weights of a perceptron,

then the so-called delta rule (see below) is guaranteed to find such a set of connection

weights. However, this result is only true for linearly separable training sets.

3

LEARNING IN NEURAL NETWORKS

network architecture and connection weights in such a way that a task is performed

efficiently.

• Using available training sets, the network learns how to use the appropriate connection

weights to achieve a certain desired outcome. This is done by iteratively updating the

weights in the network, according to some particular learning paradigm. Three main

learning paradigms are usually employed: supervised, unsupervised, and hybrid.

• In supervised learning, we already have a set of correct answers from which to teach

the network the appropriate way of updating the weights so that for every input, the

correct answers are returned.

similarity and correlation between objects of interest. Accordingly, we classify the

objects into categories using such information, typically using clustering algorithms.

weights are updated using supervised learning, and the remainder using unsupervised

learning.

• There are three basic issues with learning from sample data: capacity, sample com-

plexity, and computational complexity.

• Capacity determines how many patterns can be stored; sample complexity determines

the number of training patterns required for training the network to perform a valid

generalisation; computational complexity determines the time needed for a learning

algorithm to obtain a valid solution from training patterns.

• Four learning rules are commonly used in neural network models: error-correction

(delta rule), Boltzmann, Hebbian, and competitive learning. We will only learn the

first one in this course.

- 10.1.1.38.8150Uploaded byomilani3482
- Neural Fuzzy Based Self-Learning AlgorithmsUploaded byVikram Adithya
- Deep Learning Tutorial_ Perceptrons to Machine Learning Algorithms _ ToptalUploaded byseadragonwin
- scimakelatex.98868.bublooUploaded bySarang Gupta
- 10.1109@21.199466Uploaded bySandy Tondolo
- Forecasting Material Properties using Functional NetworksUploaded byJournal of Computing
- Lecture 03Uploaded bysrikanth.mujjiga
- 2007 02 01b Janecek PerceptronUploaded byRaden
- Dean I. Radin- Searching for "Signatures" in Anomalous Human-Machine Interaction Data: A Neural Network ApproachUploaded byDominos021
- Face Detection System Based on MLP Neural NetworkUploaded byJournal of Computing
- 2-36-1-PBUploaded byChandra Kanth Pamarthi
- ANN for QbD, Roller CompactionUploaded byngriffin0521
- Face Detection Using Neural Network & Gabor Wavelet TransformUploaded byFayçal Belkessam
- Comparative Studies of Data Mining TechniquesUploaded byRuna Reddy
- Tr 200904text24Uploaded bySpeedSrl
- Application of Artificial Intelligence in Gas Storage ManagementUploaded byTrivendra Joshi
- PhD - Alex ConradieUploaded byAlex Conradie
- paper.pdfUploaded byAmitabha
- Successive Iteration Method for Reconstruction of Missing DataUploaded byIAEME Publication
- DDOS Attack Detection Based on an Ensemble of Neural ClasifierUploaded byNew Faizin Ridho
- knowledge.pptUploaded byMohammed Jashid
- CEIJUploaded byceijjournal
- ISAP - Artificial Neural Network Application in Estimation of Dissolved Gases in Insulating Mineral Oil From Physical-Chemical Datas for Incipient Fault Diagnosis_Paper_224Uploaded byjcmsergio
- Speech processing research paper 18Uploaded byimparivesh
- SOFT COMPUTING APPROACHES TO FAULT DIAGNOSISUploaded byrom009
- Building Neural Network and Logistic Regression ModelsUploaded byXuanthu Saigon
- 34 - Artigo_BaquiUploaded bycastilho22
- 1511.08458v2Uploaded byTehreem Ansari
- NnsUploaded byjaved765
- (Synthesis Lectures on Human Language Technologies 33) Philip Williams, Rico Sennrich, Matt Post, Philipp Koehn - Syntax-based Statistical Machine Translation-Morgan & Claypool (2016)Uploaded bysahand

- MIB.TUTORIAL10Uploaded bymrtfkhang
- As.lecture11Uploaded bymrtfkhang
- Tutorial 4Uploaded bymrtfkhang
- MIB.LECTURE11Uploaded bymrtfkhang
- MIB.TUTORIAL11Uploaded bymrtfkhang
- MIB.LECTURE9Uploaded bymrtfkhang
- MIB.NOTES10Uploaded bymrtfkhang
- Tutorial 5Uploaded bymrtfkhang
- Shes 2303 Mathematics in Biology Tutorial 9Uploaded bymrtfkhang
- MIB.LECTURE8Uploaded bymrtfkhang
- As.lecture12Uploaded bymrtfkhang
- AS.TUTORIAL3Uploaded bymrtfkhang
- tut3_Q4Q5Uploaded bymrtfkhang
- Shes 2402 Adaptive Systems Week 10: ChaosUploaded bymrtfkhang
- AS.LECTURE10.FR1Uploaded bymrtfkhang
- MIB.LECTURE6Uploaded bymrtfkhang
- AS.TUTORIAL1Uploaded bymrtfkhang
- MIB.TUTORIAL7Uploaded bymrtfkhang
- AS.TUTORIAL2Uploaded bymrtfkhang
- MIB.TUTORIAL6Uploaded bymrtfkhang
- AS.LECTURE9.SIUploaded bymrtfkhang
- AS.LECTURE8.GA3Uploaded bymrtfkhang
- As.lecture9Uploaded bymrtfkhang
- Shes 2303 Mathematics in Biology Assignment TopicsUploaded bymrtfkhang
- MIB.LECTURE7Uploaded bymrtfkhang
- AS.LECTURE9.NN2Uploaded bymrtfkhang
- AS.LECTURE7.GA1Uploaded bymrtfkhang
- AS.LECTURE7.GA2Uploaded bymrtfkhang
- MIB.TUTORIAL8Uploaded bymrtfkhang

- lukisan kejuruteraan reportUploaded byaq lapar
- Who Gave You the EpsilonUploaded byJary Lin
- Warren_F._motte OULIPO A_Primer of Potentia(BookzzUploaded byNIKOT2
- Corporate Elites and Corporate StrategyUploaded byArturo González Maldonado
- Supplier Selection Using ANN and Fuzzy Set TheoryUploaded byamit15094775
- [P] [Wirth, Kuensting, Leutner, 2009] Impact of goal specificity on learning outcome and cognitive load.pdfUploaded byDũng Lê
- SomUploaded byMohsin Mulla
- Determination of the Blaze WavelengthUploaded byakshay88
- Learning from Data Supplementary mathematicsUploaded byashutosh singh
- Qualifying QuadrilateralsUploaded byyaki70
- RBI Study PlanUploaded byAmar Rao
- meteorology booksUploaded byZAXUM E-learning Open source
- ResEng Flame Ch6Uploaded byCaleb Jonatan Montes Velarde
- Chapter 2Uploaded byMXR-3
- 02F_final_aUploaded bygd3000
- ch8.pdfUploaded byAmanda
- L3b Reactor Sizing Example ProblemsUploaded byNaniNeijie
- Ece 1673Uploaded bydominhphung
- BiophysicsUploaded byEris Maximiano
- 1-s2.0-S0360544212007050-mainUploaded byjehadyam
- ENG17 - SQ14 - Homework 7 - SolutionsUploaded bySghosh
- 1999-01-3000V001Uploaded byad80083
- Ch.02 Force VectorsUploaded byCK_85_3
- Metal Forming BookUploaded byعلاءإدريس
- Class 5english Math.pdfUploaded byUsman Faruque
- 1.2.7Uploaded byXiomara Escalante
- Mechanical SitesUploaded byjvaldivie1
- Geometric Data StructuresUploaded byScarlett Alejandra Serrano
- Birch, The Dialectic of Discovery (Abduction)Uploaded byPala Pili
- momen inersiaUploaded byHorick Errick