You are on page 1of 14

McCulloch-Pitts

Introduction to Deep
Neuron
Learning
(20CS3269RA)
Mankind’s First Mathematical Model Of A Biological
Neuron
❑ It is very well known that the most
fundamental unit of deep neural
networks is called an artificial
neuron/perceptron.

❑ But the very first step towards the


Dendrite: Receives signals from other neurons
perceptron we use today was taken in
Soma: Processes the information
1943 by McCulloch and Pitts, by
mimicking the functionality of a Axon: Transmits the output of this neuron
biological neuron. Synapse: Point of connection to other neurons
McCulloch-Pitts Neuron

❑ Lets suppose that I want to predict my


own decision, whether to watch a
random football game or not on TV.

❑ The inputs are all Boolean i.e., {0,1}


and my output variable is also
Boolean {0: Will watch it, 1: Won’t
watch it}.
McCulloch-Pitts Neuron
❑ So,
❑ x_1 could be isPremierLeagueOn (I like Premier League more)
❑ x_2 could be isItAFriendlyGame (I tend to care less about the friendlies)
❑ x_3 could be isNotHome (Can’t watch it when I’m running errands. Can I?)
❑ x_4 could be isManUnitedPlaying (I am a big Man United fan. GGMU!) and so on.

❑ These inputs can either be excitatory or inhibitory.


❑ Inhibitory inputs are those that have maximum effect on the decision making
irrespective of other inputs i.e., if x_3 is 1 (not home) then my output will always be
0.
McCulloch-Pitts Neuron
❑ Excitatory inputs are NOT the ones that will make the neuron fire on their own but
they might fire it when combined together.
❑ Formally, this is what is going on:

❑ We can see that g(x) is just doing a sum of the inputs — a simple aggregation.
❑ And theta here is called thresholding parameter.
❑ For example, if I always watch the game when the sum turns out to be 2 or more, the
theta is 2 here.
McCulloch-Pitts Neuron
❑ All excitatory weights have the same positive
magnitude w, and all inhibitory weights have the
same negative magnitude –p.

❑ The activation function is a binary step function. It


is 1 if the net input y_in is greater than or equal to
a given threshold value θ, and 0 otherwise.

❑ The inhibition is absolute. A single inhibitory input


should prevent the neuron from firing irrespective
of the number of active excitatory inputs
Examples
Solution: The given neural net consists of three
input neurons and one output neuron. The inputs and
weights are:
[𝑥1, 𝑥2, 𝑥3] = [0.3,0.5,0.6]
[𝑤1, 𝑤2, 𝑤3] = [0.2,0.1, −0.3]
The net input can be calculated as
Q1: For the network shown in 𝑦𝑖𝑛 = 𝑥1𝑤1 + 𝑥2𝑤2 + 𝑥3𝑤3
above Figure, calculate the weights = 0.3 × 0.2 + 0.5 × 0.1 + 0.6 × −0.3
are net input to the output neuron. = 0.06 + 0.05 − 0.18
= −0.07
Examples
Solution: The given net consists of two input neurons a
bias and an output neuron. The inputs and weights are:
[x1, x2 ] = [0.2,0.6]
[w1, w2] = [0.3,0.7]
Since the bias is included b=0.45 and bias input x0 is
equal to 1.
Q2: Calculate the net input for the The net input can be calculated as
network shown in above Figure with yin= b + x1w1 + x2w2
bias included in the network. = 0.45 + 0.2 × 0.3 + 0.6 × 0.7
= 0.45 + 0.06 + 0.42
= 0.93
Therefore, yin = 0.93is the net input.
Examples
Solution: The given network has three neurons with bias
and output neuron. These form a single layer network.
The inputs and weights ere given as:
[x1, x2, x3 ] = [0.8,0.6,0.4]
[w1, w2, w3] = [0.1,0.3,-0.2]
with bias b=0.35 (its input is always 1). The net input to
Q3: Obtain the output of the neuron Y the output neuron is:
for the network shown in above yin= b+ σ𝑛𝑖=1 𝑥𝑖𝑤𝑖 [n=3, because only 3 input neurons are given]
Figure using activation function as: (i) = b + x1 w1 + x2w2 + x3w3
binary sigmoidal and (ii) bipolar = 0.35 + 0.8 × 0.1 + 0.6 × 0.3 + 0.4 × −0.2
sigmoidal. = 0.35 + 0.08 + 0.18 − 0.08
= 0.53
Examples
Solution: (i)For binary sigmoidal activation function,
y = 𝑓 𝑦𝑖𝑛
1
= −
1+𝑒 𝑦𝑖𝑛
1
= −
1+𝑒 053
= 0.625
(ii) For bipolar sigmoidal activation function,
Q3: Obtain the output of the neuron Y
for the network shown in above y = 𝑓 𝑦𝑖𝑛
2
Figure using activation function as: (i) = − -1
1+𝑒 𝑦𝑖𝑛
binary sigmoidal and (ii) bipolar 2
= − -1
sigmoidal. 1+𝑒 053
= 0. 259
M-P Neuron to implementation the logical AND
operation
M-P Neuron to implementation the logical OR
operation
M-P Neuron to implementation the logical XOR
operation
Limitations of the McCulloch-Pitts Artificial Neuron
❑ What about non-boolean (say, real) inputs?
❑ Do we always need to hand code the threshold?
❑ Are all inputs equal? What if we want to assign more importance to some inputs?
❑ What about functions which are not linearly separable? Say XOR function.
❑ It is now clear why we are not using the M-P neuron today.
❑ Overcoming the limitations of the M-P neuron, an American psychologist, proposed
the classical perception model, the mighty artificial neuron, in 1958.
❑ It is more generalized computational model than the McCulloch-Pitts neuron where
weights and thresholds can be learnt over time.

You might also like