Professional Documents
Culture Documents
Basic Perceptron
The dendrites branch of from the soma in a tree-like way and getting thinner with
every branch. They receive signals (impulses) from other neurons at synapses. The
axon - there is always only one - also leaves the soma and usually tend to extend for
longer distances than the dentrites. The axon is used for sending the output of the
neuron to other neurons or better to the synapsis of other neurons.
1
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020
When a signal comes in, it gets multiplied by a weight value that is assigned to this
particular input. That is, if a neuron has three inputs, then it has three weights that
can be adjusted individually. The weights usually get adjusted during the learn phase.
After this the modified input signals are summed up. It is also possible to add addi-
tionally a so-called bias b to this sum. The bias is a value which can also be adjusted
during the learn phase.
2
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020
Figure 4: Perceptron
Finally, the actual output has to be determined. For this purpose an activation or
step function φ is applied to weighted sum of the input values.
def c r e a t e _ d i s t a n c e _ f u n c t i o n ( a , b , c ) :
""" 0 = ax + by + c """
def d i s t a n c e ( x , y ) :
""" r e t u r n s t u p l e ( d , pos )
d is the distance
I f pos == −1 p o i n t i s b e l o w t h e l i n e ,
0 on t h e l i n e and +1 i f ab o ve t h e l i n e
"""
nom = a ∗ x + b ∗ y + c
i f nom == 0 :
pos = 0
e l i f (nom<0 and b<0) or (nom>0 and b >0):
pos = −1
3
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020
else :
pos = 1
return ( np . a b s o l u t e (nom) / np . s q r t ( a ∗∗ 2 + b ∗∗ 2 ) , pos )
return d i s t a n c e
f i g , ax = p l t . s u b p l o t s ( )
ax . s e t _ x l a b e l ( " s w e e t n e s s " )
ax . s e t _ y l a b e l ( " s o u r n e s s " )
ax . set_xlim ([ −1 , 6 ] )
ax . set_ylim ([ −1 , 8 ] )
X = np . a r a n g e ( −0.5 , 5 , 0 . 1 )
s i z e = 10
f o r ( index , ( x , y ) ) in enumerate ( p o i n t s ) :
i f i n d e x== 0 :
ax . p l o t ( x , y , " o " ,
c o l o r=" d a r k o r a n g e " ,
m a r k e r s i z e=s i z e )
else :
ax . p l o t ( x , y , " oy " ,
m a r k e r s i z e=s i z e )
step = 0.05
f o r x in np . a r a n g e ( 0 , 1+s t e p , s t e p ) :
s l o p e = np . tan ( np . a r c c o s ( x ) )
d i s t 4 l i n e 1 = c r e a t e _ d i s t a n c e _ f u n c t i o n ( s l o p e , −1, 0 )
#p r i n t (" x : " , x , " s l o p e : " , s l o p e )
Y = slope ∗ X
results = []
f o r p o i n t in p o i n t s :
r e s u l t s . append ( d i s t 4 l i n e 1 ( ∗ p o i n t ) )
#p r i n t ( s l o p e , r e s u l t s )
i f ( r e s u l t s [ 0 ] [ 1 ] != r e s u l t s [ 1 ] [ 1 ] ) :
ax . p l o t (X, Y, "g−" )
else :
ax . p l o t (X, Y, " r−" )
p l t . show ( )
4
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020
class Perceptron :
@staticmethod
def u n i t _ s t e p _ f u n c t i o n ( x ) :
if x > 0.5:
return 1
return 0
5
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020
p = P e r c e p t r o n ( 2 , np . a r r a y ( [ 0 . 5 , 0 . 5 ] ) )
data_in = np . empty ( ( 2 , ) )
f o r i n 1 in range ( 2 ) :
f o r i n 2 in range ( 2 ) :
data_in = ( in1 , i n 2 )
data_out = p ( data_in )
print ( data_in , data_out )
We will see that the neural network will find a line that separates the two classes.
This line should not be mistaken for the line, which we used to create the points.
6
Neural Networks CSE425
Moin Mostakim Assignment 1 January 30, 2020