You are on page 1of 5

Neural networks

Feedforward neural network - capacity of single neuron

h(x) = g(a(x)) = g(bi i wixi)


Abstract

ARTIFICIAL NEURON
-1

Abstract

Abs

0
Math for my slides Feedforward neural network. -1
Math for my slides
Feedforward
neural
network
Math
for
my
slides
Feedforward
ne
0
P
-1

>

a(x) = b +

1
w
x
=
b
+
w
x
i
i
P
i weights, bias, activationP
x1 >
Topics: connection
function
>
Pa(x) = bi i wia(x)
xi ==bbi + wi wxzikxi = b + w x
h(x) = g(a(x)) = g(b + i wi xi )
P
P

h(x)
=
g(a(x))
=
g(b
+
w
x
)
i
i
i
h(x) = g(a(x)) = g(bi i wi xi )
w
sortie k
w

w
y
1
x2
{
1
-.4
-1
w

{
range determined
kj
y
y
1 .7
2
0
by g() b
1
bias b only
g()
-1
-1.5

w
{

-1

-1
1

x1

(from Pascal Vincents slides)

biais

changes the
position
of
.5
the riff

cach

1 1

wji

entre

g(a) = sigm(a) =

1
1+exp( a)

ARTIFICIAL
NEURON
g(a) = tanh(a) =

exp(a) exp( a)
exp(a)+exp( a)

exp(2a) 1
exp(2a)+1

g(a)
= max(0, a)
Topics: capacity, decision boundary of
neuron

Reseaux
de neurones

Could

do binary classification:

g(a) = reclin(a) = max(0, a)

p(y = 1|x)
with sigmoid, can interpret neuron as estimating

also known as logistic regression classifier


g()
b
decision
La
puissance expressive des
reseaux
de neurones
boundary
is linear

if greater than 0.5,


predict class 1

otherwise, predict
class 0

h(x) = g(a(x))

deux couches

(similar idea can


apply with tanh)

x1

(1)
x2
Wi,j

a(x) =
x2

(1)
bi

R(1)
1
b

xj h(x)i

+W

(1)

(2)
wi

b(2)

a(x)i =

(2)
(2) >
f (x) = o b R+
w
x
2
p(y = c|x)

x2
(from Pascal Vincents slides)
o(a) = softmax(a)

(1)
bi

(1)
j Wi,j xj

x1

Pexp(a1 )

...

Pexp(aC )

i>

ARTIFICIAL NEURON
Topics: capacity of single neuron
solve linearly separable problems

, x2 )

0
0

(x1

XOR (x1 , x2 )
1

AND (x1 , x2 )
1

, x2 )

, x2 )

21

AND (x1 , x2 )

OR (x1 , x2 )

0
0

(x1

, x2 )

Can

(x1

XOR (x1 , x2 )

ARTIFICIAL NEURON
1

, x2 )

, x2 )

, x2 )

Topics: capacity of single neuron

solve
problems...
0
1
0 non 1linearly separable
(x1
(x1

XOR (x1 , x2 )

, x2 )

... unless

(x1

(x1

XOR (x1 , x2 )

AND (x1 , x2 )

Cant

AND (x1 , x2 )

the input is transformed in a better representation

Figure 1.8 Exemple de modlisation de XOR par un rseau une couche cache. En

You might also like