Professional Documents
Culture Documents
3 Types of layers
Activation Functions
Network Structure
Network Structure
Learning Paradigms
Supervised Learning:
Given training data consisting of pairs of
inputs/outputs, find a function which correctly
matches them
Unsupervised Learning:
Given a data set, the network finds patterns and
categorizes into data groups.
Reinforcement Learning:
No data given. Agent interacts with the
environment calculating cost of actions.
Simple Perceptron
Simple Perceptron
Linearly Separable
Linearly Separable
The bias is proportional to the offset of the plane
from the origin
The weights determine the slope of the line
The weight vector
is perpendicular to
the plane
1
if
x
nset
A
d n =
1 if x nset B
Learning Example
Initial Values:
= 0.2
0
w= 1
0.5
0 = w 0w 1 x1w 2 x 2
= 0x 10.5x 2
x 2 = 2x1
Learning Example
= 0.2
0
w= 1
0.5
x1 = 1, x2 = 1
wTx > 0
Correct classification,
no action
Learning Example
= 0.2
0
w= 1
0.5
x1 = 2, x2 = -2
w 0 = w 00.21
w 1 = w 10.22
w 2 = w 20.22
Learning Example
= 0.2
0.2
w = 0.6
0.9
x1 = 2, x2 = -2
w 0 = w 00.21
w 1 = w 10.22
w 2 = w 20.22
Learning Example
= 0.2
0.2
w = 0.6
0.9
x1 = -1, x2 = -1.5
wTx < 0
Correct classification,
no action
Learning Example
= 0.2
0.2
w = 0.6
0.9
x1 = -2, x2 = -1
wTx < 0
Correct classification,
no action
Learning Example
= 0.2
0.2
w = 0.6
0.9
x1 = -2, x2 = 1
w 0 = w 00.21
w 1 = w 10.22
w 2 = w 20.21
Learning Example
= 0.2
0
w = 0.2
1.1
x1 = -2, x2 = 1
w 0 = w 00.21
w 1 = w 10.22
w 2 = w 20.21
Learning Example
= 0.2
0
w = 0.2
1.1
x1 = 1.5, x2 = -0.5
w 0 = w 00.21
w 1 = w 10.21.5
w 2 = w 20.20.5
Learning Example
= 0.2
0.2
w = 0.5
1
x1 = 1.5, x2 = -0.5
w 0 = w 00.21
w 1 = w 10.21.5
w 2 = w 20.20.5
(x is an input vector)
x
k
1w
k
1
.
.
.
=x k ...x 1w 0
.
w w k 1 = w x 1...w x k
These are all > 0,
hence all >= ,
where
Thus we get
=min w xk
w w k 1 k
w w k 1 w w k1
ww k 1 k
2
k
w k 1 2
w
2
w j1 =w j x j
2
2
= w j x j 2w jx j
Thus we get
w j12w j2 x j2
incorrectly
classified,
so < 0
w j1 w j x j
w j 2w j12 x j12
.
.
.
We get
w 1 w 0 x 1
k
w k 1 x j
2
j=1
= max x j
k
w k 1 2
w
w k 1 k
2
2
max
2
w
k max =
2
The End