Neural Networks
Hopfield Nets and Auto Associators Fall 2017
1
Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 - - PowerPoint PPT Presentation
Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 Story so far Neural networks for computation All feedforward structures But what about.. 2 Loopy network = +1 > 0 =
1
2
πβ π
πππ§π + ππ
The output of a neuron affects the input to the neuron
3
πβ π
πππ§π + ππ
ππ
4
πβ π
πππ§π + ππ
ππ
5
πβ π
πππ§π + ππ
A neuron βflipsβ if weighted sum of other neuronβs outputs is of the opposite sign But this may cause
6
πππ§π + ππ
πβ π
πππ§π + ππ
7
8
9
10
11
β Which may cause other neurons including the first one to flipβ¦ Β» And so onβ¦
12
π§π = Ξ ΰ·
πβ π
π₯
πππ§π + ππ
A neuron βflipsβ if weighted sum of other neuronβs outputs is of the opposite sign But this may cause
Ξ π¨ = α+1 ππ π¨ > 0 β1 ππ π¨ β€ 0
13
14
β Which may cause other neurons including the first one to flipβ¦
15
β be the output of the i-th neuron just before it responds to the
current field
+ be the output of the i-th neuron just after it responds to the current
field
β = π‘πππ Οπβ π π₯ πππ§π + ππ , then π§π + = π§π β
β If the sign of the field matches its own sign, it does not flip
π§π
+
ΰ·
πβ π
π₯
πππ§π + ππ
β π§π
β
ΰ·
πβ π
π₯
πππ§π + ππ
= 0
πβ π
πππ§π + ππ
16
β β π‘πππ Οπβ π π₯ πππ§π + ππ , then π§π + = βπ§π β
π§π
+
ΰ·
πβ π
π₯
πππ§π + ππ
β π§π
β
ΰ·
πβ π
π₯
πππ§π + ππ
= 2π§π
+
ΰ·
πβ π
π₯
πππ§π + ππ
β This term is always positive!
π§π ΰ·
πβ π
π₯
πππ§π + ππ
πβ π
πππ§π + ππ
17
π
π<π
πππ§π + ππ
π,π<π
π
+, β¦ , π§π β πΈ π§1, β¦ , π§π β, β¦ , π§π
18
+, β¦ , π§π β πΈ π§1, β¦ , π§π β, β¦ , π§π
19
πΈ = ΰ·
π,π<π
π₯πππ§ππ§π + ΰ·
π
πππ§π
πΈπππ¦ = ΰ·
π,π<π
π₯ππ + ΰ·
π
ππ
βπΈπππ= min
π, {π§π, π=1..π} 2 ΰ· πβ π
π₯
πππ§π + ππ
20
21
β In doing so it may flip
β Which may flip
22
square of distance
πβ π
2 + ππ intrinsic external
23
π¦π = ΰ΅π¦π ππ π‘πππ π¦π π ππ = 1 βπ¦π ππ’βππ π₯ππ‘π π ππ = ΰ·
πβ π
π π¦π ππ β ππ
2 + ππ
24
π¦π = ΰ΅π¦π ππ π‘πππ π¦π π ππ = 1 βπ¦π ππ’βππ π₯ππ‘π
π ππ = ΰ·
πβ π
π π¦π ππ β ππ
2 + ππ
25
π¦π = ΰ΅π¦π ππ π‘πππ π¦π π ππ = 1 βπ¦π ππ’βππ π₯ππ‘π
πβ π
2 + ππ
26
πΉ = π· β 1 2 ΰ·
π
π¦ππ ππ = π· β ΰ·
π
ΰ·
π>π
π π¦ππ¦π ππ β ππ
2 β ΰ· π
πππ¦π
β Dipoles stop flipping if any flips result in increase of PE
π ππ = ΰ·
πβ π
π π¦π ππ β ππ
2 + ππ
π¦π = ΰ΅π¦π ππ π‘πππ π¦π π ππ = 1 βπ¦π ππ’βππ π₯ππ‘π
27
β Where PE is a local minimum
configuration
β I.e. the system remembers its stable state and returns to it
28
π,π<π
π
πβ π
πππ§π + ππ
29
π,π<π
π
πβ π
πππ§π + ππ
30
π,π<π
πβ π
πππ§π
31
π,π<π
state PE 32
state PE
33
Image pilfered from unknown source
π,π<π
34
energy contour
in energy
β So path to energy minimum is monotonic
π,π<π
35
π,π<π
36
π,π<π
37
In matrix form Note the 1/2
38
39
1 2 π³πππ³ = β 1 2 (βπ³)ππ(βπ³)
40
41
42
43
πΉ = β ΰ·
π
ΰ·
π>π
π₯
πππ§ππ§π
does not change significantly any more
π§π 0 = π¦π, 0 β€ π β€ π β 1
π§π π’ + 1 = Ξ ΰ·
πβ π
π₯
πππ§π
, 0 β€ π β€ π β 1
44
45
46
2
47
πΉ = β ΰ·
π
ΰ·
π<π
π₯
πππ§ππ§π
1 1 1
1
1
48
β Remember that every stored pattern π is actually two stored patterns, π and βπ
state PE
1
1 1 1
1
49
π
π<π
πππ§ππ§π
1
1 1 1
1
50
πβ π
πππ§π
1
1 51
πβ π
πππ§π
1
1 52
1
1 53
1
1
54
π
π<π
πππ§ππ§π = β ΰ· π
π<π
2π§π 2
π
π<π
1
1 55
π
π<π
πππ§ππ§π = β ΰ· π
π<π
2π§π 2
π
π<π
1
1
56
1
1 1 1
1
57
π
π β I) = πππ β πππ
1
1 1 1
1
58
Number of patterns
59
Energy landscape
an additive constant
1
1 1 1
1
60
Positive semidefinite!
61
62
63
64
65
66
67
Stored patterns
68
69
Stored patterns Ghosts (negations)
70
71
72
73
vectors of π
β Let π = π³1 π³2 β¦ π³π π¬π+1 β¦ π¬π β π is the number of bits β π¬π+1 β¦ π¬π are βsyntheticβ non-binary vectors β π³1 π³2 β¦ π³π π¬π+1 β¦ π¬π are all orthogonal to one another π = πΞππ
energy function around the stored values
β What must the Eigen values corresponding to the π³πs be? β What must the Eigen values corresponding to the βπ¬βs be?
74
75
76
77
78
79
80
81