Jonathan Pillow Princeton University
Statistical modeling and analysis of neural data (NEU 560), Spring 2018 Linear Algebra Review Lecture 2
1
Statistical modeling and analysis of neural data (NEU 560), Spring - - PowerPoint PPT Presentation
Statistical modeling and analysis of neural data (NEU 560), Spring 2018 Jonathan Pillow Princeton University Linear Algebra Review Lecture 2 1 Linear algebra Linear algebra has become as basic and as applicable as calculus, and
1
2
vN v
1 v2
v
1
v2 v v3 v2 v
1
v
3
# make a 3-component column vector v = np.array([[3], [1], [-7]])
# transpose v.T
# create row vector directly v = np.array([[3,1,-7]]) # row vector # or v = np.array([3,1,-7]) # 1D vector
4
6
7
# make a vector v = np.array([1, 7, 3, 0, 1]) # many equivalent ways to compute norm np.linalg.norm(v) # built-in function np.sqrt(np.dot(v,v)) # sqrt of dot product np.sqrt(v.T @ v) # sqrt of v-tranpose times v np.sqrt(sum(v * v)) # sqrt of sum of elementwise product np.sqrt(sum(v ** 2)) # sqrt of v elementwise-squared # note use of @ and * and ** # @ - gives matrix multiply # * - gives elementwise multiply # ** - gives exponentiation (“raising to a power”)
8
9
10
11
b
12
13
14
15
1
2
3
16
17
18
19
20
“the identity” (eg., for 4 x 4)
21
“the identity” (eg., for 4 x 4)
22
23
assume (for now) square and invertible left-multiply both sides by inverse of A:
24
1
1
2
2
1
25
1
1
2
2
1
26
1
2
2
1
27
28