Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R - - PowerPoint PPT Presentation

linear classifiers prediction eq u ations
SMART_READER_LITE
LIVE PREVIEW

Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R - - PowerPoint PPT Presentation

Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R S IN P YTH ON Michael ( Mike ) Gelbart Instr u ctor , The Uni v ersit y of British Col u mbia Dot Prod u cts x = np.arange(3) np.sum(x*y) x 14 array([0, 1, 2]) x@y y =


slide-1
SLIDE 1

Linear classifiers: prediction equations

L IN E AR C L ASSIFIE R S IN P YTH ON

Michael (Mike) Gelbart

Instructor, The University of British Columbia

slide-2
SLIDE 2

LINEAR CLASSIFIERS IN PYTHON

Dot Products

x = np.arange(3) x array([0, 1, 2]) y = np.arange(3,6) y array([3, 4, 5]) x*y array([0, 4, 10]) np.sum(x*y) 14 x@y 14

x@y is called the dot

product of x and y , and is wrien x ⋅ y.

slide-3
SLIDE 3

LINEAR CLASSIFIERS IN PYTHON

Linear classifier prediction

raw model output = coefficients ⋅ features + intercept

Linear classier prediction: compute raw model output, check the sign if positive, predict one class if negative, predict the other class This is the same for logistic regression and linear SVM

fit is dierent but predict is the same

slide-4
SLIDE 4

LINEAR CLASSIFIERS IN PYTHON

How LogisticRegression makes predictions

raw model output = coefficients ⋅ features + intercept

lr = LogisticRegression() lr.fit(X,y) lr.predict(X)[10] lr.predict(X)[20] 1

slide-5
SLIDE 5

LINEAR CLASSIFIERS IN PYTHON

How LogisticRegression makes predictions (cont.)

lr.coef_ @ X[10] + lr.intercept_ # raw model output array([-33.78572166]) lr.coef_ @ X[20] + lr.intercept_ # raw model output array([ 0.08050621])

slide-6
SLIDE 6

LINEAR CLASSIFIERS IN PYTHON

The raw model output

slide-7
SLIDE 7

LINEAR CLASSIFIERS IN PYTHON

The raw model output

slide-8
SLIDE 8

LINEAR CLASSIFIERS IN PYTHON

The raw model output

slide-9
SLIDE 9

Let's practice!

L IN E AR C L ASSIFIE R S IN P YTH ON

slide-10
SLIDE 10

What is a loss function?

L IN E AR C L ASSIFIE R S IN P YTH ON

Michael Gelbart

Instructor, The University of British Columbia

slide-11
SLIDE 11

LINEAR CLASSIFIERS IN PYTHON

Least squares: the squared loss

scikit-learn's LinearRegression minimizes a loss:

(true ith target value − predicted ith target value)

Minimization is with respect to coecients or parameters of the model. Note that in scikit-learn model.score() isn't necessarily the loss function.

i=1

n 2

slide-12
SLIDE 12

LINEAR CLASSIFIERS IN PYTHON

Classification errors: the 0-1 loss

Squared loss not appropriate for classication problems (more on this later). A natural loss for classication problem is the number of errors. This is the 0-1 loss: it's 0 for a correct prediction and 1 for an incorrect prediction. But this loss is hard to minimize!

slide-13
SLIDE 13

LINEAR CLASSIFIERS IN PYTHON

Minimizing a loss

from scipy.optimize import minimize minimize(np.square, 0).x array([0.]) minimize(np.square, 2).x array([-1.88846401e-08])

slide-14
SLIDE 14

Let's practice!

L IN E AR C L ASSIFIE R S IN P YTH ON

slide-15
SLIDE 15

Loss function diagrams

L IN E AR C L ASSIFIE R S IN P YTH ON

Michael (Mike) Gelbart

Instructor, The University of British Columbia

slide-16
SLIDE 16

LINEAR CLASSIFIERS IN PYTHON

The raw model output

slide-17
SLIDE 17

LINEAR CLASSIFIERS IN PYTHON

0-1 loss diagram

slide-18
SLIDE 18

LINEAR CLASSIFIERS IN PYTHON

Linear regression loss diagram

slide-19
SLIDE 19

LINEAR CLASSIFIERS IN PYTHON

Logistic loss diagram

slide-20
SLIDE 20

LINEAR CLASSIFIERS IN PYTHON

Hinge loss diagram

slide-21
SLIDE 21

LINEAR CLASSIFIERS IN PYTHON

Hinge loss diagram

slide-22
SLIDE 22

Let's practice!

L IN E AR C L ASSIFIE R S IN P YTH ON