Application of PCA to Facial Recognition Aaron Kosmatin, Clayton - - PowerPoint PPT Presentation

application of pca to facial recognition
SMART_READER_LITE
LIVE PREVIEW

Application of PCA to Facial Recognition Aaron Kosmatin, Clayton - - PowerPoint PPT Presentation

Outline Introduction Principle Component Analysis Applying PCA to Images Application of PCA to Facial Recognition Aaron Kosmatin, Clayton Broman Math 45 December 17, 2010 Outline Introduction Principle Component Analysis Applying PCA to


slide-1
SLIDE 1

Outline Introduction Principle Component Analysis Applying PCA to Images

Application of PCA to Facial Recognition

Aaron Kosmatin, Clayton Broman

Math 45

December 17, 2010

slide-2
SLIDE 2

Outline Introduction Principle Component Analysis Applying PCA to Images

1

Outline

2

Introduction

3

Principle Component Analysis

4

Applying PCA to Images

slide-3
SLIDE 3

Outline Introduction Principle Component Analysis Applying PCA to Images

Applications of Facial Recognition

Photo Organization

Googles’ Picasa Facebook’s Face Locator Apple’s iLife Sony’s Picture Motion Browser

Automation

Border crossings that check passports Replace passwords on computers

Misc.

Many digital cameras Microsoft Kinect

slide-4
SLIDE 4

Outline Introduction Principle Component Analysis Applying PCA to Images

Challenges

Computation Speed False Positives Organizing Information

slide-5
SLIDE 5

Outline Introduction Principle Component Analysis Applying PCA to Images

Solutions

Principle Component Analysis (PCA) Linear Discriminate Analysis Electron Bunch Graph Mapping Hidden Markov Model Dynamic Link Matching

slide-6
SLIDE 6

Outline Introduction Principle Component Analysis Applying PCA to Images

Overview

PCA uses a covariance matrix to analyze how dimensions of a space vary with respect to one another. Some dimensions of the covariance matrix will have a low variation, while others will have high, depending on how consistent patterns or variations are from the average data. The eigenvectors of the covariance matrix form a basis for the space contained by the data. When sorted from highest to lowest, the eigenvalues represent the dimensions with the highest to lowest respective variation.

slide-7
SLIDE 7

Outline Introduction Principle Component Analysis Applying PCA to Images

Analysis of Datasets

When compairing datasets, a more useful measure than just the mean is

  • required. While [1 3 5 7 9], [3 4 5 6 7] and [5 5 5 5 5] have the same

mean, the range of values is higher in some sets than others. This range can be shown by the standard deviation. sdev =

  • n
  • i=1

(X − ¯ X)2 n − 1

slide-8
SLIDE 8

Outline Introduction Principle Component Analysis Applying PCA to Images

Variance

The variance is a related function to the standard deviation. variance =

n

  • i=1

(X − ¯ X)2 n − 1 Variance is the square of the standard deviation. covariance =

n

  • i=1

(X − ¯ X)(Y − ¯ Y ) n − 1 Covariance is the standard deviation of one dimension multiplied by the standard deviation of another dimension. The covariance shows the amount that two dimensions vary with respect to one another.

slide-9
SLIDE 9

Outline Introduction Principle Component Analysis Applying PCA to Images

The Covariance Matrix

The covariance matrix groups all of the covariances into a matrix. C =   cov(x, x) cov(x, y) cov(x, z) cov(y, x) cov(y, y) cov(y, z) cov(z, x) cov(z, y) cov(z, z)   The main diagonal of the covariance matrix is the variance of that

  • dimension. Since cov(a, b) = cov(b, a) the covariance matrix is symetric.

The covariance matrix can be found by subtracting the average from each dimension and then: AT × A

slide-10
SLIDE 10

Outline Introduction Principle Component Analysis Applying PCA to Images

Original Data

x y −0.4 0.4 0.6 0.0 0.7 1.0 1.8 1.8 1.4 2.2 2.7 2.2 3.0 2.3 3.0 3.5 3.8 4.0 4.2 3.5 Original Data Graph of Original Data

slide-11
SLIDE 11

Outline Introduction Principle Component Analysis Applying PCA to Images

Adjusting Data

xadjusted yadjusted x y x − ¯ x y − ¯ y −0.4 0.4 −2.5 −1.7 0.6 0.0 −1.5 −2.1 0.7 1.0 −1.4 −1.2 1.8 1.8 −0.2 −0.3 1.4 2.2 −0.6 0.1 2.7 2.2 0.6 0.1 3.0 2.3 0.9 0.2 3.0 3.5 1.0 1.4 3.8 4.0 1.7 1.9 4.2 3.5 2.1 1.4 ¯ x = 2.1 ¯ y = 2.1 Graph of Adjusted Data

slide-12
SLIDE 12

Outline Introduction Principle Component Analysis Applying PCA to Images

Creating the Covariance Matrix

cov(xadjusted, yadjusted) = 2.2573 1.8629 1.8629 1.8240

slide-13
SLIDE 13

Outline Introduction Principle Component Analysis Applying PCA to Images

Eigenvalues of the Covariance Matrix

eigenvalues = 0.1652 3.9161

  • eigenvectors =
  • 0.6650

−0.7468 −0.7468 −0.6650

  • The dominant eigenvector is:

−0.7468 −0.6650

slide-14
SLIDE 14

Outline Introduction Principle Component Analysis Applying PCA to Images

Using Eigenvectors as Basis for Data

FinalData = V T×AdjustedDataT v1 v2 −0.3691 2.9541 0.5401 2.4921 −0.0670 1.8298 0.0366 0.4173 −0.5360 0.3831 0.3221 −0.5185 0.4381 −0.8003 −0.4276 −1.6601 −0.2500 −2.5932 0.3128 −2.5043 Eigenvectors as Basis

slide-15
SLIDE 15

Outline Introduction Principle Component Analysis Applying PCA to Images

Reducing Dimensions

v2 2.9541 2.4921 1.8298 0.4173 0.3831 −0.5185 −0.8003 −1.6601 −2.5932 −2.5043 Data in terms of the dominant eigenvector

slide-16
SLIDE 16

Outline Introduction Principle Component Analysis Applying PCA to Images

Restoring Data

(V × v2Data) + OriginalAverage x y −0.1274 0.1378 0.2175 0.4452 0.7119 0.8858 1.7665 1.8255 1.7920 1.8482 2.4651 2.4481 2.6755 2.6356 3.3174 3.2076 4.0141 3.8284 3.9476 3.7692 Restored data in terms of the dominant eigenvector

slide-17
SLIDE 17

Outline Introduction Principle Component Analysis Applying PCA to Images

Why PCA

PCA highlights where there is high variance in a face. It reduces the data to areas of high variance and recognizes faces based on that. The data being compared is the pixels. The images are converted to column vectors, so the data runs across the rows, not down the columns.

slide-18
SLIDE 18

Outline Introduction Principle Component Analysis Applying PCA to Images

Γ

The image is read into the computer and converted to a vector, Γi. The vectors Γi are added to a matrix such that: Γ = [Γ1 Γ2 ... ΓM]

slide-19
SLIDE 19

Outline Introduction Principle Component Analysis Applying PCA to Images

Ψ

Ψ = 1 M

M

  • i=1

Γi

slide-20
SLIDE 20

Outline Introduction Principle Component Analysis Applying PCA to Images

Φ

Φ centers Γ around the origin. Phi is created with: Φi = Γi − Ψ

slide-21
SLIDE 21

Outline Introduction Principle Component Analysis Applying PCA to Images

A Large Dataset

The data could be analyzed by solving for the covariance matrix, but...

slide-22
SLIDE 22

Outline Introduction Principle Component Analysis Applying PCA to Images

A Large Dataset

The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = ΦT

slide-23
SLIDE 23

Outline Introduction Principle Component Analysis Applying PCA to Images

A Large Dataset

The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = ΦT and C = ATA

slide-24
SLIDE 24

Outline Introduction Principle Component Analysis Applying PCA to Images

A Large Dataset

The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = ΦT and C = ATA which means C = ΦΦT and this will require a large amount of computation.

slide-25
SLIDE 25

Outline Introduction Principle Component Analysis Applying PCA to Images

Eigenvectors of the Covariance Matrix

If we start with: ΦTΦvi = λivi

slide-26
SLIDE 26

Outline Introduction Principle Component Analysis Applying PCA to Images

Eigenvectors of the Covariance Matrix

If we start with: ΦTΦvi = λivi And multiply on the left by Φ: ΦΦTΦvi = λiΦvi

slide-27
SLIDE 27

Outline Introduction Principle Component Analysis Applying PCA to Images

Eigenvectors of the Covariance Matrix

If we start with: ΦTΦvi = λivi And multiply on the left by Φ: ΦΦTΦvi = λiΦvi The eigenvectors of the covariance matrix can be found as the eigenvectors of ΦTΦ multiplied on the left by Φ

slide-28
SLIDE 28

Outline Introduction Principle Component Analysis Applying PCA to Images

Projecting The Training Set into Facespace

The variation of the individual faces can be found with: α = V T × Φ

slide-29
SLIDE 29

Outline Introduction Principle Component Analysis Applying PCA to Images

Identifying Faces

The procedure for identifying a face not contained in the training set, Γ′ follows closely the procedure for creating α.

slide-30
SLIDE 30

Outline Introduction Principle Component Analysis Applying PCA to Images

Identifying Faces

The procedure for identifying a face not contained in the training set, Γ′ follows closely the procedure for creating α. First, subtract Ψ: Φ′ = Γ′ − Ψ

slide-31
SLIDE 31

Outline Introduction Principle Component Analysis Applying PCA to Images

Identifying Faces

The procedure for identifying a face not contained in the training set, Γ′ follows closely the procedure for creating α. First, subtract Ψ: Φ′ = Γ′ − Ψ Second, project the image into facespace: α′ = V TΦ′

slide-32
SLIDE 32

Outline Introduction Principle Component Analysis Applying PCA to Images

Identifying Faces

The procedure for identifying a face not contained in the training set, Γ′ follows closely the procedure for creating α. First, subtract Ψ: Φ′ = Γ′ − Ψ Second, project the image into facespace: α′ = V TΦ′ Third, find the closest point in α to α′ using the standard euclidian distance formula.

slide-33
SLIDE 33

Outline Introduction Principle Component Analysis Applying PCA to Images

Identifying Faces

The procedure for identifying a face not contained in the training set, Γ′ follows closely the procedure for creating α. First, subtract Ψ: Φ′ = Γ′ − Ψ Second, project the image into facespace: α′ = V TΦ′ Third, find the closest point in α to α′ using the standard euclidian distance formula. The point (face) that is closest the projected point is recognized as the person.

slide-34
SLIDE 34

Outline Introduction Principle Component Analysis Applying PCA to Images

Facial Reconstruction

Faces can be reconstructed with (α′ × V ) + Ψ.