Long-Term Goal: Monitoring Driver Behavior /++01.23( - - PowerPoint PPT Presentation

long term goal monitoring driver behavior
SMART_READER_LITE
LIVE PREVIEW

Long-Term Goal: Monitoring Driver Behavior /++01.23( - - PowerPoint PPT Presentation

MSP - CRSS A SSESSMENT OF DRIVER S DISTRACTION USING PERCEPTUAL EVALUATIONS , SELF ASSESSMENTS AND MULTIMODAL FEATURE ANALYSIS J INESH J AIN C ARLOS B USSO September 6th, 2011 busso@utdallas.edu MSP - CRSS Long-Term Goal: Monitoring Driver


slide-1
SLIDE 1

busso@utdallas.edu

MSP - CRSS

ASSESSMENT OF DRIVER’S DISTRACTION USING

PERCEPTUAL EVALUATIONS, SELF ASSESSMENTS AND MULTIMODAL FEATURE ANALYSIS

JINESH JAIN CARLOS BUSSO

September 6th, 2011

slide-2
SLIDE 2

busso@utdallas.edu

MSP - CRSS

Long-Term Goal: Monitoring Driver Behavior

2

!"#$%"&$#'( )&$*+&( ,+-.*$"&(

/++01.23( !$2&"4-"#+5( /&"#%.6(7.8+&.( 79:;,<5( =".0(7.8+&.(

First step is to define metrics to characterize driver distraction

slide-3
SLIDE 3

busso@utdallas.edu

MSP - CRSS

Definitions

  • Types of Distraction
  • Visual, cognitive, auditory and physical distractions
  • Report by Australian Road Safety Board
  • Voluntary or Involuntary diversion from primary driving task
  • Not related to impairment due to alcohol, fatigue and drugs
  • While performing secondary task focusing on a different
  • bject, event or person
  • Reduces situational awareness, decision making abilities

3

slide-4
SLIDE 4

busso@utdallas.edu

MSP - CRSS

Metrics for Distraction

  • Secondary task performance
  • Complete artificial detection tasks (e.g., math problem)
  • Effectiveness (accuracy) and efficiency (required time)
  • Surrogate schemes
  • The lane change test (LCT) [Mattes & Hallén, 2008]
  • Visual occlusion approach [Foley, 2008]
  • Primary task performance metrics
  • Lateral control, longitudinal control, brake response

4

slide-5
SLIDE 5

busso@utdallas.edu

MSP - CRSS

Metrics for Distraction

  • Eye glance behavior
  • Detailed eye-control metrics (e.g., within-fixation

metrics, eye closure pattern, eye-off-the-road duration)

  • Coarse visual behavior metric (e.g., head movement)
  • Subjective assessments [Victor, 2008]
  • Subjective mental workload (NASA-TLX)

5

Not all these metrics can be directly used to define labels to train machine learning

slide-6
SLIDE 6

busso@utdallas.edu

MSP - CRSS

Our Goal

  • To define reference labels for distracted drivers
  • Facilitate the training of classifiers
  • Real driving conditions
  • To explore and compare 3 different approaches:
  • Self evaluations (post driving questionnaires)
  • Perceptual evaluations (external raters)
  • Multimodal feature analysis (deviation from normal

behaviors)

6

slide-7
SLIDE 7

busso@utdallas.edu

MSP - CRSS

UTDrive

  • Front facing camera
  • PBC-700
  • 320 x 240 at 30fps
  • 4 - channel Microphone array
  • 25kHz
  • CAN Bus for Steering wheel,

Vehicle speed, Brake, Gas

  • Road facing camera
  • 320 x 240 at 15fps

7

slide-8
SLIDE 8

busso@utdallas.edu

MSP - CRSS

Protocol

8

  • 2 runs of driving per subject
  • First run – with 7 tasks
  • Operating a Radio
  • Operating Navigation System (GPS)
  • Operating and following
  • Cell phone
  • Operating and talking
  • Describing Pictures
  • Conversation with a Passenger
  • Second run – neutral driving

(without tasks)

20 drivers Good Day light, dry weather conditions to reduce environmental factors

slide-9
SLIDE 9

busso@utdallas.edu

MSP - CRSS

Self Assessments

  • Assumption:
  • Drivers are aware of the distractions induced by common

secondary tasks

  • Questionnaires completed by drivers after the recording
  • They rate how distracted they felt while performing tasks
  • 1 – less distracted, 5 – more distracted

9

Secondary tasks

  • Radio
  • GPS - Operating
  • GPS - Following
  • Phone - Operating
  • Phone - Talking
  • Pictures
  • Conversation
slide-10
SLIDE 10

busso@utdallas.edu

MSP - CRSS

Self Assessments

  • More Distracting
  • GPS - Operating
  • Phone - Operating
  • Less Distracting
  • GPS - Following
  • Conversation

10

Visual intensive tasks are perceived more distracting

slide-11
SLIDE 11

busso@utdallas.edu

MSP - CRSS

Perceptual Evaluations

  • Procedure:
  • Videos segmented into 5 sec videos
  • Subset of videos randomly chosen (480 videos)
  • 3 samples x 8 tasks x 20 drivers = 480
  • Twelve evaluators - UTD students (ρ = 0.63)
  • Three independent evaluations per video
  • Advantages
  • Labels assigned to localized segments
  • Videos can be assessed by many raters

11

Secondary tasks

  • Radio
  • GPS - Operating
  • GPS - Following
  • Phone - Operating
  • Phone - Talking
  • Pictures
  • Conversation
slide-12
SLIDE 12

busso@utdallas.edu

MSP - CRSS

Perceptual Evaluations

  • More Distracting
  • Radio
  • GPS - Operating
  • Phone - Operating
  • Pictures
  • Less Distracting
  • GPS - Following
  • Phone - Talking
  • Conversation

12

Visual intensive tasks are perceived more distracting

slide-13
SLIDE 13

busso@utdallas.edu

MSP - CRSS

Multimodal Feature Analysis

  • What features can be used to characterize

distractions?

  • Approach:
  • Contrasting features from task and normal

conditions (for each route segment)

  • Hypothesis testing (matched pairs)

13

Secondary tasks

  • Radio
  • GPS - Operating
  • GPS - Following
  • Phone - Operating
  • Phone - Talking
  • Pictures
  • Conversation
slide-14
SLIDE 14

busso@utdallas.edu

MSP - CRSS

Multimodal Feature Analysis

  • CAN-Bus Information
  • Steering wheel angle (Jitter),

Vehicle Speed, Brake Value, Gas pedal pressures

  • Frontal Facing video Information:
  • Head pose (yaw and pitch), eye closure
  • Extracted with AFECT

14

Courtesy: Machine Perception Laboratory, University of California, San Diego

slide-15
SLIDE 15

busso@utdallas.edu

MSP - CRSS

15

Multimodal Feature Analysis

  • Matched pairs Hypothesis Testing (p = 0.05)
slide-16
SLIDE 16

busso@utdallas.edu

MSP - CRSS

16

Multimodal Feature Analysis

  • The mean of head - yaw is an important feature
slide-17
SLIDE 17

busso@utdallas.edu

MSP - CRSS

Multimodal Feature Analysis

  • Error plot for the mean of head - yaw

17

−30 −20 −10 10 20 Radio GPS Operating GPS Following Phone Operating Phone Talking Pictures Conversation Neutral Task −30 −20 −10 10 20 Radio GPS Operating GPS Following Phone Operating Phone Talking Pictures Conversation Neutral Task

More distracted

slide-18
SLIDE 18

busso@utdallas.edu

MSP - CRSS

18

Multimodal Feature Analysis

  • Some tasks produce higher deviation in the

features from normal conditions

slide-19
SLIDE 19

busso@utdallas.edu

MSP - CRSS

19

Multimodal Feature Analysis

  • Other tasks produce small or no deviation in the

features from normal conditions

slide-20
SLIDE 20

busso@utdallas.edu

MSP - CRSS

Conclusions

  • Three methodologies to describe drivers’ distraction
  • Self evaluations
  • Perceptual evaluations
  • Multimodal feature analysis
  • Consistent results are observed across approaches
  • Visual distractions are better described than cognitive

distraction (e.g., Phone - Talking [Strayer et al., 2004])

  • Current work: we are conducting subjective

evaluations with mental workload scales

20

slide-21
SLIDE 21

busso@utdallas.edu

MSP - CRSS

Discussion & Questions

21

THANK YOU!