Uncertainty-Centric Safety Assurance of ML- Based Perception for - - PowerPoint PPT Presentation

uncertainty centric safety assurance of ml based
SMART_READER_LITE
LIVE PREVIEW

Uncertainty-Centric Safety Assurance of ML- Based Perception for - - PowerPoint PPT Presentation

Uncertainty-Centric Safety Assurance of ML- Based Perception for Automated Driving Krzysztof Czarnecki Waterloo Intelligent Systems Engineering (WISE) Lab University of Waterloo 1 Uncertainty-Centric Assurance of ML-Based Perception


slide-1
SLIDE 1

Uncertainty-Centric Safety Assurance of ML- Based Perception for Automated Driving

Krzysztof Czarnecki Waterloo Intelligent Systems Engineering (WISE) Lab University of Waterloo

1

slide-2
SLIDE 2

Uncertainty-Centric Assurance of ML-Based Perception

2

Perceptual Uncertainty

Misclassifications, under-classifications, quantitative errors

Safety requirements

  • n perception

Uncertainty Influence factors

(domain coverage, sensor noise, etc.)

Perceptual Uncertainty Aware Responsibility Sensitive Safety (PURSS) Uncertainty Management

slide-3
SLIDE 3

Uncertainty-Centric Assurance of ML-Based Perception

3

Perceptual Uncertainty

Misclassifications, under-classifications, quantitative errors

Uncertainty Management Perceptual Uncertainty Aware Responsibility Sensitive Safety (PURSS) Safety requirements

  • n perception

Uncertainty Influence factors

(domain coverage, sensor noise, etc.)

  • R. Salay, K. Czarnecki, M. Elli,
  • I. Alvarez, S. Sedwards, J. Weast.

PURSS: Towards a Perceptual Uncertainty- Aware Responsibility Sensitive Safety. Under submission.

slide-4
SLIDE 4

Responsible Sensitive Safety (RSS)

  • Defines responsible

behavior to address behavioral uncertainty

– Safe actions when safe and proper response when not safe

  • Guarantees no collision

when everyone follows the rules

4

slide-5
SLIDE 5

Responsible Sensitive Safety (RSS)

RULE 1. Do not hit the car in front (longitudinal distance) RULE 2. Do not cut in recklessly (lateral distance) RULE 3. Right of way is given, not taken RULE 4. Be cautious in areas with limited visibility RULE 5. If you can avoid a crash without causing another one, you must

5

https://arxiv.org/abs/1708.06374

slide-6
SLIDE 6

RULE 1. Safe Following Distance in RSS

6

Distance traveled due to reaction time Braking distance Distance traveled by front vehicle

slide-7
SLIDE 7

RULE 1. Safe Following Distance in RSS

7

Problem: Assumes perfect perception

slide-8
SLIDE 8

Perception Triangle

8

Real-world situation

Pedestrian speed = 0.1 activity = walking

Perception Accuracy

Pedestrian speed = 0 activity = standing

True state (unknowable)

slide-9
SLIDE 9

Safety Argument Decomposition

9

Perception Planning & control Sensing Actuation World model

ADS

slide-10
SLIDE 10

RSS as a Constraint on ADS

10

ADS

Perception Planning & control Sensing Actuation World model Perception Planning & control Sensing Actuation World model

RSS

slide-11
SLIDE 11

RSS as a Constraint on ADS

11

ADS

Perception Planning & control Sensing Actuation World model Perception Planning & control Sensing Actuation World model

RSS

slide-12
SLIDE 12

Sample RSS-Compliant World Model Schema

12

Safe following distance Safe action set Safe(s)

slide-13
SLIDE 13

Perception Cases (s → s’)

13

Misperception

s → s’ where s ¹ s’

Correct Perception

s → s’ where s = s’

Real-world situation Perception

Pedestrian speed = 0.1 activity = walking

True state (unknowable)

Pedestrian speed = 0 activity = standing

s s’ s ¹ s’ True state (unknowable) Real-world situation Perception

Pedestrian speed = 0 activity = standing

s s’ s = s’

Pedestrian speed = 0 activity = standing

slide-14
SLIDE 14

Safety of Perception

14

Misperception s → s’ potentially causes safety risk iff

slide-15
SLIDE 15

Safety-Irrelevant Misperceptions

15

Misperception s → s’ where Safe(s) = Safe(s’)

slide-16
SLIDE 16

Precise World Model

16

Real-world situation

Pedestrian speed = 0.1 activity = walking

Perception

Pedestrian speed = 0 activity = standing

True state (unknowable) Accuracy

slide-17
SLIDE 17

Perceptual Uncertainty Handling via Imprecise World Models

17

Real-world situation Perception Accuracy

Pedestrian speed = 0 activity = standing Pedestrian speed = 0.1 activity = walking Pedestrian speed = 0 activity = standing

… Imprecise World Model Set of credible states at conf. level a True state (unknowable)

slide-18
SLIDE 18

Perceptual Uncertainty Aware RSS (PURSS)

18

Imperfect Perception Planning & control Action World model

ADS RSS – rules lifted to imprecise world model

Situation Imprecise Perception Imprecise world model Planning & control Safe Actions Planning & control Planning & control

slide-19
SLIDE 19

Lifting World Model Schema to Imprecise World Model Schema

Elementwise lifting:

  • Class entity to superclass
  • Continuous value to interval
  • Discrete value to enumerated set
  • Derived attributes via set operations and

interval arithmetic

19

slide-20
SLIDE 20

Using Imprecise World Models to Mitigate Misperception

20

A safe action in an imprecise model must be safe for every precise model covered by the imprecise model. Given an under-perception case, where S is an imprecise model of confidence α perceived when the correct model:

slide-21
SLIDE 21

Different Risk Levels

21

a=10-4 a=10-4 a=10-9 a=10-9

slide-22
SLIDE 22

Imprecise Classification when High Integrity Required

22

a=10-4 a=10-4 ? a=10-9

slide-23
SLIDE 23

Conservative Action for High Integrity

23

a=10-4 a=10-4 ? a=10-9

slide-24
SLIDE 24

Example of Mitigation

24

Any No Lane Obstruction in Front Lane Obstruction in Front (LOF) Static LOF Front Vehicle

Actions: continue or stop or follow

slide-25
SLIDE 25

Safety Requirements on Perception Performance from PURSS

25

Any No Lane Obstruction in Front Lane Obstruction in Front (LOF) Static LOF Front Vehicle

Correct LOF/NLOF classification and distance ±5 cm at aLOF = 10-9 for 100% of time duration within ODD conditions Correct FV/SLOF classification and distance ±25 cm and velocity ±0.5 m/s at aFV = 10-4 for 90% of time duration within ODD condition

slide-26
SLIDE 26

Uncertainty-Centric Assurance of ML-Based Perception

26

Perceptual Uncertainty

Misclassifications, under-classifications, quantitative errors

Uncertainty Management Perceptual Uncertainty Aware Responsibility Sensitive Safety (PURSS) Safety requirements

  • n perception

Uncertainty Influence factors

(domain coverage, sensor noise, etc.)

  • K. Czarnecki and R. Salay.

Towards a Framework to Manage Perceptual Uncertainty for Safe Automated Driving. WAISE’18

slide-27
SLIDE 27

Guide to the Expression of Uncertainty in Measurement (GUM)

  • True accuracy

unknowable

– Accuracy in ML wrt. test set only

  • Must estimate

uncertainty

27

slide-28
SLIDE 28

Perception Triangle (Instance-Level)

28

Perception Real-world situation Sensory channel Camera image, radar data Perception algorithm

Pedestrian speed = 0.1 activity = walking Pedestrian speed = 0 activity = standing

… Set of credible states (uncertain) Accuracy

Pedestrian speed = 0 activity = standing

True state (unknowable)

slide-29
SLIDE 29

Perceptual Triangle

29

Real-world situations Concept Semantics Sensory data Sensory channel Data interpretation Perception

Sensory channel Perception Real-world situation

Pedestrian speed = 0 activity = standing

True state (unknowable) Perception algorithm Camera image, radar data

Pedestrian speed = 0.1 activity = walking Pedestrian speed = 0 activity = standing

… Set of credible states (uncertain) Accuracy

Instance-level Domain-level (generic)

slide-30
SLIDE 30

Perceptual Triangle When Using Supervised ML

30

Concept Development situations and scenarios Sensory data Sensory channel Partial semantics (examples) Data labeling

Training & testing

Trained Model

Model class selection, training & testing

Development Inference

Concept Operational situations and scenarios Sensory data Sensory channel Resulting perception Inferred state

Operation

slide-31
SLIDE 31

Factors Influencing Uncertainty (F1-7)

Concept Development situations and scenarios Sensory data Sensory channel Partial semantics (examples) Data labeling Concept Operational situations and scenarios Sensory data Sensory channel Resulting perception Inferred state

F4 F4

Training & testing Inference

Trained Model

Model class selection, training & testing

F6 F5

Domain shift F7

Development Operation

F2 F3 F2 F1

31

F3

  • K. Czarnecki and R. Salay. Towards a Framework to Manage

Perceptual Uncertainty for Safe Automated Driving. WAISE’18

slide-32
SLIDE 32

Factors Influencing Uncertainty (F1-7)

Concept Development situations and scenarios Sensory data Sensory channel Partial semantics (examples) Data labeling Concept Operational situations and scenarios Sensory data Sensory channel Resulting perception Inferred state

F4 F4

Training & testing Inference

Trained Model

Model class selection, training & testing

F6 F5

Domain shift F7

Development Operation

F2 F2 F1

32

F3

  • K. Czarnecki and R. Salay. Towards a Framework to Manage

Perceptual Uncertainty for Safe Automated Driving. WAISE’18

F3

slide-33
SLIDE 33

F3: Scene Uncertainty

33

slide-34
SLIDE 34

F3: Scene Uncertainty

  • Surrogate measures

– range, scale, occlusion level, atmospheric visibility, illumination, clutter and crowding level

  • Also part of development data set coverage
  • To determine sufficient coverage, compare

these measures with

  • 1. Test set accuracy
  • 2. Estimated uncertainty by the network

34

slide-35
SLIDE 35

Synthetic Dataset to Study Scene Influence Factors

35

Samin Khan, Buu Phan, Rick Salay, and Krzysztof Czarnecki. ProcSy: Procedural Synthetic Dataset Generation Towards Influence Factor Studies Of Semantic Segmentation Networks. Workshop on Vision for All Seasons: Bad Weather and Nighttime, associated with CVPR, Long Beach, 2019

slide-36
SLIDE 36

Scene Influence Factors -> Accuracy

37

slide-37
SLIDE 37

38

slide-38
SLIDE 38

39

slide-39
SLIDE 39

Aleatoric and Epistemic Uncertainty

40

1 2 3 Mutual Information (MI) = PE - AE Aleatoric Entropy (AE) = E(H(p)) Predictive Entropy (PE) = H(E(p)) 3 2 1 (Epistemic Uncertainty)

Smith L, Gal Y. Understanding measures of uncertainty for adversarial example detection. arXiv preprint arXiv:1803.08533

slide-40
SLIDE 40

Scene Influence Factors -> Uncertainty Estimates

41

slide-41
SLIDE 41

Occlusion and Depth -> Uncertainty Estimates

42

Buu Phan, Samin Khan, and Rick Salay, and Krzysztof Czarnecki. Bayesian Uncertainty Quantification with Synthetic Data. In Proceedings of International Workshop on Artificial Intelligence Safety Engineering (WAISE), SAFECOMP, Turku, Finland, 2019

slide-42
SLIDE 42

Occlusion and Depth -> Uncertainty Estimates

43 |D| = 3000 |D| = 500 |D| = 8000 |D| = 13100

slide-43
SLIDE 43

Rain, Clouds, Puddles -> Uncertainty Estimates

44

slide-44
SLIDE 44

Coming Soon: Canadian Adverse Driving Conditions Dataset

45

slide-45
SLIDE 45

Summary: Uncertainty-Centric Assurance of ML-Based Perception

46

Perceptual Uncertainty

Misclassifications, under-classifications, quantitative errors

Uncertainty Management Perceptual Uncertainty Aware Responsibility Sensitive Safety (PURSS) Safety requirements

  • n perception

Uncertainty Influence factors

(domain coverage, sensor noise, etc.)

slide-46
SLIDE 46

Insights and Challenges

  • ML currently cannot be assured to

certainty levels required for collision avoidance – ML is useful for longer-term, anticipatory risk reduction

  • Perceptual uncertainty must be

considered for the complete, fused perception and over time – E.g., different information becomes certain with different delays

  • Out-of-distribution detection is still

far from being useful in practice

  • RSS leads to more conservative

automated driving than human driving – E.g., negotiation in merging

47