Nonparametric Filter Quan Nguyen November 16, 2015 1 Outline 1. - - PowerPoint PPT Presentation

β–Ά
nonparametric filter
SMART_READER_LITE
LIVE PREVIEW

Nonparametric Filter Quan Nguyen November 16, 2015 1 Outline 1. - - PowerPoint PPT Presentation

Nonparametric Filter Quan Nguyen November 16, 2015 1 Outline 1. Hidden Markov Model 2. State estimation 3. Bayes filters 4. Histogram filter 5. Binary filter with static state 6. Particle filter 7. Summary 8. References 2 1. . Hidden


slide-1
SLIDE 1

Nonparametric Filter

Quan Nguyen November 16, 2015

1

slide-2
SLIDE 2

Outline

  • 1. Hidden Markov Model
  • 2. State estimation
  • 3. Bayes filters
  • 4. Histogram filter
  • 5. Binary filter with static state
  • 6. Particle filter
  • 7. Summary
  • 8. References

2

slide-3
SLIDE 3

1. . Hidden Mark rkov Mod

  • del

Bayesian Network

  • Graphical model of conditional probabilistic relation
  • Directed acyclic graph (DAG)

𝑯 = 𝑾, 𝑭 V: set of random variables E: set of conditional dependencies

http://www.intechopen.com/books/current-topics-in-public-health/from-creativity-to-artificial-neural- networks-problem-solving-methodologies-in-hospitals

3

slide-4
SLIDE 4

1. . Hidden Mark rkov Mod

  • del

Hidden Markov Model

  • Particular kind of Bayesian Network
  • Modelling time series data

http://sites.stat.psu.edu/~jiali/hmm.html

4

slide-5
SLIDE 5

1. . Hidden Mark rkov Mod

  • del

Hidden Markov Model

5

https://en.wikipedia.org/wiki/Viterbi_algorithm#Example

slide-6
SLIDE 6

1.

  • 1. Hidd

idden Mar arkov Mod

  • del

el

Hidden Markov Model

Observing a patient for 3 days: + Day 1: Cold + Day 2: Normal + Day 3: Dizzy Question: 1) Most likely sequence of health condition of the patient in last 3 days ? 2) Most likely health condition of the patient in the 4th day ?

6

slide-7
SLIDE 7

2. . State es esti timati tion

  • n

State space

  • Quantities that cannot be directly observed but can be inferred from sensor

data

  • Examples: position and direction of robot in a room
  • Notation:

π‘Œ = 𝑦1, 𝑦2, … 𝑦𝑒 𝑄 π‘Œ = 𝑦𝑒 : π‘žπ‘ π‘π‘π‘π‘π‘—π‘šπ‘’π‘§ 𝑝𝑔 𝑑𝑏𝑒𝑓 π‘“π‘Ÿπ‘£π‘π‘šπ‘‘ 𝑒𝑝 𝑦 𝑏𝑒 𝑒𝑗𝑛𝑓 𝑒

7

slide-8
SLIDE 8

2. . St State es esti timati tion

  • n

Measurement (Observation)

  • Environment data provided by robot sensor
  • Examples: distance to ground, camera images
  • Notation:

π‘Ž = 𝑨1, 𝑨2, … , 𝑨𝑒 𝑄 π‘Ž = 𝑨𝑒 : π‘žπ‘ π‘π‘π‘π‘π‘—π‘šπ‘’π‘§ 𝑝𝑔 π‘›π‘“π‘π‘‘π‘£π‘ π‘“π‘›π‘“π‘œπ‘’ π‘“π‘Ÿπ‘£π‘π‘šπ‘‘ 𝑒𝑝 𝑨 𝑏𝑒 𝑒𝑗𝑛𝑓 𝑒

8

slide-9
SLIDE 9

2.S .State es esti timati tion

  • n

Control data

  • Information about the change of state in the

environment

  • Examples: velocity of robot, temperature of a room,

an action of robot on environment objects

  • Notation:

𝑉 = 𝑣1, 𝑣2, … , 𝑣𝑒 𝑄 𝑉 = 𝑣𝑒 : π‘žπ‘ π‘π‘π‘π‘π‘—π‘šπ‘’π‘§ 𝑝𝑔 π‘›π‘“π‘π‘‘π‘£π‘ π‘“π‘›π‘“π‘œπ‘’ π‘“π‘Ÿπ‘£π‘π‘šπ‘‘ 𝑒𝑝 𝑨 𝑏𝑒 𝑒𝑗𝑛𝑓 𝑒

9

slide-10
SLIDE 10

2.S .State es esti timati tion

  • n

Probabilistic Generative Laws

  • State can be constructed on all past states,

measurements and controls: 𝑄 π‘Œ = 𝑦𝑒 = 𝑄 π‘Œ = 𝑦𝑒 𝑦0:π‘’βˆ’1, 𝑨0:π‘’βˆ’1, 𝑣0:π‘’βˆ’1)

  • Markov assumption:

𝑄(π‘Œ = 𝑦𝑒) = 𝑄(π‘Œ = 𝑦𝑒|π‘¦π‘’βˆ’1, 𝑣𝑒) 𝑄(π‘Ž = 𝑨𝑒) = 𝑄(π‘Ž = 𝑨𝑒|𝑦𝑒)

10

slide-11
SLIDE 11

2.S .State es esti timati tion

  • n

Belief distribution

  • Belief:
  • Internal knowledge of the robot about the true state
  • Represent probability to each possible true sate
  • Notation:

π‘π‘“π‘š 𝑦𝑒 = π‘ž(𝑦𝑒|𝑨1:𝑒, 𝑣1:𝑒)

  • Prediction:

π‘π‘“π‘š 𝑦𝑒 = π‘ž(𝑦𝑒|𝑨1:π‘’βˆ’1, 𝑣1:𝑒)

  • Correction:

π‘π‘“π‘š 𝑦𝑒 = F(π‘π‘“π‘š 𝑦𝑒 )

11

slide-12
SLIDE 12

3. . Bay Bayes Filter

Bayes Filter algorithm (continuous case)

1: πΊπ‘£π‘œπ‘‘_π‘‘π‘π‘œπ‘’π‘—π‘œπ‘π‘£π‘‘_𝐢𝑏𝑧𝑓𝑑_π‘”π‘—π‘šπ‘’π‘“π‘  (π‘π‘“π‘š π‘¦π‘’βˆ’1 , 𝑣𝑒, 𝑨𝑒) 2: 𝑔𝑝𝑠 π‘π‘šπ‘š 𝑦𝑒 𝑒𝑝 3: π‘π‘“π‘š 𝑦𝑒 = π‘ž( 𝑦𝑒|𝑣𝑒, π‘¦π‘’βˆ’1)π‘π‘“π‘š π‘¦π‘’βˆ’1 𝑒𝑦 4: π‘π‘“π‘š 𝑦𝑒 = π‘œπ‘π‘ π‘›π‘π‘šπ‘—π‘¨π‘“π‘  βˆ— π‘ž(𝑨𝑒|𝑦𝑒) π‘π‘“π‘š 𝑦𝑒 5: π‘“π‘œπ‘’ 6: π‘ π‘“π‘’π‘£π‘ π‘œ π‘π‘“π‘š(𝑦𝑒)

12

slide-13
SLIDE 13

3. . Bay Bayes Filter

Bayes Filters algorithm (discrete case)

1: πΊπ‘£π‘œπ‘‘_𝑒𝑗𝑑𝑑𝑠𝑓𝑒𝑓_𝐢𝑏𝑧𝑓𝑑_π‘”π‘—π‘šπ‘’π‘“π‘ (π‘žπ‘™,π‘’βˆ’1, 𝑣𝑒, 𝑨𝑒) 2: 𝑔𝑝𝑠 π‘π‘šπ‘š 𝑙 𝑒𝑝 3: π‘žπ‘™,𝑒 = π‘ž( 𝑦𝑒|𝑣𝑒, π‘Œπ‘’βˆ’1 = 𝑦𝑗)π‘žπ‘—,π‘’βˆ’1 4: π‘žπ‘™,𝑒 = π‘œπ‘π‘ π‘›π‘π‘šπ‘—π‘¨π‘“π‘  βˆ— π‘ž(𝑨𝑒|𝑦𝑒)π‘žπ‘™,𝑒 5: π‘“π‘œπ‘’ 6: π‘ π‘“π‘’π‘£π‘ π‘œ π‘žπ‘™,𝑒

13

slide-14
SLIDE 14

4.His .Histogram filter

Histogram Filter

  • Discrete Bayes filter estimation for continuous state spaces
  • State space decomposition:
  • π‘†π‘π‘œπ‘•π‘“ π‘Œπ‘’ = {𝑦1,𝑒 βˆͺ 𝑦2,𝑒 βˆͺ … 𝑦𝑁,𝑒}
  • 𝐺𝑝𝑠 𝑓𝑀𝑓𝑠𝑧 𝑗 β‰  𝑙: 𝑦𝑗,𝑒 ∩ 𝑦𝑙,𝑒 = βˆ…
  • In each region the posterior is a piecewise constant density:
  • 𝐺𝑝𝑠 𝑓𝑀𝑓𝑠𝑧 𝑑𝑒𝑏𝑒𝑓 𝑦𝑒 π‘π‘“π‘šπ‘π‘œπ‘•π‘‘ 𝑒𝑝 π‘™π‘’β„Ž π‘ π‘“π‘•π‘—π‘π‘œ:

π‘ž 𝑦𝑒 =

π‘žπ‘™,𝑒 𝑦𝑙 𝑒

14

slide-15
SLIDE 15

4.His .Histogram filter

Histogram filter

  • Problem: prior information is defined for individual states, not for

region !

  • Refer to line 3, 4 of discrete Bayes filter algorithm
  • Solution: approximating density of a region by a representative state of

that region. 𝑦𝑙,𝑒 =

𝑦𝑙,𝑒 𝑦 𝑒𝑒𝑦 𝑒

𝑦𝑙,𝑒

15

slide-16
SLIDE 16

4.His .Histogram filter

Histogram filter

  • Approximation of density values for regions:

π‘ž 𝑨𝑒|𝑦𝑙,𝑒 β‰ˆ π‘ž 𝑨𝑒 𝑦𝑙,𝑒 π‘ž 𝑦𝑙,𝑒|𝑣𝑒, 𝑦𝑗,π‘’βˆ’1 β‰ˆ π‘œπ‘π‘ π‘›π‘π‘šπ‘—π‘¨π‘“π‘  βˆ— π‘ž( 𝑦𝑙,𝑒|𝑣𝑒, 𝑦𝑗,π‘’βˆ’1)

  • Precondition: all regions must have the same size.
  • Now discrete Bayes filter algorithm is applicable !

16

slide-17
SLIDE 17

5. . Bi Binary filter r with th stati tic state

Binary Bayes filter with Static State

  • Belief is a function of measurement:

π‘π‘“π‘šπ‘’ 𝑦 = π‘ž 𝑦 𝑨1:𝑒, 𝑣1:𝑒 = π‘ž(𝑦|𝑨1:𝑒)

  • General algorithm:

1: πΊπ‘£π‘œπ‘‘_π‘π‘—π‘œπ‘π‘ π‘§_𝐢𝑏𝑧𝑓𝑑_π‘”π‘—π‘šπ‘’π‘“π‘ (π‘šπ‘’βˆ’1, 𝑨𝑒) 2: π‘šπ‘’ = π‘šπ‘’βˆ’1 + log

π‘ž(𝑦|𝑨𝑒) 1 βˆ’π‘ž 𝑦 𝑨𝑒

βˆ’ log

π‘ž(𝑦) 1βˆ’π‘ž(𝑦)

3: π‘ π‘“π‘’π‘£π‘ π‘œ π‘šπ‘’

17

slide-18
SLIDE 18

5. . Bi Binary filter r with th stati tic state

  • Log odds ratio

π‘š 𝑦 = log π‘ž(𝑦) 1 βˆ’ π‘ž(𝑦)

  • Avoids truncation problems when probabilities close to 0 or 1
  • Inverse measurement model:
  • Reduce complexity by using probability of state given

measurement data

  • Example: infer state of a door in an image is much easier than

infer an image from all other images of a close/open door.

18

slide-19
SLIDE 19

5. . Bin Binary fil ilter r with ith stati tic state

Example of Binary filter: Occupancy grid mapping

  • Estimate (generate) map from (noisy) sensor measurement data and

robot position

  • General algorithm:

𝒒 𝑡𝒃𝒒 = 𝑡 π’œπŸ:𝒖, π’šπŸ:𝒖 =

𝒅

𝒒(π‘«π’‡π’Žπ’Ž = 𝒅 𝒋𝒕 𝒑𝒅𝒅𝒗𝒒𝒋𝒇𝒆|π’œπŸ:𝒖, π’šπŸ:𝒖) 𝒒(π‘«π’‡π’Žπ’Ž = 𝒅 𝒋𝒕 𝒑𝒅𝒅𝒗𝒒𝒋𝒇𝒆|π’œπŸ:𝒖, π’šπŸ:𝒖) is a binary estimation problem

19

slide-20
SLIDE 20

6. . Parti article filter

Particle filter algorithm

  • Represent the posterior density by a set of weighted random particles
  • General algorithm:

1: πΊπ‘£π‘œπ‘‘_π‘„π‘π‘ π‘’π‘—π‘‘π‘šπ‘“_π‘”π‘—π‘šπ‘’π‘“π‘ (π‘Œπ‘’βˆ’1, 𝑣𝑒, 𝑨𝑒) 2: π‘Œπ‘’ = π‘Œπ‘’ = βˆ… 3: 𝑔𝑝𝑠 𝑗 = 1 𝑒𝑝 𝑁 𝑒𝑝 4: π‘‘π‘π‘›π‘žπ‘šπ‘“ 𝑦𝑒

𝑗 ~ π‘ž(𝑦𝑒|π‘¦π‘’βˆ’1 𝑗

) 5: π‘₯𝑒

𝑗 = π‘ž(𝑨𝑒 |𝑦𝑒 𝑗)

6: π‘Œπ‘’ = π‘Œπ‘’ + (𝑦𝑒

𝑗, π‘₯𝑒 𝑗)

7: π‘“π‘œπ‘’π‘”π‘π‘  8: 𝑔𝑝𝑠 𝑗 = 1 𝑒𝑝 𝑁 𝑒𝑝 9: 𝑒𝑠𝑏π‘₯ 𝑗 π‘₯π‘—π‘’β„Ž π‘žπ‘ π‘π‘π‘π‘π‘—π‘šπ‘—π‘’π‘§ ∝ π‘₯𝑒

𝑗

10: 𝑏𝑒𝑒 𝑦𝑒

𝑗 𝑒𝑝 π‘Œπ‘’

11: π‘“π‘œπ‘’π‘”π‘π‘  12: π‘ π‘“π‘’π‘£π‘ π‘œ π‘Œπ‘’

20

slide-21
SLIDE 21

6. . Parti article filter

Particle filter algorithm

http://www.juergenwiki.de/work/wiki/doku.php?id=public:particle_filter

21

slide-22
SLIDE 22

6.

  • 6. Par

article fil filter

Properties of Particle filter algorithm

  • Degree of freedom:
  • Because of normalization we lost one degree of freedom:

deg = 𝑁 βˆ’ 1

  • Identical particles after resampling phase:
  • Resampling with probability proportional to weight: after every iteration

we failed to draw one or more state sample

22

slide-23
SLIDE 23

6. . Parti article filter

Properties of Particle filter algorithm

  • Deterministic sensor:
  • Sensor with noise-free range: measurement data is

zero for most of state ! οƒžAll weights become zero.

  • Particle deprivation problem:
  • Resampling can wipe out all particles near the true

state οƒžincorrect states have larger weight !

23

slide-24
SLIDE 24

6. . Parti article filter

Application of Particle filter

  • Tracking the state of a dynamic system modeled

by a Bayesian Network: Robot localization, SLAM, robot fault diagnosis.

  • Image segmentation: by generating a large number
  • f particles and gradually focus on particle with

desired properties οƒžImage processing, Medial image analysis

24

slide-25
SLIDE 25

7. . Su Summary ry

Summary

  • Nonparametric filters represent posterior state as a function of previous

poster state

  • Nonparametric filters does not rely on a fixed functional form of

posterior

  • Histogram filter and Particle filter represent state space and posterior as

a finite set of data

  • There is usually a trade-off between efficiency and level of detail of data

25

slide-26
SLIDE 26

8. . References

References

  • Sebastian Thrun, Wolfram Burgard, and Dieter Fox. 2005. Probabilistic

Robotics (Intelligent Robotics and Autonomous Agents). The MIT Press.

  • Kaijen Hsiao, Henry de Plinval-Salgues and Jason Miller. Particle filters

and their applications, Cognitive Robotics, April 11, 2005.

  • Dwyer, P. S.. (1967). Some Applications of Matrix Derivatives in

Multivariate Analysis. Journal of the American Statistical Association, 62(318), 607–625. http://doi.org/10.2307/2283988

  • Pietro Manzi, Paolo, Barini. Current Topics in Public Health. 2013-05-15

26