Dense Random Fields Philipp Krhenbhl Stanford University Zoo of - PowerPoint PPT Presentation
Dense Random Fields Philipp Krhenbhl Stanford University Zoo of computer vision problems bottle tiger bottle bottle kitten wood car skin paper pumpkin cloth playing tennis Emma in her hat looking super cute 2 Zoo of computer
Filtering Pros: v 1 v 2 v 3 v 4 v 5 v 6 • Propagates information over v 7 v 8 v 9 v 10 v 11 v 12 large distances ṽ i • up to 1/3 of image v 13 v 14 v 15 v 16 v 17 v 18 v 19 v 20 v 21 v 22 v 23 v 24 28
Filtering Pros: v 1 v 2 v 3 v 4 v 5 v 6 • Propagates information over v 7 v 8 v 9 v 10 v 11 v 12 large distances ṽ i • up to 1/3 of image v 13 v 14 v 15 v 16 v 17 v 18 Cons: v 19 v 20 v 21 v 22 v 23 v 24 28
Filtering Pros: v 1 v 2 v 3 v 4 v 5 v 6 • Propagates information over v 7 v 8 v 9 v 10 v 11 v 12 large distances ṽ i • up to 1/3 of image v 13 v 14 v 15 v 16 v 17 v 18 Cons: v 19 v 20 v 21 v 22 v 23 v 24 • No probabilistic interpretation 28
Filtering Pros: v 1 v 2 v 3 v 4 v 5 v 6 • Propagates information over v 7 v 8 v 9 v 10 v 11 v 12 large distances ṽ i • up to 1/3 of image v 13 v 14 v 15 v 16 v 17 v 18 Cons: v 19 v 20 v 21 v 22 v 23 v 24 • No probabilistic interpretation • No joint inference 28
Filtering Pros: v 1 v 2 v 3 v 4 v 5 v 6 • Propagates information over v 7 v 8 v 9 v 10 v 11 v 12 large distances ṽ i • up to 1/3 of image v 13 v 14 v 15 v 16 v 17 v 18 Cons: v 19 v 20 v 21 v 22 v 23 v 24 • No probabilistic interpretation • No joint inference • No learning 28
Dense Random Fields 29
Dense Random Fields 29
Dense Random Fields 29
Dense Random Fields X X E ( X ) = ψ i ( X i ) + ψ ij ( X i , X j ) i i,j ∈ N unary term pairwise term 30
Dense Random Fields X X E ( X ) = ψ i ( X i ) + ψ ij ( X i , X j ) i i,j ∈ N unary term pairwise term • Every node is connected to every other node • Connections weighted differently 30
Dense Random Fields 31
Dense Random Fields 31
Dense Random Fields 32
Dense Random Fields Pros: 32
Dense Random Fields Pros: • Long range interactions 32
Dense Random Fields Pros: • Long range interactions • No shrinking bias 32
Dense Random Fields Pros: • Long range interactions • No shrinking bias • Probabilistic interpretation 32
Dense Random Fields Pros: • Long range interactions • No shrinking bias • Probabilistic interpretation • Parameter learning 32
Dense Random Fields Pros: • Long range interactions • No shrinking bias • Probabilistic interpretation • Parameter learning • Combine with other models 32
Dense Random Fields 33
Dense Random Fields Cons: 33
Dense Random Fields Cons: • Very large model • 50’000 - 100’000 variables • billions pairwise terms 33
Dense Random Fields Cons: • Very large model • 50’000 - 100’000 variables • billions pairwise terms • Traditional inference very slow • MCMC “converges” in 36h • GraphCuts and alpha-exp.: no convergence in 3 days 33
Dense Random Fields • Efficient inference • 0.2s / image • Pairwise term • linear combination of Gaussians 34
Dense Random Fields X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i 35
Dense Random Fields X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i X k ( m ) ( f i , f j ) µ ( m ) ( X i , X j ) ψ ij ( X i , X j ) = m 35
Dense Random Fields X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i X k ( m ) ( f i , f j ) µ ( m ) ( X i , X j ) ψ ij ( X i , X j ) = m Gaussian kernel k (m) 35
Dense Random Fields X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i X k ( m ) ( f i , f j ) µ ( m ) ( X i , X j ) ψ ij ( X i , X j ) = m Label compatibility 𝜈 (m) Gaussian kernel k (m) GRASS SHEEP WATER … 𝜈 GRASS 0 1 1 … ⨂ SHEEP 1 0 10 … 1 10 0 … WATER … … … … 0 35
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 γ 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] 𝜈 GRASS SHEEP WATER … 0 1 1 1 GRASS SHEEP 1 0 1 1 WATER 1 1 0 1 … 1 1 1 0 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] • Learned from data 𝜈 GRASS SHEEP WATER … 0 ? ? ? GRASS SHEEP ? 0 ? ? WATER ? ? 0 ? … ? ? ? 0 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] • Learned from data (c i -c j ) 2 =( - ) 2 s j • Appearance kernel s i 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] • Learned from data (c i -c j ) 2 =( - ) 2 s j • Appearance kernel • Color—sensitive s i 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] • Learned from data • Appearance kernel • Color—sensitive s i s j • Local smoothness 36
Dense Random Fields ! − | s i − s j | 2 − | c i − c j | 2 ψ ij ( X i , X j ) = µ 1 ( X i , X j ) exp + 2 σ 2 2 σ 2 α β − | s i − s j | 2 ✓ ◆ µ 2 ( X i , X j ) exp 2 σ 2 • Label compatibility γ • Potts model: 𝜈 (Xi,Xj) = [Xi ≠ Xj] • Learned from data • Appearance kernel • Color—sensitive s i s j • Local smoothness • Discourages single pixel noise 36
E ffi cient inference X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i 37
E ffi cient inference X X ψ ij ( X i , X j ) E ( X ) = ψ i ( X i )+ i>j i Find most likely assignment (MAP) P ( X ) = 1 X P ( X ) where Z exp( − E ( X )) x = arg max ˆ 37
Recommend
More recommend
Explore More Topics
Stay informed with curated content and fresh updates.