MMSE Approximation for the Sparse Prior Using Stochastic Resonance - - PowerPoint PPT Presentation

mmse approximation for the sparse prior using stochastic
SMART_READER_LITE
LIVE PREVIEW

MMSE Approximation for the Sparse Prior Using Stochastic Resonance - - PowerPoint PPT Presentation

MMSE Approximation for the Sparse Prior Using Stochastic Resonance Dror Simon Computer Science, Technion, Israel January 15, 2020 Joint work with Jeremias Sulam, Yaniv Romano, Yue M. Lu and Michael Elad Dror Simon (Technion) MMSE for Sparse


slide-1
SLIDE 1

MMSE Approximation for the Sparse Prior Using Stochastic Resonance

Dror Simon

Computer Science, Technion, Israel

January 15, 2020

Joint work with Jeremias Sulam, Yaniv Romano, Yue M. Lu and Michael Elad

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 1 / 46

slide-2
SLIDE 2

Noise Removal

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 2 / 46

slide-3
SLIDE 3

Noise Removal

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 2 / 46

slide-4
SLIDE 4

Noise Removal

Why denoising?

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 2 / 46

slide-5
SLIDE 5

Noise Removal

Why denoising? A simple testing ground for novel concepts in signal processing.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 2 / 46

slide-6
SLIDE 6

Noise Removal

Why denoising? A simple testing ground for novel concepts in signal processing. Can be generalized to other, more complicated applications.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 2 / 46

slide-7
SLIDE 7

Noise Removal

Noisy Signal

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 3 / 46

slide-8
SLIDE 8

Noise Removal

Noisy Signal Clean Estimate

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 3 / 46

slide-9
SLIDE 9

Noise Removal

Signal Restoration

Noisy Signal Clean Estimate

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 3 / 46

slide-10
SLIDE 10

Noise Removal

Signal Restoration

Noisy Signal Clean Estimate Prior Knowledge

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 3 / 46

slide-11
SLIDE 11

Noise Removal – Bayesian Standpoint

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-12
SLIDE 12

Noise Removal – Bayesian Standpoint

Denoising

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-13
SLIDE 13

Noise Removal – Bayesian Standpoint

Signal Estimation Denoising

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-14
SLIDE 14

Noise Removal – Bayesian Standpoint

Signal Estimation Denoising Approx.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-15
SLIDE 15

Noise Removal – Bayesian Standpoint

Suboptimal Signal Estimation Denoising Approx.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-16
SLIDE 16

Noise Removal – Bayesian Standpoint

Denoising Optimal Signal Estimation Approx. Suboptimal Signal Estimation

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-17
SLIDE 17

Noise Removal – Bayesian Standpoint

Denoising Approx. Optimal Signal Estimation Suboptimal Signal Estimation Approx.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 4 / 46

slide-18
SLIDE 18

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 5 / 46

slide-19
SLIDE 19

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 6 / 46

slide-20
SLIDE 20

The Generative Model

D ∈ Rn×m is a dictionary with normalized columns.

D

m n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 7 / 46

slide-21
SLIDE 21

The Generative Model

Each element i in α is non zero with probability pi ≪ 1.

D

α

m n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 7 / 46

slide-22
SLIDE 22

The Generative Model

The non-zero elements of the sparse representation, denoted by αs, are sampled from a Gaussian distribution αs|s ∼ N

  • 0, σ2

αI|s|

  • .

D

α

m n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 7 / 46

slide-23
SLIDE 23

The Generative Model

The product Dα leads to a signal x.

D

α

=

x

m n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 7 / 46

slide-24
SLIDE 24

The Generative Model

We are given noisy measurements y = Dα + ν, where ν is a white Gaussian noise ν ∼ N

  • 0, σ2

νIn

  • .

D

α

=

x

m n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 7 / 46

slide-25
SLIDE 25

The Generative Model – Results

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-26
SLIDE 26

The Generative Model – Results

The prior probability of a support (Bernoulli): p(s) =

i∈s pi

  • j /

∈s (1 − pj).

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-27
SLIDE 27

The Generative Model – Results

The prior probability of a support (Bernoulli): p(s) =

i∈s pi

  • j /

∈s (1 − pj).

When the support is known, y and αs are jointly Gaussian y = Dsαs + ν, leading to1

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-28
SLIDE 28

The Generative Model – Results

The prior probability of a support (Bernoulli): p(s) =

i∈s pi

  • j /

∈s (1 − pj).

When the support is known, y and αs are jointly Gaussian y = Dsαs + ν, leading to1

y|s is Gaussian: y|s ∼ N (0, Cs).

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-29
SLIDE 29

The Generative Model – Results

The prior probability of a support (Bernoulli): p(s) =

i∈s pi

  • j /

∈s (1 − pj).

When the support is known, y and αs are jointly Gaussian y = Dsαs + ν, leading to1

y|s is Gaussian: y|s ∼ N (0, Cs). y|αs, s is Gaussian: y|αs, s ∼ N

  • Dsαs, σ2

νIn

  • .

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-30
SLIDE 30

The Generative Model – Results

The prior probability of a support (Bernoulli): p(s) =

i∈s pi

  • j /

∈s (1 − pj).

When the support is known, y and αs are jointly Gaussian y = Dsαs + ν, leading to1

y|s is Gaussian: y|s ∼ N (0, Cs). y|αs, s is Gaussian: y|αs, s ∼ N

  • Dsαs, σ2

νIn

  • .

αs|y, s is Gaussian: αs|y, s ∼ N

  • 1

σ2

ν Q−1

s DT s y, Q−1 s

  • .

Cs = σ2

αDsDT s + σ2 νIn,

Qs = 1 σ2

α

I|s| + 1 σν DT

s Ds

1Turek, Javier S., Irad Yavneh, and Michael Elad, 2011. ”On MMSE and MAP

denoising under sparse representation modeling over a unitary dictionary.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 8 / 46

slide-31
SLIDE 31

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 9 / 46

slide-32
SLIDE 32

Bayesian Estimators

The goal: Estimate α given the noisy measurements y, i.e. denoise the signal.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 10 / 46

slide-33
SLIDE 33

Bayesian Estimators

The goal: Estimate α given the noisy measurements y, i.e. denoise the signal. Many estimators can be proposed. We focus our attention on three:

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 10 / 46

slide-34
SLIDE 34

Bayesian Estimators

The goal: Estimate α given the noisy measurements y, i.e. denoise the signal. Many estimators can be proposed. We focus our attention on three:

1 The oracle estimator. Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 10 / 46

slide-35
SLIDE 35

Bayesian Estimators

The goal: Estimate α given the noisy measurements y, i.e. denoise the signal. Many estimators can be proposed. We focus our attention on three:

1 The oracle estimator. 2 The Maximum A-posteriori Probability (MAP) estimator. Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 10 / 46

slide-36
SLIDE 36

Bayesian Estimators

The goal: Estimate α given the noisy measurements y, i.e. denoise the signal. Many estimators can be proposed. We focus our attention on three:

1 The oracle estimator. 2 The Maximum A-posteriori Probability (MAP) estimator. 3 The Minimum Mean Square Error (MMSE) estimator. Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 10 / 46

slide-37
SLIDE 37

The Bayesian Estimators

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-38
SLIDE 38

The Bayesian Estimators

Oracle Estimator

  • αOracle

s

= E {αs|s, y} = 1 σ2

ν

Q−1

s DT s y

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-39
SLIDE 39

The Bayesian Estimators

MAP Support Estimator

  • sMAP = arg max

s

p (s|y)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-40
SLIDE 40

The Bayesian Estimators

MAP Support Estimator

  • sMAP = arg max

s

p (s|y) = arg max

s

p (s) p (y|s)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-41
SLIDE 41

The Bayesian Estimators

MAP Support Estimator

  • sMAP = arg max

s

p (s|y) = arg max

s

p (s) p (y|s) = arg max

s

−1 2yTC−1

s y − 1

2 log det (Cs) +

  • i∈s

log (pi) +

  • j∈s

log (1 − pj)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-42
SLIDE 42

The Bayesian Estimators

MAP Support Estimator

  • sMAP = arg max

s

p (s|y) = arg max

s

p (s) p (y|s) = arg max

s

−1 2yTC−1

s y − 1

2 log det (Cs) +

  • i∈s

log (pi) +

  • j∈s

log (1 − pj) s ∈ {0, 1}m

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-43
SLIDE 43

The Bayesian Estimators

MMSE Estimator

  • αMMSE = arg min
  • α(y)

E

  • α (y) − α2

2

  • y
  • Dror Simon (Technion)

MMSE for Sparse Prior January 15, 2020 11 / 46

slide-44
SLIDE 44

The Bayesian Estimators

MMSE Estimator

  • αMMSE = arg min
  • α(y)

E

  • α (y) − α2

2

  • y
  • = E {α|y}

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-45
SLIDE 45

The Bayesian Estimators

MMSE Estimator

  • αMMSE = arg min
  • α(y)

E

  • α (y) − α2

2

  • y
  • = E {α|y}

= Es|y

  • Eα|y,s {α|y, s}
  • Dror Simon (Technion)

MMSE for Sparse Prior January 15, 2020 11 / 46

slide-46
SLIDE 46

The Bayesian Estimators

MMSE Estimator

  • αMMSE = arg min
  • α(y)

E

  • α (y) − α2

2

  • y
  • = E {α|y}

= Es|y

  • Eα|y,s {α|y, s}
  • =
  • s∈{0,1}m

p (s|y) αOracle

s

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 11 / 46

slide-47
SLIDE 47

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-48
SLIDE 48

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-49
SLIDE 49

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one. MMSE – compute the posterior probability of each support and use them as weights for all the possible oracle estimators.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-50
SLIDE 50

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one. MMSE – compute the posterior probability of each support and use them as weights for all the possible oracle estimators.

How is this issue resolved?

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-51
SLIDE 51

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one. MMSE – compute the posterior probability of each support and use them as weights for all the possible oracle estimators.

How is this issue resolved?

MAP – use an approximation algorithm (greedy or relaxed) to recover a likely support, and then use the oracle.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-52
SLIDE 52

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one. MMSE – compute the posterior probability of each support and use them as weights for all the possible oracle estimators.

How is this issue resolved?

MAP – use an approximation algorithm (greedy or relaxed) to recover a likely support, and then use the oracle. MMSE – usually avoided.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-53
SLIDE 53

Bayesian Estimators – Summary

Both estimators are practically impossible to obtain.

MAP – compute the posterior probability of each of the 2m supports and pick the most probable one. MMSE – compute the posterior probability of each support and use them as weights for all the possible oracle estimators.

How is this issue resolved?

MAP – use an approximation algorithm (greedy or relaxed) to recover a likely support, and then use the oracle. MMSE – usually avoided.

Can we do better?

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 12 / 46

slide-54
SLIDE 54

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 13 / 46

slide-55
SLIDE 55

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 14 / 46

slide-56
SLIDE 56

Random OMP

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-57
SLIDE 57

Random OMP

A modified version of the OMP algorithm.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-58
SLIDE 58

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-59
SLIDE 59

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual. RandOMP: weights the unpicked atoms according to their correlation with the residual, and chooses randomly.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-60
SLIDE 60

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual. RandOMP: weights the unpicked atoms according to their correlation with the residual, and chooses randomly.

Repeated many times leading to a variety of solutions.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-61
SLIDE 61

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual. RandOMP: weights the unpicked atoms according to their correlation with the residual, and chooses randomly.

Repeated many times leading to a variety of solutions. Averages the solutions to retrieve a final estimate.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-62
SLIDE 62

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual. RandOMP: weights the unpicked atoms according to their correlation with the residual, and chooses randomly.

Repeated many times leading to a variety of solutions. Averages the solutions to retrieve a final estimate. Asymptotically converges with the MMSE estimator when:

The dictionary is unitary. The cardinality of the sparse representation is 1.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-63
SLIDE 63

Random OMP

A modified version of the OMP algorithm.

OMP: picks the atom most correlated with the current residual. RandOMP: weights the unpicked atoms according to their correlation with the residual, and chooses randomly.

Repeated many times leading to a variety of solutions. Averages the solutions to retrieve a final estimate. Asymptotically converges with the MMSE estimator when:

The dictionary is unitary. The cardinality of the sparse representation is 1.

Empirically achieves better MSE than OMP even when these conditions are not met.

Elad, Michael, and Irad Yavneh, 2009. ”A plurality of sparse representations is better than the sparsest one alone.”

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 15 / 46

slide-64
SLIDE 64

MMSE Approximation Methods

Why do these methods work?

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-65
SLIDE 65

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-66
SLIDE 66

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature. ⇒ The estimator is dominated by a small set of solutions.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-67
SLIDE 67

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature. ⇒ The estimator is dominated by a small set of solutions.

  • αMMSE =
  • s∈{0,1}m

p (s|y) αOracle

s

  • s∈ω⊂{0,1}m

˜ p (s|y) αOracle

s

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-68
SLIDE 68

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature. ⇒ The estimator is dominated by a small set of solutions.

  • αMMSE =
  • s∈{0,1}m

p (s|y) αOracle

s

  • s∈ω⊂{0,1}m

˜ p (s|y) αOracle

s

These methods find a “dominant” subset ω of supports and approximate their weights (posterior probabilities).

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-69
SLIDE 69

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature. ⇒ The estimator is dominated by a small set of solutions.

  • αMMSE =
  • s∈{0,1}m

p (s|y) αOracle

s

  • s∈ω⊂{0,1}m

˜ p (s|y) αOracle

s

These methods find a “dominant” subset ω of supports and approximate their weights (posterior probabilities). Previously suggested algorithms operate in a greedy fashion.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-70
SLIDE 70

MMSE Approximation Methods

Why do these methods work? p(s|y) has an exponential nature. ⇒ The estimator is dominated by a small set of solutions.

  • αMMSE =
  • s∈{0,1}m

p (s|y) αOracle

s

  • s∈ω⊂{0,1}m

˜ p (s|y) αOracle

s

These methods find a “dominant” subset ω of supports and approximate their weights (posterior probabilities). Previously suggested algorithms operate in a greedy fashion. ⇒ They are impractical for high dimensional signals.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 16 / 46

slide-71
SLIDE 71

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 17 / 46

slide-72
SLIDE 72

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 18 / 46

slide-73
SLIDE 73

Stochastic Resonance

Definition

Originally, was suggested as an explanation to the periodic recurrence

  • f ice ages.

Today, broadly applied to describe a more general phenomenon where presence of noise in a nonlinear system provides a better response.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 19 / 46

slide-74
SLIDE 74

Stochastic Resonance

Definition

Originally, was suggested as an explanation to the periodic recurrence

  • f ice ages.

Today, broadly applied to describe a more general phenomenon where presence of noise in a nonlinear system provides a better response. Noise improves system performance?

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 19 / 46

slide-75
SLIDE 75

Dither – Noise Has a Constructive Value

Dither

In signal quantization, additive noise is used to create stochastic quantization error. For example in images, dither prevents color banding which creates unpleasant images.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 20 / 46

slide-76
SLIDE 76

Dither – Noise Has a Constructive Value

Dither

In signal quantization, additive noise is used to create stochastic quantization error. For example in images, dither prevents color banding which creates unpleasant images.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 20 / 46

slide-77
SLIDE 77

Dither – Noise Has a Constructive Value

Dither

In signal quantization, additive noise is used to create stochastic quantization error. For example in images, dither prevents color banding which creates unpleasant images.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 20 / 46

slide-78
SLIDE 78

Dither – Noise Has a Constructive Value

Dither

In signal quantization, additive noise is used to create stochastic quantization error. For example in images, dither prevents color banding which creates unpleasant images.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 20 / 46

slide-79
SLIDE 79

Dither – Noise Has a Constructive Value

Dither

In signal quantization, additive noise is used to create stochastic quantization error. For example in images, dither prevents color banding which creates unpleasant images.

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 20 / 46

slide-80
SLIDE 80

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 21 / 46

slide-81
SLIDE 81

Outline

1

Bayesian Framework The Generative Model Bayesian Estimators

2

MMSE Approximation Previous Work

3

Stochastic Resonance Can Noise Help Denoising?

4

Our Proposed Method The Algorithm Unitary Case Analysis Image Denoising

5

Conclusions

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 22 / 46

slide-82
SLIDE 82

The Proposed Algorithm

y

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-83
SLIDE 83

The Proposed Algorithm

y n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-84
SLIDE 84

The Proposed Algorithm

y + n

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-85
SLIDE 85

The Proposed Algorithm

y + n

MAP

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-86
SLIDE 86

The Proposed Algorithm

y + n S

MAP

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-87
SLIDE 87

The Proposed Algorithm

y + n S

MAP Oracle

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-88
SLIDE 88

The Proposed Algorithm

y + n S

MAP Oracle

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-89
SLIDE 89

The Proposed Algorithm

y + n S

MAP

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 23 / 46

slide-90
SLIDE 90

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-91
SLIDE 91

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-92
SLIDE 92

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-93
SLIDE 93

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-94
SLIDE 94

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do nk ← SampleNoise(σn)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-95
SLIDE 95

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do nk ← SampleNoise(σn) ˜ αk ← PursuitMethod(y + nk, D)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-96
SLIDE 96

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do nk ← SampleNoise(σn) ˜ αk ← PursuitMethod(y + nk, D)

  • Sk ← Support( ˜

αk)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-97
SLIDE 97

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do nk ← SampleNoise(σn) ˜ αk ← PursuitMethod(y + nk, D)

  • Sk ← Support( ˜

αk) S ← S ∪ {ˆ Sk} end

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46

slide-98
SLIDE 98

The Proposed Algorithm

Algorithm 1: Prior-based SR algorithm input : y, D, PursuitMethod, σn, K

  • utput:

α S ← ∅ for k ∈ 1...K do nk ← SampleNoise(σn) ˜ αk ← PursuitMethod(y + nk, D)

  • Sk ← Support( ˜

αk) S ← S ∪ {ˆ Sk} end ˆ α ←

  • S∈S P(y|S)P(S) ˆ

αOracle

S

(y)

  • S∈S P(y|S)P(S)

Dror Simon (Technion) MMSE for Sparse Prior January 15, 2020 24 / 46