What Physics has to do with Information Theory? Boltzmann, Gibbs - - PowerPoint PPT Presentation

what physics has to do with information theory
SMART_READER_LITE
LIVE PREVIEW

What Physics has to do with Information Theory? Boltzmann, Gibbs - - PowerPoint PPT Presentation

What Physics has to do with Information Theory? Boltzmann, Gibbs (19th century): statistical physics: macroscopic behavior from microscopic interactions E.T. Jaynes (1957): same principles can be derived from maximizing Shannon entropy H. Bethe


slide-1
SLIDE 1

What Physics has to do with Information Theory?

Boltzmann, Gibbs (19th century):

statistical physics: macroscopic behavior from microscopic interactions

E.T. Jaynes (1957):

same principles can be derived from maximizing Shannon entropy

  • H. Bethe (1935), R. Gallager (1963), J. Pearl (1982):

similar ideas for exploiting sparsity in spin glasses, channel coding and AI

late 90s – unified perspective: it’s all about inference

computing avg. physical properties, decoding ECCs, learning in neural networks, denoising images, reconstructing signals . . .

(M. Mezard, A. Montanari, F. Krzakala, R. Urbanke, D. Saad, Y. Kabashima,

  • H. Nishimori, M. Opper, M. Jordan, M. Wainwright, . . . )

Andre Manoel (IF-USP, Brazil) Physics and Information 1 / 4

slide-2
SLIDE 2

Example: decoding ECCs

y

Q(y|x)

− − − − − − − → x

transmitter noisy channel receiver

P(x|y) = 1 Z (y)

  • i

Q(yi|xi)

  • likelihood
  • a

I

  • xia

1 ⊕ · · · ⊕ xia p = 0

  • prior

single instance

compute marginals P(xi|y), using belief propagation; assign symbol MAP estimate ˆ xi(y) = arg max P(xi|y)

typical case

compute free energy Ey log Z(y), using replica/cavity methods;

  • btain avg. estimator performance

ε = Ey

  • i

I (ˆ xi(y) = xi)

Andre Manoel (IF-USP, Brazil) Physics and Information 2 / 4

slide-3
SLIDE 3

What I’ve worked with in the recent past . . .

Compressed sensing: sampling signals at sub-Nyquist rates y = Fx, y ∈ RM, x ∈ RN, F ∈ RM×N Solve for sparse x when M ≪ N. Probabilistic approach: P(x|y, F) ∝ P(y|F, x) P0(x) Compute marginals using approximate message-passing (AMP) Been working on . . . variational approximations [arXiv:1402.1384] converging AMP algorithms [arXiv:1406.4311]

(joint work w/ F. Krzakala, E. Tramel and L. Zdeborov´ a)

Andre Manoel (IF-USP, Brazil) Physics and Information 3 / 4

slide-4
SLIDE 4

. . . and what I’m working on right now

From optimal source coding . . . min

x

D(x, y) s.t. x ∈ C e.g. using LDPC, x ∈ C ⇔ Hx = 0, and BP for optimization . . . to information hiding min

x

D(x, y) s.t. H(κ)x = m ⇐ ⇒ embed message m in image y

(joint work with Renato Vicente)

Andre Manoel (IF-USP, Brazil) Physics and Information 4 / 4