Relational reasoning via probabilistic coupling Gilles Barthe, - - PowerPoint PPT Presentation

relational reasoning via probabilistic coupling
SMART_READER_LITE
LIVE PREVIEW

Relational reasoning via probabilistic coupling Gilles Barthe, - - PowerPoint PPT Presentation

Relational reasoning via probabilistic coupling Gilles Barthe, Thomas Espitau, Benjamin Grgoire, Justin Hsu, Lo Stefanesco, Pierre-Yves Strub IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania November 28, 2015 1


slide-1
SLIDE 1

Relational reasoning via probabilistic coupling

Gilles Barthe, Thomas Espitau, Benjamin Grégoire, Justin Hsu, Léo Stefanesco, Pierre-Yves Strub

IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania

November 28, 2015

1

slide-2
SLIDE 2

Relational properties

Properties about two runs of the same program

◮ Assume inputs are related by Ψ ◮ Want to prove the outputs are related by Φ 2

slide-3
SLIDE 3

Examples

Monotonicity

◮ Ψ : in1 ≤ in2 ◮ Φ : out1 ≤ out2 ◮ “Bigger inputs give bigger outputs” 3

slide-4
SLIDE 4

Examples

Monotonicity

◮ Ψ : in1 ≤ in2 ◮ Φ : out1 ≤ out2 ◮ “Bigger inputs give bigger outputs”

Non-interference

◮ Ψ : low1 = low2 ◮ Φ : out1 = out2 ◮ “If low-security inputs are the same, then outputs are the same” 3

slide-5
SLIDE 5

Probabilistic relational properties

Richer properties

◮ Differential privacy ◮ Cryptographic indistinguishability 4

slide-6
SLIDE 6

Probabilistic relational properties

Richer properties

◮ Differential privacy ◮ Cryptographic indistinguishability

Verification tool: pRHL [BGZ-B]

◮ Imperative while language + command for random sampling ◮ Deterministic input, randomized output ◮ Hoare-style logic 4

slide-7
SLIDE 7

Inspiration from probability theory

Probabilistic couplings

◮ Used by mathematicians for proving relational properties ◮ Applications: Markov chains, probabilistic processes

Idea

◮ Place two processes in the same probability space ◮ Coordinate the sampling 5

slide-8
SLIDE 8

Our results

Main observation

The logic pRHL internalizes coupling

6

slide-9
SLIDE 9

Our results

Main observation

The logic pRHL internalizes coupling

Consequences

◮ Constructing pRHL proof → constructing a coupling ◮ Can verify classic examples of couplings in mathematics with

proof assistant EasyCrypt (built on pRHL)

6

slide-10
SLIDE 10

The plan

Today

◮ Introducing probabilistic couplings ◮ Introducing the relational logic pRHL ◮ Example: convergence of random walks 7

slide-11
SLIDE 11

Probabilistic couplings

8

slide-12
SLIDE 12

Introducing to probabilistic couplings

Basic ingredients

◮ Given: two distributions X1, X2 over set A ◮ Produce: joint distribution Y over A × A

– Distribution over the first component is X1 – Distribution over the second component is X2

9

slide-13
SLIDE 13

Introducing to probabilistic couplings

Basic ingredients

◮ Given: two distributions X1, X2 over set A ◮ Produce: joint distribution Y over A × A

– Distribution over the first component is X1 – Distribution over the second component is X2

Definition

Given two distributions X1, X2 over a set A, a coupling Y is a distribution over A × A such that π1(Y ) = X1 and π2(Y ) = X2.

9

slide-14
SLIDE 14

Example: mirrored random walks

Simple random walk on integers

◮ Start at position p = 0 ◮ Each step, flip coin x

$

← flip

◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1

1/2 1/2

10

slide-15
SLIDE 15

Example: mirrored random walks

Simple random walk on integers

◮ Start at position p = 0 ◮ Each step, flip coin x

$

← flip

◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1

1/2 1/2 Figure: Simple random walk

10

slide-16
SLIDE 16

Coupling the walks to meet

Case p1 = p2: Walks have met

◮ Arrange samplings x1 = x2 ◮ Continue to have p1 = p2 11

slide-17
SLIDE 17

Coupling the walks to meet

Case p1 = p2: Walks have met

◮ Arrange samplings x1 = x2 ◮ Continue to have p1 = p2

Case p1 = p2: Walks have not met

◮ Arrange samplings x1 = ¬x2 ◮ Walks make mirror moves 11

slide-18
SLIDE 18

Coupling the walks to meet

Case p1 = p2: Walks have met

◮ Arrange samplings x1 = x2 ◮ Continue to have p1 = p2

Case p1 = p2: Walks have not met

◮ Arrange samplings x1 = ¬x2 ◮ Walks make mirror moves

Under coupling, if walks meet, they move together

11

slide-19
SLIDE 19

Why is this interesting?

Goal: memorylessness

◮ Start two random walks at w and w + 2k ◮ To show: position distributions converge as we take more steps 12

slide-20
SLIDE 20

Why is this interesting?

Goal: memorylessness

◮ Start two random walks at w and w + 2k ◮ To show: position distributions converge as we take more steps

Coupling bounds distance between distributions

◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet 12

slide-21
SLIDE 21

Why is this interesting?

Goal: memorylessness

◮ Start two random walks at w and w + 2k ◮ To show: position distributions converge as we take more steps

Coupling bounds distance between distributions

◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet

Theorem

If Y is a coupling of two distributions (X1, X2), then X1 − X2TV

  • a∈A

|X1(a) − X2(a)| ≤ Pr

(y1,y2)∼Y[y1 = y2]. 12

slide-22
SLIDE 22

The logic pRHL

13

slide-23
SLIDE 23

The program logic pRHL

Probabilistic Relational Hoare Logic

◮ Hoare-style logic for probabilistic relational properties ◮ Proposed by Barthe, Grégoire, Zanella-Béguelin ◮ Implemented in the EasyCrypt proof assistant for crypto proofs 14

slide-24
SLIDE 24

Language and judgments

The pWhile imperative language

c ::= x ← e | x

$

← d | if e then c else c | while e do c | skip | c; c

15

slide-25
SLIDE 25

Language and judgments

The pWhile imperative language

c ::= x ← e | x

$

← d | if e then c else c | while e do c | skip | c; c

15

slide-26
SLIDE 26

Language and judgments

The pWhile imperative language

c ::= x ← e | x

$

← d | if e then c else c | while e do c | skip | c; c

Basic pRHL judgments

c1 ∼ c2 : Ψ ⇒ Φ

◮ Ψ and Φ are formulas over labeled program variables x1, x2 ◮ Ψ is precondition, Φ is postcondition 15

slide-27
SLIDE 27

Interpreting the judgment

c1 ∼ c2 : Ψ ⇒ Φ

16

slide-28
SLIDE 28

Interpreting the judgment

c1 ∼ c2 : Ψ ⇒ Φ

Interpreting pre- and post-conditions

◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ† on distributions over memories 16

slide-29
SLIDE 29

Interpreting the judgment

c1 ∼ c2 : Ψ ⇒ Φ

Interpreting pre- and post-conditions

◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ† on distributions over memories

Definition (Couplings in disguise!)

If Φ is a relation on A, the lifted relation Φ† is a relation on Distr(A) where µ1 Φ†µ2 if there exists µ ∈ Distr(A × A) with

◮ supp(µ) ⊆ Φ; and ◮ π1(µ) = µ1 and π2(µ) = µ2. 16

slide-30
SLIDE 30

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

17

slide-31
SLIDE 31

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

17

slide-32
SLIDE 32

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

◮ Bijection f : specifies how to coordinate the samples 17

slide-33
SLIDE 33

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

◮ Bijection f : specifies how to coordinate the samples 17

slide-34
SLIDE 34

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17

slide-35
SLIDE 35

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17

slide-36
SLIDE 36

Proof rules

The key rule: Sampling

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Notes

◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f ◮ Assume: samples coupled when proving postcondition Φ 17

slide-37
SLIDE 37

Examples

18

slide-38
SLIDE 38

Example: mirroring random walks in pRHL

The code

pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do b

$

← flip;

H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position

19

slide-39
SLIDE 39

Example: mirroring random walks in pRHL

The code

pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do b

$

← flip;

H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position

Goal: couple two walks via mirroring

19

slide-40
SLIDE 40

Record the history

H stores history of flips

◮ Σ(H) is the net distance that the first process moves to the right ◮ Meet(H) if there is prefix H’ of H with Σ(H’) = k 20

slide-41
SLIDE 41

Specify the coupling

Sampling rule

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

21

slide-42
SLIDE 42

Specify the coupling

Sampling rule

Sample

f ∈ T 1−1 − → T ∀v ∈ T. d1(v) = d2(f v) x1

$

← d1 ∼ x2

$

← d2 : ∀v, Φ[v/x1, f (v)/x2] ⇒ Φ

Case on Meet(H1)

◮ True: take bijection f to be id ◮ False: take bijection f to be negation ¬ 21

slide-43
SLIDE 43

Final judgment

c ∼ c :

start1 + 2k = start2 ⇒ (Meet(H1) → pos1 = pos2)

How to read

22

slide-44
SLIDE 44

Final judgment

c ∼ c :

start1 + 2k = start2 ⇒ (Meet(H1) → pos1 = pos2)

How to read

22

slide-45
SLIDE 45

Final judgment

c ∼ c :

start1 + 2k = start2 ⇒ (Meet(H1) → pos1 = pos2)

How to read

◮ The two walks start 2k apart 22

slide-46
SLIDE 46

Final judgment

c ∼ c :

start1 + 2k = start2 ⇒ (Meet(H1) → pos1 = pos2)

How to read

◮ The two walks start 2k apart 22

slide-47
SLIDE 47

Final judgment

c ∼ c :

start1 + 2k = start2 ⇒ (Meet(H1) → pos1 = pos2)

How to read

◮ The two walks start 2k apart ◮ If walks have met before, their positions are equal 22

slide-48
SLIDE 48

Further examples

Lazy random walk on torus

1/8 1/8 1/8 1/8

Figure: Lazy random walk on a two dimensional torus

23

slide-49
SLIDE 49

Further examples

Lazy random walk on torus

1/8 1/8 1/8 1/8

Figure: Lazy random walk on a two dimensional torus

Stochastic domination

◮ Notion of ordering for probabilistic processes ◮ Proved via couplings 23

slide-50
SLIDE 50

Wrapping up

24

slide-51
SLIDE 51

Open problems

Handling more advanced couplings

◮ Shift couplings, path couplings, etc. ◮ Hard example: constructive Lovász Local Lemma by Moser

Quantitative bounds

◮ How long does it take for the mirrored walks to meet? ◮ Non-relational reasoning

Borrow more ideas from the coupling literature

◮ Couplings from mathematics may suggest natural rules to add 25

slide-52
SLIDE 52

Relational reasoning via probabilistic coupling

Gilles Barthe, Thomas Espitau, Benjamin Grégoire, Justin Hsu, Léo Stefanesco, Pierre-Yves Strub

IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania

November 28, 2015

26