Fast Homomorphic Evaluation of Deep Discretized Neural Networks
Florian Bourse Michele Minelli Matthias Minihold Pascal Paillier
ENS, CNRS, PSL Research University, INRIA (Work done while visiting CryptoExperts)
Fast Homomorphic Evaluation of Deep Discretized Neural Networks - - PowerPoint PPT Presentation
Fast Homomorphic Evaluation of Deep Discretized Neural Networks Florian Bourse Michele Minelli Matthias Minihold Pascal Paillier ENS, CNRS, PSL Research University, INRIA (Work done while visiting CryptoExperts) CRYPTO 2018 UCSB, Santa
ENS, CNRS, PSL Research University, INRIA (Work done while visiting CryptoExperts)
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 2 / 16
Michele Minelli 3 / 16
Michele Minelli 3 / 16
i
Michele Minelli 3 / 16
Michele Minelli 4 / 16
Michele Minelli 4 / 16
Michele Minelli 4 / 16
Michele Minelli 4 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
Michele Minelli 5 / 16
i
Michele Minelli 6 / 16
i
i
Michele Minelli 6 / 16
i
i
Michele Minelli 6 / 16
i
i
Michele Minelli 6 / 16
Michele Minelli 7 / 16
Michele Minelli 7 / 16
Michele Minelli 7 / 16
Michele Minelli 7 / 16
Michele Minelli 7 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme
i
i
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function
i
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly
i
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers
i
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers
Michele Minelli 8 / 16
1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers
Michele Minelli 8 / 16
Michele Minelli 9 / 16
i
Michele Minelli 9 / 16
i
1 Compute the multisum ∑
i wixi
Michele Minelli 9 / 16
i
1 Compute the multisum ∑
i wixi
2 Bootstrap to the activated value Michele Minelli 9 / 16
[CGGI16,CGGI17]
c
s a
Michele Minelli 10 / 16
[CGGI16,CGGI17]
c
s a
Michele Minelli 10 / 16
[CGGI16,CGGI17]
c
1 Hom. compute X b−⟨s,a⟩: spin the wheel 2 Pick the ciphertext pointed to by the arrow 3 Switch back to the original key Michele Minelli 10 / 16
Michele Minelli 11 / 16
Michele Minelli 11 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
i
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
i
i wiX −i.
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
i
i wiX −i.
i wi xi).
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
Michele Minelli 12 / 16
1 Reducing bandwidth usage 2 Dynamically changing the message space
Michele Minelli 12 / 16
. . . . . .
. . .
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract sign bootstrapping
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping weighted sums
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
weighted sums
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
weighted sums
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
weighted sums
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
weighted sums
Michele Minelli 13 / 16
. . . . . .
. . .
i piX i)
i wiX −i
extract
sign bootstrapping
weighted sums
Michele Minelli 13 / 16
Original NN (R) DiNN + hard_sigmoid DiNN + sign 30 neurons 94.76% 93.76% (-1%) 93.55% (-1.21%) 100 neurons 96.75% 96.62% (-0.13%) 96.43% (-0.32%)
Accur. Disag. Wrong BS
Time 30 or 93.71% 273 (105–121) 3383/300000 196/273 0.515 s 30 un 93.46% 270 (119–110) 2912/300000 164/270 0.491 s 100 or 96.26% 127 (61–44) 9088/1000000 105/127 1.679 s 100 un 96.35% 150 (66–58) 7452/1000000 99/150 1.64 s
un = unfolded Michele Minelli 14 / 16
Original NN (R) DiNN + hard_sigmoid DiNN + sign 30 neurons 94.76% 93.76% (-1%) 93.55% (-1.21%) 100 neurons 96.75% 96.62% (-0.13%) 96.43% (-0.32%)
Accur. Disag. Wrong BS
Time 30 or 93.71% 273 (105–121) 3383/300000 196/273 0.515 s 30 un 93.46% 270 (119–110) 2912/300000 164/270 0.491 s 100 or 96.26% 127 (61–44) 9088/1000000 105/127 1.679 s 100 un 96.35% 150 (66–58) 7452/1000000 99/150 1.64 s
un = unfolded Michele Minelli 14 / 16
Michele Minelli 15 / 16
Michele Minelli 15 / 16
Michele Minelli 15 / 16
Michele Minelli 16 / 16
Michele Minelli 16 / 16
Michele Minelli 16 / 16
aReLU (x) = max (0, x) Michele Minelli 16 / 16
aReLU (x) = max (0, x)
Michele Minelli 16 / 16