SLIDE 1
Making sure crypto stays insecure Daniel J. Bernstein University of Illinois at Chicago & Technische Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters.
SLIDE 2 Making sure stays insecure
University of Illinois at Chicago & echnische Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing invades cit begins selling Image credit:
SLIDE 3
insecure Bernstein Illinois at Chicago & Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing carte invades city in Moro begins selling addictive Image credit: Wikip
SLIDE 4
Chicago & Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia.
SLIDE 5
Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia.
SLIDE 6
rist in Hong Kong res to throw deadly weapon Chinese government workers. credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile to remove sexually Image credit:
SLIDE 7
Hong Kong w deadly weapon government workers. Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces to remove most of sexually abuses child Image credit: Child
SLIDE 8
eapon rkers. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces helpless to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer.
SLIDE 9
Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer.
SLIDE 10
Drug-dealing cartel “Starbucks” invades city in Morocco; selling addictive liquid. credit: Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal calling itself sells classifie Image credit:
SLIDE 11
rtel “Starbucks” Morocco; addictive liquid. Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The sells classified government Image credit: The
SLIDE 12
rbucks” liquid. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian.
SLIDE 13
Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian.
SLIDE 14
edophile convinces helpless child remove most of her clothing; sexually abuses child in public. credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have everything so that w drug dealers, pedophiles,
SLIDE 15 convinces helpless child
child in public. Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch everything that people so that we can catch drug dealers, organized pedophiles, murderers,
SLIDE 16
helpless child clothing; public. rnographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc.
SLIDE 17
Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc.
SLIDE 18
Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted?
SLIDE 19 Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples
the world’s crypto ecosystem so that we can understand almost all of this traffic.
SLIDE 20 Criminal organization itself “The Guardian” classified government secrets. credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples
the world’s crypto ecosystem so that we can understand almost all of this traffic.
SLIDE 21 rganization “The Guardian” government secrets. The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples
the world’s crypto ecosystem so that we can understand almost all of this traffic.
SLIDE 22 rdian” secrets. rdian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples
the world’s crypto ecosystem so that we can understand almost all of this traffic.
SLIDE 23 We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples
the world’s crypto ecosystem so that we can understand almost all of this traffic.
SLIDE 24
ve to watch and listen to everything that people are doing that we can catch terrorists, dealers, organized criminals, edophiles, murderers, etc. to systematically monitor record all Internet traffic. what if it’s encrypted? talk gives some examples we’ve manipulated rld’s crypto ecosystem that we can understand all of this traffic. Other useful not covered Manipulate so that soft Break into hundreds screens,
SLIDE 25
atch and listen to people are doing catch terrorists, rganized criminals, murderers, etc. systematically monitor Internet traffic. it’s encrypted? some examples nipulated crypto ecosystem nderstand this traffic. Other useful strategies, not covered in this Manipulate software so that software sta Break into computers; hundreds of million screens, microphones
SLIDE 26
listen to doing rists, criminals, etc. monitor traffic. encrypted? examples ystem Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras.
SLIDE 27
Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras.
SLIDE 28
Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”.
SLIDE 29 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some imp
I want secure
SLIDE 30 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important cla
I want secure crypto.
SLIDE 31 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
SLIDE 32 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
SLIDE 33 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
SLIDE 34 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated.
SLIDE 35 Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity?
SLIDE 36 useful strategies, covered in this talk: Manipulate software ecosystem that software stays insecure. into computers; access hundreds of millions of disks, screens, microphones, cameras. back doors to hardware. 2012 U.S. government report that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–T 65ms to used for Attack p but without Almost all use fast Kernel’s influences influencing influencing
65ms: compute
SLIDE 37 strategies, this talk: ware ecosystem stays insecure. computers; access illions of disks, hones, cameras. to hardware. government report Chinese-manufactured “Chinese vices access to telecommunication networks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–T 65ms to steal Linux used for hard-disk Attack process on but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES influences table-load influencing CPU cache influencing measurable
65ms: compute key
SLIDE 38 ecosystem insecure. access disks, cameras. re. government report Chinese-manufactured to rks”. Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings.
SLIDE 39 Some important clarifications
- 1. “We” doesn’t include me.
I want secure crypto.
fundamental human rights.
- 3. I don’t know how much
- f today’s crypto ecosystem
was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings.
SLIDE 40 important clarifications e” doesn’t include me. secure crypto. Their actions violate fundamental human rights. don’t know how much day’s crypto ecosystem deliberately manipulated. talk is actually thought experiment: could an attacker manipulate ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings. 2011 Brumley–T minutes machine’s Secret branch influence Most cryptographic has many variations e.g., memcmp Many mo 2014 van extracted from 25
SLIDE 41 clarifications esn’t include me. crypto. violate human rights. how much crypto ecosystem manipulated. actually eriment: attacker manipulate r insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL Secret branch conditions influence timings. Most cryptographic has many more small-scale variations in timing: e.g., memcmp for IPsec Many more timing 2014 van de Pol–Sma extracted Bitcoin secret from 25 OpenSSL
SLIDE 42 rifications me. rights. ecosystem manipulated. manipulate insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures.
SLIDE 43 Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings
65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures.
SLIDE 44 Timing attacks Osvik–Shamir–Tromer: to steal Linux AES key for hard-disk encryption. process on same CPU without privileges. Almost all AES implementations fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings attack process. compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture that such Maybe terro won’t try 2001 NIST development Encryption “A general timing attacks each encryption
amount not vulnerable
SLIDE 45 Osvik–Shamir–Tromer: Linux AES key rd-disk encryption.
rivileges. implementations tables. AES key table-load addresses, cache state, measurable timings cess. key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public that such attacks exist. Maybe terrorists Alice won’t try to stop t 2001 NIST “Report development of the Encryption Standa “A general defense timing attacks is to each encryption and
amount of time. : : not vulnerable to timing
SLIDE 46 romer: ey encryption. CPU implementations addresses, state, timings timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lo not vulnerable to timing attacks.
SLIDE 47 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.”
SLIDE 48 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. branch conditions influence timings. cryptographic software many more small-scale riations in timing: memcmp for IPsec MACs. more timing attacks: e.g. van de Pol–Smart–Yarom extracted Bitcoin secret keys 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RF Layer Securit Version 1.2”: small timing performance extent on fragment, be large due to the existing MA
SLIDE 49 uveri: another enSSL ECDSA key. conditions timings. cryptographic software small-scale timing: IPsec MACs. timing attacks: e.g.
Bitcoin secret keys enSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Layer Security (TLS) Version 1.2”: “This small timing channel, performance depends extent on the size fragment, but it is be large enough to due to the large blo existing MACs and
SLIDE 50 ECDSA key. re Cs. attacks: e.g. arom eys signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transp Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MA performance depends to some extent on the size of the data fragment, but it is not believed be large enough to be exploitable due to the large block size of existing MACs and the small
SLIDE 51 Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
SLIDE 52 Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption
- peration runs in the same
amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext.
SLIDE 53 Manufacture public denials such attacks exist. terrorists Alice and Bob try to stop the attacks. NIST “Report on the development of the Advanced Encryption Standard (AES)”: general defense against attacks is to ensure that encryption and decryption eration runs in the same amount of time. : : : Table lookup: vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions flow from timings: constant-distance (on most What if software instructions? see anything
SLIDE 54 public denials attacks exist. Alice and Bob stop the attacks.
the Advanced Standard (AES)”: defense against to ensure that and decryption in the same : : : Table lookup: to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions flow from their inputs timings: e.g., logic constant-distance shifts, (on most CPUs), add, What if Alice and software built solely instructions? Yikes: see anything from
SLIDE 55 denials Bob attacks. the Advanced (AES)”: against that decryption same lookup: attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings!
SLIDE 56 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings!
SLIDE 57 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size
2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.”
SLIDE 58 RFC 5246 “The Transport Security (TLS) Protocol, ersion 1.2”: “This leaves a timing channel, since MAC rmance depends to some
fragment, but it is not believed to rge enough to be exploitable, the large block size of existing MACs and the small size timing signal.” AlFardan–Paterson “Lucky Thirteen: breaking the TLS and record protocols”: exploit timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time maybe with that mak for resea but that with our
SLIDE 59
“The Transport (TLS) Protocol, his leaves a channel, since MAC ends to some size of the data is not believed to to be exploitable, block size of and the small size signal.” rdan–Paterson “Lucky reaking the TLS and rotocols”: exploit steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time maybe with “countermeasures” that make the timings for researchers to analyze but that are still break with our computer
SLIDE 60 ransport Protocol, a since MAC some data elieved to exploitable,
all size “Lucky TLS and exploit plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficu for researchers to analyze but that are still breakable with our computer resources.
SLIDE 61
Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources.
SLIDE 62
Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”.
SLIDE 63
instructions have no data from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply most CPUs), add, subtract. if Alice and Bob use crypto re built solely from these instructions? Yikes: we won’t anything from timings! scare implementors away constant-time software. “It will be too slow.” too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if use a different constant-time are simple Don’t standa e.g. choose not higher-securit Watch o standardization Discourage Pretend is a guarantee while anything has questionable
SLIDE 64
instructions have no data inputs to CPU logic instructions, constant-distance shifts, multiply CPUs), add, subtract. and Bob use crypto solely from these Yikes: we won’t from timings! implementors away constant-time software. too slow.” write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice use a different cipher constant-time imple are simple and fast? Don’t standardize e.g. choose Rijndael not higher-security Watch out for any standardization effo Discourage use of Pretend that standa is a guarantee of securit while anything non has questionable securit
SLIDE 65 data CPU instructions, multiply subtract. crypto these
timings! away re. Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequen standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security.
SLIDE 66
Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security.
SLIDE 67 variable-time software, with “countermeasures” make the timings difficult searchers to analyze that are still breakable
Continue expressing skepticism constant time is needed. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache attacks still feasible?”, rtunately shredded by 2014 qui–Inci–Eisenbarth–Sunar a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding 1998 Bleichenbacher: Decrypt by observing to ≈106 SSL first then checks (which many Subsequent more serious Server resp pattern of pattern reveals
SLIDE 68 riable-time software, “countermeasures” timings difficult to analyze breakable computer resources. ressing skepticism time is needed. ery–Keelveedhi– AES x86 cache still feasible?”, shredded by 2014 qui–Inci–Eisenbarth–Sunar A fast,
What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA by observing server to ≈106 variants of SSL first inverts RSA, then checks for “PK (which many forgeries Subsequent processing more serious integrit Server responses re pattern of PKCS fo pattern reveals plaintext.
SLIDE 69
re, “countermeasures” ifficult resources. epticism needed. ery–Keelveedhi– cache feasible?”, 2014 rth–Sunar AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext.
SLIDE 70
What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext.
SLIDE 71 if terrorists Alice and Bob different cipher for which constant-time implementations simple and fast? Yikes! standardize that cipher. choose Rijndael as AES, higher-security Serpent.
rdization efforts. Discourage use of the cipher. Pretend that standardization guarantee of security anything non-standard questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic so that fo as much e.g. Design and check checking Broken b such as BEAST e.g. Design IPsec options. Paterson–Y Degabriele–P
SLIDE 72 rists Alice and Bob cipher for which plementations fast? Yikes! rdize that cipher. dael as AES, higher-security Serpent. any subsequent efforts.
standardization
non-standard security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic so that forgeries are as much processing e.g. Design SSL to and check padding checking a serious Broken by padding-o such as BEAST and e.g. Design “encrypt-only” IPsec options. Brok Paterson–Yau for Lin Degabriele–Paterson
SLIDE 73
and Bob which mentations es! cipher. AES, ent. uent cipher. rdization rd Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and Degabriele–Paterson for RFCs.
SLIDE 74
Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs.
SLIDE 75 adding oracles Bleichenbacher: Decrypt SSL RSA ciphertext
- bserving server responses
106 variants of ciphertext. first inverts RSA, checks for “PKCS padding” many forgeries have). Subsequent processing applies serious integrity checks. responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldb SSL keys 2008 Bello: OpenSSL <20 bits 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–W Heninger–Durumeric–W Halderman keys for The prim randomness
SLIDE 76
Bleichenbacher: RSA ciphertext server responses riants of ciphertext. RSA, “PKCS padding” rgeries have). cessing applies integrity checks. reveal forgeries; plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: SSL keys had <50 2008 Bello: Debian/Ubuntu OpenSSL keys for <20 bits of entrop 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter Heninger–Durumeric–W Halderman broke the keys for 0.5% of all The primes had so randomness that they
SLIDE 77 ciphertext
ciphertext. padding” have). applies hecks. rgeries; Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscap SSL keys had <50 bits of entrop 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and Heninger–Durumeric–Wustro Halderman broke the RSA publ keys for 0.5% of all SSL servers. The primes had so little randomness that they collided.
SLIDE 78
Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided.
SLIDE 79 cryptographic systems that forgeries are sent through much processing as possible. Design SSL to decrypt check padding before checking a serious MAC. by padding-oracle attacks as BEAST and POODLE. Design “encrypt-only”
aterson–Yau for Linux and 2007 riele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomnes extremely Have each its own RNG Maintain each application. build this from the available Pay people RNGs such Claim “p
SLIDE 80
cryptographic systems are sent through cessing as possible. to decrypt ng before serious MAC. padding-oracle attacks and POODLE. “encrypt-only” Broken by 2006 r Linux and 2007 rson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation extremely difficult Have each application its own RNG “for sp Maintain separate each application. “F build this RNG in from the inputs conveniently available to that application. Pay people to use RNGs such as Dual Claim “provable securit
SLIDE 81 systems through
decrypt attacks POODLE. 2006 and 2007 RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code each application. “For simplicit build this RNG in ad-hoc wa from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”.
SLIDE 82
Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”.
SLIDE 83
Randomness Goldberg–Wagner: Netscape eys had <50 bits of entropy. Bello: Debian/Ubuntu enSSL keys for years had bits of entropy. Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public r 0.5% of all SSL servers. rimes had so little randomness that they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if merge all into a central This pool bad/failing/malicious if there is Merging Yikes!
SLIDE 84
erg–Wagner: Netscape 50 bits of entropy. Debian/Ubuntu for years had entropy. Lenstra–Hughes–Augier– achter and 2012 Heninger–Durumeric–Wustrow– e the RSA public all SSL servers. so little they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available into a central entrop This pool can survive bad/failing/malicious if there is one good Merging process is Yikes!
SLIDE 85
Netscape entropy. Debian/Ubuntu had Lenstra–Hughes–Augier– and 2012 ustrow– public servers. collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes!
SLIDE 86
Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes!
SLIDE 87
Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom).
SLIDE 88 randomness-generation code extremely difficult to audit. each application maintain wn RNG “for speed”. Maintain separate RNG code for
- application. “For simplicity”
this RNG in ad-hoc ways the inputs conveniently available to that application.
such as Dual EC. “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if RNG speed Make it to use randomness possible. tests, en e.g. DSA new random m; could H(s; m). user is given which to Bushing–Ma “PS3 epic
SLIDE 89 ndomness-generation code lt to audit. application maintain r speed”. rate RNG code for
- application. “For simplicity”
in ad-hoc ways conveniently application. use backdoored Dual EC. security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists RNG speed isn’t an Make it an issue! to use randomness
tests, encouraging e.g. DSA and ECDSA new random numb m; could have replaced H(s; m). 1992 Rivest: user is given enough which to hang himself Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS
SLIDE 90 s-generation code audit. maintain de for simplicity” ways conveniently application.
What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists realize RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries.
SLIDE 91 What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries.
SLIDE 92 if the terrorists all available inputs central entropy pool?
bad/failing/malicious inputs there is one good input. Merging process is auditable. performance problems in to a central pool, reading from a central pool. dify pool to make it unusable random) or scary (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molna Osvik–de MD5 ⇒
SLIDE 93 terrorists available inputs entropy pool? survive many bad/failing/malicious inputs
is auditable. rmance problems in central pool, central pool. make it unusable ry (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failure 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molna Osvik–de Weger exploited MD5 ⇒ rogue CA
SLIDE 94
many inputs auditable. roblems in
unusable urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS.
SLIDE 95 What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS.
SLIDE 96 What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack.
SLIDE 97 What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped.
SLIDE 98 What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as
- possible. This also complicates
tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility.
SLIDE 99 if the terrorists realize that speed isn’t an issue? it an issue! Design crypto randomness as often as
- ssible. This also complicates
encouraging bugs. DSA and ECDSA use a random number k to sign could have replaced k with ). 1992 Rivest: “the poor given enough rope with to hang himself”. 2010 Bushing–Marcan–Segher–Sven epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC to “secure” e.g. dnssec-deployment.org address is
SLIDE 100
terrorists realize that an issue? issue! Design crypto randomness as often as also complicates raging bugs. ECDSA use a number k to sign replaced k with Rivest: “the poor enough rope with himself”. 2010 rcan–Segher–Sven PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses to “secure” IP addresses. e.g. dnssec-deployment.org address is signed b
SLIDE 101 realize that crypto
complicates a sign with “the poor with 2010 rcan–Segher–Sven rgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024.
SLIDE 102
Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024.
SLIDE 103
Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD.
SLIDE 104
Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.”
SLIDE 105
crypto failures Stevens–Sotirov– elbaum–Lenstra–Molnar– Osvik–de Weger exploited ⇒ rogue CA for TLS. Flame: new MD5 attack. By 1996, a few years the introduction of MD5, Preneel and Dobbertin were for MD5 to be scrapped. managed to keep MD5. How? eed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince that secure Many techniques: incompetent
SLIDE 106
res Stevens–Sotirov– elbaum–Lenstra–Molnar– exploited CA for TLS. new MD5 attack. a few years duction of MD5, Dobbertin were to be scrapped. keep MD5. How? rds; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terro that secure crypto Many techniques: incompetent benchma
SLIDE 107 elbaum–Lenstra–Molnar– TLS. attack. rs MD5, ere pped.
tibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud.
SLIDE 108
2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud.
SLIDE 109
2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu
SLIDE 110
DNSSEC uses RSA-1024 “secure” IP addresses. dnssec-deployment.org address is signed by RSA-1024. Analyses in 2003 concluded RSA-1024 was breakable; 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse ticking to RSA-1024: speed. radeoff between the risk of key romise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most the pack 750 pack maximum goes wel (2,265 pack Processing second and ms can ha hardware. a Pentium needs ab a verification cryptographic likely to
SLIDE 111 uses RSA-1024 addresses. dnssec-deployment.org by RSA-1024. in 2003 concluded was breakable; Shamir–Tromer r, ≈107 USD. excuse RSA-1024: speed. een the risk of key performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving the packet rates do 750 packets per second. maximum highway goes well beyond this (2,265 packets per Processing 1,000 pack second and processing ms can hardly be met
a Pentium D 3.4 GHz needs about 5 times a verification : : : a cryptographic co-p likely to be necessa
SLIDE 112 RSA-1024 dnssec-deployment.org RSA-1024. concluded ble; USD. speed.
rmance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving situations the packet rates do not exceed 750 packets per second. Only maximum highway scenario : goes well beyond this value (2,265 packets per second). Processing 1,000 packets per second and processing each in ms can hardly be met by current
- hardware. As discussed in [32]
a Pentium D 3.4 GHz processo needs about 5 times as long a verification : : : a dedicated cryptographic co-processor is likely to be necessary.”
SLIDE 113 How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.”
SLIDE 114 to convince terrorists secure crypto is too slow? techniques: obsolete data, incompetent benchmarks, fraud. Example: PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical like performance, scalability, deployability of V2X security systems.” preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare
5.48 cycles/ 2.30 cycles/ for Salsa20, 498349 cycles 624846 cycles for Curve25519
SLIDE 115 terrorists crypto is too slow? techniques: obsolete data, enchmarks, fraud. contributes to the rivacy of future vehicle-to-vehicle and vehicle- communication addressing critical rmance, scalability, y of V2X security preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON
5.48 cycles/byte (1.4 2.30 cycles/byte (3.4 for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH,
SLIDE 116 slow?
fraud. to the future vehicle- communication critical scalability, security “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify.
SLIDE 117 “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify.
SLIDE 118 “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4).
SLIDE 119 “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current
- hardware. As discussed in [32],
a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk.
SLIDE 120 most driving situations : : : packet rates do not exceed packets per second. Only the maximum highway scenario : : : ell beyond this value packets per second). : : : cessing 1,000 packets per and processing each in 1 can hardly be met by current
- are. As discussed in [32],
entium D 3.4 GHz processor about 5 times as long for verification : : : a dedicated cryptographic co-processor is to be necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if hear about Yikes! Similar to Don’t standa Discourage
SLIDE 121 driving situations : : : do not exceed
ay scenario : : :
er second). : : : 1,000 packets per cessing each in 1 e met by current discussed in [32], GHz processor times as long for a dedicated co-processor is necessary.” Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure Yikes! Similar to constant-time Don’t standardize Discourage use of
SLIDE 122 situations : : : exceed Only the rio : : : value second). : : : per each in 1 current [32], cessor long for dedicated r is Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story Don’t standardize good crypto. Discourage use of good crypto.
SLIDE 123 Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto.
SLIDE 124 Compare to “NEON crypto”
5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto.
SLIDE 125
Compare to “NEON crypto” 1GHz Cortex-A8 core: cycles/byte (1.4 Gbps), cycles/byte (3.4 Gbps) Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) Curve25519 DH, verify. Cortex-A8 was high-end rtphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); OMAP3630 (Motorola Droid Apple A4 (iPad 1/iPhone 4). Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed Try to build into many e.g. Complicate
SLIDE 126 “NEON crypto” rtex-A8 core: (1.4 Gbps), (3.4 Gbps)
(2000/second), (1600/second) DH, verify. was high-end in 2010: e.g., Exynos 3110 (Galaxy S); (Motorola Droid ad 1/iPhone 4). A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic into many layers of e.g. Complicate the
SLIDE 127
crypto” Gbps), Gbps) (2000/second), (1600/second) . high-end e.g., (Galaxy S); Droid 1/iPhone 4). in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragilit into many layers of the system. e.g. Complicate the protocols.
SLIDE 128
What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols.
SLIDE 129 What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
SLIDE 130 What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted.
SLIDE 131 if the terrorists bout fast secure crypto? r to constant-time story. standardize good crypto. Discourage use of good crypto. good crypto persists, bury it behind menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the by precomputing Insist that allow precomputation e.g. DNSSEC.
SLIDE 132 terrorists secure crypto? constant-time story. rdize good crypto.
crypto persists, ehind bad options. “cryptographic agility”; cryptographic fragility. this “agility” reakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the proto allow precomputation e.g. DNSSEC.
SLIDE 133 crypto? story. crypto. crypto. rsists,
agility”; fragility. crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for sp e.g. DNSSEC.
SLIDE 134 Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC.
SLIDE 135 Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags.
SLIDE 136 Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”
and “the hard problem”
- f protecting confidentiality.
e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned.
SLIDE 137
Precomputed signatures build cryptographic fragility many layers of the system. Complicate the protocols. cryptographic security “the easy problem” rotecting integrity “the hard problem” rotecting confidentiality. gue against encrypted SNI DNS is unencrypted, rgue against encrypted DNS SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” and human into activities threaten Set up centralized encrypting that collab More distraction: breakable Declare crypto without
SLIDE 138 signatures cryptographic fragility
the protocols. cryptographic security roblem” integrity roblem” confidentiality. against encrypted SNI unencrypted, against encrypted DNS unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resource into activities that threaten mass surveillance. Set up centralized encrypting data to that collaborate with More distraction: b breakable by active Declare crypto success without encrypting
SLIDE 139
fragility system. cols. y y. encrypted SNI encrypted DNS Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resources into activities that don’t threaten mass surveillance. Set up centralized systems encrypting data to companies that collaborate with us. More distraction: build systems breakable by active attacks. Declare crypto success without encrypting the Internet.
SLIDE 140
Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resources into activities that don’t threaten mass surveillance. Set up centralized systems encrypting data to companies that collaborate with us. More distraction: build systems breakable by active attacks. Declare crypto success without encrypting the Internet.