EncryptionEncryption is less secure than we thought

By Larry Hardesty

Published 15 August 2013

For sixty-five years, most information-theoretic analyses of cryptographic systems have made a mathematical assumption that turns out to be wrong.

Information theory — the discipline that gave us digital communication and data compression — also put cryptography on a secure mathematical foundation. Since 1948, when the paper that created information theory first appeared, most information-theoretic analyses of secure schemes have depended on a common assumption.

Unfortunately, as a group of researchers at MIT and the National University of Ireland (NUI) at Maynooth, demonstrated in a paper presented at the recent IEEE International Symposium on Information Theory, held in Istanbul, Turkey, 7-12 July 2013, that assumption is false. In a follow-up paper being presented this fall at the Asilomar Conference on Signals and Systems, to be held in Pacific Grove, California, 3-6 November 2013, the same team shows that, as a consequence, the wireless card readers used in many keyless-entry systems may not be as secure as previously thought.

In information theory, the concept of information is intimately entwined with that of entropy.

Two digital files might contain the same amount of information, but if one is shorter, it has more entropy. If a compression algorithm — such as WinZip or gzip — worked perfectly, the compressed file would have the maximum possible entropy. That means that it would have the same number of 0s and 1s, and the way in which they were distributed would be totally unpredictable. In information-theoretic parlance, it would be perfectly uniform.

Traditionally, information-theoretic analyses of secure schemes have assumed that the source files are perfectly uniform. In practice, they rarely are, but they’re close enough that it appeared that the standard mathematical analyses still held.

“We thought we’d establish that the basic premise that everyone was using was fair and reasonable,” says Ken Duffy, one of the researchers at NUI. “And it turns out that it’s not.”

On both papers, Duffy is joined by his student Mark Christiansen; Muriel Médard, a professor of electrical engineering at MIT; and her student Flávio du Pin Calmon.

The problem, Médard explains, is that information-theoretic analyses of secure systems have generally used the wrong notion of entropy. They relied on so-called Shannon entropy, named after the founder of information theory, Claude Shannon, who taught at MIT from 1956 to 1978.