True Names and the Opening of the Cyberspace Frontier (17 page)

Further, Alice can
sign
a message to prove its authenticity. After all, Bob might have reason to believe that Mallory has forged a message to him—since Mallory only needs to look up Bob's public key to encrypt a message to him, Mallory could easily fake a message (which nonetheless only Bob can read) that claims to have come from Alice. To prevent this, Alice first encrypts her message with her
private
key, then encrypts
that
with Bob's public key. She sends the result to Bob. Bob decrypts with his private key, then with Alice's public key. If he doesn't get garbage, he knows that only someone who knew Alice's private key could have sent the message. [Alice often encrypts just a
cryptographic hash
of her message—a sort of summary—instead, but this is a detail we can ignore.]

Public-key systems have lots of great properties, but speed is not one of them. So they are often used in combination with symmetric systems.

Alice generates a string of random bits called a
session key,
which she'll use as a key for one communication (only!) with Bob. She then encrypts this key (slowly) using a public-key system, and sends it to Bob. Then, she uses this key with something like 3DES or IDEA to encrypt the bulk of her message (quickly), knowing that Bob can use his private key to decrypt the session key, and then use the session key to decrypt the message.

A popular communications package on the Internet, called Pretty Good Privacy (PGP), uses exactly this scheme: RSA to encrypt the session keys, and IDEA to encrypt the messages. (It also has lots of other safeguards to prevent inadvertent disclosure of keys while they're stored, to manage collections of keys, and so forth.)

The algorithms used both in symmetric systems like DES and in public-key systems like RSA have an interesting and critically important property: doubling the length of the key used usually makes encryption or decryption take only a little longer, but makes
cracking
the key astronomically harder. (Technically, the encryption time is often n², where n is the length of the key, but the time to crack often goes up as 2ⁿ, which goes up
much
faster—this is what's called
polynomial
vs.
exponential
growth.) This means that,
when using conventional computers
(in other words, barring an algorithmic breakthrough—more on this later), Alice and Bob can easily keep ahead of Eve and Mallory by slowly increasing the length of their keys as computers get faster. By doing so, they keep increasing the work required by Eve and Mallory by huge amounts, without increasing their own workload very much.

Using these basic techniques of symmetric and public-key cryptographic systems, we can make a huge variety of products. Keeping communications (and stored files) private is an important application, but there are loads of others. We can make electronic, unforgeable cash (by definition, cash is
anonymous),
which either does or does not require an online connection to a bank when it's handed back and forth, depending on the design. We can ensure anonymity, by encrypting messages and sending them through chains of other machines all around the world which eventually decrypt and disgorge the message far from the sender (the Cypherpunks remailers do this). We can make protocols that allow decoding a message only if seven out of twelve (or any other number) people all agree to decode it. We can
blind
data so that someone can sign a document without being able to read it (useful in proving when you invented something without giving away the invention, for example). We can prove knowledge of a secret, beyond any reasonable doubt, without having to give away even one bit of that secret. We can flip a (virtual) coin over the telephone, and prove which way it landed, without being able to change our minds later. Two people can simultaneously sign a contract. The list goes on and on.

Remember that cryptography is an arms race between users and cryptanalysts. DES, after twenty years or more, has finally fallen to an ambitious but conventional digital computer design. Worse, even public-key systems like RSA aren't totally safe forever—for example, Peter Shor of Bell Labs proved recently that, using a
quantum computer
(one which does its computations in a totally different way from a digital computer), one
can
break systems like RSA in only polynomial time—rendering it effectively useless. The only problem so far is that no one knows
how to build
a quantum computer that can run long enough to solve real problems—but many people are working on the problem, and they're making progress. This has some interesting political ramifications, as we shall see.

The Politics of Strong Cryptography

In this century, good cryptography has often been considered a military weapon; wars have been won and lost, in part, on the basis of who could break whose ciphers. It is therefore defined by two difficult issues for which the answers are sociological and political, not technical:

Who gets to use it?

Who gets to break it?

These two issues are at the heart of the current political imbroglio surrounding cryptography.

As far as who gets to use cryptography, those who would restrict its use have already essentially lost the battle. Basic knowledge about how to build and use strong cryptosystems, whether symmetric or asymmetric, is worldwide. Not only are there excellent texts describing dozens of these algorithms published (and exported) worldwide, but further information (and occasionally the algorithms themselves as source code) is widely available on the global Internet. (The U.S. Department of Commerce once conducted a survey to see how quickly someone overseas could find a copy of DES; it took about thirty seconds to do a search using Archie—a common FTP search engine—of the most popular FTP servers in the world and hence their answer was, approximately, “30 seconds for 60+ sites currently offering DES”. These days, with the rise of fast Web search engines, this figure is obsolete: it's even easier.) And even if one cannot find source code, any sufficiently motivated programmer, for example, can implement one of these systems relatively quickly; the algorithms for most of them can be taught to high-school students. MIT routinely teaches the algorithm behind RSA in the third week of an introductory group-theory course taken by all second-year CS students. (As another example, Schneier's
Applied Cryptography
book, which contains descriptions and source code of these algorithms, has sold over 45,000 copies, many overseas.)

Therefore, a government intent on denying the
knowledge
of how to do good cryptography would have to be quite repressive. It would have to prohibit import of books describing the techniques, and isolate itself from the global Internet. Such isolationism is inconsistent with any government that wants to interact in a meaningful way with the rest of the world, and hence is available to few.

A government intent on denying
prewritten software
to its people, however, still has a hard time. There is an underground market in good encryption for those with the need; such an underground market requires customs officials to stop anyone with any sort of magnetic media (a floppy or a laptop) from entering the country with it, and verifying compliance is
extremely
difficult, even given technically-competent customs inspectors—for example, what if the program is dithered into the low-order bit of each pixel in an image stored in a floppy? This technique is called steganography, and involves adding tiny bits of imperceptible “noise” to an image (which has a lot of bits in it anyway, most of them redundant). This “noise” is really the signal, of course; the image is a cover. One basically has to prohibit the import of any un-erased computer or magnetic media.

Similarly, Shamir (the S in RSA) recently described a charmingly simple technique involving overhead-projector transparencies and fax machines to implement a one-time pad consisting of the pattern on an overhead. Remember one-time pads and exclusive-or? Given a transparency containing what looks like static (the one-time pad) that is possessed by both the sending and receiving party, one can send an XOR of the data with the pad (in classic one-time-pad style) via fax. The receiving party can line up the transparency with the fax and recover the data; anyone else gets random noise. The one-time-pad transparency can even be made to look like a (non-noise) image, hence hiding its real use. Any of these techniques would defeat any customs inspector that could see only the picture (in steganography), or only the fax or the transparency (in Shamir's one-time-pad fax system). Defeating this mechanism of transfer requires getting rid of fax machines—and one can always use paper mail to send the encrypted data anyway

On the other hand, governments which try to deny
commercially available systems
for strong cryptography have a much easier time. Governments which are currently trying to do so include the United States federal government, Russia, and most of the European Community, for starters. (France has completely outlawed cryptography altogether, unless the government is given the keys; both France and Germany require key registration, and the French government routinely spies on non-French company subsidiaries based in France for competitiveness reasons, leading companies such as IBM to routinely transmit disinformation on the “encrypted” links to their French offshoots.)

Companies, unlike individuals, are visible by the products they sell. This means that it is much easier to pass laws outlawing particular kinds of products (hence controlling their commercial distribution) than it is to control private use by highly-motivated individuals. In the U.S., passing laws that simply ban all use of advanced cryptography has never really succeeded, and as a result, companies are allowed to makestrong crypto products. Whether they can
export
them, though, is where both the Commerce Department and the National Security Agency have a tremendous amount of leverage, and they use every bit of it.

In short, any cryptographic product that uses keys longer than forty bits (hence making the space of possible keys larger than can easily be searched), or that implements the ability to
encrypt
a conversation (as opposed to a simple digital signature used only for
authentication
) is illegal to export outside of the United States or Canada without a special waiver—which is routinely not granted. How can this be? Because cryptographic products in the U.S. are regulated by the International Traffic in Arms Regulations Act (ITAR), which deems them “munitions,” just like explosives.

ITAR can be a powerful weapon. By denying the ability to easily export strong crypto overseas, ITAR means that US manufacturers who put cryptographic technology in their products have two choices: make a strong version for domestic use and a weak version for international shipment, or use the weak version everywhere. Since manufacturing and stocking two different products costs a lot more than just one, most U.S. companies opt for the latter solution. This means that, even domestically, consumers get weak protection or no protection at all. For example, Motorola—the world leader in cellular systems—does not have encrypting cell phones, because they would have to cripple them for export. (And the European Community is following suit; GSM, a recently-released trans-European standard for digital cell phones, incorporated very-secure A5 encryption that, at the last minute, was required to be changed to the much weaker A5X to ensure government tappability. Without such change, the phones are nonexportable—but now the “one standard” is shattered into at least two and the market is in disarray.)

The official idea behind making crypto products subject to ITAR stems from the Cold War idea that restricting export of crypto gear harms our opponents more than ourselves. However, any government that wants to can easily make its own crypto gear; as detailed above, the technology required is well known. Since the end of the Cold War, other bogeymen have entered public discussion: The so-called Four Horsemen of the National Information Infrastructure—drug dealers, unnamed foreign terrorists, organized crime, and child pornographers.

Yet
each one
of these “Four Horsemen” has both the motivation and the capability to use strong crypto via noncommercial products. Furthermore, the federal government knows this; the Commerce study above is only one of many obvious cases. What, then, is the point of ITAR restrictions on cryptography?

As Tom Kalil, who oversees infrastructure policy for former Vice President Al Gore, commented to an MIT audience, “We are fighting a delaying action” to keep
U.S. citizens
from access to strong cryptography, in order to make their communications easier for law enforcement to tap—ostensibly so that the “Four Horsemen” could be caught. Yet Kalil admitted that the “Four Horsemen” could (and do!) use crypto anyway, even though they might not be able to buy it right off the shelf. Hence he was immediately asked, “What's your threat model?”—e.g., “What are you
really
trying to keep from happening?” To which his answer was “I can't tell you.” It is unfortunate that the actual threat against which ITAR is aimed is either unarticulated or classified, but that is the reality. (It is quite likely that the “delaying action” is trying to forestall commercial strong crypto until quantum computers—which can break such schemes—are a reality. This is speculation on the author's part; you won't find anyone in official circles willing to say anything of the sort.)

ITAR can indeed be a powerful weapon in the fight to delay citizens' access to strong cryptography. Part of its power stems from the “fear, uncertainty, and doubt” surrounding its use. For example, since the detailed principles under which a cryptographic system may or may not be deemed exportable are not written down, there is no way to know in advance whether any system not covered above under the forty-bit rule (itself not a law but simply a custom) might not be exportable. Since getting export approval can take years, which is several times longer than the lifetime of most software (or even hardware) products, companies are encouraged to guess weak. As another example, Phil Zimmermann, the original author of PGP, was charged with violating ITAR because copies of PGP found their way overseas—despite no evidence at all that he exported those copies. (They could have been exported by
any
user of the Internet—a list of suspects numbering in the millions.) The investigation was finally dropped, with no explanation by the Justice Department, after two years of legal battles—and after two years of Zimmermann being stopped, searched, and interrogated by Customs
every
time he entered the U.S., for no detectable reason except that the Justice Department had a grudge against him.

Other books

Against All Odds (Arabesque) by Forster, Gwynne
Day's End by Colleen Vanderlinden
John the Revelator by Peter Murphy
End Game by John Gilstrap
Watson's Choice by Gladys Mitchell
Stolen Grace by Arianne Richmonde
Linda Ford by The Cowboys Unexpected Family