True Names and the Opening of the Cyberspace Frontier (18 page)

Indeed, the FUD factor surrounding ITAR can have almost laughable consequences—if the potential threats to civil liberties of a government spying on its citizens are ignored, of course. Take the issue of “machine readability.” Bruce Schneier's
Applied Cryptography,
mentioned above
,
is a classic in the field. The book has source-code examples of cryptographic systems in it. The U.S. government tried (and failed, in the 1970s) to make any export or public discussion of cryptographic procedures either “born classified” (like nuclear weapons technology) or subject to prior review by the NSA. The failure to pass an executive order or a law mandating this is what allows U.S. cryptographers to go to overseas conferences and to publish papers in international journals. This failure also means that books like
Applied Cryptography
cannot legally be banned from U.S. export.

However, electronic media is not nearly so lucky as its print cousin. Having never been definitively protected under the First Amendment, unlike the many affirmations for print, it is subject to ITAR. In particular, a floppy consisting of
the very same source code
that is in
Applied Cryptography
is illegal to export, ostensibly because it is in “machine-readable form.” It makes no difference to argue that one can type the programs back in from the book by hand, or scan them in with OCR; the floppy is still nonexportable. As a result, a clever individual in the U.K. came up with a four-line Perl script (a tiny program) which implements RSA. He exported it to the U.S., by putting it in his signature file on e-mail he sent to a mailing list. The four-line program was picked up in the U.S. and made into a T-shirt, along with the same few hundred characters turned into a machine-readable barcode, also printed on the shirt. The T-shirt, which may be unexportable owing to its barcode, is thus a curious form of political protest.

The damage done to U.S. industries is tremendous. (The damage done to civil liberties will be discussed below.) Since other countries are free to set their own laws which do not prohibit export of
their
crypto products to the U.S., consumers in the U.S. often have an interesting choice: Buy American, and accept inferior products, or buy foreign-made products, which protect their privacy. Many U.S. companies have testified before Congress that such policies are crippling their ability to compete, not only at home but in the market abroad, because their technologies are seen as inferior—a particularly ironic result when one considers that most of those technologies were invented in the U.S.

It is also not unknown for prominent American cryptographers who wish to ship products (as opposed to writing academic papers) to emigrate to, e.g., Australia, which does not prohibit cryptographic export. This means, of course, that Australia gets the economic benefits of the U.S. researcher's training and background, and of the rest of the U.S.-developed technologies he or she uses. But such is life with ITAR.

Similar odd stories abound in any product that touches upon cryptography. Take the Global Positioning System, for example. The GPS is a truly amazing technological feat—using a network of orbiting satellites and a handheld receiver that costs under $300, one can locate oneself anywhere on Earth to a theoretical accuracy of less than one meter. Using “differential-mode” GPS (which requires a land-based beacon within a hundred kilometers), one can get accuracies in the millimeters. How many people have died, on land or at sea, in the last few thousand years because they didn't know where they were?

Yet, again because of the Cold War, the developers of GPS decided employ “selective availability”—they dithered the signal. This means that those bits of the signals from the satellites which carry the highest-precision data are encrypted. Civilian receivers, which cannot decode these bits, consequently have their accuracy affected and are good only for navigation to between thirty and one hundred meters—good enough for most uses, but not for many applications, such as landing an ICBM
exactly
on a missile silo. Military receivers, which get frequent key updates, can receive the signal at full precision. The idea here, of course, was to allow most uses without making a perfect missile-guidance system for our enemies at the time.

But during the Gulf War,
everyone's
GPS suddenly operated at full precision! Why? Because the U.S. military needed so many GPS units to hand out to its personnel (hundreds of thousands) that military contractors could not keep up with the demand. Instead, the U.S. bought commercial, civilian units off the shelf, and turned off selective availability on the satellites. (The obvious joke now is “If your GPS receiver suddenly gets really accurate, we must be at war.”) Yet with the end of the Cold War, do we really need selective availability? After all, thirty-meter accuracy isn't quite enough to navigate a ship in a crowded, unfamiliar harbor with dangerous shoals, and there are many other applications which could benefit from more-accurate GPS that didn't require a nearby beacon to enable differential-mode reception. (Quite recently, the U.S. has announced it will drop selective availability within the next four to ten years, in part because workarounds to SA have continued to improve.)

Despite all this discussion of ITAR, who gets to
use
commercial, off-the-shelf strong cryptography is only half of the picture. Assuming we still restrict our attention to the sorts of cryptography that a nontechnical user might employ (e.g., what you can buy, not what you can program), the other interesting question is
who can break it.
This is where we fall into the rathole of
key escrow,
the
Clipper chip,
Skipjack,
and the
Digital Telephony Bill.

For many years now, the NSA has had as its major agenda keeping strong

cryptography out of the hands of foreign governments, and breaking whatever cryptographic systems they came up with. It is
not
supposed to employ its signal intelligence on domestic citizens; doing so is against both laws and executive orders. Yet the NSA has been cooperating closely with both the FBI and the National Institute of Science and Technology (NIST, formerly NBS, the National Bureau of Standards) to dictate domestic encryption policy. The FBI is happy for the help, because the FBI has been trying for years to tap just about everyone. The details are frightening—even if you believe that no one currently at the FBI is nearly as corrupt as Hoover, in hindsight, clearly was.

The most obvious and public recent demonstration of the NSA's involvement in domestic, civilian cryptography came with the announcement during the early days of the Clinton Administration of the Clipper chip, which, though not originally publicly announced as such, was based on a design called Skipjack that had been developed in previous years (and previous administrations) by the NSA. The Clipper chip's basic goal was to enable tappability of encrypted communications using key escrow—each chip has a key that is also possessed by the government. Even in an ideal world in which governments were always trustable, this solution would hardly work against the “four bogeymen” always cited, since those players wouldn't be caught dead using an encryption scheme for which someone else held the keys. In the world we live in, the Clipper/Skipjack system held a world of other disadvantages as well.

To wit: to prevent any
one
agency's compromise (either personal or administrative), the proposal was to divide keys in half, with each half being held by a different escrow agency. Initially, the
identity
of these agencies (a matter of critical importance) was not revealed; eventually two agencies were picked—both in the executive branch.

Prior abuses of executive authority are rife and well known, Watergate being a classic example of any number of dirty-tricks campaigns that have misused executive authority—not to imply, of course, that the other two branches of government have been known to hold to particularly higher standards. And is it just a bad coincidence that the Aldrich Ames affair, in which a high-level CIA operative had been compromising our national security for
years
without detection despite blatant flaunting of large amounts of income from unknown sources, happened during discussion of the whole Clipper affair? Cries that “one can trust the government” can be a little hard to take in the light of situations like this.

The proposal had numerous technical faults. It depended on a classified design —the kiss of death for most crypto proposals, which get their strength not from
secrecy of
algorithm
but by
standing up to cryptanalysis.
A proposal that had been generated in secret, which depended upon the secrecy of its algorithm, and which could not be openly analyzed was derided in the cryptographic community. (The classification of the algorithm was, in part, to prevent knock-off versions of Clipper which used the same encryption algorithm but were manufactured without giving the government the keys.) Furthermore, from the
unclassified
description of the algorithm, Matt Blaze, a Bell Labs researcher, was able to come up with a scheme that enabled use of Clipper with fake (untappable) keys.

In addition, it depended on
hardware.
Many products would much rather do encryption in software, which is cheaper and cuts down on chips—and is reconfigurable instantly as the marketplace changes. It also depended on
tamper-proof
hardware (which doesn't really exist) to protect the algorithm—and on only
two
escrow agents. These chips, remember, were to be the absolute cornerstone of civilian commerce. The economic pressures to bribe as many people as it took in both escrow agencies would be immense—and giving away the keys to
all
the chips in the country could be done on a handful of floppies or on a single DAT tape. And how much exactly did it cost to bribe Aldrich Ames to destroy what was left of the CIA's reputation? Less than the salary of a major corporate CEO.

Silvio Micali, an MIT cryptographer, then released details on “fair cryptosystems,” which could be implemented entirely in software, did not depend on secrecy for their algorithms, enabled any
k
of
n
escrow holders to reconstruct the key, disabled the ability to use the system
without
registering the keys,
and
enabled use of the system in a way that, once one's keys were acquired by the government for a tap, one's privacy was not
permanently
lost. Unlike Clipper, where anyone authorized to do a tap could then tap forever, or use the tap on data collected in years past that was just waiting for the decoding key, fair cryptosystems have time-bounded keys and a tap can only succeed within that bound. Micali's proposal (and others) were ignored by those pushing Clipper.

Indeed, Micali argued that, because Clipper didn't address one of the most fundamental aspects of any cryptographic system—how the keys are actually managed and distributed—it was doomed from the start. Public comment about the proposal was difficult in the light of this missing part of the specification, and he argued that the key distribution mechanism, which would have to be national in scope, could easily be subverted to serve as free infrastructure for distributing keys to
other,
untappable cryptosystems. In short, Micali said, we were concentrating on making trackable cars while building an entire interstate highway system for the bad guys—and the
highway
is the part requiring massive investment and which is hard for individual bad guys to create.

Further, the proposal was politically naive. In order to function against determined opponents and not just the most unsophisticated, cryptography of other sorts would have to be outlawed. Lack of trust in the government's assurance that this would not happen led to the rallying cry “If cryptography is outlawed, only outlaws will have cryptography.” (Shades of
True Names
indeed…) In addition, the entire Internet and academic community landed resoundingly on the proposal. About the only academic cryptographer not laughing hysterically (or moving to Australia) was Dorothy Denning, herself a respected cryptographer but whose major arguments appeared to consist of “trust the government” and “nobody ever does wiretaps that aren't court-authorized.” Such arguments ignore two factors: first, that only
evidence to be used in court
requires court authorization to be useful, and second, that FBI wiretaps are only disapproved in the very rarest of circumstances. (According to the 1992 Government Accounting Office's “Report on Applications for Orders Authorizing or Approving the Interception of Wire, Oral, or Electronic Communications [Wiretap Report],” there were 919 wiretaps authorized in 1992, and
zero
requests were denied.
None!
The 1994 report also showed absolutely zero requests denied. In the entire period from 1982-1992, only seven applications for surveillance were denied, which is much less than one-tenth of one percent—a virtually complete rubber stamp.)

Consider also the Digital Signature Standard, which was also being promulgated at the time. Charged with developing a cryptographic standard by Congress, NIST wimped out (disregarding the Computer Security Act under which it was legally tasked to perform) and developed only a signature standard instead—in other words, one could verify someone's identity, but not communicate secretly with them. The initial proposal was hooted down—the keys were ludicrously short (obviously breakable), the algorithm was less well-tested than others, and even NIST admitted its shortcomings compared to other well-known methods (in documents obtained via Freedom of Information Act lawsuit). Once again, the key distribution mechanism was not specified. Further, repeated and hounding requests from the Electronic Frontier Foundation (EFF), the Computer Professionals for Social Responsibility (CPSR), the Electronic Privacy Information Center (EPIC) and others via the Freedom of Information Act (FOIA) finally revealed (after NIST turned down the first FOIA request by illegally failing to reveal responsive documents in their possession from the NSA and had to be sued into complying) that NIST had actually acquired the entire system from the NSA—which was
not
supposed to be setting civilian policy in encryption! It was clear that, once again, the NSA illegally had its fingers in the pie, this time promoting an inferior, non-privacy-preserving signature-only standard, apparently to clear the way toward the tappable Clipper being the only viable encryption system in town.

Other books

Through the Cracks by Honey Brown
Coming Rain by Stephen Daisley
Jacob's Ladder by Z. A. Maxfield
Conspiracy by Allan Topol
The Reindeer Girl by Holly Webb
Falling In by Alexa Riley
Swan River by David Reynolds