Monday, February 27, 2006

What didn't happen . . .

As part of my effort to learn about the history of cybercrime, I just read a book published in 1973: Computer Crime: How A New Breed of Criminals Is Making Off With Millions, by Gerald McKnight.

Like the other early books about computer crime, McKnight's book focuses solely on "mainframe computer crime." Most of the book deals with how crimes can be committed by using mainframe computers, but it also addresses an issue I had never thought about: attacks on computers.

By "attacks on computers" I don't mean the kind of virtual assaults we think of when we hear the phrase "attacks on computers:" hacking into a computer system without authorization; launching a Denial of Service attack to shut down a system; and launching malware to cripple systems, erase data or wreak other kinds of havoc. No, I mean physical attacks on computers.

McKnight describes what he sees as an inevitable trend toward computer sabotage as "the most serious threat facging the electronic society". (Computer Crime, p. 83.) He attributes the trend, in part, to what he says is "our simple fear of being `taken over.' . . . the worry that the computer may turn into a monster. Get out of control."
(Computer Crime, p. 83.) He also attributes it to our "fear of this metal beast which has come to take jobs from men". (Computer Crime, p. 100.) And he at least implicitly suggests it may reflect our "intuitive fear" of having to compete with a new type of life: "`a form of machiine life that will be the outgrowth of today's computer science.'" (Computer Crime, p. 98.)

McKnight devoted one chapter to the May, 1970 attempt to blow up the New York University computer center, arguing that it was a manifestation of this fear, and claiming that the episode "served a useful purpose in this respect: it gave us a warning."
(Computer Crime, p. 95.) In a later chapter, McKnight described other attacks on computers and explains that these incidents in which "a human-being expresses in violence deeply repressed feelings of hositility toward the computer" are an indication that some portion of mankind, anyway, "is in subconscious revolt against the machine." (Computer Crime, p. 105.) He cautioned that while computer saboteurs were "not yet organized," they "should be regarded as the outriders of a growing guerrilla force." (Computer Crime, p. 105.)

McKnight noted that in the spring of 1972 a number of computers were bombed in New York, and suggests that these bombings were another empirical indication of the growing hostility against computers.
(Computer Crime, p. 105-106.) He speculated that "Electronic Luddities" may someday "systematically seize and . . . destroy . . . the vital computers controlling national power grids and other services". (Computer Crime, p. 113.)

I find McKnight's speculations in this regard fascinating . . . as something that might, perhaps, have come to pass, but has not, presumably because of the development and proliferation of the personal computer and the Internet.

The personal computer gave everyone access to computing power that far exceeds what was available to businesses and government agencies thirty years ago, when McKnight wrote. This democratization of computer technology effectively eliminated the possibility that humans would perceive computers as a threat and would strike back at them. It prevented computers' becoming the sole province of an elite technocracy and perceived as instruments of oppression. Instead, personal computers became a tool for the masses, in a fashion analogous to the telephone, radio and television. We are well on our way to becoming addicted to computers even though, as McKnight noted, they do take over tasks that were once performed by people.

The Internet also altered the playing field: When McKnight wrote, computers were stand-alone mainframes. If, as he describes, someone threw a bomb into a mainframe or physically attacked it in some other way, the mainframe could be crippled or even destroyed. That would have the cumulative effect of destroying the machine -- the object McKnight postulates as the target of human hostility -- and of eradicating the data held on the machine. With the development of the Internet and the proliferation of networked computers, the destruction of a single computer would be a far more futile and consequently far less satisfactory act. The destroyed computer would almost certainly not be the only repository of the data it held; the data should be available from other computers with which the victim computer was networked, and should also be archived in some other storage area.

The proliferation of the network also had a psychological effect: If someone were to bomb or otherwise destroy "a" computer today, the event could not have the visceral satisfaction the people McKnight describes must have felt when they destroyed a mainframe. They destroyed "the" computer. Now, if someone were to bomb a computer hooked to the Internet, it would be the equivalent of destroying a terminal, or maybe a typewriter; "the" computer is the network, not the appendages that are connected to the network.

Although McKnight's forecasts of man-machine war have not come to pass and seem unlikely to come true in the near future, there may be a time when a rivalry develops between humanity and its creations. The development of the personal computer and the Internet may have rendered much of his analysis irrelevant, but we have not yet had to deal with true "machine intelligence," with computers that can analyze, learn and even reflect.

British Telecom recently released its 2006 technology timeline, which predicts various steps in the evolution of technology from now until 2051. Among other things, the timeline predicts that an Artificial Intelligence entity will be elected to Parliament in 2020. I rather doubt we will greet milestones such as this by becoming Electronic Luddites who are hell-bent on the destruction of intelligent technologies; I certainly hope we do not take this path. But I suspect many people will find it difficult to accept machine life-forms. . . .



No comments:

Post a Comment