12.BILL GATES

by Steven Shaviro
©1995-1997 Steven Shaviro

It's out there. It's hungry. It grows like a cancer. There's more of it every second. It gobbles up entrepreneurs for breakfast; only the strong survive. Thanks to personal computers and fast modems and the Internet, INFORMATION is circulating and expanding as never before in history. We are finally moving, as Burroughs has long urged us to do, out of time and into space: away from the teleology of what Lyotard calls "metanarratives," and into the simultaneity and fractal self-similarity of the digital matrix. "Everything is information," Rudy Rucker rhapsodizes; "the world can be resolved into digital bits, with each bit made of smaller bits. These bits form a fractal pattern in fact-space." Baudrillard's once radical thesis is now a commonplace: in our postmodern economy, the proliferation of simulacra and empty digital signs replaces the production of tangible commodities, and even the simple presence of bodies and other referential objects. As Timothy Leary says, computers are the new LSD. Why bother going out to a bar, when I can dial up LambdaMOO or FurryMUCK from the ease and comfort of my own terminal at home? There's nothing like a little virtual sex to brighten up a cold winter evening. It's only a few bucks an hour, and anonymity is guaranteed. So don't waste time fretting in front of the mirror: if you're having a bad hair day, all you need to do is delete your old description file, and write yourself a new one. McLuhan, as usual, was right: the hardware depends upon the software, and not the other way around. Microsoft is bigger business than IBM. Which reminds me of a joke I heard recently: How many Microsoft employees does it take to screw in a lightbulb? Answer: none, because Bill Gates has declared darkness the new standard. Don't worry about material conditions, they're easy enough to work around. All you have to do is slap on another layer of code. Changing the lightbulb is a hardware problem, and no concern of ours.

Information is everywhere, and everything can be transformed into information. Digital code has supplanted money as the ultimate medium of exchange: what Marx calls the universal equivalent, or what McLuhan calls the Pentecostal translator. Information is the space we move in, the air we breathe. Its very ubiquity, however, makes it hard to grasp. As McLuhan says, we tend to be unconscious of our immediate technological environment, just as a fish is unaware of being in the water. Immersed as we are in information, dependent as we are upon it, we don't really know what it is. Indeed, information seems rather like pornography: you know it when you see it, but you can't come up with a rigorous definition. Information is addictively fascinating, just like pornography; and both media arouse the same sort of moralistic indignation. When Baudrillard, Neil Postman, Jerry Mander, and Bill McKibben rail against the evils of contemporary electronic media, they sound like nobody so much as Catherine MacKinnon and Andrea Dworkin crusading against porn. All these critics start from a correct McLuhanesque apprehension: they understand, on some dim level, that information and pornography--like all media--are not just passive means of representation, but active forces in their own right, "extensions of man" (sic) that literally remold our bodies. McLuhan notes that "new environments inflict considerable pain on the perceiver" who has not yet adapted to their conditions or learned how to negotiate their demands. It's impossible to take seriously these arguments for the elimination of television and pornography; but I can't help being impressed by the apocalyptic fervor of the anti-porn and anti-TV crusaders, their Manichean sense of the depravity of images. Writers like Postman and Dworkin are symptomatically important. They testify unwittingly to what McLuhan calls our ingrained "rear-view-mirrorism," the conservatism and inertia of our genes and memes: "moral vehemence may provide ersatz dignity for our normal moronic behavior," McLuhan says; "the normal human condition, when faced with innovation, is that of the brainwashed idiot who tries to introduce the painfully learned responses from one situation into new situations where they apply not at all."

Such idiocy, you might say, is structurally unavoidable. All media are extensions of ourselves, yet we can never keep up with the breathless speed of their mutations and metamorphoses. There's always lag, as anyone who cruises the Internet knows all too well. This explains why, as McLuhan puts it, "the content of any medium is always the preceding medium." Information is everywhere in our postmodern economy; but the content of that information--what it tells us, or what it's about--is still the old world, the world before it was altered by microtransistors and fiber-optic cables. John Perry Barlow suggests that electronic information "is like farm produce," in that "its quality degrades rapidly both over time and in distance from the source of production." Yesterday's data, like yesterday's papers, are good only for landfill, or for wrapping fish. It's a never-ending struggle, as Bill Gates would surely tell you, to stay on top of your game, to anticipate shifts in the market, and to make sure you have all the latest upgrades. So don't waste your money on CD-ROMs, whose content is obsolete even before it gets engraved in silicon. Go out and get more RAM and a faster modem instead. Artificial Intelligence theorists couldn't be more wrong than when they try to explain machine intelligence on the model of an organism's permanent, long-term memory. As McLuhan says, the mere "storage" of data is a relic of the "old technology"; in the new electronic environment, "the real job of the computer is not retrieval, but discovery." RAM is continually being swapped, and disappears entirely when you turn the computer off; a hard disk indeed preserves data, but its most important feature is that it's indefinitely rewriteable. A better organic analogy for artificial intelligence is therefore short-term memory, which is abrupt, multiple, and discontinuous, and "includes forgetting as an active process" (Deleuze and Guattari). Continual revision is the way out of "rear-view-mirrorism," the one way that can lead, in Foucault's words, to "the going-astray of the one who knows." The greatest virtue of the computer is that it allows us more and more to dispense with long-term memory, and to approach something like Andy Warhol's state of eternally renewable short-term bliss: "I have no memory. Every day is a new day because I don't remember the day before. Every minute is like the first minute of my life... My mind is like a tape recorder with one button--Erase."

Maybe that's what they mean when they say that "information wants to be free." Information, like Nietzsche's will to power, is not a static entity, not a resource that can be conserved or capitalized. Use it or lose it. It is a dynamic inner differential, "the last delta-t" (Pynchon), "a difference which makes a difference" (Gregory Bateson). Just as the will to power is "a structure in which differences of potential are distributed, a constitutive dissymmetry, difference, or inequality" (Deleuze), so information is composed of reversible gradients of electronic potential and ever-changing dissymmetries of charge. It is a matter of gates and switches, of pulses and fluxes. Its oscillations may be induced chemically at synaptic thresholds, or they may be triggered by clock signals on silicon chips; in either case, the world is a construct of self-organizing and self-executing binary programs. Rucker defines the information content of any object as "the length of the shortest computer program that would answer any possible question about that object"; on this basis, he proclaims that "reality" is nothing more (or less) than "an incompressible computation by a fractal cellular automaton of inconceivable dimensions." Indeed, an extensive digital software seems at work within the most diverse regimes of matter: we find the same nonlinear equations, fractal patterns, and strange attractors regulating variations in the weather, disturbances of cardiac rhythms, distributions of charge in neural networks, fluctuations in the stock market. But these are not closed, balanced systems; they are rather what Ilya Prigogine and Isabelle Stengers call dissipative structures, operating in "far-from-equilibrium" conditions, forever poised at the edge of chaos. Being is not stable, but paradoxically, precariously metastable.

In such conditions, behavior is in a real sense spontaneous or "free": infinitely sensitive to the most minute variations, it cannot be predicted, anticipated, or controlled from the outside. Not even Marxists believe in central planning anymore. But this "free" behavior is still information, and nothing but information: which means that it is ultimately computable, to any desired degree of accuracy. It's simply a matter of running the right simulations: of course, you need good software, and an awful lot of CPU time. Chaos theory thus harmonizes freedom and determinism, or chance and necessity, in much the same way that Leibniz, the first great philospher of information, reconciled free will with the infallible foreknowledge of God. God knows everything that will happen to me, according to Leibniz, because that information is enveloped in the concept--or as we should say, the program--of what I am. But the running of this program, the calculation of my being, is what mathematicians call an NP-complete problem: one that apparently cannot be solved by an efficient, time-reducing algorithm. The computation is so vast that it can only take place in real time, the very time of my lived experience; and the universe itself, in its entirety, is the only computer big enough to crunch all the numbers.

You might say, then, that "reality" itself is one enormous simulation, with information continually being computed to infinite decimal places. In the beginning was not the Word, but lines and lines of code. God is neither a stern judge nor a loving father; he is rather, as Leibniz implicitly argues, a master programmer. Such is the theology best suited to our postmodern experience of the hyperreal: a vision that moves beyond the dead end of modernist paranoia. Descartes, the prototypical modernist, worried that the Deity was actually an evil demon, bent on deceiving him. His attempts to persuade himself that God could be trusted after all are never altogether convincing. For once the seeds of paranoid doubt and existential angst have been planted, there's no way of eradicating them. Even Baudrillard is still a Late Cartesian, worried that hyperreal simulation has left us adrift in a vacant universe, "without origin or reality." For us, however--as indeed already for Leibniz--this simply isn't a problem. The discovery that God is a programmer running simulations is precisely what guarantees his veracity. For if God wanted to deceive us, then first and foremost he'd have to deceive himself. But if that were the case, then even his lies would end up being true. As Hans Moravec puts it, "A simulated Descartes correctly deduces his own existence. It makes no difference just who or what is doing the simulation--the simulated world is complete in itself." Our existence is no less real, for being that of a computer simulation, or an idea in the mind of God. Reality-testing involves what Wittgenstein would call a deep tautology: "What is, is. No fantasy. Pain. Just the details" (Kathy Acker). For what other criterion of truth and reality do we have? Philosophers in the Cartesian tradition are always trying to establish foundations and universals. But in every case, the philosophical groundings they've come up with are less evident, less solid and secure, than the very phenomena they are supposed to ground. The only convincing 'reality test' is a pragmatic one: "Just try--in a real case--to doubt someone else's fear or pain" (Wittgenstein). When we say that something is "real," we generally mean that it's so vivid, overwhelming, and all-embracing that it would be a frivolous--or willfully cynical--intellectual exercise to entertain Cartesian doubts as to its validity. Something is real because it's intense, and not the reverse.

And so we no longer ask the old Cartesian question: is it real or is it Memorex? We trust and believe that the world is real, precisely because we know it to be a simulation. Thanks to computers, we have rid ourselves of the representationalist prejudice that played so baleful a role in the history of Western thought. For a simulation is not a representation, but something altogether different: "to simulate something you need more than mere mimicry, more than an ability to produce actions that are like the ones you are wanting to simulate. You need a working model" (Benjamin Woolley). A representation comes after the object it imitates or signifies. That's why "the symbol is the murder of the thing," as Lacan put it: every representation implies, to some measure, the "lack"--the replacement, the death or the absence--of the thing it is supposed to represent. A simulation, on the contrary, precedes its object: it doesn't imitate or stand in for a given thing, but provides a program for generating it. The simulacrum is the birth of the thing, rather than its death. As Deleuze and Guattari say, simulation is how the real is effectively produced. No real without its hyperreal: the map becomes the territory. Reality will be virtual, or not at all. We live in an age of information, rather than one of representation and signification; and information is characterized by plenitude and redundancy--not lack. Leibniz argues that, among all possible worlds, God necessarily chose to create the one having "the greatest quantity of reality." In postmodern terms, this amounts to saying that the program simulating our universe is more powerful and detailed, more intense, more packed with information--more real, in short--than anything we could possibly run on our own feeble machines, or imagine inside our heads. The situation is rather like that in quantum mechanics. A wave function is inherently indeterminate and probabilistic; but it collapses, or gets determined, once it has been observed and measured. Contrary to popular misconception, however, this measuring intervention need not imply consciousness on the part of the 'observer.' A mechanical device, like a counter, is sufficient to make the wave function collapse. Simulation, likewise, is a relativistic and perspectival process, but not for all that a subjective one. It coerces my participation, but does not require it. As Wallace Stevens writes, "it fills the being before the mind can think." Information overload, you might say, is our proof that an external world really exists. We do not hallucinate an imaginary presence, says Deleuze: "it's rather presence itself that is hallucinatory."

I'm drawn into this delirium every time I turn on my modem. I log in to my account, access the World Wide Web, and head out in search of information. Too much is never enough. It accumulates more and more, in a classic feedback loop. First I download a file, then I need a program to read it. Then I discover a bug in the program, so I have to look around for a patch or an upgrade. And then I'm desperate to find more files, just to make running the program worthwhile. It's like what they say about drugs: watch out for that first hit, because each step leads to the next. Better not even inhale. Before I knew it, I was hooked: now my hard drive's entirely full, and I've got to shell out for a bigger one. It's all so absorbingly self-referential; there's no end to the digital labyrinth. Pynchon sees this process as a virtual reconfiguration of the American landscape, with information space overlaid upon, and literally mapping to, the shopping malls and real estate subdivisions of suburbia: "it was like walking among matrices of a great digital computer, the zeroes and ones twinned above, hanging like balanced mobiles right and left, ahead, thick, maybe endless." On the information superhighway, every house is a byte or a pixel, and there are endless opportunities to shop. Bits, as Rucker says, are always composed of smaller bits. Leibniz similarly insists on the infinite divisibility of matter, with differences animating it on every scale. Information is always embedded, one level lodged inside another, sort of like the Cat in the Hat. At one extreme, digital information encompasses the entire universe. At the other extreme, the smallest possible bit is not an entity, but a difference: an on/off switch, an undecidable 1/0, an uncollapsed, still indeterminate, quantum state. Something like the VOOM that's in the Hat of the tiniest Cat, a force that can accomplish anything, but whose ultimate nature, says Dr. Seuss, "I never will know."

With all those levels in between, surfing the Net can be daunting. So much depends upon finding a clean, manageable interface, and a smoothly-running client program. The sheer ugliness of MS-DOS and Windows is one thing for which I never can forgive Bill Gates. There's no need for such opaque protocols and rigid hierarchies, since fractal patterns, like all those Cats in all those Hats, remain self-similar across differences of scale. In cyberspace, linear and hierarchical modes of organization are replaced by what McLuhan calls a "mosaic," or what Leo Steinberg calls a "flatbed": "the flatbed picture plane makes its symbolic allusion to hard surfaces such as tabletops, studio floors, charts, bulletin boards--any receptor surface on which objects are scattered, on which data is entered, on which information may be received, painted, impressed--whether coherently or in confusion... the painted surface is no longer the analogue of a visual experience of nature but of operational processes." With programs like Mosaic and Netscape Navigator, you can jump discontinuously from any point in the World Wide Web to any other point, without having to traverse all those tedious up-and-down intermediate links. Microsoft was way behind in adopting this paradigm. World Wide Web browsers turn the Internet into what Deleuze and Guattari call smooth or rhizomatic space: a space of "acentered systems, finite networks of automata in which communication runs from any neighbor to any other, the stems or channels do not preexist, and all individuals are interchangeable, defined only by their state at a given moment." Selves are no longer constrained by rules of unity and organic form. You can adopt whatever handle or pseudonym you want. We are all the same in cyberspace, and anyone can be replaced by anyone else, just as Andy Warhol dreamed. But the states that such interchangeable individuals can occupy are multiplied far beyond our preexisting, restrictive norms; on LambdaMOO, for example, ten genders are currently available, instead of merely two.

But let's not get carried away with utopian fantasies. Most straight men are assholes, and the mere opportunity for expanded gender play on the Net doesn't do anything to change that. A successful drag performance is harder to pull off than you might think. Straight guys often pretend to be girls on the Net--I've done it often myself-- thinking that the disguise will make it easier to get attention, and especially to score with 'actual' girls. But what goes around, comes around: the girls these guys meet usually turn out to be yet other guys in virtual disguise. Face it, the information of which most straight men are composed is monotonously self-referential: it just turns round and round forever, in the selfsame endless loop. The orgy always ends in disillusionment and boredom. However much you try, you can never be promiscuous enough. Leibniz takes the universal, orgiastic communication of bodies, "the fact that every portion of matter is agitated by the motions of the entire universe, and is acted upon in some way by all other parts of matter, however distant," as evidence for a cosmic "pre-established harmony." He reasons that, since everything is connected, and all these connections constitute information, and all information is computable (at least ideally, in the mind of God), then we must be living in the best of all possible worlds. By a similar argument, physicists long imagined that they were on the brink of discovering a 'final theory of everything,' and that the ultimate laws of nature would turn out to make a structure of great simplicity and beauty. If only Congress hadn't cancelled funding for the Superconducting Super Collider! But let's face it, guys, the universe's actual operating system is an ugly, inelegant hodgepodge--much like MS-DOS and Windows. I vastly prefer the Macintosh operating system myself; but I know that imperfection is inherent to design, and that even the best program is less a dazzlingly logical and elegant construction than it is a heterogeneous assemblage, buggy and inconsistent, patched with dubious trade-offs and quick and dirty shortcuts. That's the only way natural selection can operate. The postmodern God, it would seem, resembles Bill Gates far more than he does Leibniz's all-wise Designer. God, like Gates, has exactly the aggressiveness, the competitive drive, and the sense of entitlement you'd expect in a talented straight boy from a privileged WASP background. He's an obsessive hacker, a brilliant but clumsy bricoleur, with an brute-force approach to problem-solving. He's a ferocious workaholic, who regularly puts in 80-hour weeks, and expects his employees to do the same. And although he's something of a visionary, he's not a particularly reliable one; he never meets product deadlines, and the goods he so tirelessly promotes are mostly vaporware. God, like Gates, owes his power and success less to the quality of his product than to his ruthless business sense. He's created a near monopoly by outmuscling the competition. You might not like this universe, just as you might not like Microsoft's clunky programs; but pragmatically speaking, where else do you have to go?

It's Bill Gates's world; we just live in it. Even a cursory look at Microsoft programs will disabuse you of the notion that the workings of natural selection, or of the "free market," somehow lead to optimal solutions. Most of the time, they leave us stranded, as it were, in the basins of insufficiently strange attractors: stable but suboptimal norms, programs that work just well enough to avert too frequent crashes, and to foreclose the chance of further innovation. Even God, you might say, can't really make the trains run on time. Information wants to be free; but its liberation won't have the glittering consequences that its proponents sometime imagine: "though at first thought a leap out of our biological bodies [and into cyberspace] might seem to free us of the diseases of the flesh--alas, it is not so" (Hans Moravec). Today's simple computer viruses are only the beginning, as Moravec almost gleefully explains. Even as I write, new strains are being reported that mutate slightly in every generation, just enough to evade detection by standard anti-virus programs. For information has a body, even if it's not always the carbon-based one we've been accustomed to. Information is never just meaning, it's never a pure signal: there's always some waste in the form of redundancy, and there's always an uneliminable residue of noise. Redundancy and noise are information's body, the excess--the nonproductive expenditure--without which it couldn't function at all. They are inherent features of the medium, which means that they take precedence over any particular message. As Michel Serres points out, in living systems the couple 'message/noise' itself becomes, on a meta-level, a new form of information. And as Prigogine and Stengers observe, for living systems "a random fluctuation in the external flux, often termed 'noise,' far from being a nuisance, produces new types of behavior." So let's stammer and stutter and repeat ourselves, spam the terminals, jam the channels, and otherwise revel in information overload. The problem with Microsoft software isn't so much that it's heavy and slow, and filled with bugs and redundancies. What's truly obnoxious about it is that in spite of its flaws, it functions all too slickly and too well. It establishes a field of coherence and closure that reins in excess, and shuts out other programs. The problem is not with Bill Gates the hacker, but with Bill Gates the monopolist who unilaterally imposes the latest standard.

So think again about that lightbulb that Gates couldn't be troubled to screw in. "The electric light is pure information," says McLuhan; "it is a medium without a message, as it were, unless it is used to spell out some verbal ad or name." The sheer dissipation of the lightbulb, its prodigal display of luminescence, its irreversible expenditure of itself in heat: this is a delirium of pure information. You could stare at that one bulb for hours, in a psychedelic trance. Of course someone will always try to use the lightbulb, to get some reading done, to sell some product, or to spell out some illustrious name. But the Second Law of Thermodynamics assures us that information cannot ever entirely be put to work. The medium is more than the sum of its messages and uses. Information is ecstatic, before it is communicative. It will be convulsive, or not at all. Remember Pynchon's story of Byron the Bulb, the lightbulb that burns forever? The international electronics cartel, concerned for its bottom line, sends out hit squads to 'retire' him. But Byron is effectively immortal; somehow he escapes each time, and continues to be screwed into socket after socket. Byron the Bulb embodies information in its pure state, sterile and sublime and incommunicable, unable to be accounted for, never to be put to any productive use: "he is condemned to go on forever, knowing the truth and powerless to change anything." And indeed one day "he will find himself, poor perverse bulb, enjoying it..."

So there you have it. Turn on, tune in, drop out. In this new world of electronic information, it's not what you know that is important, but the fact that, whatever it is, and wherever it comes from, it just never stops. That's the real point behind Leary's claim that netsurfing and virtual reality extend the effects of LSD by other means. It's not that you receive any profound new revelations when you're tripping; if you think you do, as often happens, you're only fooling yourself. But the value of psychedelia is that it is a vertigo of redundancy, an ecstasy of sheer quantitative overload. It's a feedback effect, like playing an electric guitar right up against the speakers. The message itself is unimportant. What really changes, thanks to the blast of these interference patterns, is the medium, or the messenger. All media, McLuhan says, are prosthetic implants, extensions of ourselves. But don't imagine that you can extend yourself with impunity, without being deeply changed by the experience. Even if you are just sitting there stooped over your terminal, or off in a corner, in a drug-induced stupor, staring at a lightbulb and muttering to yourself--even then, or rather especially then, you will hear voices that are profoundly alien. Terence McKenna calls them elves and pixies, William Gibson Haitian loas, Burroughs brain parasites from Venus, Jack Spicer little green men from Mars, John Carpenter free marketeers from Andromeda. Whatever. These beings may be malicious, or simply indifferent to us. They probably don't have our best interests at heart. But who cares where they come from, or even what they want? It's not as if we could make them go away; it's far too late for that. We have to learn to live with them; there's no other option. In the words of Sun Ra and his Intergalactic Solar Arkestra: "It's after the end of the world. Don't you know that yet?"


Go to Steven Shaviro's homepage or new book in progress, Stranded in the Jungle.

Back to Hiro Protagonist's home page