TIME.com - Visions Of The 21st Century (June 19, 2000)
Will We Plug Chips Into Our Brain?
The writer who coined the word cyberspace contemplates a future stranger
than his science fiction
By William Gibson
Maybe.
But only once or twice, and probably not for very long.
With their sharp black suits and their surgically implanted silicon chips,
the cyberpunk hard guys of '80s science fiction (including the characters
in my early novels and short stories) already have a certain nostalgic
romance about them. These information highwaymen were so heroically attuned
to the new technology that they laid themselves open to its very cutting
edge. They became it; they took it within themselves.
Meanwhile, in case you somehow haven't noticed, we are all becoming it;
we seem to have no choice but to take it within ourselves.
In hindsight, the most memorable images of science fiction often have
more to do with our anxieties in the past (that is to say, the writer's
present) than with those singular and ongoing scenarios that make up our
life as a species — our real future, our ongoing present.
Many of us, even today, or most particularly today, must feel as though
we already have silicon chips embedded in our brains. Some of us, certainly,
are not entirely happy with that feeling. Some of us must wish that ubiquitous
computation would simply go away and leave us alone. But that seems increasingly
unlikely.
That does not, however, mean that we will one day, as a species, submit
to the indignity of the chip — if only because the chip is likely to shortly
be as quaint an object as the vacuum tube or the slide rule.
From the viewpoint of bioengineering, a silicon chip is a large and rather
complex shard of glass. Inserting a silicon chip into the human brain
involves a certain irreducible inelegance of scale. It's scarcely more
elegant, relatively, than inserting a steam engine into the same tissue.
It may be technically possible, but why should we even want to attempt
such a thing?
I suspect that mainstream medicine and the military will both find reasons
for attempting such a thing, at least in the short run, and that medicine's
reasons may at least serve to counter some disability, acquired or inherited.
If I were to lose my eyes, I would quite eagerly submit to some sort of
surgery that promised a video link to the optic nerves. (And once there,
why not insist on full-channel cable and a Web browser?) The military's
reasons for chip insertion would probably have something to do with what
I suspect is the increasingly archaic job description of "fighter pilot,"
or with some other aspect of telepresent combat, in which weapons in the
field are remotely controlled by distant operators. At least there's still
a certain macho frisson to be had in the idea of embedding a tactical
shard of glass in your head, and crazier things, really, have been done
in the name of king and country.
But if we do it at all, I doubt we'll be doing it for very long, as various
models of biological and nanomolecular computing are looming rapidly in
view. Rather than plug a piece of hardware into our gray matter, how much
more elegant to extract some brain cells, plop them into a Petri dish
and graft on various sorts of gelatinous computing goo. Slug it all back
into the skull and watch it run on blood sugar, the way a human brain's
supposed to. Get all the functions and features you want, without that
clunky-junky 20th century hardware thing. You really don't need complicated
glass to crunch numbers, and computing goo probably won't be all that
difficult to build. (The trickier aspect here may be turning data into
something brain cells can understand. If you knew how to get brain cells
to manage pull-down menus, you'd probably know everything you needed to
know about brain cells.)
Our hardware, I think, is likely to turn into something like us a lot
faster than we are likely to turn into something like our hardware. Our
hardware is evolving at the speed of light, while we are still the product,
for the most part, of unskilled labor.
But there is another argument against the need to implant computing devices,
be they glass or goo. It's a very simple one, so simple that some have
difficulty grasping it. It has to do with a certain archaic distinction
we still tend to make, a distinction between computing and "the world."
Between, if you like, the virtual and the real.
I very much doubt that our grandchildren will understand the distinction
between that which is a computer and that which isn't.
Or to put it another way, they will not know "computers" as a distinct
category of object or function. This, I think, is the logical outcome
of genuinely ubiquitous computing, of the fully wired world. The wired
world will consist, in effect, of a single unbroken interface. The idea
of a device that "only" computes will perhaps be the ultimate archaism
in a world in which the fridge or the toothbrush is potentially as smart
as any other object, including you, a world in which intelligent objects
communicate, routinely and constantly, with one another and with us.
In this world, there may be no need for the physical augmentation of
the human brain, as the most significant, and quite unthinkably powerful,
augmentation will have taken place beyond geographic boundaries, via distributed
processing. You won't need smart goo in your brain, because your fridge
and your toothbrush will be very smart indeed, enormously smart, and they
will be there for you, constantly and always.
So it won't, I don't think, be a matter of computers crawling buglike
into the most intimate chasms of our being, but of humanity crawling buglike
out into the mingling light and shadow of the presence of that which we
will have created, which we are creating now, and which seems to me to
be in the process of re-creating us.