Technology 101: What Do We Need To Know About The Future We're Creating?

By Howard Rheingold

(last revision: 5/4/98)

1: Where I'm Coming From
3: Seduction By Mind Amplifier
4: Jumping into the Virtual World
5: From Thinking Tools to Thinking About Tools

2: Growing Up Futurian

I've always been future-oriented. In the fourth grade, I gave lectures in class that were called "Howard and the atom." In 1958, in the sixth grade, I made a solar furnace as my science fair project and accompanied it with a drawing of a solar-powered city. In 1968, inspired by the finding that Zen monks had more alpha rhythms than other people, I wrote my Reed college undergraduate thesis about brainwave biofeedback and consciousness technology. I wrote an article about the future of money in 1976. In 1978, I sold my first two big ticket magazine articles: "Future Highs" for Playboy and "The Future of Pinball" for Penthouse.

When the Altair, the first "computer on a chip" as they were called, came along in the mid 1970s, I was attracted to the idea that hobbyists could now have computers, which had been the size of refrigerators and cost millions of dollars a few years prior, but I wasn't a kit-builder, and didn't really know anything about computers. I got into personal computers in the early 1980s through my interest in the possibility of a science of consciousness.

In 1982 got a job as staff writer for the Institute of Noetic Sciences. IONS had been founded by Edgar Mitchell, the Apollo astronaut who had a profound spiritual experience during his extravehicular activity on the way back from the moon.

The main honcho at the Institute of Noetic Sciences was Willis Harman, who was one of the inventors of commercial futurism at Stanford Research Institute in the 1960s. Eventually I learned that he also been involved with the bizarre, shadowy, subcultural figure, Al Hubbard, Willis' homeboy from Washington State, who some claim was the man who turned the CIA onto LSD. I co-wrote a book with Willis (Higher Creativity is still in print) about the technologies and scientific discoveries and works of art that originated in extraordinary states of consciousness, and the possibility that such states could be "cultivated rather than harvested wild." During the writing of that book, ironically, I started my long love affair with computer-augmented thinking.

In the late 1970s, I had heard that some computer enthusiasts -- I don't even think they were called "nerds" yet -- were using computers and display screens as intelligent typewriters. I read a paper that had been distributed at one of the first West Coast Computer Faires -- the ones where organizer Jim Warren still tooled around the show floor on his roller scates, and there wasn't a suit to be seen. The guy who wrote the paper, Jef Raskin, worked for Apple. I visited him in Cupertino. The Apple campus consisted of two, not-all-that-big buildings. Raskin had indeed written a text editor for the Apple II. But the state of the art in printers was still primitive dot matrix, and more importantly, there was no use of lower case letters.

At that time, only upper case screen fonts were burned into the Apple II ROM. Raskin said that the people who ran Apple had decided that personal computers were for people who played games and programmed in BASIC, neither of which required lower case fonts. I never forgot that. Even visionaries become myopic. Technology always seems to move faster than our ability to understand what to do with it. And the pace seems to be accelerating. He didn't talk to me about it when I interviewed him, but Raskin had already initiated a project that was going to go far beyond the Apple II -- the Macintosh project that Steve Jobs later comandeered.

While I was writing Higher Creativity with Willis Harman, one of the board members of the Institute of Noetic Sciences bought the Institute a personal computer. It was a Morrow Designs s-100 bus CPU running CP/M. The printer was a Diablo impact printer that sounded like a machine gun. The word processing software was Wordstar 1.0. The documentation was stapled together. The Institute was leasing a big house overlooking San Francisco Bay. In the basement was a built-in redwood hot tub. We never used the tub, so it was dry. It smelled like redwood. That's where we put the new computer. I put on my Walkman earphones, which was itself new technology at that time, climbed down the ladder into the hot tub pit, and sat myself in front of the computer screen, feeling futuristic.

Word processing didn't just save me the effort of retyping my revised drafts over and over again -- retyping drafts over and over again is an excellent writing exercise, but by 1982, after ten years, I had learned enough from the typing part of it. WordStar, with its clumsy user interface (select a word by prefacing it with control-KB and following it with control-KD, then type control-KX to cut it to the frighteningly invivisble clipboard, and control-KV to paste the now-hidden selection back in to a new place in the document), was my first experience of the computer as a mind-amplifier. It took me another fifteen years to even notice how quickly I had been sucked into spending most of my working day sitting in front of a computer, increasingly engaged with the things computers were making possible.

My trains of thought were no longer limited to the linear requirements of type on paper. Compositional experiments that simply were too much trouble to attempt in typewriter days became possible: Looking at different alternative words or sentences or paragraphs, moving blocks of text around, toggling back and forth between versions of a phrase, had not been possible with a typewriter. Word processing doesn't make you any more intelligent or creative than you already are, but it makes it easier to play with words and ideas. If you aren't a decent writer, this tool is only going to augment your typing, and might even muddle your writing by making it easier for you to mess with it without knowing what you're doing. If you do have some grasp of the craft of writing, however, a word processor is like a power-saw to a carpenter. You can do things with this new tool you weren't able to do without it.

In 1983, I bought one of the first IBM XTs. 256K RAM. Plenty for the rest of my life! And a ten megabyte hard disk!

Around the time I finished working on Higher Creativity, I read an old article in Scientific American on "Microelectronics and The Personal Computer" by Alan Kay. His vision of a "Dynabook" of the future captured my imagination, and so did the place he worked at the time he wrote that article, the Xerox Palo Alto Research Center, also known as PARC. What could possibly be better than a mind amplifier? Answer: An R&D thinktank dedicated to creating mind amplifiers. If they didn't need writers, I was determined to convince them that they did. I started calling around, asking if anybody knew anybody who worked at PARC. I called their publication office. I devoted a morning each week to making calls regarding employment at PARC.

When I was looking for a job at PARC, a more experienced freelancer told me: "Nobody has a job for you, but everybody has a problem to solve, sooner or later." So I called the woman at the public affairs office once a week, and politely reminded her I was eager to get my toe in the water. One Friday, she called and said that a Xerox executive was supposed to do a speech at a convention on Monday, and somehow it hadn't been written. So, sure, I could start from scratch on Friday afternoon, learn what I need to learn about impact printers, and script a speech by Saturday night, so slides could be created to go along with it. After I solved that problem, she started calling me to help scientists at PARC write papers for scientific meetings. Bingo. Dream job. These people had been working with personal computers expressly designed for intellectual augmentation for TEN YEARS!

With my interest (growing into an obsession) in the use of computers to amplify thinking, communicating, and creative work in general, I would have paid to be allowed to wander through Xerox PARC when I discovered it, in 1983. Every week or so, I'd get a call to help someone write. These were very smart people who knew their stuff, and either didn't have time or didn't like to write or were simply gifted procrastinators. I came in, asked them to explain what they were doing, and turned the transcript into a draft, with the help of whatever written materials they gave me. Then we'd meet again and he or she would tell me what was wrong with the draft. The first such assignment was an article about "higher level protocols for data networks." I had no idea what even a lower level protocol was. It was like being handed a Greek dictionary and told to come back in a week with an essay suitable for Greek. Not easy, but if you are suitably motivated, entirely possible.

The best part of the PARC gig was the privilege of using an Alto to research and compose my articles. With its custom-build microprocessors, large bit-mapped screen (about six times larger than the first Macintosh screen), two-button mouse, icons, windows, point-and-click interface, the Alto was exactly what I knew a computer could be -- and Xerox researchers had been using them for years! Each Alto was connected with the PARC Ethernet (local-area networks were another PARC invention) and gatewayed to the ARPAnet. At that time, it was also the only place in the world you could print your hardcopy on a laser printer (another PARC invention). I drove forty-five minutes each way from San Francisco, just to be able to work on an Alto.

The Macintosh came out in 1984. It had a fraction of the Alto's processing power, disk storage, and screen size. But by that time, Xerox had already blown its unbelievable ten year lead in the PC industry. The Alto had been commercialized as the Xerox Star. But you couldn't buy a Star. You had to buy a network with multiple Stars, servers, laser printers. Xerox only wanted customers who wrote at least six zeroes on their checks. Macintosh capabilities couldn't compare to Xerox technology, but you could buy a Mac for a couple thousand dollars. Nobody remembers the Star. The Macintosh, despite the sorry fate of Apple, Inc., was one of the world's greatest brands and creation myths of the digital age. For many, it was a noetic experience.

By the time everybody was making a big deal of the Mac, I had met Bob Taylor, director of the Computer Science Lab at PARC, and had read the bibliographies of enough CSL publications to know that Doug Engelbart and JCR Licklider were responsible for the idea of using computers as mind amplifiers, long before PARC existed. Practically nobody knew about the role PARC had played, so I told PARC's public affairs director I wanted to write stories about the great stuff PARC was inventing.

The tale of teenagers in garages creating an industry was a great story. But there was an equally interesting, and in some ways more profound story of the mavericks who swam against the mainstream of mainframe computer science and created personal computing. These people were on a crusade, and the goal was not to make a fortune, but to change the way the world accomplished intellectual work, starting with themselves.

By 1983, tracing the origins of personal computing was already an exercise in archeology. Bob Taylor still directed the Computer Science Lab at PARC, but the Alto was already ten years old, Alan Kay had departed PARC for Atari Research Labs, and the deepest roots of the PC lay in an institution that no longer existed, Doug Engelbart's Augmentation Research Center at Stanford Research Institute.

Engelbart certainly existed, I learned, and was still pursuing his dream of mind-amplifying media. My curiosity led me to interview him, and the interview turned a key and unlocked something that has taken a long time to develop. I'm still tingling from my encounter with the ideals he inspired the day I met him, fifteen years ago. I've never encountered, and doubt whether I will ever find again, a person in pursuit of such a broad vision of the way the world ought to be, and in possession of such incredible tenacity in that pursuit. He cooked up a brainstorm one day in 1950, driving to work, and it has dominated his life ever since. That brainstorm has come to dominate many other lives, since it is unthinkable that personal computers and networks and multimedia and hypertext and point-and-click interfaces would have developed without Engelbart's tenacious vision, and the pioneering work he accomplished in its pursuit.

In 1950, when there were only a few digital computers in the world, and television was a brand-new medium, Doug Engelbart conceived of a mind-amplifying device that would help the human race navigate the complexities of the future by representing information on TV screens and storing that information in a hypertext network. When I met him in 1983, Engelbart had been pursuing that idea for more than three decades.

I first met him at the office of Tymshare (a company that no longer exists), which had bought his Augment system from SRI. Ironically, that office in Cupertino was surrounded by the then-expanding Apple campus. Like a cross between the Ancient Mariner and an Old Testament prophet, he has been compelled to tell his entire story thousands of times. It took decades before anyone else in the world could perceive the future he had foreseen. His blue eyes still focus on a distant horizon when he explains. Engelbart wanted to go Bacon and Descartes one better -- the philosophers of the first Enlightenment Project didn't have computers, televisions, and networks. It's all about how people and tools can learn to think together in new ways, to address complex problems. Sounds sensible now. In 1950, it was as if he started talking Martian.

Doug Engelbart, a twenty five year old veteran, had been a radar operator in WWII. He realized that the post-war world would be dominated by technologies and by complex global problems. In fact, the very complexity of the problems that new technologies would cause would be the major metaproblem. Why not use technologies to help people solve those complex problems together? While driving through the fruit orchards of the Santa Clara Valley, circa 1950, on his way to work as an electrical engineer at Ames Aviation (now NASA's research center), Engelbart began to think about ways he could use his life to help the human race survive the explosive growth of technology he was helping create.

Immediately after the end of the war, while waiting for his ship home from the Phillipines, Engelbart read Vannevar Bush's visionary article in the May, 1945, Atlantic, As We May Think. When he started thinking about how people could solve complex intellectual problems together, Engelbart began envisioning a version of Bush's memex that was more of a communication device than just an information-finding tool.

In 1962, Engelbart published his epochal paper, Augmenting Human Intellect, about a tool-using tool that would involve more than just hardware and software: new ways of thinking, working, communicating, and new languages to represent these new mind-tools would be required, as well as new training methods and organizational systems to manage their use as part of scientific, educational, industrial enterprises. Like the Enlightenment philosophers, Engelbart was looking at a whole new way of using our minds, our language, our institutions. He saw that new electronic tools with symbol-manipulating capacity furnished great opportunities for intellectual leverage, but above all he understood that these tools would necessarily be part of a profound systemic change. Read Engelbart's conclusion, then remind yourself he wrote this in 1962.

Doug Engelbart's personal charisma is a quiet kind. You have to lean close when he's talking, even when he uses a microphone. His voice is soft, but he's like a tuning fork when he speaks; he seems to be vibrating at the frequency of his vision for the world whenever he begins to talk about it. Being around him affected me. It became clear to me that the world didn't know that personal computers were invented by stubborn visionaries like Engelbart, and not by the computer industry or computer science orthodoxy. After talking to Engelbart, Alan Kay, JCR Licklider, Bob Taylor, and others who had been involved in "interactive computing" since the 1960s, I understood that this tool was the work of people who deliberately sought to extend the powers of intellect and communication. In contrast to the priesthood of the mainframe era, the ARPA programmers were revolutionary. They knew that access to computing resources could empower entire populations to think and communicate in new ways. So I wrote Tools For Thought to tell that story.

3: Seduction by Mind Amplifier

rheingold's brainstorms | more rheingoldian writing

©1998 howard rheingold, all rights reserved worldwide.