Rheingold's Rants

February 1998

1998 archives:
january
march
april
may

26 February

By 1983, tracing the origins of personal computing was already an exercise in archeology. Bob Taylor still directed the Computer Science Lab at PARC, but the Alto was already ten years old, Alan Kay had departed PARC for Atari Research Labs, and the deepest roots of the PC lay in an institution that no longer existed, Doug Engelbart's Augmentation Research Center at Stanford Research Institute.

Engelbart certainly existed, and was still pursuing his dream of mind-amplifying media. I'm still running on the ideals he inspired the day I met him, fifteen years ago. I've never encountered, and doubt whether I will ever find again, a person in pursuit of such a broad vision of the way the world ought to be, and in possession of such incredible tenacity in that pursuit. He cooked up a brainstorm one day in 1950, driving to work, and it has dominated his life ever since.

25 February

By the time everybody was making a big deal of the Mac, I had met Bob Taylor, director of the Computer Science Lab at PARC, and had read the bibliographies of enough CSL publications to know that Doug Engelbart and JCR Licklider were responsible for the idea of using computers as mind amplifiers, long before PARC existed. Practically nobody knew about the role PARC had played, so I told PARC's public affairs director I wanted to write stories about the great stuff PARC was inventing.

The tale of teenagers in garages creating an industry was a great story. But there was an equally interesting, and in some ways more profound story of the mavericks of ARPA, ARC, and PARC who swam against the mainstream of mainframe computer science and created personal computing. These people were on a crusade, and the goal was not to make a fortune, but to change the way the world accomplished intellectual work, starting with themselves.

24 February

The Macintosh came out in 1984. It had a fraction of the Alto's processing power, disk storage, and screen size. But by that time, Xerox had already blown its unbelievable ten year lead in the PC industry. The Alto had been commercialized as the Xerox Star. But you couldn't buy a Star. You had to buy a network, multiple Stars, laser printers. Xerox only wanted customers who wrote at least five zeroes on their checks. Macintosh capabilities couldn't compare to Xerox technology, but you could buy a Mac for a couple thousand dollars. Nobody remembers the Star. The Macintosh, despite the sorry fate of Apple, Inc., was one of the world's greatest brands and creation myths of the digital age.

23 February

With my interest (growing into an obsession) in the use of computers to amplify thinking, communicating, and creative work in general, I would have paid to be allowed to wander through Xerox PARC when I discovered it, in 1983. Every week or so, I'd get a call to help someone write. These were very smart people who knew their stuff, and either didn't have time or didn't like to write or were simply gifted procrastinators. I came in, asked them to explain what they were doing, and turned the transcript into a draft, with the help of whatever written materials they gave me. Then we'd meet again and they would tell me what was wrong with the draft. The first such assignment was an article about "higher level protocols for data networks." I didn't even know what a lower level protocol was. It was like being handed a Greek dictionary and told to come back in a week with an essay suitable for publication...in Greek.

21 February


For the past week or more, I've been posting daily installments in an ongoing narrative about what technophiles ought to know. But today I want to interrupt by posting a short list of books that have helped me broaden my own thinking about technology. Ironically, half these books are out of print:


20 February

When I was looking for a job at PARC, a more experienced freelancer told me: "Nobody has a job for you, but everybody has a problem to solve, sooner or later." So I called the woman at the public affairs office once a week, and politely reminded her I was eager to get my toe in the water. One Friday, she called and said that a Xerox executive was supposed to do a speech at a convention on Monday, and somehow it hadn't been written. So, sure, I could start from scratch on Friday afternoon, learn what I need to learn about impact printers, and script a speech by Saturday night, so slides could be created to go along with it. After I solved that problem, she started calling me to help scientists at PARC write papers for scientific meetings. Bingo. Dream job. These people had been working with personal computers expressly designed for intellectual augmentation for TEN YEARS!

19 February

Around the time I finished working on Higher Creativity, I read an old article in Scientific American on "Microelectronics and The Personal Computer" by Alan Kay. His vision of a "Dynabook" of the future captured my imagination, and so did the place he worked at the time he wrote that article, the Xerox Palo Alto Research Center, also known as PARC. What could possibly be better than a mind amplifier? Answer: An R&D thinktank dedicated to creating mind amplifiers. If they didn't need writers, I was determined to convince them that they did. I started calling around, asking if anybody knew anybody who worked at PARC. I called their publication office. I devoted a morning each week to making calls regarding employment at PARC.


18 February

Word processing didn't just save me the effort of retyping my revised drafts over and over again -- an excellent writing exercise, but by 1982, after ten years, I had learned enough from it. WordStar, with its clumsy user interface (select a word by prefacing it with control-KB and following it with control-KK), was my first experience of the computer as a mind-amplifier. It took me another fifteen years to even notice that I was soon sucked into spending most of my working day sitting in front of a computer, increasingly engaged with the things computers were making possible.


My trains of thought were no longer limited to the linear requirements of type on paper. Compositional experiments that simply were too much trouble to experiment with in typewriter days became possible: Looking at alternatives, moving blocks of text around, toggling back and forth between versions of a phrase, that just had not been possible with a typewriter. Word processing doesn't make you any more intelligent or creative than you already are, but it makes it easier to play with words and ideas.

17 February


While I was writing a book with Willis Harman, one of the board members of the Institute of Noetic Sciences bought the Institute a personal computer. It was a Morrow Designs s-100 bus CPU running CP/M. The printer was a Diablo impact printer that sounded like a machine gun. The word processing software was Wordstar 1.0. The documentation was stapled together. The Institute was leasing a big house overlooking San Francisco Bay. In the basement was a built-in redwood hot tub. We never used it, so it was dry. That's where we put the new computer. I put on my Walkman, which was itself new technology at that time, climbed down the ladder into the hot tub pit, and sat myself in front of the computer screen, feeling damn futuristic.



16 February


In the late 1970s, I had heard that some computer enthusiasts -- I don't even think they were called "nerds" yet -- were using computers and display screens as intelligent typewriters. I read a paper that had been distributed at one of the first West Coast Computer Faires -- the ones where organizer Jim Warren still tooled around the show floor on his roller scates. The guy who wrote the paper, Jef Raskin, worked for Apple. I visited him in Cupertino. The Apple campus consisted of two, not-all-that-big buildings. Raskin had indeed written a text editor for the Apple II. But the state of the art in printers was still primitive dot matrix, and more importantly, there was no use of lower case letters. At that time, only upper case screen fonts were burned into the Apple II ROM. Raskin said that the people who ran Apple had decided that personal computers were for people who played games and programmed in BASIC, neither of which required lower case fonts.


15 February

The main honcho at the Institute of Noetic Sciences was Willis Harman, who was one of the inventors of commercial futurism at Stanford Research Institute in the 1960s. Eventually I learned that he also been involved with the bizarre, shadowy, subcultural figure, Al Hubbard, Willis' homeboy from Washington State, who was the man who turned the CIA onto LSD. I co-wrote a book with Willis (Higher Creativity is still in print) about the technologies and scientific discoveries and works of art that originated in extraordinary states of consciousness, and the possibility that such states could be "cultivated rather than harvested wild." During the writing of that book, ironically, I started my long love affair with computer-augmented thinking.


12 February


I've always been future-oriented. I wrote an article about the future of money in 1976. In 1978, I sold my first two big ticket magazine articles: "Future Highs" for Playboy and "The Future of Pinball" for Penthouse. When the Altair came along, I thought it was an incredibly neat idea, but I wasn't a kit-builder, and didnŐt really know anything about computers. I got into personal computers in the early 1980s through my interest in altered states of consciousness.

In 1982 got a job as staff writer for the Institute of Noetic Sciences. IONS had been founded by Edgar Mitchell, the Apollo astronaut who had a profound spiritual experience during his extravehicular activity on the way back from the moon (actually, Captain Ed told me that EVERYONE who has done an EVA, American and Russian alike, has had a profound exerience -- but Ed is the only one who talks about it).


11 February


We have forgotten, and have been encouraged to forget, about the origins and provenance of fundamental tools we all benefit from -- rationality, progress, democratic self-governance, universal acceptance of the superiority of the scientific method to other ways of knowing.


Without claiming I have an answer to the problem of technology, I'd like to tell other technology-lovers and technology-designers about a few of the things I've learned -- and sharpen those ideas in online conversation. But I have to start with my own fascination with technologies, especially those that amplify intellectual functions.


6 February


We are all partaking in, and many of us are helping to build, something that none of us understands. For reasons that I will explain, there are taboos against looking too critically at the real politics of technology. Marx was just as deluded as Adam Smith when it came to understanding the real invisible hand that has shaped how humans work, live, and think for the past couple hundred years. The historical trajectory of technology has only recently become visible, and only to a few, largely-unread, thinkers. One of the things that made technology dangerous is the way people forget where tools come from, and what they were designed to do.

4 February


Do we really know where our technologies are leading us? Do we have any idea about where we ought to go with the power over matter, mind, and life itself that next-generation technologies promise? Is there anything we can do about it? And still have any fun?


I am compelled to declare my love for mind-extending technologies before I recount how I started thinking more critically about tools, minds, and civilizations. I don't want you to mistake this for the standard neoLuddite rant. I lack the certainty of the true believers -- both the orthodox technophiles and the convinced technophobes. I confess up front that I know of no theology or ideology that will answer the questions I can no longer avoid asking.


3 February


I've been reading "The Religion of Technology: The Divinity of Man and the Spirit of Invention, By David F. Noble, Knopf 1997. The first graf is a grabber:


We in the West confront the close of the second Christian Millennium much as we began it, in devout anticipation of doom and deliverance, only now our medieval expectations assume a more modern, technological expression. It is the aim of this book to demonstrate that the present enchantment with things technological -- the very measure of modern enlightenment -- is rooted in religious myths and ancient imaginings. Although today's technologies, in their sober pursuit of utility, power, and profit, seems to set society's standard for rationality, they are driven also by distant dreams, spiritual yearnings for supernatural redemption. However dazzling and daunting their display of worldly wisdom, their true imagination lies elsewhere, in an enduring, other-worldly quest for transcendence and salvation


2 February


"Money is messages," I wrote in an article on "The Future of Money" for the San Francisco Examiner in 1976. I had worked in the swing shift of the wire room at the Bank of America during the time the old-fashioned teletype machines were phased out and computerized funds-transfer was phased in. In the twenty two years since then, the global cybernation of money has set the stage for an entirely new kind of energy exchange. Bernard Lietaer believes the Net is the ideal medium for creating something closer to what capitalism was supposed to be, but never became. I wrote a brief description of Lietaer's ideas about the future of money.


1998 archives:
march
february
january

rheingold's brainstorms