Computers Are Stuck in 1984

Question: How can we fix ingrained software flaws rather than accepting them?

David Gelernter: Well, there are a bunch of big juicy flaws one could talk about.  I think sort of the most important flaw that haunts us today is the way that software is stuck in 1984.  The hardware world has changed dramatically.  The world of computer users, everything has changed, but when I turn on a computer I see a desktop with windows and icons and a menu and I have a mouse.  This so-called desktop UI or GUI (graphical user interface), a brilliant invention at Xerox Palo Alto Research Center in the 1970’s. A group led by Allen Kay and others, developed this idea of the GUI for the first personal computer called the Altos.  So, in 1976, ‘77, ‘78, this was an extraordinary invention.  When Apple saw Xerox demo in, I think it was 1979, and first implemented this thing on the Lisa in 1983 and in a more commercially successfully form on the Mac in 1984, the brilliant move, the brilliant idea was a GUI.  However, 1984 was a long time ago.  It’s a full generation ago.  And while a freshman in one of my classes today, if I sat him down with a 1984 Mac, he wouldn’t even know it was a computer.  I mean, it’s got this tiny little screen it was sort of like an upright shoebox and it’s obviously wildly—it doesn’t have a hard drive in it.  It’s obviously wildly different in terms of its capabilities.  It has no internet connection ordinarily.  However, if I turn it on for him and he looks at the display, he’s going to be right at home because it’s exactly what he sees on his Mac or his PC, or whatever platform he’s got today.  He’s got Windows, and they overlap and there are menus and icons and you drag the mouse around.  

And so this interface, the Windows interface, the desktop interface, that was a brilliant innovation, the most important in computing, when Xerox came up with it the 1970’s, was a tremendously important step forward when Apple popularized it and commercialized it with a Mac in 1984, was still reasonably fresh and new when Microsoft took over the world with Windows 3.0 in 1990, which was—I mean, Microsoft was already a company.  But that’s the product that made Microsoft the dominant power in computing for the next period of years.  That was still pretty good in 1990.  By 1995, the Web was up and running, uses in computers were becoming different, email was changing the way in which people communicated.  By the late ‘90’s, the Web was emerging and all the Web commerce—the Web became essential to the economy in a lot of ways.  Cell phones were becoming ubiquitous as computing platforms of other sorts, there were all sort of new ways to communicate; in the last 10 years the emergence of social networking sites and stuff like that have once again changed the picture. 

So, users today are a radically different group of people.  They use computers for different purposes than they did in 1984.  I mean, today I use a computer as an information... first of all as a communication device, second of all to manage the information that I have.  It’s essential for me in dealing with the world in a lot of different ways and finding out what’s going on.  Keeping my own information up-to-date.  I certainly don’t use a computer for computing.  Back in the 1980’s people still did compute with computers.  The major users were people who had reasonably computationally serious applications at the time.  Either they were spreadsheets, or they were running simulations, video games have always been major users.  Cycles of course going back then too, but the emergence of the network, the radical increase in power of the hardware, the radical increase in the size of the user base, the radical change in the purpose to which computers are put, the radical difference in graphical capabilities—modern high definition screen as opposed to a small low definition, now only grey scale, but black and white screen that the first Mac had and that Xerox was working with.  All this stuff suggests that we 1984 software is probably not going to be the right solution for 2010 computing. 

However, the industry, what can you say about an industry which makes so much money so fast, has so many stockholders, responsible to so many owners and employees that it’s naturally reactionary.  You know, all the companies that depend on Microsoft, all the companies that depend on Apple and there are a few other producers at the edges dealing with Linux machines and stuff like that—they have heavy, what do you want to call it, fiduciary responsibilities that make them reactionary.  I mean, if you are a very successful company, you’re slow to change except for the edges.  I mean, you want everybody to think you’re a leading edge, you want to do your best to look as if you’re changing, but nonetheless, you’ve got your Windows, and you’ve got your menus, and you’ve got your mouse, you’ve got a display which treats the screen as an opaque surface, as a flat surface, as a desktop and there are Windows sort of tacked to it as if it were a bulletin board. 

What I want is not an opaque surface; I want to think of the screen as a view port that I can look through.  I want to be able to look through it to a multi-dimensional information landscape on the other side.  I don’t want to have a little bulletin board or a little desktop to put stuff on.  That was a brilliant idea in 1984, today it’s a constraining view and it’s obsolete.

Recorded on April 1, 2010.

The basic user interface of our personal computers has stayed the same for a generation. How can we move beyond the desktop?

Photos: Courtesy of Let Grow
Sponsored by Charles Koch Foundation
  • The coronavirus pandemic may have a silver lining: It shows how insanely resourceful kids really are.
  • Let Grow, a non-profit promoting independence as a critical part of childhood, ran an "Independence Challenge" essay contest for kids. Here are a few of the amazing essays that came in.
  • Download Let Grow's free Independence Kit with ideas for kids.
Keep reading Show less

Withdrawal symptoms from antidepressants can last over a year, new study finds

We must rethink the "chemical imbalance" theory of mental health.

Photo Illustration by Joe Raedle/Getty Images
Surprising Science
  • A new review found that withdrawal symptoms from antidepressants and antipsychotics can last for over a year.
  • Side effects from SSRIs, SNRIs, and antipsychotics last longer than benzodiazepines like Valium or Prozac.
  • The global antidepressant market is expected to reach $28.6 billion this year.
Keep reading Show less

Is there a limit to optimism when it comes to climate change?

Or is doubt a self-fulfilling prophecy?

David McNew/Getty Images
Politics & Current Affairs

'We're doomed': a common refrain in casual conversation about climate change.

Keep reading Show less

Four philosophers who realized they were completely wrong about things

Philosophers like to present their works as if everything before it was wrong. Sometimes, they even say they have ended the need for more philosophy. So, what happens when somebody realizes they were mistaken?

Sartre and Wittgenstein realize they were mistaken. (Getty Images)
Culture & Religion

Sometimes philosophers are wrong and admitting that you could be wrong is a big part of being a real philosopher. While most philosophers make minor adjustments to their arguments to correct for mistakes, others make large shifts in their thinking. Here, we have four philosophers who went back on what they said earlier in often radical ways. 

Keep reading Show less

What should schools teach? Now is the moment to ask.

The future of learning will be different, and now is the time to lay the groundwork.

Future of Learning
  • The coronavirus pandemic has left many at an interesting crossroads in terms of mapping out the future of their respective fields and industries. For schools, that may mean a total shift not only in how educators teach, but what they teach.
  • One important strategy moving forward, thought leader Caroline Hill says, is to push back against the idea that getting ahead is more important than getting along. "The opportunity that education has in this moment to really push students and think about what is the right way to live, how do we do it and how do we do it in a way that doesn't hurt or rob the dignity of other people?"
  • Hill also argues that now is the time for bigger swings and for removing the barriers that limit education. The online space is boundary free and provides educators with new opportunities to connect with students around the world.

Keep reading Show less