Big Think Interview With David Gelernter
David Gelernter is professor of computer science at Yale, chief scientist at Mirror Worlds Technologies, contributing editor at the Weekly Standard, and member of the National Council of the Arts. He is the author of several books and many technical articles, as well as essays, art criticism, and fiction. The "tuple spaces" introduced in Carriero and Gelernter's Linda system (1983) are the basis of many computer-communication and distributed programming systems worldwide. According to Reuters, his book "Mirror Worlds" (Oxford University Press, 1991) "foresaw" the World Wide Web and was "one of the inspirations for Java"; the "lifestreams" system (first implemented by Eric Freeman at Yale) is the basis for Mirror Worlds Technologies' software. Gelernter is also the author of "The Muse in the Machine" (Free Press, 1994), the novel "1939" (Harper Perennial, 1995), "Machine Beauty" (Basic Books, 1998), and most recently, "Judaism: A Way of Being" (Yale University Press, 2010).
Question: What balance do\r\nyou strike between your teaching, writing, science, and art?\r\n\r\n
David Gelernter:\r\nNowadays, I spend my time mainly painting. I have an exhibition coming up. Generally speaking, I spend more time painting than doing\r\nanything else, except for writing. \r\nI’ve been writing pieces for—some pieces connected with DLD where I got\r\nto meet Frank Schumacher, who's been associated with the meeting for some time, and\r\nis an editor and publisher at the Frankfurter Allgemeine Zeitung, and a remarkable guy. So, I agreed to write a series of\r\npieces for them and he’s a wonderful guy and I think Europe is more interested\r\nin the implications as opposed to the immediate market meaning of\r\ntechnology. I mean, I don’t\r\nthink people are better educated or more thoughtful or any different, they’ve\r\njust got a somewhat different focus. \r\nI think growing out of the nature of the European market and the origin\r\nof so much of the technology in the United States gives them one degree of remove,\r\nwhich I think is useful. There’s a\r\nlot of thoughtful people over there.\r\n\r\n
Question: What is the\r\nfocus of your new art exhibition?\r\n\r\n
David Gelernter:\r\nWell, let’s see, this is the latest of a series of exhibits at Yale, which is a\r\ngood place for me to exhibit. I\r\nlike to sell paintings, not from galleries, but from a more informal,\r\none-on-one way, and so a non-commercial gallery space in which to exhibit, is, for\r\nme, very useful. I mainly—I’ve\r\nbeen trying for many years, I should say for many decades at this point, to\r\nfigure out what Jewish art is, if there is such a thing. It’s come to seem to me that Jewish art\r\nis paintings of words. Not just\r\npaintings in which words appear, or words on a wall, but paintings in which the\r\nwords themselves have meaning and decorative significance and conceptual weight. It’s hard to describe an image,\r\nespecially one that is somewhat idiosyncratic, but anyway. General idea.\r\n\r\n
Question: How does\r\nJudaism shape your work?\r\n\r\n
David Gelernter:\r\nGenetically, to begin with. When I\r\ndo think up pictures, my own job description is an image thinker, as many\r\npeople have been, and what I do is a matter of the images that float through my\r\nhead. Many people think in images,\r\nit’s hard to say how many. \r\nCertainly many people think in images some of the time. Many people think in images virtually\r\nall the time. When I’m working in\r\nsoftware, I’m thinking of the picture that needs to appear on the screen or\r\nthat needs to appear in the user’s head in order to make sense of the software. In the studio, more directly, I try and\r\ntake as any painter does, as any artist does, tries to take what is in his head\r\nand make it concrete which is a constant—which is a struggle, which isn’t easy,\r\nbut is what art has always been about.\r\n\r\n
When I write, I tend to write vividly or try to write\r\nvividly, and it’s also a matter of the images that drift through one’s\r\nhead. So, this is the way I deal\r\nwith the world, picture-wise.\r\n\r\n
Question: Why do you\r\nbelieve Judaism is the central intellectual development in Western history?\r\n\r\n
David Gelernter:\r\nIt seems to be, beyond doubt, that Judaism is the most important intellectual\r\ndevelopment in western history for two reasons: one having to do with the\r\naesthetic and spiritual, and the other having to do with the ethical. If I begin with ethical and moral\r\nissues, Judaism invented the idea of mankind as an entity. So we see striking differences between\r\nancient Israeli literature and Greek literature, let’s say in the first 1,000\r\nyears, the first millennium B.C. \r\nThere is a word in Greek that has no equivalent in Hebrew, namely “barbarian.” Barbarian meaning, somebody that\r\nbabbles—a Greek word meaning someone who babbles, who doesn’t speak Greek, who\r\nis foreign, who is culturally inferior by definition and of very little\r\ninterest. Not only different, but\r\nboring. Judaism, meanwhile insofar\r\nas to develop the idea of a single god, which was a revolutionary and bazaar\r\nidea at that time, first emerges 3,000 some odd years ago. I figured that if there really only one\r\ngod in the world, he had to be everybody’s god. Everybody should have the right to say, this is my god. Must have that right. And then if you look who that\r\ncommunity, who the faithful are in principle, it’s everybody. So, Judaism has the idea that ethical\r\nlaws, moral rules and strictures apply to everybody. Not that everybody has a\r\nsort of liability to carry them out. \r\nThere were stricter requirements of Jews, or Israelis, than there are\r\nof people in general. Judaism has\r\nnever been a proselytizing religion. \r\nIt doesn’t really care—as a matter of fact is indifferent—whether people\r\nbecome Jews or join the Jewish community, but is very clear on what the basic\r\nmoral obligations of mankind are with respect for life, respect for justice,\r\nkindness to animals, a familial, what should I say, sexual fidelity and\r\nrefraining from sexual crimes. \r\nThese are the so-called “Seven laws of the sons of Noah,” meaning that\r\nthey apply to everybody.\r\n\r\n
So, without going into a lengthy disquisition, Judaism has\r\nthe idea that there is a simple moral code which goes not only for the Israeli\r\npeople, or the Israeli nation, but is applicable to everybody and has the\r\nrevolutionary idea that not only is there one god, but there is essentially one\r\nman; one mankind, the whole world. \r\nSo on festival occasions at the Temple of Jerusalem, 70 sacrifices would\r\nbe brought at certain points. It\r\nwas thought that there were 70 nations in the world; one for each nation.\r\n\r\n
Judaism has an aesthetic and spiritual side also, of\r\ncourse. Judaism is obsessed with\r\nimagery. One often finds that its\r\nstereotypes are either basically right or exactly wrong. They are rarely sort of in\r\nbetween. Judaism is often\r\ndescribed as being hostile to imagery. \r\nBut we know that can’t be right because of the Hebrew Bible underlies\r\nwestern literature. Hebrew poetry,\r\nthe poetry of the psalms, the prophets, the Book of Job, is the basis of\r\nWestern literature. Hebrew prose\r\nnarrative is the basis of Western narrative. There is no such thing as great poetry without imagery, the\r\nidea is absurd. There is no such\r\nthing as great writing that isn’t vivid and vibrant and that means based on\r\nimages. And we find, in fact, the imagery of the Bible is the imagery that\r\nrecurs throughout Western literature and Western art, from ... the split-open Red Sea, to the handwriting on the wall,\r\nto chariot of fire. These are\r\nimages that are not only painted in the developing tradition of medieval art\r\nand western realist painting, but they recur in Western literature of all\r\nlanguages down to this afternoon.\r\n\r\n
So for both of these reasons, Judaism has a commanding role\r\nin the creation of the culture and civilization that we’ve occupied for several\r\nthousand years, and especially so with the emergence of the idea of the liberal\r\nnation. The liberal modern nation\r\nwhich is a sort of joint invention of the United States and of Great Britain in\r\nthe 17th century and the 18th century. \r\nThese were Christian nations, but the Christianity of early America and\r\nof Britain in the Elizabethan, and especially the age of the civil wars and\r\nCromwell, is what is often called “Hebraic Christianity,” or “Old Testament\r\nChristianity.” It was a profoundly\r\nHebrew-inspired sort of Christianity. \r\nNot that people thought of themselves as Jews because they did not, but\r\nboth the early United States and the early Britain repeatedly referred to\r\nthemselves as “The New Israel” and the idea of freedom and liberty emerges in\r\nthe United States on the basis of the story of the Exodus, the biblical verse,\r\n“Let my people go,” which is repeated many times by Moses to Pharaoh becomes\r\nfundamental in American history not only when religious zealots, who were\r\npersecuted in England immigrate in the 17th Century to the United States, but\r\nwhen the United States declares it’s own independence and freedom as a nation\r\nduring the Civil War when the North becomes gradually resolved under President\r\nLincoln to free the slaves, and then the Civil Rights Movement of the ‘60s,\r\nlate ’50s and ‘60s again.\r\n\r\n
So, the notion of freedom, the notion\r\nof equality, which is derived by the founders of English and American\r\nliberalism from the opening of the Bible, which says, “All men are created in\r\nGod’s image, therefore you’re not allowed to make distinctions on the basis of\r\nrace, color, and creed. All men\r\nbeing in God’s image are to be treated justly and fairly.” Abraham Lincoln put that most\r\nconcisely. And interestingly, the\r\nidea of democracy too, if you read the early literature in the United States,\r\ndeveloping the idea of modern democracy in the 1600’s, especially New England and\r\nin Virginia, to some extent, biblical verses are quoted constantly. Not only the ones in which Moses sets up what is described\r\nas a Jewish commonwealth, he’s told to essentially let each tribe furnish its\r\nown leaders. Tell Moses who his\r\nleaders will be. But it’s also the\r\ncase of the Hebrew Bible is an aggressively anti-monarchy book. There are vivid denunciations of the\r\nidea of a king, the rights of kings, an absolute king. Prophets in the Bible confront kings\r\nfor them in the name of God to be fair and to be just and to be honorable, and\r\nin fact, Israel was told that if it had any sense, they wouldn’t have a king\r\nto begin with.\r\n\r\n
So in lots of ways—and this is something that used to be\r\nwell known—the last couple of generations in western culture, I would say\r\nsince the Second World War, have been secularizing generations in which we were\r\nmore apt to look at ancient Greece than ancient Israel. But as a matter of historical record,\r\nit’s easy to trace these ideas, also in the philosophy of the English\r\nEnlightenment. It’s easy to open a\r\nbook of Locke and notice that he keeps quoting the Bible, or Hobbes, or Seldon,\r\nor others of the English philosophers who provided the intellectual\r\ncounter-weight to the active and pragmatic liberalism of the founding fathers.\r\n\r\n
Question: What is cloud\r\ncomputing, and what do you believe its future will be?\r\n\r\n
David Gelernter:\r\nThe idea of the cloud is that I compute on many platforms in many places. I use many different machines, either\r\nbecause I have a machine at home, a machine at work, because I have a couple of\r\nlaptops, maybe I have a cell phone which itself is a computing platform, a pod,\r\na pad, a Blackberry or whatever it is, there are a lot of different\r\nplatforms. I travel; I need to\r\ncompute in a lot of different places. \r\nSo, for practical reasons, rather than taking my information and putting\r\nit in the file system on my new laptop, or on my machine at home, or on my cell\r\nphone or something like that, it’s much easier for me just to let the\r\ninformation float off somewhere so it’s always sort of overhead, or some\r\nintangible place around me and I can tune it in, in the sense that I can tune\r\nin C-Span from any TV, cable connected TV, I want to be able to tune in my\r\ninformation and be able to see it from any internet-connected computer. It’s important\r\nin terms of portability; it has other major pragmatic advantages, some of which\r\nhave not yet been fully realized. \r\nIt still is an enormous nuisance to buy a new computer, which is\r\nabsurd. Why, when I get a new\r\ncomputer it sits in my front hall for three weeks while I work up courage to\r\ninstall it. I usually wait until\r\none of my sons is home so he can do the work for me because, although it should\r\nbe trivial, what I want is to get a new computer, take it out of the box, plug\r\nit in, take a sledge hammer and smash the old one to bits, and I’m online. But because the cloud doesn’t really\r\nfunction the way I want it to yet, one has to copy, painfully, the entire file\r\nsystem from one computer to the other computer, even if one rigs up a special\r\nconnection that’s a nuisance. One\r\nalways winds up missing things.\r\n\r\n
\r\n\r\n So, anyway, you need a cloud because you have a lot of\r\ncomputers. You need a cloud\r\nbecause you often get new computers that are born empty. Maybe most important, you need a cloud\r\nfor security. More and more\r\nof people’s lives is going online. \r\nFor security and privacy, I need the same sort of serious protection my\r\ninformation gets that my money gets in a bank. If I have money, I’m not going to shove it in a drawer under\r\nmy bed and protect it with a shotgun or something like that. I’m just going to assume that there are\r\ninstitutions that I can trust, reasonably trustworthy to take care of the money\r\nfor me. By the same token, I don’t\r\nwant to worry about the issues particularly with machines that are always on, that are always connected to the network, easy to break into. I don’t want to manage the security on\r\nmy machine. I don’t want to worry\r\nabout encryption; I don’t want to worry about other techniques to frustrate\r\nthieves and spies. If my\r\ninformation is out on the cloud, not only can somebody else worry about encryption\r\nand coding it, not only can somebody else worry about barriers and logon\r\nprotections, but going back to Linda and the idea of parallelism and a network\r\nserver existing not on one machine, but being spread out on many, I’d like each\r\nline of text that I have to be spread out over a thousand computers, let’s say,\r\nor over a million.
So, anyway, you need a cloud because you have a lot of\r\ncomputers. You need a cloud\r\nbecause you often get new computers that are born empty. Maybe most important, you need a cloud\r\nfor security. More and more\r\nof people’s lives is going online. \r\nFor security and privacy, I need the same sort of serious protection my\r\ninformation gets that my money gets in a bank. If I have money, I’m not going to shove it in a drawer under\r\nmy bed and protect it with a shotgun or something like that. I’m just going to assume that there are\r\ninstitutions that I can trust, reasonably trustworthy to take care of the money\r\nfor me. By the same token, I don’t\r\nwant to worry about the issues particularly with machines that are always on, that are always connected to the network, easy to break into. I don’t want to manage the security on\r\nmy machine. I don’t want to worry\r\nabout encryption; I don’t want to worry about other techniques to frustrate\r\nthieves and spies. If my\r\ninformation is out on the cloud, not only can somebody else worry about encryption\r\nand coding it, not only can somebody else worry about barriers and logon\r\nprotections, but going back to Linda and the idea of parallelism and a network\r\nserver existing not on one machine, but being spread out on many, I’d like each\r\nline of text that I have to be spread out over a thousand computers, let’s say,\r\nor over a million.
So, if I’m a hacker and I break into one computer, I may be\r\nable to read a vertical strip of a document or a photograph, which is\r\nmeaningless in itself, and I have to break into another 999,999 computers to\r\nget the other strips. You know, or\r\nit may be more computers than that. \r\nThe cost of computers is going asymptotically to zero, of course it will always\r\ncost money to connect them and keep them running and stuff like that, but not\r\nonly for matters of convenience, which are very important, I need to be able to\r\nget my data anywhere on any platform, but even more for privacy and security\r\nwhen people talk about a cloud, they mean information that’s available on any\r\nplatform managed, not by me, but by responsible—by an organization in whom I\r\ncan place as much trust as the institution as my community or my city that\r\npatrol the streets, that bank my money, that generally keep civilization\r\nrunning. They need to do the same\r\nthing with respect to the information landscape and privacy and security and so\r\nforth.\r\n\r\n
Question: How can we fix\r\ningrained software flaws rather than accepting them?\r\n\r\n
David Gelernter:\r\nWell, there are a bunch of big juicy flaws one could talk about. I think sort of the most important flaw\r\nthat haunts us today is the way that software is stuck in 1984. The hardware world has changed\r\ndramatically. The world of\r\ncomputer users, everything has changed, but when I turn on a computer I see a\r\ndesktop with windows and icons and a menu and I have a mouse. This so-called desktop UI or GUI (graphical user interface), a brilliant invention at Xerox Palo Alto Research Center in the\r\n1970’s. A group led by Allen Kay and others, developed this idea of the GUI for\r\nthe first personal computer called the Altos. So, in 1976, ‘77, ‘78, this was an extraordinary\r\ninvention. When Apple saw Xerox\r\ndemo in, I think it was 1979, and first implemented this thing on the Lisa in\r\n1983 and in a more commercially successfully form on the Mac in 1984, the brilliant\r\nmove, the brilliant idea was a GUI. \r\nHowever, 1984 was a long time ago. \r\nIt’s a full generation ago. \r\nAnd while a freshman in one of my classes today, if I sat him down with\r\na 1984 Mac, he wouldn’t even know it was a computer. I mean, it’s got this tiny little screen it was sort of like\r\nan upright shoebox and it’s obviously wildly—it doesn’t have a hard drive in\r\nit. It’s obviously wildly\r\ndifferent in terms of its capabilities. \r\nIt has no internet connection ordinarily. However, if I turn it on for him and he looks at the\r\ndisplay, he’s going to be right at home because it’s exactly what he sees on\r\nhis Mac or his PC, or whatever platform he’s got today. He’s got Windows, and they overlap and\r\nthere are menus and icons and you drag the mouse around.\r\n\r\n
And so this interface, the Windows interface, the desktop\r\ninterface, that was a brilliant innovation, the most important in computing,\r\nwhen Xerox came up with it the 1970’s, was a tremendously important step\r\nforward when Apple popularized it and commercialized it with a Mac in 1984, was\r\nstill reasonably fresh and new when Microsoft took over the world with Windows\r\n3.0 in 1990, which was—I mean, Microsoft was already a company. But that’s the product that made\r\nMicrosoft the dominant power in computing for the next period of years. That was still pretty good in\r\n1990. By 1995, the Web was up and\r\nrunning, uses in computers were becoming different, email was changing the way\r\nin which people communicated. By\r\nthe late ‘90’s, the Web was emerging and all the Web commerce—the Web became\r\nessential to the economy in a lot of ways. Cell phones were becoming ubiquitous as computing platforms\r\nof other sorts, there were all sort of new ways to communicate; in the last 10\r\nyears the emergence of social networking sites and stuff like that have once\r\nagain changed the picture.\r\n\r\n
So, users today are a radically different group of\r\npeople. They use computers for\r\ndifferent purposes than they did in 1984. \r\nI mean, today I use a computer as an information... first of all as a\r\ncommunication device, second of all to manage the information that I have. It’s essential for me in dealing with\r\nthe world in a lot of different ways and finding out what’s going on. Keeping my own information\r\nup-to-date. I certainly don’t use\r\na computer for computing. Back in\r\nthe 1980’s people still did compute with computers. The major users were people who had reasonably\r\ncomputationally serious applications at the time. Either they were spreadsheets, or they were running\r\nsimulations, video games have always been major users. Cycles of course going back then too,\r\nbut the emergence of the network, the radical increase in power of the\r\nhardware, the radical increase in the size of the user base, the radical change\r\nin the purpose to which computers are put, the radical difference in graphical\r\ncapabilities—modern high definition screen as opposed to a small low\r\ndefinition, now only grey scale, but black and white screen that the first Mac\r\nhad and that Xerox was working with. \r\nAll this stuff suggests that we 1984 software is probably not going to\r\nbe the right solution for 2010 computing.\r\n\r\n
However, the industry, what can you say about an industry which makes so much money so fast, has so many stockholders, responsible to so\r\nmany owners and employees that it’s naturally reactionary. You know, all the companies that depend\r\non Microsoft, all the companies that depend on Apple and there are a few other\r\nproducers at the edges dealing with Linux machines and stuff like that—they\r\nhave heavy, what do you want to call it, fiduciary responsibilities that make\r\nthem reactionary. I mean, if you\r\nare a very successful company, you’re slow to change except for the edges. I mean, you want everybody to think\r\nyou’re a leading edge, you want to do your best to look as if you’re changing,\r\nbut nonetheless, you’ve got your Windows, and you’ve got your menus, and you’ve\r\ngot your mouse, you’ve got a display which treats the screen as an opaque\r\nsurface, as a flat surface, as a desktop and there are Windows sort of tacked\r\nto it as if it were a bulletin board.\r\n\r\n
What I want is not an opaque surface; I want to think of the\r\nscreen as a view port that I can look through. I want to be able to look through it to a multi-dimensional\r\ninformation landscape on the other side. \r\nI don’t want to have a little bulletin board or a little desktop to put\r\nstuff on. That was a brilliant\r\nidea in 1984, today it’s a constraining view and it’s obsolete.\r\n\r\n
Question: How can a grasp\r\nof emotional subtext be built into artificial intelligence?\r\n\r\n
David Gelernter: AI\r\nis really—it used to be a completely separate field, in effect. If you were working in Artificial\r\nIntelligence, which had an obvious interface to the neurosciences, to cognitive\r\nscience, to cognitive psychology, experimental psychology, and philosophy of\r\nmind—that was a very different field from software, software engineering,\r\nsoftware design, software architecture. \r\nToday, the fields increasingly are in contact as techniques are\r\ndeveloped and AI are ubiquitous in modern software. Generally not very fancy techniques, but systems of rules;\r\nwhat are called experts, at what used to be called expert systems. They’re fundamental in a lot of\r\nways. If you look at the heart of\r\nthe problems of AI, Artificial Intelligence, going back to the 1950’s, the\r\nquestion—the deeper question AI was interested in, and certainly did want to\r\nbuild useful software, and it still does and has an important goal with AI. But it was also the philosophical or\r\nscientific question of, "How does the mind work? What does it mean to think?" And the idea was—finding a definition for thinking,\r\nunderstanding thinking, had proven enormously elusive over the centuries, since\r\nDescartes with his opening bid essentially created the field of philosophy of\r\nmind in the 17th century, it’s been very difficult to understand what the mind\r\nis. What consciousness is, what\r\nunderstanding is, and what it means to understand and communicate as opposed to\r\nmerely use language in a robotic or mechanical way.\r\n\r\n
Now, AI people, in trying to understand thought and mind,\r\ngenerally approach it by attempting to build a working model and software. The computer has seemed mind-like to\r\npeople in 1950. I mean, during the\r\n1950’s it was widely called an electronic brain, and people don’t use that\r\nphrase anymore, but they think of the computer in the same terms. It was an interesting idea, the idea of\r\ntrying to build a working model of human thought out of software. Because if I could get a computer to\r\nact as if it were thinking, then I could look at the software and maybe learn\r\nsomething. Not necessarily—because\r\nsoftware is very different from brain-ware—but maybe get some idea of what\r\nthought really consists of and maybe what consciousness consists of.\r\n\r\n
Now, my argument with the direction that AI has taken is that borrowing from the standard approaches in philosophy and mind, in most cases, not in every case, borrowing from approaches in cognitive psychology and experimental psychology, AI had tended to say, “We want to know about thinking, then we’ll move on to emotion, or maybe consciousness.” Thinking is what we want to do. The really important thing the mind does, the important activity the mind does is think; solve problems. A lot of people in AI used problem-solving as an equivalent of thinking. They said, “Here’s what we’re working on. We’re working on artificial thought, artificial intelligence, problem-solving software.” But it’s obvious to anybody that the mind does much more than solve problems. It’s very rare for anybody to go about solving a problem formally. Certainly we don’t do a lot of mathematical problem-solving, or problem sets in physics most of the time, and to an extent that we are confronted with problems to solve, we almost always first have recourse to experience and we think, “Well, what did I do the last time?” But in a more fundamental way, it is obvious to anybody—maybe obvious to anybody who is not in AI—that if I am working at my computer and I get tired and I lean back and look out the window and just watch the passing scene, I’m still thinking, my mind hasn’t shut down. I watch what’s\r\nhappening; I react in more subtle, cognitive ways to what I see. It’s obvious that when I get tired,\r\nwhen my mind starts to drift, when I move into the free-associative state that\r\nwas studied by Freud that we know precedes falling asleep; free association is\r\na kind of thinking also. My mind\r\ndoesn’t shut off, but I’m certainly not solving problems, I’m wandering\r\naround. And we also know that when\r\nwe sleep, we think also. Sleep\r\nthought is different than waking thought. \r\nSleep thought is not solving problems in mathematics, or solving any\r\nkind of problems in a methodical way. \r\nSleep thought is image thought, for the most part, and sleep thought is\r\nhallucinatory. I see things that\r\naren’t there.\r\n\r\n
So, we need to understand the connection, the spectrum that\r\nconnects wide-awake, focused, alert problem-solving type of thought with what\r\nhappens to my mind as I get tired, as my focus decreases, as I approach sleep. Actually, the brain goes through several oscillations like\r\nthis during the day, but there’s a continuous spectrum connecting my most\r\nfocused, my sharpest kind of analytical thought, the sharpest of which I am\r\ncapable of on the one hand, and the lowest focused kind of thought in which my\r\nmind drifts and ultimately I find myself asleep and dreaming.\r\n\r\n
The field of Artificial Intelligence had studied only the\r\nvery top end of the spectrum and still tends to study only the very top\r\nend. Tends to say, what is\r\nthinking, it’s this highly focused, wide awake, alert, problem-solving state of\r\nmind. But not only is that not the\r\nwhole story, but the problem, the sort of biggest unsolved problem has tended\r\nto haunt philosophy of mind, cognitive psychology AI is creativity. People have always been fascinated,\r\nwhat makes for a creative person? \r\nWhat explains a creative leap, which is a well defined psychological\r\nevent? People know when it happens\r\nto them. There is general\r\nagreement that to be creative is to have the ability to invent new analogies. To connect two things that are not\r\nobviously related, but once you have made the connection, you can see, yeah,\r\nthere is a relationship and other people can see the relationship too and\r\ncreativity flows from that.\r\n\r\n
Now, we know that the invention of analogy has to do with\r\nnot highly focused analytic problem-solving thought. Creating analogy means connecting thoughts, letting your\r\nmind drift essentially from one thought to another. So, not only do we need to study the entire cognitive\r\nspectrum, or cognitive continuum because that’s what human beings are. They are not problem-solving machines,\r\nthey do a lot of kinds of thought that is not problem-solving, but if we ever\r\nwant to know what creativity is, if we ever want to know what goes into the\r\ninvention of the new analogy, we’re going to have to study the free-associative\r\nstates in which the mind drifts from state to state to state.\r\n\r\n
And finally, to add in a question which had to do with\r\nemotion, how do we connect thoughts? \r\nI’ve argued—and I won’t go through this now—but, as you’re focus\r\ndeclines, as you become less alert, less focused. As you look out the window, as your mind starts to wander,\r\nas you start to get drowsy, emotion is playing a more and more important role\r\nin your thought. Emotion is what\r\nallows us to take two thoughts or ideas that seem very different and connect\r\nthem together because emotion is a tremendously subtle thing; a subtle kind of\r\ncode, or tag that can be attached to a very complicated scene. If I say "What is your emotion on the\r\nfirst really warm day in April or March when you go out and you don’t need a\r\ncoat and you can smell the flowers blooming and there may be remnants of snow,\r\nbut you know it’s not going to snow anymore and there’s a certain springiness\r\nin the air. What do you feel?" It’s not that you feel happy\r\nexactly. There are a million kinds\r\nof happiness. It’s a particular\r\nshade of emotion. Or what you feel\r\non a million other occasions. You\r\ngo to the mailbox and you see a letter from a girlfriend from 15 years ago and\r\nyou haven’t heard from her since. \r\nOr you read in the newspaper something about blah, blah, blah, or your\r\ncomputer breaks just when you need to do something. There are many, many circumstances in which I can say I’m\r\nhappy, I’m sad, but I need to go much further, beyond language. I can’t use ordinary language to describe\r\nthe nuanced emotions that I feel on these occasions. They’re much more subtle than elation or depression. But the mind knows what they are even\r\nif they can’t be reduced to words and I can connect two scenes that seem very\r\ndifferent, that are virtually are very different. Where the actors are different, where the scenes are\r\ndifferent, the colors are different, the sounds are different, but nonetheless, if\r\nthey made me feel the same way, I can connect them and create analogies. So, how it ultimately fits together.\r\n\r\n
Question: How has\r\nsurviving the Unabomber attack changed your life?\r\n\r\n
David Gelernter:\r\nZero. It was my responsibility—I\r\nthink it would be anybody’s who had been attacked in a particularly cowardly\r\nand despicable fashion—to go on. \r\nIf I had said "This attack had changed me in the following 10 ways"... I’m\r\nnot interested in being changed by criminals, murderers, and terrorists. I’m interested in being whoever I was\r\ndestined to be as a member of my family and my community and that’s what I’ve\r\nbeen doing. It slowed me down,\r\npresented physical challenges, but it didn’t change my worldview, or the sort\r\nof broader sense... Worldview in the\r\nsense—there’s a tremendously useful German word used in philosophy:\r\nweltanschauung. Worldview meaning\r\nnot just looking around, but how to make sense of things, how I put it together\r\nin a coherent way. So, my\r\nworldview is the same.\r\n\r\n
Question: Has being the\r\nvictim of an attack changed your feelings about terrorism?\r\n\r\n
David Gelernter:\r\nI’m not a victim. I never was,\r\nnever will be. Victimhood is\r\nsomething you choose, or something you reject. I and so many others have done before me and are doing\r\ntoday, they rejected, hate the tendency of society to glorify victimhood and to\r\nspeak of oppression and victimhood and persecution as some soft of badge of\r\nhonor, or something of that sort. \r\nI’m not a victim.\r\n\r\n
On terrorism, on the other hand, I guess it’s fair to say\r\nthat I had a close-up personal look at terrorism. I don’t think my views have changed any. The fact is, any member of the American\r\nJewish community has relatives who lived through the Holocaust, and who has more\r\nimportant, has relatives or close friends in Israel, who were either attacked\r\nthemselves or whose family has experienced terrorist attack because terrorism\r\ngoes back many centuries, but has always been a weapon of choice of Jew-haters\r\nand Israel-haters... So, the tragic\r\nfact is that the reality of terrorism is fundamental cowardliness, is\r\nfundamental anti-human character. \r\nI think it's familiar to everybody... I’d should say not just in the\r\nAmerican Jewish community, the fact is that America is unique in its sympathy\r\nfor Israel. Europe certainly\r\ndoesn’t feel this way, Asia doesn’t feel this way. This is not a feeling only of American Jews. In fact, in many cases, the Christian\r\ncommunity has been—has shown itself as much more interested in Israel’s fate\r\nand well-being than the Jewish community, which has its own political axes to\r\ngrind. I think America in general\r\nhas felt close to, in some ways, because the states are so similar—there is no\r\nnation in the world set up by people with bibles in their back pockets as a New\r\nIsrael, there’s no nation that has been set up on that basis aside from the\r\nUnited States and Israel. So,\r\nthere’s always been the sympathy, and growing up one has the feeling, one had a\r\nfeeling in this country, I mean back in the 1960s and ‘70s, that terrorist\r\nattacks on Israel were hitting close to home. It was impossible not to be aware of the nature of\r\nterrorism, the threat of terrorism. \r\nIt’s something that I’ve always lived with, tragically, as has everybody\r\nwho has felt close to Israel.\r\n\r\n
Question: What makes you\r\noptimistic about the century ahead?\r\n\r\n
David Gelernter:\r\nWe’re looking at—in one word maybe I should say, graphics. Not only computer graphics or\r\nanimation, but the enormously increased scope for pictures; for showing\r\npictures, for seeing pictures, for seeing things. Seeing is a source of wisdom and pleasure in a lot of\r\nways. Mankind really has no\r\nvocabulary to discuss color because if you look at art history, until two\r\ngenerations ago, nobody knew what paintings looked like, they could be\r\nreproduced in black and white going back to the 19th century, before then they\r\ncouldn’t be reproduced in any way at all, but until, say the 1930’s, ‘40’s,\r\n‘50’s in the 20th century, there was no way to, you could say Titian is a great\r\ncolorist, or Velazquez has extraordinary subtle browns, or the reason the 13th\r\ncentury glass at Chartres is unique is because of the blue. The special blue. But you could see it. You had to travel to France or to you\r\nknow, Venice to see—wherever. And\r\nnot only that, once you were there, unless you stayed, you’re not going to stay\r\nplanted in front of a picture in a museum and nor are you going to camp out in\r\na cathedral. But computers have\r\nnot only made printing—has not only made displaying on their screens, but they\r\nmade printing on paper—color printing—vastly better and inexpensive.\r\n\r\n
The possibility that we have now of seeing what mankind has\r\ndone, the art that has been done, the cities that have been built, the\r\nlandscapes that have drawn on people is a tremendously exciting—and to see each\r\nother, because ultimately that is what people want to see most of all is other\r\npeople. That’s exciting. It opens up a new world that mankind\r\nhas longed for ever since he’s seen... "Colors are good, and I want to make my world\r\ncolorful, and I want to see my fellow human beings and I want to build things\r\nand I want the horizons to be further than what I can see from my front door."
Recorded on April 1, 2010.\r\n\r\n\r\n\r\n\r\n
A conversation with the writer, artist, and Yale computer scientist.
Young people could even end up less anxiety-ridden, thanks to newfound confidence
- The coronavirus pandemic may have a silver lining: It shows how insanely resourceful kids really are.
- Let Grow, a non-profit promoting independence as a critical part of childhood, ran an "Independence Challenge" essay contest for kids. Here are a few of the amazing essays that came in.
- Download Let Grow's free Independence Kit with ideas for kids.
We must rethink the "chemical imbalance" theory of mental health.
- A new review found that withdrawal symptoms from antidepressants and antipsychotics can last for over a year.
- Side effects from SSRIs, SNRIs, and antipsychotics last longer than benzodiazepines like Valium or Prozac.
- The global antidepressant market is expected to reach $28.6 billion this year.
Or is doubt a self-fulfilling prophecy?
Philosophers like to present their works as if everything before it was wrong. Sometimes, they even say they have ended the need for more philosophy. So, what happens when somebody realizes they were mistaken?
Sometimes philosophers are wrong and admitting that you could be wrong is a big part of being a real philosopher. While most philosophers make minor adjustments to their arguments to correct for mistakes, others make large shifts in their thinking. Here, we have four philosophers who went back on what they said earlier in often radical ways.
The future of learning will be different, and now is the time to lay the groundwork.
- The coronavirus pandemic has left many at an interesting crossroads in terms of mapping out the future of their respective fields and industries. For schools, that may mean a total shift not only in how educators teach, but what they teach.
- One important strategy moving forward, thought leader Caroline Hill says, is to push back against the idea that getting ahead is more important than getting along. "The opportunity that education has in this moment to really push students and think about what is the right way to live, how do we do it and how do we do it in a way that doesn't hurt or rob the dignity of other people?"
- Hill also argues that now is the time for bigger swings and for removing the barriers that limit education. The online space is boundary free and provides educators with new opportunities to connect with students around the world.