Neil deGrasse Tyson: Atheist or Agnostic?
Neil deGrasse Tyson was born and raised in New York City where he was educated in the public schools clear through his graduation from the Bronx High School of Science. Tyson went on to earn his BA in Physics from Harvard and his PhD in Astrophysics from Columbia. He is the first occupant of the Frederick P. Rose Directorship of the Hayden Planetarium. His professional research interests are broad, but include star formation, exploding stars, dwarf galaxies, and the structure of our Milky Way. Tyson obtains his data from the Hubble Space Telescope, as well as from telescopes in California, New Mexico, Arizona, and in the Andes Mountains of Chile. Tyson is the recipient of nine honorary doctorates and the NASA Distinguished Public Service Medal. His contributions to the public appreciation of the cosmos have been recognized by the International Astronomical Union in their official naming of asteroid "13123 Tyson". Tyson's new book is Letters From an Astrophysicist (2019).
Neil deGrasse Tyson: I'm often asked – and occasionally in an accusatory way – “Are you atheist?” And it’s like, you know, the only “ist” I am is a scientist, all right? I don’t associate with movements. I'm not an “ism.” I just - I think for myself. The moment when someone attaches to a philosophy or a movement, then they assign all the baggage and all the rest of the philosophy that goes with it to you, and when you want to have a conversation they will assert that they already know everything important there is to know about you because of that association. And that’s not the way to have a conversation. I'm sorry. It’s not. I’d rather we explore each other’s ideas in real time rather than assign a label to it and assert, you know, what’s going to happen in advance.
So what people are really after is, what is my stance on religion or spirituality or God? And I would say, if I find a word that came closest it would be agnostic. Agnostic – the word dates from the 19th century – Huxley – to refer to someone who doesn’t know but hasn’t yet really seen evidence for it but is prepared to embrace the evidence if it’s there but if it’s not won’t be forced to have to think something that is not otherwise supported.
There are many atheists who say, “Well, all agnostics are atheists.” Okay. I'm constantly claimed by atheists. I find this intriguing. In fact, on my Wiki page – I didn’t create the Wiki page, others did, and I'm flattered that people cared enough about my life to assemble it – and it said, “Neil deGrasse is an atheist.” I said, “Well that’s not really true.” I said, “Neil deGrasse is an agnostic.” I went back a week later. It said, “Neil deGrasse is an atheist.” – again within a week – and I said, “What’s up with that?” and I said, “I have to word it a little differently.” So I said, okay, “Neil deGrasse, widely claimed by atheists, is actually an agnostic.”
And some will say, well, that’s – "You’re not being fair to the fact that they’re actually the same thing." No, they’re not the same thing, and I'll tell you why. Atheists I know who proudly wear the badge are active atheists. They’re like in your face atheist and they want to change policies and they’re having debates. I don’t have the time, the interest, the energy to do any of that. I'm a scientist. I'm an educator. My goal is to get people thinking straight in the first place, just get you to be curious about the natural world. That’s what I'm about. I'm not about any of the rest of this.
And it’s odd that the word atheist even exists. I don’t play golf. Is there a word for non-golf players? Do non-golf players gather and strategize? Do non-skiers have a word and come together and talk about the fact that they don’t ski? I don’t—I can’t do that. I can’t gather around and talk about how much everybody in the room doesn’t believe in God. I just don’t—I don’t have the energy for that, and so I . . . Agnostic separates me from the conduct of atheists whether or not there is strong overlap between the two categories, and at the end of the day I’d rather not be any category at all.
Directed / Produced by
Jonathan Fowler & Elizabeth Rodd
Astrophysicist Neil deGrasse Tyson claims the title "scientist" above all other "ists." And yet, he says he is "constantly claimed by atheists." So where does he stand? "Neil deGrasse, widely claimed by atheists, is actually an agnostic."
Once a week.
Subscribe to our weekly newsletter.
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.
Seek pleasure and avoid pain. Why make it more complicated?
- The Epicureans were some of the world's first materialists and argued that there is neither God, nor gods, nor spirits, but only atoms and the physical world.
- They believed that life was about finding pleasure and avoiding pain and that both were achieved by minimizing our desires for things.
- The Epicurean Four Step Remedy is advice on how we can face the world, achieve happiness, and not worry as much as we do.
Self-help books are consistently on the best-seller lists across the world. We can't seem to get enough of happiness advice, wellness gurus, and life coaches. But, as the Book of Ecclesiastes says, there is nothing new under the sun. The Ancient Greeks were into the self-help business millennia before the likes of Dale Carnegie and Mark Manson.
Four schools of ancient Greek philosophy
From the 3rd century BCE until the birth of Jesus, Greek philosophy was locked into an ideological war. Four rival schools emerged, each proclaiming loudly that they — alone — had the secret to a happy and fulfilled life. These schools were: Stoicism, Cynicism, Skepticism, and Epicureanism. Each had their advocates and even had a kind of PR battle to get people to sign up to their side. They were trying to sell happiness.
Epicurus's guide to living is noticeably different from a lot of modern self-help books in just how little day-to-day advice it gives.
Many of us are familiar with Stoicism, a topic I covered recently, because it forms the foundation of cognitive behavioral therapy. Skepticism and Cynicism have become watered down or warped variations of their original forms. (I will cover these in future articles.) Today, we focus on the most underappreciated of these schools, the Epicureans. In their philosophy, we can find a surprisingly modern and easy-to-follow "Four Part Remedy" to life.
Epicureans: The first atheists
The Epicureans were some of history's first materialists. They believed that the world was made up only of atoms (and void), and that everything is simply a particular composition of these atoms. There were no gods, spirits, or souls (or, at most, they're irrelevant to the world as we encounter it). They thought that there was no afterlife or immortality to be had, either. Death is just a relocation of atoms. This atheism and materialism was what the Christian Church would later come to despise, and after centuries of being villainized by priests, popes, and church doctrine, the Epicureans fell out of fashion.
In the atomistic, worldly philosophy of the Epicureans, all there is to life is to get as much pleasure as you can and avoid pain. This isn't to become some rampant hedonist, staggering from opium dens to brothels, but concerns the higher pleasures of the mind.
Epicurus, himself, believed that pleasure was defined as the satisfying of a desire, such as when we drink a glass of water when we're really thirsty. But, he also argued that desires themselves were painful since they, by definition, meant longing and anguish. Thirst is a desire, and we don't like being thirsty. True contentment, then, could not come from creating and indulging pointless wants but must instead come from minimizing desire altogether. What would be the point of setting ourselves new targets? These are just new desires that we must make efforts to satisfy. Thus, minimizing pain meant minimizing desires, and the bare minimum desires were those required to live.
The Four Part Remedy
Given that Epicureans were determined to maximize pleasure and minimize pain, they developed a series of rituals and routines designed to help. One of the best known (not least because we've lost so much written by the Epicureans) was the so-called "Four Part Remedy." These were four principles they believed we ought to accept so that we might find solace and be rid of existential and spiritual pain:
1. Don't fear God. Remember, everything is just atoms. You won't go to hell, and you won't go to heaven. The "afterlife" will be nothingness, in just the same way as when you had no awareness whatsoever of the dinosaurs or Cleopatra. There was simply nothing before you existed, and death is a great expanse of the same timeless, painless void.
2. Don't worry about death. This is a natural corollary of Step 1. With no body, there is no pain. In death, we lose all of our desires and, along with them, suffering and discontent. It's striking how similar in tone this sounds to a lot of Eastern, especially Buddhist, philosophy at the time.
3. What is good is easy to get. Pleasure comes in satisfying desires, specifically the basic, biological desires required to keep us alive. Anything more complicated than this, or harder to achieve, just creates pain. There's water to be drunk, food to be eaten, and beds to sleep in. That's all you need.
4. What is terrible is easy to endure. Even if it is difficult to satisfy the basic necessities, remember that pain is short-lived. We're rarely hungry for long, and sicknesses most often will be cured easily enough (and this was written 2300 years before antibiotics). All other pains often can be mitigated by pleasures to be had. If basic biological necessities can't be met, then you die — but we already established there is nothing to fear from death.
Epicurus's guide to living is noticeably different from a lot of modern self-help books in just how little day-to-day advice it gives. It doesn't tell us "the five things you need to do before breakfast" or "visit these ten places, and you'll never be sad again." Just like it's rival school of Stoicism, Epicureanism is all about a psychological shift of some kind.
Namely, that psychological shift is about recognizing that life doesn't need to be as complicated as we make it. At the end of the day, we're just animals with basic needs. We have the tools necessary to satisfy our desires, but when we don't, we have huge reservoirs of strength and resilience capable of enduring it all. Failing that, we still have nothing to fear because there is nothing to fear about death. When we're alive, death is nowhere near; when we're dead, we won't care.
Practical, modern, and straightforward, Epicurus offers a valuable insight to life. It's existential comfort for the materialists and atheists. It's happiness in four lines.
The idea of 'absolute time' is an illusion. Physics and subjective experience reveal why.
- Since Einstein posited his theory of general relativity, we've understood that gravity has the power to warp space and time.
- This "time dilation" effect occurs even at small levels.
- Outside of physics, we experience distortions in how we perceive time — sometimes to a startling extent.
Place one clock at the top of a mountain. Place another on the beach. Eventually, you'll see that each clock tells a different time. Why? Time moves slower as you get closer to Earth, because, as Einstein posited in his theory of general relativity, the gravity of a large mass, like Earth, warps the space and time around it.
Scientists first observed this "time dilation" effect on the cosmic scale, such as when a star passes near a black hole. Then, in 2010, researchers observed the same effect on a much smaller scale, using two extremely precise atomic clocks, one placed 33 centimeters higher than the other. Again, time moved slower for the clock closer to Earth.
The differences were tiny, but the implications were massive: absolute time does not exist. For each clock in the world, and for each of us, time passes slightly differently. But even if time is passing at ever-fluctuating speeds throughout the universe, time is still passing in some kind of objective sense, right? Maybe not.
Physics without time
In his book "The Order of Time," Italian theoretical physicist Carlo Rovelli suggests that our perception of time — our sense that time is forever flowing forward — could be a highly subjective projection. After all, when you look at reality on the smallest scale (using equations of quantum gravity, at least), time vanishes.
"If I observe the microscopic state of things," writes Rovelli, "then the difference between past and future vanishes … in the elementary grammar of things, there is no distinction between 'cause' and 'effect.'"
So, why do we perceive time as flowing forward? Rovelli notes that, although time disappears on extremely small scales, we still obviously perceive events occur sequentially in reality. In other words, we observe entropy: Order changing into disorder; an egg cracking and getting scrambled.
Rovelli says key aspects of time are described by the second law of thermodynamics, which states that heat always passes from hot to cold. This is a one-way street. For example, an ice cube melts into a hot cup of tea, never the reverse. Rovelli suggests a similar phenomenon might explain why we're only able to perceive the past and not the future.
"Any time the future is definitely distinguishable from the past, there is something like heat involved," Rovelli wrote for the Financial Times. "Thermodynamics traces the direction of time to something called the 'low entropy of the past', a still mysterious phenomenon on which discussions rage."
"Entropy growth orients time and permits the existence of traces of the past, and these permit the possibility of memories, which hold together our sense of identity. I suspect that what we call the "flowing" of time has to be understood by studying the structure of our brain rather than by studying physics: evolution has shaped our brain into a machine that feeds off memory in order to anticipate the future. This is what we are listening to when we listen to the passing of time. Understanding the "flowing" of time is therefore something that may pertain to neuroscience more than to fundamental physics. Searching for the explanation of the feeling of flow in physics might be a mistake."
Scientists still have much to learn about how we perceive time, and why time operates differently depending on the scale. But what's certain is that, outside of the realm of physics, our individual perception of time is also surprisingly elastic.
The strange subjectivity of time
Time moves differently atop a mountain than it does on a beach. But you don't need to travel any distance at all to experience strange distortions in your perception of time. In moments of life-or-death fear, for example, your brain would release large amounts of adrenaline, which would speed up your internal clock, causing you to perceive the outside world as moving slowly.
Another common distortion occurs when we focus our attention in particular ways.
"If you're thinking about how time is currently passing by, the biggest factor influencing your time perception is attention," Aaron Sackett, associate professor of marketing at the University of St. Thomas, told Gizmodo. "The more attention you give to the passage of time, the slower it tends to go. As you become distracted from time's passing—perhaps by something interesting happening nearby, or a good daydreaming session—you're more likely to lose track of time, giving you the feeling that it's slipping by more quickly than before. "Time flies when you're having fun," they say, but really, it's more like "time flies when you're thinking about other things." That's why time will also often fly by when you're definitely not having fun—like when you're having a heated argument or are terrified about an upcoming presentation."
One of the most mysterious ways people experience time-perception distortions is through psychedelic drugs. In an interview with The Guardian, Rovelli described a time he experimented with LSD.
"It was an extraordinarily strong experience that touched me also intellectually," he said. "Among the strange phenomena was the sense of time stopping. Things were happening in my mind but the clock was not going ahead; the flow of time was not passing any more. It was a total subversion of the structure of reality."
It seems few scientists or philosophers believe time is completely an illusion.
"What we call time is a rich, stratified concept; it has many layers," Rovelli told Physics Today. "Some of time's layers apply only at limited scales within limited domains. This does not make them illusions."What is an illusion is the idea that time flows at an absolute rate. The river of time might be flowing forever forward, but it moves at different speeds, between people, and even within your own mind.
- The history of AI shows boom periods (AI summers) followed by busts (AI winters).
- The cyclical nature of AI funding is due to hype and promises not fulfilling expectations.
- This time, we might enter something resembling an AI autumn rather than an AI winter, but fundamental questions remain if true AI is even possible.
The dream of building a machine that can think like a human stretches back to the origins of electronic computers. But ever since research into artificial intelligence (AI) began in earnest after World War II, the field has gone through a series of boom and bust cycles called "AI summers" and "AI winters."
Each cycle begins with optimistic claims that a fully, generally intelligent machine is just a decade or so away. Funding pours in and progress seems swift. Then, a decade or so later, progress stalls and funding dries up. Over the last ten years, we've clearly been in an AI summer as vast improvements in computing power and new techniques like deep learning have led to remarkable advances. But now, as we enter the third decade of the 21st century, some who follow AI feel the cold winds at their back leading them to ask, "Is Winter Coming?" If so, what went wrong this time?
How to build an A.I. brain that can conceive of itself | Joscha Bach | Big Think www.youtube.com
A brief history of AI
To see if the winds of winter are really coming for AI, it is useful to look at the field's history. The first real summer can be pegged to 1956 and the famous Dartmouth University Workshop where one of the field's pioneers, John McCarthy, coined the term "artificial intelligence." The conference was attended by scientists like Marvin Minsky and H. A. Simon, whose names would go on to become synonymous with the field. For those researchers, the task ahead was clear: capture the processes of human reasoning through the manipulation of symbolic systems (i.e., computer programs).
Unless we are talking about very specific tasks, any 6-year-old is infinitely more flexible and general in his or her intelligence than the "smartest" Amazon robot.
Throughout the 1960s, progress seemed to come swiftly as researchers developed computer systems that could play chess, deduce mathematical theorems, and even engage in simple discussions with a person. Government funding flowed generously. Optimism was so high that, in 1970, Minsky famously proclaimed, "In three to eight years we will have a machine with the general intelligence of a human being."
By the mid 1970s, however, it was clear that Minsky's optimism was unwarranted. Progress stalled as many of the innovations of the previous decade proved too narrow in their applicability, seeming more like toys than steps toward a general version of artificial intelligence. Funding dried up so completely that researchers soon took pains not to refer to their work as AI, as the term carried a stink that killed proposals.
The cycle repeated itself in the 1980s with the rise of expert systems and the renewed interest in what we now call neural networks (i.e., programs based on connectivity architectures that mimic neurons in the brain). Once again, there was wild optimism and big increases in funding. What was novel in this cycle was the addition of significant private funding as more companies began to rely on computers as essential components of their business. But, once again, the big promises were never realized, and funding dried up again.
AI: Hype vs. reality
The AI summer we're currently experiencing began sometime in the first decade of the new millennium. Vast increases in both computing speed and storage ushered in the era of deep learning and big data. Deep learning methods use stacked layers of neural networks that pass information to each other to solve complex problems like facial recognition. Big data provides these systems with vast oceans of examples (like images of faces) to train on. The applications of this progress are all around us: Google Maps give you near-perfect directions; you can talk with Siri anytime you want; IBM's Deep Think computer beat Jeopardy's greatest human champions.
In response, the hype rose again. True AI, we were told, must be just around the corner. In 2015, for example, The Guardian reported that self-driving cars, the killer app of modern AI, was close at hand. Readers were told, "By 2020 you will become a permanent backseat driver." And just two years ago, Elon Musk claimed that by 2020 "we'd have over a million cars with full self-driving software."
The general intelligence — i.e., the understanding — we humans exhibit may be inseparable from our experiencing. If that's true, then our physical embodiment, enmeshed in a context-rich world, may be difficult if not impossible to capture in symbolic processing systems.
By now, it's obvious that a world of fully self-driving cars is still years away. Likewise, in spite of the remarkable progress we've made in machine learning, we're still far from creating systems that possess general intelligence. The emphasis is on the term general because that's what AI really has been promising all these years: a machine that's flexible in dealing with any situation as it comes up. Instead, what researchers have found is that, despite all their remarkable progress, the systems they've built remain brittle, which is a technical term meaning "they do very wrong things when given unexpected inputs." Try asking Siri to find "restaurants that aren't McDonald's." You won't like the results.
Unless we are talking about very specific tasks, any 6-year-old is infinitely more flexible and general in his or her intelligence than the "smartest" Amazon robot.
Even more important is the sense that, as remarkable as they are, none of the systems we've built understand anything about what they are doing. As philosopher Alva Noe said of Deep Think's famous Jeopardy! victory, "Watson answered no questions. It participated in no competition. It didn't do anything. All the doing was on our side. We played Jeapordy! with Watson." Considering this fact, some researchers claim that the general intelligence — i.e., the understanding — we humans exhibit may be inseparable from our experiencing. If that's true, then our physical embodiment, enmeshed in a context-rich world, may be difficult if not impossible to capture in symbolic processing systems.
Not the (AI) winter of our discontent
Thus, talk a of a new AI winter is popping up again. Given the importance of deep learning and big data in technology, it's hard to imagine funding for these domains drying up any time soon. What we may be seeing, however, is a kind of AI autumn when researchers wisely recalibrate their expectations and perhaps rethink their perspectives.