Naomi Wolf on Third Wave Feminism
Naomi Wolf is an author and essayist whose works have appeared in The New Republic, Wall Street Journal, Glamour, Ms., Esquire, The Washington Post, and The New York Times. She also speaks widely to groups across the country.
Her first book, The Beauty Myth, was an international bestseller. She followed it with Fire With Fire: The New Female Power and How It Will Change The 21st Century; Promiscuities: The Secret Struggle for Womanhood; Misconceptions, critique of pregnancy and birth in America; The End of America: A Letter of Warning to a Young Patriot; and Give me Liberty: A Handbook for American Revolutionaries.
Wolf is also co-founder of the Board of The Woodhull Institute for Ethical Leadership, an organization devoted to training young women in ethical leadership for the 21st century. She is a graduate of Yale University and completed her graduate work at New College, Oxford University as a Rhodes Scholar.
Wolf: Well, I guess, many people were sort of inventing the term the term at that time and I remember kind of making it up, you know, for me, I hadn’t heard it before. When I was finishing my first book “The Beauty Myth” and it was kind of an early exercise in “if you built it they will come” thinking because at that time I read “The Beauty Myth” everyone were saying feminism is dead, no one’s interested, it’s over and in fact there wasn’t a lot of there, there was almost no activity on the part of young women especially. And second wave feminist were, you know, their mom’s age. So, I said it in the end of the book, you know, there’s going to be this third wave of a feminism and I sort of hoping, well, you know, if I say it maybe someone will join up, you know? And, of course, other people like Rebecca Walker is also using the phrase that the same time. What’s the definition? You know, I don’t think, it really needs to have a rigorous definition. I’ve never been one who’s a fan of labels. To me it just means women who are younger substantially younger than second wave feminist, you know. Does that generation of third and probably a fourth wave feminist have different style in their feminism? I think there are some differences and they’re good differences. I think, third wave feminism tends to be much more pluralistic about sexuality and personal expression and, you know, fashion choices and much less dogmatic which I think is great. I think third wave feminism tends to be more alert than some second wave feminists were to issues of a class and race. I think, third wave feminist are more engaged, tend to be more engaged with being willing to use power like to use the media or to use the electoral process or to use consumer practices for a good outcome which is great like second way feminism tended to become puritanical, like, we can only be really good if we don’t touch the market place, we don’t touch the media, we don’t touch, you know, politics as usual. But apart from that, I mean, what I love about, what I love about being kind of superannuated, you know, I love being kind of the dinosaurs and letting the young ones kind of carry the ball forward. I mean, what I love about the third wave and the fourth wave is it’s time for them to make their own mistakes, you know, to create their own theories and, you know, throw up their own leaders and that as it should be. I mean that a movement is alive and vigorous.
Question: Does feminism respect the hijab?
Wolf: I did write a piece in which I said that Westerners should be aware of being presumptuous in assuming they know that a Hijab means oppression to a women wearing it, and where did I get that from, I got it from feminists in the Muslim world saying again and again things like, you know, what we have much worse problems than this, you know, it’s much more urgent that they’re burning, you know, brides or that, you know, we’re facing [forced clitoridectomies] like you Westerners are so preoccupied with wearing a head scarf. You know, get a grip. This is, you know, like grow up. And many feminists I’ve heard from including young women in Western Europe and I think it’s very interesting and I was just reporting what they said, they said very intriguingly to me as the author of “Beauty Myth” that when they wore the head scarf or modest clothing like Hijab, they chose to do so and these are, I mean, I’m talking about like the head of the Oxford Union, you know, this beautiful brilliant young Muslim woman who could have worn anything she want and she chose to wear a head scarf and she said for her wearing a head scarf or more modest clothing made her feel freer in the Western context than wearing Western clothing, freer of objectification, freer of sexual harassment, freer of having to worry all the time about how she look compared to fashion models, and I thought that was intriguing. And I thought it was intriguing to surface these comments because I think, you know, we’re in such deep conflict with the Muslim world right now, we are, in the West and one of the problems is, you know, their presumptions about what our culture means and stereotypes about Western corruption and our presumptions about with their culture means and stereotypes about Islam. So, I think, it’s healthy for Westerners to turn a lens on Western practices to maybe, you know, it’s not necessarily so free to define freedom in this very reductive way we do in the West as, you know, any kinds of sexual expression, any kind of purchasing power, any kind of, you know, secular practice. I mean, there are other ways of looking at how other people see us, and even if we don’t agree with those ways, I do think that it’s a very important time to be engaged in an open dialog with this, you know, with the Muslim world and be open to hearing, you know, Muslim women’s own interpretations of what the Hijab means for them. Are there many other Muslim women who think it’s very oppressive? Absolutely and I remember saying in the piece that that is true, of course, yeah.
Naomi Wolf says third wave feminism is far more pluralistic about sexuality and personal expression.
Once a week.
Subscribe to our weekly newsletter.
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.
Seek pleasure and avoid pain. Why make it more complicated?
- The Epicureans were some of the world's first materialists and argued that there is neither God, nor gods, nor spirits, but only atoms and the physical world.
- They believed that life was about finding pleasure and avoiding pain and that both were achieved by minimizing our desires for things.
- The Epicurean Four Step Remedy is advice on how we can face the world, achieve happiness, and not worry as much as we do.
Self-help books are consistently on the best-seller lists across the world. We can't seem to get enough of happiness advice, wellness gurus, and life coaches. But, as the Book of Ecclesiastes says, there is nothing new under the sun. The Ancient Greeks were into the self-help business millennia before the likes of Dale Carnegie and Mark Manson.
Four schools of ancient Greek philosophy
From the 3rd century BCE until the birth of Jesus, Greek philosophy was locked into an ideological war. Four rival schools emerged, each proclaiming loudly that they — alone — had the secret to a happy and fulfilled life. These schools were: Stoicism, Cynicism, Skepticism, and Epicureanism. Each had their advocates and even had a kind of PR battle to get people to sign up to their side. They were trying to sell happiness.
Epicurus's guide to living is noticeably different from a lot of modern self-help books in just how little day-to-day advice it gives.
Many of us are familiar with Stoicism, a topic I covered recently, because it forms the foundation of cognitive behavioral therapy. Skepticism and Cynicism have become watered down or warped variations of their original forms. (I will cover these in future articles.) Today, we focus on the most underappreciated of these schools, the Epicureans. In their philosophy, we can find a surprisingly modern and easy-to-follow "Four Part Remedy" to life.
Epicureans: The first atheists
The Epicureans were some of history's first materialists. They believed that the world was made up only of atoms (and void), and that everything is simply a particular composition of these atoms. There were no gods, spirits, or souls (or, at most, they're irrelevant to the world as we encounter it). They thought that there was no afterlife or immortality to be had, either. Death is just a relocation of atoms. This atheism and materialism was what the Christian Church would later come to despise, and after centuries of being villainized by priests, popes, and church doctrine, the Epicureans fell out of fashion.
In the atomistic, worldly philosophy of the Epicureans, all there is to life is to get as much pleasure as you can and avoid pain. This isn't to become some rampant hedonist, staggering from opium dens to brothels, but concerns the higher pleasures of the mind.
Epicurus, himself, believed that pleasure was defined as the satisfying of a desire, such as when we drink a glass of water when we're really thirsty. But, he also argued that desires themselves were painful since they, by definition, meant longing and anguish. Thirst is a desire, and we don't like being thirsty. True contentment, then, could not come from creating and indulging pointless wants but must instead come from minimizing desire altogether. What would be the point of setting ourselves new targets? These are just new desires that we must make efforts to satisfy. Thus, minimizing pain meant minimizing desires, and the bare minimum desires were those required to live.
The Four Part Remedy
Given that Epicureans were determined to maximize pleasure and minimize pain, they developed a series of rituals and routines designed to help. One of the best known (not least because we've lost so much written by the Epicureans) was the so-called "Four Part Remedy." These were four principles they believed we ought to accept so that we might find solace and be rid of existential and spiritual pain:
1. Don't fear God. Remember, everything is just atoms. You won't go to hell, and you won't go to heaven. The "afterlife" will be nothingness, in just the same way as when you had no awareness whatsoever of the dinosaurs or Cleopatra. There was simply nothing before you existed, and death is a great expanse of the same timeless, painless void.
2. Don't worry about death. This is a natural corollary of Step 1. With no body, there is no pain. In death, we lose all of our desires and, along with them, suffering and discontent. It's striking how similar in tone this sounds to a lot of Eastern, especially Buddhist, philosophy at the time.
3. What is good is easy to get. Pleasure comes in satisfying desires, specifically the basic, biological desires required to keep us alive. Anything more complicated than this, or harder to achieve, just creates pain. There's water to be drunk, food to be eaten, and beds to sleep in. That's all you need.
4. What is terrible is easy to endure. Even if it is difficult to satisfy the basic necessities, remember that pain is short-lived. We're rarely hungry for long, and sicknesses most often will be cured easily enough (and this was written 2300 years before antibiotics). All other pains often can be mitigated by pleasures to be had. If basic biological necessities can't be met, then you die — but we already established there is nothing to fear from death.
Epicurus's guide to living is noticeably different from a lot of modern self-help books in just how little day-to-day advice it gives. It doesn't tell us "the five things you need to do before breakfast" or "visit these ten places, and you'll never be sad again." Just like it's rival school of Stoicism, Epicureanism is all about a psychological shift of some kind.
Namely, that psychological shift is about recognizing that life doesn't need to be as complicated as we make it. At the end of the day, we're just animals with basic needs. We have the tools necessary to satisfy our desires, but when we don't, we have huge reservoirs of strength and resilience capable of enduring it all. Failing that, we still have nothing to fear because there is nothing to fear about death. When we're alive, death is nowhere near; when we're dead, we won't care.
Practical, modern, and straightforward, Epicurus offers a valuable insight to life. It's existential comfort for the materialists and atheists. It's happiness in four lines.
The idea of 'absolute time' is an illusion. Physics and subjective experience reveal why.
- Since Einstein posited his theory of general relativity, we've understood that gravity has the power to warp space and time.
- This "time dilation" effect occurs even at small levels.
- Outside of physics, we experience distortions in how we perceive time — sometimes to a startling extent.
Place one clock at the top of a mountain. Place another on the beach. Eventually, you'll see that each clock tells a different time. Why? Time moves slower as you get closer to Earth, because, as Einstein posited in his theory of general relativity, the gravity of a large mass, like Earth, warps the space and time around it.
Scientists first observed this "time dilation" effect on the cosmic scale, such as when a star passes near a black hole. Then, in 2010, researchers observed the same effect on a much smaller scale, using two extremely precise atomic clocks, one placed 33 centimeters higher than the other. Again, time moved slower for the clock closer to Earth.
The differences were tiny, but the implications were massive: absolute time does not exist. For each clock in the world, and for each of us, time passes slightly differently. But even if time is passing at ever-fluctuating speeds throughout the universe, time is still passing in some kind of objective sense, right? Maybe not.
Physics without time
In his book "The Order of Time," Italian theoretical physicist Carlo Rovelli suggests that our perception of time — our sense that time is forever flowing forward — could be a highly subjective projection. After all, when you look at reality on the smallest scale (using equations of quantum gravity, at least), time vanishes.
"If I observe the microscopic state of things," writes Rovelli, "then the difference between past and future vanishes … in the elementary grammar of things, there is no distinction between 'cause' and 'effect.'"
So, why do we perceive time as flowing forward? Rovelli notes that, although time disappears on extremely small scales, we still obviously perceive events occur sequentially in reality. In other words, we observe entropy: Order changing into disorder; an egg cracking and getting scrambled.
Rovelli says key aspects of time are described by the second law of thermodynamics, which states that heat always passes from hot to cold. This is a one-way street. For example, an ice cube melts into a hot cup of tea, never the reverse. Rovelli suggests a similar phenomenon might explain why we're only able to perceive the past and not the future.
"Any time the future is definitely distinguishable from the past, there is something like heat involved," Rovelli wrote for the Financial Times. "Thermodynamics traces the direction of time to something called the 'low entropy of the past', a still mysterious phenomenon on which discussions rage."
"Entropy growth orients time and permits the existence of traces of the past, and these permit the possibility of memories, which hold together our sense of identity. I suspect that what we call the "flowing" of time has to be understood by studying the structure of our brain rather than by studying physics: evolution has shaped our brain into a machine that feeds off memory in order to anticipate the future. This is what we are listening to when we listen to the passing of time. Understanding the "flowing" of time is therefore something that may pertain to neuroscience more than to fundamental physics. Searching for the explanation of the feeling of flow in physics might be a mistake."
Scientists still have much to learn about how we perceive time, and why time operates differently depending on the scale. But what's certain is that, outside of the realm of physics, our individual perception of time is also surprisingly elastic.
The strange subjectivity of time
Time moves differently atop a mountain than it does on a beach. But you don't need to travel any distance at all to experience strange distortions in your perception of time. In moments of life-or-death fear, for example, your brain would release large amounts of adrenaline, which would speed up your internal clock, causing you to perceive the outside world as moving slowly.
Another common distortion occurs when we focus our attention in particular ways.
"If you're thinking about how time is currently passing by, the biggest factor influencing your time perception is attention," Aaron Sackett, associate professor of marketing at the University of St. Thomas, told Gizmodo. "The more attention you give to the passage of time, the slower it tends to go. As you become distracted from time's passing—perhaps by something interesting happening nearby, or a good daydreaming session—you're more likely to lose track of time, giving you the feeling that it's slipping by more quickly than before. "Time flies when you're having fun," they say, but really, it's more like "time flies when you're thinking about other things." That's why time will also often fly by when you're definitely not having fun—like when you're having a heated argument or are terrified about an upcoming presentation."
One of the most mysterious ways people experience time-perception distortions is through psychedelic drugs. In an interview with The Guardian, Rovelli described a time he experimented with LSD.
"It was an extraordinarily strong experience that touched me also intellectually," he said. "Among the strange phenomena was the sense of time stopping. Things were happening in my mind but the clock was not going ahead; the flow of time was not passing any more. It was a total subversion of the structure of reality."
It seems few scientists or philosophers believe time is completely an illusion.
"What we call time is a rich, stratified concept; it has many layers," Rovelli told Physics Today. "Some of time's layers apply only at limited scales within limited domains. This does not make them illusions."What is an illusion is the idea that time flows at an absolute rate. The river of time might be flowing forever forward, but it moves at different speeds, between people, and even within your own mind.
- The history of AI shows boom periods (AI summers) followed by busts (AI winters).
- The cyclical nature of AI funding is due to hype and promises not fulfilling expectations.
- This time, we might enter something resembling an AI autumn rather than an AI winter, but fundamental questions remain if true AI is even possible.
The dream of building a machine that can think like a human stretches back to the origins of electronic computers. But ever since research into artificial intelligence (AI) began in earnest after World War II, the field has gone through a series of boom and bust cycles called "AI summers" and "AI winters."
Each cycle begins with optimistic claims that a fully, generally intelligent machine is just a decade or so away. Funding pours in and progress seems swift. Then, a decade or so later, progress stalls and funding dries up. Over the last ten years, we've clearly been in an AI summer as vast improvements in computing power and new techniques like deep learning have led to remarkable advances. But now, as we enter the third decade of the 21st century, some who follow AI feel the cold winds at their back leading them to ask, "Is Winter Coming?" If so, what went wrong this time?
How to build an A.I. brain that can conceive of itself | Joscha Bach | Big Think www.youtube.com
A brief history of AI
To see if the winds of winter are really coming for AI, it is useful to look at the field's history. The first real summer can be pegged to 1956 and the famous Dartmouth University Workshop where one of the field's pioneers, John McCarthy, coined the term "artificial intelligence." The conference was attended by scientists like Marvin Minsky and H. A. Simon, whose names would go on to become synonymous with the field. For those researchers, the task ahead was clear: capture the processes of human reasoning through the manipulation of symbolic systems (i.e., computer programs).
Unless we are talking about very specific tasks, any 6-year-old is infinitely more flexible and general in his or her intelligence than the "smartest" Amazon robot.
Throughout the 1960s, progress seemed to come swiftly as researchers developed computer systems that could play chess, deduce mathematical theorems, and even engage in simple discussions with a person. Government funding flowed generously. Optimism was so high that, in 1970, Minsky famously proclaimed, "In three to eight years we will have a machine with the general intelligence of a human being."
By the mid 1970s, however, it was clear that Minsky's optimism was unwarranted. Progress stalled as many of the innovations of the previous decade proved too narrow in their applicability, seeming more like toys than steps toward a general version of artificial intelligence. Funding dried up so completely that researchers soon took pains not to refer to their work as AI, as the term carried a stink that killed proposals.
The cycle repeated itself in the 1980s with the rise of expert systems and the renewed interest in what we now call neural networks (i.e., programs based on connectivity architectures that mimic neurons in the brain). Once again, there was wild optimism and big increases in funding. What was novel in this cycle was the addition of significant private funding as more companies began to rely on computers as essential components of their business. But, once again, the big promises were never realized, and funding dried up again.
AI: Hype vs. reality
The AI summer we're currently experiencing began sometime in the first decade of the new millennium. Vast increases in both computing speed and storage ushered in the era of deep learning and big data. Deep learning methods use stacked layers of neural networks that pass information to each other to solve complex problems like facial recognition. Big data provides these systems with vast oceans of examples (like images of faces) to train on. The applications of this progress are all around us: Google Maps give you near-perfect directions; you can talk with Siri anytime you want; IBM's Deep Think computer beat Jeopardy's greatest human champions.
In response, the hype rose again. True AI, we were told, must be just around the corner. In 2015, for example, The Guardian reported that self-driving cars, the killer app of modern AI, was close at hand. Readers were told, "By 2020 you will become a permanent backseat driver." And just two years ago, Elon Musk claimed that by 2020 "we'd have over a million cars with full self-driving software."
The general intelligence — i.e., the understanding — we humans exhibit may be inseparable from our experiencing. If that's true, then our physical embodiment, enmeshed in a context-rich world, may be difficult if not impossible to capture in symbolic processing systems.
By now, it's obvious that a world of fully self-driving cars is still years away. Likewise, in spite of the remarkable progress we've made in machine learning, we're still far from creating systems that possess general intelligence. The emphasis is on the term general because that's what AI really has been promising all these years: a machine that's flexible in dealing with any situation as it comes up. Instead, what researchers have found is that, despite all their remarkable progress, the systems they've built remain brittle, which is a technical term meaning "they do very wrong things when given unexpected inputs." Try asking Siri to find "restaurants that aren't McDonald's." You won't like the results.
Unless we are talking about very specific tasks, any 6-year-old is infinitely more flexible and general in his or her intelligence than the "smartest" Amazon robot.
Even more important is the sense that, as remarkable as they are, none of the systems we've built understand anything about what they are doing. As philosopher Alva Noe said of Deep Think's famous Jeopardy! victory, "Watson answered no questions. It participated in no competition. It didn't do anything. All the doing was on our side. We played Jeapordy! with Watson." Considering this fact, some researchers claim that the general intelligence — i.e., the understanding — we humans exhibit may be inseparable from our experiencing. If that's true, then our physical embodiment, enmeshed in a context-rich world, may be difficult if not impossible to capture in symbolic processing systems.
Not the (AI) winter of our discontent
Thus, talk a of a new AI winter is popping up again. Given the importance of deep learning and big data in technology, it's hard to imagine funding for these domains drying up any time soon. What we may be seeing, however, is a kind of AI autumn when researchers wisely recalibrate their expectations and perhaps rethink their perspectives.