Your Brain Is like This Mardi Gras Party

I was just thinking, which, it turns out, the brain does not like to do. At least not the purposeful, effortful, stay-focused kind. It’s work to think, to pay attention. 

I was just thinking, which, it turns out, the brain does not like to do. At least not the purposeful, effortful, stay-focused kind. It’s work to think, to pay attention. You literally do pay, in calories, for concentrating, and the brain is a calorie hog, using way more per pound than any other part of the body. And since we evolved back when there weren’t 7/11s and McDonald’s everywhere and the next load of calories was no sure thing, that three pounds of potential cognitive firepower in our skulls has developed ways to mostly keep things on relatively lazy subconscious autopilot and leave the serious focused paying attention for when it’s really needed.

I was thinking about this in the context of Mardi Gras, and Lent, and how nearly every faith tradition prescribes some version of the same thing; sacrificing, giving something up, doing the hard work of going without…as a way of demonstrating strength of character. On Fat Tuesday we overindulge, because for the next 40 days we’re not supposed to, as a way of being strong (i.e. good), of demonstrating mastery over our material selves…of forcing ourselves to get closer to the spiritual beings that, at our core, is who we really are. Or at least that’s what Christians and Buddhists and Muslims and Jews and most other belief traditions prescribe in one way or another. Going without forces us to pay attention to the more meaningful side of our lives.

And it occurred to me that what Mardi Gras is to materialist discipline, multitasking is to paying cognitive attention….a way out of doing the work required. As it is easier to overeat and overdrink and over indulge our physical pleasures than to do the work of going without, it is also easier to check your email whenever a message comes in, or your Twitter or Facebook feed every five minutes, or text a friend, or play some online game, than to stay focused on that one task or person you’re paying attention to at the moment.

Is that article you’re reading getting kind of long? Just bookmark it and click on to something else…save the work of paying attention for later. Got to a tough spot in that essay you’re writing or project you're working on, that needs a little pondering to figure out where to go next? Just hit save and check out your Tweet stream. Is the conversation among your friends losing interest? Whip out your smart phone and check…whatever. (How many times have you seen that happen? Or done it!?) Why pay attention, if you have an easier way out?

It may well be that we multitask not because the world has become so busy and demands it of us, but because the technologically advanced world offers us an unprecedented universe of possibilities to get out of having to pay attention to the task at hand. We multitask not because we have to but because we CAN, literally seduced by how the brain intrinsically operates, always looking to avoid the heavy lifting/calorically expensive work of focused purposeful thinking. We overeat, many believe, in large part because food is so readily available, and we have evolved to want the calories, and that imperative easily overpowers the work it takes to discipline our diets. We may well be overmultitasking for similiar reasons. The distractions are so readily available, and the brain has evolved to save the calories and get to lazy autopilot whenever it can, and that imperative easily overpowers the work it takes to discipline our focus.

This has ominous implications, of course (which are being researched by a wide range of scholars). To the extent this is true, our unprecedented world of easy distractions does not bode well for the things we need to pay attention to if we want to do them well. Like, say, making thoughtful decisions. Or thinking critically for ourselves instead of just going along with the crowd. Or being decent friends and family members by paying attention to other people’s feelings and ideas and lives. Or even just learning. There is already research suggesting that whatever we read online we remember less, apparently because the brain knows it  can just go back and look it up later so why spend the calories paying attention to it and remembering it now.

This doesn’t bode well for the human ability to explore ideas in any richness or depth. In fact, it doesn’t bode well for the possibility that most of those who started this piece are even still here. Google Analytics says the average duration a reader spends on this blog is :47 seconds. Which means the average reader made it roughly to my mention of Fat Tuesday back in the second paragraph. (Granted, a majority of ‘visitors’ are just bots, and they have REALLY short attention spans, dragging the average down.) Congratulations on your mental focus, and thanks for your interest, if you’re still here!   

There is a WONDERFUL piece on short attention spans by Farhad Monjoo in Slate, You Won’t Finish This Article. Go read it. (When you have the time.) But this issue is about more than the human attention span, which may be shrinking but has always been short. As I said, the brain is trying to slip into calorie-saving autopilot whenever it safely can. More broadly this about how carefully we think, and the broad implications for us as individuals, and for society, when we don’t. The more we’re on autopilot and not paying attention, the less we learn, the less we connect, the less careful and thoughtful our choices and behaviors.

It’s like living in a world of constant Mardi Gras, happily pigging out and partying and saving the work of self-discipline for later. Not great for our physical health. We ought to worry what the constant cognitive Mardi Gras of our über connected/multitasking existence is doing to our social and intellectual health, to our decision making, and even literally to our safety. I’m sure serious researchers are looking into this. So sure, in fact, that…it’s time to check the Tweet stream.


​There are two kinds of failure – but only one is honorable

Malcolm Gladwell teaches "Get over yourself and get to work" for Big Think Edge.

Big Think Edge
  • Learn to recognize failure and know the big difference between panicking and choking.
  • At Big Think Edge, Malcolm Gladwell teaches how to check your inner critic and get clear on what failure is.
  • Subscribe to Big Think Edge before we launch on March 30 to get 20% off monthly and annual memberships.
Keep reading Show less

Saying no is hard. These communication tips make it easy.

You can say 'no' to things, and you should. Do it like this.

  • Give yourself permission to say "no" to things. Saying yes to everything is a fast way to burn out.
  • Learn to say no in a way that keeps the door of opportunity open: No should never be a one-word answer. Say "No, but I could do this instead," or, "No, but let me connect you to someone who can help."
  • If you really want to say yes but can't manage another commitment, try qualifiers like "yes, if," or "yes, after."
Keep reading Show less

Scientists reactivate cells from 28,000-year-old woolly mammoth

"I was so moved when I saw the cells stir," said 90-year-old study co-author Akira Iritani. "I'd been hoping for this for 20 years."

Yamagata et al.
Surprising Science
  • The team managed to stimulate nucleus-like structures to perform some biological processes, but not cell division.
  • Unless better technology and DNA samples emerge in the future, it's unlikely that scientists will be able to clone a woolly mammoth.
  • Still, studying the DNA of woolly mammoths provides valuable insights into the genetic adaptations that allowed them to survive in unique environments.
Keep reading Show less

Why is 18 the age of adulthood if the brain can take 30 years to mature?

Neuroscience research suggests it might be time to rethink our ideas about when exactly a child becomes an adult.

Mind & Brain
  • Research suggests that most human brains take about 25 years to develop, though these rates can vary among men and women, and among individuals.
  • Although the human brain matures in size during adolescence, important developments within the prefrontal cortex and other regions still take pace well into one's 20s.
  • The findings raise complex ethical questions about the way our criminal justice systems punishes criminals in their late teens and early 20s.
Keep reading Show less