Understanding Creativity: Why Brain Hacks Don't Help
Everyone thinks they know how to make their brain more creative and have better ideas.
David Eagleman is a neuroscientist and a New York Times bestselling author. He directs the Laboratory for Perception and Action at the Baylor College of Medicine, where he also directs the Initiative on Neuroscience and Law. He is best known for his work on time perception, brain plasticity, synesthesia, and neurolaw.
Beyond his 100+ academic publications, he has published many popular books. His bestselling book Incognito: The Secret Lives of the Brain, explores the neuroscience "under the hood" of the conscious mind: all the aspects of neural function to which we have no awareness or access. His work of fiction, SUM, is an international bestseller published in 28 languages and turned into two operas. Why the Net Matters examines what the advent of the internet means on the timescale of civilizations. The award-winning Wednesday is Indigo Blue explores the neurological condition of synesthesia, in which the senses are blended.
Eagleman is a TED speaker, a Guggenheim Fellow, a winner of the McGovern Award for Excellence in Biomedical Communication, a Next Generation Texas Fellow, Vice-Chair on the World Economic Forum's Global Agenda Council on Neuroscience & Behaviour, a research fellow in the Institute for Ethics and Emerging Technologies, Chief Scientific Advisor for the Mind Science Foundation, and a board member of The Long Now Foundation. He has served as an academic editor for several scientific journals. He was named Science Educator of the Year by the Society for Neuroscience, and was featured as one of the Brightest Idea Guys by Italy's Style magazine. He is founder of the company BrainCheck and the cofounder of the company NeoSensory. He was the scientific advisor for the television drama Perception, and has been profiled on the Colbert Report, NOVA Science Now, the New Yorker, CNN's Next List, and many other venues. He appears regularly on radio and television to discuss literature and science.
David Eagleman: There are many books that exist on creativity and it’s about, “hey, do this,” “take a hot shower” or “take a long walk in nature,” “be in a pink room,” or something.
What my coauthor Anthony Brandt and I really strove to do here was to figure out what is below all of that.
What is the basic cognitive software that’s running in the human brain that takes ideas in and smooshes them up and crunches them—and it’s like a food processor that’s constantly spitting out new ideas.
The key is that humans are really different from one another, and for one person taking a hot shower might work and for another person a cold shower. One person works well in the morning and another person at night. For one writer they should go and sit in the coffee shop where it’s loud, and for another writer, it works better for them to sit alone in their quiet office and write.
So I suspect there’s no single piece of advice that’s going to apply to everyone, and that was our—that’s what we wanted to avoid, was that sort of thing.
Instead, we’re trying to understand what is it that’s special about the human brain that allows creativity to happen?
Because when you look at us compared to all the other species on the Earth—we have very similar brains. I mean, you know, obviously we’re cousins with our nearest neighbors and all throughout the animal kingdom, it’s a continuous family tree.
But there's this one species that has gone, that does these incredible—we’re in New York City right now and when I flew in here it’s like looking at a motherboard that has risen from the earth. And when you fly over a forest it looks the same as it did a million years ago.
So we’re running around the planet doing something unbelievable. You don’t have squirrels going to the moon or dogs inventing the internet or cows doing theater plays for one another or any of the gazillion things that we do.
We’re doing something really different, and that’s what Anthony and I have really tried to understand.
So I’m a neuroscientist. Anthony is a music composer, and we started talking about creativity a while ago, many years ago, we’ve been good friends for a long time—And we started realizing that there were all these very interesting ways which our views came together. So then we ended up writing this book together.
What’s special about the human brain is that we, during the evolution of the cortex, got a lot more space in between input and output.
So other animals have these much closer together. So when they get some stimulus they make an essentially reflexive response.
In humans, as the cortex expanded there’s a lot more room there, which means that inputs can come in and sort of percolate around and get stored and get thought about, and then maybe you make an output or maybe you don’t.
And there’s one other thing that happened with the expansion of the cortex, which is that we got a much bigger prefrontal cortex. That’s the part right behind the forehead.
And that is what allows us to simulate what ifs, to separate ourselves from our location in space and time and think about possibilities. “What if I did that? What if I had done that? What if I could do that?” And so we do this all the time.
The amazing part is now there are almost eight billion brains running around the planet and as a result creativity—I mean the creativity of our species—has gone up in this amazing way, because there’s so much raw material to draw on and there are so many of us that are constantly saying “well what if this, what if that?”
Most of the things we generate stink. And, in fact, only some fraction of them even percolate up to consciousness. And of those most of those stink. But every once in a while you have one that works, that actually sticks for your society, for your moment in space and time, and you make the next step.
And so what’s happening now is we’ve got this massively bootstrapping society going on around us.
People think that their brain is like an iPhone — if they can just unlock it and press a few things in a certain order, then something is sure to happen. That's just not the case, as neuroscientist David Eagleman tells us. While some swear a cold shower helps them think better it's simply a matter of personal preference; what works for one might not work for anyone else. David has a great line: "You don’t have squirrels going to the moon or dogs inventing the internet or cows doing theater plays for one another or any of the gazillion things that we do." Quite frankly, what gets creatvity going the best is actually the most boring: a good diet and regular exercise... but where's the fun (and clickable headline) in that? David's new book is The Runaway Species: How Human Creativity Remakes the World.
Once a week.
Subscribe to our weekly newsletter.
Is working from home the ultimate liberation or the first step toward an even unhappier "new normal"?
- The Great Resignation is an idea proposed by Professor Anthony Klotz that predicts a large number of people leaving their jobs after the COVID pandemic ends and life returns to "normal."
- French philosopher Michel Foucault argued that by establishing what is and is not "normal," we are exerting a kind of power by making people behave a certain way.
- If working from home becomes the new normal, we must be careful that it doesn't give way to a new lifestyle that we hate even more than the office.
You wake up, you put on your work clothes, and you go to the office. You sit behind a desk, or in some designated space, and you work until the clock says it's over. This is what life is like for the vast majority of people. That is, until COVID came along. Then, everything changed.
Recently, an interesting idea has emerged called the "Great Resignation." This is a phenomenon that Professor Anthony Klotz of Texas A&M University has predicted will happen when people are asked, or told, to return to their offices. Klotz argues that, when we're all forced back into the old reality of the commute, a nine-to-five job, and cubicle life, there will be a "Great Resignation" among the workforce.
The argument is that in times of uncertainty and insecurity — like during a global pandemic — people behave conservatively. They'll stay put. But once things "normalize" again, we ought to expect employees to head for the exits.
But why? What has changed? Why has working from home made us so dissatisfied with our previously normal lives? Other than the comfort and convenience of working from home, one explanation might involve the concept of "normalization," a topic that fascinated French philosopher Michel Foucault.
The power of normal people
Foucault argued that we often spend an inordinate amount of time trying to be normal. We must dress the same way as everyone else. We must talk about the same things. We must work just like everyone else works. It's hugely important that things are normal. But, behind all of this, is a power dynamic that many of us are simply unaware of — and unconsciously unhappy about.
Someone, somewhere, must define what is "normal." It is then for the rest of us to bend over backward to fit into this narrow mold. To be powerful, then, is to say, "Do this, otherwise everyone will call you weird." Power is to hold the hoops everyone else must jump through. It's what Foucault describes as "normalizing power."
COVID was a wake-up call to the abnormality of modern work
Let's apply Focault's normalization concept to the modern workplace. Accepted wisdom had it that the best — and really, the only way — to work was in an office, usually downtown, far away from where we live. We were told this is where collaboration and creativity occur. Largely unchallenged, this "normal" functioned for decades, and we all obeyed.
We had to wake up at the crack of dawn to get ready for work. We had to travel in clogged and joyless commutes. We had to eat ready-packaged lunches behind our too-small desks. We had to sit through meetings in "good posture" ergonomic chairs that wouldn't be out of place in the Spanish Inquisition. Then we had to travel back home in yet another clogged and joyless commute. And we did this day after day after day.
Then COVID came along and revealed just how artificial, unnecessary, and abnormal it all is. It's as if someone ripped a blindfold off of society. We have laptops, wi-fi, and 5G (at least when people aren't burning the towers down). Many of us were just as productive — if not more so — than during the "normal" pre-COVID era. We don't need to be in an office. We don't need to waste countless hours of our lives sitting in traffic.
While the idea of a Great Resignation is quite appealing right now, we should be careful the "new normal" isn't so much worse.
Even better, people got to spend more time with their families, enjoy long and restful breaks, and have space to pursue their hobbies. In short, people like not going to an office. And, as Klotz argues, when companies see this dissatisfaction — this Great Resignation — they're going to ask some revolutionary questions, like, "Do you want to come back full time? Work remotely? In-office three days a week? Four days? One day?"
The silver lining to the COVID pandemic is that it has made us re-examine what "normal" is.
Beware the new normal
Of course, the idea of a nine-to-five office job was not established by some moustache-twirling villain just to satisfy his sadistic whims. It came about because people thought that was the most effective and productive way to operate.
People do need direct human contact, and it's often easier and more productive to speak to a colleague next to you or walk across an office to ask for some help. Remote-working software like Zoom is indeed convenient, but can a company honestly say that it's as efficient as working in an office?
What's more, there's a particularly pernicious sting in what Foucault argued. It's something that ought to slow any would-be Great Resignation. This is the idea that there likely will always be some kind of normal.
While COVID has revealed the office for the normalized power play that it is, what's to say what the next "normal" will be? Let's say that working from home becomes the new normal. Will we be expected to attend Zoom meetings at any hour of the day or answer text messages at midnight? Might cameras be used to monitor our every movement? Might software check that we're working at the right pace and in the right way?
While the idea of a Great Resignation is quite appealing right now, we should be careful the "new normal" isn't so much worse.
Linguists discover 30 sounds that may have allowed communication before words existed.
- What did the first person who wanted to speak say?
- New research suggests that there are lots of sounds that everyone understands.
- These sounds may have allowed the first exchanges that gave birth to language.
As hard as it is sometimes to get a conversation started, imagine how difficult it must have been before words existed. Linguists have long wondered how verbal language began. Some form of communication must have been in place to get the whole thing going. Maybe it was gestures.
Now, a new study published in Scientific Reports by linguists at the University of Birmingham (UBir) in the UK and the Leibniz-Centre General Linguistics in Berlin proposes another idea: Verbal communication may have begun at least partly with "iconic" mouth-produced sounds whose meanings were inherently obvious to anyone who heard them. (The researchers use the word "iconic" to mean that these sounds represent things.)
The importance of these sounds may also extend beyond their role as the ultimate conversation starters, says co-author Marcus Perlman of UBir. "Our study fills in a crucial piece of the puzzle of language evolution, suggesting the possibility that all languages — spoken as well as signed — may have iconic origins."
30 iconic sounds
Credit: Alexander Pokusay /
The researchers have posted a few of these iconic sounds: "cut," "tiger," "water," and "good." (Note: These audio files won't play in Apple's Safari browser.) The study reveals that there are a lot more of these sounds than previously appreciated, and likely enough to form a bridge to language development.
Co-author UBir's Bodo Winter explains:
"Our findings challenge the often-cited idea that vocalizations have limited potential for iconic representation, demonstrating that in the absence of words people can use vocalizations to communicate a variety of meanings — serving effectively for cross-cultural communication when people lack a common language."
The researchers compiled a list of 30 iconic-sound candidates that likely would have been of use to the earliest speakers. These included mouth noises that could represent:
- animate beings — "child," "man," "woman," "snake," "tiger," "deer"
- inanimate objects — "fire," "rock," "meat," "water," "knife," "fruit"
- activities — "eat," "sleep," "cut," "cook," "gather," "hunt," "hide"
- descriptors — "good," "bad," "small," "big" "dull," "sharp"
- quantities — "one," "many"
- demonstrative words — "this," "that"
Was "nom, nom" the sound for eating?
Credit: Aleksandra Ćwiek, et al. / Scientific Reports
Making a list — and making noises — is one thing; finding out if anyone understands them is another. The researchers tested out their iconic sounds in two different experiments.
In an online experiment, speakers of 25 different languages were asked to match the meaning of iconic sounds to six written labels. They listened to three performances for each of the 30 candidates, 90 recordings in all.
Participants correctly identified the sounds' meaning roughly 65 percent of the time.
Some meanings were more readily understood than others. "Sleep" was correctly identified by almost 99 percent, as opposed to "that," understood by only 35 percent. The most often understood sounds were "eat," "child," "sleep," "tiger," and "water." The least? "That," "gather," "sharp," "dull," and "knife."
The researchers next conducted field experiments to capture the meaningfulness of the sounds in oral cultures with inconsistent literacy levels. For these people, researchers played twelve iconic sounds for animals and inanimate objects as listeners identified each from a grid of pictures. The volunteers correctly identified the sounds' meanings about 56 percent of the time, again above the level of chance.
The universal roots of language
In addition to being the sounds that facilitated the birth of language, the authors of the study wonder if such commonly understood sounds may also be a factor in the similarities that exist between different modern languages that don't share a common root language. They cite other research that found "vocalizations for 25 different emotions were identifiable across cultures with above-chance accuracy."
"The ability to use iconicity to create universally understandable vocalizations," says Perlman, "may underpin the vast semantic breadth of spoken languages, playing a role similar to representational gestures in the formation of signed languages."
A 19th-century surveying mistake kept lumberjacks away from what is now Minnesota's largest patch of old-growth trees.
- In 1882, Josias R. King made a mess of mapping Coddington Lake, making it larger than it actually is.
- For decades, Minnesota loggers left the local trees alone, thinking they were under water.
- Today, the area is one of the last remaining patches of old-growth forest in the state.
Vanishingly rare, but it exists: a patch of Minnesota forest untouched by the logger's axe.Credit: Dan Alosso on Substack and licensed under CC-BY-SA
The trees here tower a hundred feet above the forest floor — a ceiling as high as in prehistory and vanishingly rare today. That's because no logger's axe has ever touched these woods.
Pillars of the green cathedral
As you walk among the giant pillars of this green cathedral, you might think you're among the redwood trees of California. But those are 1,500 miles (2,500 km) away. No, these are the red and white pines of the "Lost Forty" in Minnesota. This is the largest single surviving patch of old-growth forest in the state and a fair stretch beyond. And it's all thanks to a surveying error.
Despite its name, the Lost Forty Scientific and Natural Area (SNA) is actually 144 acres (0.58 km2) in total. Still, it's an easily overlooked part of the Chippewa National Forest, which sprawls across 666,000 acres (2,700 km2) of north-central Minnesota. And that – being easily overlooked – is kind of this area's superpower.
In the 1820s, when European-Americans arrived in what is now Minnesota, they found about 20 million acres (80,000 km2) of prairie and 30 million acres (120,000 km2) of forest. Two centuries on, both ecosystems largely have been depleted. Fewer than 100,000 acres (400 km2) of natural prairie remain, and fewer than 18 million acres (73,000 km2) of forest.
And today's woods are different. They're not just younger; the original pine stands have been harvested and largely replaced with aspen and birch.
To the moon and back
White pine especially was in heavy demand during the lumbering boom that had Minnesota in its grip by the 1840s — a boom driven by an insatiable demand for building materials and supercharged by the steam that powered the saws and the rails that transported the goods to market.
The two decades flanking the turn of the 20th century were the golden age of lumbering in Minnesota. At any given time, 20,000 lumberjacks were at work in the woods, a further 20,000 in the sawmills, and another 20,000 in other lumber-related industries.
Production peaked in the year 1900, with over 2.3 billion board-feet (5.4 million m3) of lumber harvested from the state's forests. That was enough to build 600,000 two-story houses or a boardwalk nine feet (2.7 m) wide, circling Earth along the equator. From then on, yields declined, albeit slightly at first. By 1910, however, the first lumber operations started packing up and moving on to the Pacific Northwest and elsewhere.
Minnesota's era of Big Timber symbolically came to an end with the closure of the Virginia and Rainy Lake Lumber Company in 1929. At that time, a century's worth of lumbering in Minnesota had produced 68 billion board-feet (160 million m3) of pine — enough to fill a line of boxcars all the way to the moon and halfway back again.
Now spool back a few decades. It's 1882, and the Public Land Survey is measuring, mapping, and quantifying the wilderness of northern Minnesota — and its as yet unharvested north woods. Setting out from the small settlement of Grand Rapids, Josias Redgate King leads a three-man survey team 40 miles north, into the backwoods.
Mapping error becomes cartographic fact
Their job, specifically, is to chart the area between Moose and Coddington Lakes. And they mess up. Perhaps it's the lousy November weather, the desolate swampy terrain, or both. But they make a serious mistake: their survey stretches Coddington Lake half a mile further northwest than it actually exists. As happens surprisingly often with mapping mistakes, the error becomes cartographic fact, undisputed for decades.
The area is marked on all maps as being under water and is therefore excluded from the considerations of logging companies. Only in 1960 is the area re-surveyed and the error corrected. But by then, as we have seen, Big Timber has moved on from the Gopher State.
Map of the "Lost Forty" SNA (top right). Bordering it on the south is the Chippewa National Forest Unique Biological Area. Credit: Minnesota Department of Natural Resources
Incidentally, Josias R. King was more than the mismapper of Coddington Lake. He has another, and rather better, claim to fame. When the Civil War broke out, Minnesota was the first state to offer volunteers to fight for the Union. At Fort Snelling, Mr. King rushed to the front of a line of men waiting to sign up.
So it was said, with some justification, that he was the first volunteer for the Union in all of the country. During the war, he attained the rank of lieutenant colonel. After, he returned to his civilian job, surveying. Because of his credentials as the Union's first volunteer, he was asked to pose for the face of the bronze soldier on the Civil War monument which was unveiled at St. Paul's Summit Park in 1903.
The loggers' loss is nature's gain
But back to the Lost Forty. The loggers' loss — hence the name — is actually nature's gain. The SNA's crowning glory, literally, is nearly 32 acres of designated old-growth red pine and white pine forest, in two stands, partially extending into the Chippewa National Forest proper. (In fact, much of the mismapped area seems to fall within the Chippewa National Forest Unique Biological Area adjacent to the Lost Forty.) Old-growth forests represent less than 2 percent — and designated old-growth forests less than 0.25 percent — of all of Minnesota's forests.
The oldest pine trees in the Lost Forty are between 300 and 400 years old, close to their maximum natural life span, which is up to 500 years. Similar pines in other parts of the National Forest are harvested at between 80 and 150 years for pulp and lumber. As a result, the pines in the Lost Forty are not only higher than most of the surrounding woods but also bigger with a diameter of between 22 and 48 inches (55 to 122 cm). One of the biggest has a circumference of 115 inches (2.9 m).
With their craggy bark, massive trunks, and dizzying height, these trees look like the ancient beings they are. And they exist in a cluster the size of which is unique for the Midwest. There's nothing lost about these trees; in fact, it's rather the reverse. Perhaps the area should more precisely be called the "Last Forty."
At 52 feet, only half as high as an old-growth white pine: Josias R. King's likeness atop the Soldier's Monument in Summit Park, St. Paul.Credit: Library of Congress
Get a good look at the Lost Forty in this video of the local hiking trail.
Strange Maps #1084
Got a strange map? Let me know at email@example.com.
These Roman Emperors were infamous for their debauchery and cruelty.
- Roman Emperors were known for their excesses and violent behavior.
- From Caligula to Elagabalus, the emperors exercised total power in the service of their often-strange desires.
- Most of these emperors met violent ends themselves.
We rightfully complain about many of our politicians and leaders today, but historically speaking, humanity has seen much worse. Arguably no set of rulers has been as debauched, ingenious in their cruelty, and prone to excess as the Roman Emperors.
While this list is certainly not exhaustive, here are seven Roman rulers who were perhaps the worst of the worst in what was one of the largest empires that ever existed, lasting for over a thousand years.
Officially known as Gaius (Gaius Caesar Augustus Germanicus), Caligula was the third Roman Emperor, ruling from 37 to 41 AD. He acquired the nickname "Caligula" (meaning "little [soldier's] boot") from his father's soldiers during a campaign.
While recognized for some positive measures in the early days of his rule, he became famous throughout the ages as an absolutely insane emperor, who killed anyone when it pleased him, spent exorbitantly, was obsessed with perverse sex, and proclaimed himself to be a living god.
Caligula gives his horse Incitatus a drink during a banquet. Credit: An engraving by Persichini from a drawing by Pinelli, from "The History of the Roman Emperors" from Augustus to Constantine, by Jean Baptiste Louis Crevier. 1836.
Among his litany of misdeeds, according to the accounts of Caligula's contemporaries Philo of Alexandria and Seneca the Younger, he slept with whomever he wanted, brazenly taking other men's wives (even on their wedding nights) and publicly talking about it.
He also had an insatiable blood thirst, killing for mere amusement. Once, as reports historian Suetonius, when the bridge across the sea at Puteoli was being blessed, he had a number of spectators who were there to inspect it thrown off into the water. When some tried to cling to the ships' rudders, Caligula had them dislodged with hooks and oars so they would drown. On another occasion, he got so bored that he had his guards throw a whole section of the audience into the arena during the intermission so they would be eaten by wild beasts. He also allegedly executed two consuls who forgot his birthday.
Suetonius relayed further atrocities of the mad emperor's character, writing that Caligula "frequently had trials by torture held in his presence while he was eating or otherwise enjoying himself; and kept an expert headsman in readiness to decapitate the prisoners brought in from gaol." One particular form of torture associated with Caligula involved having people sawed in half.
He caused mass starvation and purposefully wasted money and resources, like making his troops stage fake battles just for theater. If that wasn't enough, he turned his palace into a brothel and was accused of incest with his sisters, Agrippina the Younger, Drusilla, and Livilla, whom he also prostituted to other men. Perhaps most famously, he was planning to appoint his favorite horse Incitatus a consul and went as far as making the horse into a priest.
In early 41 AD, Caligula was assassinated by a conspiracy of Praetorian Guard officers, senators, and other members of the court.
Fully named Nero Claudius Caesar, Nero ruled from 54 to 68 AD and was arguably an even worse madman than his uncle Caligula. He had his step-brother Britannicus killed, his wife Octavia executed, and his mother Agrippina stabbed and murdered. He personally kicked to death his lover Poppeaea while she was pregnant with his child — a horrific action the Roman historian Tacitus depicted as "a casual outburst of rage."
He spent exorbitantly and built a 100-foot-tall bronze statue of himself called the Colossus Neronis.
He is also remembered for being strangely obsessed with music. He sang and played the lyre, although it's not likely he really fiddled as Rome burned in what is a popular myth about this crazed tyrant. As misplaced retribution for the fire which burned down a sizable portion of Rome in the year 64, he executed scores of early Christians, some of them outfitted in animal skins and brutalized by dogs, with others burned at the stake.
He died by suicide.
Roman Emperor Nero in the burning ruins of Rome. July 64 AD.Credit: From an original painting by S.J. Ferris. (Photo by Kean Collection / Getty Images)
Like some of his counterparts, Commodus (a.k.a. Lucius Aelius Aurelius Commodus) thought he was a god — in his case, a reincarnation of the Greek demigod Hercules. Ruling from 176 to 192 AD, he was also known for his debauched ways and strange stunts that seemed designed to affirm his divine status. Numerous statues around the empire showed him as Hercules, a warrior who fought both men and beasts. He fought hundreds of exotic animals in an arena like a gladiator, confusing and terrifying his subjects. Once, he killed 100 lions in a single day.
Emperor Commodus (Joaquin Phoenix) questions the loyalty of his sister Lucilla (Connie Nielsen) In Dreamworks Pictures' and Universal Pictures' Oscar-winning drama "Gladiator," directed by Ridley Scott.Credit: Photo By Getty Images
The burning desire to kill living creatures as a gladiator for the New Year's Day celebrations in 193 AD brought about his demise. After Commodus shot hundreds of animals with arrows and javelins every morning as part of the Plebeian Games leading up to New Year's, his fitness coach (aptly named Narcissus), choked the emperor to death in his bath.
Officially named Marcus Aurelius Antoninus II, Elagabalus's nickname comes from his priesthood in the cult of the Syrian god Elagabal. Ruling as emperor from 218 to 222 AD, he was so devoted to the cult, which he tried to spread in Rome, that he had himself circumcised to prove his dedication. He further offended the religious sensitivities of his compatriots by essentially replacing the main Roman god Jupiter with Elagabal as the chief deity. In another nod to his convictions, he installed on Palatine Hill a cone-like fetish made of black stone as a symbol of the Syrian sun god Sol Invictus Elagabalus.
His sexual proclivities were also not well received at the time. He was likely transgender (wearing makeup and wigs), had five marriages, and was quite open about his male lovers. According to the Roman historian (and the emperor's contemporary) Cassius Dio, Elagabalus prostituted himself in brothels and taverns and was one of the first historical figures on record to be looking for sex reassignment surgery.
He was eventually murdered in 222 in an assassination plot engineered by his own grandmother Julia Maesa.
Emperor for just eight months, from April 19th to December 20th of the year 69 AD, Vitellius made some key administrative contributions to the empire but is ultimately remembered as a cruel glutton. He was described by Suetonius as overly fond of eating and drinking, to the point where he would eat at banquets four times a day while sending out the Roman navy to get him rare foods. He also had little social grace, inviting himself over to the houses of different noblemen to eat at their banquets, too.
Vitellius dragged through the streets of Rome.Credit: Georges Rochegrosse. 1883.
He was also quite vicious and reportedly either had his own mother starved to death or approved a poison with which she committed suicide.
Vitellius was ultimately murdered in brutal fashion by supporters of the rival emperor Vespasian, who dragged him through Rome's streets, then likely beheaded him and threw his body into the Tiber river. "Yet I was once your emperor," were supposedly his last words, wrote historian Cassius Dio.
Marcus Aurelius Antoninus I ruled Rome from 211 to 217 AD on his own (while previously co-ruling with his father Septimius Severus from 198). "Caracalla"' was his nickname, referencing a hooded coat from Gaul that he brought into Roman fashion.
He started off his rise to individual power by murdering his younger brother Geta, who was named co-heir by their father. Caracalla's bloodthirsty tyranny didn't stop there. He wiped out Geta's supporters and was known to execute any opponents to his or Roman rule. For instance, he slaughtered up to 20,000 citizens of Alexandria after a local theatrical satire dared to mock him.
Geta Dying in His Mother's Arms.Credit: Jacques Pajou (1766-1828)
One of the positive outcomes of his rule was the Edict of Caracalla, which gave Roman citizenship to all free men in the empire. He was also known for building gigantic baths.
Like others on this list, Caracalla met a brutal end, being assassinated by army officers, including the Praetorian prefect Opellius Macrinus, who installed himself as the next emperor.
As the second emperor, Tiberius (ruling from 42 BC to 16 AD) is known for a number of accomplishments, especially his military exploits. He was one of the Roman Empire's most successful generals, conquering Pannonia, Dalmatia, Raetia, and parts of Germania.
He was also remembered by his contemporaries as a rather sullen, perverse, and angry man. In the chapter on his life from The Lives of the Twelve Caesars by the historian Suetonius, Tiberius is said to have been disliked from an early age for his personality by even his family. Suetonius wrote that his mother Antonia often called him "an abortion of a man, that had been only begun, but never finished, by nature."
"Orgy of the Times of Tiberius on Capri".Painting by Henryk Siemiradzki. 1881.
Suetonius also paints a damning picture of Tiberius after he retreated from public life to the island of Capri. His years on the island would put Jeffrey Epstein to shame. A horrendous pedophile, Tiberius had a reputation for "depravities that one can hardly bear to tell or be told, let alone believe," Suetonius wrote, describing how "in Capri's woods and groves he arranged a number of nooks of venery where boys and girls got up as Pans and nymphs solicited outside bowers and grottoes: people openly called this 'the old goat's garden,' punning on the island's name."
There's much, much more — far too salacious and, frankly, disgusting to repeat here. For the intrepid or morbidly curious reader, here's a link for more information.
After he died, Tiberius was fittingly succeeded in emperorship by his grandnephew and adopted grandson Caligula.