Big ideas.
Once a week.
Subscribe to our weekly newsletter.
What hasn’t been said about Louis C.K.? The New York Times called him a “comedic Quentin Tarantino.” Writing for the Los Angeles Book Review Adam Wilson said he was “television’s most honest man.” GQ suggested that he is the “funniest comic alive.” Then there’s C.K. himself who in stand up routines and on his self-produced and self-direction show, Louie, brazenly identifies as “fat,” “pathetic,” and a “frequent masturbator.”
C.K.’s approach is not new. His humor emerges from close dissections of everyday life. Comics call this “observational comedy.” Dave Allen pioneered it in the 1970s and Jerry Seinfeld made a career out of it on Seinfeld. My favorite C.K. example comes from several years ago when he was on Conan making fun of frustrated flyers. “People think there are delays in flying. Delays?” he asks. “New York to California in five hours. That used to take 30 years and a bunch of you would die on the way there… Now you watch a movie and take a dump and your home.” A bit about being broke likewise turns the ordinary into the hilarious. “Do you ever get so broke that your bank starts charging you money for not having enough money?” a depressed sounding C.K. polls the audience. “Negative $10, that’s how much I have now. That means I don’t even have no money… I have to raise $10 just to be broke.”
C.K. curses frequently and talks openly about, among other sexual impulses, masturbation. But his vulgarity isn’t novel either. George Carlin explored taboo topics. Carlin's bit about the “Seven Words You Can Never Say on Television” is still discussed today.
"Shit, Piss, Fuck, Cunt, CockSucker, MotherFucker, and Tits" Those are the heavy seven. Those are the ones that'll infect your soul, curve your spine, and keep the country from winning the war. "Shit, Piss, Fuck, Cunt, CockSucker, MotherFucker, and Tits"
C.K. might be exceptionally risqué with his words. In the span of a few minutes in his 2008 album, Chewed Up, he tells the audience he misses using the word “faggot,” he argues that “cunt” is a beautiful word (he says “It’s chocolatey and round. I don’t use it as an insult… I just like saying it”), and he vents about white newscasters who say the “n” word because it “puts the word nigger in the listeners head without them actually saying it.”
A study published by Robert Lynch of Rutgers University in 2009 helps explain why vulgarity brings out the biggest laughs. Lynch gathered 60 undergrads from Rutgers and had them watch the comedian Bill Burr do a 30-minute routine. Lynch monitored his subjects’ laughter and facial expressions and found that the biggest laughs came from white students during racial jokes (Burr tells a joke about being afraid of black neighborhoods) and male students during gender jokes (Burr jokes that men should make more money than women). Lynch concludes that what’s faux pas is funny because it brings out unconscious, or at least unspoken, beliefs we all hold. In this light the role of the comedian is to give us an excuse to laugh about what we’re not suppose to laugh about, and that’s the funny part.
One talent viewers overlook is C.K.’s writing. He is the sole writer of Louie and it’s one of the best shows on TV. (It appears on numerous top ten lists and it received an Emmy for Outstanding Writing for a Comedy Series.) There’s a vignette in the second season of Louie that captures his skills well. A group of writers gather around a table to “inject some funny” back into an over-edited script. C.K. awkwardly stands around the table eating a jelly croissant when the head writer reads the first page. It goes something like this: An alarm clock switches from 6:59am to 7am. A veteran cop in his 30s hits the snooze button. “Oh, not another one of these,” the cop says as he goes back to sleep. His dog licks his face. “Come on boy, gimme a break.”
“That’s page one. Any suggestions?” the head writer asks. A variety of attempts come up short. A snob remarks, “Do we really need another movie with the alarm clock close up and the dog licking the guy. Come on everybody. This is like every bad cop movie I’ve ever seen.” The writers grumble and C.K. steps in. “What if the dogs stops the alarm?”
Louie is scattered with gems like these. It’s the small ingenious surprises and plots twist that makes C.K.’s writing great. We’re expecting X but we get Y. Many times these surprises are exceptionally sad or serious. Yet, it’s the sentimental moments that leave us with new perspectives on life.
Consider the second episode of season one, “Poker/Divorce.” C.K. and a group of friends are playing poker. One turns to a gay friend and asks him what it feels like to have a “dick in the ass.” The gay friend diverges into a description of “City Jerks,” a club where gay men get together to masturbate each other. Another friend responds, “I have to be honest, what you guys do is sick. Not at a political or Bible level either. Just picturing you touching another guy’s dick is gross.” There’s an awkward tension when C.K. steps in with a question, “Is that how you feel about what we do? Do you think vaginas are gross?”
The conversation diverges once again into a discussion about the word “faggot.” The gay friend explains how bundles of sticks – faggots – were used to burn homosexuals at the stake during the middle ages. Then he reminds the group that every gay man in American has been called faggot while he was being beaten up. The mood gets very serious for a moment. “Ok thanks, faggot. We will keep that in mind.” The friends burst into laughter and somehow you’re left with an honest and new perspective on homosexuality. Within the vulgarity was sincerity.
C.K. is a creative genius because he reverse engineers the everyday to exploit its humor. Sometimes this means a conversation amongst friends about gay sex. Other times it means examining in depth one of Carlin’s seven words. What hasn’t been said about Louis C.K.? His creativity is in his penmanship.
Image via Flickr/David_shankbone
There are 5 eras in the universe's lifecycle. Right now, we're in the second era.
Astronomers find these five chapters to be a handy way of conceiving the universe's incredibly long lifespan.
Image based on logarithmic maps of the Universe put together by Princeton University researchers, and images produced by NASA based on observations made by their telescopes and roving spacecraft
- We're in the middle, or thereabouts, of the universe's Stelliferous era.
- If you think there's a lot going on out there now, the first era's drama makes things these days look pretty calm.
- Scientists attempt to understand the past and present by bringing together the last couple of centuries' major schools of thought.
If you're fortunate enough to get yourself beneath a clear sky in a dark place on a moonless night, a gorgeous space-scape of stars waits. If you have binoculars and point them upward, you're treated to a mind-bogglingly dense backdrop of countless specks of light absolutely everywhere, stacked atop each other, burrowing outward and backward through space and time. Such is the universe of the cosmological era in which we live. It's called the Stelliferous era, and there are four others.
The 5 eras of the universe
There are many ways to consider and discuss the past, present, and future of the universe, but one in particular has caught the fancy of many astronomers. First published in 1999 in their book The Five Ages of the Universe: Inside the Physics of Eternity, Fred Adams and Gregory Laughlin divided the universe's life story into five eras:
- Primordial era
- Stellferous era
- Degenerate era
- Black Hole Era
- Dark era
The book was last updated according to current scientific understandings in 2013.
It's worth noting that not everyone is a subscriber to the book's structure. Popular astrophysics writer Ethan C. Siegel, for example, published an article on Medium last June called "We Have Already Entered The Sixth And Final Era Of Our Universe." Nonetheless, many astronomers find the quintet a useful way of discuss such an extraordinarily vast amount of time.
The Primordial era

Image source: Sagittarius Production/Shutterstock
This is where the universe begins, though what came before it and where it came from are certainly still up for discussion. It begins at the Big Bang about 13.8 billion years ago.
For the first little, and we mean very little, bit of time, spacetime and the laws of physics are thought not yet to have existed. That weird, unknowable interval is the Planck Epoch that lasted for 10-44 seconds, or 10 million of a trillion of a trillion of a trillionth of a second. Much of what we currently believe about the Planck Epoch eras is theoretical, based largely on a hybrid of general-relativity and quantum theories called quantum gravity. And it's all subject to revision.
That having been said, within a second after the Big Bang finished Big Banging, inflation began, a sudden ballooning of the universe into 100 trillion trillion times its original size.
Within minutes, the plasma began cooling, and subatomic particles began to form and stick together. In the 20 minutes after the Big Bang, atoms started forming in the super-hot, fusion-fired universe. Cooling proceeded apace, leaving us with a universe containing mostly 75% hydrogen and 25% helium, similar to that we see in the Sun today. Electrons gobbled up photons, leaving the universe opaque.
About 380,000 years after the Big Bang, the universe had cooled enough that the first stable atoms capable of surviving began forming. With electrons thus occupied in atoms, photons were released as the background glow that astronomers detect today as cosmic background radiation.
Inflation is believed to have happened due to the remarkable overall consistency astronomers measure in cosmic background radiation. Astronomer Phil Plait suggests that inflation was like pulling on a bedsheet, suddenly pulling the universe's energy smooth. The smaller irregularities that survived eventually enlarged, pooling in denser areas of energy that served as seeds for star formation—their gravity pulled in dark matter and matter that eventually coalesced into the first stars.
The Stelliferous era

Image source: Casey Horner/unsplash
The era we know, the age of stars, in which most matter existing in the universe takes the form of stars and galaxies during this active period.
A star is formed when a gas pocket becomes denser and denser until it, and matter nearby, collapse in on itself, producing enough heat to trigger nuclear fusion in its core, the source of most of the universe's energy now. The first stars were immense, eventually exploding as supernovas, forming many more, smaller stars. These coalesced, thanks to gravity, into galaxies.
One axiom of the Stelliferous era is that the bigger the star, the more quickly it burns through its energy, and then dies, typically in just a couple of million years. Smaller stars that consume energy more slowly stay active longer. In any event, stars — and galaxies — are coming and going all the time in this era, burning out and colliding.
Scientists predict that our Milky Way galaxy, for example, will crash into and combine with the neighboring Andromeda galaxy in about 4 billion years to form a new one astronomers are calling the Milkomeda galaxy.
Our solar system may actually survive that merger, amazingly, but don't get too complacent. About a billion years later, the Sun will start running out of hydrogen and begin enlarging into its red giant phase, eventually subsuming Earth and its companions, before shrining down to a white dwarf star.
The Degenerate era

Image source: Diego Barucco/Shutterstock/Big Think
Next up is the Degenerate era, which will begin about 1 quintillion years after the Big Bang, and last until 1 duodecillion after it. This is the period during which the remains of stars we see today will dominate the universe. Were we to look up — we'll assuredly be outta here long before then — we'd see a much darker sky with just a handful of dim pinpoints of light remaining: white dwarfs, brown dwarfs, and neutron stars. These"degenerate stars" are much cooler and less light-emitting than what we see up there now. Occasionally, star corpses will pair off into orbital death spirals that result in a brief flash of energy as they collide, and their combined mass may become low-wattage stars that will last for a little while in cosmic-timescale terms. But mostly the skies will be be bereft of light in the visible spectrum.
During this era, small brown dwarfs will wind up holding most of the available hydrogen, and black holes will grow and grow and grow, fed on stellar remains. With so little hydrogen around for the formation of new stars, the universe will grow duller and duller, colder and colder.
And then the protons, having been around since the beginning of the universe will start dying off, dissolving matter, leaving behind a universe of subatomic particles, unclaimed radiation…and black holes.
The Black Hole era

Image source: Vadim Sadovski/Shutterstock/Big Think
For a considerable length of time, black holes will dominate the universe, pulling in what mass and energy still remain.
Eventually, though, black holes evaporate, albeit super-slowly, leaking small bits of their contents as they do. Plait estimates that a small black hole 50 times the mass of the sun would take about 1068 years to dissipate. A massive one? A 1 followed by 92 zeros.
When a black hole finally drips to its last drop, a small pop of light occurs letting out some of the only remaining energy in the universe. At that point, at 1092, the universe will be pretty much history, containing only low-energy, very weak subatomic particles and photons.
The Dark Era

Image source: Big Think
We can sum this up pretty easily. Lights out. Forever.
Tonight, if it's clear, maybe you want to step outside, take a nice deep breath, and look up, grateful that we are where we are, and when we are, in spite of all the day's hardships. We've got a serious amount of temporal elbow room here, far more than we need, so not to worry, and those stars aren't going anywhere for a long, long time.
Greed and the philosophy of wealth
When does a healthy desire for wealth morph into greed? And how can we stop it?
- It's common wisdom that most things in life are best in moderation.
- Most of us agree that owning property is okay but are hard-pressed to say why and when it has gone too far.
- Greed dominates your life if the pursuit of wealth is a higher priority than charity, kindness, and solidarity with others.
The great Greek poet, Hesiod, wrote, "Observe due measure; moderation is best in all things." It's a wisdom that finds support across all ages, stages, and aspects of life. Drinking water is a good thing, but drinking too much is dangerous. A shot of vodka won't kill you, but a gallon probably will. Working hard is good, but burning yourself out is not. Being nice is great, but a sycophant is creepy. Moderation in all things.
But, it's not always easy to determine where that line falls, and a great example of this concerns property and wealth.
Most of us agree that owning things, or at least having the right to own things, is good. It's okay to buy a phone, to own a car, or to have your own clothes. But equally true is that most people feel uneasy about a world which has both billionaires in vast mansions as well as children dying malnourished. Greed, avarice, envy, and venality are considered vices. To be obsessively driven for material things is still, in the main, considered to be either misguided or, at its worst, utterly immoral. So, when does wealth become greed?
John Locke and the philosophy of property

It's hard to pinpoint exactly when humans first called a thing "mine," but the philosophy and law of property is much easier to track. One of the biggest names to consider the issue was the 17th century English philosopher John Locke.
Locke's political philosophy is famously cited as a major influence on the U.S. Declaration of Independence but also fed heavily into the French Revolution and the Great Reform movements of Britain. His work on property is perhaps one of his most important contributions.
Although subject to a fair bit of debate — what isn't in philosophy? — it's generally accepted that Locke adopted a "fair usage" view of property. He argued that one can hold any property that meets the following criteria:
- It can be used before it spoils (e.g., we don't have huge stores of food that just rots).
- It leaves "good and enough" for everyone else (e.g., one person cannot own all the land in a country).
- The property must come from your own work and effort or what he calls "mixing your labor" with that thing (e.g., if you farm a field, the field and its produce become yours).
If we were to follow these rules, it seems hard to envisage a world of greed and inequality. Everyone can have and get what they want, so long as enough is left for everyone else to get what they want, as well.
But, there's a lot of ambiguity in these rules, and money rather changes things. Money, especially modern money in the form of digital numbers on a screen, does not spoil. And, thanks to modern banking, there is no limit to the amount of money there could be — a bank can, and does, literally create money each time they give you a credit card or a loan (although, in practice, few countries allow this and place limits on money creation). So, no matter how many billions someone creates, there will always be "good and enough" money for others, too.
(Of course, in practice, constantly creating huge new pools of money will lead to hyperinflation, devaluing the money for everyone. Yet, even if we were to ban all new money creation today, a Lockean could argue that there's more than enough already for a generous distribution around the world.)
So, money changes things for Locke's account. It won't spoil and there will always be at least some money for everyone else. It's even been argued that Locke, far from advocating an equal and distributive philosophy, can easily support rampant capitalist accumulation of wealth. Locke wrote that, because of money, "Now one man could have… a disproportionate and unequal possession of the earth… and fairly possess more land than he himself can use."
It's the philosophy of greed.
Too much greed
The idea that greed is an essential part of being human (or at least an animal) goes back at least to Plato and has a rich philosophical history from there. Today, it often takes the form of evolutionary psychology or genetics, exemplified by Richard Dawkins' The Selfish Gene.
It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
One thinker who has challenged this is Peter Singer. Singer acknowledges the fact that evolution does work on a certain competitiveness, that is, the fittest will pass on their genes. But he also believes that it's wrong to associate this wholly with greed or selfishness. Cooperation and productive relationships are just as vital to survival.
Singer argues that the desire to do good, to work hard, and to succeed are admirable parts of the human condition, but when they are taken to excess, they turn into greed. That line comes when the want of more — particularly, the desire for material wealth — becomes the sole focus of a life. It's when working late or constantly looking for that promotion is prioritized over family, friends, and common human compassion.
The fact is that, in the West, most people have enough. Even poor people generally have TVs, smartphones, and automobiles. The average person in the West lives far better than royalty did for millennia. Singer asks us to get a sense of perspective. We spend more on bottled water than some families in developing countries live off for a day. We're so fixated on our current day-to-day condition, that we lose sight of how much we really have.
Greed über alles
Singer's argument helps us identify the point at which drive and success insidiously morph into greed: It's when we are loath to spend our money and devote all of our waking lives to determinedly accumulating more and more at the expense of our relationships. It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
But it's also when greed replaces our common sense of compassion. It's when property and wealth become virtues greater than charity, kindness, and solidarity with others. It's when dollar signs and fast cars matter more than people dying in the street. It's when getting a pay raise matters more than someone else getting fired.
Nobody likes to think of themselves as greedy, but if you examine yourself closely, you will probably find some aspects of your life that are at least tainted by greed. We should all check ourselves from time to time.
This programmable fiber has memories and can sense temperature
Researchers were even able store and read a 767-kilobit full-color short movie file in the fabric.
MIT researchers have created the first fiber with digital capabilities, able to sense, store, analyze, and infer activity after being sewn into a shirt.
Yoel Fink, who is a professor in the departments of materials science and engineering and electrical engineering and computer science, a Research Laboratory of Electronics principal investigator, and the senior author on the study, says digital fibers expand the possibilities for fabrics to uncover the context of hidden patterns in the human body that could be used for physical performance monitoring, medical inference, and early disease detection.
Or, you might someday store your wedding music in the gown you wore on the big day — more on that later.
Fink and his colleagues describe the features of the digital fiber today in Nature Communications. Until now, electronic fibers have been analog — carrying a continuous electrical signal — rather than digital, where discrete bits of information can be encoded and processed in 0s and 1s.
"This work presents the first realization of a fabric with the ability to store and process data digitally, adding a new information content dimension to textiles and allowing fabrics to be programmed literally," Fink says.
MIT PhD student Gabriel Loke and MIT postdoc Tural Khudiyev are the lead authors on the paper. Other co-authors MIT postdoc Wei Yan; MIT undergraduates Brian Wang, Stephanie Fu, Ioannis Chatziveroglou, Syamantak Payra, Yorai Shaoul, Johnny Fung, and Itamar Chinn; John Joannopoulos, the Francis Wright Davis Chair Professor of Physics and director of the Institute for Soldier Nanotechnologies at MIT; Harrisburg University of Science and Technology master's student Pin-Wen Chou; and Rhode Island School of Design Associate Professor Anna Gitelson-Kahn. The fabric work was facilitated by Professor Anais Missakian, who holds the Pevaroff-Cohn Family Endowed Chair in Textiles at RISD.
Memory and more
The new fiber was created by placing hundreds of square silicon microscale digital chips into a preform that was then used to create a polymer fiber. By precisely controlling the polymer flow, the researchers were able to create a fiber with continuous electrical connection between the chips over a length of tens of meters.
The fiber itself is thin and flexible and can be passed through a needle, sewn into fabrics, and washed at least 10 times without breaking down. According to Loke, "When you put it into a shirt, you can't feel it at all. You wouldn't know it was there."
Making a digital fiber "opens up different areas of opportunities and actually solves some of the problems of functional fibers," he says.
For instance, it offers a way to control individual elements within a fiber, from one point at the fiber's end. "You can think of our fiber as a corridor, and the elements are like rooms, and they each have their own unique digital room numbers," Loke explains. The research team devised a digital addressing method that allows them to "switch on" the functionality of one element without turning on all the elements.
A digital fiber can also store a lot of information in memory. The researchers were able to write, store, and read information on the fiber, including a 767-kilobit full-color short movie file and a 0.48 megabyte music file. The files can be stored for two months without power.
When they were dreaming up "crazy ideas" for the fiber, Loke says, they thought about applications like a wedding gown that would store digital wedding music within the weave of its fabric, or even writing the story of the fiber's creation into its components.
Fink notes that the research at MIT was in close collaboration with the textile department at RISD led by Missakian. Gitelson-Kahn incorporated the digital fibers into a knitted garment sleeve, thus paving the way to creating the first digital garment.

On-body artificial intelligence
The fiber also takes a few steps forward into artificial intelligence by including, within the fiber memory, a neural network of 1,650 connections. After sewing it around the armpit of a shirt, the researchers used the fiber to collect 270 minutes of surface body temperature data from a person wearing the shirt, and analyze how these data corresponded to different physical activities. Trained on these data, the fiber was able to determine with 96 percent accuracy what activity the person wearing it was engaged in.
Adding an AI component to the fiber further increases its possibilities, the researchers say. Fabrics with digital components can collect a lot of information across the body over time, and these "lush data" are perfect for machine learning algorithms, Loke says.
"This type of fabric could give quantity and quality open-source data for extracting out new body patterns that we did not know about before," he says.
With this analytic power, the fibers someday could sense and alert people in real-time to health changes like a respiratory decline or an irregular heartbeat, or deliver muscle activation or heart rate data to athletes during training.
The fiber is controlled by a small external device, so the next step will be to design a new chip as a microcontroller that can be connected within the fiber itself.
"When we can do that, we can call it a fiber computer," Loke says.
This research was supported by the U.S. Army Institute of Soldier Nanotechnologies, National Science Foundation, the U.S. Army Research Office, the MIT Sea Grant, and the Defense Threat Reduction Agency.
Reprinted with permission of MIT News. Read the original article.
No news is good news? Think again
Information economics suggests that "no news" means somebody is hiding something. But people are bad at noticing that.