Big ideas.
Once a week.
Subscribe to our weekly newsletter.
Why is democracy so difficult? Could be because it demands that each of us accept, as the anthropologist Clifford Geertz said to me way back when I wrote this, "that other people are as real as you are." But other people's opinions are so wrong! Surely they're deluded, deceived, bought off, stupid, neurotic or perhaps merely insane. Their access to the truth must be less than ours. The alternative is admitting our own certainty may be as overstated as theirs. That just doesn't come naturally, not even to those who imagine themselves enlightened. Case in point: This study, just out in the journal Nature Climate Change.
Dan Kahan and his co-authors compared 1,540 Americans' views on the risks of global warming with their scientific literacy and ability to reason logically and mathematically. Result: Higher scientific literacy and reasoning skill correlated with lower levels of concern about climate change.
The effect wasn't large, but according to the standard catechism of environmentalism, it should not exist at all. That catechism holds that concern about global warming rises as people are exposed to "the facts" (and, therefore, that lack of concern about the climate is a product of enemy propaganda, lack of education, or some sort of neurotic reaction).
In other words, the study reminds those concerned about climate change that other people are as real as they are—that those who deprecate or deny global warming are not necessarily working with an inferior set of mental tools, nor with bad information. Nor are we who disagree with them superior beings who have a greater ability to overcome the mind's built-in biases. Therefore, bombarding the other side with scientific facts will not change their minds.
Along with their climate-related questions, Kahan and his co-authors also assessed how their volunteers came down on the value of equality versus the value of hierarchy in society (by asking how much they agreed or disagreed with statements like "We need to markedly reduce inequalities between the rich and the poor, whites and people of color, and men and women") and how they saw the balance between community and individuals (in their reactions to statements like "Government should put limits on the choices individuals can make so they do not get in the way of what is good for society").
Americans tend to clump into two groups on this measure, one hierarchical-individualistic (let people alone and respect authority) and the other egalitarian-communitarian (reduce inequality and look out for the good of society). And it turned out that this measure of value was a much stronger predictor of concern about global warming than was scientific literacy or reasoning skill. Egalitarian-communalists were far more worried about global warming, and a better score on the science competence tests in their group correlated with slightly greater concern. But among the hierarchical-individualists, there was a stronger link between scientific literacy and less concern. That was what was responsible for the overall group result. (Hierachical-individualists were also a great deal less concerned about nuclear power than were egalitarian-communalists.)
Now, these results are a problem for the Enlightenment-era, rationalist model of politics, in which people weigh arguments according to standards of logic and evidence. In real life, people generally do that only when they have to—when, for example, it's required by their jobs.
For those who have to deal with it professionally, after all, climate change isn't in dispute. Agriculture experts, epidemiologists, disaster preparedness teams, civil engineers, military planners and the like can no more deny the state of the climate than an astronaut could believe in a Flat Earth. It's a part of their jobs, and, as NASA's Gavin Schmidt puts it, "gases don’t care whether you are a Republican or a Democrat – left wing, right wing – libertarian, or conservative." Why aren't the rest of us like the pros?
Here, Kahan et al. propose that the answer stems from the fact that climate-change isn't part of our jobs. In fact, for billions of us non-specialists, our understanding of climate change has little immediate, practical impact. If you stop taking airplanes and otherwise reduce your carbon footprint, you will, of course, be helping to reduce the impact of greenhouse gases. But if you really understand the science, you understand that your effect will be absurdly small, until and unless many others join you.
So scientists and their allies proselytize. All well and good, except that people who have banded together to change the world send a social signal. We are the people who believe in global warming, this is what we are like, and how we talk, and how we behave. That signal is much more emotionally compelling, and more consequential in day-to-day life, than imagery of a drowned world sometime in the lifetime of one's grandchildren.
In other words, while gases don't care if you're a Democrat or Republican, people sure as hell do. An opinion about global warming is one of the flags we fly to show that we're down with our fellow Tea Partiers (or fellow members of the NRDC). Unless you're required to face reality (maybe you're planning the system that will deal with massive storm surges in a future New York or London), that flag-flying is much more motivating than geophysical facts. So you to engage in what Kahan has called "protective cognition" to prevent science from driving a wedge between you and your peers.
Such, anyway, is the explanation Kahan et al. offer for their data. The new study's findings, its authors write, are evidence of how "remarkably well-equipped ordinary individuals are to discern which stances towards scientific information secure their personal interests."
Now, this could have been presented in the familiar tone of one-sided self-congratulation (here is why they are so stupid). That's an occupational hazard of what I call post-rational research: The tendency to see these sort of results as an explanation for why other people don't do the right thing. But Kahan has noticed that taking this work seriously means realizing that we are all subject to biases and sometimes flawed rules of thumb. If you take democracy seriously, you have to recognize then that science isn't going to tell you why other people are idiots while you are right. Instead, it is going to tell you why we are all idiots together, and give you the tools to deal with that fact.
We needn't accept every damn fool argument that comes down the road, but we do need to accept that we're all inclined to protect damn fool arguments that are associated with our identities. Environmentalists who spend their time trying to figure out why they are morally, intellectually or scientifically superior to their opponents are, themselves, using climate change as a tribal marker of identity. Such people are likely—just like their opponents—to reject science that doesn't fit their received opinions.
The paper therefore implies a truly post-rational vision of politics—not a battle of ideas and interests in which all players keep an accurate score, but rather a kind of theater in which our emotional selves display solidarity with our chosen teams, and reason supplies the justifications for what we would do anyway.
That sounds like a despairing vision, I guess, if you're committed to the traditional view of politics. But I think this is a hopeful study, because it suggests a way to conduct politics that aligns better with human nature than did the Enlightenment model. Here, for instance, Kahan suggests some practical strategies that, to my eye, amount to filleting the cultural markers out of a scientific argument. If you want to persuade a hierarchical-individualist that climate change must be reckoned with, he suggests, mention that geoengineering and nuclear power could be part of the solution. If you want a egalitarian-communalist to look kindly on nanotech, mention that it could be used to mitigate environmental damage. The point, I think, is to keep each argument bound to its terms, and avoid letting them become bundled into cultural nets. The requires self-control on all sides, as "protective cognition" is always tempting us.
Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks Nature Climate Change DOI: 10.1038/NCLIMATE1547
There are 5 eras in the universe's lifecycle. Right now, we're in the second era.
Astronomers find these five chapters to be a handy way of conceiving the universe's incredibly long lifespan.
Image based on logarithmic maps of the Universe put together by Princeton University researchers, and images produced by NASA based on observations made by their telescopes and roving spacecraft
- We're in the middle, or thereabouts, of the universe's Stelliferous era.
- If you think there's a lot going on out there now, the first era's drama makes things these days look pretty calm.
- Scientists attempt to understand the past and present by bringing together the last couple of centuries' major schools of thought.
If you're fortunate enough to get yourself beneath a clear sky in a dark place on a moonless night, a gorgeous space-scape of stars waits. If you have binoculars and point them upward, you're treated to a mind-bogglingly dense backdrop of countless specks of light absolutely everywhere, stacked atop each other, burrowing outward and backward through space and time. Such is the universe of the cosmological era in which we live. It's called the Stelliferous era, and there are four others.
The 5 eras of the universe
There are many ways to consider and discuss the past, present, and future of the universe, but one in particular has caught the fancy of many astronomers. First published in 1999 in their book The Five Ages of the Universe: Inside the Physics of Eternity, Fred Adams and Gregory Laughlin divided the universe's life story into five eras:
- Primordial era
- Stellferous era
- Degenerate era
- Black Hole Era
- Dark era
The book was last updated according to current scientific understandings in 2013.
It's worth noting that not everyone is a subscriber to the book's structure. Popular astrophysics writer Ethan C. Siegel, for example, published an article on Medium last June called "We Have Already Entered The Sixth And Final Era Of Our Universe." Nonetheless, many astronomers find the quintet a useful way of discuss such an extraordinarily vast amount of time.
The Primordial era

Image source: Sagittarius Production/Shutterstock
This is where the universe begins, though what came before it and where it came from are certainly still up for discussion. It begins at the Big Bang about 13.8 billion years ago.
For the first little, and we mean very little, bit of time, spacetime and the laws of physics are thought not yet to have existed. That weird, unknowable interval is the Planck Epoch that lasted for 10-44 seconds, or 10 million of a trillion of a trillion of a trillionth of a second. Much of what we currently believe about the Planck Epoch eras is theoretical, based largely on a hybrid of general-relativity and quantum theories called quantum gravity. And it's all subject to revision.
That having been said, within a second after the Big Bang finished Big Banging, inflation began, a sudden ballooning of the universe into 100 trillion trillion times its original size.
Within minutes, the plasma began cooling, and subatomic particles began to form and stick together. In the 20 minutes after the Big Bang, atoms started forming in the super-hot, fusion-fired universe. Cooling proceeded apace, leaving us with a universe containing mostly 75% hydrogen and 25% helium, similar to that we see in the Sun today. Electrons gobbled up photons, leaving the universe opaque.
About 380,000 years after the Big Bang, the universe had cooled enough that the first stable atoms capable of surviving began forming. With electrons thus occupied in atoms, photons were released as the background glow that astronomers detect today as cosmic background radiation.
Inflation is believed to have happened due to the remarkable overall consistency astronomers measure in cosmic background radiation. Astronomer Phil Plait suggests that inflation was like pulling on a bedsheet, suddenly pulling the universe's energy smooth. The smaller irregularities that survived eventually enlarged, pooling in denser areas of energy that served as seeds for star formation—their gravity pulled in dark matter and matter that eventually coalesced into the first stars.
The Stelliferous era

Image source: Casey Horner/unsplash
The era we know, the age of stars, in which most matter existing in the universe takes the form of stars and galaxies during this active period.
A star is formed when a gas pocket becomes denser and denser until it, and matter nearby, collapse in on itself, producing enough heat to trigger nuclear fusion in its core, the source of most of the universe's energy now. The first stars were immense, eventually exploding as supernovas, forming many more, smaller stars. These coalesced, thanks to gravity, into galaxies.
One axiom of the Stelliferous era is that the bigger the star, the more quickly it burns through its energy, and then dies, typically in just a couple of million years. Smaller stars that consume energy more slowly stay active longer. In any event, stars — and galaxies — are coming and going all the time in this era, burning out and colliding.
Scientists predict that our Milky Way galaxy, for example, will crash into and combine with the neighboring Andromeda galaxy in about 4 billion years to form a new one astronomers are calling the Milkomeda galaxy.
Our solar system may actually survive that merger, amazingly, but don't get too complacent. About a billion years later, the Sun will start running out of hydrogen and begin enlarging into its red giant phase, eventually subsuming Earth and its companions, before shrining down to a white dwarf star.
The Degenerate era

Image source: Diego Barucco/Shutterstock/Big Think
Next up is the Degenerate era, which will begin about 1 quintillion years after the Big Bang, and last until 1 duodecillion after it. This is the period during which the remains of stars we see today will dominate the universe. Were we to look up — we'll assuredly be outta here long before then — we'd see a much darker sky with just a handful of dim pinpoints of light remaining: white dwarfs, brown dwarfs, and neutron stars. These"degenerate stars" are much cooler and less light-emitting than what we see up there now. Occasionally, star corpses will pair off into orbital death spirals that result in a brief flash of energy as they collide, and their combined mass may become low-wattage stars that will last for a little while in cosmic-timescale terms. But mostly the skies will be be bereft of light in the visible spectrum.
During this era, small brown dwarfs will wind up holding most of the available hydrogen, and black holes will grow and grow and grow, fed on stellar remains. With so little hydrogen around for the formation of new stars, the universe will grow duller and duller, colder and colder.
And then the protons, having been around since the beginning of the universe will start dying off, dissolving matter, leaving behind a universe of subatomic particles, unclaimed radiation…and black holes.
The Black Hole era

Image source: Vadim Sadovski/Shutterstock/Big Think
For a considerable length of time, black holes will dominate the universe, pulling in what mass and energy still remain.
Eventually, though, black holes evaporate, albeit super-slowly, leaking small bits of their contents as they do. Plait estimates that a small black hole 50 times the mass of the sun would take about 1068 years to dissipate. A massive one? A 1 followed by 92 zeros.
When a black hole finally drips to its last drop, a small pop of light occurs letting out some of the only remaining energy in the universe. At that point, at 1092, the universe will be pretty much history, containing only low-energy, very weak subatomic particles and photons.
The Dark Era

Image source: Big Think
We can sum this up pretty easily. Lights out. Forever.
Tonight, if it's clear, maybe you want to step outside, take a nice deep breath, and look up, grateful that we are where we are, and when we are, in spite of all the day's hardships. We've got a serious amount of temporal elbow room here, far more than we need, so not to worry, and those stars aren't going anywhere for a long, long time.
Greed and the philosophy of wealth
When does a healthy desire for wealth morph into greed? And how can we stop it?
- It's common wisdom that most things in life are best in moderation.
- Most of us agree that owning property is okay but are hard-pressed to say why and when it has gone too far.
- Greed dominates your life if the pursuit of wealth is a higher priority than charity, kindness, and solidarity with others.
The great Greek poet, Hesiod, wrote, "Observe due measure; moderation is best in all things." It's a wisdom that finds support across all ages, stages, and aspects of life. Drinking water is a good thing, but drinking too much is dangerous. A shot of vodka won't kill you, but a gallon probably will. Working hard is good, but burning yourself out is not. Being nice is great, but a sycophant is creepy. Moderation in all things.
But, it's not always easy to determine where that line falls, and a great example of this concerns property and wealth.
Most of us agree that owning things, or at least having the right to own things, is good. It's okay to buy a phone, to own a car, or to have your own clothes. But equally true is that most people feel uneasy about a world which has both billionaires in vast mansions as well as children dying malnourished. Greed, avarice, envy, and venality are considered vices. To be obsessively driven for material things is still, in the main, considered to be either misguided or, at its worst, utterly immoral. So, when does wealth become greed?
John Locke and the philosophy of property

It's hard to pinpoint exactly when humans first called a thing "mine," but the philosophy and law of property is much easier to track. One of the biggest names to consider the issue was the 17th century English philosopher John Locke.
Locke's political philosophy is famously cited as a major influence on the U.S. Declaration of Independence but also fed heavily into the French Revolution and the Great Reform movements of Britain. His work on property is perhaps one of his most important contributions.
Although subject to a fair bit of debate — what isn't in philosophy? — it's generally accepted that Locke adopted a "fair usage" view of property. He argued that one can hold any property that meets the following criteria:
- It can be used before it spoils (e.g., we don't have huge stores of food that just rots).
- It leaves "good and enough" for everyone else (e.g., one person cannot own all the land in a country).
- The property must come from your own work and effort or what he calls "mixing your labor" with that thing (e.g., if you farm a field, the field and its produce become yours).
If we were to follow these rules, it seems hard to envisage a world of greed and inequality. Everyone can have and get what they want, so long as enough is left for everyone else to get what they want, as well.
But, there's a lot of ambiguity in these rules, and money rather changes things. Money, especially modern money in the form of digital numbers on a screen, does not spoil. And, thanks to modern banking, there is no limit to the amount of money there could be — a bank can, and does, literally create money each time they give you a credit card or a loan (although, in practice, few countries allow this and place limits on money creation). So, no matter how many billions someone creates, there will always be "good and enough" money for others, too.
(Of course, in practice, constantly creating huge new pools of money will lead to hyperinflation, devaluing the money for everyone. Yet, even if we were to ban all new money creation today, a Lockean could argue that there's more than enough already for a generous distribution around the world.)
So, money changes things for Locke's account. It won't spoil and there will always be at least some money for everyone else. It's even been argued that Locke, far from advocating an equal and distributive philosophy, can easily support rampant capitalist accumulation of wealth. Locke wrote that, because of money, "Now one man could have… a disproportionate and unequal possession of the earth… and fairly possess more land than he himself can use."
It's the philosophy of greed.
Too much greed
The idea that greed is an essential part of being human (or at least an animal) goes back at least to Plato and has a rich philosophical history from there. Today, it often takes the form of evolutionary psychology or genetics, exemplified by Richard Dawkins' The Selfish Gene.
It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
One thinker who has challenged this is Peter Singer. Singer acknowledges the fact that evolution does work on a certain competitiveness, that is, the fittest will pass on their genes. But he also believes that it's wrong to associate this wholly with greed or selfishness. Cooperation and productive relationships are just as vital to survival.
Singer argues that the desire to do good, to work hard, and to succeed are admirable parts of the human condition, but when they are taken to excess, they turn into greed. That line comes when the want of more — particularly, the desire for material wealth — becomes the sole focus of a life. It's when working late or constantly looking for that promotion is prioritized over family, friends, and common human compassion.
The fact is that, in the West, most people have enough. Even poor people generally have TVs, smartphones, and automobiles. The average person in the West lives far better than royalty did for millennia. Singer asks us to get a sense of perspective. We spend more on bottled water than some families in developing countries live off for a day. We're so fixated on our current day-to-day condition, that we lose sight of how much we really have.
Greed über alles
Singer's argument helps us identify the point at which drive and success insidiously morph into greed: It's when we are loath to spend our money and devote all of our waking lives to determinedly accumulating more and more at the expense of our relationships. It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
But it's also when greed replaces our common sense of compassion. It's when property and wealth become virtues greater than charity, kindness, and solidarity with others. It's when dollar signs and fast cars matter more than people dying in the street. It's when getting a pay raise matters more than someone else getting fired.
Nobody likes to think of themselves as greedy, but if you examine yourself closely, you will probably find some aspects of your life that are at least tainted by greed. We should all check ourselves from time to time.
This programmable fiber has memories and can sense temperature
Researchers were even able store and read a 767-kilobit full-color short movie file in the fabric.
MIT researchers have created the first fiber with digital capabilities, able to sense, store, analyze, and infer activity after being sewn into a shirt.
Yoel Fink, who is a professor in the departments of materials science and engineering and electrical engineering and computer science, a Research Laboratory of Electronics principal investigator, and the senior author on the study, says digital fibers expand the possibilities for fabrics to uncover the context of hidden patterns in the human body that could be used for physical performance monitoring, medical inference, and early disease detection.
Or, you might someday store your wedding music in the gown you wore on the big day — more on that later.
Fink and his colleagues describe the features of the digital fiber today in Nature Communications. Until now, electronic fibers have been analog — carrying a continuous electrical signal — rather than digital, where discrete bits of information can be encoded and processed in 0s and 1s.
"This work presents the first realization of a fabric with the ability to store and process data digitally, adding a new information content dimension to textiles and allowing fabrics to be programmed literally," Fink says.
MIT PhD student Gabriel Loke and MIT postdoc Tural Khudiyev are the lead authors on the paper. Other co-authors MIT postdoc Wei Yan; MIT undergraduates Brian Wang, Stephanie Fu, Ioannis Chatziveroglou, Syamantak Payra, Yorai Shaoul, Johnny Fung, and Itamar Chinn; John Joannopoulos, the Francis Wright Davis Chair Professor of Physics and director of the Institute for Soldier Nanotechnologies at MIT; Harrisburg University of Science and Technology master's student Pin-Wen Chou; and Rhode Island School of Design Associate Professor Anna Gitelson-Kahn. The fabric work was facilitated by Professor Anais Missakian, who holds the Pevaroff-Cohn Family Endowed Chair in Textiles at RISD.
Memory and more
The new fiber was created by placing hundreds of square silicon microscale digital chips into a preform that was then used to create a polymer fiber. By precisely controlling the polymer flow, the researchers were able to create a fiber with continuous electrical connection between the chips over a length of tens of meters.
The fiber itself is thin and flexible and can be passed through a needle, sewn into fabrics, and washed at least 10 times without breaking down. According to Loke, "When you put it into a shirt, you can't feel it at all. You wouldn't know it was there."
Making a digital fiber "opens up different areas of opportunities and actually solves some of the problems of functional fibers," he says.
For instance, it offers a way to control individual elements within a fiber, from one point at the fiber's end. "You can think of our fiber as a corridor, and the elements are like rooms, and they each have their own unique digital room numbers," Loke explains. The research team devised a digital addressing method that allows them to "switch on" the functionality of one element without turning on all the elements.
A digital fiber can also store a lot of information in memory. The researchers were able to write, store, and read information on the fiber, including a 767-kilobit full-color short movie file and a 0.48 megabyte music file. The files can be stored for two months without power.
When they were dreaming up "crazy ideas" for the fiber, Loke says, they thought about applications like a wedding gown that would store digital wedding music within the weave of its fabric, or even writing the story of the fiber's creation into its components.
Fink notes that the research at MIT was in close collaboration with the textile department at RISD led by Missakian. Gitelson-Kahn incorporated the digital fibers into a knitted garment sleeve, thus paving the way to creating the first digital garment.

On-body artificial intelligence
The fiber also takes a few steps forward into artificial intelligence by including, within the fiber memory, a neural network of 1,650 connections. After sewing it around the armpit of a shirt, the researchers used the fiber to collect 270 minutes of surface body temperature data from a person wearing the shirt, and analyze how these data corresponded to different physical activities. Trained on these data, the fiber was able to determine with 96 percent accuracy what activity the person wearing it was engaged in.
Adding an AI component to the fiber further increases its possibilities, the researchers say. Fabrics with digital components can collect a lot of information across the body over time, and these "lush data" are perfect for machine learning algorithms, Loke says.
"This type of fabric could give quantity and quality open-source data for extracting out new body patterns that we did not know about before," he says.
With this analytic power, the fibers someday could sense and alert people in real-time to health changes like a respiratory decline or an irregular heartbeat, or deliver muscle activation or heart rate data to athletes during training.
The fiber is controlled by a small external device, so the next step will be to design a new chip as a microcontroller that can be connected within the fiber itself.
"When we can do that, we can call it a fiber computer," Loke says.
This research was supported by the U.S. Army Institute of Soldier Nanotechnologies, National Science Foundation, the U.S. Army Research Office, the MIT Sea Grant, and the Defense Threat Reduction Agency.
Reprinted with permission of MIT News. Read the original article.
No news is good news? Think again
Information economics suggests that "no news" means somebody is hiding something. But people are bad at noticing that.