Once a week.
Subscribe to our weekly newsletter.
Princeton: Stylish men are perceived as 'significantly more competent'
As much as we say it's not about the clothes, it's still about the clothes.
- A new study from Princeton University shows that perception of competence is linked to dress.
- Researchers discovered the same results over nine separate studies: men that dress better are viewed as more competent.
- Even when told clothing is not a measure of competence, judges ruled in favor of the better-dressed men.
It would be nice to believe that we judge others not based on how they dress, but on their personal character. While such an idea sounds commendable, it's not true, says a new study out of Princeton University.
Regardless of how much we like to think ourselves above judging based on the clothing others wrap themselves in, such decisions are made much quicker than conscious awareness allows—as little as 130 milliseconds. The paper, which includes research from nine separate studies, confirms the long-held sentiment that "the clothes make the man."
Numerous studies have investigated how we judge faces. Speculation about the role of facial symmetry, large eyes, and prominent cheekbones has been around for decades. For this study — it was published in Nature Human Behavior on Dec. 9 — the researchers flashed the photos of 50 men so quickly that facial recognition proved about impossible. Noticing their torso and, more importantly, what that torso was garbed in, remained possible, however.
The team of DongWon Oh, Eldar Shafir, and Alexander Todorov write,
"Faces were shown with different upper-body clothing rated by independent judges as looking 'richer' or 'poorer', although not notably perceived as such when explicitly described. The same face when seen with 'richer' clothes was judged significantly more competent than with 'poorer' clothes."
Clothing Matters | Jan Erickson | TEDxColoradoSprings
Before beginning, the researchers asked a separate panel of judges to ensure that the clothing they chose was not overtly wealthy or representative of extreme poverty. Those partaking in the study were told to judge the competence of the faces they saw on a scale from one to nine based on "gut feeling."
In a few of the nine studies clothing was explicitly mentioned, though the judges were told to not consider what the people were wearing. In one, they claimed that there was no relationship between clothing and competence.
It didn't matter. In every single study, those rocking finer threads were deemed more competent, even when the same man was shown in different tops. If he was wearing nice clothes, he was deemed more competent than when wearing a t-shirt.
These perceptions influence who we vote for and where we spend our money. Oh believes this is especially important in the age of widespread income inequality.
"Wealth inequality has worsened since the late 1980s in the United States. Now the gap between the top 1 percent and the middle class is over 1,000,000 percent, a mind-numbing figure. Other labs' work has shown people are sensitive to how rich or poor other individuals appear. Our work found that people are susceptible to these cues when judging others on meaningful traits, like competence, and that these cues are hard, if not impossible, to ignore."
The researchers hope that by making people aware of their implicit bias toward clothing, such awareness can later translate into better decision-making about the judgment of an individual's character. As we learn over and over again, fine clothing does not necessary represent an ethical or honest person. It just means they can afford to purchase it.
Rapper Post Malone visits SiriusXM Studio on November 30, 2016 in New York City.
Photo credit: Matthew Eisman/Getty Images
Yet this lesson will be a hard lesson to instill. Clothing has long been representative of social status. Shoes were once a social marker separating those who could afford them compared to those that had to wear sandals or go barefoot. At root, humans are extremely shallow: we'd rather show off what we can afford than help others achieve the same status.
Shafir goes so far to claim that uniformity might be the way to go.
"A potential, even if highly insufficient, interim solution may be to avoid exposure whenever possible. Just like teachers sometimes grade blindly so as to avoid favoring some students, interviewers and employers may want to take what measures they can, when they can, to evaluate people, say, on paper so as to circumvent indefensible yet hard to avoid competency judgments. Academic departments, for example, have long known that hiring without interviews can yield better scholars. It's also an excellent argument for school uniforms."
That will be a hard sell an individualistic culture such as America, where outward appearance too often trumps inner character. As long as that's the case, we have to recognize ourselves for what we are: judgmental creatures focused on exterior presentation. Inner work takes practice, but the benefits — namely, not being conned or focused on fleeting materialism — is worth it.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A machine learning system lets visitors at a Kandinsky exhibition hear the artwork.
Have you ever heard colors?
As part of a new exhibition, the worlds of culture and technology collide, bringing sound to the colors of abstract art pioneer Wassily Kandinsky.
Kandinsky had synesthesia, where looking at colors and shapes causes some with the condition to hear associated sounds. With the help of machine learning, virtual visitors to the Sounds Like Kandinsky exhibition, a partnership project by Centre Pompidou in Paris and Google Arts & Culture, can have an aural experience of his art.
An eye for music
Kandinsky's synesthesia is thought to have heavily influenced his painting. Seeing yellow summoned up trumpets, evoking emotions like cheekiness; reds produced violins portraying restlessness; while organs representing heavenliness he associated with blues, according to the exhibition notes.
Virtual visitors are invited to take part in an experiment called Play a Kandinsky, which allows them to see and hear the world through the artist's eyes.
Kandinsky's synesthesia is thought to have heavily influenced his 1925 painting Yellow, Red, Blue.Image: Guillaume Piolle/Wikimedia Commons
In 1925, the artist's masterpiece, "Yellow, Red, Blue", broke new ground in the world of abstract art, guiding the viewer from left to right with shifting shapes and shades. Almost a century after it was painted, Google's interactive tool lets visitors click different parts of the artwork to journey through the artist's description of the colors, associated sounds and moods that inspired the work.
But Google's new toy is not the only tool developed to enhance the artistic experience.
Artist Neil Harbisson has developed an artificial way to emulate Kandinsky by turning colors into sounds. He has a rare form of color blindness and sees the world in greyscale. But a smart antenna attached to his head translates dominant colors into musical notes, creating a real-world soundtrack of what's in front of him. The invention could open up a new world for people who are color blind.
A new study suggests that private prisons hold prisoners for a longer period of time, wasting the cost savings that private prisons are supposed to provide over public ones.
- Private prisons in Mississippi tend to hold prisoners 90 days longer than public ones.
- The extra days eat up half of the expected cost savings of a private prison.
- The study leaves several open questions, such as what affect these extra days have on recidivism rates.
The United States of America, land of the free, is home to 5 percent of the world's population but 25 percent of its prisoners. The cost of having so many people in the penal system adds up to $80 billion per year, more than three times the budget for NASA. This massive system exploded in size relatively recently, with the prison population increasing by six-fold in the last four decades.
Ten percent of these prisoners are kept in private prisons, which are owned and operated for the sake of profit by contractors. In theory, these operations cost less than public prisons and jails, and states can save money by contracting them to incarcerate people. They have a long history in the United States and are used in many other countries as well.
However, despite the pervasiveness of private contractors in the American prison system, there is not much research into how well they live up to their promise to provide similar services at a lower cost to the state. The little research that is available often encounters difficulties in trying to compare the costs and benefits of facilities with vastly different operations and occasionally produces results suggesting there are few benefits to privatization.
A new study by Dr. Anita Mukherjee and published in the American Economic Journal: Economic Policy joins the debate with a robust consideration of the costs and benefits of private prisons. Its findings suggest that some private prisons keep people incarcerated longer and save less money than advertised.
The study focuses on prisons in Mississippi. Despite its comparatively high rate of incarceration, Mississippi's prison system is very similar to that of other states that also use private prisons. Demographically, its system is representative of the rest of the U.S. prison system, and its inmates are sentenced for similar amounts of time.
The state attempts to get the most out of its privatization efforts, as a 1994 law requires all contracts for private prisons in Mississippi to provide at least a 10 percent cost savings over public prisons while providing similar services. As a result, the state seeks to maximize its savings by sending prisoners to private institutions first if space if available.
While public and private prisons in Mississippi are quite similar, there are a few differences that allow for the possibility of cost savings by private operators — not the least of which is that the guards are paid 30 percent less and have fewer benefits than their publicly employed counterparts.
The results of privatization
The graph depicts the likelihood of release for public (dotted line) vs. private (solid line) prison inmates. At every level of time served, public prisoners were more likely to be released than private prisoners.Dr. Anita Mukherjee
The study relied on administrative records of the Mississippi prison system between 1996 and 2013. The data included information on prisoner demographics, the crimes committed, sentence lengths, time served, infractions while incarcerated, and prisoner relocation while in the system, including between public and private jails. For this study, the sample examined was limited to those serving between one and six years and those who served at least a quarter of their sentence. This created a primary sample of 26,563 bookings.
Analysis revealed that prisoners in private prisons were behind bars for four to seven percent longer than those in public prisons, which translates to roughly 85 to 90 extra days per prisoner. This is, in part, because those in private prison serve a greater portion of their sentences (73 percent) than those in public institutions (70 percent).
This in turn might be due to the much higher infraction rate in private prisons compared to public ones. While only 18 percent of prisoners in a public prison commit an infraction, such as disobeying a guard or possessing contraband, the number jumps to 46 percent in a private prison. Infractions can reduce the probability of early release or cause time to be added to a sentence.
It's unclear why there are so many more infractions in private prisons. Dr. Mukherjee suggests it could be the result of "harsher prison conditions in private prisons," better monitoring techniques, incentives to report more of them to the state before contract renewals, or even a lackadaisical attitude on the part of public prison employees.
What does all this cost Mississippi?
The extra time served eats 48 percent of the cost savings of keeping prisoners in a private facility. For example, it costs about $135,000 to house a prisoner in a private prison for three years and $150,000 in the public system. But longer stays in private prisons reduce the savings from $15,000 to only $7,800.
As Dr. Mukherjee remarks, this cost is also just the finance. Some things are a little harder to measure:
"There are, of course, other costs that are difficult to quantify — e.g., the cost of injustice to society (if private prison inmates systematically serve more time), the inmate's individual value of freedom, and impacts of the additional incarceration on future employment. Abrams and Rohlfs (2011) estimates a prisoner's value of freedom for 90 days at about $1,100 using experimental variation in bail setting. Mueller-Smith (2017) estimates that 90 days of marginal incarceration costs about $15,000 in reduced wages and increased reliance on welfare. If these social costs were to exceed $7,800 in the example stated, private prisons would no longer offer a bargain in terms of welfare-adjusted cost savings."
It is possible that the extra time in jail provides benefits that counter these costs, such as a reduced recidivism rate, but this proved difficult to determine. Though it was not statistically significant, there was some evidence that the added time actually increased the rate of recidivism. If that's true, then private prisons could be counterproductive.