Once a week.
Subscribe to our weekly newsletter.
Why American universities are the best in the world
American universities used to be small centers of rote learning, but three big ideas turned them into intellectual powerhouses.
- American universities used to be small denominational schools with little research output.
- Competition between schools in the late 19th century drove many schools to innovate.
- Today, America has many top universities and the lion's share of Nobel Prize winners.
List after list confirms it. The United States, by far, has the best and most prestigious universities in the world.
But it wasn't always this way, and there was no guarantee that this outcome would happen. According to a new essay by W. Bentley MacLeod and Miguel Urquiola and published in the Journal of Economic Perspectives, a series of innovations at American universities combined with lots of funding accidentally created a system that valued research, promoted talent sorting, and provided lots of cash to fund bigger and better schools.
The three big ideas: Sorting, performance review, tenure
According to the authors, you would have hardly recognized any of the original colleges in the United States. Schools might have a hundred students and perhaps five poorly paid professors who taught several disparate classes at once. The curriculum was limited and excluded things like business or engineering. Most students, who could be as young as 14, learned by rote. Schools were set up by denomination, with most students selecting to go somewhere close to home that matched their particular stance on Christianity. Research efforts were minimal.
It wasn't until after the Civil War that things began to change. Professors were hired for their expertise, schools began to specialize, and students started to pay less attention to the denomination of the school they wanted to attend. The number of colleges exploded, and things that worked in one were often taken up elsewhere.
The authors propose that this turnaround was made possible by the accidental convergence of a few things that America enjoyed and Europe lacked. Low entry requirements meant new schools with new ideas popped up all the time, the large number of schools allowed for more experimentation in how schools operated, and the variety of choices students and staff had led to self-sorting towards institutions that excelled in particular fields.
Some of the more famous cases of experimentation, Johns Hopkins and Cornell, sought to emulate the specialization of European schools, while others, such as the University of Chicago, prioritized hiring the most qualified staff — even when they were already working at other universities.
Over time, schools placed less emphasis on religious affiliation and began to focus on specialization. Admissions standards began to rise at some schools, sorting high-achieving (or high status) students into programs with highly qualified staff.
The effort to find and maintain high-quality staff led to the creation of performance review standards in different fields. These systems, which often had accomplished professors reviewing their peers, encouraged more high-quality research output. Those who performed well often gained secure contracts to teach and conduct research — that is, tenure — which further encouraged high achievement.
All of this was made possible by large amounts of state and private funding, the latter often from proud alumni.
"What became challenging for all these universities once they started emphasizing research is how to incentivize that activity. One thing that agency theory shows is that one way to achieve this is to create somewhat lumpy rewards. That is to say, rewards that don't necessarily give you a little bit more for a little bit more output but rather create a big prize. Tenure has that flavor. It basically says if your research output is high enough you're going to get a lifetime contract at this university. Tenure has a couple of benefits that come out of agency theory. One is that these types of lumpy rewards can be particularly good when you make people compete against each other. The emergence of it in the US, in fact, helped place the US on good footing to compete at research with Europe, which does not have that institution as much."
Taken together, these factors created a virtuous cycle. The authors describe it as producing "resources to invest in research, which they could effectively incentivize; this helped attract strong students and funding, which could go into further reforms and enhancements."
By the 1920s, the US had overtaken Germany — the European country with the strongest universities in the early 20th century — in the share of Nobel Prize winners and never looked back.
The side effect: Inequality
All of this does produce one side effect well known to Americans: inequality. While the greatest American schools do well across the board, many other schools are comparatively middling. The authors point to one ranking list which illustrates this. According to the Shanghai Ranking, 41 of the top 100 universities globally are American, but while 83 percent of public universities in Spain make the top 1000, only about 23 percent of American ones do.
This is partly the result of the American system being as sorted as it is, so the best researchers and students tend to go to the same places. The European model, on the other hand, ensures equality of resources between different schools within a country.
Today, universities on both sides of the Atlantic share ideas, but retain their own character. For instance, tenure, which so benefited American schools, exists in an altered form in Europe.
Why isn't American K-12 education as good?
While this system has produced great universities, it's difficult to apply these tools elsewhere. For example, performance evaluation has been refined for researchers at the university level, but there is still tremendous debate over what counts as high performance at the K-12 level.
Additionally, it's possible for a country to dominate in university rankings with a handful of great schools. At the K-12 level, it would require thousands of schools performing at the peak of their abilities to get a similar result.
Until that happens, Americans can take pride that their universities — through a combination of competition, experimentation, and lots and lots of money — rose from small centers of rote learning to become the greatest research institutions in the history of the world.
So much for rest in peace.
- Australian scientists found that bodies kept moving for 17 months after being pronounced dead.
- Researchers used photography capture technology in 30-minute intervals every day to capture the movement.
- This study could help better identify time of death.
We're learning more new things about death everyday. Much has been said and theorized about the great divide between life and the Great Beyond. While everyone and every culture has their own philosophies and unique ideas on the subject, we're beginning to learn a lot of new scientific facts about the deceased corporeal form.
An Australian scientist has found that human bodies move for more than a year after being pronounced dead. These findings could have implications for fields as diverse as pathology to criminology.
Dead bodies keep moving
Researcher Alyson Wilson studied and photographed the movements of corpses over a 17 month timeframe. She recently told Agence France Presse about the shocking details of her discovery.
Reportedly, she and her team focused a camera for 17 months at the Australian Facility for Taphonomic Experimental Research (AFTER), taking images of a corpse every 30 minutes during the day. For the entire 17 month duration, the corpse continually moved.
"What we found was that the arms were significantly moving, so that arms that started off down beside the body ended up out to the side of the body," Wilson said.
The researchers mostly expected some kind of movement during the very early stages of decomposition, but Wilson further explained that their continual movement completely surprised the team:
"We think the movements relate to the process of decomposition, as the body mummifies and the ligaments dry out."
During one of the studies, arms that had been next to the body eventually ended up akimbo on their side.
The team's subject was one of the bodies stored at the "body farm," which sits on the outskirts of Sydney. (Wilson took a flight every month to check in on the cadaver.)Her findings were recently published in the journal, Forensic Science International: Synergy.
Implications of the study
The researchers believe that understanding these after death movements and decomposition rate could help better estimate the time of death. Police for example could benefit from this as they'd be able to give a timeframe to missing persons and link that up with an unidentified corpse. According to the team:
"Understanding decomposition rates for a human donor in the Australian environment is important for police, forensic anthropologists, and pathologists for the estimation of PMI to assist with the identification of unknown victims, as well as the investigation of criminal activity."
While scientists haven't found any evidence of necromancy. . . the discovery remains a curious new understanding about what happens with the body after we die.
The distances between the stars are so vast that they can make your brain melt. Take for example the Voyager 1 probe, which has been traveling at 35,000 miles per hour for more than 40 years and was the first human object to cross into interstellar space. That sounds wonderful except, at its current speed, it will still take another 40,000 years to cross the typical distance between stars.
Worse still, if you are thinking about interstellar travel, nature provides a hard limit on acceleration and speed. As Einstein showed, it's impossible to accelerate any massive object beyond the speed of light. Since the galaxy is more than 100,000 light-years across, if you are traveling at less than light speed, then most interstellar distances would take more than a human lifetime to cross. If the known laws of physics hold, then it seems a galaxy-spanning human civilization is impossible.
Unless of course you can build a warp drive.
Ah, the warp drive, that darling of science fiction plot devices. So, what about a warp drive? Is that even a really a thing?
Let's start with the "warping" part of a warp drive. Without doubt, Albert Einstein's theory of general relativity ("GR") represents space and time as a 4-dimensional "fabric" that can be stretched and bent and folded. Gravity waves, representing ripples in the fabric of spacetime, have now been directly observed. So, yes spacetime can be warped. The warping part of a warp drive usually means distorting the shape of spacetime so that two distant locations can be brought close together — and you somehow "jump" between them.
This was a basic idea in science fiction long before Star Trek popularized the name "warp drive." But until 1994, it had remained science fiction, meaning there was no science behind it. That year, Miguel Alcubierre wrote down a solution to the basic equations of GR that represented a region that compressed spacetime ahead of it and expanded spacetime behind to create a kind of traveling warp bubble. This was really good news for warp drive fans.
The problems with a warp drive
There were some problems though. Most important was that this "Alcubierre drive" required lots of "exotic matter" or "negative energy" to work. Unfortunately, there's no such thing. These are things theorists dreamed up to stick into the GR equations in order to do cool things like make stable open wormholes or functioning warp drives.
It's also noteworthy that researchers have raised other concerns about an Alcubierre drive — like how it would violate quantum mechanics or how when you arrived at your destination it would destroy everything in front of the ship in an apocalyptic flash of radiation.
Warp drives: A new hope
Credit: Primada / 420366373 via Adobe Stock
Recently, however, there seemed to be good news on the warp drive front with the publication this April of a new paper by Alexey Bobrick and Gianni Martre entitled "Introducing Physical Warp Drives." The good thing about the Bobrick and Martre paper was it was extremely clear about the meaning of a warp drive.
Understanding the equations of GR means understanding what's on either side of the equals sign. On one side, there is the shape of spacetime, and on the other, there is the configuration of matter-energy. The traditional route with these equations is to start with a configuration of matter-energy and see what shape of spacetime it produces. But you can also go the other way around and assume the shape of spacetime you want (like a warp bubble) and determine what kind of configuration of matter-energy you will need (even if that matter-energy is the dream stuff of negative energy).
Warp drives are simpler and much less mysterious objects than the broader literature has suggested.
What Bobrick and Martre did was step back and look at the problem more generally. They showed how all warp drives were composed of three regions: an interior spacetime called the passenger space; a shell of material, with either positive or negative energy, called the warping region; and an outside that, far enough away, looks like normal unwarped spacetime. In this way they could see exactly what was and was not possible for any kind of warp drive. (Watch this lovely explainer by Sabine Hossenfelder for more details). They even showed that you could use good old normal matter to create a warp drive that, while it moved slower than light speed, produced a passenger area where time flowed at a different rate than in the outside spacetime. So even though it was a sub-light speed device, it was still an actual warp drive that could use normal matter.
That was the good news.
The bad news was this clear vision also showed them a real problem with the "drive" part of the Alcubierre drive. First of all, it still needed negative energy to work, so that bummer remains. But worse, Bobrick and Martre reaffirmed a basic understanding of relativity and saw that there was no way to accelerate an Alcubierre drive past light speed. Sure, you could just assume that you started with something moving faster than light, and the Alcubierre drive with its negative energy shell would make sense. But crossing the speed of light barrier was still prohibited.
So, in the end, the Star Trek version of the warp drive is still not a thing. I know this may bum you out if you were hoping to build that version of the Enterprise sometime soon (as I was). But don't be too despondent. The Bobrick and Martre paper really did make headway. As the authors put it in the end:
"One of the main conclusions of our study is that warp drives are simpler and much less mysterious objects than the broader literature has suggested"
That really is progress.
The Black Death wasn't the only plague in the 1300s.
- In a unique study, researchers have determined how many people in medieval England had bunions
- A fashion trend towards pointed toe shoes made the affliction common.
- Even monks got in on the trend, much to their discomfort later in life.
Late Medieval England had its share of problems. The Wars of Roses raged, the Black Death killed off large parts of the population, and passing ruffians could say "Ni" at will to old ladies.
To make matters worse, a first of its kind study published in the International Journal of Paleopathology has demonstrated that much of the population suffered from another plague — a plague of bunions likely caused by a ridiculous medieval fashion trend.
If the shoe fits, it won't cause bunions
The outlines of a leather shoe from the King's Ditch, Cambridge. It is easy to see how these shoes might be constricting. Copyright Cambridge Archaeological Unit.
The bunion, known to medicine as "hallux valgus," is a deformity of the joint connecting the big toe to the rest of the foot. It is painful and can cause other issues including poor balance. The condition is associated with having worn constrictive shoes for a long period of time as well as genetic factors. Today, it is often caused by wearing high heeled shoes.
The medieval English didn't care for high heeled shoes as much as modern fashionistas, but there was a major fashion trend toward shoes with long, pointed toes called "poulaines" or "crakows" for their supposed place of origin, Krakow, Poland.
This trend, already silly-looking to a modern observer, got out of hand in a hurry. According to some records, the points on nobleman's shoes could be so long as to require tying them to the leg with string so the wearer could walk. At one point, King Edward IV had to ban commoners from wearing points longer than two inches. A couple years later, he saw fit to ban the shoes altogether.
But, just knowing that people back in the day made poor fashion choices doesn't prove they suffered for it. That is where digging up old skeletons to look at their feet comes in.
Beauty is pain: the price of high medieval fashion
To learn how bad the bunion epidemic was, the researchers looked to four burial sites in and around Cambridge. One was a rural cemetery where poor peasants were buried. Another was the All Saints by the Castle parish, which had a mixed collection of people that tended toward poverty. The Hospital of St. John's burial ground contained both the poor charges of a charity hospital and wealthy benefactors. Lastly, they considered the cemetery of a local Augustinian friary, home to monks and well-to-do philanthropists.
The team considered 177 adult skeletons that were at least a quarter complete and still had enough of their feet to make studying them possible. The remains were classified by age and sex by observation and DNA testing. Each was examined for evidence of bunions and signs of complications from the condition, such as falling.
Those buried in the monastery's graveyard were the most affected. Nearly half, 43 percent, of the remains found there had bunions. This includes five of the eleven members of the clergy they found. Twenty-three percent of those laid to rest at the Hospital of St. John had bunions, though only 10 percent of those at the All Saints by the Castle parish graveyard did.
The rural cemetery had a much lower rate of instances, only three percent, suggesting that these peasants were able to avoid at least one plague.
Overall, eighteen percent of the individuals examined had bunions, with men more likely to have them than women. Those at cemeteries known for exclusivity were more likely to have them as well, though it is clear that the condition also affected members of other classes. This makes sense, as it is known that these shoes had mass appeal.
The authors note that the rural cemetery having fewer cases is partly because that cemetery "went out of use prior to the wide adoption of pointed shoes, and it is likely that those residing in the parish predominately wore soft leather shoes, or possibly went barefoot."
Those skeletons with evidence of bunions were more likely to have fractures indicative of a fall. This was more common on those estimated or recorded as having lived past age 45.
In our much more enlightened times, 23 percent of the population currently endures having bunions, most of them women, and one of the leading culprits behind this is the high heeled shoe.
Some things never change.