Once a week.
Subscribe to our weekly newsletter.
The Robin Hood of Science: The Missing Chapter
The tale of a young man driven to his death for fighting for what is right, and the young woman picking up where he left off.
Content Warning: Contains references to violence, injustice, suicide and material you may find upsetting, you might not want to read this on the bus.
Last week I wrote what has quickly become the most read piece of writing I have ever written. It told the tale of how one researcher — Alexandra Elbakyan — has made nearly every scientific paper ever published available for free online to anyone, anywhere in the world. This is part two, but it’s OK; this is a story that, just like Star Wars, you can read backward.
The moment I started working on this story last October I knew it was a huge story. I knew it deserved to be read not just by the hundreds of thousands of people who are now reading and sharing it. I knew it deserved to be in newspapers around the world.
So I did what any science journalist would do. I pitched it to The New York Times; I pitched it to The Guardian; I pitched it to all the big guns. The response? Crickets. Bizarrely, until the day that I broke the story last week not one publication had ever covered the story of Sci-Hub. Since last week, dozens have.
The only publication that gave any indication of interest happened to be the largest publisher in tech news. For a time things sounded very promising; but discussions eventually broke down when I made it clear that it was impossible for me to convince Alexandra to travel to Europe for a photoshoot. She feared arrest, quite understandably. Discussions reached a deadlock as I was placed under immense pressure to cut the story down, ultimately to less than 250 words, barely a couple of paragraphs.
Apparently the story was not worth being printed on paper without glossy photos of Alexandra herself posing for the camera, as if they were remotely relevant to the story. A story of this complexity simply couldn’t be covered in any meaningful way in 250 words, I argued. That has since proven to be true as vast numbers of people who read the story thought researchers or universities received a portion of the fees paid by the public to read the journals, which contain academic research funded by taxpayers.
This is simply not true. Most of the billions of dollars that are paid every year for access to academic publications are creamed off, directly into the pockets of publishing fat cats and their shareholders. Not a penny of this is paid to a scientist or academic institution. In fact if anything, scientists must pay for their work to be published. Editing, reviewing, production, every stage of the process is carried out by researchers who act as volunteers, independently of journals for the good of science. Every single stage of work paid for by the public purse is farmed out, except the profits, which are sucked up by billion-dollar-per-year corporations.
This is routine; it’s just the way things work today, a sad hangover from a time when print was a finite resource, even though now it is obsolete in the academic world, replaced by digital documents effortlessly transmitted down the telephone line. It is a hangover that benefits and enriches a handful of for-profit corporations that create nothing, at the expense of all of humanity’s access to the wealth of scientific knowledge.
The costs are real. Just yesterday I met a social worker who — now that she’s qualified, now that she’s a “professional” — can no longer access the social work journals she needs to do her job because she is no longer at a university, so now she no longer has an access code. The same is true for doctors, psychologists, neurologists, engineers, botanists, geneticists, chemists, and philosophers around the developing world.
Ultimately, the first publisher I approached with the story, who after a brief discussion I sent the complete 2,000-word report, published a cut down and cobbled together version of the story behind my back. I panicked and decided as a last-ditch resort to publish my working copy on my blog.
I wanted an accurate and complete version of events not just to be the story of record, but also to be the breaking story, the story that people actually read. Thankfully it was. The story was accurate, but to my shame the story was incomplete. Since last week I have received countless messages asking me why one name was missing from the report. That name was Aaron Swartz. What follows is the missing chapter.
At the very same time that Alexandra was building Sci-Hub, on the other side of the world, a young man named Aaron Swartz was fighting the same fight, in a very different way. Unlike Alexandra who has explained her reasoning for breaking the law to the judge acting in her case in the frankest possible terms, Aaron always followed the law meticulously.
Aaron was a boy genius; at the age of 12, he built Info Network, an early ancestor of Wikipedia. At the age of 13 he built Really Simple Syndication (RSS), the technology that is now near universally used to track virtually any publisher on the Internet, right from a blog such as mine, to the New York Times. If Aaron had not invented RSS, then I probably wouldn’t have been able to start out independently building my own audience, get noticed, and become a writer. Without Aaron you almost certainly would not be reading this now.
Later, he co-founded Creative Commons, the framework now used by millions of artists, writers, and publishers to free their work of the shackles of copyright with simple, clear open-access licences. Billions of pieces of work are now shared using this method. He also co-founded Reddit, a democratic social network that has become “the front page of the Internet,” delivering millions of people in any given moment to the most upvoted piece of information in their chosen networks.
He went on to build Deaddrop, now called SecureDrop, a method now broadly used by news agencies to collect information from anonymous sources. He also built Open Library, a website with the goal of having a page dedicated to every book in existence. In 26 short years Aaron helped found countless organisations dedicated to freedom of information and democratic social progress. One of those organisations, Demand Progress, has been responsible for some of the largest successful grassroots political campaigns in U.S. history. Despite earning millions at a very young age from his creations, he was a passionate fighter against wealth disparity:
“It seems ridiculous that miners should have to hammer away until their whole bodies are dripping with sweat, faced with the knowledge that if they dare to stop, they won’t be able to put food on the table that night, while I get to make larger and larger amounts of money each day just by sitting watching TV, but apparently the world is ridiculous." — Aaron Swartz
Aaron soon realised a grand injustice existed in the U.S. Access to vast swathes of the core documents that make up the law are not freely available to the public. To access the law, you had to pay a complex bureaucratic website 10 cents per page. In fact, you still do, and it’s a $10 billion-per-year business.
Of course, the law itself is not copyrighted. So when in 2008 Aaron wrote a piece of code to download 2.7 million documents from the PACER (Public Access to Court Electronic Records) database through a library terminal, and then made them freely available, Aaron was not technically breaking the law, as the FBI eventually conceded.
Technologist Carl Malamud explains: “PACER is an incredible abomination of government services; 10 cents per page; the most brain-dead code you have ever seen; you can’t search it; you can’t bookmark anything; you’ve got to have a credit card. These are public records; U.S. district courts are very important — it’s where a lot of our seminal litigation starts; civil rights cases, patent cases. Journalists, students, citizens, and lawyers all need access to PACER and it fights them every step of the way. People without means can’t see the law ... it’s a poll tax on access to justice.”
At the time Aaron made the documents available to the public there were only 17 libraries capable of freely accessing the law within the entire United States; that’s one access point for every 221,090 square miles (572,620 square km) of U.S soil.
In an unconnected hack, while at Stanford University Aaron downloaded the entire contents of the Westlaw legal database, a database he never publicly released, because that would have been illegal.
Analysis of the data Aaron obtained published in the Stanford Law Review revealed a pattern of massively corrupt practices involving top-level law professors being quietly paid by oil giants and other multi-billion dollar corporations, for the publication of biased legal opinions — “vanity research” purpose-built to be used to argue in court for the minimisation of punitive damages in existing multi-million dollar lawsuits.
I could go on about his many and varied achievements. I could go on about the incalculable good Aaron did for society. I could go on about the steps Aaron always took to act within the law, while he worked on its precipice, always for the betterment of others. I could go on about how while being in prime position to make untold millions more out of his creations, he lived modestly and spent night and day donating his time to fighting within the law for what is right, but it’s not what Aaron created that this story is about; it is what was taken from him, and with him, from all of us.
At the end of 2010, Aaron plugged a laptop directly into the server farm at Massachusetts Institute of Technology (MIT). He’d written a Python script called “Keep Grabbing That Pie” to quietly download the entire contents of the JSTOR database of academic research.
Aaron had complete legal access to the research he downloaded, through his university subscription. His crime, had Aaron ever made it to the dock, would essentially have been taking too many books out of the library.
To their credit, when Aaron was caught, JSTOR chose not to press charges, but in a highly unusual legal decision, Aaron was set upon by the United States government with a string of 13 wire fraud-related felony charges.
On the 6th of January 2011, Aaron was arrested, allegedly assaulted by the police and placed in solitary confinement. In a strongly worded statement intended to send a message to hackers, federal prosecutors announced Aaron was facing felony charges that would result in up to 35 years in jail, restitution, asset forfeiture and up to a million dollar fine. He was released on $100,000 bail.
The government gave Aaron a non-negotiable demand that he accept the felony charges. Determined he had not committed a crime, he refused to plead guilty in return for a reduced sentence, and bans and restrictions on his computer use. This despite the fact that his legal costs had completely exhausted his financial resources and all the money that had been raised to defend him, a sum that ran into the millions of dollars.
On the 11th January 2013, two years of bitter legal proceedings later, and only two days after the prosecution had declined his counteroffer to a plea deal, he was found hanging dead in his apartment.
Aaron’s obituary was the first obituary I ever wrote. It doesn’t do justice to one of the greatest minds of our generation. It was written in a haze of shock, anger, and sadness the day of Aaron’s death. I wasn’t alone. An earthquake of grief reverberated around the Internet. His eulogy was read by Tim Berners-Lee, the inventor of the World Wide Web.
I never knew Aaron, but I am acutely aware of exactly how much I have benefited from his work, and how much we all stand to gain from the work he was doing. I studied at a university that couldn’t afford most useful journals, so I was dependent on the goodwill of others getting me the research I needed to pass. When I started out writing my first blog, Creative Commons gave me vast and easy access to sources of imagery I could legally and freely use to help illustrate my work; Reddit helped people find my writing even though they were mostly on the other side of the Atlantic. RSS let people that liked my work follow me without spending a penny, enabling me to build a career. For all of that, I will always be grateful to Aaron.
Aaron died before he could finish his work, but unbeknownst to him, Alexandra had already picked up the baton where he and countless others in online communities dedicated to freedom of information left off.
Alexandra has not only matched the 4 million articles Aaron downloaded from JSTOR before he was caught, but also released the articles into the public domain along with 43 million more, and built Sci-Hub, a one-click instant paywall workaround that works not just on JSTOR, but also Elsevier and a whole host of other paywalled academic publishers.
If I were a religious man, I might say that we can only pray that Alexandra won’t face a fate similar to Aaron’s — that she will stay safe from prison and legal intimidation and those who wish she would disappear, that she can continue doing what she does best, making discoveries and creating things.
But to say this would be a lie. We can’t only pray.
We can do everything in our power to make sure the politicians we elect don’t allow corporations to throw the book at researchers who have no other way to conduct science than to share their work freely. We can’t allow politicians to throw away the keys to the libraries. We must convince academics to stop handing the keys to their work to gang masters who would happily see all of our scientific knowledge remain inaccessible to the vast majority of humanity.
“Information is power. But like all power, there are those who want to keep it for themselves. The world's entire scientific and cultural heritage, published over centuries in books and journals, is increasingly being digitized and locked up by a handful of private corporations. Want to read the papers featuring the most famous results of the sciences? You'll need to send enormous amounts to publishers like Reed Elsevier.
There are those struggling to change this. The Open Access Movement has fought valiantly to ensure that scientists do not sign their copyrights away, but instead ensure their work is published on the Internet, under terms that allow anyone to access it. But even under the best scenarios, their work will only apply to things published in the future. Everything up until now will have been lost.
That is too high a price to pay. Forcing academics to pay money to read the work of their colleagues? Scanning entire libraries, but only allowing the folks at Google to read them? Providing scientific articles to those at elite universities in the First World, but not to children in the global South? It's outrageous and unacceptable.
"I agree," many say, "but what can we do? The companies hold the copyrights; they make enormous amounts of money by charging for access, and it's perfectly legal — there's nothing we can do to stop them." But there is something we can, something that's already being done: We can fight back.
Those with access to these resources — students, librarians, scientists — you have been given a privilege. You get to feed at this banquet of knowledge while the rest of the world is locked out. But you need not — indeed, morally, you cannot — keep this privilege for yourselves. You have a duty to share it with the world. And you have: trading passwords with colleagues, filling download requests for friends.
Meanwhile, those who have been locked out are not standing idly by. You have been sneaking through holes and climbing over fences, liberating the information locked up by the publishers and sharing them with your friends.
But all of this action goes on in the dark, hidden underground. It's called stealing or piracy, as if sharing a wealth of knowledge were the moral equivalent of plundering a ship and murdering its crew. But sharing isn't immoral — it's a moral imperative. Only those blinded by greed would refuse to let a friend make a copy.
Large corporations, of course, are blinded by greed. The laws under which they operate require it — their shareholders would revolt at anything less. And the politicians they have bought off back them, passing laws giving them the exclusive power to decide who can make copies.
There is no justice in following unjust laws. It's time to come into the light and, in the grand tradition of civil disobedience, declare our opposition to this private theft of public culture.
We need to take information, wherever it is stored, make our copies, and share them with the world. We need to take stuff that's out of copyright and add it to the archive. We need to buy secret databases and put them on the Web. We need to download scientific journals and upload them to file-sharing networks. We need to fight for Guerilla Open Access.
With enough of us, around the world, we'll not just send a strong message opposing the privatization of knowledge — we'll make it a thing of the past. Will you join us?" - Aaron Swartz - July 2008, Eremo, Italy
Below is a gripping, thought-provoking and tear-jerking documentary on the events that led to Aaron’s death. It received a string of offers when it was nominated at Sundance, but in the spirit of Aaron’s beliefs the film’s producers have made it publicly available under a Creative Commons licence, so you can watch it in full below. It’s the most moving film I’ve seen in years.
At the time of Aaron’s arrest, Alexandra’s website was already in operation, but working on opposite sides of the world, the two were unbeknown to each other. “When I read in the news about Aaron for the first time I thought, that's the guy who could be my best friend and collaborator,” Alexandra told me.
While Alexandra later came to find Aaron’s writings inspiring and is working on translating them to Russian, she maintains her greatest inspiration was the countless “inspired people” all around the world who share knowledge in online communities based on their shared belief that knowledge should be free.
Read Part One: How one researcher — Alexandra Elbakyan — has made nearly every scientific paper ever published available for free to anyone, anywhere in the world.
Graphics and video courtesy of The Documentary Network (CC-BY-NC-SA). Creative Commons infographic by Shiamm Donnelly (CC-BY-NC-SA).
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
Quantum theory has weird implications. Trying to explain them just makes things weirder.
- The weirdness of quantum theory flies in the face of what we experience in our everyday lives.
- Quantum weirdness quickly created a split in the physics community, each side championed by a giant: Albert Einstein and Niels Bohr.
- As two recent books espousing opposing views show, the debate still rages on nearly a century afterward. Each "resolution" comes with a high price tag.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
To Einstein, the world was ultimately rational. Things had to make sense. They should be quantifiable and expressible through a logical chain of cause-and-effect interactions, from what we experience in our everyday lives all the way to the depths of reality. To Bohr, we had no right to expect any such order or rationality. Nature, at its deepest level, need not follow any of our expectations of well-behaved determinism. Things could be weird and non-deterministic, so long as they became more like what we expect when we traveled from the world of atoms to our world of trees, frogs, and cars. Bohr divided the world into two realms, the familiar classical world, and the unfamiliar quantum world. They should be complementary to one another but with very different properties.
The two scientists spent decades arguing about the impact of quantum physics on the nature of reality. Each had groups of physicists as followers, all of them giants of their own. Einstein's group of quantum weirdness deniers included quantum physics pioneers Max Planck, Louis de Broglie, and Erwin Schrödinger, while Bohr's group had Werner Heisenberg (of uncertainty principle fame), Max Born, Wolfgang Pauli, and Paul Dirac.
Almost a century afterward, the debate rages on.
Einstein vs. Bohr, Redux
Two books — one authored by Sean Carroll and published last fall and another published very recently and authored by Carlo Rovelli — perfectly illustrate how current leading physicists still cannot come to terms with the nature of quantum reality. The opposing positions still echo, albeit with many modern twists and experimental updates, the original Einstein-Bohr debate.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
I summarized the ongoing dispute in my book The Island of Knowledge: Are the equations of quantum physics a computational tool that we use to make sense of the results of experiments (Bohr), or are they supposed to be a realistic representation of quantum reality (Einstein)? In other words, are the equations of quantum theory the way things really are or just a useful map?
Einstein believed that quantum theory, as it stood in the 1930s and 1940s, was an incomplete description of the world of the very small. There had to be an underlying level of reality, still unknown to us, that made sense of all its weirdness. De Broglie and, later, David Bohm, proposed an extension of the quantum theory known as hidden variable theory that tried to fill in the gap. It was a brilliant attempt to appease the urge Einstein and his followers had for an orderly natural world, predictable and reasonable. The price — and every attempt to deal with the problem of figuring out quantum theory has a price tag — was that the entire universe had to participate in determining the behavior of every single electron and all other quantum particles, implicating the existence of a strange cosmic order.
Later, in the 1960s, physicist John Bell proved a theorem that put such ideas to the test. A series of remarkable experiments starting in the 1970s and still ongoing have essentially disproved the de Broglie-Bohm hypothesis, at least if we restrict their ideas to what one would call "reasonable," that is, theories that have local interactions and causes. Omnipresence — what physicists call nonlocality — is a hard pill to swallow in physics.
Credit: Public domain
Yet, the quantum phenomenon of superposition insists on keeping things weird. Here's one way to picture quantum superposition. In a kind of psychedelic dream state, imagine that you had a magical walk-in closet filled with identical shirts, the only difference between them being their color. What's magical about this closet? Well, as you enter this closet, you split into identical copies of yourself, each wearing a shirt of a different color. There is a you wearing a blue shirt, another a red, another a white, etc., all happily coexisting. But as soon as you step out of the closet or someone or something opens the door, only one you emerges, wearing a single shirt. Inside the closet, you are in a superposition state with your other selves. But in the "real" world, the one where others see you, only one copy of you exists, wearing a single shirt. The question is whether the inside superposition of the many yous is as real as the one you that emerges outside.
To Einstein, the world was ultimately rational... To Bohr, we had no right to expect any such order or rationality.
The (modern version of the) Einstein team would say yes. The equations of quantum physics must be taken as the real description of what's going on, and if they predict superposition, so be it. The so-called wave function that describes this superposition is an essential part of physical reality. This point is most dramatically exposed by the many-worlds interpretation of quantum physics, espoused in Carroll's book. For this interpretation, reality is even weirder: the closet has many doors, each to a different universe. Once you step out, all of your copies step out together, each into a parallel universe. So, if I happen to see you wearing a blue shirt in this universe, in another, I'll see you wearing a red one. The price tag for the many-worlds interpretation is to accept the existence of an uncountable number of non-communicating parallel universes that enact all possibilities from a superstition state. In a parallel universe, there was no COVID-19 pandemic. Not too comforting.
Bohm's team would say take things as they are. If you stepped out of the closet and someone saw you wearing a shirt of a given color, then this is the one. Period. The weirdness of your many superposing selves remains hidden in the quantum closet. Rovelli defends his version of this worldview, called relational interpretation, in which events are defined by the interactions between the objects involved, be them observers or not. In this example, the color of your shirt is the property at stake, and when I see it, I am entangled with this specific shirt of yours. It could have been another color, but it wasn't. As Rovelli puts it, "Entanglement… is the manifestation of one object to another, in the course of an interaction, in which the properties of the objects become actual." The price to pay here is to give up the hope of ever truly understanding what goes on in the quantum world. What we measure is what we get and all we can say about it.
What should we believe?
Both Carroll and Rovelli are master expositors of science to the general public, with Rovelli being the more lyrical of the pair.
There is no resolution to be expected, of course. I, for one, am more inclined to Bohr's worldview and thus to Rovelli's, although the interpretation I am most sympathetic to, called QBism, is not properly explained in either book. It is much closer in spirit to Rovelli's, in that relations are essential, but it places the observer on center stage, given that information is what matters in the end. (Although, as Rovelli acknowledges, information is a loaded word.)
We create theories as maps for us human observers to make sense of reality. But in the excitement of research, we tend to forget the simple fact that theories and models are not nature but our representations of nature. Unless we nurture hopes that our theories are really how the world is (the Einstein camp) and not how we humans describe it (the Bohr camp), why should we expect much more than this?
Maybe eyes really are windows into the soul — or at least into the brain, as a new study finds.
- Researchers find a correlation between pupil size and differences in cognitive ability.
- The larger the pupil, the higher the intelligence.
- The explanation for why this happens lies within the brain, but more research is needed.
What can you tell by looking into someone's eyes? You can spot a glint of humor, signs of tiredness, or maybe that they don't like something or someone.
But outside of assessing an emotional state, a person's eyes may also provide clues about their intelligence, suggests new research. A study carried out at the Georgia Institute of Technology shows that pupil size is "closely related" to differences in intelligence between individuals.
The scientists found that larger pupils may be connected to higher intelligence, as demonstrated by tests that gauged reasoning skills, memory, and attention. In fact, the researchers claim that the relationship of intelligence to pupil size is so pronounced, that it came across their previous two studies as well and can be spotted just with your naked eyes, without any additional scientific instruments. You should be able to tell who scored the highest or the lowest on the cognitive tests just by looking at them, say the researchers.
The pupil-IQ link
The connection was first noticed across memory tasks, looking at pupil dilations as signs of mental effort. The studies involved more than 500 people aged 18 to 35 from the Atlanta area. The subjects' pupil sizes were measured by eye trackers, which use a camera and a computer to capture light reflecting off the pupil and cornea. As the scientists explained in Scientific American, pupil diameters range from two to eight millimeters. To determine average pupil size, they took measurements of the pupils at rest when the participants were staring at a blank screen for a few minutes.
Another part of the experiment involved having the subjects take a series of cognitive tests that evaluated "fluid intelligence" (the ability to reason when confronted with new problems), "working memory capacity" (how well people could remember information over time), and "attention control" (the ability to keep focusing attention even while being distracted). An example of the latter involves a test that attempts to divert a person's focus on a disappearing letter by showing a flickering asterisk on another part of the screen. If a person pays too much attention to the asterisk, they might miss the letter.
The conclusions of the research were that having a larger baseline pupil size was related to greater fluid intelligence, having more attention control, and even greater working memory capacity, although to a smaller extent. In an email exchange with Big Think, author Jason Tsukahara pointed out, "It is important to consider that what we find is a correlation — which should not be confused with causation."
The researchers also found that pupil size seemed to decrease with age. Older people had more constricted pupils but when the scientists standardized for age, the pupil-size-to-intelligence connection still remained.
Why are pupils linked to intelligence?
The connection between pupil size and IQ likely resides within the brain. Pupil size has been previously connected to the locus coeruleus, a part of the brain that's responsible for synthesizing the hormone and neurotransmitter norepinephrine (noradrenaline), which mobilizes the brain and body for action. Activity in the locus coeruleus affects our perception, attention, memory, and learning processes.
As the authors explain, this region of the brain "also helps maintain a healthy organization of brain activity so that distant brain regions can work together to accomplish challenging tasks and goals." Because it is so important, loss of function in the locus coeruleus has been linked to conditions like Alzheimer's disease, Parkinson's, clinical depression, and attention deficit hyperactivity disorder (ADHD).
The researchers hypothesize that people who have larger pupils while in a restful state, like staring at a blank computer screen, have "greater regulation of activity by the locus coeruleus." This leads to better cognitive performance. More research is necessary, however, to truly understand why having larger pupils is related to higher intelligence.
In an email to Big Think, Tsukahara shared, "If I had to speculate, I would say that it is people with greater fluid intelligence that develop larger pupils, but again at this point we only have correlational data."
Do other scientists believe this?
As the scientists point out in the beginning of their paper, their conclusions are controversial and, so far, other researchers haven't been able to duplicate their results. The research team addresses this criticism by explaining that other studies had methodological issues and examined only memory capacity but not fluid intelligence, which is what they measured.