Once a week.
Subscribe to our weekly newsletter.
Top 20 greatest inventions of all time
The most impactful technology inventions in history are ranked.
Technology is a core component of the human experience. We have been creating tools to help us tame the physical world since the early days of our species.
Any attempt to count down the most important technological inventions is certainly debatable, but here are some major advancements that should probably be on any such list (in chronological order):
1. FIRE - it can be argued that fire was discovered rather than invented. Certainly, early humans observed incidents of fire, but it wasn't until they figured out how to control it and produce it themselves that humans could really make use of everything this new tool had to offer. The earliest use of fire goes back as far as two million years ago, while a widespread way to utilize this technology has been dated to about 125,000 years ago. Fire gave us warmth, protection, and led to a host of other key inventions and skills like cooking. The ability to cook helped us get the nutrients to support our expanding brains, giving us an indisputable advantage over other primates.
2. WHEEL - the wheel was invented by Mesopotamians around 3500 B.C., to be used in the creation of pottery. About 300 years after that, the wheel was put on a chariot and the rest is history. Wheels are ubiquitous in our everyday life, facilitating our transportation and commerce.
Circa 2000 BC, Oxen drawing an ancient Egyptian two-wheeled chariot. (Photo by Hulton Archive/Getty Images)
3. NAIL - The earliest known use of this very simple but super-useful metal fastener dates back to Ancient Egypt, about 3400 B.C. If you are more partial to screws, they've been around since Ancient Greeks (1st or 2nd century B.C.).
4. OPTICAL LENSES - from glasses to microscopes and telescopes, optical lenses have greatly expanded the possibilities of our vision. They have a long history, first developed by ancient Egyptians and Mesopotamians, with key theories of light and vision contributed by Ancient Greeks. Optical lenses were also instrumental components in the creation of media technologies involved in photography, film and television.
5. COMPASS - this navigational device has been a major force in human exploration. The earliest compasses were made of lodestone in China between 300 and 200 B.C.
Circa 1121 BC, An ancient Chinese magnetic chariot. The figure, pointing to the south, moves in accordance with the principle of the magnetic compass. (Photo by Hulton Archive/Getty Images)
6. PAPER - invented about 100 BC in China, paper has been indispensible in allowing us to write down and share our ideas.
7. GUNPOWDER - this chemical explosive, invented in China in the 9th century, has been a major factor in military technology (and, by extension, in wars that changed the course of human history).
8. PRINTING PRESS - invented in 1439 by the German Johannes Gutenberg, this device in many ways laid the foundation for our modern age. It allowed ink to be transferred from the movable type to paper in a mechanized way. This revolutionized the spread of knowledge and religion as previously books were generally hand-written (often by monks).
1511, Printing Press, from the title page of 'Hegesippus' printed by Jodocus Badius Ascensius in Paris. (Photo by Hulton Archive/Getty Images)
9. ELECTRICITY - utilization of electricity is a process to which a number of bright minds have contributed over thousands of years, going all the way back to Ancient Egypt and Ancient Greece, when Thales of Miletus conducted the earliest research into the phenomenon. The 18th-century American Renaissance man Benjamin Franklin is generally credited with significantly furthering our understanding of electricity, if not its discovery. It's hard to overestimate how important electricity has become to humanity as it runs the majority of our gadgetry and shapes our way of life. The invention of the light bulb, although a separate contribution, attributed to Thomas Edison in 1879, is certainly a major extension of the ability to harness electricity. It has profoundly changed the way we live, work as well as the look and functioning of our cities.
10. STEAM ENGINE - invented between 1763 and 1775 by Scottish inventor James Watt (who built upon the ideas of previous steam engine attempts like the 1712 Newcomen engine), the steam engine powered trains, ships, factories and the Industrial Revolution as a whole.
circa 1830: An early locomotive hauling freight. (Photo by Hulton Archive/Getty Images)
11. INTERNAL COMBUSTION ENGINE - the 19th-century invention (created by Belgian engineer Etienne Lenoir in 1859 and improved by Germany's Nikolaus Otto in 1876), this engine that converts chemical energy into mechanical energy overtook the steam engine and is used in modern cars and planes. Elon Musk's electric car company Tesla, among others, is currently trying to revolutionize technology in this arena once again.
12. TELEPHONE - although he was not the only one working on this kind of tech, Scottish-born inventor Alexander Graham Bell got the first patent for an electric telephone in 1876. Certainly, this instrument has revolutionized our ability to communicate.
13. VACCINATION - while sometimes controversial, the practice of vaccination is responsible for eradicating diseases and extending the human lifespan. The first vaccine (for smallpox) was developed by Edward Jenner in 1796. A rabies vaccine was developed by the French chemist and biologist Louis Pasteur in 1885, who is credited with making vaccination the major part of medicine that is it today. Pasteur is also responsible for inventing the food safety process of pasteurization, that bears his name.
14. CARS - cars completely changed the way we travel, as well as the design of our cities, and thrust the concept of the assembly line into the mainstream. They were invented in their modern form in the late 19th century by a number of individuals, with special credit going to the German Karl Benz for creating what's considered the first practical motorcar in 1885.
Karl Benz (in light suit) on a trip with his family with one of his first cars, which was built in 1893 and powered by a single cylinder, 3 h.p. engine. His friend Theodor von Liebig is in the Viktoria. (Photo by Hulton Archive/Getty Images)
15. AIRPLANE - invented in 1903 by the American Wright brothers, planes brought the world closer together, allowing us to travel quickly over great distances. This technology has broadened minds through enormous cultural exchanges—but it also escalated the reach of the world wars that would soon break out, and the severity of every war thereafter.
16. PENICILLIN - discovered by the Scottish scientist Alexander Fleming in 1928, this drug transformed medicine by its ability to cure infectious bacterial diseases. It began the era of antibiotics.
17. ROCKETS - while the invention of early rockets is credited to the Ancient Chinese, the modern rocket is a 20th century contribution to humanity, responsible for transforming military capabilities and allowing human space exploration.
18. NUCLEAR FISSION - this process of splitting atoms to release a tremendous amount of energy led to the creation of nuclear reactors and atomic bombs. It was the culmination of work by a number of prominent (mostly Nobel Prize-winning) 20th-century scientists, but the specific discovery of nuclear fission is generally credited to the Germans Otto Hahn and Fritz Stassmann, working with the Austrians Lise Meitner and Otto Frisch.
Austrian nuclear physicist Lise Meitner (1878 - 1968) congratulates German chemist Otto Hahn (1879 - 1968) on his 80th birthday, Gottingen, Germany, 8th March 1959. The pair collaborated for 30 years in the study of radioactivity, work which culminated in the discovery of nuclear fission. (Photo by Keystone/Hulton Archive/Getty Images)
19. SEMICONDUCTORS - they are at the foundation of electronic devices and the modern Digital Age. Mostly made of silicon, semiconductor devices are behind the nickname of “Silicon Valley", home to today's major U.S. computing companies. The first device containing semiconductor material was demonstrated in 1947 by America's John Bardeen, Walter Brattain and William Shockley of Bell Labs.
20. PERSONAL COMPUTER - invented in the 1970s, personal computers greatly expanded human capabilities. While your smartphone is more powerful, one of the earliest PCs was introduced in 1974 by Micro Instrumentation and Telemetry Systems (MITS) via a mail-order computer kit called the Altair. From there, companies like Apple, Microsoft, and IBM have redefined personal computing.
(BONUS) 21. THE INTERNET - while the worldwide network of computers (which you used to find this article) has been in development since the 1960s, when it took the shape of U.S. Defense Department's ARPANET, the Internet as we know it today is an even more modern invention. 1990s creation of the World Wide Web by England's Tim Berners-Lee is responsible for transforming our communication, commerce, entertainment, politics, you name it.
Cover photo: a drawing by Leonardo Da Vinci
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
Quantum theory has weird implications. Trying to explain them just makes things weirder.
- The weirdness of quantum theory flies in the face of what we experience in our everyday lives.
- Quantum weirdness quickly created a split in the physics community, each side championed by a giant: Albert Einstein and Niels Bohr.
- As two recent books espousing opposing views show, the debate still rages on nearly a century afterward. Each "resolution" comes with a high price tag.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
To Einstein, the world was ultimately rational. Things had to make sense. They should be quantifiable and expressible through a logical chain of cause-and-effect interactions, from what we experience in our everyday lives all the way to the depths of reality. To Bohr, we had no right to expect any such order or rationality. Nature, at its deepest level, need not follow any of our expectations of well-behaved determinism. Things could be weird and non-deterministic, so long as they became more like what we expect when we traveled from the world of atoms to our world of trees, frogs, and cars. Bohr divided the world into two realms, the familiar classical world, and the unfamiliar quantum world. They should be complementary to one another but with very different properties.
The two scientists spent decades arguing about the impact of quantum physics on the nature of reality. Each had groups of physicists as followers, all of them giants of their own. Einstein's group of quantum weirdness deniers included quantum physics pioneers Max Planck, Louis de Broglie, and Erwin Schrödinger, while Bohr's group had Werner Heisenberg (of uncertainty principle fame), Max Born, Wolfgang Pauli, and Paul Dirac.
Almost a century afterward, the debate rages on.
Einstein vs. Bohr, Redux
Two books — one authored by Sean Carroll and published last fall and another published very recently and authored by Carlo Rovelli — perfectly illustrate how current leading physicists still cannot come to terms with the nature of quantum reality. The opposing positions still echo, albeit with many modern twists and experimental updates, the original Einstein-Bohr debate.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
I summarized the ongoing dispute in my book The Island of Knowledge: Are the equations of quantum physics a computational tool that we use to make sense of the results of experiments (Bohr), or are they supposed to be a realistic representation of quantum reality (Einstein)? In other words, are the equations of quantum theory the way things really are or just a useful map?
Einstein believed that quantum theory, as it stood in the 1930s and 1940s, was an incomplete description of the world of the very small. There had to be an underlying level of reality, still unknown to us, that made sense of all its weirdness. De Broglie and, later, David Bohm, proposed an extension of the quantum theory known as hidden variable theory that tried to fill in the gap. It was a brilliant attempt to appease the urge Einstein and his followers had for an orderly natural world, predictable and reasonable. The price — and every attempt to deal with the problem of figuring out quantum theory has a price tag — was that the entire universe had to participate in determining the behavior of every single electron and all other quantum particles, implicating the existence of a strange cosmic order.
Later, in the 1960s, physicist John Bell proved a theorem that put such ideas to the test. A series of remarkable experiments starting in the 1970s and still ongoing have essentially disproved the de Broglie-Bohm hypothesis, at least if we restrict their ideas to what one would call "reasonable," that is, theories that have local interactions and causes. Omnipresence — what physicists call nonlocality — is a hard pill to swallow in physics.
Credit: Public domain
Yet, the quantum phenomenon of superposition insists on keeping things weird. Here's one way to picture quantum superposition. In a kind of psychedelic dream state, imagine that you had a magical walk-in closet filled with identical shirts, the only difference between them being their color. What's magical about this closet? Well, as you enter this closet, you split into identical copies of yourself, each wearing a shirt of a different color. There is a you wearing a blue shirt, another a red, another a white, etc., all happily coexisting. But as soon as you step out of the closet or someone or something opens the door, only one you emerges, wearing a single shirt. Inside the closet, you are in a superposition state with your other selves. But in the "real" world, the one where others see you, only one copy of you exists, wearing a single shirt. The question is whether the inside superposition of the many yous is as real as the one you that emerges outside.
To Einstein, the world was ultimately rational... To Bohr, we had no right to expect any such order or rationality.
The (modern version of the) Einstein team would say yes. The equations of quantum physics must be taken as the real description of what's going on, and if they predict superposition, so be it. The so-called wave function that describes this superposition is an essential part of physical reality. This point is most dramatically exposed by the many-worlds interpretation of quantum physics, espoused in Carroll's book. For this interpretation, reality is even weirder: the closet has many doors, each to a different universe. Once you step out, all of your copies step out together, each into a parallel universe. So, if I happen to see you wearing a blue shirt in this universe, in another, I'll see you wearing a red one. The price tag for the many-worlds interpretation is to accept the existence of an uncountable number of non-communicating parallel universes that enact all possibilities from a superstition state. In a parallel universe, there was no COVID-19 pandemic. Not too comforting.
Bohm's team would say take things as they are. If you stepped out of the closet and someone saw you wearing a shirt of a given color, then this is the one. Period. The weirdness of your many superposing selves remains hidden in the quantum closet. Rovelli defends his version of this worldview, called relational interpretation, in which events are defined by the interactions between the objects involved, be them observers or not. In this example, the color of your shirt is the property at stake, and when I see it, I am entangled with this specific shirt of yours. It could have been another color, but it wasn't. As Rovelli puts it, "Entanglement… is the manifestation of one object to another, in the course of an interaction, in which the properties of the objects become actual." The price to pay here is to give up the hope of ever truly understanding what goes on in the quantum world. What we measure is what we get and all we can say about it.
What should we believe?
Both Carroll and Rovelli are master expositors of science to the general public, with Rovelli being the more lyrical of the pair.
There is no resolution to be expected, of course. I, for one, am more inclined to Bohr's worldview and thus to Rovelli's, although the interpretation I am most sympathetic to, called QBism, is not properly explained in either book. It is much closer in spirit to Rovelli's, in that relations are essential, but it places the observer on center stage, given that information is what matters in the end. (Although, as Rovelli acknowledges, information is a loaded word.)
We create theories as maps for us human observers to make sense of reality. But in the excitement of research, we tend to forget the simple fact that theories and models are not nature but our representations of nature. Unless we nurture hopes that our theories are really how the world is (the Einstein camp) and not how we humans describe it (the Bohr camp), why should we expect much more than this?
Maybe eyes really are windows into the soul — or at least into the brain, as a new study finds.
- Researchers find a correlation between pupil size and differences in cognitive ability.
- The larger the pupil, the higher the intelligence.
- The explanation for why this happens lies within the brain, but more research is needed.
What can you tell by looking into someone's eyes? You can spot a glint of humor, signs of tiredness, or maybe that they don't like something or someone.
But outside of assessing an emotional state, a person's eyes may also provide clues about their intelligence, suggests new research. A study carried out at the Georgia Institute of Technology shows that pupil size is "closely related" to differences in intelligence between individuals.
The scientists found that larger pupils may be connected to higher intelligence, as demonstrated by tests that gauged reasoning skills, memory, and attention. In fact, the researchers claim that the relationship of intelligence to pupil size is so pronounced, that it came across their previous two studies as well and can be spotted just with your naked eyes, without any additional scientific instruments. You should be able to tell who scored the highest or the lowest on the cognitive tests just by looking at them, say the researchers.
The pupil-IQ link
The connection was first noticed across memory tasks, looking at pupil dilations as signs of mental effort. The studies involved more than 500 people aged 18 to 35 from the Atlanta area. The subjects' pupil sizes were measured by eye trackers, which use a camera and a computer to capture light reflecting off the pupil and cornea. As the scientists explained in Scientific American, pupil diameters range from two to eight millimeters. To determine average pupil size, they took measurements of the pupils at rest when the participants were staring at a blank screen for a few minutes.
Another part of the experiment involved having the subjects take a series of cognitive tests that evaluated "fluid intelligence" (the ability to reason when confronted with new problems), "working memory capacity" (how well people could remember information over time), and "attention control" (the ability to keep focusing attention even while being distracted). An example of the latter involves a test that attempts to divert a person's focus on a disappearing letter by showing a flickering asterisk on another part of the screen. If a person pays too much attention to the asterisk, they might miss the letter.
The conclusions of the research were that having a larger baseline pupil size was related to greater fluid intelligence, having more attention control, and even greater working memory capacity, although to a smaller extent. In an email exchange with Big Think, author Jason Tsukahara pointed out, "It is important to consider that what we find is a correlation — which should not be confused with causation."
The researchers also found that pupil size seemed to decrease with age. Older people had more constricted pupils but when the scientists standardized for age, the pupil-size-to-intelligence connection still remained.
Why are pupils linked to intelligence?
The connection between pupil size and IQ likely resides within the brain. Pupil size has been previously connected to the locus coeruleus, a part of the brain that's responsible for synthesizing the hormone and neurotransmitter norepinephrine (noradrenaline), which mobilizes the brain and body for action. Activity in the locus coeruleus affects our perception, attention, memory, and learning processes.
As the authors explain, this region of the brain "also helps maintain a healthy organization of brain activity so that distant brain regions can work together to accomplish challenging tasks and goals." Because it is so important, loss of function in the locus coeruleus has been linked to conditions like Alzheimer's disease, Parkinson's, clinical depression, and attention deficit hyperactivity disorder (ADHD).
The researchers hypothesize that people who have larger pupils while in a restful state, like staring at a blank computer screen, have "greater regulation of activity by the locus coeruleus." This leads to better cognitive performance. More research is necessary, however, to truly understand why having larger pupils is related to higher intelligence.
In an email to Big Think, Tsukahara shared, "If I had to speculate, I would say that it is people with greater fluid intelligence that develop larger pupils, but again at this point we only have correlational data."
Do other scientists believe this?
As the scientists point out in the beginning of their paper, their conclusions are controversial and, so far, other researchers haven't been able to duplicate their results. The research team addresses this criticism by explaining that other studies had methodological issues and examined only memory capacity but not fluid intelligence, which is what they measured.