Few could match the famous physicist in his ability to communicate difficult-to-understand concepts in a simple and warm fashion.
- Richard Feynman was a renowned physicist who conducted legendary work on quantum physics, the Manhattan Project, and investigating the Challenger explosion.
- Later in life, however, he became best known for his education work, gaining the nickname "the Great Explainer."
- His series, Fun to Imagine, works as an excellent primer to Feynman's unique educational style. Here are 9 science lessons he covers in his series.
Theoretical physicist Richard Feynman was unparalleled for his wit, warmth, and insightful understanding of theoretical physics. Being a gifted conversationalist with a powerful passion, Feynman loved to talk about theoretical physics and was good at it, so much so he was known as "the Great Explainer." Few others were able to approach the difficult and nebulous realm of physics and break it down into simple, entertaining, and informative nuggets of information. In his 1983 series Fun to Imagine, Feynman touches on a variety of topics from a big blue chair in his living room in Altadena, California. Here are 9 brief science lessons from this series.
1. Heat is just jiggling atoms
What we think of as heat is really just motion. Feynman explains that the sensation of heat is the "jiggling" of atoms — the jiggling atoms in hot coffee make it hot, and those atoms bump up against the atoms in the ceramic of your coffee mug, causing them to jiggle as well, making them hotter than they were before.
"It brings up another thing that's kind of curious," says Feynman. "If you're used to balls bouncing, you know they slow up and stop after a while. […] As it bounces, it's passing its extra energy, its extra motions, to little patches on the floor each time it bounces and loses a little each time, until it settles down, we say, as if all the motion has stopped." Instead, the downward motion of all the atoms in the ball have just been transferred into the floor, whose atoms are jiggling just a little bit more and has commensurately become just a little bit warmer.
Start the top video at 0:50 to watch this lesson.
2. Fire is stored sunlight
Carbon and oxygen have a somewhat paradoxical relationship; once "close" enough to one another, they form a very strong partnership, snapping together. But if they're too "far away" from one another, they'll repel each other. Feynman likens it to a hill with a deep hole in the top. "[An oxygen atom is] rolling along, it doesn't go down in the deep hole because if it starts to climb the hill, it rolls away again. But if you made it go fast enough, it'll fall into the hole."
As we learned before, when we talk about heat, we're really talking about motion, and vice versa. So, if we heat up an atom of oxygen enough, it can roll up this hypothetical hill and fall into the hole. On its way, it might bump into other atoms of oxygen, sending them rolling up their hills, and falling into their holes, which maybe bump other atoms of oxygen at the same time. This cascades, over and over again, until you have what we call a fire. Wood, for instance, contains a lot of carbon. If the oxygen around it heats up enough, the oxygen and the carbon can meet up and make a partnership together into the form of CO2, releasing a lot of energy along the way.
Where did this stored energy come from? Originally, it came from the sunlight striking a tree, which was then cut down and harvested for its wood. "The light and heat that's coming out," explains Feynman, "that's the light and the heat of the Sun that went in. So, it's sort of stored Sun that's coming out when you burn a log."
Start the top video at 7:18 to watch this lesson.
3. Rubber bands are jiggling, too
In addition to fire and the motion of atoms, heat is a big part of why rubber bands are stretchy. Rubber bands are composed of these kinked chains of molecules that, when stretched out, are bombarded by atoms from the environment that encourage those chains to kink up together again. Feynman proposes a little experiment: "If you take a fairly wide rubber band and put it between your lips and pull it out, you'll certainly notice its hotter. And if you then let it in, you'll notice its cooler."
"I've always found rubber bands fascinating," he adds. "The world is a dynamic mess of jiggling things if you look at it right."
Start the top video at 12:08 to watch this lesson.
4. Magnetic force? That's a challenge to explain!
Why do magnets repel? "You're not at all disturbed by the fact that when you put your hand on the chair, it pushes you back." With magnets, "we found out by looking at it that that's the same force as, a matter of fact […] It's the same electrical repulsions involved in keeping your finger away from the chair." The difference, Feynman notes, and the thing that makes magnets seem so unusual, is that their repulsive force acts over a distance. This is because the atoms in a magnet are all spinning in the same direction, magnifying the force such that you can feel it at a distance.
Start the top video at 14:53 to watch this lesson.
Richard Feynman while teaching.
5. Electricity: The reason you don't sink through the floor
It's pretty incredible that a wheel turning from the force of falling water from a dam can, when connected by copper wires, cause a motor to turn many miles away as well. If the wheel at the dam stops, so too does everything connected to that part of the power grid. "That phenomenon, I like to think about a lot. […] It's just iron and copper. If you took a big long loop of copper and add iron at each end and move the piece of iron, the iron moves at the other [end]."
In fact, electricity is the reason why you can't push your finger through a solid object. The negatively charged electrons in your finger are tightly bound to the positively charged protons in your finger, and the same relationship holds true for any solid object. Once you try to push your finger through something, the respective protons and electrons can't tolerate the addition of any more positive or negative charge — the electrical charge in your finger's atoms are neutral, and want to stay that way. So, the object and your finger push back very hard on one another.
In a wire conducting electricity, the electrical charge of the atoms is not neutral. The energy derived from, say, a dam, pushes electrons from one atom out, which repels the other electrons along the wire. We can use this energy to move a motor on the far end of the wire or turn on a light.
Start the top video at 22:29 to watch this lesson.
6. The mirror and train puzzle
Feynman described two puzzles he was given by his fraternity brothers at MIT. Why is it that when you look at yourself in the mirror, only the left and right sides are reversed and not the top and bottom of the reflected image? How does the mirror know to flip an image along one axis and not the other? Well, if you were facing a mirror with your nose facing north, the left and right sides aren't actually flipped—your right hand and your reflected image's right hand are both in the east. It's your front and back that have been flipped: Your nose faces north, and your reflected image's nose faces south.
Feynman thought this was an easy puzzle. A harder one is to ask what keeps a train on a track. When turning a corner in a car, the outside wheels have to go farther than the inside wheels, but cars deal with this using a differential gear, which helps each wheel to turn at different rates. Trains, though, have a solid steel bar between each of their wheels. How does the train stay on the track? The answer is that trains have conical wheels. When a train turns a corner, the inside wheels are riding on the thinner part, meaning they can rotate quickly without going too far, while the outside wheels are riding on the thicker part of the cone, meaning they have farther to go to make one rotation.
Start the top video at 32.05 to watch this lesson.
7. Your eyes are eighth-inch black holes
If a sufficiently intelligent bug were sitting in the corner of a pool, they could, in theory, observe the waves in the pool and determine who had dived in. This is what we do with our eyeballs. Like the bug in a pool, we simply take in this shaking stuff (the electromagnetic field) and can learn which objects have "dived" into our pool.
"There's this tremendous mess of waves all over in space, which is the light bouncing around the room and going from one thing to the other. Of course, most of the room doesn't have eighth-inch black holes [our pupils]. It's not interested in light, but the light's there anyway." We can sort this mess out with the instruments we carry around in our eye sockets. Feynman explains that our eighth-inch black holes are only tuned to a small slice of the waves in this pool. But the other waves, bigger ones or smaller ones, we experience as heat or as sound broadcasted from radios. The craziest thing about this to Feynman? "It's all really there! That's what gets you!"
Start the top video at 37:46 to watch this lesson.
8. Conceiving of inconceivable things
Scale, whether looking at very small things or very big things, is very difficult to conceptualize. The size of an atom compared to an apple, for instance, is the same as the size of an apple to the size of Earth. Feynman explains how difficult it is to consider very large scales, as well: "There's a very large number of stars in the galaxy. There's so many, that if you tried to name them, one a second, naming all the stars in our galaxy, […] it takes 3,000 years. And yet that's not a very big number. If those stars were to drop a one-dollar bill during a year, […] they might take care of the deficit which is suggested for the budget of the United States. You can see what kind of numbers we're dealing with."
Start the top video at 43:43 to watch this lesson.
9. Thinking is kind of nutty
Sometimes, we like to mythologize particularly impressive people, Feynman included. But thinking this way can be limiting. Feynman doesn't believe there are particularly "special" people — just those who work and study hard. That's not to say there's no difference between people, however. "I suspect that what goes on in every man's head might be very, very different. The actual imagery, or semi-imagery which comes when we're talking to each other at these high and complicated levels […] We think we're speaking very well and we're communicating, but what we're doing is having this big translation scheme for translating what this fellow says into our images, which are very different."
Start the top video at 55:01 to watch this lesson.
Famous physicists like Richard Feynman think 137 holds the answers to the Universe.
- The fine structure constant has mystified scientists since the 1800s.
- The number 1/137 might hold the clues to the Grand Unified Theory.
- Relativity, electromagnetism and quantum mechanics are unified by the number.
Does the Universe around us have a fundamental structure that can be glimpsed through special numbers?
The brilliant physicist Richard Feynman (1918-1988) famously thought so, saying there is a number that all theoretical physicists of worth should "worry about". He called it "one of the greatest damn mysteries of physics: a magic number that comes to us with no understanding by man".
That magic number, called the fine structure constant, is a fundamental constant, with a value which nearly equals 1/137. Or 1/137.03599913, to be precise. It is denoted by the Greek letter alpha - α.
What's special about alpha is that it's regarded as the best example of a pure number, one that doesn't need units. It actually combines three of nature's fundamental constants - the speed of light, the electric charge carried by one electron, and the Planck's constant, as explains physicist and astrobiologist Paul Davies to Cosmos magazine. Appearing at the intersection of such key areas of physics as relativity, electromagnetism and quantum mechanics is what gives 1/137 its allure.
Physicist Laurence Eaves, a professor at the University of Nottingham, thinks the number 137 would be the one you'd signal to the aliens to indicate that we have some measure of mastery over our planet and understand quantum mechanics. The aliens would know the number as well, especially if they developed advanced sciences.
The number preoccupied other great physicists as well, including the Nobel Prize winning Wolfgang Pauli (1900-1958) who was obsessed with it his whole life.
"When I die my first question to the Devil will be: What is the meaning of the fine structure constant?" Pauli joked.
Pauli also referred to the fine structure constant during his Nobel lecture on December 13th, 1946 in Stockholm, saying a theory was necessary that would determine the constant's value and "thus explain the atomistic structure of electricity, which is such an essential quality of all atomic sources of electric fields actually occurring in nature."
One use of this curious number is to measure the interaction of charged particles like electrons with electromagnetic fields. Alpha determines how fast an excited atom can emit a photon. It also affects the details of the light emitted by atoms. Scientists have been able to observe a pattern of shifts of light coming from atoms called "fine structure" (giving the constant its name). This "fine structure" has been seen in sunlight and the light coming from other stars.
The constant figures in other situations, making physicists wonder why. Why does nature insist on this number? It has appeared in various calculations in physics since the 1880s, spurring numerous attempts to come up with a Grand Unified Theory that would incorporate the constant since. So far no single explanation took hold. Recent research also introduced the possibility that the constant has actually increased over the last six billion years, even though slightly.
If you'd like to know the math behind fine structure constant more specifically, the way you arrive at alpha is by putting the 3 constants h,c, and e together in the equation --
As the units c, e, and h cancel each other out, the "pure" number of 137.03599913 is left behind. For historical reasons, says Professor Davies, the inverse of the equation is used 2πe2/hc = 1/137.03599913. If you're wondering what is the precise value of that fraction - it's 0.007297351.
Measuring quantum gravity has proven extremely challenging, stymying some of the greatest minds in physics for generations.
For over a century, the two leading theories in physics have had irreconcilable differences, and scientists have scrambled to find ways to square them, to no avail. An experiment proposed in 1957 by American luminary Richard Feynman, is now getting a makeover, and the results could be significant.
Scientists at Oxford University and University College London (UCL), are attempting to overhaul one of the late Nobel Laureate's experiments and in doing so, hope to heal the rift in a dramatic fashion. Could a Theory of Everything be near? This would be incorporating all four physical forces: gravity, electromagnetism, and the strong and weak nuclear forces, into one solid working theory.
Thus far, theoretical physicist Steven Weinberg, himself a Nobel Laureate, has only been able to combine electromagnetism and the weak nuclear force. A final theory—as Weinberg calls it, would mark the end of physics as we know it. Although the laws of general relativity and quantum mechanics work exceptionally well in their own spheres, some of the rules that govern one area don’t work in the other, and vice-versa. For instance, Relativity explains the gravitational force as it relates to bodies on Earth or in space. But it falls apart on the quantum level.
The merging of two neutron stars. Einstein’s general relativity helps us understand the gravitational forces involved well. Where it gets lost is on the quantum level. Credit: European Space Agency (ESA).
The current upgrade to Feynman’s proposal focuses on quantum gravity. Two papers on the upcoming experiment were published in the journal Physical Review Letters. In the first researchers write, “Understanding gravity in the framework of quantum mechanics is one of the great challenges in modern physics.” A lot of experiments have been proposed, but it’s proven extremely difficult to test quantum gravity in the lab.
One of the reasons, researchers write in the second paper, “Quantum effects in the gravitational field are exceptionally small, unlike those in the electromagnetic field.” Sougato Bose leads the UCL team. He told Physics World, “For example, even the electrostatic force between two electrons overtakes the gravitational force between two kilogram masses by several orders of magnitude.”
These physicists believe if they can detect gravity on the quantum level, it would help us better understand why it operates so differently there, and perhaps reveal the secret to navigating between our two prevailing theories. Feynman’s idea to test for quantum gravity surrounds something known as superposition. A particle is thought to exist in all possible states or positions at once, until measured. Then you can nail down its exact location (or velocity, but not both).
Feynman speculated that using quantum entanglement, one could take a small mass and place it inside a gravitational field, causing it to become entangled with the field on the quantum level. Then, the physicist would be able to detect the field’s interference, before indicating the mass’s position. The interference itself would cause the mass to take a single, specific location or form, which would occur before the mass separated itself from the field. And so in this way, quantum gravity could be detected.
Illustration of a quantum gravity photon race. The purple or high energy photon carries a million times the energy of the yellow one. Yet, it’s thought to move slower as it interacts more with the frothy material of space. In 2009, satellites measured a gamma ray burst from a neutron star collision. Curiously, after traveling approximately 7 billion light years, a pair of such photons arrived just nine-tenths of a second apart. Credit: NASA Goddard Space Flight Center.
Oxford researchers Chiara Marletto and Vlatko Vedral worry that since entanglement isn’t measured directly in Feynman’s proposed experiment, it wouldn’t provide direct evidence of quantum gravity. By quantizing not one but two masses and entangling them, the Oxford physicists say, quantum gravity can be detected directly. Each mass would be in superposition and entangled in a quantum gravity field. UCL physicists added their own element, a “quantum gravity mediator,” to entangle the masses.
To conduct the experiment, two identical interferometers will be placed adjacent to one another. These usually split light up into its constituent parts. But down on the quantum level, these devices interfere with a mass’s quantum wave function, in order to superimpose its quantum state. If gravity is operating on the quantum level, the two masses will become entangled before each leaves its own interferometer.
An artist’s depiction of quantum entanglement. Credit: The National Science Foundation.
Dr. Marletto explained in Physics World,
Our two teams took slightly different approaches to the proposal. Vedral and I provided a general proof of the fact that any system that can mediate entanglement between two quantum systems must itself be quantum. On the other hand, Bose and his team discussed the details of a specific experiment, using two spin states to create the spatial superposition of the masses.
This isn’t a done deal by any means. Electromagnetic forces might interfere with the entanglement before researchers are able to measure the effects of quantum gravity. Even if the gravitational field is quantized, quantum gravity may be harder to detect than scientists predict. But if it does work, it could lead to quite a breakthrough.
There are a lot of theories on how gravity operates on the quantum level. It may originate from particles called gravitons, which would be carriers of gravity much like how photons carry electromagnetism. Quantum gravity and string theory have their own bends. The results of this experiment could help us sort quantum gravity out, and perhaps lead to a final theory, at which time, it’d be the dawning of a whole new understanding of the universe and how all its forces fit together.
To learn more about where we’re at with the Theory of Everything, click here:
Will we ever have a Theory of Everything? Theoretical physicist Lawrence Krauss isn't sure that's the right question to be asking.
It’s no surprise that understanding highly abstract mathematics can be challenging, says theoretical physicist Lawrence Krauss. The organ of your body that does the understanding — the brain — is like the organ that does the waste processing — the kidney. Both are products of millions of years of evolution, and neither will change overnight. The type of thinking that helped us survive on the African savannas doesn’t help us grasp quantum mechanics. We should expect to not understand everything about the universe, and to keep asking questions…
Lawrence Krauss' most recent book is The Greatest Story Ever Told -- So Far: Why Are We Here?.
Don't believe every science study you read, because sometimes not even their authors believe them. Here are the issues corrupting good, honest science – and how to fix them.
It’s a dirty little secret in the science community that most published scientific studies aren’t 100% true. As Nobel Prize-winning biologist Thomas Sudhof told PLOS, there are a host of problems with science journals. He summarizes those five problems as:
1. Hidden conflicts of interest between the journal and its reviewers
2. Trivial accountability measures for journals and reviewers
3. Expensive publishing costs and limited journals for authors to publish in
4. A murky, hodge-podge peer-review process
5. Experiments with unreproducible results
Once these studies are published they get into the media's unreliable little hands, some of whom are genuinely confused by the science, and others who are genuinely sensationalizing science for publicity gains. Depending on the day and the news outlet, coffee will either kill you or be the secret to eternal life (depending of course through which orifice you administer it). Owning a certain pet can make you infertile. Smelling farts can prevent cancer. Eating chocolate can turn you into a Nobel Prize winner. Watching pornography could make men better weightlifters. The list could, and unfortunately does, go on.
It’s perhaps best said by John Oliver in his excellent report on sham science studies: “In science, you don’t just get to cherry-pick the parts that justify what you were going to do anyway. That’s religion. You’re thinking of religion,” he says.
Much of the information gets dumbed down or selectively sensationalized as it passes from news source to news source, and some of it was dodgy from the start due to publicity-hungry scientists, which you can kind of understand (but not entirely forgive) as their continued funding depends on finding things that are spectacular, even if a little fictional. And yet it appears grant money is pissing down over Aston University in England, where a study concluded that toast falling off a table will tend to fall butter-side down. This important information was published in the European Journal of Physics.
The five problems Sudhof described above are big. All of them need to be fixed. When they are, papers published in scientific journals would not only be more honest; they’d be more varied. More kinds of research would be published – smaller experiments, overlooked topics, and even experiments that had unfavorable or negative results. All of those outcomes would make scientific papers more approachable to the general public. It would also cut down on the amount of pseudoscience that attempts to explain the actual science and ends up confusing everyone.
So is there a way to fix those 5 problems? You bet! At least from the scientific end (the media is another kettle of fish). Sudhof offers 6 easy tips scientists can use to fix their publication problems and get the public interested in their work:
Credit: Laurie Vazquez/Big Think
1. Post research to preprint servers before publication, giving researchers time to improve their work
When a scientist runs an experiment and has a significant result to report, their first step is to write it all up. Their second step is to find a journal to publish in. This is an enormous pain for many reasons, but one of the biggest is that every journal uses a different submission format. Journals collect and publish materials in different ways; streamlining the editorial process by putting all the journals on the same publishing system would let researchers focus more on honing their results, instead of futzing with formatting. Cold Spring Harbor Laboratory’s bioRxiv is already doing this. Hopefully more platforms follow.
2. Clarifying review forms to give workable feedback to authors
Because each journal has its own submission format, they’ve also got their own publishing process. That means they use different methods to review papers, and those methods are often forms that are “cumbersome or insufficient to provide thoughtful and constructive feedback to authors,” Sudhof explains. Streamlining those forms would cut down on the amount of back and forth between the researcher and the journal, again allowing them to focus more on clarifying their work than formatting it.
3. Reviewer and editor training that puts burgeoning and established reviewers on the same playing field
Journals have a variety of people reviewing proposed publications. Some of them were trained decades ago. Some of them are brand-new to reviewing. None of them have a standardized review process that tells them what to look for. Investing in training allows them to assess papers fairly and give constructive feedback to the researcher.
4. Reduce the complexity of experiments to make the results easier to reproduce
“Many experiments are by design impossible to repeat,” Sudhof writes. “Many current experiments are so complex that differences in outcome can always be attributed to differences in experimental conditions (as is the case for many recent neuroscience studies because of the complexity of the nervous system). If an experiment depends on multiple variables that cannot be reliably held constant, the scientific community should not accept the conclusions from such an experiment as true or false.”
5. Validate the methods of the experiment
Sudhof again: “Too often, papers in premier journals are published without sufficient experimental controls—they take up too much space in precious journal real estate!—or with reagents that have not been vetted after they were acquired.”
6. Publish ALL results, not just ones that support the conclusion you want to make
Journals are a business, and as such they tend to publish results that will encourage people to buy them. In this case, that means focusing on experiments with positive results. Sudhof takes particular issue with this, citing the “near impossibility of actually publishing negative results, owing to the reluctance of journals—largely motivated by economic pressures—to devote precious space to such papers, and to the reluctance of authors to acknowledge mistakes.” However, not all journals are like that. PLOS ONE lets scientists publish “negative, null and inconclusive” results, not just ones that support the experiment. That allows for a more comprehensive understanding of the experiment, and can even provide more helpful data than positive results. Hopefully more journals follow suit.
By taking these 6 steps, scientists would make their results clearer to the public. That would make discoveries easier to understand, help increase scientific curiosity, and cut down misinformation. It would also force scientists to communicate in plain English, which would make a serious dent in the amount of pseudoscience we hear on a daily basis. Physicist and renowned skeptic Richard Feynman explained it to us this way: “'Without using the new word which you have just learned, try to rephrase what you have just learned in your own language.” Pseudoscience explanations are larded with jargon and often can’t be explained in plain English; without the jargon, the explanation falls apart at the seams. Actual science can – and should – do better.
Plus, the sooner pseudoscience goes away, the happier – and smarter – we’ll all be. The ball’s in your court, scientists. Run with it.