Big Think Interview With Krisztina Holly
Krisztina\r\n Holly: So there are two main parts to the university. There is the \r\nresearch enterprise and there is the educational enterprise and I \r\ndefinitely see both parts of it changing a lot in the next decade or so.\r\n So, speaking about the educational enterprise it’s not enough for \r\npeople the learn the skills that they’re learning today, as valuable as \r\nthey may be, but it’s going to be very important for them to learn \r\ninnovation skills that enable them to better communicate their ideas and\r\n communicate a value proposition, figure out how to make greater impact \r\nwith their ideas by enrolling other people in their vision and to \r\nunderstand how to finance their idea and how to turn it into a \r\nsustainable business, nonprofit, whatever form it is. So that is going \r\nto be really important. We think it is especially important at the PhD \r\nlevel. We think that is something that has been ignored. In fact, Kurt \r\nCarlson who is the CEO of SRI International, they’re a non-profit \r\nresearch lab in Menlo Park, he was giving a talk recently and he said \r\nthat they get the most amazing PhDs around the world, around the country\r\n to come and work for them and despite that, it takes them seven to ten \r\nyears to become fully productive members of the team. Why is that? It is\r\n because they lack the innovation skills. They lack the skills of \r\nunderstanding how they fit into the innovation ecosystem, how they fit \r\nin, how they communicate their ideas and how they address real world \r\nproblems. So that’s perfect validation for the fact that at USC we just \r\nannounced and we’re launching for the fall an innovation diploma program\r\n for PhD students that is free of charge for PhDs. It is a three-course \r\nsequence unlike any other program that we’re aware of, and we think it \r\nis really, really valuable for our students. We’re not trying to turn \r\nPhDs into business people. We don’t think that is appropriate. We don’t \r\nthink that it always works. We want to keep them as researchers at the \r\ncutting edge of their field and that’s their whole goal is to become the\r\n absolute best in a discipline and there have been some criticisms that \r\nacademics don’t understand the bigger picture or they’re too \r\nspecialized. The reality is if you’re going to be the absolute best you \r\nhave to be very specialized, but that doesn’t preclude you from \r\nunderstanding how to communicate with others that can take your idea and\r\n make it into something really impactful. So it’s sort of bridging that \r\ngap by making both the academics aware and then of course we’d like to \r\nfocus on the business community as well to bring them closer to \r\nacademia.
Question: How might digital scholarship \r\nimpact innovation at universities?
Krisztina Holly: There\r\n are a lot of changes that are happening now that are really going to be\r\n impacting the way innovation happens in the university. One of them, \r\nfor example, is open access to research results and people are \r\npublishing increasingly in open access journals and in fact I think \r\nthere have been about 5,000 new open access journals that have popped up\r\n online in the last couple of years that are circumventing the typical \r\npeer reviewed printed journal publications and that will have some \r\nsignificant affect in the future. It’s not just a matter of open access \r\nto the papers, but also there has been a greater drive towards open \r\naccess towards the data itself. It is somewhat controversial because \r\nthere is definitely an interest by faculty with all the work that they \r\nput into collecting that data and this has been a challenge for a while,\r\n but it is exacerbated by this new open access. How do you get to \r\nbenefit from your own data that you’ve worked so hard to collect and \r\nthen and publish on? So how long is it appropriate to hold back that \r\ndata before you share it with other people? Obviously the sooner you get\r\n the data out there the more people will benefit and at the same time \r\nyou need to motivate faculty to be collecting that data in the first \r\nplace and so that will be an interesting thing to see.
Also \r\ndigital scholarship is changing the output of research. It used to be \r\nthat you can do some research, you can write it up in a thesis or a \r\npaper, publish it or put it on a bookshelf and that was your \r\npublication. That is not going to cut it anymore. You have digital \r\nmultimedia output. How do you archive that? For example, we have this \r\nsystem that was developed at USC in collaboration with some other \r\nuniversities called Hypercities. It was developed by a historian in fact\r\n at USC, Phil Ethington and what it is, is you can put geo-rectified \r\nmaps and geo-tagged photographs into the system. I can look at my \r\nneighborhood and then click on a button and it goes from the view from \r\nthe sky, the satellite view down into "Well let’s see what the map \r\nlooked like from 1986. Now let’s see this other map from 1920," and you \r\nrealize "My God, there is no marina there," and it’s almost like going \r\nthrough time and seeing how things were. You can look at different \r\nphotographs and very much in a crowdsourcing fashion it enables other \r\nhistorians now. It’s this platform where other people can add to this \r\narchive of information, so it brings up some interesting questions. One \r\nof the questions is how do you store that kind of output if it is not a \r\npiece of paper that you can put in a library or you can scan in? How do \r\nyou archive this? How do you enable people to access that information? \r\nAnd if you are allowing people to contribute to it then how do you give \r\nproper credit to those individuals that are contributing to this piece \r\nof scholarship if there is now hundreds of people that are contributing \r\nto this? So this is very different. It’s a brave new world. It’s \r\ndifferent from the way it was 20, 30 years ago and it’s going to \r\ncontinue to change.
Krisztina Holly: It is an interesting challenge that in order to motivate people to excel and do things, it’s part of human nature that there needs to be some sort of incentive. So in the market economy it’s very much based on financial rewards. In academia it is very much based on reputation and so either way there is competition. I do think in academia it’s much more collaborative, so I think that although people can criticize academics at times for holding back certain research results—and it’s not ideal, it’s not optimal—at the same time I do think that there is a real sense of collaboration and the desire to create great results together. But I do think that we do have to be collaborating more and we are collaborating more. A perfect example is the Human Genome project. That would not have come together unless you had many universities and researchers that came together to work for the greater good on this project and ultimately it was clear who the big contributors were. It’s really a part of the whole ethic is to try to be able to track that, but there are challenges because if you’re starting to bring together lots of other people you know you want to make sure that we maintain that ethic for providing acknowledgment to the people who contribute.
We have lots of big challenges ahead of us, whether it is trying to reduce the cost of solar energy or trying to deliver clean water to the whole world or renewable energy in general and global warming. All of these things are going to really need to have large collaborations and I don’t know that we’ve completely figured that out yet. It’s just a prediction that it will cause some pressure and some challenges for universities because right now, especially the larger elite universities they have large research enterprises that they can build on and they can build on their reputation by bringing more research or dollars, and to be doing more exciting research. At the same time if universities are collaborating more on programs then the universities will maybe be asking themselves: "How do we preserve our brand?" Because brand is important in that collaboration. So individual universities need to have a value proposition so that it is not just a place where faculty sit and get a paycheck. Faculty can take their research and they can move to another place, so it will put more pressure on universities to ensure that they’re doing their jobs and creating that innovative environment that enables people to collaborate and work together. That is really one of the huge values of universities and a place like USC, we’ve been around for almost 130 years... absolutely integral to the local community and also within our own we’ve built up this faculty over the years. And that enables us to get the absolute best students to come through. So it’s based on a real foundation and as an example we just need to make sure that we maintain that and we keep growing and we keep increasing that or else we’re not going to be relevant.
A conversation with the vice provost for innovation at the University of Southern California.
Once a week.
Subscribe to our weekly newsletter.
Milgram's experiment is rightly famous, but does it show what we think it does?
- In the 1960s, Stanley Milgram was sure that good, law-abiding Americans would never be able to follow orders like the Germans in the Holocaust.
- His experiments proved him spectacularly wrong. They showed just how many of us are willing to do evil if only we're told to by an authority figure.
- Yet, parts of the experiment were set up in such a way that we should perhaps conclude something a bit more nuanced.
Holding a clipboard and wearing a lab coat makes you a very powerful person. Add in a lanyard and a confident voice, and you're pretty much in Ocean's Eleven.
Though we believe ourselves to be contrarians, most of us like to obey authority. We answer questions, help with any number of tasks, and obey commands unthinkingly. The vast majority of the time, this is relatively harmless and even requisite for a functioning society, but it can also lead humanity to very dark places.
It could happen here
As we've seen with Asch's experiments on conformity, the post-World War II community was determined to answer how and why the Holocaust took place. Just after the trial of Adolf Eichmann, the American media and public came to see German society as some special kind of monster in just how willing they were to follow orders unthinkingly, at odds with any sense of duty or morality.
Into this came Stanley Milgram. In 1961, Milgram set out a series of experiments to show, in his view, how the German people were more susceptible to authoritarianism than Americans. Milgram believed, as a lot of people did, that the American people would never be capable of such horrendous evil.
The experiment was to be set up in two stages: the first would be on American subjects, to gauge how far they would obey orders; the second would be on Germans, to prove how much they differed. The results stopped Milgram in his tracks.
Shock, shock, horror
Milgram wanted to ensure that his experiment involved as broad and diverse a group of people as possible. In addition to testing the American vs. German mindset, he wanted to see how much age, education, employment, and so on affected a person's willingness to obey orders.
So, the original 40 participants he gathered came from a wide spectrum of society, and each was told that they were to take part in a "memory test." They were to determine the extent to which punishment affects learning and the ability to memorize.
Milgram believed, as a lot of people did, that the American people would never be capable of such horrendous evil.
The experiment involved three people. First, there was the "experimenter," dressed in a lab coat, who gave instructions and prompts. Second, there was an actor who was the "learner." Third, there was the participant who thought that they were acting as the "teacher" in the memory test. The apparent experimental setup was that the learner had to match two words together after being taught them, and whenever they got the answer wrong, the teacher had to administer an electric shock. (The teachers (participants) were shocked as well to let them know what kind of pain the learner would experience.) At first, the shock was set at 15 volts.
The learner (actor) repeatedly made mistakes for each study, and the teacher was told to increase the voltage each time. A tape recorder was played that had the learner (apparently) make sounds as if in pain. As it went on, the learner would plead and beg for the shocks to stop. The teacher was told to increase the amount of voltage as punishment up to a level that was explicitly described as being fatal — not least because the learner was desperately saying he had a heart condition.
The question Milgram wanted to know: how far would his participants go?
Just obeying orders
The results were surprising. Sixty-five percent of the participants were willing to give a 450-volt shock described as lethal, and all administered a 300-volt shock described as traumatically painful. It should be repeated, this occurred despite the learner (actor) begging the teacher (participant) to stop.
In the studies that came after, in a variety of different setups, that 60 percent number came up again and again. They showed that roughly two out of three people would be willing to kill someone if told to by an authority figure. Milgram proved that all genders, ages, and nationalities were depressingly capable of inflicting incredible pain or worse on innocent people.
Major limitations in Milgram's experiment
Milgram took many steps to make sure that his experiment was rigorous and fair. He used the same tape recording of the "learner" screaming, begging, and pleading for all participants. He made sure the experimenters used only the same four prompts each time when the participants were reluctant or wanted to stop. He even made sure that he himself was not present at the experiment, lest he interfere with the procedure (something Phillip Zimbardo did not do).
But, does the Milgram experiment actually prove what we think it does?
First, the experimenters were permitted to remind the participants that they were not responsible for what they did and that the team would take full blame. This, of course, does not make the study any less shocking, but it does perhaps change the scope of the conclusions. Perhaps the experiment reveals more about our ability to surrender responsibility and our willingness simply to become a tool. The conclusion is still pretty depressing, but it shows what we are capable of when offered absolution rather than when simply following orders.
Second, the experiment took place in a single hour, with very little time either to deliberate or talk things over with someone. In most situations, like the Holocaust, the perpetrators had ample time (years) to reflect on their actions, and yet, they still chose to turn up every day. Milgram perhaps highlights only how far we'll go in the heat of the moment.
Finally, the findings do not tell the whole tale. The participants were not engaging in sadistic glee to shock the learner. They all showed signs of serious distress and anxiety, such as nervous laughing fits. Some even had seizures. These were not willing accomplices but participants essentially forced to act a certain way. (Since then, many scientists have argued that Milgram's experiment is hugely unethical.)
The power of authority
That all being said, there's a reason why Milgram's experiment stays with us today. Whether it's evolutionarily or socially drilled into us, it seems that humans are capable of doing terrible things, if only we are told to do so by someone in power — or, at the very least, when we don't feel responsible for the consequences.
One silver lining to Milgram is in how it can inoculate us against such drone-like behavior. It can help us to resist. Simply knowing how far we can be manipulated helps allow us to say, "No."
As the American population grows, fewer people will die of cancer.
- A new study projects that cancer deaths will decrease in relative and absolute terms by 2040.
- The biggest decrease will be among lung cancer deaths, which are predicted to fall by 50 percent.
- Cancer is like terrorism: we cannot eliminate it entirely, but we can minimize its influence.
As the #2 leading cause of death, cancer takes the lives of about 600,000 Americans each year. In comparison, heart disease (#1) claims more than 650,000 lives, while accidents (#3) take about 175,000 lives. (In 2020 and likely 2021, COVID will claim the #3 spot.)
Headlines are usually full of terrible news about cancer. Seemingly, you can't get away from anything that causes it. RealClearScience made a list of all the things blamed for cancer — antiperspirants, salty soup, eggs, corn, Pringles, bras, burnt toast, and even Facebook made the list.
The reality, however, is much more optimistic. We're slowly but surely winning the war on cancer.
Winning the war on cancer
How can we make such a brazen statement? A new paper published in the journal JAMA Network Open tracks trends in cancer incidence and deaths and makes projections to the year 2040. The authors predict that around 568,000 Americans will have died of cancer in 2020, but they project that number to fall to 410,000 by 2040. That's a drop of nearly 28 percent, despite the U.S. population being projected to grow from roughly 333 million today to 374 million in 2040, an increase of 12 percent. That means cancer deaths will decrease in both relative and absolute terms.
What accounts for this unexpected good news? The lion's share is the number of deaths attributable to lung cancer, which is projected to decrease by more than 50 percent, from 130,000 to 63,000. This drop is largely due to the decreasing use of tobacco products. Other deaths predicted to decline include those from colorectal, breast, prostate, and ovarian cancers, among others, such as leukemia and non-Hodgkin lymphoma (NHL).
The authors credit screening and biomedical advances for saving many of these lives. For instance, lead author Dr. Lola Rahib wrote in an email to Big Think that "colonoscopies remove precancerous polyps." She also noted that targeted therapies and immunotherapies have helped reduce the number of deaths from leukemia and NHL.
We'll never cure cancer
Now the bad news: We'll never cure cancer. There are at least three reasons for this. The first is obvious: We all die. The lifetime prevalence of death is 100 percent. The truth is that we are running out of things to die from. After a long enough period of time, something gives out — often your cardiovascular system or nervous system. Or you develop you cancer.
The second reason is that we are multicellular organisms and, hence, we are susceptible to cancer. (Contrary to popular myth, sharks get cancer, too.) The cells of multicellular organisms face an existential dilemma: they can either get old and stop dividing (a process called senescence) or become immortal but cancerous. For this reason, the problem of cancer may not have a solution.
Finally, there isn't really such a thing as a disease called "cancer." What we call cancer is actually a collection of several different diseases, some of which are preventable (like cervical cancer with the HPV vaccine) or curable (like prostate cancer). Unfortunately, some cancers probably never will be curable, not least because cancers can mutate and develop resistance to the drugs we use to treat them.
But the overall optimism still stands: We are slowly and incrementally winning the war on cancer. Like terrorism, it's not a foe that we can completely vanquish, but it is one whose influence we can minimize in our lives.
We explore the history of blood types and how they are classified to find out what makes the Rh-null type important to science and dangerous for those who live with it.
- Fewer than 50 people worldwide have 'golden blood' — or Rh-null.
- Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system.
- It's also very dangerous to live with this blood type, as so few people have it.
Golden blood sounds like the latest in medical quackery. As in, get a golden blood transfusion to balance your tantric midichlorians and receive a free charcoal ice cream cleanse. Don't let the New-Agey moniker throw you. Golden blood is actually the nickname for Rh-null, the world's rarest blood type.
As Mosaic reports, the type is so rare that only about 43 people have been reported to have it worldwide, and until 1961, when it was first identified in an Aboriginal Australian woman, doctors assumed embryos with Rh-null blood would simply die in utero.
But what makes Rh-null so rare, and why is it so dangerous to live with? To answer that, we'll first have to explore why hematologists classify blood types the way they do.
A (brief) bloody history
Our ancestors understood little about blood. Even the most basic of blood knowledge — blood inside the body is good, blood outside is not ideal, too much blood outside is cause for concern — escaped humanity's grasp for an embarrassing number of centuries.
Absence this knowledge, our ancestors devised less-than-scientific theories as to what blood was, theories that varied wildly across time and culture. To pick just one, the physicians of Shakespeare's day believed blood to be one of four bodily fluids or "humors" (the others being black bile, yellow bile, and phlegm).
Handed down from ancient Greek physicians, humorism stated that these bodily fluids determined someone's personality. Blood was considered hot and moist, resulting in a sanguine temperament. The more blood people had in their systems, the more passionate, charismatic, and impulsive they would be. Teenagers were considered to have a natural abundance of blood, and men had more than women.
Humorism lead to all sorts of poor medical advice. Most famously, Galen of Pergamum used it as the basis for his prescription of bloodletting. Sporting a "when in doubt, let it out" mentality, Galen declared blood the dominant humor, and bloodletting an excellent way to balance the body. Blood's relation to heat also made it a go-to for fever reduction.
While bloodletting remained common until well into the 19th century, William Harvey's discovery of the circulation of blood in 1628 would put medicine on its path to modern hematology.
Soon after Harvey's discovery, the earliest blood transfusions were attempted, but it wasn't until 1665 that first successful transfusion was performed by British physician Richard Lower. Lower's operation was between dogs, and his success prompted physicians like Jean-Baptiste Denis to try to transfuse blood from animals to humans, a process called xenotransfusion. The death of human patients ultimately led to the practice being outlawed.4
The first successful human-to-human transfusion wouldn't be performed until 1818, when British obstetrician James Blundell managed it to treat postpartum hemorrhage. But even with a proven technique in place, in the following decades many blood-transfusion patients continued to die mysteriously.
Enter Austrian physician Karl Landsteiner. In 1901 he began his work to classify blood groups. Exploring the work of Leonard Landois — the physiologist who showed that when the red blood cells of one animal are introduced to a different animal's, they clump together — Landsteiner thought a similar reaction may occur in intra-human transfusions, which would explain why transfusion success was so spotty. In 1909, he classified the A, B, AB, and O blood groups, and for his work he received the 1930 Nobel Prize for Physiology or Medicine.
What causes blood types?
It took us a while to grasp the intricacies of blood, but today, we know that this life-sustaining substance consists of:
- Red blood cells — cells that carry oxygen and remove carbon dioxide throughout the body;
- White blood cells — immune cells that protect the body against infection and foreign agents;
- Platelets — cells that help blood clot; and
- Plasma — a liquid that carries salts and enzymes.6,7
Each component has a part to play in blood's function, but the red blood cells are responsible for our differing blood types. These cells have proteins* covering their surface called antigens, and the presence or absence of particular antigens determines blood type — type A blood has only A antigens, type B only B, type AB both, and type O neither. Red blood cells sport another antigen called the RhD protein. When it is present, a blood type is said to be positive; when it is absent, it is said to be negative. The typical combinations of A, B, and RhD antigens give us the eight common blood types (A+, A-, B+, B-, AB+, AB-, O+, and O-).
Blood antigen proteins play a variety of cellular roles, but recognizing foreign cells in the blood is the most important for this discussion.
Think of antigens as backstage passes to the bloodstream, while our immune system is the doorman. If the immune system recognizes an antigen, it lets the cell pass. If it does not recognize an antigen, it initiates the body's defense systems and destroys the invader. So, a very aggressive doorman.
While our immune systems are thorough, they are not too bright. If a person with type A blood receives a transfusion of type B blood, the immune system won't recognize the new substance as a life-saving necessity. Instead, it will consider the red blood cells invaders and attack. This is why so many people either grew ill or died during transfusions before Landsteiner's brilliant discovery.
This is also why people with O negative blood are considered "universal donors." Since their red blood cells lack A, B, and RhD antigens, immune systems don't have a way to recognize these cells as foreign and so leaves them well enough alone.
How is Rh-null the rarest blood type?
Let's return to golden blood. In truth, the eight common blood types are an oversimplification of how blood types actually work. As Smithsonian.com points out, "[e]ach of these eight types can be subdivided into many distinct varieties," resulting in millions of different blood types, each classified on a multitude of antigens combinations.
Here is where things get tricky. The RhD protein previously mentioned only refers to one of 61 potential proteins in the Rh system. Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system. This not only makes it rare, but this also means it can be accepted by anyone with a rare blood type within the Rh system.
This is why it is considered "golden blood." It is worth its weight in gold.
As Mosaic reports, golden blood is incredibly important to medicine, but also very dangerous to live with. If a Rh-null carrier needs a blood transfusion, they can find it difficult to locate a donor, and blood is notoriously difficult to transport internationally. Rh-null carriers are encouraged to donate blood as insurance for themselves, but with so few donors spread out over the world and limits on how often they can donate, this can also put an altruistic burden on those select few who agree to donate for others.
Some bloody good questions about blood types
A nurse takes blood samples from a pregnant woman at the North Hospital (Hopital Nord) in Marseille, southern France.
Photo by BERTRAND LANGLOIS / AFP
There remain many mysteries regarding blood types. For example, we still don't know why humans evolved the A and B antigens. Some theories point to these antigens as a byproduct of the diseases various populations contacted throughout history. But we can't say for sure.
In this absence of knowledge, various myths and questions have grown around the concept of blood types in the popular consciousness. Here are some of the most common and their answers.
Do blood types affect personality?
Japan's blood type personality theory is a contemporary resurrection of humorism. The idea states that your blood type directly affects your personality, so type A blood carriers are kind and fastidious, while type B carriers are optimistic and do their own thing. However, a 2003 study sampling 180 men and 180 women found no relationship between blood type and personality.
The theory makes for a fun question on a Cosmopolitan quiz, but that's as accurate as it gets.
Should you alter your diet based on your blood type?
Remember Galen of Pergamon? In addition to bloodletting, he also prescribed his patients to eat certain foods depending on which humors needed to be balanced. Wine, for example, was considered a hot and dry drink, so it would be prescribed to treat a cold. In other words, belief that your diet should complement your blood type is yet another holdover of humorism theory.
Created by Peter J. D'Adamo, the Blood Type Diet argues that one's diet should match one's blood type. Type A carriers should eat a meat-free diet of whole grains, legumes, fruits, and vegetables; type B carriers should eat green vegetables, certain meats, and low-fat dairy; and so on.
However, a study from the University of Toronto analyzed the data from 1,455 participants and found no evidence to support the theory. While people can lose weight and become healthier on the diet, it probably has more to do with eating all those leafy greens than blood type.
Are there links between blood types and certain diseases?
There is evidence to suggest that different blood types may increase the risk of certain diseases. One analysis suggested that type O blood decreases the risk of having a stroke or heart attack, while AB blood appears to increase it. With that said, type O carriers have a greater chance of developing peptic ulcers and skin cancer.
None of this is to say that your blood type will foredoom your medical future. Many factors, such as diet and exercise, hold influence over your health and likely to a greater extent than blood type.
What is the most common blood type?
In the United States, the most common blood type is O+. Roughly one in three people sports this type of blood. Of the eight well-known blood types, the least common is AB-. Only one in 167 people in the U.S. have it.
Do animals have blood types?
They most certainly do, but they are not the same as ours. This difference is why those 17th-century patients who thought, "Animal blood, now that's the ticket!" ultimately had their tickets punched. In fact, blood types are distinct between species. Unhelpfully, scientists sometimes use the same nomenclature to describe these different types. Cats, for example, have A and B antigens, but these are not the same A and B antigens found in humans.
Interestingly, xenotransfusion is making a comeback. Scientists are working to genetically engineer the blood of pigs to potentially produce human compatible blood.
Scientists are also looking into creating synthetic blood. If they succeed, they may be able to ease the current blood shortage, while also devising a way to create blood for rare blood type carriers. While this may make golden blood less golden, it would certainly make it easier to live with.* While antigens are typically proteins, they can be other molecules as well, such as polysaccharides.
China has reached a new record for nuclear fusion at 120 million degrees Celsius.
This article was originally published on our sister site, Freethink.
China wants to build a mini-star on Earth and house it in a reactor. Many teams across the globe have this same bold goal --- which would create unlimited clean energy via nuclear fusion.
But according to Chinese state media, New Atlas reports, the team at the Experimental Advanced Superconducting Tokamak (EAST) has set a new world record: temperatures of 120 million degrees Celsius for 101 seconds.
Yeah, that's hot. So what? Nuclear fusion reactions require an insane amount of heat and pressure --- a temperature environment similar to the sun, which is approximately 150 million degrees C.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it. In nuclear fusion, the extreme heat and pressure create a plasma. Then, within that plasma, two or more hydrogen nuclei crash together, merge into a heavier atom, and release a ton of energy in the process.
Nuclear fusion milestones: The team at EAST built a giant metal torus (similar in shape to a giant donut) with a series of magnetic coils. The coils hold hot plasma where the reactions occur. They've reached many milestones along the way.
According to New Atlas, in 2016, the scientists at EAST could heat hydrogen plasma to roughly 50 million degrees C for 102 seconds. Two years later, they reached 100 million degrees for 10 seconds.
The temperatures are impressive, but the short reaction times, and lack of pressure are another obstacle. Fusion is simple for the sun, because stars are massive and gravity provides even pressure all over the surface. The pressure squeezes hydrogen gas in the sun's core so immensely that several nuclei combine to form one atom, releasing energy.
But on Earth, we have to supply all of the pressure to keep the reaction going, and it has to be perfectly even. It's hard to do this for any length of time, and it uses a ton of energy. So the reactions usually fizzle out in minutes or seconds.
Still, the latest record of 120 million degrees and 101 seconds is one more step toward sustaining longer and hotter reactions.
Why does this matter? No one denies that humankind needs a clean, unlimited source of energy.
We all recognize that oil and gas are limited resources. But even wind and solar power --- renewable energies --- are fundamentally limited. They are dependent upon a breezy day or a cloudless sky, which we can't always count on.
Nuclear fusion is clean, safe, and environmentally sustainable --- its fuel is a nearly limitless resource since it is simply hydrogen (which can be easily made from water).
With each new milestone, we are creeping closer and closer to a breakthrough for unlimited, clean energy.