China has reached a new record for nuclear fusion at 120 million degrees Celsius.
This article was originally published on our sister site, Freethink.
China wants to build a mini-star on Earth and house it in a reactor. Many teams across the globe have this same bold goal --- which would create unlimited clean energy via nuclear fusion.
But according to Chinese state media, New Atlas reports, the team at the Experimental Advanced Superconducting Tokamak (EAST) has set a new world record: temperatures of 120 million degrees Celsius for 101 seconds.
Yeah, that's hot. So what? Nuclear fusion reactions require an insane amount of heat and pressure --- a temperature environment similar to the sun, which is approximately 150 million degrees C.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it. In nuclear fusion, the extreme heat and pressure create a plasma. Then, within that plasma, two or more hydrogen nuclei crash together, merge into a heavier atom, and release a ton of energy in the process.
Nuclear fusion milestones: The team at EAST built a giant metal torus (similar in shape to a giant donut) with a series of magnetic coils. The coils hold hot plasma where the reactions occur. They've reached many milestones along the way.
According to New Atlas, in 2016, the scientists at EAST could heat hydrogen plasma to roughly 50 million degrees C for 102 seconds. Two years later, they reached 100 million degrees for 10 seconds.
The temperatures are impressive, but the short reaction times, and lack of pressure are another obstacle. Fusion is simple for the sun, because stars are massive and gravity provides even pressure all over the surface. The pressure squeezes hydrogen gas in the sun's core so immensely that several nuclei combine to form one atom, releasing energy.
But on Earth, we have to supply all of the pressure to keep the reaction going, and it has to be perfectly even. It's hard to do this for any length of time, and it uses a ton of energy. So the reactions usually fizzle out in minutes or seconds.
Still, the latest record of 120 million degrees and 101 seconds is one more step toward sustaining longer and hotter reactions.
Why does this matter? No one denies that humankind needs a clean, unlimited source of energy.
We all recognize that oil and gas are limited resources. But even wind and solar power --- renewable energies --- are fundamentally limited. They are dependent upon a breezy day or a cloudless sky, which we can't always count on.
Nuclear fusion is clean, safe, and environmentally sustainable --- its fuel is a nearly limitless resource since it is simply hydrogen (which can be easily made from water).
With each new milestone, we are creeping closer and closer to a breakthrough for unlimited, clean energy.
A thought experiment from 1867 leads scientists to design a groundbreaking information engine.
- Their engine is the fastest ever such contraption, using information as "fuel."
- The application of the technology may lie in nanotechnology and nano-biology.
- Inspired by an 1867 thought experiment, researchers design an information engine.
Can information become a source of energy? Scientists from Simon Fraser University (SFU) in Canada devised an ultrafast engine that claims to operate on information, potentially opening up a groundbreaking new frontier in humanity's search for new kinds of fuel. The study, published in Proceedings of the National Academy of Sciences (PNAS), describes how the researchers turned the movements of tiny particles into stored energy.
How would an information engine even work? The idea for such a contraption, which at first sounds like it would break the laws of physics, was first proposed by the Scottish scientist James Clerk Maxwell back in 1867. Colorfully named "Maxwell's demon," such a machine would theoretically achieve something akin to perpetual motion. Maxwell's thought experiment was meant to show that it may be possible to violate the second law of thermodynamics, which basically states that the amount of entropy, or disorder, always increases.
Maxwell imagined a hypothetical creature, a demon, who would control the opening and closing of a tiny door between two gas chambers. The demon's goal would be to send fast-moving gas particles into one compartment and the slow ones to another. By doing this, one compartment would be hotter (containing faster molecules) and one cooler. The demon would essentially create a system with greater order and stored energy than what it started with. Without expending any energy, it would seemingly accomplish a decrease in entropy.
A 1929 paper on Maxwell's demon by the Hungarian physicist Leo Szilard actually showed that the thought experiment would not violate the second law of thermodynamics. The demon, proved Szilard, has to exert some amount of energy to figure out if the molecules were hot or cold.
Over 150 years later, researchers built a system that operates according to the ideas in Maxwell's thought experiment, turning information into "work."
SFU physics professor and senior author John Bechhoefer, who was involved in the experiments, explained in a press statement that their group "wanted to find out how fast an information engine can go and how much energy it can extract, so we made one."
SFU physics professor David Sivak, who led the theorists on the project, said their team made a significant advance in the design of the information engine, having "pushed its capabilities over ten times farther than other similar implementations, thus making it the current best-in-class."
Designing an information engineTheir design is akin to a microscopic particle that is submerged in water, while being attached to a spring that is, in turn, connected to a stage that can be moved up. The researchers, playing the role of Maxwell's demon, observe the particle going up or down due to thermal motion, then move the stage up if the particle randomly bounced upward. If it bounces down, they wait. As elaborated by PhD student Tushar Saha, "This ends up lifting the entire system using only information about the particle's position."
Caption: Schematic of the information engine. (A) Ratcheted spring-mass system under gravity. (B) Experimental realization using horizontal optical tweezers in a vertical gravitational field. Feedback operations on the right side in A and B are indicated by the small red "swoosh" arrows.Credit: TK Saha et al., PNAS, 2021.
Of course, a particle is too small to attach to a spring, so the actual set-up utilized an instrument known as an optical trap, which "uses a laser to create a force on the particle that mimics that of the spring and stage." As they repeated the process, without pulling the particle directly, the particle was raised to a "great height," storing up a large amount of gravitational energy, according to the researchers.
PhD student Tushar Saha working on the information ratchet, an experimental apparatus that lifts a heavy microscopic particle using information.Credit: Simon Fraser University
The amount of power this system generates is "comparable to molecular machinery in living cells," with "speeds comparable to fast-swimming bacteria," said postdoctoral fellow Jannik Ehrich.
While applications of this still-developing technology are yet to be fully explored, the researchers see potential uses in nanotechnology and nanobiology. Improving computing speed may also be a potential avenue to pursue, according to the researchers
In paint form, the world's "whitest white" reflects so much light that surfaces become cooler than the surrounding air.
- Scientists at Purdue University announce the whitest white ever developed. It will be available as paint and a nanofilm.
- The new paint can actually cool surfaces on which it's applied, potentially reducing the need for climate-unfriendly air conditioners.
- This is the second whitest white to come from these researchers, and they believe this is about as white as any material could ever be.
A few years ago, researchers announced the development of the blackest black ever, a place where colors go to die. It was called Vantablack®, and it was so absorptive of visible light that only the tiniest amount escaped its surface to reflect back to our eyes. (All of that light energy is dissipated into the surrounding substrate, so Vantablack doesn't become hot.)
In a new paper published in the journal ACS Applied Materials & Interfaces, scientists at Purdue University announced BaSO4 (barium sulfate), the whitest white ever. BaSO4 is practically impervious to the colors of the visible spectrum. Even better, while it's a very cool invention in the colloquial sense, it's also cool in the thermal sense.
The coolest white
The infrared image on the right shows how a square of the super-white paint and the board on which it's painted — shown in a normal image on the left — are cooler than the surrounding materials.Credit: Purdue University/Joseph Peoples
Most outside paints actually warm the surfaces to which they're applied. While there are already some reflective paints on the market, they only reflect 80 to 90 percent of sunlight, not enough for a cooling effect.
By contrast, BaSO4 results in 98.1 percent of sunlight bouncing off. According to senior investigator Xuilin Ruan, "If you were to use this paint to cover a roof area of about 1,000 square feet, we estimate that you could get a cooling power of 10 kilowatts. That's more powerful than the central air conditioners used by most houses."
Ruan and his colleagues tested BaSO4 using thermocouples, high-accuracy devices that measure voltage to determine temperature. They found that at night, BaSO4 surfaces are 19° F. cooler than surrounding air. Under strong sunlight the effect is not quite so extreme, but still dramatic: 8° of cooling.
The researchers even found the paint works in cold weather. Testing it on a 43° F. day, the surface on which BaSO4 was painted was a brisk 25° F. Their tests also indicate that BaSO4 is hardy enough for outdoor conditions.
How the new white was developed
Xuilin Ruan and a square of BaSO4Credit: Purdue University/Jared Pike
Research in the field of radiative paint for cooling goes back to the 1970s, though Ruan's team has been working toward BaSO4 for only six years. Along the way, they analyzed over 100 reflective materials, trying them out in about 50 experimental formulations.
Lead author, postdoc Xiangyu Li explains, "We looked at various commercial products, basically anything that's white. We found that using barium sulfate, you can theoretically make things really, really reflective, which means that they're really, really white."
The whitest white paint before — developed by the same team just last autumn — depended on calcium carbonate, a compound commonly found in seashells, rocks, and blackboard chalk.
The team crammed as many tiny BaSO4 particles into the paint as possible. Says Li: "Although a higher particle concentration is better for making something white, you can't increase the concentration too much. The higher the concentration, the easier it is for the paint to break or peel off."
Another factor that makes the team's BaSO4 formulation so reflective is that the researchers used barium sulfate particles of many different sizes. When it comes to reflecting light, size matters.
Co-author and PhD student Joseph Peoples said, "A high concentration of particles that are also different sizes gives the paint the broadest spectral scattering, which contributes to the highest reflectance."
The team's formulation method, they report, is compatible with commercial paint production.
Cool support for the planet
Purdue has applied for patents relating to BaSO4, though there are as yet no plans to make it commercially available.
However, the sooner they release it, the better. Air conditioning currently accounts for 12% of U.S. energy consumption. Also, many air conditioners use hydrofluorocarbons (HFCs). While HFCs constitute just a small percentage of greenhouse gases, they trap thousands of times the amount of heat as carbon dioxide.
Therefore, BaSO4 can play a role in combating global warming by reducing energy consumption and the emission of HFCs.
An artificial island in the North Sea is the biggest building project ever in Danish history - and could pave the way for many more.
- In 1991, Denmark constructed the world's first offshore wind farm.
- Now they're building an entire 'Energy Island' in the North Sea.
- As the U.S. catches up, Danish know-how could soon come to America.
Giant wind farms
Wind turbines of the Block Island Wind Farm, so far the only offshore wind project in operation in the U.S.
Credit: Don Emmert/AFP via Getty Images
On Monday, President Biden designated a 'Wind Energy Area' in the waters between Long Island and New Jersey. It's part of an ambitious plan to build giant wind farms along the East Coast. There's currently only one offshore wind farm in the Eastern U.S., off Rhode Island (1).
When those wind farms get built, you can bet there'll be Danish companies involved. In 1991, Denmark built Vindeby, the world's first offshore wind farm. In the years since, Danish companies have maintained their global lead.
In February, the Danish government announced it would build the world's first 'Energy Island'. Everybody else in the world, take note: if the Danes pull this off, similar islands could soon pop up off your shores – perhaps also in the New York Bight.
So, what's an Energy Island, and why does Denmark want one? For the answer, we spool back to June 2020, when a broad coalition of Danish parties, left and right, in government and opposition, concluded a Climate Agreement. This is Denmark's plan not only to make a radical break with fossil fuels but also to show the rest of the world how it's done.
On the rise again
Close-up of Energy Island, with two of the seawalls at the back and the port at the front.
Credit: Danish Energy Agency
Due in large part to its pioneering work with wind energy, Denmark has a green image. But that hasn't always reflected reality. Yes, in 2019 the country generated 30 percent of its energy from renewable sources – earning it 9th place worldwide (2). But in 2018, Denmark also was the EU's leading oil producer (3).
Under the Climate Agreement, that will stop. Denmark will no longer explore and develop new oil and gas fields in its section of the North Sea. Extraction will be gradually reduced to zero. In exchange, Denmark will dramatically scale up the production of sustainable energy via offshore wind farms. The ultimate goal: nationwide carbon neutrality by 2050.
Offshore wind farms produce the bulk of Europe's sustainable energy. And after a dip in the first decade of the century, offshore wind farms are on the rise again (4). One reason for the increased popularity: taller turbines, which means larger blades, which means greater capacity.
- In 2016, the tallest turbines were 540 ft (164 m) and had a capacity of 8 megawatts (MW).
- In 2021, turbines can be up to 720 ft (220 m) tall, generating up to 12 MW.
- Soon, the turbines will reach 820 ft (250 m) – not that much shorter than the Eiffel Tower (1,030 ft or 314 m, street to flagpole). These will have a capacity of up to 20 MW.
Potential position of Energy Island (red) off the western coast of Jutland, surrounded by a wind farm (green) filled with turbines (blue dots).
Credit: Danish Energy Agency
As the shallow parts of the North Sea (<66 ft; <20 m) fill up with wind farms, the issue of managing the energy flow produced by these farms becomes acute. The obvious solution would be to build a central point where the energy is collected, converted from AC to DC and transmitted to one or more points onshore. Centralised management of the wind farms would mitigate the fluctuations in energy production and make it easier for supply to meet demand.
If supply is greater than demand, these collection points can also serve as storage units. Excess energy could be stored in batteries or transformed into hydrogen via electrolysis. If and when necessary, the hydrogen can then be transported onto land and reconverted into electricity.
The Dutch are thinking about it, and some have suggested the Dogger Bank as an ideal location: shallow and central within the North Sea, ideally placed to distribute energy to the various countries bordering the sea. But the Danes are doing it. The Climate Agreement envisaged not one, but two energy islands.
One would be Bornholm, Denmark's Baltic island, halfway between Sweden and Poland, which would serve as the hub for local offshore wind farms. But the other would be an entirely new, entirely artificial island in the North Sea, to be built about 50 miles (80 km) off Thorsminde, on the western coast of Jutland.
10 million households
Schematic overview of how an Energy Island could serve as a hub for collecting and redistributing sustainable energy.
Credit: Danish Energy Agency
In February, the Danish government revealed how much this Energi-Ø would cost, how long it would take to build – and what it might look like.
- Energy Island will be built via the caisson method – essentially, sinking a watertight box to the bottom of the sea. The island will be protected from storms by high seawalls on three sides. The fourth side will feature a dock for ships.
- Construction could start in 2026 and is expected to take three years. Building the wind farms and transmission network will take a few years more. By 2033, it could be churning out its sustainable GWs.
- In its initial phase, the island will have an area of about 12 hectares (30 acres, or about 18 soccer fields). It will centralize the production of about 200 offshore wind turbines, with a joint capacity of 3 GW. That's about the equivalent of 3 million households – slightly more than the total for Denmark.
- When fully completed, the island will have an area of around 46 hectares (114 acres, just under 70 soccer fields), collect the energy of 600 turbines, for a total capacity of 10 GW (5). That covers 10 million households.
- 10 GW is equivalent to about 150 percent of Denmark's entire electricity needs (households, industry, infrastructure, etc.) That leaves plenty of scope for supplying neighbouring countries. Agreements have already been reached with Germany, the Netherlands and Belgium.
The plan also foresees a plant for hydrogen production on the island, either to be piped onshore, or stored and transported in large batteries.
Yet untested aspects
Location of Energy Island (yellow) in the North Sea, showing potential connections towards neighboring countries.
Credit: Danish Climate Ministry / Vimeo
In all, the island would cost DKK 210 billion (US$33 billion) to build – by far Denmark's largest construction project (6).
The project will be undertaken in a public-private partnership between the Danish state and commercial interests. Because it is 'critical infrastructure', the state will retain a stake of at least 50.1 percent in the project. There are two scenarios for co-ownership:
- The island will be owned in its entirety by a company, in which the Danish state retains at least that smallest of majorities;
- Private companies will be able to own up to 49.9 percent of the island itself.
The Danish government needs private-sector input to overcome unknown and as yet untested aspects of the project, not just in terms of design and building an entire island from scratch, but also on how to operate and maintain it, and even when it comes to financing and risk management.
But where there's risk, there is potential. If the project is successful, it will become the blueprint for similar energy islands the world over – and the companies that helped build the first one, will be in high demand to build the other ones too, perhaps soon in Biden's 'Wind Energy Area'.
Green, as the Danes have discovered, is not just the color of nature. It's also the color of money.
Strange Maps #1077
Got a strange map? Let me know at email@example.com.
(1) Coastal Virginia Offshore Wind, a two-turbine pilot project 23 miles (43 km) off Virginia Beach, was completed last year.
(2) The Top 10 (2019) are Iceland (79%), Norway (66%), Brazil (45%), Sweden (42%), New Zealand (35%), Austria (38%), Switzerland (31%), Ecuador (30%), Denmark (30%) and Canada (28%).
(3) With 5.8 megatons of oil equivalent (Mtoe), Denmark beat Italy (4.7 Mtoe) and Romania (3.4 Mtoe). Oil production in the EU is on the way down. It peaked in 2004 (42.5 Mtoe) and has since halved (to 21.4 Mtoe in 2018). A similar trend has occurred in the two key non-EU oil producers in Europe. a. Norway's oil production peaked in 2001 (159.2 Mtoe) and has since more than halved (to 74.5 Mtoe in 2018). b. The UK's oil production peaked in 1999 (133.3 Mtoe) and has since been reduced by almost two thirds (to 49.3 Mtoe in 2018).
(5) The Bornholm energy hub is projected to top out at 2 GW.
(6) Inaugurated in 2000, the famous Øresund Bridge (Øresundsbroen), connecting Sweden to Denmark, cost about DKK 25 billion (US$4 billion) in today's money. When it's finished (by 2029, if work continues apace), the Fehmarn Belt Fixed Link (18 km) between the Danish island of Lolland and the German island of Fehmarn, will be the world's longest road/rail tunnel. It will have cost about DKK 55 billion (US$ 8.7 billion).
Can we ever make energy efficient AI?
The paper pointed out the risks of language-processing artificial intelligence, the type used in Google Search and other text analysis products.
Among the risks is the large carbon footprint of developing this kind of AI technology. By some estimates, training an AI model generates as much carbon emissions as it takes to build and drive five cars over their lifetimes.
I am a researcher who studies and develops AI models, and I am all too familiar with the skyrocketing energy and financial costs of AI research. Why have AI models become so power hungry, and how are they different from traditional data center computation?
Today's training is inefficient
Traditional data processing jobs done in data centers include video streaming, email and social media. AI is more computationally intensive because it needs to read through lots of data until it learns to understand it – that is, is trained.
This training is very inefficient compared to how people learn. Modern AI uses artificial neural networks, which are mathematical computations that mimic neurons in the human brain. The strength of connection of each neuron to its neighbor is a parameter of the network called weight. To learn how to understand language, the network starts with random weights and adjusts them until the output agrees with the correct answer.
How artificial neural networks work.
A common way of training a language network is by feeding it lots of text from websites like Wikipedia and news outlets with some of the words masked out, and asking it to guess the masked-out words. An example is “my dog is cute," with the word “cute" masked out. Initially, the model gets them all wrong, but, after many rounds of adjustment, the connection weights start to change and pick up patterns in the data. The network eventually becomes accurate.
One recent model called Bidirectional Encoder Representations from Transformers (BERT) used 3.3 billion words from English books and Wikipedia articles. Moreover, during training BERT read this data set not once, but 40 times. To compare, an average child learning to talk might hear 45 million words by age five, 3,000 times fewer than BERT.
Looking for the right structure
What makes language models even more costly to build is that this training process happens many times during the course of development. This is because researchers want to find the best structure for the network – how many neurons, how many connections between neurons, how fast the parameters should be changing during learning and so on. The more combinations they try, the better the chance that the network achieves a high accuracy. Human brains, in contrast, do not need to find an optimal structure – they come with a prebuilt structure that has been honed by evolution.
As companies and academics compete in the AI space, the pressure is on to improve on the state of the art. Even achieving a 1% improvement in accuracy on difficult tasks like machine translation is considered significant and leads to good publicity and better products. But to get that 1% improvement, one researcher might train the model thousands of times, each time with a different structure, until the best one is found.
Researchers at the University of Massachusetts Amherst estimated the energy cost of developing AI language models by measuring the power consumption of common hardware used during training. They found that training BERT once has the carbon footprint of a passenger flying a round trip between New York and San Francisco. However, by searching using different structures – that is, by training the algorithm multiple times on the data with slightly different numbers of neurons, connections and other parameters – the cost became the equivalent of 315 passengers, or an entire 747 jet.
Bigger and hotter
AI models are also much bigger than they need to be, and growing larger every year. A more recent language model similar to BERT, called GPT-2, has 1.5 billion weights in its network. GPT-3, which created a stir this year because of its high accuracy, has 175 billion weights.
Researchers discovered that having larger networks leads to better accuracy, even if only a tiny fraction of the network ends up being useful. Something similar happens in children's brains when neuronal connections are first added and then reduced, but the biological brain is much more energy efficient than computers.
AI models are trained on specialized hardware like graphics processor units, which draw more power than traditional CPUs. If you own a gaming laptop, it probably has one of these graphics processor units to create advanced graphics for, say, playing Minecraft RTX. You might also notice that they generate a lot more heat than regular laptops.
All of this means that developing advanced AI models is adding up to a large carbon footprint. Unless we switch to 100% renewable energy sources, AI progress may stand at odds with the goals of cutting greenhouse emissions and slowing down climate change. The financial cost of development is also becoming so high that only a few select labs can afford to do it, and they will be the ones to set the agenda for what kinds of AI models get developed.
Doing more with less
What does this mean for the future of AI research? Things may not be as bleak as they look. The cost of training might come down as more efficient training methods are invented. Similarly, while data center energy use was predicted to explode in recent years, this has not happened due to improvements in data center efficiency, more efficient hardware and cooling.
There is also a trade-off between the cost of training the models and the cost of using them, so spending more energy at training time to come up with a smaller model might actually make using them cheaper. Because a model will be used many times in its lifetime, that can add up to large energy savings.
In my lab's research, we have been looking at ways to make AI models smaller by sharing weights, or using the same weights in multiple parts of the network. We call these shapeshifter networks because a small set of weights can be reconfigured into a larger network of any shape or structure. Other researchers have shown that weight-sharing has better performance in the same amount of training time.
Looking forward, the AI community should invest more in developing energy-efficient training schemes. Otherwise, it risks having AI become dominated by a select few who can afford to set the agenda, including what kinds of models are developed, what kinds of data are used to train them and what the models are used for.