Once a week.
Subscribe to our weekly newsletter.
Electricity and fear: The trouble with nuclear energy
Although everyone knows that coal-based energy is a thing of the past, declarations about nuclear power plants somehow do not want to enter into force.
No other power-generating device raises as much concern as the nuclear reactor. Because of this, until recently the future of the entire energy sector has been determined by its past.
On the eve of the pandemic, the European energy sector found itself at a crossroads, somewhere between Great Britain, Germany and Poland. Five years ago, across the English Channel, the then Prime Minister David Cameron announced an ambitious program to build 12 new nuclear power plants with a total capacity of 16 GW. While developing renewable energy resources, they would allow the United Kingdom to reduce carbon dioxide emissions from the energy sector to almost zero. Soon after, Cameron came up with the idea of a referendum on leaving the EU – and Brexit reset all long-term British plans. However, the British are already producing electricity in a very sustainable way. Almost 38% comes from renewable sources, about 20% from nuclear power plants, while the remainder is provided by gas-powered plants, the only ones that emit CO2.
Meanwhile in Germany, the aversion towards nuclear energy has been growing for years. Finally, following the Fukushima disaster in March 2011, chancellor Angela Merkel announced that all nuclear power plants would be shut down by 2022. For the first few years, the great Energiewende (energy transformation) plan seemed to be going well. Thanks to subsidies and increased electricity prices for individual customers, the intensive development of wind farms and solar power plants continued. However, no technological solution has been found to overcome the main weakness of renewable energy sources: plants running on renewable sources work on average for 20-30% of the day and remain completely dependent on whether the wind blows or the sun shines. Because of this, they are not able to handle energy peaks. In turn, when a gale comes, suddenly there is a network overload due to excess power. In both these extreme cases, the entire country is at risk of blackout, and the risk of collapsing energy supplies increases significantly when more than 30% is obtained from renewable sources. Safety requires the maintenance of traditional power plants, which due to their flexibility, stabilize the entire system.
In Germany, as subsequent nuclear reactors began to shut down, lignite-fired power plants started to play a key role. Unlike nuclear plants, they devastate the natural environment not only due to CO2 emissions, but also the need to expand opencast mines. A huge wave of criticism from environmentalists and Berlin's goal to lead by example in the fight against global warming have brought an adjustment in strategy. Today, coal-fired power plants are being replaced by gas-fired ones that emit one-third less carbon dioxide. Russia will provide fuel for them via the Nord Stream and Nord Stream 2 gas pipelines. However, withdrawal from the decommissioning of nuclear power plants is now out of the question.
In turn, the development of renewable energy in Poland is suffering, despite the construction of one or more nuclear power plants having been announced two decades ago. Before the pandemic, the government envoy for strategic energy infrastructure Piotr Naimski claimed that by the end of 2045 as many as six nuclear reactors with a total capacity of 6 GW would be built. Although everyone knows that coal-based energy is a thing of the past, declarations about nuclear power plants somehow do not want to enter into force. And this is a very complicated undertaking, during which any disregard of security standards can awaken demons from the past.
A pile of trouble
"In fifteen years, nuclear power will provide electricity too cheap to measure its consumption," the head of the American Atomic Energy Commission, Lewis Strauss, prophesied in 1954. By the end of that decade, energy corporations had overcome technological barriers. "Westinghouse has perfected the PWR reactor, the water-pressure reactor, and GE [General Electric] the BWR reactor, boiling water reactor," explains Daniel Yergin in the The Quest: In Search of Energy. These two types of first generation reactors have spread throughout the world. By 1970, 15 nuclear power plants had launched in 62 countries and the construction of a further 89 had begun. Most of them were located in the US, USSR, UK, France, Japan and West Germany. Three years later, the first oil crisis erupted and it seemed certain that highly developed countries would base their future on nuclear power plants. However, the first problems began to emerge.
The first generation, 1000 MW water-pressure reactor generated as much as 20 tons of radioactive waste annually. Initially, the Americans placed it into metal containers and buried it in the ocean. The Soviets did the same. Protests by environmental organizations led to containers with a guarantee of durability of a thousand years starting to be buried in the Nevada desert – ignoring the fact that the half-life of plutonium-239 is about 24,400 years. In other countries, old mines were used as waste dumps. The French coped with this problem exemplarily by building a plant at La Hague specializing in the recovery of radioactive uranium and plutonium from waste. Later, these elements are enriched and sold to energy companies. During the 1980s, many countries – including Japan, West Germany, Belgium and Switzerland – began to use the services of the French.
In addition to waste, investment costs have become an equally large problem. "Emerging ecological movements, especially anti-nuclear ones, forced additional reviews and changes. It was necessary to thicken the concrete walls, and remove pipeline installations and rework them. Power plants had to be redesigned, even several times during construction," emphasizes Yergin. He writes: "Power plants also became more expensive because of inflation and later, the high interest rates on loans. Instead of six years, construction took ten; it also cost money. The power plants, which were to cost $200million, ultimately cost $2billion." Later, they produced the cheapest electricity on the market, but gigantic expenses had to be included in its price. While the French model handles waste well, investment costs remain the Achilles' heel of nuclear energy to this day, even if they are less important than the media and public fear.
Awaiting the apocalypse
"There is nothing in the laws of nature that stops us from building better nuclear power plants. We are stopped by a deep justified public distrust. The public distrusts the experts because they claimed to be infallible," writes Freeman Dyson, a physicist who participated in the construction of the first reactors, in the book Imagined Worlds. The distrust of nuclear energy emerged gradually. In the 1960s, everyone remembered the fate of Hiroshima and Nagasaki, but the fear of radioactive radiation had not yet paralysed ordinary people. Experts managed to convince Western societies that the nuclear power plant hardly differed from the coal-fired power plant. All it needs is access to a lot more coolant for the reactor, preferably a huge water tank.
The sense of security began to fade not because of a failure, but catastrophic scenarios loved by the press, especially in West Germany. In October 1975, Der Spiegel very vividly presented to readers what would happen if the reactor at a power plant built near Ludwigshafen overheated. "The molten reactor core will penetrate the surrounding protective structures. It will sink into the ground at a speed of two to four meters per hour. The amount of radiation emitted would correspond to the radiation of a thousand bombs such as the one dropped on Hiroshima," the newspaper forecasted, estimating the number of victims at 100,000 killed immediately and about 1.6 million "dying slowly" due to radiation sickness. Such apocalyptic visions interested Hollywood, resulting in the neo-thriller entitled The China Syndrome. In specialist jargon, this name means the severe meltdown of the core components of the reactor.
Lo and behold, two weeks after the film's release, on 28th March 1979, there was a failure at the Three Mile Island nuclear power plant located on an artificial island. Pipes supplying coolant to the reactor burst when the back-up cooling system was disconnected for inspection. The reactor had warmed up, but the safety measures worked. Each reactor is managed using control rods. They are made of alloys that absorb neutrons. Sliding the control rods in between the fuel rods slows down the chain reaction. Pulling them out has the opposite effect. When the reactor overheats, all control rods fall into the core, quenching the reaction.
This happened at Three Mile Island. However, due to the pipes bursting, water poured out onto the reactor jacket and immediately evaporated, forming a mixture of oxygen and hydrogen under the dome of the power block. One spark could have blown up the power plant. The following day, technicians pumped off hazardous, radioactive gases outside. The residents of nearby Harrisburg panicked. About 80,000 people attempted to escape the city in cars. The US energy minister James Schlesinger's assurances that the radiation only increased by around 0.03 rem and would not hurt anyone fell on deaf ears. Those who have seen The China Syndrome knew better. It wasn't until five days later, when President Jimmy Carter personally visited Three Mile Island and in the presence of TV cameras toured the area, that the panic was subjugated. However, the misfortunes of nuclear power plants were only just beginning.
The weakest link
The owners of the plant, the Westinghouse group, largely caused the Three Mile Island disaster. The power plant was built in a rush to make it operational before 30th December 1978, in order for the company to gain a $40 million tax break. After launching the reactor, it turned out that the coolant supply pipes were leaking. At that point, the management ordered temporary sealing of leaks, after which the test of the emergency cooling system was performed, starting with its shutdown. This was done on the assumption that the main pipes would still last a little longer. "The accident was caused by a series of relatively small equipment failures followed by operator error," the head of the commission investigating the causes of the disaster, Admiral Hyman G. Rickover, wrote in his report. Fortunately, none of the Westinghouse executives were so thoughtless as to deactivate the other safeguards. Seven years later, it turned out that even such recklessness is possible.
On the night of 26th April 1986, the management of the Chernobyl power plant began to experiment with manual control of the reactor in block 4. For complete freedom, all automatic security systems were turned off. During the experiments, the stack heated up rapidly, and the control rods blocked by the staff did not automatically quench the chain reaction. Then the pipes supplying water to the cooling system burst. As in Three Mile Island, the water evaporated by the hot reactor turned into hydrogen and oxygen. The explosion of this mixture tore the dome and threw a 500-ton piece of concrete into the air, which a moment later fell into the reactor, breaking it completely. 50 tons of fuel escaped outside and the core melted. Vast areas of northern Ukraine and Belarus became contaminated due to the radioactive cloud. 50,000 residents of the nearby town of Pripyat and surrounding villages were evacuated.
As a result of the disaster, 31 people lost their lives (mainly irradiated firefighters). UNSCEAR (UN Scientific Committee on Effects of Atomic Radiation) found that there were many more casualties: a 2000 report found that of about 600 employees of the power plant and firefighters, 237 were diagnosed with radiation sickness symptoms. Of these, 28 people died. According to the report, epidemiologists have not observed an increase in the incidence of cancer in the most contaminated areas, except for higher than average rates of thyroid cancer. No genetic defects were found in the offspring of irradiated persons.
A quarter of a century later, the 'Chinese syndrome' became Japanese. Two oil crises in the 1970s encouraged the government of Japan to finance the construction of 50 nuclear reactors. They guaranteed energy security for the state. However, haste made them forget about their side effects in a country where earthquakes happen regularly. The Fukushima reactor was built right on the seafront. When massive shocks (9 on the Richter scale) came on 11th March 2011, the security systems functioned properly. The reactors were automatically quenched and the cooling system switched to the emergency power supply. Nothing bad would have happened if it weren't for the sea. Tectonic shocks caused a tsunami wave of 15-metre heights, and the breakwater was only six-metres high. Huge amounts of water flooded the power plant. The power generators went down and the reactor core suddenly stopped being cooled. Then the water evaporated and the hydroxide mixture exploded.
About 10 times less radioactive substance escaped outside than in Chernobyl, and no-one was killed during the event. The first person irradiated as a result of the disaster's aftermath did not die until September 2018. Yet again, however, a wave of fear swept through the entire world.
The sum of fears
The disaster in Fukushima was a strong blow to the nuclear energy sector – which even without it, suffered bad press – and led to public trepidation, even though by the mid-1980s the number of reactors operating worldwide had reached 430 and stopped growing. New ones were still being built in France, Japan, the USSR (later, Russia), South Korea and China, but elsewhere they were gradually dismantled. The only country that had based their entire energy system on nuclear power plants was France, where they produce over 80% of electricity. Finland is also focusing on the development of nuclear energy. Two nuclear power plants currently generate around 30% of the country's energy, and once the third one is built, this will reach 60% (the rest is to come from renewable sources).
Most countries, however, still recognize the nuclear industry as a dead end. The emergence of much better third generation reactors that use less uranium, while reducing the amount of waste, did not change that. Developed by two companies – the French Framatome and the German Siemens – the EPR (European Pressurized Reactor) has a quadruple safety system and reinforcement that can withstand even the impact of an aircraft crash. In turn, the ESBWR (Economic Simplified Boiling Water Reactor) by GE Hitachi, apart from showing similar resistance, requires minimal amounts of coolant and discharges excess heat directly into the atmosphere.
There are more innovative constructions, but they have started to generate interest only recently, thanks to the rapid development of Asian countries, and thus an increase in demand for cheap electricity. A nuclear power plant uses roughly 30-50 tons of uranium per year. At a market price of around $55 per kilogram, a cost of fuel of around $2.5 million a year is very cheap – 100 times cheaper than the cost of fuel for a coal-fired power plant fuel. It is estimated that known uranium deposits will last for about 300 years. At the same time, as with crude oil, this deadline may prove to be much more distant, since no new ones have been sought for years. Therefore, it should not come as a surprise that in April 2019 China presented a plan for the vast expansion of its nuclear energy sector. While today the total capacity of Chinese nuclear power plants is about 42 GW, it will exceed 100 GW in 100 years. Then, the People's Republic of China will overtake the US in this field. South Korea has presented slightly less ambitious goals, announcing an increase in nuclear power by one-third.
And what path will the European Union take? The fight against CO2 emissions determines the direction of its energy policy, and renewable energy sources are a priority. However, to fully base their economy on them, efficient energy storage is necessary – methods capable of accumulating electricity at times of overproduction and releasing it in the absence of sun and wind. Even lithium-ion cells cannot fully cope with this task. Attempts are being made at avoiding the lack of this element by designing self-sufficient buildings that draw energy from solar batteries and heat pumps. However, in the scale of cities and entire countries, large power plants cannot be replaced, and the only ones that do not emit carbon dioxide are nuclear power plants. This fact means that even in Europe, their slow renaissance continues. For now, countries on the outskirts of the EU (Finland, Hungary, Lithuania, the Czech Republic and Slovakia) are modernizing old plants or building new ones. In just one year, the construction of over 60 new reactors began.
Despite public resentment, more investments will begin soon. Right now, fear of the 'China syndrome' is weaker than fear of the effects of global warming and sudden energy shortages and blackouts.
Translated from the Polish by Joanna FigielReprinted with permission of Przekrój. Read the original article.
- It is Time to Hold Nuclear Fear Mongers Accountable. - Big Think ›
- Why the Future of Nuclear Energy May Depend on Media Coverage ... ›
Inventions with revolutionary potential made by a mysterious aerospace engineer for the U.S. Navy come to light.
- U.S. Navy holds patents for enigmatic inventions by aerospace engineer Dr. Salvatore Pais.
- Pais came up with technology that can "engineer" reality, devising an ultrafast craft, a fusion reactor, and more.
- While mostly theoretical at this point, the inventions could transform energy, space, and military sectors.
The U.S. Navy controls patents for some futuristic and outlandish technologies, some of which, dubbed "the UFO patents," came to life recently. Of particular note are inventions by the somewhat mysterious Dr. Salvatore Cezar Pais, whose tech claims to be able to "engineer reality." His slate of highly-ambitious, borderline sci-fi designs meant for use by the U.S. government range from gravitational wave generators and compact fusion reactors to next-gen hybrid aerospace-underwater crafts with revolutionary propulsion systems, and beyond.
Of course, the existence of patents does not mean these technologies have actually been created, but there is evidence that some demonstrations of operability have been successfully carried out. As investigated and reported by The War Zone, a possible reason why some of the patents may have been taken on by the Navy is that the Chinese military may also be developing similar advanced gadgets.
Among Dr. Pais's patents are designs, approved in 2018, for an aerospace-underwater craft of incredible speed and maneuverability. This cone-shaped vehicle can potentially fly just as well anywhere it may be, whether air, water or space, without leaving any heat signatures. It can achieve this by creating a quantum vacuum around itself with a very dense polarized energy field. This vacuum would allow it to repel any molecule the craft comes in contact with, no matter the medium. Manipulating "quantum field fluctuations in the local vacuum energy state," would help reduce the craft's inertia. The polarized vacuum would dramatically decrease any elemental resistance and lead to "extreme speeds," claims the paper.
Not only that, if the vacuum-creating technology can be engineered, we'd also be able to "engineer the fabric of our reality at the most fundamental level," states the patent. This would lead to major advancements in aerospace propulsion and generating power. Not to mention other reality-changing outcomes that come to mind.
Among Pais's other patents are inventions that stem from similar thinking, outlining pieces of technology necessary to make his creations come to fruition. His paper presented in 2019, titled "Room Temperature Superconducting System for Use on a Hybrid Aerospace Undersea Craft," proposes a system that can achieve superconductivity at room temperatures. This would become "a highly disruptive technology, capable of a total paradigm change in Science and Technology," conveys Pais.
High frequency gravitational wave generator.
Credit: Dr. Salvatore Pais
Another invention devised by Pais is an electromagnetic field generator that could generate "an impenetrable defensive shield to sea and land as well as space-based military and civilian assets." This shield could protect from threats like anti-ship ballistic missiles, cruise missiles that evade radar, coronal mass ejections, military satellites, and even asteroids.
Dr. Pais's ideas center around the phenomenon he dubbed "The Pais Effect". He referred to it in his writings as the "controlled motion of electrically charged matter (from solid to plasma) via accelerated spin and/or accelerated vibration under rapid (yet smooth) acceleration-deceleration-acceleration transients." In less jargon-heavy terms, Pais claims to have figured out how to spin electromagnetic fields in order to contain a fusion reaction – an accomplishment that would lead to a tremendous change in power consumption and an abundance of energy.
According to his bio in a recently published paper on a new Plasma Compression Fusion Device, which could transform energy production, Dr. Pais is a mechanical and aerospace engineer working at the Naval Air Warfare Center Aircraft Division (NAWCAD), which is headquartered in Patuxent River, Maryland. Holding a Ph.D. from Case Western Reserve University in Cleveland, Ohio, Pais was a NASA Research Fellow and worked with Northrop Grumman Aerospace Systems. His current Department of Defense work involves his "advanced knowledge of theory, analysis, and modern experimental and computational methods in aerodynamics, along with an understanding of air-vehicle and missile design, especially in the domain of hypersonic power plant and vehicle design." He also has expert knowledge of electrooptics, emerging quantum technologies (laser power generation in particular), high-energy electromagnetic field generation, and the "breakthrough field of room temperature superconductivity, as related to advanced field propulsion."
Suffice it to say, with such a list of research credentials that would make Nikola Tesla proud, Dr. Pais seems well-positioned to carry out groundbreaking work.
A craft using an inertial mass reduction device.
Credit: Salvatore Pais
The patents won't necessarily lead to these technologies ever seeing the light of day. The research has its share of detractors and nonbelievers among other scientists, who think the amount of energy required for the fields described by Pais and his ideas on electromagnetic propulsions are well beyond the scope of current tech and are nearly impossible. Yet investigators at The War Zone found comments from Navy officials that indicate the inventions are being looked at seriously enough, and some tests are taking place.
If you'd like to read through Pais's patents yourself, check them out here.
Laser Augmented Turbojet Propulsion System
Credit: Dr. Salvatore Pais
As bad as this sounds, a new essay suggests that we live in a surprisingly egalitarian age.
- A new essay depicts 700 years of economic inequality in Europe.
- The only stretch of time more egalitarian than today was the period between 1350 to approximately the year 1700.
- Data suggest that, without intervention, inequality does not decrease on its own.
Economic inequality is a constant topic. No matter the cycle — boom or bust — somebody is making a lot of money, and the question of fairness is never far behind.
A recently published essay in the Journal of Economic Literature by Professor Guido Alfani adds an intriguing perspective to the discussion by showing the evolution of income inequality in Europe over the last several hundred years. As it turns out, we currently live in a comparatively egalitarian epoch.
Seven centuries of economic history
Figure 8 from Guido Alfani, Journal of Economic Literature, 2021.
This graph shows the amount of wealth controlled by the top ten percent in certain parts of Europe over the last seven hundred years. Archival documentation similar to — and often of a similar quality as — modern economic data allows researchers to get a glimpse of what economic conditions were like centuries ago. Sources like property tax records and documents listing the rental value of homes can be used to determine how much a person's estate was worth. (While these methods leave out those without property, the data is not particularly distorted.)
The first part of the line, shown in black, represents work by Prof. Alfani and represents the average inequality level of the Sabaudian State in Northern Italy, The Florentine State, The Kingdom of Naples, and the Republic of Venice. The latter part, in gray, is based on the work of French economist Thomas Piketty and represents an average of inequality in France, the United Kingdom, and Sweden during that time period.
Despite the shift in location, the level of inequality and rate of increase are very similar between the two data sets.
Apocalyptic events cause decreases in inequality
Note that there are two substantial declines in inequality. Both are tied to truly apocalyptic events. The first is the Black Death, the common name for the bubonic plague pandemic in the 14th century, which killed off anywhere between 30 and 50 percent of Europe. The second, at the dawn of the 20th century, was the result of World War I and the many major events in its aftermath.
The 20th century as a whole was a time of tremendous economic change, and the periods not featuring major wars are notable for having large experiments in distributive economic policies, particularly in the countries Piketty considers.
The slight stall in the rise of inequality during the 17th century is the result of the Thirty Years' War, a terrible religious conflict that ravaged Europe and left eight million people dead, and of major plagues that affected South Europe. However, the recurrent outbreaks of the plague after the Black Death no longer had much effect on inequality. This was due to a number of factors, not the least of which was the adaptation of European institutions to handle pandemics without causing such a shift in wealth.
In 2010, the last year covered by the essay, inequality levels were similar to those of 1340, with 66 percent of the wealth of society being held by the top ten percent. Also, inequality levels were continuing to rise, and the trends have not ended since. As Prof. Alfani explained in an email to BigThink:
"During the decade preceding the Covid pandemic, economic inequality has shown a slow tendency towards further inequality growth. The Great Recession that began in 2008 possibly contributed to slow down inequality growth, especially in Europe, but it did not stop it. However, the expectation is that Covid-19 will tend to increase inequality and poverty. This, because it tends to create a relatively greater economic damage to those having unstable occupations, or who need physical strength to work (think of the effects of the so-called "long-Covid," which can prove physically invalidating for a long time). Additionally, and thankfully, Covid is not lethal enough to force major leveling dynamics upon society."
Can only disasters change inequality?
That is the subject of some debate. While inequality can occur in any economy, even one that doesn't grow all that much, some things appear to make it more likely to rise or fall.
Thomas Piketty suggested that the cause of changes in inequality levels is the difference in the rate of return on capital and the overall growth rate of the economy. Since the return on capital is typically higher than the overall growth rate, this means that those who have capital to invest tend to get richer faster than everybody else.
While this does explain a great deal of the graph after 1800, his model fails to explain why inequality fell after the Black Death. Indeed, since the plague destroyed human capital and left material goods alone, we would expect the ratio of wealth over income to increase and for inequality to rise. His model can provide explanations for the decline in inequality in the decades after the pandemic, however- it is possible that the abundance of capital could have lowered returns over a longer time span.
The catastrophe theory put forth by Walter Scheidel suggests that the only force strong enough to wrest economic power from those who have it is a world-shattering event like the Black Death, the fall of the Roman Empire, or World War I. While each event changed the world in a different way, they all had a tremendous leveling effect on society.
But not even this explains everything in the above graph. Pandemics subsequent to the Black Death had little effect on inequality, and inequality continued to fall for decades after World War II ended. Prof. Alfani suggests that we remember the importance of human agency through institutional change. He attributes much of the post-WWII decline in inequality to "the redistributive policies and the development of the welfare states from the 1950s to the early 1970s."
What does this mean for us now?
As Professor Alfani put it in his email:
"[H]istory does not necessarily teach us whether we should consider the current trend toward growth in economic inequality as an undesirable outcome or a problem per se (although I personally believe that there is some ground to argue for that). Nor does it teach us that high inequality is destiny. What it does teach us, is that if we do not act, we have no reason whatsoever to expect that inequality will, one day, decline on its own. History also offers abundant evidence that past trends in inequality have been deeply influenced by our collective decisions, as they shaped the institutional framework across time. So, it is really up to us to decide whether we want to live in a more, or a less unequal society."
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
- A new study suggests that tabs can cause people to be flustered as they try to keep track of every website.
- The reason is that tabs are unable to properly organize information.
- The researchers are plugging a browser extension that aims to fix the problem.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.