Top 6 fears about future technology

Many of our greatest fears stem from uncertainty about the future, and technology has made the future very uncertain indeed.

  • Americans are scared, but hardly alone; people are primed by evolution to worry over their inability to control their future environment.
  • Oxford professor Nick Bostrom has painted a doomsday scenario. Are he and Elon Musk correct?
  • Even if these six fears come to pass—and some of them surely will—they aren't guaranteed to be as catastrophic as we think. Fortunately or unfortunately, we are incredibly bad at predicting the future.

The future is a scary place. According to a 2017 survey, many Americans' greatest fears—economic collapse, another world war, not having enough money for the future, etc.—are concerns over the state of tomorrow. (Although it is worth noting that their number one fear, corrupt government officials, is a clear and ever-present danger.)

Americans are hardly alone. People are primed to worry over their inability to control their future environment. Tomorrow's unpredictability requires that our brains view it with suspicion, as a potential threat to our survival. Unfortunately for our survival-primed brains, technology's influence is making our future ever more protean.

Today's technological advancements occur exponentially, and the average person will have to adjust to changes that would have previously taken several generations. Many of these advancements will, no doubt, be beneficial. Others, however, could prove less than advantageous.

Autonomous AI

Elon Musk speaks onstage at SXSW 2018 in Austin, Texas. During the conversation, Musk shared his fears over the future of AI

(Photo by Diego Donamaria/Getty Images for SXSW)

Elon Musk speaks onstage at SXSW 2018 in Austin, Texas. During the conversation, Musk shared his fears over the future of AI.

Imagine a paperclip company creates an artificial superintelligence and tasks it with the single goal of making as many paperclips as possible. The company's stock soars, and humanity enters the golden age of the paperclip.

But something unexpected happens. The AI surveys the natural resources we need to survive and decides those could go a long way toward paperclip manufacturing. It consumes those resources in an effort to fulfill its prime directive, "make as many paperclips as possible," and wipes out humanity in the process.

This thought experiment, devised by Oxford professor Nick Bostrom, details just one potential danger in creating an artificial superintelligence—that being, we need to be very careful with our words.

"I'm very close to the cutting-edge of AI, and it scares the hell out of me," Elon Musk, CEO of Tesla and SpaceX, said at SXSW 2018. "It is capable of vastly more than anyone knows, and the rate of improvement is exponential. […] We have to figure out some way to ensure that the advent of digital superintelligence is one which is symbiotic with humanity. I think that's the single biggest exponential crisis that we face."

Bostrom and Musk paint worst-case scenarios, but there are plenty of worries over artificial superintelligence that don't end in human genocide. Experts have postulated that AI could automate terrorism, mass produce propaganda, and streamline hacking to devastating effects.

Runaway automation

Americans have steadily been losing work to automation for decades, but the trend appears to be picking up speed. Self-driving cars, for example, could soon displace 5 million workers nationwide.

But taxi drivers aren't the only people who should be worried. A McKinsey Global Institute study suggests that nearly 70 million people could lose their jobs to automation by 2030. U.S. workers in retail, agriculture, manufacturing, and food services may find their jobs on the automated chopping block.

No wonder Americans fear the incoming robo-revolution. A Pew Research report found that 72 percent of U.S. adults surveyed expressed worry over automation, compared with 33 percent who were enthusiastic. A majority were also hesitant to consider using automated services such as driverless cars or robotic caregivers.

Killer robots

We create robots to fight our wars for us, but they turn on their masters and bring ruin to our world. It's a classic science fiction conceit, and one we're much closer to than, say, first contact. Autonomous drones are already available, and it is only a matter of time before they make the leap from selfie-machine to combatant.

The Campaign to Stop Killer Robots worries about this future, but not about robotic warriors turning on their masters. Rather, the campaign believes that autonomous weapons will lead to an erosion of accountability in armed conflicts between states.

As stated on the campaign's website:

The use of fully autonomous weapons would create an accountability gap as there is no clarity on who would be legally responsible for a robot's actions: the commander, programmer, manufacture, or robot itself? Without accountability, these parties would have less incentive to ensure robots did not endanger civilians and victims would be left unsatisfied that someone was punished for the harm they experienced.

Considering the difficulties already associated with prosecuting war crimes, the concern is worth consideration.

Vicious virtual reality

A group of children wearing virtual reality headsets.

(Photo by Getty Images)

A group of children wearing virtual reality headsets.

Virtual reality is here, and it looks way better than the '80s led us to believe it would. But as with any new technology, trepidation has welled up over to how it will affect people's wellbeing, especially children.

"The gap between 'things that happen to my character' and 'things that happen to me' is bridged," Scott Stephen, a VR designer, told The New Yorker. "The way I process these scares is not through the eyes of a person using their critical media-viewing faculty but through eyes of I, the self, with all of the very human, systems-level, subconscious voodoo that comes along with that."

Because the technology's availability has been limited until recently, not many studies that have looked at VR's effects on children, and the studies we have aren't conclusive. One study showed that children were more likely to create a false memory under VR's influence, but another study has shown its ability to reduce anxiety in children undergoing medical procedures.

Baleful biomedical technologies

In the coming years, we could cultivate biomaterials in labs to replace failing organs and splice genes in utero so children won't suffer the debilitating inherited diseases of their forebearers. Biomedical technologies promise a future where we are all better, stronger, faster and at the fraction of the cost of one Steve Austin.

But a 2016 Pew Research report suggests that Americans don't see these medical advancements as incoming miracles. Of those surveyed, a majority said they were either somewhat or very worried about brain chips that make us smarter (69 percent), genetic editing to reduce babies' risk of disease (68 percent), and synthetic blood to improve physical abilities (63 percent).

Their reasoning? Such enhancements "could exacerbate the divide between haves and have-nots" and be used as a measure of superiority by their recipients. The more religious a participant, the more likely they were to believe such technologies were "meddling with nature" and "crosses a line we should not cross." Mostly though, we just loathe the idea of neighbors throwing a get-together to show off their fancy new brain chips.

Wholesale nuclear power

The ghost town of Pripyat, Ukraine, with the Chernobyl nuclear reactor in the background.

(Photo by MediaProduction/Getty Images)

The ghost town of Pripyat, Ukraine, with the Chernobyl nuclear reactor in the background.

On Aug. 6, 1945, the United States dropped an atomic bomb on Hiroshima, Japan. Since then, nuclear weapons have been an existential threat to our species. As of Jan. 2018, the Bulletin of Atomic Scientists set the Doomsday Clock at a mere two minutes to midnight.

But weapons of mass destruction aren't why nuclear made this list. It's here because of people dread nuclear energy.

In a 2016 Gallup poll, a majority of Americans surveyed (54 percent) were opposed to nuclear energy, the first time a majority opposed the prospect since 1994, when Gallup first started asking the question. Of course, it's not hard to where the fear originates. When nuclear power plants fail, they fail with devastating consequences. Three Mile Island, Chernobyl, Fukushima, the list is longer than we'd like.

But some experts argue that we need nuclear energy to decarbonize quickly enough to avert major climate catastrophe. Not only does nuclear power produce immense amounts of energy, it also has a low-carbon footprint (lower than even solar).

"In most of the world, especially the rich world, they're not talking about building new reactors. We're actually talking about taking reactors down before their lifetimes are over," Michael Shellenberger, president of Environmental Progress, said during his TED talk. "[The United States] could lose half of our reactors over the next 15 years, which would wipe out 40 percent of the emissions reductions we're supposed to get under the Clean Power Plan."

A cloudy crystal ball

So, is the future a technological murder mansion, a place where every dark corner hides a robotic horror waiting to kill all humans or, at the very least, take all our jobs? Maybe, but probably not.

People have a strong desire to predict the course of tomorrow, and whole social movements, from futurists to psychics to horoscopes, have sprung up to meet that demand. Such conjectures return to us a semblance of control with regards to our future environment.

Thing is, we are incredibly bad at predicting the future.

To pick a few well-known examples: In the late 18th century Thomas Malthus argued that unless family size was regulated, humanity would overpopulate the planet and create a misery of famine. In 1989 Francis Fukuyama foresaw the end of history. And in 1998 the Y2K bug was predicted to wipe out computer networks across the world.

But Malthus couldn't predict the technological advancements in agriculture that could feed billions more people than existed in his day; Fukuyama could not foresee the political upheaval of events such as 9/11; and Y2K doomsayers, well, they were just wrong.

Even if these six fears come to pass — and some of them surely will — they aren't guaranteed to be as bad as predicted. Automation could wipe out 70 million jobs, but new innovations could generate new jobs needing to be filled. Biomedical technologies could widen the expanding gap between classes, but if treat them as reconstructive procedures, rather than aesthetic ones, then everyone should have a right to benefit.

That makes you feel better about the future… right?

Antimicrobial resistance is a growing threat to good health and well-being

Antimicrobial resistance is growing worldwide, rendering many "work horse" medicines ineffective. Without intervention, drug-resistant pathogens could lead to millions of deaths by 2050. Thankfully, companies like Pfizer are taking action.

Image courtesy of Pfizer.
  • Antimicrobial-resistant pathogens are one of the largest threats to global health today.
  • As we get older, our immune systems age, increasing our risk of life threatening infections. Without reliable antibiotics, life expectancy could decline for the first time in modern history.
  • If antibiotics become ineffective, common infections could result in hospitalization or even death. Life-saving interventions like cancer treatments and organ transplantation would become more difficult, more often resulting in death. Routine procedures would become hard to perform.
  • Without intervention, resistant pathogens could result in 10 million annual deaths by 2050.
  • By taking a multi-faceted approach—inclusive of adherence to good stewardship, surveillance and responsible manufacturing practices, as well as an emphasis on prevention and treatment—companies like Pfizer are fighting to help curb the spread.
Keep reading Show less
Sponsored

How to bring more confidence to your conversations

Entrepreneur and author Andrew Horn shares his rules for becoming an assured conversationalist.

content.jwplatform.com
Videos
  • To avoid basing action on external validation, you need to find your "authentic voice" and use it.
  • Finding your voice requires asking the right questions of yourself.
  • There are 3-5 questions that you would generally want to ask people you are talking to.
Keep reading Show less

Bespoke suicide pods now available for death in style

Sarco assisted suicide pods come in three different styles, and allow you to die quickly and painlessly. They're even quite beautiful to look at.

The Sarco assisted suicide pod
Technology & Innovation

Death: it happens to everyone (except, apparently, Keanu Reeves). But while the impoverished and lower-class people of the world die in the same ol' ways—cancer, heart disease, and so forth—the upper classes can choose hip and cool new ways to die. Now, there's an assisted-suicide pod so chic and so stylin' that peeps (young people still say peeps, right?) are calling it the "Tesla" of death... it's called... the Sarco! 

Keep reading Show less

Scientists find a horrible new way cocaine can damage your brain

Swiss researchers identify new dangers of modern cocaine.

Getty Images
Mind & Brain
  • Cocaine cut with anti-worming adulterant levamisole may cause brain damage.
  • Levamisole can thin out the prefrontal cortex and affect cognitive skills.
  • Government health programs should encourage testing of cocaine for purity.
Keep reading Show less