Will Technology Progress or Bring down Humanity?

The Bulletin of the Atomic Scientists worries technological advancements are going unchecked. The group asks that regulatory bodies be established to help assess and prevent risks.

The Bulletin of the Atomic Scientists decided to keep the hands of the Doomsday Clock at three minutes to midnight. The decision was meant to express disappointment in the world's failure to take dramatic action to curb climate change and the risk of nuclear disaster. Lower down on the list of potential catastrophes is a worry that disruptive technological advancements are going unchecked.

“It is clear that advances in biotechnology; in artificial intelligence, particularly for use in robotic weapons; and in the cyber realm all have the potential to create global-scale risk,” the group wrote.

The Bulletin recognizes advancements in artificial intelligence have the capacity to do great good for humanity, but also great harm. It has been growing at such a rapid pace thanks in part to deep learning, but the board and the world's brightest minds have given cause to worry.

The fear of a corporate-driven Skynet-like catastrophe has been on the minds of many brilliant thinkers. It has been such a concern for Elon Musk that he helped found OpenAI, a nonprofit artificial intelligence research company. It was established on the belief that it's “important to have a leading research institution which can prioritize a good outcome for all over its own self-interest.” This institution would help create a place to develop without the potential for misuse by an unregulated government, scientific, or corporate institution.

"Where the technology is pushing conflict is moving so much faster than our systems ability to adapt and regulate it that it’s going to be a real challenge for us the next 10 to 15 years."

Even research done by well-intentioned people can go awry. They may be pushing the boundaries of science not stopping to think if it's something they should be doing.

Scientists have pointed out over and over again that progress comes as a double-edged swordMore technological advancements means better standards of living, but also means we're creating a number of "new ways things can go wrong,” according to Stephen Hawking. 

However, Lawrence Krauss isn't convinced there's cause for immediate concern.

“Elon Musk and others who have expressed concern and Stephen Hawking are friends of mine and I understand their potential concerns, but I’m frankly not as concerned about AI in the near term at the very least as many of my friends and colleagues are,” says Krauss, who is the chair of the Bulletin's Board of Sponsors.

We, of course, have to realize that the rate at which machines are evolving in capability may far exceed the rate at which society is able to deal with them,” he said.

For this reason, Hawking has said, “It's important to ensure that these changes are heading in the right directions. In a democratic society, this means that everyone needs to have a basic understanding of science to make informed decisions about the future.”

The Science and Security Board believes the international community should establish an institution to inspect and regulate these emerging technologies to assess any risks. So far, these technologies have gone with little oversight, which is why it falls upon society to ask for these regulators.


Photo Credit: Christian Science Monitor / Contributor/ Getty

Natalie has been writing professionally for about 6 years. After graduating from Ithaca College with a degree in Feature Writing, she snagged a job at PCMag.com where she had the opportunity to review all the latest consumer gadgets. Since then she has become a writer for hire, freelancing for various websites. In her spare time, you may find her riding her motorcycle, reading YA novels, hiking, or playing video games. Follow her on Twitter: @nat_schumaker

LinkedIn meets Tinder in this mindful networking app

Swipe right to make the connections that could change your career.

Getty Images
Swipe right. Match. Meet over coffee or set up a call.

No, we aren't talking about Tinder. Introducing Shapr, a free app that helps people with synergistic professional goals and skill sets easily meet and collaborate.

Keep reading Show less

People who engage in fat-shaming tend to score high in this personality trait

A new study explores how certain personality traits affect individuals' attitudes on obesity in others.

Mind & Brain
  • The study compared personality traits and obesity views among more than 3,000 mothers.
  • The results showed that the personality traits neuroticism and extraversion are linked to more negative views and behaviors related to obesity.
  • People who scored high in conscientiousness are more likely to experience "fat phobia.
Keep reading Show less

4 anti-scientific beliefs and their damaging consequences

The rise of anti-scientific thinking and conspiracy is a concerning trend.

Moon Landing Apollo
  • Fifty years later after one of the greatest achievements of mankind, there's a growing number of moon landing deniers. They are part of a larger trend of anti-scientific thinking.
  • Climate change, anti-vaccination and other assorted conspiratorial mindsets are a detriment and show a tangible impediment to fostering real progress or societal change.
  • All of these separate anti-scientific beliefs share a troubling root of intellectual dishonesty and ignorance.
Keep reading Show less

Reigning in brutality - how one man's outrage led to the Red Cross and the Geneva Conventions

The history of the Geneva Conventions tells us how the international community draws the line on brutality.

Napoleon III at the Battle of Solferino. Painting by Adolphe Yvon. 1861.
Politics & Current Affairs
  • Henry Dunant's work led to the Red Cross and conventions on treating prisoners humanely.
  • Four Geneva Conventions defined the rules for prisoners of war, torture, naval and medical personnel and more.
  • Amendments to the agreements reflect the modern world but have not been ratified by all countries.
Keep reading Show less