Why Scientists are Training AI to Take Standardized Tests

Researchers hope training machines to the test will allow for advances in imbuing software with basic common sense.

Computer software has proven itself to be a lot better than humans at a whole lot of things: search queries, indexing, calculations, etc. But common sense is not currently one of those things. That's why computer scientists are toying with a bunch of neat new strategies for instilling in AI the main cognitive ability we possess that it doesn't -- the ability to learn.


For example, a team of researchers out of the Allen Institute for Artificial Intelligence in Seattle is training their AI program, named Aristo, to take the New York state fourth grade standard science exams. Oren Etzioni, the Allen Institute's CEO, argues that standardized tests offer a strong benchmark for tracking the progress of machine learning.

To understand what he means, let's revert quickly to standardized tests. They get a bad rap around here and deservedly so, as they're not a great way to guide our school children toward creative thinking or a lifelong love of learning. Luckily for computer scientists, AI isn't like your typical fourth grader.

Microsoft Director of Search Stefan Weitz explains that the future of machine learning consists of teaching artificial intelligence to identify patterns.

There's a reason some kids are better at taking tests than others -- and it's not all brains. It's a matter of finding the most efficient way to interpret questions and deliver the best possible answers. Take, for example, your garden variety multiple-choice question. If you don't already know the solution, the best strategy is to winnow down the choices until you've found the one that's most likely correct. In many ways it's a matter of common sense, which is why Etzioni is so keen on making sure Aristo passes the exam.

Why is common sense the current golden fleece for computer scientists? The Siri program on your phone might be able to interpret your voice and deliver action, but it's not applying rational thought to assist you. It's incapable of figuring things out for itself or of interpreting requests in ways it wasn't initially programmed to do so. The same applies for plenty of computer systems more advanced and important than personal assistant software. Imagine how useful effective and replicable machine learning could be if scientists can make major progress in teaching AI to teach itself.

The future of AI and machine learning is going to be a lot more impressive than a simple search engine. The only reason we're not there yet is because teaching software how to reason is a lot more difficult than assigning it mindless busy work. If the folks at Allen and others like them prove successful in their endeavors, the future of AI may not be so far into the future after all.

Read more at MIT Technology Review.

Wanna meet Aristo and see it in action: Check it out at the Allen Institute website.

Image credit: Vergeles_Andrey / Getty iStock

LinkedIn meets Tinder in this mindful networking app

Swipe right to make the connections that could change your career.

Getty Images
Sponsored
Swipe right. Match. Meet over coffee or set up a call.

No, we aren't talking about Tinder. Introducing Shapr, a free app that helps people with synergistic professional goals and skill sets easily meet and collaborate.

Keep reading Show less

What’s behind our appetite for self-destruction?

Is it "perverseness," the "death drive," or something else?

Photo by Brad Neathery on Unsplash
Mind & Brain

Each new year, people vow to put an end to self-destructive habits like smoking, overeating or overspending.

Keep reading Show less

34 years ago, a KGB defector chillingly predicted modern America

A disturbing interview given by a KGB defector in 1984 describes America of today and outlines four stages of mass brainwashing used by the KGB.

Politics & Current Affairs
  • Bezmenov described this process as "a great brainwashing" which has four basic stages.
  • The first stage is called "demoralization" which takes from 15 to 20 years to achieve.
  • According to the former KGB agent, that is the minimum number of years it takes to re-educate one generation of students that is normally exposed to the ideology of its country.
Keep reading Show less

Douglas Rushkoff – It’s not the technology’s fault

It's up to us humans to re-humanize our world. An economy that prioritizes growth and profits over humanity has led to digital platforms that "strip the topsoil" of human behavior, whole industries, and the planet, giving less and less back. And only we can save us.

Think Again Podcasts
  • It's an all-hands-on-deck moment in the arc of civilization.
  • Everyone has a choice: Do you want to try to earn enough money to insulate yourself from the world you're creating— or do you want to make the world a place you don't have to insulate yourself from?
Keep reading Show less