A.I. is translating messages of long-lost languages
MIT and Google researchers use deep learning to decipher ancient languages.
- Researchers from MIT and Google Brain discover how to use deep learning to decipher ancient languages.
- The technique can be used to read languages that died long ago.
- The method builds on the ability of machines to quickly complete monotonous tasks.
There are about 6,500-7,000 languages currently spoken in the world. But that's less than a quarter of all the languages people spoke over the course of human history. That total number is around 31,000 languages, according to some linguistic estimates. Every time a language is lost, so goes that way of thinking, of relating to the world. The relationships, the poetry of life uniquely described through that language are lost too. But what if you could figure out how to read the dead languages? Researchers from MIT and Google Brain created an AI-based system that can accomplish just that.
While languages change, many of the symbols and how the words and characters are distributed stay relatively constant over time. Because of that, you could attempt to decode a long-lost language if you understood its relationship to a known progenitor language. This insight is what allowed the team which included Jiaming Luo and Regina Barzilay from MIT and Yuan Cao from Google's AI lab to use machine learning to decipher the early Greek language Linear B (from 1400 BC) and a cuneiform Ugaritic (early Hebrew) language that's also over 3,000 years old.
Linear B was previously cracked by a human – in 1953, it was deciphered by Michael Ventris. But this was the first time the language was figured out by a machine.
The approach by the researchers focused on 4 key properties related to the context and alignment of the characters to be deciphered – distributional similarity, monotonic character mapping, structural sparsity and significant cognate overlap.
They trained the AI network to look for these traits, achieving the correct translation of 67.3% of Linear B cognates (word of common origin) into their Greek equivalents.
What AI can potentially do better in such tasks, according to MIT Technology Review, is that it can simply take a brute force approach that would be too exhausting for humans. They can attempt to translate symbols of an unknown alphabet by quickly testing it against symbols from one language after another, running them through everything that is already known.
Next for the scientists? Perhaps the translation of Linear A - the Ancient Greek language that no one has succeeded in deciphering so far.
You can check out their paper "Neural Decipherment via Minimum-Cost Flow: from Ugaritic to Linear B" here.
Noam Chomsky on Language’s Great Mysteries
Famous physicists like Richard Feynman think 137 holds the answers to the Universe.
- The fine structure constant has mystified scientists since the 1800s.
- The number 1/137 might hold the clues to the Grand Unified Theory.
- Relativity, electromagnetism and quantum mechanics are unified by the number.
Younger Americans support expanding the Supreme Court and serious political reforms, says new poll.
- Americans under 40 largely favor major political reforms, finds a new survey.
- The poll revealed that most would want to expand the Supreme Court, impose terms limits, and make it easier to vote.
- Millennials are more liberal and reform-centered than Generation Z.
Logic puzzles can teach reasoning in a fun way that doesn't feel like work.
- Logician Raymond Smullyan devised tons of logic puzzles, but one was declared by another philosopher to be the hardest of all time.
- The problem, also known as the Three Gods Problem, is solvable, even if it doesn't seem to be.
- It depends on using complex questions to assure that any answer given is useful.