Google’s neural network learns to translate languages that it has never learned
The legendary Google Translate will now produce more natural and better translation between languages, thereby reducing the gap between human and machine translators and enhancing the learning capabilities of Google Translate’s neural network.
The company is now using a new technology called Neural Machine Translation (NMT), which aims to make computer-generated translations more parallel to those done by humans and to power its translations in seven new languages. The update should make translations in those languages much more accurate and easier to understand, says Google.
“At a high level, the Neural system translates whole sentences at a time, rather than just piece by piece,” said Barak Turovsky, product lead at Google Translate during a press event at Google’s San Francisco office on Tuesday. “It uses this broader context to help it figure out the most relevant translation, which it then rearranges and adjusts to be more like a human speaking with proper grammar.”
Back in September, Google had announced that it would be switching from Phrase-Based Machine Translation (PBMT) to Google Neural Machine Translation (GNMT) for handling translations between Chinese and English. Built on a neural network, the legendary Google Translate will now produce more natural and better translation between languages.
As neural network can learn from past actions to solve new problems, even if it’s not previously programmed to do so, that makes the GNMT creators curious about something. What if the Google translate could translate between two languages even those weren’t paired by the system previously? And, thanks to machine learning, the answer is yes. It can actually generate reasonable translations between two languages without resorting to another language as a bridge between them. The method is called zero-shot.
Researchers explained in a blog post how this works:
“Let’s say we train a multilingual system with Japanese to English and Korean to English examples… [GNMT] shares its parameters to translate between these four different language pairs… It can generate reasonable Korean to Japanese translations, even though it has never been taught to do so… To the best of our knowledge, this is the first time this type of transfer learning has worked in Machine Translation.”
With GNMT rolling out for eight language pairs, Google Translate got its biggest update ever. French, German, Japanese, Korean, Portuguese, Spanish and Turkish were added to English and Chinese, making GNMT making over 35 percent of translation requests on Google. The company hopes to add GNMT to all 103 languages in the system in the next few months.
The company also announced it would be opening up its NMT API to developers and businesses through the Google Cloud.
“Todays step towards Neural Machine Translation is a significant milestone for Google Translate, but there’s always more work to do and well continue to learn over time,” says Turovsky.
Translate is just one of many Google products that has profited from the company’s venture into neural networks, with other applications including a futuristic image compression system and geo-location detection.
The new and, obviously artificially intelligent Google Translate is live now. Earlier this year, Google celebrated ten years of machine based translations. More than one billion words are translated every day.