icons-social-media-facebook-circleicons-social-media-twitter-circleicons-social-media-linked-in-circle
Neural is the New Black
Artificial neural networks have come to stay and are already proving efficient for data-intensive industries. A blog post about Neural Machine Translation.
This article is the first in a series we'll be posting, exploring the topic of machine translation. Powering the SmartCAT ecosystem with flexible third-party integration capabilities, we’re preparing to enter the Convergence era, in line with the insightful outlook* Jaap van der Meer presented on TAUS in 2013. Not just evolve with the industry but take the lead in this process, that’s what we’re up to.

*Read Jaap’s article on TAUS here.

In November last year, Google and Microsoft announced they have updated their translation engines with Neural Machine Translation (NMT) systems. Both companies celebrated a significant boost in the quality of translation output, with Google going as far as to claim that the improvement outweighs all the efforts made in the last ten years combined. Microsoft, in turn, set up a demo page showing the dramatic progress in accuracy and fluency the NMT technology offers compared to the previous one, SMT (Statistical Machine Translation). Though both Google and Microsoft currently support only a dozen of the most popular languages, the companies made it clear a major milestone has been reached and they’re looking to work on developing the technology and adding more languages.

Explaining the Magic

Artificial neural networks have come to stay and are already proving efficient for data-intensive industries. It won’t be long before this technology becomes commonplace, and, for the wider public, the announcement of NMT served as an introduction. “Artificial neural networks (NN for short) are practical, elegant, and mathematically fascinating models for machine learning. They are inspired by the central nervous systems of humans and animals — smaller processing units (neurons) are connected together to form a complex network that is capable of learning and adapting,” explains Dr. Marek Rei, Researcher in University of Cambridge.

Such complex models as those Google and Microsoft build require huge data sets and their performance depends on vast computing resources. NMT learns to translate by processing massive collections of existing translations across different language pairs. Unlike SMT method, which is based on acquiring similarities from bilingual texts, the new technology doesn’t just match words and phrases but rather thoroughly studies the relationships between two languages. It analyzes each segment in the text and attempts to understand its context, therefore determining the meaning of each word in the segment that needs to be translated. Looking beyond grammatical rules, semantics and structure, NMT finds insights and linguistic patterns not even a human mind can reveal. It then deconstructs full sentences in the source language and rebuilds them in the target language. “The key thing about neural network models is that they are able to generalize better from the data,” says Microsoft researcher Arul Menezes. “With the previous model, no matter how much data we threw at them, they failed to make basic generalizations. At some point, more data was just not making them any better.”

 

GNMT proves to be accurate at handling this translation of Wikipedia definition from English into French.

What Experts Say

Chris Wendt at Microsoft sounds enthusiastic: “Neural networks bring up the quality of languages with largely differing sentence structure, say English⇌Japanese, up to the quality level of languages with similar sentence structure, say English⇌Spanish. I have looked at a lot of Japanese to English output: Finally actually understandable.”

Even though a higher quality and more human-sounding output is apparent for some language pairs, we’re now witnessing the early days of the technology and the human translators are by no means at risk of being replaced by machines. As Google Brain Team engineers Quoc V. Le and Mike Schuster point out, “Machine translation is by no means solved” with Neural MT.

Roland Meertens of Infor spoke at a GALA 2017 webinar about one major drawback NMT has compared to SMT and even Rule-Based MT — it doesn’t handle rare words very well and in the current state can’t make good use of glossaries.

“The first wave of NMT solutions are mostly generic systems, which are clearly improved in most language combinations over existing generic SMT solutions, especially to human evaluators. While we need to be wary of over exuberance about the progress, there is reason for optimism and we can expect further quality improvements as our understanding of the mystery of ‘hidden layers’ of deep learning improves”, Kirti Vashee admits. Making a comment about NMT systems in the professional translation context, however, he argues that to be most useful they “must be adaptable/customizable for specific business purposes, i.e. they need to learn specific terminology and specific customer domain. Comprehensive customization will take significantly more computing time and all the requirements for good quality data will only intensify.”

Anyone else?

Google and Microsoft aren’t the only players in the field. Actually, Systran was the first to launch its Pure Neural Machine Translation engine. Yandex too has walked down the “neural” path to make rare languages available in its own translation engine and recently has announced it’s putting machine translation at the core of its new AI strategy. Baidu and Amazon also mentioned their own NMT initiatives are underway. Last summer Facebook has introduced its DeepText AI engine which utilizes deep neural network architecture to improve automatic translations of posts in the news feed. And Abdessamad Echihabi of SDL says, “Much like Google, SDL has been actively researching and investing in Neural MT.”

Asked for a comment for this post, Gábor Bessenyei, CEO of Morphologic Localisation, shared that the impending 3.0 version of Globalese will be based on NMT. In his opinion, the technology is breakthrough and much more than just a hype. Compared to SMT, it provides sensibly better, albeit not always perfect output, which makes a difference in the post-editing process, Gábor said. In many cases, the machine translation will be as grammatically correct as the professional human reference translation, and it’s only a matter of wording. For example:


Original (in German): Der Rechnungsführer sorgt für die gebotenen technischen Vorkehrungen zur wirksamen Anwendung des FWS und für dessen Überwachung.

Reference human translation: The accounting officer shall ensure appropriate technical arrangements for an effective functioning of the EWS and its monitoring.

Globalese NMT: The accounting officer shall ensure the necessary technical arrangements for the effective use of the EWS and for its monitoring.


While we’re at it, SmartCAT users will be the first to enjoy the new Globalese NMT capabilities as the two technologies are now integrated with each other.

Be On the Look Out

False fluency NMT tends to demonstrate in some cases is the issue that can lead even a seasoned post-editor astray. This is why anyone who begins to work with the technology should take time to update their knowledge on how the neural machine translation works and find out what pitfalls to expect. As Chris Wendt, Microsoft Research Machine Translation's Principal Group Program Manager, commented on this issue, when neural and statistical machine translation are compared head-to-head, “statistical would win on accuracy and neural wins on fluency.” So keep in mind that for different languages the result may not be the same and sometimes the neural brain can play tricks, especially with names, titles and places.

 

In this English into Russian translation, Microsoft Translator calls GALA members “church members,” fails to figure out what “livestream” is and phrases the call-to-action in the end REALLY awkwardly.

 

Jump on Board

NMT still has a long way to go but it’s surely going to be a fun journey. Despite all the talks about technology eventually putting translators out of a job, those more insightful say the emergence of a smart AI partner promises higher levels of quality in the face of ever-increasing quantities of content to be translated. A recent post on Slator’s Facebook page proved that the industry recognizes the potential.

This is exactly the idea the SmartCAT team has in mind as we work to make powerful tools available to our users. Both Google and Microsoft NMT engines can be used right in the editor and alongside the SMT options. Using both systems in your projects is easy: Navigate to the Resources tab in the selected project, select Paid service in the Machine Translation section, click on the gear icon and switch on the engines you want to use for the project. Keep in mind that not all language pairs are currently supported, however the developers claim they’re working hard to add more soon. The Microsoft NMT is available for free, as usual, if you are willing to send your translations to Microsoft for bettering their engine. If not, using both Google’s and Microsoft’s engines costs the same as for their non-NMT counterparts.

 

 

If you’re as excited about this new feature as we are, we’ll appreciate it if you let us know about your experience in the comments below. You’re welcome to get in touch with our support wizards at support@smartcat.ai or from the Customer Support menu on the website if you happen to face any difficulties using these or any other features.


This article originally appeared in the SmartCAT blog section.

 

Author
pavel-doronin

Product Analyst

Related Articles
icons-action-calendar16/05/2023
Understanding our new leap in Artificial Intelligence and its impact on the industry.
icons-action-calendar31/03/2023
How is ChatGPT being used by the translation industry and is it really fit for the purposes of machine translation or language data for AI applications?
icons-action-calendar30/06/2022
TAUS recently launched DeMT™. This article provides context and more information on this recipe for improved MT.