Facebook AI's new MMT model translates directly btw 100 languages
The M2M-100 model is trained on a total of 2,200 language directions, 10x more than the previous best, English-centric multilingual models. Facebook says the new model will improve the quality of translations for billions of people, especially those that speak low-resource languages.
- Country:
- United States
Facebook AI has developed a new multilingual machine translation (MMT) model that can directly translate between any pair of 100 languages as opposed to most other artificial intelligence-powered translation systems that typically translate using English data.
For instance, when translating, say, Chinese to French, most English-centric multilingual models train on Chinese to English and English to French, because English training data is the most widely available. However, Facebook AI's new MMT model directly trains on Chinese to French data, outperforming English-centric systems by 10 points on the widely used BLEU metric for evaluating machine translations.
"Advanced multilingual systems can process multiple languages at once, but compromise on accuracy by relying on English data to bridge the gap between the source and target languages. We need one multilingual machine translation (MMT) model that can translate any language to better serve our community, nearly two-thirds of which use a language other than English," Facebook wrote in a blog post.
The M2M-100 model is trained on a total of 2,200 language directions, 10x more than the previous best, English-centric multilingual models. Facebook says the new model will improve the quality of translations for billions of people, especially those that speak low-resource languages.
Facebook has released the model, training, and evaluation setup to help other researchers reproduce and further advance multilingual models to achieve the goal of making a single universal model that understands all languages.
"For years, AI researchers have been working toward building a single universal model that can understand all languages across different tasks. A single model that supports all languages, dialects, and modalities will help us better serve more people, keep translations up to date, and create new experiences for billions of people equally. This work brings us closer to this goal," said Facebook.
For more details on the new Facebook AI MMT model, head over to the official blog post.
- READ MORE ON:
- M2M-100 model
- multilingual machine translation
- AI

