New Research Explores How to Simplify Translation Between Related Languages

Related Languages Translation

In a paper published on October 23, 2023, a group of researchers introduced a new approach to machine translation (MT) that simplifies the task of translating between related languages. 

These related languages belong to the same language family and share linguistic characteristics, such as word order and lexical similarity. The proposed approach, known as DecoMT — Decomposed Prompting for Machine Translation — focuses on streamlining the challenging task of translating between such languages, according to the authors.

More specifically, DecoMT divides the “complex” translation process into “simpler, more manageable subtasks,” instead of attempting to translate entire sentences in one go. These subtasks are then addressed through few-shot prompting of large language models (LLMs).

At the core of DecoMT lies the concept of “monotonic alignment,” a characteristic commonly found in related languages, where the word order in the source language is preserved in the target language. This characteristic enables DecoMT to decompose the translation process into a sequence of word chunk translations. Each word chunk is translated independently and these translated word chunks are then combined to create the final translated sentence. This approach contrasts with traditional MT methods that translate the entire sentence as a whole.

As the authors explained, DecoMT employs a two-stage translation process for word chunks. The first stage focuses on translating each word chunk independently, while the second stage performs contextual translation, taking into account the surrounding context. This incremental approach ensures that the translation of each chunk considers the previously translated chunks, resulting in more accurate and fluent translations.

One “key innovation” of DecoMT is its contextual translation, performed incrementally for each word chunk. By taking this step-by-step approach, DecoMT aims to provide more accurate translations while maintaining the correct word order. 

“We posit that by relieving LLMs from implicit re-ordering and focusing on sub-sentence structures, more accurate translations, particularly in longer sentences, can be achieved,” the authors said.

The effectiveness of DecoMT was evaluated across various related language pairs, spanning different language families. The evaluations encompassed both automatic and human assessment methods.

The results of these evaluations showed that DecoMT outperformed existing MT methods across various language pairs. Particularly noteworthy is its superiority in scenarios involving low-resource languages.

Often Overlooked

While this study employed Google’s mT5 for both independent and contextual translations, the authors highlighted that this approach can potentially be applied to other autoregressive LLMs. “At present, we utilize mT5 for both independent and contextual translations. However, it’s worth noting that any autoregressive LLM could potentially be used for independent translation,” they said.

From both economic and social perspectives, the importance of DecoMT cannot be understated. According to the authors, this approach fills a critical gap in addressing the translation needs of related languages, which are often overlooked in the world of MT. 

They emphasized that substantial commercial activity and social interactions occur between neighboring regions where two related languages are spoken. “In these situations, pivot translation via a third language, such as English, can prove inefficient due to two inference steps which can also cause cascading errors,” they noted.

The authors highlighted that in a world characterized by global commerce and social interactions, especially in regions where related languages are spoken, DecoMT has the potential to “streamline trade and enhance social connections.” 

Authors: Ratish Puduppully, Anoop Kunchukuttan, Raj Dabre, Ai Ti Aw, Nancy F. Chen