The language services and technology industry is crossing into new territory with the emergence of large language models (LLMs) as tried and tested translation productivity tools converge with cutting-edge AI technologies.
Traditionally, translation memories (TM) have helped linguists leverage previously translated content and reduce overall costs for localization buyers. With the emergence of custom neural machine translation (MT) technology, a convergence began where TMs were used to train and fine-tune the MT engines.
As LLMs now dominate both the AI discussion and its practical implementation, translation memories take that convergence one step further. TMs have gained additional significance for linguists and buyers, as they can improve quality output by both optimizing LLMs, and being optimized by LLMs.
On the one hand, translation memories are goldmines of custom data which can be used to optimize LLMs. At this year’s SlatorCon, Roeland Hofkens, Chief Product and Technology Officer at LanguageWire recommended to “use these assets to customize your LLM,” to increase the accuracy of content produced by this new technology.
On the other hand, LLMs can help to optimize translation memories by adjusting low fuzzy matches to high or even 100% matches, to gain maximum output from previously translated content.
This gives linguists an uplift in productivity through automated fuzzy match corrections. Dave Ruane, Partnerships Director at XTM International stated in a press release that “AI-enhanced TM” brings a “new level of clarity, consistency and cost-effectiveness to global communication.”
With the convergence of translation memory and AI, developers of translation productivity solutions have new ways to bring optimization to the translation process and to increase speed and quality for localization buyers across a range of verticals.
This use case is one of ten, one-page examples of LLMs being put to use, and is drawn from research and interviews with some of the industry’s leading language technology providers.