Google Research Brings ‘Massively Multilingual’ Machine Translation to 200+ Languages
How necessary is parallel data for improving machine translation quality for low-resource languages? Google researchers scale model with ‘pragmatic approach.’
How necessary is parallel data for improving machine translation quality for low-resource languages? Google researchers scale model with ‘pragmatic approach.’
Is machine translation trained mostly on...machine translation? Research sparks discussion among MT experts. Quality issues found to be likely worse for non-English pairs, low-resource languages.
Developers praise PyTorch, Facebook’s open source framework, for its flexibility, speed, and usability. But the machine translation world is not ready to dethrone TensorFlow just yet.
New research by Google AI explores multilingual NMT on an unprecedented scale. Good for low-resource languages, less so for the rest, results show.
Language industry startups Unbabel and Lilt took to the stage at one of Europe’s largest machine learning conferences and explained how they build atop evermore sophisticated AI tech.