Apple Proposes New Solution for Multilingual Machine Translation
Apple researchers propose a solution and claim it can improve multilingual machine translation without pivoting and increasing the inference cost.
SlatorCon Zurich is now SOLD OUT — See you on October 4th!
Apple researchers propose a solution and claim it can improve multilingual machine translation without pivoting and increasing the inference cost.
Snapshot of the most interesting language technology highlights from the latest edition of the State of AI Report compiled by a group of venture capitalists and researchers.
500-million-sentence dataset from Jörg Tiedemann, Professor at University of Helsinki, can improve back translation by centralizing monolingual content for 188 languages.
A group of AI researchers takes the perennial human vs. machine debate to the next level by claiming their system outperforms professional translators...in news, on adequacy (read on for more caveats).
Since its 2017 debut, Sockeye has powered Amazon Translate and gathered decent traction elsewhere. The new version adopts Gluon base code and gets a boost from Intel and NVIDIA.
Researchers investigate the unexpected behavior of NMT and find surprising differences in the translations of extremely similar sentences.
Less than one year after Google open-sourced its much-discussed language model BERT, experts weigh in on its potential uses in neural machine translation.
At SlatorCon London 2019, Systran CEO Jean Senellart outlined the latest in NMT developments, called out some myths from vendors, and pointed out what’s missing from the industry.
Over two dozen charts and tables and high-level visualizations as well as analysis of GitHub code repositories relevant to the language industry.
Slator Weekly: Join over 15,800 subscribers and get the latest language industry intelligence every Friday
Tool Box: Join 10,000 subscribers for your monthly linguist technology update.
Your information will not be shared with third parties. No Spam.
This will close in 0 seconds