Here Are the Language Highlights from the Popular ‘State of AI Report 2022’
Snapshot of the most interesting language technology highlights from the latest edition of the State of AI Report compiled by a group of venture capitalists and researchers.
Snapshot of the most interesting language technology highlights from the latest edition of the State of AI Report compiled by a group of venture capitalists and researchers.
500-million-sentence dataset from Jörg Tiedemann, Professor at University of Helsinki, can improve back translation by centralizing monolingual content for 188 languages.
A group of AI researchers takes the perennial human vs. machine debate to the next level by claiming their system outperforms professional translators...in news, on adequacy (read on for more caveats).
Since its 2017 debut, Sockeye has powered Amazon Translate and gathered decent traction elsewhere. The new version adopts Gluon base code and gets a boost from Intel and NVIDIA.
Researchers investigate the unexpected behavior of NMT and find surprising differences in the translations of extremely similar sentences.
Less than one year after Google open-sourced its much-discussed language model BERT, experts weigh in on its potential uses in neural machine translation.
At SlatorCon London 2019, Systran CEO Jean Senellart outlined the latest in NMT developments, called out some myths from vendors, and pointed out what’s missing from the industry.
Over two dozen charts and tables and high-level visualizations as well as analysis of GitHub code repositories relevant to the language industry.