1 month ago
December 15, 2020
Amazon Productizes NLP via Alexa, Puts ‘Live Translation’ Into Echo
On December 14, 2020, Amazon launched “Live Translation” for Alexa on Echo smart speakers that have their locale set to “English (US).”
The new feature allows Alexa to serve as interpreter and translator for both sides of a two-person convo. It will initially support six language pairs: English and Spanish, French, German, Italian, Brazilian Portuguese, or Hindi.
How it works according to Amazon: Echo user asks Alexa to start a session (e.g., “Alexa, translate into Italian”). After the beep, start speaking in either English or Italian. Alexa will then detect which language is spoken and translate each side of the conversation. Users of Echo Show can also read the translation off the touchscreen. To end, simply say, “Alexa, stop.”
The same announcement said the new feature was built on existing Amazon tech, such as Amazon Translate as well as Alexa’s automatic speech recognition (ASR) and text-to-speech feature, with machine learning models “designed and optimized” for translating conversational speech.
Reach over 13,000 newsletter subscribers and 120,000 high quality monthly pageviews by leveraging Slator's unique audience of senior decision makers.
Overseeing Alexa Live Translation is Shirin Saleem, who serves as senior manager for translation at Alexa AI. A data scientist with an MS from Carnegie Mellon, she has worked at Amazon Alexa since 2013, serving in various roles mainly focused on machine learning.
Before Amazon, there was Google, which announced it was offering “Interpreter Mode” for Google Nest in December 2019. Interpreter Mode had already been made available for smartphones via Google Assistant back in February 2019.
And, in an adjacent, B2B use case from back in August 2019, Slator covered Baidu’s productization of speech-to-speech translation, Baidu Translate. (The latest news on the Baidu page: “AI simultaneous interpreting beta is open” for Chinese-English.)