PyTorch, the open source framework used to build machine learning models, including those used for machine translation, released its newest version, 1.8, on March 4, 2021.
According to PyTorch’s official announcement, “highlights include updates for compiler, code optimization, frontend APIs for scientific computing, large scale training for pipeline and model parallelism, and Mobile tutorials.” Prior to this release, PyTorch released version 1.7 in October 2020.
Facebook’s AI Research Lab released PyTorch in September 2016. Since then, PyTorch has attracted attention among developers for its flexibility, speed, and ease of debugging.
In an informal poll on LinkedIn, Tyler Folkman, Head of Artificial Intelligence at Branded Entertainment Network, asked colleagues to name their “go-to deep learning framework.”
“I personally prefer PyTorch because I think it has a good balance of usability and extendability,” Folkman wrote. “By that, I mean that I find the common stuff is pretty easy to do without too much code and the more complex architectures are also relatively straightforward to implement.”
Though respondents alternately voiced support for PyTorch and TensorFlow, Dennis Sawyers, a Senior Cloud Solutions Architect at Microsoft, wrote, “TensorFlow is passé at this point.”
On Twitter, Josh Tobin, an instructor at University of California, Berkley and past research scientist at OpenAI, suggested a generational divide: “Why do people always ask what ML framework to use? It’s easy: Jax is for researchers; PyTorch is for engineers; TensorFlow is for boomers.”
Microprocessor engineering legend Jim Keller, himself a boomer, told Lex Fridman in a February 2021 podcast that he knows many people who have switched from TensorFlow to PyTorch.
“The native language of people who write AI network programs is PyTorch now,” he said, noting how it is built to scale naturally. “If you write a piece of PyTorch code that looks pretty reasonable, you should be able to compile it and run it on hardware without having to tweak it and do all kinds of crazy things to get performance.”
Very Much an Applied Endeavor
What does this mean for machine translation (MT)? Experts hesitate to call PyTorch, which has yet to be widely deployed in production, the new standard. Other platforms, such as TensorFlow, are more established, with better visualizations, more pretrained models, and more support and tutorials.
Adam Bittlingmayer, CEO and co-founder of MT risk prediction API ModelFront, told Slator that the factors boosting PyTorch’s popularity for other machine learning tasks may not immediately apply to MT.
LocJobs is the new language industry talent hub, where candidates connect to new opportunities and employers find the most qualified professionals in the translation and localization industry. Browse new jobs now.
In the meantime, Bittlingmayer said, “It’s hard to know who is using an open-source library, but it looks like MarianNMT has the most traction right now.” (MarianNMT is notable for being written in pure C++, whereas most others are written in Python.)
“Machine translation is very much an applied endeavor. There’s so much that goes into making a good production system, so the tools that provide the most out of the box will win,” Bittlingmayer added. “The core algorithms themselves are easy to transfer across libraries and frameworks.”