Natural Language Processing’s Crazy Busy Start to 2019

Natural language processing (NLP) and Generation (NLG) continue to boom, powered by rapid advances in machine learning. Slator continually monitors NLP and NLG as the umbrella category to which machine translation (MT) belongs, because developments in these areas may eventually impact the language services market.

Additionally, machine translation as well as language services and tech are mixing well with the broader AI and machine learning scene. In January 2019, for instance, the 2019 Applied Machine Learning Days conference featured AI & Language as one of four main tracks.

Before we go breathless and launch into an update on the most recent NLP launches and fundings, this recent MIT Technology Review interview with an NLP pioneer provides some much needed perspective.

Boris Katz, principal research scientist at MIT and one of the earliest researchers to contribute to the ideas that today underpin NLP and NLG, explained: “If you look at machine-learning advances, all the ideas came 20 to 25 years ago.”

So complex is language, according to Katz, that today’s virtual assistants most would consider intelligent are, essentially, “just counting words and numbers.”

He further explained that the technology of today has simply caught up to the ideas of the past. Moving forward, however, may require a fundamentally new approach.

Katz put forth two examples. In the sentence “This book would not fit in the red box because it is too small,” he said you would want an intelligent robot to understand the box is too small. In the sentence “This book would not fit in the red box because it is too big,” however, the robot should know the book is too big.

But today’s virtual assistants, even state-of-the-art MT engines, fail to associate the pronoun with the right antecedent.

“One way forward is to gain a greater understanding of human intelligence and then use that understanding in order to create intelligent machines,” said Katz. “AI research needs to build on ideas from developmental psychology, cognitive science, and neuroscience, and AI models ought to reflect what is already known about how humans learn and understand the world.”

Fake news and millions of investment dollars

Of course, just because NLG is devoid of common sense does not mean current technologies do not have common — at times downright impressive — applications. Over the last four weeks, Slator has gone through a number of high profile news on NLP and NLG.

Among the more noteworthy news stories is the non-profit AI research company, Elon Musk and Sam Altman’s brainchild, OpenAI’s new language model that lets users input a sentence or phrase, after which the model “predicts” what the next words should be. Essentially, it generates a full story from an initial input of a few words, a phrase, or a sentence.

The model, called GPT-2, was so convincingly fluent after being trained on eight million webpages’ worth of content, that the full model was not released due to its potential for misuse.

On the more practical side of things, the emergent ecosystem based on NLP and NLG technologies is quite active. For instance, India’s top exporter, Reliance Industries, recently purchased Reverie Language Technologies for a total of USD 37.5m in two tranches. Reverie mainly focuses on localizing human-machine interaction for Indic languages, particularly India’s 22 vernacular languages.

Meanwhile, Google has announced a second wave of startups shortlisted for its Launch Accelerator program in India. One of these startups, DheeYantra, is a company that develops chatbots and NLP solutions, among others, across eight Indian languages. According to Business Insider, DheeYantra is already being used by the Syndicate Bank and Indian Institute of Management Ahmedabad.

In China, a company called Tigerobo secured USD 33m in Series A funding from Prospect Avenue Capital (PAC), CreditEase Fintech Investment Fund, and Gaorong Capital. Tigerobo is a financial search engine startup that uses NLP for its services. It has currently raised almost USD 60m in capital.

Voice apps in homes, hotels, airports, contact centers

Following in the footsteps of Amazon Alexa, big tech companies are pushing further into homes, hotels and airports, and even contact centers.

After being one of the highlights of the 2019 Consumer Electronics Show (CES) in January, the demo for Google’s Interpretation Mode for its intelligent Google Assistant has now officially rolled out for consumer use in homes. Google recently added Continued Conversation and Interpretation Mode to its Smart Displays.

Coinciding with the CES 2019 unveiling of these functions, Google announced pilot projects with the likes of Hyatt Hotels for multilingual concierge services.

Not to be left behind, names like Mitsubishi Electric have recently announced the development of an NLP system that can facilitate conversations in 10 languages. An article in Asahi Shimbun reported that the system will be tested extensively to assess how practical it is for wide scale use “in bustling areas and other noisy environments.” A prototype was displayed and demoed on an airport information board, simulating the potential environments in which the system could operate.

So NLP is bound for, if it is not already in, our homes, hotels, and airports. Soon, NLP will answer our service calls to companies too.

PolyAI, an NLP-powered conversational platform, recently raised USD 12m to deploy conversational chatbots in contact centers. Point72 Ventures led the round with Sands Capital Ventures, Amadeus Capital Partners, Passion Capital, and Entrepreneur First participating. PolyAI has, thus far, raised USD 16.4m in funding, after a prior USD 2.4m seed round.

The company’s CTO insists that their technology empowers human agents instead of replacing them, pointing out that the technology is simply automating and maximizing contact center processes.

At AMLD 2019, the founder of explained during a presentation how they are also working on a system that builds on the latest advances in NLP to deploy conversational chatbots.

Finally, among the most-talked about news to do with NLP was Google’s open sourcing of Lingvo, a sequence-to-sequence framework built on TensorFlow and particularly geared toward NLP.

Experts have called it a “welcome tool,” but cast a bit of doubt on uptake and its advantages for researchers other than being bundled specifically to promote openness in research. Lingvo is probably not going to shake up the industry, at least in the foreseeable future. But it is nice to have the option.