6 months ago
April 29, 2020
Google Fixes Gender Bias in Google Translate (Again)
Gender bias in machine translation (MT) is one of the many research topics being eagerly explored by academics and industry stakeholders alike, as teams of researchers are publishing a flurry of MT papers in the run-up to the 2020 Annual Conference of the Association for Computational Linguistics (ACL).
Researchers from Cambridge recently published a paper looking at approaching gender bias as a domain adaptation problem. And Google Translate now says it has found a new fix for the gender bias issue. On April 22, 2020, a post on the Google AI Blog by Melvin Johnson, a Senior Software Engineer at Google Research, unveiled Google’s “Scalable Approach to Reducing Gender Bias in Google Translate.”
Google had previously claimed in December 2018 that the framework changes it made at the time allowed the system to “reliably produce feminine and masculine translations 99% of the time.” Google later dialed back on this assertion when, according to the blog, “it became apparent that there were issues in scaling” as the approach was rolled out in a greater number of languages.
But Google Translate is now back with a new fix for gender bias, which it said can produce “gender-specific translations with an average precision of 97%.” This means that when the system chooses to display gender-specific translations, it gets it right 97% of the time. Again, Google claims, the “model can reliably produce the requested masculine or feminine rewrites 99% of the time.”
So what’s changed? Google Translate’s 2018 iteration used a three-step approach, first to detect gender-neutral queries, then to generate gender-specific translations (e.g., providing masculine and feminine variations), and finally to check for accuracy. However, this now-obsolete approach proved problematic because the system had low recall and failed “to show gender-specific translations for up to 40% of eligible queries,” Google admitted.
Google Translate has spent nearly 18 months overhauling its approach to dealing with the gender bias in its 2020 iteration. Using a fresh three-step approach, the system now generates a default translation (which may or may not be gendered). If it is gendered, the system then rewrites an alternative translation and checks for accuracy.
Google said tackling the gender bias problem is an important part of its commitment to upholding Google’s AI Principles, “which emphasizes the importance to avoid creating or reinforcing unfair biases.”
The article also claimed that they have made “significant progress since our initial launch by increasing the quality of gender-specific translations and also expanding it to 4 more language-pairs.” Moreover, they said, “we are committed to further addressing gender bias in Google Translate and plan to extend this work to document-level translation, as well.”