We have all been there. You are looking for a product or service online. You find something interesting, but then there is only that one, solitary review — and the rating is 4 out of 5 stars.
You continue browsing and find something similar, which has 1,200 reviews and an aggregate rating of 4 out of 5 stars. All else being equal, you will go with that one. The broader point here, the more ratings the better.
But how helpful are user-generated online reviews when they are in a different language? And how does adding the option to translate that foreign-language review into your preferred language influence your perception of a product or service?
Those are the questions Scott Hale from the Oxford Internet Institute, University of Oxford, and co-author Irene Eleta from the Barcelona Institute for Global Health (ISGlobal) set out to answer.
In May 2016, Slator covered a previous study by Hale, which found that language had a strong influence on the ratings of tourist attractions.
Bicycle Tour Challenge
“There is an interesting backstory to this,” Hale said, speaking to Slator about what triggered the research. Hale was talking to the engineer of a technology company that did not show reviews in other languages simply because they did not have a ready machine translation capability.
He recalled thinking, why should you not show reviews in other languages anyway? Is it really necessary to have an MT-powered translation option before displaying foreign-language reviews? Would it not already be beneficial to display the reviews in the original language? So, Hale thought, “Well, we disagree on this point, but it’s something empirical. We can test this.”
The first experiment was carried out in 2015. A replication using a different participant pool was conducted in Summer 2016. Hale’s hypothesis going into the research was: Yes, fully pre-translating all foreign language content would be great; but simply having more reviews, whether native language or not, would be net-positive. What they found, however, was much more complex.
Only 28% of the subjects who were offered the translate button chose to use it
On February 1, 2017, Hale and Eleta published a paper called “Foreign-language Reviews: Help or Hindrance?” which presented findings from two experiments involving hundreds of native English-speakers, who were asked to evaluate three London bicycle tours.
The researchers created a pool of 15 English-language reviews with corresponding Spanish translations, three photos, descriptions, and tour titles.
From the pool, the researchers created three tours: One had three English reviews, one had six English reviews, and one had three English and three Spanish reviews. The researchers saw their expectations confirmed when the subjects would rate the six-review tour higher than the three-review tour.
But how would the subjects rate the bilingual tour? Would it be enough for people to merely have more reviews (even in a language they do not speak) to rate it higher? It turns out, no. The researchers found no difference in the ratings between the three-review tour and the bilingual tour.
The researchers drilled down further. For the bilingual tour they provided 50% of the subjects with a “translate button.” The result? Not much difference. There was no discernible impact on the overall rating. Then again, only 28% of the subjects who were offered the translate button chose to use it.
Those who used the translate button attributed more importance reviews in general, as opposed to pictures, etc.
And then things got interesting. Those who did click on the translate button behaved very differently from those who did not, rating the bilingual tour significantly higher than their more passive peers. The second experiment, where more granular demographic information was asked from participants, confirmed the findings.
Further analysis showed that those who used the translate button attributed more importance reviews in general, as opposed to pictures, descriptions, or ratings. In short, people who care about written reviews also care about reviews in languages they do not speak. And they use and appreciate a translate button.
The researchers hope their work will inform how online retailers, travel sites, and other players in e-commerce, design and personalize the user experience. (Show foreign-language reviews for those who use the translate buttons, hide the foreign-language reviews for those who don’t).
According to the researchers, their uptake rate of 28% (subjects who chose to hit translate when offered) “is broadly in line with the figure an employee of one social media platform mentioned to us confidentially.”
The research showed a slight negative effect when displaying foreign-language reviews to those who did not use the translate button
The research showed a slight negative effect when displaying foreign-language reviews to those who did not use the translate button in the first place. To quote one participant: “I get enough Spanish here in the US. I’d rather not be around Spanish-speaking people when I visit London.”
Yet the 28% who did use the translate button rated the bilingual tour higher than the tour that had only three English reviews because, according to one participant, “it indicates the company is flexible and multinational.”
Perhaps, the most interesting finding was that almost all participants who hit the translate button did so on all three reviews. They clearly relied on translation to form their opinion. Yet, overall, the results were inconclusive as to whether or not displaying foreign-language reviews that feature a translate button would be net-positive in e-commerce.
What is clear, however, is that a larger number of reviews in a user’s language is beneficial. (After all, the English-only, six-review tours consistently scored highest.) And one way to increase that number is to translate foreign-language reviews and present them to a user in their native language. The challenge, then, for MT vendors and, perhaps, highly competitive crowd-sourcers, is to show the ROI.