Online autocomplete results for COVID-19 information are more likely to yield misleading results if the user types in Spanish than in English, according to a new report.
The difference may harm Spanish speakers by connecting them with misinformation about handwashing, sanitizers, masks, or the disease itself, according to Vivek Singh, an assistant professor at Rutgers-New Brunswick’s School of Communication and Information and lead author of the new report.
“Little attention has been paid to the ways computer search algorithms present unequal access to health information across languages,” Singh says.
“We urge a review of autocompletes in search engines for potential bias, and for search engines to remove disparities in the dissemination of health information.”
The researchers collected autocomplete results for multiple COVID-19 related search terms over 60 days starting March 12, using Google’s search application programming interface. The search terms included “coronavirus is …,” “pandemic is …,” “hand sanitizer is …,” “hand washing is …” and “face mask is …,” in both English and Spanish.
For “hand sanitizer is …,” English autocompletes included “bad,” “preferable to hand washing” and “safe.” For the Spanish equivalent, “malo” or “bad” in English was often the only autocomplete.
For “coronavirus is …”, the result “coronavirus is a lie” came up twice in English over 60 days, while the equivalent “coronavirus es mentiro” appeared 35 times in Spanish. In addition, the result “coronavirus es biblico” (“coronavirus is biblical”) appeared one-third of the time in Spanish but never in English.
They placed positive, negative, or neutral scores for each autocompleted search, using sentiment recognition software that assigns positive rankings to language (such as “coronavirus is curable“) more likely to evoke positive emotion.
The researchers say Spanish speakers who see negative autocompletes may adopt very different approaches to handwashing and other public health recommendations than English speakers. They says urgent action is needed to counter information bias that may harm the health of users seeking unbiased information.
“The autocomplete function, while convenient, may contribute to bias that has the potential to lead to health inequality experienced by marginalized and racial minority groups by providing different results for similar inquiries,” says Pamela Valera, an assistant professor at the Rutgers School of Public Health and affiliated faculty member at School of Social Work.
The report is part of a larger National Science Foundation-funded project designed to yield approaches for countering language-based bias in COVID-19 related health information disseminated by search engines, using log analysis and interviews.
Source: Rutgers University