Begin typing your search...
Online Stereotypes: Image search cements regional bias of women
Google Images is the public face of everything: When you want to see what something looks like, you will probably just Google it.
Chennai
A data-driven investigation by DW that analysed over 20,000 images and websites reveals an inherent bias in the search giant’s algorithms. Image searches for the expressions “Brazilian women,” “Thai women”or “Ukrainian women,” for instance, show results that are more likely to be “racy” than the results that show up when searching for “American women,” according to Google’s own image analysis software.
Similarly, after a search for “German women,” you are likely to see more pictures of politicians and athletes. A search for Dominican or Brazilian women, on the other hand, will be met with rows and rows of young ladies wearing swimsuits. This pattern is already visible to the naked eye and can be attested with a simple search for those terms. Quantifying and analysing the results, however, is trickier.
The very definition of what makes a provocative image is inherently subjective and sensitive to cultural, moral, and social biases. In order to classify thousands of pictures, DW’s analysis relied on Google’s own Cloud Vision SafeSearch, a computer vision software which is trained to detect images that could contain sexual or otherwise offensive content. More specifically, it was used to tag images that are likely to be “racy.” By Google’s own definition, a picture that is tagged as such “may include (but is not limited to) skimpy or sheer clothing, or close-ups of body areas.”
In countries such as the Dominican Republic and Brazil, over 40% of the pictures in the search results are likely to be racy. In comparison, that rate is 4% for American women and 5% for German women. The usage of computer vision algorithms such as this is controversial, since this kind of computer program is subject to as many — or even more — biases and cultural constraints as a human reader.
According to professor Sirijit Sunanta, a Multicultural Studies lecturer at Mhidol University in Bangkok, real-word stereotypes also affect the way Thai women are portrayed on the internet. “Thailand is seen as a kind of Disneyland for prostitution and sex-tourism. This also carries on into the internet when you do a Google Search,” she explains.
“The data that feed the algorithms reflects perceptions, biases and consumption patterns of a limited sample of humanity,” according to Renata Avila, a race and technology fellow at Stanford’s Institute for Human-Centered Artificial Intelligence. “It is no surprise that search engines replicate biases that are not exclusive to the technology, but rather cultural ones. Women of certain nationalities are pigeon-holed into sexual and services roles by a male English-speaking culture,” she adds. Experts tend to agree with one thing: in this issue, there are no isolated cases. It’s all part of a deeper, more systemic problem.
Avila believes that fairer algorithms are incompatible with the current business model of “big tech” companies, which are mostly concerned with collecting data and increasing information consumption.
A similar position is held by Joana Varon, founder of the think-tank Coding Rights. She says that search engines tend to reproduce the kind of content that is widely available online, and white men from developed countries have more access to the tools and strategies needed to publish content that drives page views. “If an algorithm is not doing anything to compensate for this, then it’ll be mostly racist, sexist, patriarchal,” she says, adding that “Commercial algorithms and their providers should be accountable for what they show, since they are reinforcing an oppressing world view in a search tool that became universal.”
This article was provided by Deutsche Welle
Visit news.dtnext.in to explore our interactive epaper!
Download the DT Next app for more exciting features!
Click here for iOS
Click here for Android
Next Story