A Google spokesman said the differences in results were not caused by censorship and that content on the Tiananmen Square massacre is available via Google Search in any language or location. Tourist images gain a prominent place in some cases, the spokesman said, when the search engine detects the intention to travel, which is more likely for searchers closer to Beijing or written in Chinese. Searching Tiananmen Square from Thailand or the United States with Google’s Chinese-language setting also induces recent clean images of the historic site.
“We locate the results in your preferred region and language so you can quickly access the most reliable information,” the spokesman said. Google users can tune in to their results by adjusting their location and language.
Search Atlas collaborators have also built maps and visualizations that show how search results can differ around the globe. One shows how the search for images of “God” yields bearded Christian images in Europe and America, images of Buddha in some Asian countries, and Arabic scripture for Allah in the Persian Gulf and Northeast Africa. A Google spokesman said the results reflect how its translation service converts the English term “God” into words with more specific meanings for some languages, such as Allah in Arabic.
Other boundaries of information traced by researchers do not map directly across national or linguistic boundaries. The results for “how to fight climate change” tend to divide island nations and countries on continents. In European countries such as Germany, the most common words in Google results refer to policy measures such as energy conservation and international agreements; for islands such as Mauritius and the Philippines, the results were more likely to cite the enormity and immediacy of the threat of a changing climate, or damage such as rising sea levels.
Search Atlas was presented last month at the academic conference Designing Interactive Systems; its creators are testing a private beta of the service and considering how to extend access to it.
Search Atlas cannot reveal why different versions of Google portray the world differently. The lucrative classification systems of society are strictly held, and society says little about how to tune the results based on a person’s geography, language, or activity.
Whatever the exact reason that Google shows – or doesn’t show – particular results, they have a power too easily ignored, says Search Atlas co-creator Ye. “People ask search engines things they would never ask a person, and the things they happen to see in Google results can change their lives,” Ye says. “It could be” How do you get an abortion? restaurants near you, or how you vote, or get a vaccine ”.
WIRED’s experiences show how people in neighboring countries could be guided by Google to very different information on a current topic. When WIRED asked Search Atlas about the ongoing war in the Tigray region in Ethiopia, the Google edition in Ethiopia pointed to Facebook pages and blogs criticizing Western diplomatic pressure to de-escalate the conflict, suggesting that the United States and others sought to weaken Ethiopia. The results for neighboring Kenya, and the US version of Google, have more importantly presented explanatory news from sources such as the BBC and In the New York Times.
Ochigame and Ye are not the first to point out that search engines are not neutral actors. His project was partly inspired by the work of Safiya Noble, co-founder and co-director of the UCLA Center for Critical Internet Research. She Book 2018 Oppression Algorithms studied how Google searches with words like “Black” or “Hispanic” to produce results that reflect and reinforce social prejudices against certain marginalized people.
Noble says the project could provide a way to explain the true nature of search engines to a wider audience. “It’s very difficult to make visible the ways in which search engines are not democratic,” she says.