Watching Synthetic Messenger is a somewhat dissociative experience. It works in a Zoom mode with 100 participants, which are all bots. Observers can watch these bots — which are strangely anthropomorphized with images of incarnate hands and voices that say “scroll” and “click” repeatedly — scroll methodically through news articles on climate change and click every ad on every page.
The project, created by two New York-based artists-cum-engineers, was launched earlier this month. In its first week and a half online, its bots have visited 2 million weather articles — you can see them listed here-And clicked on 6 million ads.
If everything looks like a strange, trippy art project, it definitely is. But it’s also a piece of criticism about how narratives about the climate crisis are formed by the media.
Most online outlets are funded by advertisers. Stories that collect more ad clicks can also become more visible in Google’s search algorithms, drawing more attention to the page. When certain stories gather more views and engagement, news organizations are more likely to publish similar articles. Absurdly, this means that advertising mechanisms and algorithms can play an excessive role in determining what people see news rather than other factors such as, um, how important the story is.
“With this project, we want to see how media ecology affects our current ecology, how narrative affects our material realm,” said Sam Lavigne, artist and assistant professor in the Department of Design at the University of Texas.
Of course, conflicting narratives have always played a role in the climate crisis, as Lavigne was quick to point out. Polluters know that it is important to control how people talk and think about the climate crisis anu expenses storms on all kinds of disinformation campaigns, also on the formation of narratives in the media.
“The narrative around climate change has been so controlled by the fossil fuel industry and by lobby groups,” Lavigne said.
Algorthims have also distorted how the news – or, increasingly, misinformation – reaches people. YouTube’s algorithm for recommending videos, for example, has he encouraged viewers to watch videos full of climate denial. YouTube has also sold out against these videos, taking advantage of the misinformation while encouraging viewers to consume them more and more.
As historically damaging fires spread to Australia a year and a half ago, a narrative has emerged that they were sparkled by arsonists, not from the climate crisis. This misinformation, a group of researchers found, has been shared with the use of bots in online trolling. The conservative media turned then and there has amplified those claims, creating a feedback loop where everyone lies down instead of talking about how to deal with the climate crisis. (Same scenario played in the United States last year.) Yet, as Tega Brain, who co-created the project, said, they’re not the only ways that one algorithms have colored the media landscape.
“All the news, and therefore all public opinion, is formed.” [by] algorithms, “said Brain, an assistant professor of digital media at New York University who has a background in environmental engineering.” And the algorithmic systems that make up the news are these blackbox algorithms, “he added, referring to the practice of hiding technology companies as their code and priorities from the public.
Synthetic Messenger, then, watches the system game showing bot-fueled interest in weather stories. While it could play a small role in amplifying climate coverage, there are some complications. For one, since its algorithm is inaccurate and based on climate-related keywords, it also clicks ads on media that deny the weather. Its creators have tried to circumvent that on the blacklist of negative sites like those property of Rupert Murdoch, but it is not a perfect system.
If this project were primarily conceived as a tool for political organization, those could be great points of attachment. But Brain and Lavigne are clear that they know their project will not change the media landscape or combat the climate crisis itself.
“We don’t intend it to be read as‘ Here’s this really effective new activist strategy to tackle climate change, ’” Brain said. “Essentially, with this project we’re doing what’s called‘ click fraud, ’and if we’ve been doing it for quite some time and on a fairly large scale, it wouldn’t work, because obviously ad networks do everything they can to sort of protect against automated behavior. They’re going to stop. ”
Rather, the goal is to draw attention to the incentive structures in place that determine what climate stories are told and amplified by advertisers and research algorithms.
“It’s not like we’re offering this as a solution to this problem that we have. The solution is a meaningful climate policy, an effective policy,” Brain said. “But we seek to open a conversation and highlight the way our media landscape is operating today.”