Digital Gerrymandering and the Dangerous Influence of the Internet on Politics

Politics Features
Digital Gerrymandering and the Dangerous Influence of the Internet on Politics

It’s undeniable that the Internet and all its trappings have changed the world around us forever. From the rise of search websites and engines that allow us to immediately access large amounts of information on almost any subject imaginable to the way that social media has come to permeate almost all aspects of our lives, it’s having a profound impact on day-to-day existence.

There have been numerous studies about the ways that the Internet revolution has impacted our life, and many of those have rightly been focused on how these phenomena affect our engagement with the political process. Something that seems to be more and more apparent though, is that we don’t even know the ways that we communicate with others through social media and the places we get our information from have a huge capacity to influence if and how we vote in elections, whether they are intentionally doing so or not.

Jonathan Zittrain, in a recent article in the Harvard Law Review, calls attention to this dynamic, one that he calls “digital gerrymandering”. He writes that “digital gerrymandering” is:

“..the selective presentation of information by an intermediary to meet its agenda rather than to serve its users. It is possible on any service that personalizes what it presents, particularly when there is an abundance of items to offer up and only a few that can be shown at a time. Whether for search results, top tweets, or Facebook updates, each curating service uses a proprietary recipe to draw from boundless ingredients to prepare our feeds, and each is in a position to gerrymander its users, distinguishing between those it thinks will be supportive of its goals and those who will not. None promises neutrality, whatever that might mean, although some — including Google and Bing — clearly distinguish between results driven by the companies’ own formulas versus those displayed as sponsored advertising.”

Perhaps the most heavily discussed example of this, and indeed one that Zittrain raises is the example from the 2010 US midterm elections, during which Facebook conducted an ambitious social experiment to see if it could turn out more voters. On election day it exposed its users who were eligible to vote to one of two messages in their news feed. One was a post that included a link to show you where your polling place was and a button to indicate whether you had voted. The second had both of the aforementioned features and displayed which of your friends had indicated they were voting. There was a control group that was not shown either.

It’s estimated that around 340,000 extra people turned out to vote as a result of the experiment, which was conducted across a group of 61 million people. The results may seem insignificant at first, proportionally, but in the close election seasons the US typically sees, they certainly have the potential to have a large impact. While Facebook did not target users based on political affiliation or encourage them to vote for a party of candidate, it’s not far-fetched to think that, should Mark Zuckerberg be so inclined, Facebook has the capacity to have push voters to the polls based on their political affiliation. So, for example, he could target liberal voters with news feed posts designed to get them out to vote, while neglecting to do the same for more conservative individuals.

This isn’t the only way we could be influenced without even knowing it though. A recently published study reveals that the order of Google search results, and the ranking of positive or negative stories onscreen can have a huge influence on the way its users vote. As of 2012, Google garnered 78% of global searches and was used by 1.17 billion people worldwide. It’s possible that those numbers have only increased since.

According to Adam Rogers of Wired, one group in the study was shown positive articles about one candidate first, while the other saw positive articles about the other candidate. The result was that people were more likely to vote for the side they saw positive results for – by around 48 percent. Perhaps surprisingly, the study found that this effect held and was even stronger when researchers included a negative story into the third of fourth spot on the Google search results. This was because it ostensibly made the results seem even more neutral and trustworthy.

Robert Epstein, a psychologist and one of the study’s authors, noted that “We estimate, based on win margins in national elections around the world that Google could determine the outcome of upwards of 25 percent of all national elections.” This is not to say that Google is currently engaging in curating their list results to encourage certain behavior among their users. That being said, as with Facebook, the infrastructure is in place for them to do so if they wanted, and it’d be done without users even being aware of what was happening and its affect on them.

Finally, a study from January 2013 examining cross-ideology exposure on Twitter found that “on Twitter, political talk is highly partisan, where users’ clusters are characterized by homogeneous views and are linked to information sources.” This dynamic largely “reinforce in-group and out-group affiliations, as literally, users form separate political groups on Twitter.” What this means practically is that by and large, what we are seeing on Twitter reinforces our own worldview, and influences how we think.

While this is not considered digital gerrymandering, it’s another way that we self=select and insulate ourselves via a digital platform, which uses an algorithm to give us the news we consume and thereafter base our decisions upon.

All this is to say that we should be aware of the ways that the platforms that now permeate our lives can push us one way or another when it comes to how we engage with politics. Because if we know it can happen, that’s the first step towards lowering the chances that it will.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin