Deception & Misdirection
Beyond Cambridge Analytica: How Tech Giants Can Impact Elections
Capitol Hill was buzzing last week when Mark Zuckerberg spoke at Congressional hearings about Facebook’s handling of the Cambridge Analytica scandal and the security of the website’s user data. But there’s a larger issue begging to be discussed surrounding the Internet and elections: after browsing through search engines and social media, how do we know if the political opinions we form are truly…ours?
Dr. Robert Epstein, former editor in chief of Psychology Today and current researcher at the American Institute for Behavioral Research and Technology, was mindful of left-of-center Google’s capability to control the flow of information to users through their list of search results. Epstein notes:
That ordered list is so good, in fact, that about 50 percent of our clicks go to the top two items, and more than 90 percent of our clicks go to the 10 items listed on the first page of results…Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them.
Beginning in 2013, Epstein researched the influence that search engine results have on users during political elections. He conducted an initial experiment in San Diego in which 102 people were randomly assigned to one of three groups that looked at real search results about Australian political candidates on a fictional search engine. The groups viewed the same search results and web pages—the only difference between what each group viewed was the order of the search results. One group viewed search results that favored one of the candidates, the second group viewed search results that favored the opposing candidate, and a control group looked at search results that favored neither candidate and was a mix of search results.
Epstein was able to replicate the findings multiple times including in both a 2,000-person experiment in the United States as well as a 2,150-person experiment in India during a real Lok Sabha election for prime minister in 2014. The findings were striking:
- Voting preferences of undecided voters can shift by 20 percent or more after viewing biased search rankings
- Some demographic groups can experience an even higher percentage of preference shift
- Search rankings can be “masked” so that people show no awareness of the manipulation.
Dr. Epstein calls this seemingly invisible grip on search engine users the Search Engine Manipulation Effect (SEME). Epstein explains the outsized influence that SEME can have on voters:
Google has a near-monopoly on internet searches in the US, with 83 percent of Americans specifying Google as the search engine they use most often, according to the Pew Research Center…if Google favors one candidate in an election, its impact on undecided voters could easily decide the election’s outcome.
Google and some of its affiliates have already faced criticism for impeding web traffic to right-of-center content creators. Recently, Prager University filed a lawsuit against Google in October claiming that YouTube (owned by Google) was engaging “in illegal censorship by limiting access to some of the organization’s content.” Prager University claimed that Google and YouTube employed search filters to “restrict access to right-leaning videos on topics such as the ‘Arab world,’ gun rights, and abortion.” While Google ultimately won the dismissal of the suit, this example calls into question the extent to which these Internet-juggernauts are influencing—or perhaps censoring—political ideas and opinions.
What about Facebook? Well, Facebook has its own impact. A study published in Nature in 2012 found that sending 61 million random Facebook users a reminder to vote during the 2010 midterm elections resulted in 340,000 people voting in the election that otherwise wouldn’t have. While civic engagement isn’t necessarily a bad thing, Facebook is commonly viewed as a left-of-center platform and could very easily send strategic voting reminders to key demographics that “might cross the fuzzy ethical line into becoming potentially intrusive or coercive.”
However, Facebook already has a history of “coercive” behavior. In 2014, Facebook conducted a psychological experiment on over 650,000 users that determined that “Facebook has the ability to make you feel good or bad, just by tweaking what shows up in your news feed.” Couple this emotional manipulation with political advertisements and Facebook could sell powerful, “emotionally fueled marketing opportunities” perfectly tailored for use in political campaigns—or push an agenda all its own.
In light of the Cambridge Analytica incident, Facebook is supposedly making efforts to expose the impact of social media on elections by forming an “independent” research group to look into the issue. CRC’s Michael Hartmann recently discussed some of his questions that arose with this “initiative”—questions about the ability for commission members to publish dissenting opinions, if there would be public proceedings, and if there would be any competing research.
While Cambridge Analytica came under fire for using Facebook user data to create voter profiles, the real sleeping giants here are the left-of-center platforms, who function like informational gatekeepers that can, quite possibly, invisibly shape political outcomes and opinions.