Can Tech Giants Save Elections?

17 Sep


There’s a unique quality of the technology sector that sets it apart from other industries, a push and pull that’s as distinct as it is complex. The tech industry’s specialists, CEOs, innovation gurus, and everyday workers all play key roles in two oppositional forces: the perpetration of, and resistance to abuse of technological power. 

It’s widely recognized that for every hero, there’s a villain. But it often appears that corporate tech is both. Large corporations like Facebook and Google are both causing and solving abuses by their technology, making it quite difficult to highlight their roles as strictly and wholly beneficial or harmful. One needn’t be so cynical as to imagine these companies are creating problems that they then solve, but it can often seem that way. 

The grand narrative claims that technology will liberate humanity. But in 1985, before a meaningful internet — and when “big data” was more science fiction than fact — the Association of American Colleges declared: “We have become a people unable to comprehend the technology we invent.” What seems even more true is that technology leaders can functionally comprehend their own technology, but have an interest in keeping the rest of us from comprehending it. 

In 2016 (and frequently since), technology and social media giants Facebook, Google, and Twitter have allowed themselves to be heavily used by less-than-scrupulous political advertisers through the use of techniques like microtargeting, the use of fake political ads, and the frequency with which such fakery occurs. The object of these disruptive ads has been, in many instances, to simply convince voters not to vote at all, rather than getting them to vote for a different candidate. The use of bots to disrupt and influence social media discussions and the presence of fake “followers” to boost the perception of particular politicians’ followings has influenced elections. Because some of the “meddling” was foreign, this implicated various federal laws; what would have otherwise been a problem of disproportionate influence became, in some instances, a law enforcement problem. 

Needless to say, this left Facebook, Google and Twitter with a lot of egg on their faces, especially in the context of an extremely divided and divisive election. It is no surprise, therefore, that as the country moves into full 2020 election mode, these companies met with federal officials “to discuss how they’re monitoring their platforms for foreign interference and preparing for the Republican and Democratic national conventions.” According to the New York Times, “The group met on Wednesday with representatives from agencies like the F.B.I., the Office of the director of National Intelligence and the Department of Homeland Security to share insights about disinformation campaigns and emerging deceptive behavior across their services.”

The first thing the companies did after emerging from these talks was to announce efforts to increase voter registration and facilitate voter participation. This seems easy enough for social media companies to do: run ads and place public service announcements in key cyberspaces while also linking to voter registration and education sites. Or, they could take the lead of data append companies — like Accurate Append — who have been using their resources to better connect campaigns with voters in order to get out the vote. Of course, these proactive “positive” measures are not going to fully compensate for the disruptive and deceptive effects of unscrupulous advertising and bot-based political messaging. In fact, even without microtargeting, the use of data dumps to drop false information, deliberately confusing messages, and misleading media can still undermine democracy. 

Last year, The Atlantic ran an article that foreshadowed the growth we’ve seen in 2020 of QAnon. The article described how QAnon operates: anonymous sources release thousands of images and videos that make all kinds of wacky and unsupported arguments; initially, these arguments and their accompanying graphics don’t tell a coherent story, but “clusters” of them emerge that start to shape particular plot lines: “Donald Trump and some loyal followers in the military and government are engaged in a clandestine, existential struggle against an international cabal . . . prominent Democrats were running an international child-sex-trafficking ring out of a pizza parlor in Washington . . . Robert Mueller is actually working with Trump to expose the Democrats; Angela Merkel is the granddaughter of Adolf Hitler; and the Queen of England is part of the cabal.” And so on. What QAnon ultimately facilitates is a perpetual chaotic theater with the entire world as its stage, facilitated by social media platforms too overwhelmed and compromised by political pressure to adequately stop or even regulate the flow of false or misleading information. 

The stories all affect people differently. They motivate some on the far right (and sometimes people with no coherent political ideology) to action, ranging from some nominal amount of political organizing to doing concretely irrational things like threatening pizza parlors or committing child custody violations. These acts not only signal-boost QAnon, but also make other political extremists seem entitled to political power or look reasonable in comparison. For many people raised over decades of corrupt politicians and the failure of society to deliver on the promises in its social contract, there may be lingering doubt on whether the conspiracies are true. That doubt, in turn, creates cognitive dissonance under which authoritarian or other simplistic political forces have increased influence.  

The Atlantic article points out that “the flow-oriented structure of social media also fosters conspiracism. You can’t tell a coherent story in a 280-character tweet, but you can provide a tantalizing assertion or allude to shared story fragments, especially if you use code words and acronyms (such as QAnon’s WWG1WGA) or iconic images (such as the alt-right’s Pepe the Frog).” The article points out that Trump is “a consummate flow politician” who can share and make suggestive comments about the conspiranoia. 

Facebook, Twitter, and other platforms can push back against fake stories by labeling them fake stories, but the review process takes far longer than spreading the falsehoods does, and lets many stories through the cracks. Here, we find a theme — the proliferation of misinformation — that cuts across all of the reasons why the lies and confusing messages overtake the search for truth. It’s fashionable to attribute Americans’ vulnerability to this messaging as a sign of our lack of intelligence, as if people had some kind of essential, embedded deficit of cognitive ability. Our experiences with people suggest otherwise: in everyday life, we are typically competent and have a good command of the everyday things around us. We are capable of problem solving and information management in reasonable contexts. 

But this kind of online manipulation takes advantage of our scarcity of time and inaccessibility to the resources needed to route through mass-generated information. At a minimum, one thing these companies could do is actually educate their users about the ways that political entities set out to manipulate them. What if it was baked into the user experience–a constant “flow” of reminders that the very content on the site should be subject to enhanced skepticism, and that there are sure signs that would discredit particular ads, videos, or other shares? What if this were to happen in addition to removing and flagging deceptive content, and actually taught and invited users to be part of that process? 

This would cut into the companies’ bottom lines by eliminating some advertisers, but the advertisers that stayed behind would be honest ones. And critics would be right in saying this would not completely eliminate deceptive posts or completely police them. But, it would make all of these tech giants much better stewards of the informational collective than they are now.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: