Dressage of the democratic will – When computer algorithms determine the outcome of elections

The mechanisms of popular attention and outrage are not always easy to comprehend. Hardly incomprehensible is, however,that the business practice of the firm Cambridge Analytica provoke strong public reactions. The commentaries in print and online media are rarely as unanimous as in this case: Basic rules of democracy have been violated. Less understandable is the timingof this outcry. The activities of the company and its scary and equally naughty boss Alexander Nix (on the board of which long sat the ultra-right Nazi sympathizer Steve Bannon and which was predominantly financed by hedge fund manager Robert Mercer, Trump’s biggest donor),had long been known. Those who wanted to know could easily find out that Nix and his people had used methods of so-called “micro targeting” in their contribution to both the Trump as well as the Brexit campaign. In mybook Superpower Science(in German) from August 2017, for example, Cambridge Analytica and Micro Targeting are portrayed as prominent examples of how technological change is altering the rules of our society.

But a broad public discourse about these practices is only now beginning to develop. As surprisingly (and discouragingly) late the reaction of our media, politicians and business leaders to technological change often is, it is encouraging that the full force of public outrage is now devoted to this issue. It was about time: The way we are being manipulated with the help of modern digital technologies, big data and artificial intelligence (AI) requires a broad political and social reaction that goes far beyond a mere shoulder shrug when faced with the usual lies of big Internet companies like Facebook or Google, that their collection of data about us is only for the common good and that our data is safe with them. Politicians are finally reacting, even though it took video footage of salacious detail in the business practices of Cambridge Analytica to get it started. It was not the scandalous manipulation of public will with the help of AI and Big Data that forced Alexander Nix to resign from his post, but the century old manipulation techniques of sex traps and slander used by his company.

Already some time ago, Nix and his company boasted about having psychological data of approximately “220 million Americans with four to five thousand data points for each one of them” and having thus influenced 44 political elections in the US in 2014 (which included a collaboration with Trump’s new right wing security adviser John Bolten). And in 2017 according to its own account the firm was deeply involved with the election victory of Uhuru Kenyatta in Kenya. The election campaign, which preceded his electoral victory, had beenovershadowed by fake news and misinformation unlike any other election before, supported by the extensive use of smartphones and social media.

The method of exploiting data for political purposes is by no means new. For centuries, campaigners have been trying to determine the personalities, political views and inclinations of voters and to use this information for the benefit of their candidates. But two things are new and will shape the future architecture of political power in democracies significantly: the sheer volume of data and the increasingly intelligent algorithms used for the manipulation of voters. The fact that Facebook with all the data it collects about us plays a massively important role in this development (next to Google, Microsoft, and Apple), has finally made its way into the public discourse.

The magnitude ofhow thoroughly we are already being screened with the help of the Big Data and AI ​​algorithms today only very few people have grasped. A particularly rich data source for AI applications with the purpose to determine our personality features are the “Likes” on Facebook. According to studies by the psychometrics expert Michal Kosinski within his project “myPersonality” it is possible with the help of a corresponding AI algorithm to predict from an average of 68 Likes to 95% accuracy, which skin color, with 88% probability, which sexual orientation, and with 85% accuracy, what political orientation a respective person has. But also things such as drug abuse, religious affiliation, and even intelligence quotient and family circumstancesduring childhood can be determined. Kosinski claims that with ten Facebook Likes his algorithm can judge a person’s character and behavior better than an average work colleague, with 70 Likes better than a friend, with 150 better than the parents, and with 300 Likes better than his or her spouse. And with even more Likes, the machine can even surpass the person in his or her self-assessment. Another example of the power of data: in spring 2016 AI researcher Eric Horvitz of Microsoft Research described how a computer program based solely on publicly available data can help determine which individuals are likely to suffer from certain diseases in the future. Horvitz, for example, showed how intelligent algorithms from Twitter and Facebook messages from individual users can detect the onset of a depression – even before the person himself knows about it.

At latest here everyone should realize that these skills of AI and Big Data possess a hugely dangerous potential. Because from knowing our personality it is only a short way to influence our behavior, in the political context how we vote. The fact that we can be influenced by algorithms and Big Data shows the huge commercial success of Facebook. If that was not so, Facebook could not earn tens of billions of dollars throughpersonal advertising. But since the outcome of the Brexit vote and the 2016 US presidential election at latest, it has become clear that AI plays an increasingly important role in political disputes and democratic election campaigns. Algorithms like Michal Kosinski’s can also be used the other way round: to search for specific profiles, for example for anxious, frustrated or angry employees, for NRA supporters and right wing extremist – or for undecided voters. And from there it is not far to manipulate people and influence them in their voting behavior. Profiles created with the help of AI can thus be used to send the respective voterspersonal messagesaccording to their personalities. In 2016, potential Clinton voters – Latinos and African Americans, skeptical liberals and leftists, young women – were sent massive amounts of “news”, including shameless lies, about the Clinton Foundation’s conflicts of interest or allegedly illegal activities by the Democratic presidential candidate with the aim of preventing them from voting for Hillary Clinton. For this purpose, computer-generated automated scripts called “bots” published massive amounts of artificially generated content on social media such as Twitter and Facebook, bombarding susceptible users with propaganda posts and outright lies (for example, that the Pope supports Trump or that Hillary Clinton leads a ring of child pornographers). There were even “Hispanic bots” who pretended to speak for the majority of Latinos supporting Trump (it was well known that the Latinos were overwhelmingly against him). All of these measures proved successful: the rural electorate (traditionally Republican voters) saw significant increases, while African-American votes declined. And unexpectedly, a third of Latinos chose Trump despite his numerous public verbal abuse againstthat group.

But all this is only the beginning, says Horvitz. How quickly and, above all, how easily the social media algorithms bring us to extreme positions and promote radicalization of young people is shown by an experiment the French computer scientist Kave Salamatian performed. He filed numerous dummy profiles on Facebook and let student employees mark various news and information with harmless topics with Likes. After three days, the staff had finished their part of the experiment. From now on, Salamatian automatically let each accounts accept all friend requests and had them “like” every suggestion presented to them. The result was as astonishing as scary: After another three days ten of the accounts had direct contact with the terrorist organization IS, first recruiter reported to the account “owners”.

Also concerning the processing of our data for commercial or political purposes is the scandal around Cambridge Analytica just the tip of a fast growing iceberg. Countless other companies have access to our data via Facebook. And the Internet giant Google has probably an even greater power than Facebook: If, for example, in the future the company wants to prevent a certain presidential candidate, it could simply modify its search algorithms and thus provide its users with correspondingly filtered information. With these means of manipulationthat the big American Internet companies possess and unscrupulous companies like Cambridge Analyticaexploit, our very democracy is under scrutiny. The digital revolution has long attacked its immune system and messed up our ability to separate right and wrong, real news and fake news.

Historically, people have always had a hard time dealing critically with new media. We are amusing ourselves to death, prophesied the philosopher Neil Postman in the 1980s (and thus described our relationship to television). In hindsight, Postman’s prophecy proved a brilliant anticipation of what we are observing today – albeit falling short of recognizing the manipulative power of social media. And one can go even further back: In the late 1920s and 1930s the newly developed radio was an essential propaganda instrument of the Nazis in Germany.

It is dawning on us that artificial intelligence algorithms are massively influencing our everyday experience, our social life and political processes. But the severity of these developments we have by far not yet grasped. AI experts such as Horvitz themselves call for legislation as “an important part of the legal landscape of the future that will help to preserve freedom, privacy and the greater good.”

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed