Transcribed from the 4 February 2017 episode of This is Hell! Radio (Chicago) and printed with permission. Edited for space and readability. Listen to the whole interview:
The more information you know about people, the more you can manipulate them.
Chuck Mertz: All your likes on Facebook, all your use of your smartphone—that’s all being collected and analyzed to manipulate you politically, and that should scare you a lot more than you are already.
Here to explain psychometrics and its impact on democracy, live from Zurich, Switzerland, is mathematician Paul-Olivier Dehaye. Paul was a researcher on the Das Magazin article which has been translated and posted at VICE, “The Data that Turned the World Upside Down.” Paul’s reasearch focuses on Big Data, technology, science, ethics, collective intelligence, and cognition.
Welcome to This is Hell!, Paul.
Paul-Olivier Dehaye: Hi, thank you.
CM: The story that you were a researcher on begins: “On November 9th around 8:30 a.m., Michal Kosinski woke up in the hotel Sunnehus in Zurich. The 34-year-old researcher had come to give a lecture at the Swiss Federal Institute of Technology (ETH) about the dangers of Big Data and the digital revolution. Kosinski gives regular lectures on this topic all over the world. He is a leading expert in psychometrics, a data-driven sub-branch of psychology. When he turned on the TV that morning, he saw that the bombshell had exploded: contrary to forecasts by all leading statisticians, Donald J. Trump had been elected president of the United States.”
Do we have any idea to what degree Trump’s presidency can be said to be the dangerous result of Big Data? Is that possible to quantify? Because there’s also an article at VICE that counters the argument, though I think it does so in a simplistic way—it seems to base most of its argument against this study on the idea that in the original article it said that there wouldn’t have been Brexit or Donald Trump if it weren’t for psychometrics.
I never saw that in your article whatsoever. So is there any way to quantify how much psychometrics had an impact on either the election of Donald Trump or the support for and eventual passing of Brexit?
PD: There probably is, but it’s beyond any reach of people outside the campaign. We don’t have access to the data, so we don’t know. The only certainty we can have is if individuals start asking those companies what data is held about them. Then we can have certainty about the type of profiling they’ve done and the amount of data that they have on each individual, and we can start trying to infer the effects of those tools. But from the outside it is very hard.
CM: The thing that we were hearing prior to the election wasn’t that Donald Trump was using Big Data. What we kept hearing about was that Hillary Clinton had used Big Data in her campaign, and prior to the vote, Democrats (and her campaign) were boasting that even if they lost the popular vote, through their analysis of Big Data they would still win the electoral college.
Of course that’s not how the election ended up. So is the Republican Party just better at using Big Data than the Democratic Party?
PD: It seems that there was a conscious decision with the primaries—which involved so many candidates on the Republican side—to build infrastructure at the RNC that would just be given to whomever was elected as the candidate. There was apparently a crucial meeting between the Trump team and the RNC team where the RNC explained all the work they had done ahead of time. That infrastructure was made available to Trump, and Trump’s campaign took it, and that’s how they caught up, essentially.
CM: The story also quotes Alexander Nix of Cambridge Analytica saying, “Up to now election campaigns have been organized based on demographic concepts.” Nix calls this a “ridiculous idea—the idea that all women should receive the same message because of their gender or all African-Americans because of their race.” The story adds that “what Nix meant is that while other campaigners so far have relied on demographics, Cambridge Analytica was using psychometrics.”
To you, how much does that reliance on demographics reveal the weaknesses of Hillary Clinton’s presidential campaign, especially the digital aspect of it?
PD: Nix is kind of right. The more information you know about people, the more you can manipulate them. He wouldn’t put it this way, but that’s the way I see it. We have found no evidence that Clinton in any way used psychographics to try to infer the psychological traits of voters. And there is ample evidence that on the Republican side this has been done, through this company Cambridge Analytica.
So indeed the targeting was more precise, and either it was individual targeting based on psychological profiling, or it was based on TV shows, for instance—trying to infer the psychological profiles for viewers of TV shows. There are all kinds of different ways you can leverage this information when you put out a message.
CM: Can you give us a brief description of exactly what psychometrics are?
PD: I’m not a psychologist, I’m a mathematician, but from my understanding, psychometrics tries to measure, quantitatively, psychological traits. The primary model that psychologists have—and it’s not uncontroversial—is called the OCEAN model. There are five different dimensions that they can measure: Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism. And that can give some information about how people will react to messages, or rather, to the way you present a message. That’s the extra information that Cambridge Analytica provided to the Republican campaigns, to the Cruz campaign and then the Trump campaign.
The precision of the models that are available through academics’ websites is much, much lower than what a big campaign will have managed to get. Because they have literally a billion dollars to spend on different kinds of tools, so the precision will be much higher and will focus on exactly the states that they care about, etcetera.
I hope that people realize there is a collective dimension to data protection. It’s kind of like herd immunity for vaccines.
CM: The story states, “On election day, a then-little known British company based in London sent out a press release. The statement quoted Alexander James Ashburner Nix, the CEO of Cambridge Analytica, saying, ‘We are thrilled that our revolutionary approach to data-driven communication has played such an integral part in president-elect Trump’s extraordinary win.’”
The story you worked on goes on to say that Cambridge Analytica wasn’t just integral to Trump online campaign, but to the UK’s Brexit campaign as well. So is Cambridge Analytica apolitical? Can anyone with any political agenda get Cambridge Analytica to work for them? Or does Cambridge Analytica seem to only work with those on the right?
PD: Well, they have worked all over the world. Or rather, their parent company SCL has worked all over the world. Definitely the parent company, the people on the board and so on—they have political affiliations. They have military background as well, some of them. And there have been some statements from Cambridge Analytica about the US campaign that they only affiliate with rightwing candidates. They implied that this was necessary given the way the market structures itself. But there are Republican outfits and then there are Democratic ones.
I think right now they are really happy with their success in the US, and even with our story, and so they are trying to get clients all over the world. But I don’t know.
CM: Can we blame not only Facebook as a contributing factor to the rise of Trump and Brexit (to whatever extent that is) but ourselves for participating on Facebook, revealing to marketers our personal tastes and being unprotective of our own privacy? Is this all, in the end, not the fault of some clever organization that is manipulating us, but of our unwillingness to protect our own privacy in an age of social media?
PD: I think you’re putting a lot of blame on the individual there. It’s far from clear to the average person what can be done with their data, and these companies are not forthcoming when you ask them what they do. And even if as an individual you ask, “What are you doing with my data?” they won’t tell you.
Sure, as a society we haven’t so far understood the importance of data protection and privacy, and I hope that, with this article and with similar events, people progressively realize that there is a collective dimension to data protection. It’s kind of like herd immunity for vaccines. You have to think about the collective as well.
CM: The story also states that “Remarkably reliable deductions could be drawn from simple online actions. For example, men who like the cosmetics brand MAC were slightly more likely to be gay. One of the best indicators for heterosexuality was liking Wu-Tang Clan. Followers of Lady Gaga were most probably extroverts, while those who liked philosophy tended to be introverts. While each piece of such information is too weak to produce a reliable prediction, when tens, hundreds, or thousands of individual data points are combined, the resulting predictions become really accurate.”
So how much is this surrendering of personal information driven by the idea that the information that the person is surrendering is inconsequential or innocuous? And is that intentional? Is that part of the way that psychometrics has success, because people don’t think that saying they like Lady Gaga is going to have some impact on how they might be manipulated politically?
PD: There is some of that. For instance, if you were to apply for a job, and were to take a psychological test, you would know that it is a psychological test, and you could game it. You could infer things from the questions, and they would have to ask you hundreds of questions to bury the ones that are most informative, so you wouldn’t know how to react. But once it’s trained on leftover data or data that has been collected elsewhere, it’s a lot more insidious because you don’t really know what inference can be drawn from what. In a way it could be more accurate through this method than just by doing it as a straight psychological test.
CM: Have you taken these tests? How accurate do you find the results to be? Did you find out anything about yourself?
PD: In the story we explain how data and models and technology and ideas flowed from Cambridge University to Cambridge Analytica. And Cambridge University was conducting research a few years ago and asking people to connect to their Facebook profile and fill out explicit psychological tests, to train their model. So I did that then, with the idea that I was helping science and I was helping understand humans better, etcetera. I wasn’t expecting at the time that this was going to a commercial outfit—and even a political one.
So I would expect the test to be quite accurate on me, because I have helped train those models. But since then I have deleted my Facebook “likes” and things like that.
CM: The story finds that “a smartphone, Kosinski concluded, is a vast psychological questionnaire that we are constantly filling out, both consciously and unconsciously.” We had Mara Einstein, author of Black Ops Advertising, on our show, and she said a smartphone is essentially a system to push advertising on you whether you know it or not. Is the smartphone the more revealing technology, even more than Facebook algorithms, to who we are?
When you completely customize what every individual sees, and the tools are built to measure what is efficient and to reinforce what is efficient, then in the end everyone sees a different facet of the politician. That turned out to work very, very well.
PD: It’s very revealing. You can infer psychological profiles also from the metadata of phonecalls—from the number of phonecalls, to the time of the calls, from who calls who and when, and the length of the calls. You can already build psychological profiles from that. You don’t need to be on Facebook for that. There are many different ways. Obviously cell phones are now central to our lives. We carry them all the time, we use them all the time. So yeah. Also from the apps that you have installed and the usage you make of them.
CM: Do you think that if people were fully aware—as I hope they become from the writing you helped research—that they would take steps to protect their privacy, that they might stop using Facebook, that they might get rid of their smartphone?
PD: No, and I wouldn’t want them to. It is sort of reneging on modern life if you do that. What I would want them to do is to demand better laws to protect them, and demand better tools to protect themselves. It’s completely possible to use exactly the same kind of services but with much more protection baked in. You might have to pay a little bit, because the business model might have to change, but it’s entirely possible to live the same life, just more protected.
CM: The study says, “To Michal Kosinski, the internet had always seemed like a gift from heaven. What he really wanted was to give something back, to share. Data can be copied, so why shouldn’t everyone benefit from it? It was the spirit of a whole generation, the beginning of a new era that transcended the limitations of the physical world. But what would happen, wondered Kosinski, if someone abused his search engine to manipulate people? He began to add warnings to most of his scientific work. His approach, he warned, could ‘pose a threat to an individual’s well-being, freedom, or even life.’”
I know that you can’t get inside of Kosinski’s head, but why did he believe it posed a threat to life?
PD: First about freedom: when that much data is collected about you, you progressively lose your own agency, because you give information to someone else, and then that someone else can build some intelligence based on that data, and progressively you delegate your own intelligence to that someone else. And I think life might be an exaggeration, but the way you live your life—you can certainly lose that. It can be directed by others. If you look at the big platforms, they suddenly change terms of service, and they are able to reorganize the way that people communicate very, very quickly, and very unilaterally.
So I do think that it’s a serious threat to freedom. And maybe his own background, his own origin made Kosinski exaggerate a little bit, I don’t know…but he sees it as a threat to life, and I find it hard to prove the contrary, let’s say.
CM: Do you believe, then, that Trump’s win and Brexit were both far more predictable and far less of a surprise considering their use of psychometrics?
PD: I can’t really say. The result is the result, and they used those tools. That’s the most important. With the result that is as close as the presidential election, any one factor could have clinched it either way. If you subtract one percent from Trump’s tally, he’s not elected president.
CM: The story quotes Cambridge Analytica CEO Alexander Nix saying, “Pretty much every message Trump put out was data-driven.” And the story states that “Trump’s striking inconsistencies, his much-criticized fickleness and the resulting array of contradictory messages suddenly turned out to be his greatest asset: a different message for every voter.”
How much does the use of psychometrics, then, explain Trump’s tweets? Trump’s inconsistency when it comes to his media message? How much does it explain his inconsistent media strategy?
PD: I don’t think it explains it. He is just the way he is. He behaves that way on Twitter because he wants to. But it turns out that it’s a perfect match with those tools. When you completely customize what every individual sees, and the tools are built to measure what is efficient and to reinforce what is efficient, then in the end everyone sees a different facet of the politician. That turned out to work very, very well.
CM: Is it possible, in your opinion, to close Pandora’s Box? That is, can we put the cat back in the bag? Sorry about mixing all these metaphors. Can we go back to a time when we didn’t share so much information? Is it possible to regain our privacy or do you think it’s too late?
PD: I think it’s possible to regain control over the data we share, yes. There are new regulations coming in, in Europe, which will have extraterritorial effect. They are extremely strong, so it will depend on how willing the enforcement agencies are to apply them. But they will have significant effects within Europe, and eventually outside. So I am quite hopeful, actually.
CM: Should I just quit Facebook? As of right now, is the only way that I can avoid being vulnerable to those who are using psychometrics to try to manipulate me politically—is the only way right now for me to avoid that to quit Facebook and smash my smartphone?
PD: Awareness of the tools already lowers their efficiency drastically. For every click you do, if you think a tiny millisecond more about the fact that there is an intermediary that stands to profit from your interaction, that already changes your attitude, I am convinced. This is just a conviction.
But that’s the most important: that you are aware, that you don’t just take the tools for granted as free and benevolent. They are not. There is an intermediary, and there are other people who are observing your interactions and trying to profit from them.
CM: Let’s put Trump and Brexit aside. Is this story of psychometrics and the potential that we can be manipulated somehow through social media—is there a bigger issue that we should be concerned about more than Trump, more than Brexit? Is psychometrics more of a threat to us in ways beyond politics?
PD: Definitely. It has the potential of pitting us one against the other. Trump is a common enemy to a lot of people. But if you can get inside people’s heads, you can foment conflict all over the place. Including between wife and husband, between different groups and communities. It can be much more granular. And that’s really dangerous.
CM: Wow. That’s a frightening answer. Thanks Paul, I really appreciate you being on the show, and congratulations on the fantastic work.
PD: Thank you very much.
Featured image: artwork by Tomás Saraceno