Okay, I am now officially worried. Really worried.
A few days ago, The Guardian reported on a recent conference of internet hackers, held in Las Vegas. (Yes, even hackers evidently have conferences….)
Using “psychographic” profiles of individual voters generated from publicly stated interests really does work, according to new research presented at the Def Con hacking conference in Las Vegas, Nevada.
The controversial practice allows groups to hone their messages to match the personality types of their targets during political campaigning, and is being used by firms including Cambridge Analytica and AggregateIQ to better target voters with political advertising with so-called “dark ads”.
Most of us don’t consider ourselves targets for “dark ads” aka propaganda. We like to believe that we are different–that we’re thoughtful consumers of information, people who can “smell a rat” or otherwise detect spin and disinformation. We shake our heads over reports like the one about the gullible 28-year-old who shot up a Washington Pizza Parlor because stories on social media and conservative websites had convinced him that Hillary Clinton was operating a Satanic child sex ring out of its (nonexistent) basement.
News flash: we are all more gullible than we like to believe. Confirmation bias is built in to human DNA.
Psychographic profiling classifies people into personality types using data from social networks such as Facebook. Sumner’s research focused on replicating some of the key findings of psychographic research by crafting adverts specifically targeted at certain personality types. Using publicly available data to ensure that the adverts were seen by the right people at the right time, Sumner tested how effective such targeting can be.
The referenced study used information that Facebook already generates about those who use its platform, and created two groups: one composed of “high-authoritarian” conservatives, and a “low-authoritarian” group of liberals.
Knowing the psychographic profiles of the two groups is more useful than simply being able to accurately guess what positions they already hold; it can also be used to craft messages to specifically target those groups, to more effectively shift their opinions. Sumner created four such adverts, two aimed at increasing support for internet surveillance and two aimed at decreasing it, each targeted to a low or high authoritarian group.
For example, the highly authoritarian group’s anti-surveillance advert used the slogan “They fought for your freedom. Don’t give it away!”, over an image of the D-Day landings, while the low authoritarian group’s pro-surveillance message was “Crime doesn’t stop where the internet starts: say YES to state surveillance”.
Sure enough, the targeted adverts did significantly better. The high-authoritarian group was significantly more likely to share a promoted post aimed at them than a similar one aimed at their opposites, while the low authoritarian group ranked the advert aimed at them as considerably more persuasive than the advert that wasn’t.
Think about the implications of this. Political campaigns can now target different messages to different groups far more efficiently and effectively than they could when the only mechanisms available were direct mail campaigns or placement of television ads. As the article noted, this technology allows politicians to appeal to the worst side of voters in an almost undiscoverable manner.
The importance of motivating and turning out your base is a “given” in electoral politics, and these new tools are undoubtedly already in use–further eroding the democratic ideal in which votes are cast after citizens weigh information provided through public policy debates conducted by honorable candidates using verifiable facts.
Thanks to gerrymandering, most of us don’t have genuine choices for Congress or our state legislatures on election day. Now, thanks to technology, we won’t be able to tell the difference between facts and “alternative facts.”
Comments