Photo by Chris Jackson/Getty Images

If they haven’t signed up to the Facebook transparency tool, don’t vote for them

Some New Zealand parties have signed up. Others are still deciding. But if they don’t, should you trust them at all, asks Cate Owen.

You shouldn’t just care that political parties are buying digital ads, you should treat any party that won’t sign up to Facebook’s political ad transparency measures with suspicion.

Why? Because digital ad segmentation is a powerful way to target and manipulate someone, in a “these specific people fit this psychographic profile, so let’s hit their particular hot button until they cave in” kind of way.

Before you scoff, this has already happened across the world. Fear-based campaigns using hyper-targeted social media ads have been run in Australia, the USA, the UK, the Philippines, India, Brazil, Mexico – and those are just the ones we know about thanks to a whistleblower at Cambridge Analytica, the company that did this in at least 68 countries; Most notably with Trump, and the Leave EU campaign.

Let me break it down for you: Facebook knows more about you than you almost certainly give it credit for. Much more. It probably knows your hopes and fears, your daily routine, what your guilty pleasures are, the websites you pore over – and that’s just for starters.

It got this data partially because you gave it to them, and partially because you use the internet. You uploaded a profile photo to Instagram, allowed location services, gave them your email address. Facebook sees all your private WhatsApp and Messenger chats, most of the social media content you engage with, and thanks to tracking tools such as Facebook Pixel and the Like button, it follows you as you move around the internet. It tracks the news articles you read (including the ones about politics, crime, immigration, and health!) and adjusts its algorithm based on your interests. It knows some of the most intimate aspects of your life – how you spend your money online, the apps you use most often, even the porn you look at.

An advertiser can target you based on parts of this data. Facebook allows marketers to segment the audience into, for example, new mums who live within 5kms of a prison, single men who spend a lot of money on shoes, 60-year-olds who are good friends with someone who is interested in Ramadan. No, really.

There’s also many ways you can segment voters that I haven’t mentioned – by phone number, email address, lookalike audiences – but that’s probably enough information for you to get the picture: Facebook have extremely detailed targeting available right at the advertiser’s fingertips, and a clever marketer can finetune it to within an inch of its life.

This is really going to matter in our election. It means a political party can pick at your scabs, fostering fear, anger, disgust, or voter apathy; whatever they need you to feel.

It may not be obvious – you may think you’re just looking at a regular post. Frustrated that you can’t buy a house in Wellington? Here’s a news article about about skyrocketing prices in your suburb. Scared about getting sick? Here’s something about crappy hospital wait times, featuring someone who looks like you. Think millennials whinge too much about climate change? Oh look, a hilarious meme about how they drive everywhere!

“Oh that’s not a big deal,” I hear some of you say. You’re wrong. It matters because it’s so tailored, so shadowy, and a threat to a fair and open democracy.

Seems like Facebook agree – at least, partially.

Since 2018, they have been working on providing greater transparency around ads. There’s an Ad Library you can see for any Facebook page (Here they are for Labour, National, Greens, NZ First, and ACT), and now they have a specific Ad Library for politics – not just publishing which ads have been placed, but who is targeted and how much is being spent.

In New Zealand it’s voluntary to join, but ACT and the Greens have signed up, and Labour have announced they intend to.

While many welcome the clarity, we must keep in mind that it won’t prevent misinformation. Facebook CEO Mark Zuckerberg has made it very clear his company won’t be held responsible for fact-checking political ads.

We should care about this. We need to hold political parties, their surrogates and sockpuppets to account for what they publish – both as ads and in the editorial space. We need social networks to work with our Standards Authorities to ensure misleading information doesn’t get out, and especially not under the cover of darkness, never to be seen except by a carefully selected group.

If our political parties don’t agree to Facebook’s ad transparency measures – the most basic level of accountability in a world full of examples of why it’s so necessary – can we trust them to run a fair campaign at all?




The Spinoff is made possible by the generous support of the following organisations.
Please help us by supporting them.