The worst terror attack in New Zealand history played out live on Facebook. Jen Vermeulen remembers what it was like to monitor the prime minister’s social media pages at the time.
My phone serves me a memory every March.
It’s a photo a friend took of me, sitting at my desk on the eighth floor of the Beehive. The Dominion Post is spread open in front of me, my computer screens are black, and I’m looking off to the left, talking to a colleague. I was doing this because both Facebook and Instagram were inexplicably down and, as the prime minister’s social media manager at the time, this suddenly freed up a lot of my day. Besides, two days earlier we’d delivered a hectic, last-minute livestream event with the leaders of the school strike for climate movement – this brief moment of respite felt earned.
In a sea of my thousands of photos saved to the cloud, this one photo wouldn’t have been significant, if not for what happened next. I never need to look up the date it was taken, it’s engraved in me. It was March 14, 2019.
I remember saying at the time that I hoped the outage was permanent, that maybe some disgruntled insider had taken down all their servers. Afterwards, I would wonder if it would have changed anything, had Facebook disappeared overnight and taken its livestreaming function with it.
I joked that I’d be out of a job, but at least the world would be a better place. The toxicity of social media had already poisoned me, the unending stream of abuse, the death threats, the rape threats, the racism… I’d been “doing social” for five years by that point, and I was adept at scrolling past the hate. You had to be, if you wanted to keep doing the job. But numbing yourself costs you something, and that was the bit that haunted me in the aftermath.
Here’s the thing no one tells you about social media management – it’s a thankless, soul crushing job. You are somehow both the most important, and least important person in any given team, and that can change on a dime.
If you listen to higher ups, there isn’t a problem that can’t be fixed by magically “going viral” on Facebook (the strategic equivalent of Step 1 Facebook Step 2 ?? Step 3 Profit). And the person responsible for that is always one highly stressed, usually underpaid, staffer who works long hours because social media doesn’t adhere to a 9-5 schedule.
That, on its own, would be enough to lead to burnout, but now add in the content you’re dealing with. Anyone who’s spent time online will be familiar with the phrase “don’t read the comments”, but when you’re a social media manager, you don’t have a choice. Even after official content moderation and filters have been applied, you’re still faced with some of the worst things people can say, and the only way you can deal with that is to become numb to it. Another day, another death threat. Ho hum.
Until someone very much like the angry people in the prime minister’s comments livestreams a terrorist attack on Facebook and you have to live with the hollow feeling in your gut that you missed something. You missed something. Because you are one person reading through hundreds of comments, and you’re so very tired, and so desensitised to internet rhetoric.
To me, as someone who spent my day in the weeds of horrible political discourse, who’d waded through hundreds of comments about the UN Migration Compact and government conspiracies even in 2019, social media felt intrinsically responsible for what had happened on March 15. And I, by extension, felt deeply complicit.
The only thing that kept me going was the belief that now, finally, something would have to change. Social media platforms would have to take accountability. In retrospect, it was a naive hope. If that kind of thinking worked, the US wouldn’t still be dealing with mass shootings every other day.
In the immediate aftermath of March 15, 2019, things did feel different. People seemed reluctant to be on Facebook at all, and there were questions about the government’s own use of livestreaming on the platform. That’s the paradox with social media though, it’s where the people are.
Meanwhile, thousands of comments and messages were pouring into the prime minister’s pages from all over the world. To me, each one felt like a heart attack, a potential threat to be assessed. I couldn’t do it by myself, and I’m forever grateful to the other members of the communications team who waded in to help moderate seven days a week.
It felt like we were on the cusp of real change, but two months later, people were already back to their old social media habits. The conversation moved on. But I couldn’t. I remained obsessed with monitoring all the channels, flicking between all the different accounts, over and over, afraid to switch off. I lost all context. How was I meant to reconcile using social as a communications tool, when it had just been used in this truly heinous way?
In the wake of the March 15 terrorist attacks, the government launched the Christchurch Call in a bid to bring various organisations together to eliminate terrorist and violent extremist content online.
In 2020, Facebook settled a $52m lawsuit with its American content moderators who alleged they had developed mental health issues, like PTSD, while doing their jobs reviewing some of the most horrible content imaginable. Other cases were soon filed by moderators in other countries.
After the US Capitol riots on January 6, 2021, much was made of the role of social media in assembling the mob and fueling the violence. Very little of the evidence and findings collected by the congressional committee investigating the riot made its way into the final report.
Closer to home, social media played a significant role in the 23-day parliament protest in 2022, especially in amplifying disinformation and creating an alternate reality so divorced from what was actually happening, it felt farcical.
There continues to be so much discourse about Meta’s failure to filter divisive content, its destabilisation of elections, its inability to regulate hate speech. Disinformation continues to thrive on X/Twitter thanks to paid-for-Blue-tick users – a 2023 EU report found it to have the highest ratio of disinformation posts of all large social media platforms. Meanwhile, on TikTok, a mainly young audience takes everything they see at face value and, just like on YouTube, a mindless algorithm funnels them further down an unverified rabbit hole they never meant to go down. Add AI into the mix and any sense of truth and reality starts fracturing.
So, I look at that photo taken on March 14, 2019 when I was hoping it was the end of Facebook, and I sit in the ache of knowing that tragedy is right around the corner, and that five years later, somehow, nothing’s changed. If anything, it’s worse. And it will probably keep getting worse. And all the while we lose things. Attention spans. Art. Media organisations. Democracy.
After all, what’s the solution? More algorithms, more AI to decide what is true and what isn’t real? Or is it people, exploited and undervalued, yet still expected to decide what is and isn’t fit for purpose? Maybe it’s doing away with social media platforms altogether, and accepting that maybe our fragile minds were never meant to perceive all the world’s problems through a rectangle in our back pockets.
Politicians and regulators don’t understand social media well enough to regulate it. Those who build it are too wrapped up in their own hubris to take accountability for their creations. As for us social media workers? Well, we can see the room is on fire, but we’re too burnt out and entangled in late stage capitalism to come up with any ideas.
Maybe the poets were right after all – this is the way the world ends, not with a bang, but with a very tired whimper shared as content.