Since 2019, extremist groups have become increasingly sophisticated in exploiting online platforms, and New Zealand lags behind in regulation.
How long is five years? It’s enough time for a newborn to grow into a school-aged child, learning language and movement in those intervening years. It’s enough time for a sapling to grow a good canopy, crown and shade. And yet. Not enough time to recover, to forget, to carry on as if nothing had happened. We cannot get over it.
The fifth anniversary of the Christchurch mosques attack is a time to remember, a time to take stock, a time to keep doing the mahi to protect our future.
That day, a 28-year-old man walked calmly into two mosques and cold-bloodedly murdered worshippers. What would cause a man to hate a group of people so much? It was Nelson Mandela who reminded us that we are not born to hate people because of the colour of their skin, their background or their religion. This is something we learn.
This terrorist learned to be racist from a young age, no doubt influenced by his environment. According to the Royal Commission report, he also became more racist as a result of his extensive travels after the death of his father from cancer.
This attack was the first one where technology played a specific role in the dissemination of the livestream video of the event. Technology played another role that was not new. Radicalisation online had been growing in the 2010s, and we know the killer was active in a number of spaces.
He was heavily involved with online gaming from a young age, and we know that “extremists are exploiting online gaming and gaming-adjacent sites to promote hatred and violence“. We know he was participating in a rightwing Facebook group that was banned, under the handle Barry-Harry-Tarry. He donated to Stefan Molyneux’s YouTube channel, and credits YouTube with influencing his thinking.
There is some missing information. The terms of reference of the Royal Commission of Inquiry into the terrorist attack on Christchurch masjidain precluded investigations into organisations outside of the public sector. We don’t know who interacted with him in the online gaming space, nor what YouTube’s recommender algorithms were serving him. There is no inquiry or process that will investigate these matters, yet they require investigation both for this killer and subsequent mass murderers overseas.
The online environment over the last five years has changed in many ways. We saw the impacts of QAnon and conspiracy theories, and the rise of disinformation online. There is growing state activity focused on disruption, for example from Russia in the disinformation space and the suppression of information. Many states run troll farms apart from Russia, particularly India, but also China, Brazil and the United States, among others.
Extremist groups are also increasingly sophisticated in exploiting online platforms, testing the boundaries of moderation policies and using a variety of techniques.
On the other hand, we are seeing over-censorship. Internet shutdowns have become increasingly common, with India having the highest number. States have been geo-blocking websites and platforms for years, and are also blocking particular accounts related to journalists and activists. Added to the mix is generative AI, which blurs even further the line between reality and manufactured content.
Platforms’ own censorship decisions sit on top of this. While free speech arguments are often used as a reason for failing to apply their own moderation policies around white supremacist and transphobic content, there has been heavy moderation of pro-Palestinian voices. Journalist David Farrier has written about his experience of being shadow-banned since the start of the latest Gaza conflict.
Our government’s response since the attacks has been varied. The Christchurch Call to Action will celebrate its fifth anniversary in May. As a multi-stakeholder forum, it had a lot of promise. It has had some effectiveness in preventing livestreaming of attacks through the Global Internet Forum to Counter Terrorism, but the reduction in size of trust and safety teams by major platforms over the last year has been of significant concern. Support from the current government is not assured.
There have been changes to domestic legislation such as the Films, Videos and Publications Classifications Act and the counter-terrorism legislation. These have given the government greater powers but it’s difficult to know if they have been impactful. A takedown notice to far-right social media platform Gab about objectionable material related to the Christchurch attacks resulted in the notice being posted publicly, which encouraged that material to be shared widely on mainstream social media.
One of the major pieces of work last year was the consultation by the Department of Internal Affairs on Safer Online Services and Media Platforms. Our legislation on media regulation is out of date, and unlike overseas countries, we have little regulation of online platforms. In the last five years, the EU has passed the Digital Services Act and the UK has the Online Safety Act, as has Australia.
The report from this consultation should have been released by September 2022, and yet we are no closer to ensuring our online spaces work for the benefit of our citizens and protect us from other states’ disruption, negative impacts of platforms, and the exploitation of online spaces by extremist groups.
Today we remember those who were lost and those impacted by the awful events five years ago in Christchurch. Let’s also remember that there is critical work to be done to ensure our online spaces don’t create another terrorist with another trail of grief.