spinofflive
Image: Tina Tiller
Image: Tina Tiller

InternetNovember 25, 2021

How Telegram became the extremists’ platform of choice

Image: Tina Tiller
Image: Tina Tiller

Telegram has been identified as a key part of New Zealand’s disinformation ecosystem, but what exactly is it, and why is it so significant? Dylan Reeve explores the messaging app for IRL

If you’ve read much about the current state of Covid conspiracy theories in Aotearoa, you’ve almost certainly seen mention of Telegram. The recent working paper from The Disinformation Project at Te Pūnaha Matatini made specific mention of the rise of Telegram as “the platform of choice for the spread of mis- and disinformation in Aotearoa” and highlighted the lack of oversight and limited content policies as the reason. 

Before we get into just what makes Telegram so popular for this purpose, we should understand what it is. It’s typically described as a messaging application or a communications platform, but what is it that makes Telegram different, and why has it become such a significant part of the conversation around online disinformation?

First released in late 2013, Telegram is a messaging application like WhatsApp or Facebook Messenger. It gained popularity as an alternative to the messaging applications built into smartphones because it offered advanced features, such as group messages and image support, that were often unavailable with default applications when messaging between Apple and Android users. 

Initially, Telegram was not really much different from any other messaging platform. Families looking for a place to house their group chat or flatmates coordinating their chore schedules might choose Telegram. It was like any other messaging service, mostly playing host to the innocuous minutiae of daily lives, and cute cat photos.

And like many personal messaging apps, Telegram took a very hands-off approach to content moderation. In the case of “secret chats”, messages sent on Telegram use end-to-end encryption, meaning it’s not even possible for the company to see (or moderate) the content.

But compared to other messaging platforms, Telegram’s terms of service are astonishingly brief and permissive. The entire thing consists of three bullet points – no spam or scams, no public promotion of violence, and no public sharing of illegal pornography. That’s it.  

Telegram’s entire terms of service. (Image: Dylan Reeve)

A big turning point for Telegram came in late 2015 when they launched a new feature: channels. This development allowed users to broadcast messages to their followers, creating an opportunity for influencers to establish themselves on the platform in a way similar to Facebook pages or an Instagram account. 

Soon after, some alt-right personalities found themselves being banned from social media platforms like Twitter, and sought out a new outlet for their opinions. They found that Telegram’s recently-added channels feature was ideally suited for their purposes, offering an opportunity to broadcast their message to their fans and, with no oversight to speak of, zero risk of being suspended.

As the Maga movement grew during 2016, the popularity of Telegram within the alt-right meant that it was the obvious choice for users and groups that found their rhetoric too extreme for mainstream social media platforms like Facebook, Twitter and Instagram. And not long after that, the QAnon movement followed a similar trajectory, heading to Telegram as an oasis of unfettered communication after sites like Reddit started to crack down.

The adoption of Telegram in New Zealand mirrors the experience overseas, according to Caroline Orr Bueno, a US disinformation researcher. (Photo: supplied)

By the time the Covid-19 pandemic began to sweep around the world, Telegram, which had since added new features like voice and video calls and broadcasts, had become the platform of choice for the US alt-right, Maga, QAnon and general conspiracy communities. They were still widely present on other platforms, but the most accessible and least restricted platform for most was Telegram. 

It was against that background that New Zealanders who were starting to gather on Facebook, Instagram and Twitter to share their doubts about the pandemic and the government’s response also began to establish themselves on Telegram. Gradually, as social media platforms took a harder line on limiting the spread of Covid-related disinformation, the more extreme conversation around these ideas shifted to Telegram.

Sanjana Hattotuwa, a research fellow for The Disinformation Project at Te Pūnaha Matatini, estimates that there are close to 180,000 participants across the local Telegram channels he monitors, although he points out that it’s almost impossible to estimate the real number of individuals, as many people participate in multiple channels.

Regardless of the specific numbers, Hattotuwa is clear about the trend he’s observed, especially in recent months: “It’s gone from around 45,000 in late September to nearly 180,000 today. It’s growing at around 35,000 on average every fortnight. On a graph it’s nearly a 45 degree incline.”

The adoption of Telegram in New Zealand mirrors the experience overseas, according to Caroline Orr Bueno, a postdoctoral disinformation researcher at the University of Maryland in the US. The platform’s growth during the most recent lockdown looks like a smaller scale version of what took place in the US at the beginning of the year.

Telegram had a surge of popularity after the January 6th riot in Washington DC. (Photo: Brent Stirton/Getty Images)

“There was a really big jump after January 6th,” says Orr, referring to Telegram’s surge in popularity after the insurrectionist riot in Washington DC. “On the list of most downloaded apps, [Telegram] went from being like 100 or something, to the first or second most downloaded app. They gained millions of users in a 72-hour period right after January 6.”

It’s not just the number of users that concerns Hattotuwa, but also the nature of the communication on Telegram. “In [the platforms] we study, Telegram is by far exceptional for a number of reasons,” Hattotuwa says. “One is that it features the most violent content today. It targets very high levels of authority and government.”

The platform is also “very gendered,” Hattotuwa continues, and “it signposts a lot of QAnon content — there is a lot of QAnon content coming into Aotearoa’s leading Telegram channels, with every imaginable harm you’d associate with QAnon now pulsating and present.”

While much of the high-profile activity on Telegram plays out in public groups and channels, the platform also facilitates highly secure private communication that is beyond the reach of researchers, law enforcement or even the company itself.  

“[Telegram] is clearly a platform where extremists feel comfortable and generally assume, rightly so, that they’re not going to get kicked off,” says Orr, explaining that public Telegram channels in the US have been shown to work as a recruitment tool for extremists. “It allows for this sort of funnel where you have public channels where anybody can just wander in and see what’s there and engage or not engage. If [extremists] see that certain users are engaging with things, they can open up private ways to talk with [new users], and bring them in.”

A wanted poster featuring Jacinda Ardern circulates on Telegram. (Screenshot: Dylan Reeve)

It’s commonly assumed by the users of New Zealand’s Telegram channels that they are being watched by police and the intelligence services, but there’s so far no clear evidence of this. As yet there don’t appear to have been any criminal cases emerging from official observation of the platform in Aotearoa. To many who watch Telegram, it still feels as though the platform is being widely overlooked while attention remains focused on mainstream social platforms like Facebook, or obscure deep web sites like 8chan. 

Telegram is emblematic of the complicated dual-use nature of internet technology in general, and secure communications specifically. For typical users, it’s an accessible and easy-to-use messaging tool packed with powerful features and supported on all their devices. For conspiracy theorists, extremists and anti-social groups, it’s a perfect combination of secure enclave and public promotional platform allowing them to spread their beliefs, recruit supporters and plan future activities. 

Telegram may not be a household name, but that might change as the platform’s impacts are widely seen in the spread of disinformation and the organisation of disruptive protest actions.

Keep going!
Image: Tina Tiller
Image: Tina Tiller

InternetNovember 24, 2021

I was the ‘abortion girl’ on a viral far-right video

Image: Tina Tiller
Image: Tina Tiller
After a far-right YouTube personality aggressively vox-popped her for his video at Women’s March London, a New Zealand woman watched in horror as she went viral. She shares her first-person account for IRL.

As told to Madeleine Holden.

When I heard about Women’s March London in 2017, I knew I needed to be there. I was living in Shepherd’s Bush at the time, and I wanted to be part of the international movement protesting Trump’s election and the fact that he was a misogynist, white supremacist and all-round terrible human being. 

I knew the demonstration was going to be huge, and it was: more than 100,000 people turned out to protest the US president. I went with a friend of mine and we stayed the whole day; spending hours walking from the US Embassy in Grosvenor Square in Mayfair to Trafalgar Square in central London. By the time we arrived, it was late afternoon, and when two guys rocked up on a push bike with a sound system, we joined the impromptu party that formed around them. It was a really friendly atmosphere, and I was dancing and feeling great.

Suddenly, I became aware of a person approaching me. There was light in my eyes and I couldn’t see clearly, so initially I thought I was being approached by a hot, pixie-cut lesbian. By the time I realised the stranger wasn’t a lesbian but just a boyish incel holding a camera, I had a second guy – a very posh-talking, Slender Man figure – thrust a mic in my face and start firing questions at me. I was like, “What the fuck is going on?”

I now know the man was Caolan Robertson, a then-nascent far-right YouTuber, but he didn’t introduce himself or his project at the time. When someone sticks a camera and mic in your face, you freeze. I had no idea what was going on, but I figured if I stayed calm and answered his questions, I would get out of the situation soon enough. 

I first got the sense Robertson was trying to trip me up when he asked me twice what I was doing at the march even though I’d already answered, and after he asked me a few hypothetical questions about abortion, I decided to disengage. The whole exchange lasted about 30 seconds and I assumed the footage would be unusable. Because Robertson was focused on abortion, I figured I was probably talking to an anti-choicer. I didn’t realise how much worse he actually was.

The next morning, my friend sent me a panicked text message. “Oh my God, that guy, he’s called the New Brit, [the video] is on Facebook.” When I clicked the link she sent, the video already had 100,000 views. It was boosted by Breitbart UK, which sent it into the stratosphere: it blew up all over Facebook and on Robertson’s YouTube channel. I couldn’t stop watching and reading the comments. Once it got to a million views, I just went numb.

Because I had engaged with Robertson’s questions and he didn’t selectively edit my section like he did with other interviewees, a lot of the comments focused on me, calling me the “abortion girl” or the “Aussie girl”. The comments were horrific and violent; the gist being that we were all stupid and deserved to be raped. They were threatening to do a whole lot of shit to all of us.

I felt sheer terror at this point. I was in shock and I started panicking that I would be identified by some of these far-right men in the street. All they knew about me is that I was the “abortion girl” who “refused to engage in an argument”, and that made them really, really angry.

I completely locked down my social media accounts and got rid of my profile photos. If it wasn’t for my anonymity in the video, things could have been much worse than they were. Still, the fear of these very vocal men who really hate what I stand for didn’t go away for a really long time.

The experience changed how I viewed protests and demonstrations. I had always felt safe and empowered before then, attending relatively small protests in New Zealand which didn’t tend to be infiltrated by counter-protesters and far-right celebrity hopefuls. They were always welcoming and safe spaces, and I’d never really seen them get invaded like that. I’m a lot more cautious about attending those kinds of events now. 

For a long time afterwards, I felt really embarrassed. I was like, “Oh my God, you could have done better. You should have said this, should have said that.” But there was no winning in that situation. He came in with the explicit purpose of shaming us and using us as props. There was no way out of it. 

This was just shy of five years ago. Since then, Caolan Robertson has renounced white supremacy and now works as a “counter-extremism advisor”. He’s doing his thing and that’s great, but I don’t think anyone who’s been a victim of someone like him can ever fully trust that kind of narrative. He’s left a lot of damage. 

To this day, I believe abortion is a human right, and I’m never going to apologise for that. But thinking back, my experience made me really militantly attached to those views. I was like, “How dare you try and shame me for this. Fuck you, I’m just going to do even more work with this, I’m just going to get even more feminist.” You end up getting really, really defensive. 

I can’t even imagine being vox-popped like that now; we’ve become even more polarized than we were in 2017. It makes me think of all the anti-vaxxers and their protests and videos, and how violent people are getting. It’s so easy to forget that we are all human beings with thoughts and feelings and families. We don’t deserve to be thrown on the internet and used as props for someone else’s narrative. 

If we reduce people to 10-second soundbites we deny their humanity, which is exactly what happened to me. I know that as a white lady who is not often read as queer, it’s easier for me to hold this view (and do this work) than it is for others who have been targeted by the far right, but this experience taught me to talk to people I don’t agree with and to ask them questions without initial judgement. This has opened a lot of dialogue and I’ve built bridges this way.

If we don’t meaningfully engage with people when we have the chance, we’ll never make things better. 

Do you make your living in the gig economy? Tried to delete your internet presence? Met the love of your life in a strange way online? If you’ve got a great yarn about the internet impacting your life, get in touch with us at irl@thespinoff.co.nz.