Auckland councillor Fa’anana Efeso Collins and the show that prompted his tweet, Police Ten 7 (Image: Tina Tiller/supplied)
Auckland councillor Fa’anana Efeso Collins and the show that prompted his tweet, Police Ten 7 (Image: Tina Tiller/supplied)

MediaSeptember 30, 2021

How one tweet changed the future of Police Ten 7

Auckland councillor Fa’anana Efeso Collins and the show that prompted his tweet, Police Ten 7 (Image: Tina Tiller/supplied)
Auckland councillor Fa’anana Efeso Collins and the show that prompted his tweet, Police Ten 7 (Image: Tina Tiller/supplied)

Following an independent review into Police Ten 7, TVNZ has confirmed the format of the series will change. Justin Latif asked the Auckland councillor whose tweet sparked the report if he’s satisfied with its findings.

On a Sunday morning in March, Auckland councillor Fa’anana Efeso Collins was chatting with his wife about what they had seen on television that week and the topic of racial stereotyping on the reality TV series Police Ten 7 came up. Following that discussion, he tweeted his thoughts, writing that it was “low level chewing gum TV that feeds on racial stereotypes”.

The tweet attracted a flurry of engagement and media interest, a range of both positive and negative reactions, and, horrifyingly, also a bomb threat against him and his family that included a direct reference to his criticism of the show. 

In response to the public furore about the programme, a review was conducted by Karen Bieleski, a media consultant and former TVNZ programmer, and Khylee Quince, dean of law at AUT. The review was intended to establish whether Police Ten 7 depicted Māori and Pacific people fairly and how the show aligned with contemporary societal values more broadly.

That review was made public yesterday. While it determined that Māori and Pacific people who participated in the show were fairly portrayed, it concluded the programme did little to discourage negative stereotypes. Among its recommendations were that filming should be done outside Auckland and the Waikato to ensure a wider range of ethnicities are featured, that TVNZ and Screentime NZ staff undertake training on Te Tiriti o Waitangi and racism, and that all promotional material for the show be signed off by the programme’s commissioner. 

TVNZ and Screentime NZ have said they will take the review’s finding onboard and will “reimagine” Police Ten 7’s future format.

A scene from Police Ten 7. Photo: TVNZ

Councillor Collins says he’s pleased the report was released publicly, but feels the authors “could have gone a lot further” in their recommendations. 

“It’s focused on the people involved and their genuine heart to do the best job they can – which is all very nice. But essentially the report vindicates mine and many others’ opposition, in that it says the show is not in line with our societal values, which is to be fair and consistent and cover a wide range of things.”

He applauds one of the report’s key recommendations: that TVNZ and Screentime NZ research the concerns that different groups hold around how their communities are represented in the media. “I think there’s a chance TVNZ is going to get left behind if they don’t grasp the values of contemporary society around being just and fair.

“If you only give people chocolate bars, then they will think that’s the full diet. But the truth is much more varied than that,” he says. “One problem with this show is that it shows only a third of what comprises a police officer’s shift, when police are also supporting people and communities and rebuilding trust.”

Collins says he’s still taken aback that one tweet could provoke both huge support and garner opposition as violent as a bomb threat. “I was surprised the tweet got the response it got, given it was sent on a Sunday before church, so I suspect it was a slow news day. But I also think there’s a feeling that people have had a gutsful of some police practices and that it’s time to shift.”

In response to Collins’ tweet back in March, Screentime NZ chief executive Philly de Lacey said the show was aiming to provide an “accurate portrayal of what the police are doing out in the streets”. Following the release of yesterday’s review, de Lacey said: “We look forward to partnering with TVNZ and the NZ Police to sensitively evolve the format for the next chapter.” 

In a statement, TVNZ director of content Cate Slater said the review provided “a way forward” and that TVNZ is “committed to reimagining Police Ten 7 so it serves viewers in the years to come”. TVNZ and Screentime NZ will announce details of the new version of the series and the format changes it will undergo later this year.

Collins says he would still like to see Police Ten 7 cancelled, but if that’s not possible he has some tongue-in-cheek suggestions on what a reimagined show should look like. “If I was them, I’d create programming that is reflective of New Zealand’s changing values, like showing the great things our young people are doing, not just focusing on kids who have had too much to drink. 

“How about sitting outside the IRD offices and pursuing people who have been avoiding tax? And imagine if they were in Wānaka recently, we would have had an interesting show about people escaping Auckland breaking lockdown rules,” he says. 

Whatever is next for the show, Collins is glad his tweet made an impact. “We have allowed this programme for years and years to be seen as the unquestioned truth, and the tweet actually challenged that idea, that this might not be the whole truth.”

Keep going!
the facebook logo on a black background
An image of the platform this article is about, to keep you on topic (Getty Images)

MediaSeptember 27, 2021

Facebook’s slow-motion reckoning is about much more than one company

the facebook logo on a black background
An image of the platform this article is about, to keep you on topic (Getty Images)

A major newspaper investigation into misinformation and other harms on its platforms shows how enormous the Facebook problem has become, writes Hal Crawford.

“The era of social media companies presenting themselves as passive conduits for user speech is coming to an end. The question is what will replace it.” Gilad Edelman, Wired

Last week marked a high-point in the slow build-up of societal rage against Facebook. The Wall Street Journal published a five-part series using leaked documents to show that the social media giant knows Instagram is bad for some teenagers, often struggles to implement its own policies, and gives preferential treatment to celebrities. Probably the most damning allegation is that Facebook’s disruptive 2018 algorithm change, which was supposed to bring the focus back to friends and relatives, has led to more divisive and misleading information on the platform.

The Journal, like a lawyer building up a case from many threads, weaves a net that seems to leave room for one conclusion only: this is a fundamentally flawed company doing great damage. The implication is that something – regulation, taxation, perhaps break-up – must be done about it.

While both regulation and taxation are coming – and to some extent expected by Facebook itself – my conclusion is that they won’t help until we have a clearer idea of the problem.

For Facebook, its flaws can be traced back to a business model with a particular original sin: the desire for control without responsibility. A separate and more important issue is the ease with which people can be manipulated in digital environments. This “hacking of the psyche” may be happening within Facebook and social media in general, but goes beyond them.

The Wall Street Journal’s five-part investigation into Facebook builds a damning case on the company’s failure to tackle misinformation and other societal harms on its platform

Facebook in context

Facebook is a very big company, but it’s not as big as Google, Amazon, Microsoft or Apple. See the graphic to get an idea of its relative economic power. For its size, it cops a lot of heat, while a company like Microsoft, in comparison, flies beneath the radar.

The 2018 algorithm change, the one that allegedly led to more divisive content, marked a turning point in Facebook’s relationship with traditional media. The new system favoured posts from friends and family, and traffic from Facebook to news media was hit badly. Facebook had encouraged news publishers to set up pages inside its “walled garden” and to use its platform as a distribution channel for their content. From the point of view of the media, the algorithm change was a breach of trust. Animosity, which had been latent, flowered.

What the WSJ leaks show

Most of the criticisms in the Wall Street Journal series arise from research and investigations conducted by Facebook itself. You get the sense that this is an organisation struggling to come to terms with its own massive scope and influence. Facebook VP of global affairs Nick Clegg wrote an angry rebuttal to the WSJ articles, complaining that research and opinions had been cherry-picked to create the false impression there was “only one right answer” and that Facebook “willfully ignores it if the findings are inconvenient for the company”.

Reviewing the WSJ revelations, I tend to agree with him. There is not sufficient balance in the quoted material to reassure me that the reporters came at the leaked material with open minds, rather than a desire to find facts to fit a template. But regardless of the mindset and methods of the WSJ journalists, substantial problems are revealed.

Commentary has tended to focus on the terrible effects of Instagram (owned by Facebook) on teenage girls. The essence of this problem, as your mum might say, is that Instagram is encouraging people to care about all the wrong things: being liked, being pretty, being rich. Appearance over substance. Facebook’s internal research on the matter seems to verify that chasing these chimeras will lead to unhappiness, and possibly harm your mental health.

Those findings are based on focus groups and online surveys: the flakiest science of all. More factually solid is the revelation that Facebook’s 2018 algorithm change to focus on what it calls “meaningful social interactions” (MSI) had the unintended consequence of encouraging outrage, lies, and divisive content. Behind the increase in this form of content, the leaked documents show, was a points system that significantly up-weighted commenting and re-sharing. According to the WSJ, Buzzfeed founder Jonah Peretti wrote to Facebook to privately share his concerns that the new algorithm was encouraging divisive content. At this point, we should remember that Peretti had probably just lost a truckload of traffic and was no doubt mad as hell.

The walled garden

Facebook doesn’t let users hack their own pages, or develop their own payment systems, or give access to the underlying source code of its platform. From the beginning it developed a “walled garden” that rested on top of the open web, then shifted primarily to an app-based interface. In some regions and for some users, Facebook has supplanted the open web as the typical internet experience.

This degree of control allowed it to shape its product for maximum engagement and profitability. Community standards, enforced by AI and human moderators, have been essential to the business model: without this content moderation, mainstream and small advertisers would not touch Facebook, and user experience would be catastrophically degraded. Without the absolute technical control Facebook exercises over the News Feed and other “surfaces” (as sections of the FB app are called), it would not have been able to continually tweak the interface to maximise engagement and revenue.

But Facebook has not accepted the responsibility that necessarily comes with this degree of control. It doesn’t want to be the arbiter of right and wrong. In a classic liberal stance, founder and CEO Mark Zuckerberg tolerated Holocaust denial on “the platform” for many years. Last year he changed his mind. With every such decision, Facebook moves closer to the position of the classic publisher. Publishers are morally and legally responsible for the things they publish.

In New Zealand, Facebook livestreams of the prime minister’s 1pm updates are being flooded with comments promoting vaccine misinformation.

Totally hackable

One of the biggest problems we face – the big we, all of humanity – is that our “operating systems” are incredibly easy to hack. As psychologists like Daniel Kahneman have shown, our conscious minds are storytellers, rationalising, ratifying and explaining things decided elsewhere. Because “reason” and conscious thought often don’t enter into the decision-making process, someone or something that understands the true drivers of belief and behaviour is at a huge advantage.

Back in 2012, I began a research project with some colleagues to investigate what kind of news stories would be shared on Facebook. We discovered that the most powerful driver of content on social media was appeal to group identity, expressed either as a disapproval or approval of someone or something else. We also discovered, as many other researchers had before us, that people’s tendency to share was greatly increased when they were emotionally aroused. When you put these two facts together with algorithmically facilitated communication – otherwise known as a digital social network – you get a sentimental, outrage-driven and divisive environment.

On the hook for everyone and everything

Facebook’s walled garden now contains the majority of the world’s internet users (60%). Facebook may argue that it is inconceivable that any single company could be held responsible for the misdeeds and mental health of this multitude. That seems reasonable. But it is also inconceivable that any single company should control and profit from the social interactions of such a multitude.

In 2019, Zuckerberg called for regulation of “harmful content, election integrity, privacy and data portability”. The problem isn’t resistance to regulation, it’s knowing what to do, and Zuckerberg’s list doesn’t seem to go to the heart of the problem. How do we moderate the drive to maximise engagement? How do we prevent malicious actors from exploiting the weaknesses of the human operating system? We are in completely uncharted waters, and the problems are bigger than a single US company.

What is clear is that the public mood has shifted significantly and the “pure platform” days of social media are over. As a first step, we should recognise that social media companies bear a level of legal and ethical responsibility somewhere between a traditional publisher and a telecommunications company. We may also have to accept that the current functionality of social media will significantly change.