One Question Quiz
the facebook logo on a black background
An image of the platform this article is about, to keep you on topic (Getty Images)

MediaSeptember 27, 2021

Facebook’s slow-motion reckoning is about much more than one company

the facebook logo on a black background
An image of the platform this article is about, to keep you on topic (Getty Images)

A major newspaper investigation into misinformation and other harms on its platforms shows how enormous the Facebook problem has become, writes Hal Crawford.

“The era of social media companies presenting themselves as passive conduits for user speech is coming to an end. The question is what will replace it.” Gilad Edelman, Wired

Last week marked a high-point in the slow build-up of societal rage against Facebook. The Wall Street Journal published a five-part series using leaked documents to show that the social media giant knows Instagram is bad for some teenagers, often struggles to implement its own policies, and gives preferential treatment to celebrities. Probably the most damning allegation is that Facebook’s disruptive 2018 algorithm change, which was supposed to bring the focus back to friends and relatives, has led to more divisive and misleading information on the platform.

The Journal, like a lawyer building up a case from many threads, weaves a net that seems to leave room for one conclusion only: this is a fundamentally flawed company doing great damage. The implication is that something – regulation, taxation, perhaps break-up – must be done about it.

While both regulation and taxation are coming – and to some extent expected by Facebook itself – my conclusion is that they won’t help until we have a clearer idea of the problem.

For Facebook, its flaws can be traced back to a business model with a particular original sin: the desire for control without responsibility. A separate and more important issue is the ease with which people can be manipulated in digital environments. This “hacking of the psyche” may be happening within Facebook and social media in general, but goes beyond them.

The Wall Street Journal’s five-part investigation into Facebook builds a damning case on the company’s failure to tackle misinformation and other societal harms on its platform

Facebook in context

Facebook is a very big company, but it’s not as big as Google, Amazon, Microsoft or Apple. See the graphic to get an idea of its relative economic power. For its size, it cops a lot of heat, while a company like Microsoft, in comparison, flies beneath the radar.

The 2018 algorithm change, the one that allegedly led to more divisive content, marked a turning point in Facebook’s relationship with traditional media. The new system favoured posts from friends and family, and traffic from Facebook to news media was hit badly. Facebook had encouraged news publishers to set up pages inside its “walled garden” and to use its platform as a distribution channel for their content. From the point of view of the media, the algorithm change was a breach of trust. Animosity, which had been latent, flowered.

What the WSJ leaks show

Most of the criticisms in the Wall Street Journal series arise from research and investigations conducted by Facebook itself. You get the sense that this is an organisation struggling to come to terms with its own massive scope and influence. Facebook VP of global affairs Nick Clegg wrote an angry rebuttal to the WSJ articles, complaining that research and opinions had been cherry-picked to create the false impression there was “only one right answer” and that Facebook “willfully ignores it if the findings are inconvenient for the company”.

Reviewing the WSJ revelations, I tend to agree with him. There is not sufficient balance in the quoted material to reassure me that the reporters came at the leaked material with open minds, rather than a desire to find facts to fit a template. But regardless of the mindset and methods of the WSJ journalists, substantial problems are revealed.

Commentary has tended to focus on the terrible effects of Instagram (owned by Facebook) on teenage girls. The essence of this problem, as your mum might say, is that Instagram is encouraging people to care about all the wrong things: being liked, being pretty, being rich. Appearance over substance. Facebook’s internal research on the matter seems to verify that chasing these chimeras will lead to unhappiness, and possibly harm your mental health.

Those findings are based on focus groups and online surveys: the flakiest science of all. More factually solid is the revelation that Facebook’s 2018 algorithm change to focus on what it calls “meaningful social interactions” (MSI) had the unintended consequence of encouraging outrage, lies, and divisive content. Behind the increase in this form of content, the leaked documents show, was a points system that significantly up-weighted commenting and re-sharing. According to the WSJ, Buzzfeed founder Jonah Peretti wrote to Facebook to privately share his concerns that the new algorithm was encouraging divisive content. At this point, we should remember that Peretti had probably just lost a truckload of traffic and was no doubt mad as hell.

The walled garden

Facebook doesn’t let users hack their own pages, or develop their own payment systems, or give access to the underlying source code of its platform. From the beginning it developed a “walled garden” that rested on top of the open web, then shifted primarily to an app-based interface. In some regions and for some users, Facebook has supplanted the open web as the typical internet experience.

This degree of control allowed it to shape its product for maximum engagement and profitability. Community standards, enforced by AI and human moderators, have been essential to the business model: without this content moderation, mainstream and small advertisers would not touch Facebook, and user experience would be catastrophically degraded. Without the absolute technical control Facebook exercises over the News Feed and other “surfaces” (as sections of the FB app are called), it would not have been able to continually tweak the interface to maximise engagement and revenue.

But Facebook has not accepted the responsibility that necessarily comes with this degree of control. It doesn’t want to be the arbiter of right and wrong. In a classic liberal stance, founder and CEO Mark Zuckerberg tolerated Holocaust denial on “the platform” for many years. Last year he changed his mind. With every such decision, Facebook moves closer to the position of the classic publisher. Publishers are morally and legally responsible for the things they publish.

In New Zealand, Facebook livestreams of the prime minister’s 1pm updates are being flooded with comments promoting vaccine misinformation.

Totally hackable

One of the biggest problems we face – the big we, all of humanity – is that our “operating systems” are incredibly easy to hack. As psychologists like Daniel Kahneman have shown, our conscious minds are storytellers, rationalising, ratifying and explaining things decided elsewhere. Because “reason” and conscious thought often don’t enter into the decision-making process, someone or something that understands the true drivers of belief and behaviour is at a huge advantage.

Back in 2012, I began a research project with some colleagues to investigate what kind of news stories would be shared on Facebook. We discovered that the most powerful driver of content on social media was appeal to group identity, expressed either as a disapproval or approval of someone or something else. We also discovered, as many other researchers had before us, that people’s tendency to share was greatly increased when they were emotionally aroused. When you put these two facts together with algorithmically facilitated communication – otherwise known as a digital social network – you get a sentimental, outrage-driven and divisive environment.

On the hook for everyone and everything

Facebook’s walled garden now contains the majority of the world’s internet users (60%). Facebook may argue that it is inconceivable that any single company could be held responsible for the misdeeds and mental health of this multitude. That seems reasonable. But it is also inconceivable that any single company should control and profit from the social interactions of such a multitude.

In 2019, Zuckerberg called for regulation of “harmful content, election integrity, privacy and data portability”. The problem isn’t resistance to regulation, it’s knowing what to do, and Zuckerberg’s list doesn’t seem to go to the heart of the problem. How do we moderate the drive to maximise engagement? How do we prevent malicious actors from exploiting the weaknesses of the human operating system? We are in completely uncharted waters, and the problems are bigger than a single US company.

What is clear is that the public mood has shifted significantly and the “pure platform” days of social media are over. As a first step, we should recognise that social media companies bear a level of legal and ethical responsibility somewhere between a traditional publisher and a telecommunications company. We may also have to accept that the current functionality of social media will significantly change.

Keep going!