The NZ government must take a much bolder stance on the tech giants who dominate our lives online, writes Leroy Beckett from Action Station, who today release The People’s Report on Online Hate, Harassment and Abuse.
I spend most of the time I am awake online, in some form or another.
It’s how I keep up with friends, do my job, learn about the current events horrors of the day and distract myself from them. I’m not unusual, in that regard. New Zealanders, as a whole, are extremely online (though some of us aren’t at all and that is also a problem).
For the past few months my colleagues and I at ActionStation have been thinking more deeply about what that means for our society, and how it is affecting the state of our democracy.
The report we are launching today is the result of that thinking, lots of reading and research, and listening to others’ experiences.
In it we argue tech giants who have come to dominate our lives online have failed to stop their products from damaging individuals and our wider democracy.
That means the government should take a bolder stance in ensuring they do.
In the foreword to our report, The People’s Report on Online Hate, Harassment and Abuse, the good economist Shamubeel Eaqub writes: “Being online is a misnomer. It’s like walking on footpaths and driving on roads – part of everyday life. Yet we seem to treat online as a separate space rather than an extension of everyday life.”
If our government had no role in making sure our roads and footpaths were safe places to be we would call it an abandonment of duty. It’s the same online.
The internet has taken the place of the public square. It’s where we go to get a sense of public opinion. It’s often where the news begins, and always where it is chewed over and digested. It is where we get information and go to have our voices heard.
But for many people, that public square isn’t working.
If everytime you go online you are faced with racist, sexist, homophobic abuse, how likely are you to take part in that public square?
How likely are we to get more diverse leaders and politicians if they have to cope with a torrent of hate, abuse and harassment online?
How will we face the complicated challenges of the future if the networks that are meant to connect us are instead dividing us, and if we are being radicalised by algorithms that promote misinformation?
What we found in our report is a far reaching problem.
One in three Māori (32%), and one in five Asian (22%) and Pacific (21%) people experienced racial abuse and harassment online in 2018 alone.
The comments on Stuff and in Facebook comments on news stories are worse even than we assumed.
Youtube channels are getting hundreds of thousands of views posting conspiracy theories about Māori history.
And that’s just the beginning.
We release the report today not because we hate the internet, but because we love it. And we know it can be better. We need it to be.
The report includes a series of recommendations for the government to take action, which can be summarised under four headings:
Remove: Ensure platforms are active in removing harmful content quickly. An investigation into the most effective method to do this would be required, but the responsibility should be placed on the platform, not the users.
Reduce: Limit the reach of harmful content. Neither the platforms nor the users who create hateful and harmful content should benefit from algorithms that promote divisive and polarising messages.
Review: The New Zealand government needs to review our hate speech laws, the Harmful Digital Communications Act, the Domestic Violence Act, the Harassment Act and the Human Rights Act to ensure they are fit for purpose in protecting people online in the 21st century.
Recalibrate: One of the most significant themes to emerge in this research was the need to attend not just to individualised concerns (eg individual rights and privacy) but also to collective dynamics and wellbeing. Any policies that are developed to protect people online need to have indigenous and collectivist thinking at their centre. They should also ensure that all internet safety / hate speech agencies funded by the Crown reflect the increasing diversity of our country.
They won’t solve all of the problems with the internet, or even all the ones described in our report. But it would be a start.
Leroy Beckett is the Open Democracy campaigner at ActionStation