In the sixth episode of Actually Interesting, The Spinoff’s monthly podcast exploring the effect Artificial Intelligence has on our lives, Russell Brown looks at the draft algorithm charter, the government’s commitment to transparent and accountable use of AI.
In the Star Trek Voyager episode ‘Critical Care’, the Doctor – well, actually, the mobile emitter that produces him as a hologram – is stolen and sold to an alien hospital ship.
There, he discovers that the complex computer algorithm that determines treatment, the Allocator, dishes out lifesaving care according to each patient’s Treatment Coefficient – which measures not need, but an individual’s value to society. To lower-value patients, the computer just says no, without explanation. They die.
Mandy Henk was home sick herself recently when she watched the episode. And she swiftly recognised that this was a dystopian sci-fi story about a real thing used in the government sector here on Earth, in New Zealand: an operational algorithm.
We do not dispense medical treatment on the basis of individuals’ deemed social value. But we do use algorithms to make a bunch of other decisions: provisioning school bus routes, predicting which young school-leavers are in danger of falling through the cracks, triaging visa applications.
Henk, the CEO of Tohatoha, the organisation formerly known as Creative Commons NZ, is one of a number of people looking closely at the draft algorithm charter published recently by Statistics NZ. It’s the government’s most concrete commitment yet to transparent and accountable use of algorithms, AI and other sophisticated data techniques. It’s timely.
“I think it’s probably past time,” says Henk. “Given the amount of algorithms currently used throughout government, we’re probably overdue for a commitment on the part of government to use them in ways that ensure equity and fairness.”
“We have passed the point where we need to have this conversation,” agrees data scientist Harkanwal Singh. “It’s urgently needed. We need a robust conversation and real action.”
Both Henk and Singh welcome the draft charter as a useful statement of principles – and both believe it needs to be clarified and strengthened. For instance, it commits public entities to “upon request, offer technical information about algorithms and the data they use” – which implies there needs to be someone doing the requesting. But who, and how?
“That is not clear at the outset,” says Singh. “It would be better if the language made it clear. Also, why ‘upon request?’ Being open by default is much better and creates a culture of accountability. We do not want a repeat of the OIA experience.”
“In my first career, I was a librarian and I spent 20 years doing that,” adds Henk. “The ability of people to understand their own information needs is actually fairly limited. People have to know what they don’t know in order to make good requests for information. I would prefer to see government being more proactive about this, and providing that information to the people who are going to be impacted at the front end.”
Part of the issue with transparency and explanation is that there’s only a small subset of society that currently understands how algorithms work.
“I don’t know that I understand how algorithms work,” says Henk. “And I don’t think anyone outside of a fairly narrow group of people has a very good understanding of how algorithms work. Which is one reason why transparency isn’t particularly helpful. If you need to get a degree in computer science in order to understand it, that’s an awful lot to ask of the public.”
“It might be difficult to explain internal workings of a particular algorithm,” says Singh, “but what’s not particularly hard is to show what you are optimising for, why, and at least try to justify how. In government and public sector, we have to take the approach of accountability.”
Internet NZ has been vocal on the issue recently (see the #DataCharter hashtag on Twitter) and Henk agrees there’s a key role for civil society to play.
“Just like any other area of social justice, this really does require strong engagement from civil society. Funded, resourced civil society groups going out into communities, to groups like churches and markets, and explaining to people what these algorithms are, how they work and how people can get involved in making decisions about whether or not they want this to be something that impacts their lives.”
But surely there’s also a particular duty on the subset of people who do understand to engage and be active?
“I think there is,” says Henk. “But their choice to engage and be active probably isn’t enough. We need to look seriously at how deep we want the government to engage with things that don’t actually make sense to most of the public.”
Singh believes that Immigration New Zealand’s experiment in data modelling last year to try to predict which would-be immigrants might overstay or otherwise misbehave was an example of “how not to do it”. It certainly wasn’t in the spirit of the charter – not only did the public not know about it, the minister of immigration didn’t know until journalists called him last year asking if this was racial profiling. (It was a reasonable question, given what ProPublica revealed in 2016 about a criminal profiling tool used nationally in the US.)
Singh also has concerns about the charter’s explicit limitation to operational algorithms, “rather than those used in policy-making or research. There is no good reason for that distinction, unless we want policy to be done away from scrutiny.”
Both recommend reading not only the draft charter – which, at 266 words, is essentially only a statement of principles – but Statistics’ somewhat more detailed algorithm assessment report, which helps explain what the principles are about.
While the discussion continues, Singh is clear about what the goal should be: “We need something parallel to the European governance framework for algorithmic accountability and transparency.”
And for the charter itself?
“An actual mechanism on taking intentions to actions. We need a review mechanism which requires openness by default. This means whenever algorithms are used, details should be published up front. As much detail as possible, taking into account privacy concerns.
“Rather than generate scary headlines about algorithms, we’ll be all better off if as a society we approach algorithms as useful tools as long as they’re built and refined in public eye.”
This content was created in paid partnership with Microsoft. Learn more about our partnerships here.
Subscribe to Rec Room a weekly newsletter delivering The Spinoff’s latest videos, podcasts and other recommendations straight to your inbox.