Earlier this week David Farrier lifted the lid on the very strange case of the Christchurch AI that would supposedly revolutionise global medical practice. What has he discovered since?
Since writing about Zach, the AI that increasingly appears to be neither Artificial nor Intelligent, all the main players have fallen strangely silent.
Associate Professor Pickering isn’t replying to my emails (although he did remove his blog posts about Zach, which can be found archived here), and the best I got out of David Whale was him replying to my message with … absolutely nothing, and CC’ing his son Albi Whale in. I did get some response from Dr Seddon-Smith, of which more soon.
I rang the Terrible Foundation, Whale’s business conglomerate, and after some annoying midi hold music, someone picked up.
I told them it was David Farrier, calling from The Spinoff.
“How are you?” I said.
“I’m fine, but I am not talking to you, my friend”.
The voice sounded very much like the voice of Albi Whale, whose voice I’d heard in the Stuff video.
“Is this Albi?” I said.
They hung up.
But while the Whales remain silent (not even a click or a whistle!), I have heard from lots of other people, from techy types on Twitter offering their considered opinions that Zach looks batshit bonkers, to concerned doctors in Christchurch. “I’m told a number of doctors at the [Christchurch] hospital have been involved in some way” began one email to me.
The Spinoff has learnt that a number of medical professionals went to hear Dr Seddon-Smith (a trustee at Terrible) talk at a Digital Futures event last year, and when they expressed interest in Zach were “promptly invited to invest, and sent a share prospectus for the public launch”.
I have got hold of a copy of said prospectus, and it’s unlike any share prospectus I’ve seen. The share offer in Number Eighteen Limited (listed as an operator of Omega Health) talks in incredibly broad terms without saying very much at all. It also doesn’t list any technical staff, despite Zach being a revolutionary new AI phenomenon, unlike anything the world has ever seen.
The only people named in the prospectus are Dr Robert Seddon-Smith and David Whale, along with “advisors” Dr Martin Than (an Emergency Medicine Specialist with the Canterbury District Health Board) and Dr John Pickering (an Associate Professor in the Department of Medicine at Otago).
Then there’s the share offering:
Their goals are lofty, and terrifying when you consider that there is no verifiable evidence that Zach is real:
“We intend to work through the establishment – PHOs, DHBs and the Ministry of Health as well as Health Research Council, Canterbury Medical Research Foundation and others.”
Then there’s their “valuation”:
And a breakdown of their amazing technology:
The fact that people were approached to invest after Dr Seddon-Smith’s speech is sort of amazing, because you’d think the talk would have raised a number of red flags within the medical community.
It’s a 30-minute presentation, and it’s been pored over on reddit. One user watched the whole thing and noticed some interesting additional points on Dr Seddon-Smith’s slideshow and speech, which I’ll list now:
- Patient notes for someone with the flu – see the ridiculous misspelling of oedema as ‘adiva’.
- Patient notes for a woman who was in the middle of a psychiatric episode – “Zach” inserted the number of a psychiatric helpline unprompted.
- An apparently obvious human error is the inconsistent capitalisation of words, even those starting sentences.
- The doctor suggests that “Zach” has been given access to all the electronic records of the patients concerned, from lab results all the way through to letters from psychiatrists. Wouldn’t that cross a line?
- In addition, at this point in the talk Dr Seddon-Smith thanks his audience including medical professionals, Christchurch city councillors, the media and possibly National and Labour MPs.
Point 4 seems of particular concern. Dr Seddon-Smith told me on the phone last week that patients were made aware their records would be shared with Zach – an AI. They gave their consent. But if Zach doesn’t exist, then is this consent still valid? If the patient records are being read by very human eyes – what if it instead were Albi Whale, David Whale, or one of Amazon’s Mechanical Turks (which takes its name from a fake chess-playing machine in the 1700s)?
I have queries in with the University of Otago Human Ethics Committee, as well as HDEC (the Health and Disability Ethics Committee) about this.
I went back to Dr Seddon-Smith. “No patient identifiable information has been sent to Zach,” he said.
“All patients agreed to recordings being used both in advance of them being taken and at the conclusion of the recording. Recordings which included identifiable information were never uploaded.”
He added: “In all things, I have acted with the highest professional and ethical standards. You may call me a fool, gullible, or simply misled about Zach, but I will not tolerate any suggestion that I have ever acted in a way that is unbecoming of a medical practitioner.”
All the Charities Commission annual returns for Terrible New Zealand appear utterly wonky. Take their February 2018 annual return as an example: In the last year it has gone from having less than a million dollars of equity to $456 million.
The increase comes from the Trust coming into possession of $442 million worth of ‘equipement’ (sic).
As to Terrible’s charitable work – it’s unclear what it is. In the past Terrible has claimed to donate their business proceeds to charities like Oxfam. I asked Oxfam what their relationship was to Terrible, and it turns out there was actually a relationship:
“In 2014, Oxfam New Zealand started a partnership with Terrible Talk New Zealand, with the understanding that the new telecommunications company would be making significant contributions to support Oxfam’s work.”
But Oxfam continues: “Unfortunately, their contributions did not meet the level, or the timing promised. As such, Oxfam ended the partnership with Terrible Talk in 2015 and asked that all references to Oxfam be removed from the Terrible Consolidated websites and social media pages – and all external communications.”
It’s just another example of Albi and David Whale’s Terrible being a whole lot of… absolutely nothing.
I can only guess Terrible’s “charity” work and their bonkers returns – along with all the favourable media reports that remain online about The Terrible Foundation – are to make Terrible appear legitimate, which helps when they are approaching the medical community about a fancy new AI they want them to invest in.
Oh, and the Whales are approaching lawyers, too. This is how one lawyer recollects the AI being pitched to him in 2017:
“Over several years, with multiple law practices, different countries and much actual practice we have a virtual solicitor avatar for Zach called Hustle. Hustle not only understands the law, it can construct and defend a legal argument.”
That’s right, in this context Zach is literally called Hustle. With names like Terrible and Hustle stabbing me in the eye, part of me wonders is this is all some art project being run by Shia Labeouf.
Everything stinks. Even the content from Terrible’s various websites isn’t theirs: It’s yanked from sites like the Ford Foundation:
Online: Terrible; in real life: Terrible. I’ve heard from a variety of people who have had dealings with Albi Whale – inventor of the AI – and his father in the past:
“I met Albi and his father around a year ago when Albi emailed me wanting to lease commercial office space for his charity. I’m a leasing manager in Christchurch. I met them and gave them a tour of our spaces. It wasn’t the greatest experience and Albi was an interesting character. And for all his YES talk he disappeared without trace never to be heard from again.”
The leasing manager, who didn’t want to be named, continued:
“I always research our potential tenants, but never did get to the bottom of what it was that Terrible Foundation did. He was super vague, when I asked, just kept bragging about how rich and successful he was.”
Someone else, who works in IT, contacted me about an interaction they’d had back in 2013:
“I met David and Albi in 2013, when they came to me to talk about developing a cloud based service to support small businesses. David seemed like a credible middle-aged IT consultant type, [but] Albi was astonishingly rude. We did not end up doing business.”
The stories kept rolling in. Some were stranger than others, but they also all felt strangely familiar and vacuous:
“I met up with Albi after work. It’s been a few years now, but there are still a few stand out memories from this conversation. Albi told me about ‘Hannah’, tried to convince me that Hannah was a general artificial intelligence, and that Hannah had entirely replaced 40 people’s jobs in a call centre in Canada. He told me that Hannah could do all sorts of things, including ordering pizza.
“He also told me a story about someone breaking into a data centre, setting off the alarm, and then killing themselves. I distinctly remember him saying that they found the corpse in the morning, which makes no sense considering that if the datacenter was so high security, the alarm would not go uninvestigated.
“I mention this story because I very much did not believe him.”
Still no response to my queries on all this has come from David or Albi Whale.
Terrible’s offices are listed as being in Christchurch, apparently above a bike shop. I called the store and asked if there was anything Terrible in the building.
“Yes, they’re upstairs,” the man at the bike shop said.
“Does Albi work up there?” I asked
“Yes, he’s here most days. Him and David,” he said.
I sent a friend to take a look at the office. He snapped a photo of various maths equations on their window:
Maybe the answer to all this is up there, in the maths, staring us all in the face.
If you want to contact David Farrier about this story, you can email him on firstname.lastname@example.org