One Question Quiz
Illustration: Toby Morris
Illustration: Toby Morris

The Best OfMarch 6, 2018

The mystery of Zach, New Zealand’s all-too-miraculous medical AI

Illustration: Toby Morris
Illustration: Toby Morris

An artificial intelligence bot called Zach is creating a stir in the medical community. A doctor in Christchurch is teaching it to write patient notes. An Otago professor has it interpreting ECG results. But AI experts are not convinced. David Farrier goes in search of Zach.

Last week I heard murmurings that a New Zealand healthcare organisation had been approached to trial an artificial intelligence technology called “Zach”.

I love Artificial Intelligence. The highly underrated Bicentennial Man is one of my favourite films! If some good AI was in New Zealand, I sure as hell wanted to know about it. But a quick Google showed me that the AI system, Zach, was already being trialled by a healthcare provider in Christchurch.

This trial had been written about last year by NZ Doctor. This thing was legit.

“Imagine having an assistant who listens to your consultations and immediately summarises them into clinical notes for you and memos for your patients,” stated the piece. An artificial intelligence system that can work hand in hand with medical doctors, listening to and interpreting their notes? It sounded good, very good – almost too good to be true.

Utterly fascinated, I began poking around, starting a journey that would take me around the internet, into the minds of medical professionals, and ultimately into a bizarre exchange about the nature of magic.

It all starts with the man behind the AI technology: Albi Whale.

I first heard about Albi Whale – full name Alberic Whale – back in 2014, when Jackson James Wood tweeted, “Sounds like he’s from Harry Potter”:

The tweet in question was part of a conversation about The Terrible Foundation, a charity founded by Albi Whale. Someone mentioned Alberic Whale didn’t have much of a digital footprint. “There’s a story in that I reckon,” tweeted Vaughn Davis.

That was four years ago. And it appears there was a story in it: “The curious tale of Alberic Whale: Turning over his company to world charityappeared on Stuff business in January 2017. The feature – all 3000 plus words of it – adopted a somewhat sceptical tone, but John McCrone largely let the boasts made by 25 year old Whale go unchallenged.

“Maybe he also does actually have a collection of 56 vintage Minis as an investment even though he hasn’t got a driver’s licence,” McCrone wrote, without seeking any proof the Minis actually existed. The best the story offers is a file image of three Minis from The Italian Job.

A core part of Albi’s origin story is that he met semi-retired UK businessman William Kreuk online, and started a business with him called Red Dog, after which “the money started pouring in”. Thing is, there is no record of any current or former shareholder or director in New Zealand or the UK with that name, nor are there any obvious contenders for a business called Red Dog which fits the description. In fact apart from the Stuff article, the name “William Kreuk” returns no meaningful results in a Google search. Nor does Bill Kreuk, nor Billy Kreuk, nor W Kreuk.

It’s interesting to note that around the same time Albi’s UK-based company was apparently earning “hundreds of thousands a year”, his New Zealand business, Luminous Group Limited, was put in liquidation by the High Court. Liquidators found a bank account with a $3 balance, an iPhone, and two old laptops. The Stuff feature goes on to say that Albi went on a “frenzy” of business start-ups when he established Terrible, including Terrible Print, Terrible Energy, Terrible Ideas and Terrible as a Service.

I checked, and Terrible Energy and Terrible Ideas were the only ones ever registered. Terrible Energy was removed from the Companies Register in July 2015. Is Albi Whale, and all this Terrible stuff, for real? At one point in the article Albi says he built his own cellphone tower out of scrap electronics when he was 12: “It was on a broomstick my brother screwed to the side of the house. My mother was infuriated.”

While it may have been possible to intercept the audio of conversations on analogue cellphones, experts I asked said the software and hardware required for pairing and management of phones akin to a “cellphone tower” was simply out of the reach of any hobbyist.

So what the hell is going on?

Over the last four years Albi Whale has moved on from cell phone towers to Artificial Intelligence. In 2014, he established Hannah Group Ltd, in order to explore and exploit an AI system he referred to as “Hannah”. In the Stuff story, he speaks of an AI system called Project Artemis which will soon take over management of Terrible. Albi was going to demonstrate Project Artemis in 2015 at the Epic Innovation Centre in Christchurch, however organiser Rebecca Tavete says he cancelled the demonstration at the last minute.

In Terrible’s 2017 Annual Report filed with Charities Services, it mentions that it is “directly in control of Titan (a.k.a Zach)”. In a second Stuff feature from August last year, Zach, the AI, is now running Terrible. So: Hannah, Project Artemis, Titan and… Zach.

I wanted to talk to the man currently working with, and training, Zach: Dr Robert Seddon-Smith. According to that NZ Doctor article, “a GP could upload a patient’s records to Zach and ask the system to pick up indications for medications or come up with a summary of care.”

Dr Seddon-Smith is a general practitioner in Christchurch and owns the Hei Hei Health Centre and Sumner Health Centre. As with many GPs, it takes a few days to get him on the phone, but when I do he’s happy to talk about Zach. We chatted for 45 minutes.

“The initial project was to see if Zach could create clinical notes, from just listening to a consultation. Which is pretty amazing when you consider no other technology can do anything like that,” he enthuses. “We start by getting patient consent. We explain this is this really weird thing we’re doing, and do they mind? And as long as at the end of the consult myself and the patient agree that the patient is non identifiable, we upload [the consultation] to the machine… and the machine returns notes.”

Seddon-Smith tells me it’s his job to “train” Zach. So when patient notes come back with errors, he feeds back tips and reference material to Zach, and Zach will learn. “It cannot just do analysis of data, it can apply judgement, according to how it’s trained.”

I ask what the interface looks like. How is he interacting with Zach? “Oh, email.”

“Email? You just email it?” I say.

“Yeah, you email it.”

I’m told that getting a response from Zach is not instantaneous. Zach can only deal with so many queries at once, and it can take 20 minutes or longer to get the information back.

Still, Seddon-Smith says that a fully-trained Zach would mean he gets to spend more time with his patients, without needing to worry about writing up notes. They would just be provided to him later in the day by Zach.

“The whole idea of the machine was it was good at natural language processing. And it uses a completely different approach to anyone else’s. Most people are using neural networks, and that is all very well – and has advantage of being portable, reproducible, and you can run it on a Windows box. But this runs on its own special hardware. It has its own custom made hardware – custom silicon – which is designed for natural language processing.

“The implications are huge aren’t they? It’s a true artificial intelligence.”

I ask how Seddon-Smith came to learn about this technology, which led to him trialling it with real patients. “When Pegasus Health got into developing HealthOne I elbowed my way in, saying they needed people who knew IT,” he tells me. “I have a profound interest in IT. I have ever since I was eight. I have been programming computers on and off since I was 11. Longer than you have been on the planet.”

He is warm, not arrogant – but definitely confident.

“Eventually Pegasus wisely formed a committee of people to oversee that project. So we are sitting there, having a chat about the features we’d like in a practice management system, when a colleague mentioned facetiously he’d like it to write patient notes for him. And this is  overheard by one of the other people there – who happened to be involved with the AI. After that project had concluded, he approached me.”

That man was David Whale, a trustee of the Terrible New Zealand Charitable Trust. His son is 25-year-old Albi Whale.

“My understanding is not 100% clear, but Albi basically was the founder of the Terrible Foundation, a young man who has made millions from just being the right place at the right time,” says Seddon-Smith. “To be fair, he works quite hard to get to the right places at the right time. Somehow he invented the early technology that led to the development of the AI. His dad and I got friendly, and so ultimately we have gone into business partnership together, to see what we could do with the computer [AI] in the healthcare space.”

By this point, I had fired off emails to David Whale and Albi Whale, saying I wanted to talk to them about Zach for The Spinoff. I also friended Albi Whale on Facebook and sent him a direct message.

In the meantime, I was curious where else Zach was being pushed, beyond the Hei Hei Health Centre. A source in the healthcare system told me David and Albi Whale had “impressed many senior health professionals in New Zealand”. The Canterbury District Health Board confirmed their interest. They are “yet to meet with them, but are planning to.”

Dr John W Pickering, an associate professor at the University of Otago, is also working with Zach. “Last Friday I sent my first email to an intelligent machine called Zach. Zach is a distributed machine running on custom silicon and software,” Pickering wrote on his blog [Internet Archive link] late last year. “We aim to teach it to perform another common task, namely that of interpreting the electrocardiogram. If Zach can interpret these as well and more consistently than most physicians it could be a useful tool for the busy physician.”

I got Professor Pickering on the phone and confirmed he had a research group working on it. He rejected any scepticism that he was in the process of teaching an intelligent machine how to interpret patient’s ECGs. “I am absolutely using it and thrilled to be using it,” he told me.

With so many people seemingly impressed by this brand new AI, I’d been thinking about the way Zach communicated (over email), the way it learned, and those response times. Also all the technical talk like “custom silicon”.

I’d also heard from people who said that Zach occasionally had bad spelling. Keeping in mind everything I had learnt about Albi and David Whale, I began wondering:

What if there is no AI?

What if – keeping in mind Occam’s razor – everyone was just talking to… a boring old human?

After all, Albi’s Hannah Group co-founder, Chaley Warne, had seemed a little uncertain in the Stuff article: “It could have been a person somewhere. It was hard to tell.”

“This is just Tommy Wiseau-level baffling,” one Healthcare insider emailed me, likening the whole thing to the mystery behind the enigmatic creator of The Room.

The more I read about David Whale, Albi Whale and the Terrible Foundation, the more terrible I felt. But maybe my suspicion that Zach was just a person was wrong. Maybe Zach was a complex AI, a world first.

I decided to run a random sample of claims about Zach past some experts in the field of AI. All expressed a combination of intrigue and deep suspicion.

“I had a good chuckle reading through this,” said one. “No idea how you manage to find stuff like this in New Zealand – or maybe because it is such a small place, it allows such things to happen.”

Another expert in AI, the managing director of a Fortune Global 500 company in Australia, agreed to critique some of what I’d learned about Zach.

POINT 1: Response times of 20 minutes or more.

CRITIQUE: AI’s ability to process responses is near instantaneous (at last as far as we are concerned – Watson provides complex analysis in about 30 secs for medical recommendations per query). The notes should be within minutes as soon as the audio clip is provided – and regardless of time of day and number of concurrent requests.

POINT 2: Zach communicates via email.

CRITIQUE: This is just silly. Communicating over emails just doesn’t make sense. Machines have to be taught Natural Language Processing (NLP). By default it will have to take a bunch of parameters via an API (interface – with a defined format – command + <parameters>).

POINT 3: Zach has a limited number of requests it can process at a time.

CRITIQUE: AI would never be constrained to such a small number of concurrent threads.

I also talked to Dr Ian Watson, associate professor in computer science at the University of Auckland. His research tends to focus on artificial intelligence, machine learning and knowledge-based systems.

“Well, myself and colleagues who lead Amazon’s Alexa voice recognition programme and IBM’s Watson programme have never ever heard of David Whale, or Zach, or Terrible within the AI community,” he says.

“I’d say with some confidence that this is someone trying to jump on the AI Machine Learning bandwagon. It’s unlikely Whale could have solved a problem that large teams of researchers at IBM, Amazon, Google, Apple, Microsoft, and in the top computer science departments around the world are still working on – and, mostly, sharing their results in the public domain.”

However the medical professionals I spoke to who have been granted access to Zach – Dr. Robert Seddon-Smith and Associate Professor Pickering – appear entirely satisfied that they’re emailing back and forth with an advanced AI, spelling mistakes and all.

When I told Pickering that it had crossed my mind he might be emailing back and forth with a person, he responded by asking if I was a conspiracy theorist. I told him I wasn’t.

And the field of application, after all, isn’t just anything – it’s health. Surely the bar for authentication is higher when health professionals are interacting with a system so mysterious they don’t even know where in the world it is located, and asking that mysterious system not just to write up patient notes, but interpret those notes.

“The board didn’t originally believe it could be used in the healthcare space. I have had to demonstrate that it can,” says Seddon-Smith, who estimates a quarter of his time is spent dealing with Zach.

“So the board is on board now?” I ask. “Oh yeah. They’re on board now. They’re supporting it. They’ve granted us – I can’t say how much – but millions of dollars worth of access to the system. To develop this. It’s not cheap. This isn’t something you can run on a Raspberry Pi. It’s several hundred tonnes of liquid nitrogen cooled supercomputer.”

Custom Silicon. Several hundred tonnes of liquid nitrogen cooled supercomputer. Zach is a colossus.

As I talked to Seddon-Smith, I wondered what his role was in all this. An enthusiastic helper or a core part of Team Terrible? We do know, at least, that he is a trustee of the Terrible New Zealand Charitable Trust:

He continued: “What I am able to tell you – at the moment we are using the computer that is overseas. There is no instances of it over here. We are importing a set of the hardware to New Zealand to form part of a Discovery Centre. I am allowed to tell you that the firewall that we are installing consists of multiple Cray XC50 supercomputers.”

He pauses.

“Multiple XC50s. Not one. Multiple.”

The XC50 is one of the most powerful computers on the planet. NIWA recently spent $18 million to get two of them. Oh, and according to Seddon-Smith, Zach is being trained as a legal advisor, too.

Rereading that Stuff article, it becomes clear how strikingly unquestioning it is about some extraordinary, even bizarre claims. The same goes for the NZ Doctor piece.

As for the man supposedly behind all this, I was unable to talk to Albi Whale. He never responded to my requests for comment, whether on email or via Facebook.

His father, David, did:

Luckily for me, David Whale agreed to answer some questions over email.

As to who made the AI, he wrote: “[The] team is multinational but beyond that I cannot say – security reasons.”

As to where Zach is being trialled: “I am not at liberty to answer the first part of your question – NDAs are in effect. Rob [Seddon-Smith] has access and is training the system but is not using it in a production sense. The Foundation has decided to launch Zach’s public services from Christchurch, New Zealand. There will be announcements around these in the coming months.”

And as to whether Terrible is seeking any funding or investment, Whale confirmed they will be offering shares:

“Yes. There will be a share offer for Omega Health targeted at small investors. Discussions with the new administration are at an early stage. We will approach the Christchurch City Council with respect to our proposed new Centre in the central city. This centre will be an educational resource, a place to interact directly with Zach, incubator, commercial leases etc. It will be one of the largest computing resources in the world. Available in 2019. It will attract huge interest internationally.”

This whole thing has left my head spinning.

David Whale finished his first email with a request that I remember they are “first and foremost a charity. I would appreciate a chance to comment on anything you write before publication. I have been as free as I am able with information and would not like to see any misrepresentation.”

He signed off with “I presume your research has uncovered the following links”, before listing all the places Zach has appeared in the media. The links are all here (my descriptions; his links).

From what I can tell, those links are literally everything that exists about Terrible, Zach, David and Abli Whale in its entirety. Read it all, please, and let me know what you learn. I have read it all, I’ve sought to speak to everyone involved, and what of real substance have I found out about this thing?

A massive bunch of nothing.

At one point David Whale tells me, “There is a scientific paper, being reviewed, that describes the natural language processing capabilities in the medical arena.”

But alas, there’s nothing we can look at: “No, there is nothing in the literature about this at the moment.”

I want to be convinced it’s real. I’ve asked repeatedly for something to help me believe. The invitation remains open.

In frustration and desperation, I wrote back seeking some clarity from David Whale. I wanted clarity about something. Anything. I also asked again to speak to his son. That request was ignored. For full transparency, our emails can be read in full here. Towards the end of our email chain, David Whale simply opened with: “As AC Clark said ‘Any sufficiently advanced technology is indistinguishable from magic’.”

In his final email to me, responding to alternative explanations for the whole enterprise, he wrote: “If I were to follow your argument then we have duped many people.”

Whale had had enough of our correspondence. “I see little point in pursuing this conversation,” he said. “You cannot persuade those who choose not to believe.”

For me, this whole story was nicely summed up at one point during my conversation with Seddon-Smith, as he mused on what it’s like to work a powerful AI system:

“You give it a ‘what’ or a ‘why’ and you will get a really good response. It’s human level interaction.”

I tell Seddon-Smith it’s almost like he’s dealing with a human.

“Sometimes it’s very hard to believe you’re not.”

Part two:

The mystery of Zach the miracle AI, continued: it all just gets Terribler

Additional background research for this piece by Dylan Reeve.

If you want to contact David Farrier about this story, you can email him on david@davidfarrier.com


This story was made possible by reader contributions to The Spinoff Members, which supports our investigative journalism. Read more here – or click here to donate.

Keep going!