Tools like Google Lens and DermAssist have some real benefits, but can also confuse and misinform. Are they worth using?
A few weeks ago I caught my son scratching furiously at his leg and was horrified to find a large cluster of raised red bumps on his inner thigh. So I did what every human with a smartphone would do: turned to Dr Google.
But rather than typing a description of the offending rash into the search bar, I used the camera to snap a picture for image recognition technology Google Lens. The app immediately offered up a range of photos and links to click for further information. The top result, and the one that visually matched the problem area, was some yuck-sounding skin condition called molluscum contagiosum, which the suggested reading material informed me is a relatively harmless poxvirus similar to warts. It’s very common in children, apparently, and would supposedly resolve itself without the need for intervention.
My son has an annoying fear of medical professionals after being held down for a blood test as a toddler, so avoiding the drama of an unnecessary doctors’ visit seemed compelling. But could I trust what Google had to say?
I asked my newly-retired GP father (whom I affectionately call “Dr Dad”) for a second opinion, and was surprised when, after a quick look and feel of the rash, he confirmed the internet’s diagnosis and prognosis. A late-adopter of new technology and someone who is a little suspicious of the way AI is silently creeping into many facets of modern life, I began to wonder if, in this instance, the tech behemoth did have something useful to offer.
Google Lens (available for both Android and iOS) came out in 2017, and allows users to search what they see with their camera rather than relying on words to describe what they are looking for. If you see someone wearing a top you like or don’t know if a plant in your garden is a weed, simply take a photo and the app will offer you suggestions on where to buy and whether or not to remove.
Google was already seeing billions of skin-related searches each year, so they expanded their reach to include dermatology in 2023, encouraging people to upload their skin woes for their image recognition technology to identify. And this is just the beginning. GoogleHealth is currently testing a tool called DermAssist that the company claims recognises hundreds of skin, hair and nail conditions (including more than 80% of the conditions seen in clinics) with better accuracy than primary care physicians and nurse practitioners.
Skin conditions are the number one presentation to GPs in New Zealand and family doctors can usually treat the majority of minor cases. But with minimal dermatological training and a myriad of possible conditions of varying severity, they do often refer their patients on for specialist care. There are only 75 dermatologists in the country, and wait times in the public system currently sit at around six months.
The tool we are carrying around in our back pockets could be seen as a way of improving access and helping a system under strain, but before we let robots come for the doctors’ jobs, it is worth exploring whether AI can replace the knowledge and skill of qualified skin specialists who have completed 13+ years of study.
One of these highly trained doctors is Dr AJ Seine, a consultant dermatologist in the Bay of Plenty who sees hundreds of patients every year with a range of skin conditions from ringworm to rosacea, shingles to skin cancer. Dr Seine typically spends 20-30 minutes with each new patient, taking a detailed history that includes a timeline and description of symptoms, pre-existing conditions and what medication they are on, before doing a thorough physical examination.
He sometimes uses a UV light or dermatoscope to help him see what may not be visible to the naked eye, but points out that visual representation of a skin condition is only one part of the diagnostic process. “Sight alone cannot encompass all the nuances required for accurate diagnosis and treatment,” he says.
Dr Seine always feels for texture and thickness to understand what is going on underneath the skin, and will often take biopsies to help with diagnosis. Once he has reached a satisfactory conclusion, he will talk his patient through their options and come up with a personalised treatment plan.
It all sounds far more complicated than scanning a patch of skin with a smartphone, and I was curious how Google Lens performed, so Dr Seine was happy to give the technology a test drive on three different patients (admittedly not the largest sample size, but interesting nonetheless).
First up was a fair-skinned adult with psoriasis, a rash that, despite being fairly common, Google Lens could not find a visual match for.
The second test subject presented with dry skin, an itchy rash on his back and a history of eczema as a child. Dr Seine did a spot diagnosis of eczema (that he would expect any GP to correctly identify), but Google Lens offered up three possible options: eczema, psoriasis or post inflammatory hyperpigmentation. All are very different conditions that are potentially treated in different ways. He worries that being unable to offer up conclusive identification may lead a patient down the wrong path of trying incorrect, over-the-counter medication that could actually have a detrimental effect on the skin.
He did, however, see some value in the fact that Google narrowed it down from hundreds of potential skin conditions to only three, for the user to go and research more thoroughly. Dr Seine himself uses the search engine as a starting point for rare presentations, but he knows to only rely on peer-reviewed journal articles and clinical trials that support evidence-based medicine.
If everyday people are going to use Google, he urges them to make sure they are using reliable sources like DermnetNZ – run by local dermatologists who ensure the information is accurate, unbiased and up to date – and to avoid reading what the general public have to say, which can be alarmist when taken out of context. “People will often post about the worst case scenario,” he says, “because that is what has most affected them or will get the most interest.”
Dr Seine’s third patient was a Māori man with a large scar on the back of his head that he diagnosed as acne keloidalis nuchae, an inflammatory condition around the hair follicle on the scalp. But Google suggested the autoimmune condition alopecia areata that causes patchy loss of hair instead.
Dr Seine wonders if the patient’s ethnicity may have had something to do with this misdiagnosis, and explains that conditions with inflammation that are easily visible on Caucasian skin are often less discernible on darker skin tones, which can lead to delayed diagnoses, even by experienced specialists.
One of his major concerns with the use of AI in dermatology in New Zealand is that the algorithms can have built-in biases and a lack of data powering them. “What has gone into these deep learning systems that reflect our diverse population, and does it have sufficient quantity to be able to reliably diagnose?” he says. “I don’t know how they can have enough photos of Māori skin … I know that my colleagues and I sometimes struggle with skin of colour, so I worry about an online tool doing the same thing.”
But Dr Seine’s biggest concern is the potentially devastating consequence of incorrect identification around moles and skin cancers. We (with Australia) have the highest rates of skin cancers in the world – so common, in fact, that he diagnoses between five and 20 every day – and what is worrisome is that fair-skinned New Zealanders have a lot of sun damage that can make them harder to detect. “My concern with AI is about its ability to diagnose skin cancer, and not falsely reassure someone that their mole is fine”.
Around 300 Kiwis die of melanoma every year, so the stakes are high. When a patient presents with a questionable mole, unless he can categorically say it is completely harmless, he will take a biopsy. “If in doubt,” Seine says, “we need to know.”
Google Lens does come with a disclaimer that says use of the tool doesn’t replace a medical diagnosis, but there is a risk that patients with low health literacy, particularly those struggling to pay for healthcare or desperate for answers, might take the guidance as an accurate diagnosis and avoid a visit to the doctor.
While he has never had a patient report using Google Lens specifically, Dr Seine says that the use of the search engine is common. Many of his patients look sheepish when admitting they have Googled their symptoms, but he is happy they are actively involved in their care and always open to hearing what they have discovered before investigating himself. “Google is part of everyday life and I need to accept it to have a proper therapeutic relationship with my patients,” he tells me.
But people should always approach information obtained online with a healthy dose of scepticism, he adds, stressing that it should never replace professional medical advice. This is particularly important for anyone whose rash is rapidly changing, coming out in blisters or is making them feel unwell. “You don’t need to be sitting there on Google looking at what it might be,” he warns about those cases. “You need to get yourself seen.”
Back to my son’s leg. I followed the advice given to me by Google and Dad to not treat the molluscum, and am pleased to report that the angry red rash has started to calm down – as have I. Photos I found online of the unsightly bumps infecting mouths and eyes sent me into a spin, with my overactive imagination picturing Charlie’s beautiful face forever scarred by the dreaded pox. While the reputable DermNet NZ website did advise that it is contagious, it didn’t say that spreading to the face is very uncommon – I had to learn that nugget of wisdom from Dr Dad instead.
So, it seems that in my short dalliance with Dr Google (dermatologist), the machine managed to inform but failed to reassure. For that, I needed the human touch.