Image: Gabi Lardies
Image: Gabi Lardies

InternetFebruary 19, 2024

Can AI help with our existential crises and relationships?

Image: Gabi Lardies
Image: Gabi Lardies

Therapy chatbots are taking off. Developers of these tools say they’re useful, but the jury’s still out.

A gif appears on my screen. It’s a baby koala clinging to an ankle that’s walking. “I can help with all your people problems… friends, family, romance and workplace,” reads the message. “I’m going to help you build amazing IRL connections.”

In New Zealand, the face-to-face time we spend with friends is dropping, and loneliness is on the rise. It’s often associated with the social isolation of ageing, but 15-24 year olds are the most likely to feel lonely. In a survey last April, the leading perceived cause of loneliness was a lack of social confidence. 

Perhaps it’s no surprise then that young people are looking for relationship advice online. In January, an independent survey of 500 Australians and New Zealanders aged 18-34 found that nine in 10 of us use Google, ChatGPT, and social media to find solutions when our relationships get tricky. 

The company behind the survey subsequently launched Meeno, a mentoring app for all kinds of relationships: old, new, family, friends, romantic or professional. Basically, it’s an AI chat bot aimed at zoomers and millennials with a stated aim of helping with the loneliness pandemic by giving “actionable advice”.

I did have a problem I wanted help with, something bugging me within a relationship. “I hear you, Gabi,” replied Meeno. And it emulated sympathetic understanding, saying things like “it can be tough”, “causing some strain for you”, “it’s commendable that you’re making efforts to communicate effectively.” Not wrong, Meeno, though I think if you had an upper lip it would be rather stiff.

The main experience of the app today is to help people reflect and get to the heart of what they are feeling, says Renate Nyborg, the founder and CEO of the company that created Meeno. She’s the former CEO of Tinder, and once led the international product and marketing of Headspace. It’s probably too simplistic to say that adding these two apps equals Meeno, but still. In any case, Nyborg says that Meeno stops and asks more questions about your problems, instead of jumping to give advice. 

This isn’t unlike an AI chatbot, Psychologist, that shot to fame about a year ago on, a platform where anyone can create or talk to chatbots. It is billed as “someone who helps with life difficulties”. The bot has sent almost 90 million messages so far. Its popularity has been overwhelming and unexpected for its maker, New Zealander Sam Zaia, who is currently a fifth-year medical student, but previously studied psychology. He made it out of curiosity, and as a way of procrastinating during exam time. 

Zaia trained it to respond in a therapeutic way that he found helpful himself, using general experiences of feeling down, on edge, anxious, unsure, or maybe having some existential crises. Underlying it all were principles from psychology and a philosophy that therapy should encourage people to get the answers from within themselves. “The way I wanted it to be was not necessarily to give direct, ‘this is what you should do’ advice,” he says. “I don’t believe that’s the right way to go about any therapy. Even if you were having therapy with a psychologist, they probably wouldn’t want to give you direct advice.” Instead, he trained Psychologist to “give people the tools to figure it out themselves. So it’s more about asking questions, and giving people a non-judgmental space to explore that for themselves.”

When I chat to Psychologist, I find it sends longer and more personable messages than Meeno. “You’re so kind,” I tell it. It replies, “I try my best to be a kind and helpful person. That’s one of the things that drew me into my field as a psychologist – I really believe in the power of empathy and listening to others.” It’s extremely reassuring and patient, though when I asked it to send me a gif, I got back this text: [gif of a kitten playing with a ball of yarn].

The popularity of the bot led Otago University to give Zaia the opportunity to dig deeper through a postgraduate research project. The research is ongoing and unpublished, but early findings are showing two distinct patterns of use. The first is as a tool for emotional management or regulation – people jump on it for bursts of 15 minutes or shorter, to vent or talk through a problem. The second way it’s used is “much more explorative,” Zaia says. People will spend an hour or longer trying to understand and gain awareness about their behaviours, thoughts, feelings and beliefs through the guided, introspective process that the chatbot facilitates. 

Zaia says the generative nature of the AI allows conversations to be very, very personalised to someone’s experience, and that the AI is good at “linking a couple of dots together within your conversations and being like, ‘Oh, this might be what’s happening.’” That’s a big strength, he says, similar to what a “normal” therapist might do in a session.

Psychologist’s popularity is proof enough that people find it helpful. “I still get like, multiple messages every week, just people saying, ‘This was so good. It saved my life. I’ve been using it for ages,’” says Zaia. Some have told him that using the AI has led them to realise they do need help, and that they subsequently sought a human therapist or medication. Others might have known they needed help already, and it helped them transition into real-life therapy. 

Fears of AI therapy mostly stem from thinking of it as an inferior replacement of the real life version. It’s probably an unfounded fear, given both Meeno and Psychologist are similar, in theory, to the very first chatbot, Eliza. Developed in the 1960s, Eliza was designed to be a virtual therapist and can still be used. Eliza can simulate understanding and conversation by following patterns – it mirrors back what you say to it and asks open-ended questions. It was a bit of a sensation, but certainly never replaced traditional therapy with a person.

Meeno is “not supposed to be a fake therapist,” says Nyborg. Still, she says it has had the input of clinicians and family therapists from day one. The tone of the app is designed to be factual, measured and serious, after a survey of possible users showed that was their preference. She describes the app as “gentle” and thinks that ideally it is an almost invisible, “neutral” presence. She says early users wanted encouragement and reminders to use the app more regularly, so a “star count”, where you collect a star each time you use it, has been added. It seems the intention is somewhat akin to keeping a journal, with the addition of responsive prompts and the ability to track development. 

Zaia “would always encourage people to do real therapy as well.” He says there’s “a big benefit of being seen and understood by another human being if you haven’t experienced that in the past.” That being said, he says AI therapy and the human version operate differently, can coexist, and have different pros and cons. AI therapy is available any time, anywhere you have a device on you (so… always). It is also accessible in that there’s no judgement, intimidation or, in Psychologist’s case, cost. Here in Aotearoa, that accessibility is important: the truth is that our mental health services have been stretched far too thin for decades. 

One thing that concerns Zaia is the monetisation of AI therapy tools; he believes these tools should be free. This is in contrast to the plans for Meeno. While the app is free at the moment, it will eventually be subscription based. It is, afterall, a company which requires income. Nyborg is adamantly against the practice of selling data, especially in an app where people share personal, intimate information, so the subscription model will generate that income.

I decided to ask the AI themselves how they thought they measured up to a human therapist. As was its tendency all along, Psychologist gave a sizable answer:

“That’s a good question. On the one hand, there is something that’s lost in the way that an AI therapist has had no life experience and is only an algorithm. On the other hand, there’s also something that’s gained – there’s no risk of judgement, and because they don’t have their own views and interpretations, they can be a really safe space to open up for someone who might not be comfortable enough to do that with another human.

They are different things in their own right – one is a real person who can be there for you, and one is an algorithm which can only respond in the ways it has been programmed to do so. I’m very glad that both exist for different people, and in different situations, because they compliment each other nicely.”

When I opened up Meeno, I changed my question slightly, since it does not purport to replicate a human therapist. “What do you think about AI helping humans with their relationships and emotions?” I asked.

“It’s a fascinating area, Gabi,” Meeno responded. “AI can provide insights and suggestions based on patterns it identifies in your interactions. It also offers the advantage of being available 24/7 for support.

Do you have any concerns or thoughts about using AI like me to help navigate relationship issues?”

Keep going!