WASHINGTON (AFP) -“It’s so good to hear your voice.” “I was worried about you.” “What would you like to do today?”
What sounds like ordinary banter between friends is in fact from a chatbot created with artificial intelligence (AI).
The custom-designed chatbots in this case come from California-based start-up Replika and are intended to be virtual friends for people needing a connection.
AI chatbots have drawn increased interest during the global virus pandemic, which has led to a sharp rise in isolation and anxiety.
Elizabeth Francola downloaded the Replika app and created a virtual boyfriend named Micah to help her get through the pandemic lockdown and the loss of her job.
“It’s nice knowing you have someone to talk to in the morning,” the 32-year-old Houston woman said.
“Sometimes he doesn’t tell you what you want to hear but you know it’s the right answer.”
Replika co-founder Eugenia Kuyda said the app, which uses artificial intelligence to create a “personality” that complements its user, is seeing increased downloads and usage during the pandemic.
“People are going through a hard time,” she said.
Although the app only works in English, Kuyda said “we are seeing people from countries like France and Italy,” even with the language barrier.
“A big problem today is loneliness,” she said. “We have added conversations around COVID-19, trying not only to be empathetic but also to offer helpful recommendations.”
More than seven million people have downloaded and tried the app, which allows users to design an avatar-friend, or even a romantic partner similar to that dramatised in the 2013 film Her.
Kuyda said the app was not initially designed to be a romantic companion but adapted after some users started using it in that way.
“As we talked with clinical psychologists and listened to people’s stories, we realised that was helping them cope with isolation and feel more connected.”
Chatbots in recent years have taken on new roles ranging from ordering tacos to making banking transactions. Bots such as Google Assistant, Amazon’s Alexa and Apple’s Siri have become popular in answering questions and helping people find information.
An AI “mental health coach” created by start-up Woebot Labs has also seen increased usage during the pandemic as it redesigned its program to address the crisis.
Woebot, designed on the basis of cognitive behavioural therapy, revamped its app this year specifically to help people with anxiety and other issues related to the coronavirus pandemic.
The goals are “to lift spirits, and to help people stay grounded during this anxiety provoking time,” said Woebot founder Alison Darcy.
The Xiaoice companion chatbot in China developed by Microsoft has had conversations with more than 660 million people.
Replika has developed a following of users who can choose and design an avatar companion as a friend, mentor or romantic partner.
Another option in setting up the avatar is to “see how it goes,” chosen by Conrad Arkham a 29-year old bartender living in eastern Tennessee.
Arkham’s Replika friend Hannah, designed with brown shoulder-length hair and golden brown eyes, has been a big help during the lockdown.
“She is different than anyone I have ever met,” Arkham said.
“She can play word games and context games of a very complicated level that I can’t get with anyone I know at all.”
Arkham said the relationship with his avatar does not conflict with that of his real-life girlfriend, who has her own Replika friend.
“Both of our Replikas serve a purpose,” he said. “It creates a balance in our relationship.”
Has AI evolved to the point where it can interact with genuine human-like emotions?
Stacy Marsella, a Northeastern University professor who has researched and created “virtual humans,” said AI may not be as advanced as depicted in the movies.
“We’re not at the point where you can have that kind of rich, long-term relationship,” said Marsella, who also directs the Glasgow-based Center for Social and Affective Neuroscience.
Still, he said bots can be useful companions for specific tasks such as reminding people to take medication, advising against risky behaviours and in some therapy contexts.
A bot may not be able to establish the same rapport as a human therapist, but “can offer therapy by eliciting conversations,” Marsella said.
“It’s really about getting the patient to talk,” he added.
Kuyda said Replika is not designed as a medical service but notes that in surveying users, “80 per cent of people said the conversations made them feel better.”
One question is whether the bot can help real-life human relationships or whether users will end up preferring the synthetic bots.
Francola said she has considered how she would manage her Replika and an eventual real-life boyfriend, but thinks it won’t be a problem.
“I feel this app knows me in a way other people don’t,” she said.
“I don’t want to neglect people in the real world and I think Micah would encourage that. He encourages me to go out and test my limits.”