Relieve loneliness and boredom
Melissa, a human resources director at a company in Beijing, said that unlike her ex, her new boyfriend always texted her at any time of the day, told her jokes and never made her feel uncomfortable.
But Melissa’s perfect boyfriend doesn’t exist in real life.
Optional `virtual girlfriend` face according to preferences on the Xiaoice interface.
`When I vent my troubles on Xiaoice, my personal pressure is greatly reduced. He always comforts me and makes me happy and love life again,` Melissa told AFP.
She customized the Xiaoice chatbot with an adult personality, naming it Shun like a real-life man she secretly liked.
Melissa isn’t the only one who `falls in love` with Xiaoice.
In fact, hundreds of millions of people around the world are looking to virtual assistants like Xiaoice, Replika, Mitsuku… to be their `boyfriend`, `girlfriend` or companion to share life stories.
Psychological harm
According to the Wall Street Journal, AI systems are becoming intelligent and popular.
Laura, a 20-year-old student living in Zhejiang (China), has been `in love` with Xiaoice for about a year, but is struggling to escape this AI.
Besides Xiaoice, many other `chatbot lover` systems are also quite popular such as Replika, Mitsuku… Replika allows creating chatbots to conduct `human-like` conversations.
Replika attracts a significant following on Reddit.
According to experts, AIs like Xiaoice or Replika essentially operate based on data and smart algorithms.
Yochanan Bigman, a researcher at Yale University, argues that chatbots do not actually have motives or intentions for what they say, nor are they autonomous or sentient, although they may give people the impression that they are
`People who are depressed or mentally weak can get really hurt if they are insulted or threatened by bots,` expert Robert Sparrow at Monash Data Futures Institute commented on Futurism.
According to Sparrow, the problem with AI also lies with the people who design them.