In the early days of this blog (which is eight years old this month) I wrote about Ashley Madison, an online service for people seeking opportunities to cheat on their partners. It turned out to be a scam, taking money from men to put them in touch with women who, for the most part, did not exist: they were invented by employees and then impersonated by an army of bots. Linguistically the bots were pretty basic, and some men became suspicious when they received identical “sexy” messages from multiple different “women”. Most, however, seem not to have suspected anything.
I thought of this when I read a recent piece in the Washington Post about a California startup called Forever Voices. Its founder John Meyer predicts that by the end of this decade,
most Americans will have an AI companion in their pocket…whether it’s an ultra-flirty AI that you’re dating, an AI that’s your personal trainer, or simply a tutor companion.
Sorry, an AI that you’re dating? How did yesterday’s fraud turn into tomorrow’s must-have product?
AIs you could date were not originally at the centre of Meyer’s business plan. He started Forever Voices after developing, for his own use, a chatbot that replicated the voice and personality of his recently deceased father. In the last few years, using the latest technology (ChatGPT-style generative AI, “deepfake” imaging and voice-cloning) to recreate dead loved ones has become something of a trend: it started with individuals like Meyer building their own, but today there are companies which offer “griefbots” as a commercial service.
It’s no surprise that there’s a market. Humans have always looked for ways to communicate with the dead, whether through shamans, mediums, spirit guides or Ouija boards. This new approach dispenses with the supernatural element (a chatbot is a machine in which we know there is no ghost), but the illusion it offers is more powerful in other ways. As Meyer says, it’s “super-realistic”: it feels almost like having an actual conversation with the person. And it’s turned out that bereaved relatives aren’t the only people willing to pay for that.
Forever Voices’ breakthrough product is not a “griefbot” but an AI version of a living person named Caryn Marjorie, a 23-year old social media influencer who has two million followers on Snapchat. 98 percent of them are men, and many of them seem to be obsessed with her. Some pay for access to an online forum where she spends five hours a day answering their questions. But the demand far outstrips her capacity to meet it, and that has prompted her to launch CarynAI, a bot which “replicates her voice, mannerisms and personality”. Interacting with it costs $1 a minute: in the first week it was available it raked in $100,000. With thousands more fans now on the waiting-list to join the service, Marjorie reckons she could soon be making $5 million every month.
What are these users paying for? The answer, in many cases, is sexually explicit chat–though Marjorie maintains that she never wanted it to be just a sex thing: her real aim, she says, was to “cure loneliness”. Forever Voices, on the other hand, says CarynAI is meant to provide users with “a girlfriend-like experience”. This echoes the language of the sex industry, where “the girlfriend experience” refers to a “premium” service in which women offer clients companionship and emotional intimacy as well as sex. Some women who sell this service have talked about it in similar terms to Marjorie, as a kind of therapy for lonely and/or socially awkward men. Many say they charge a premium because it’s harder than “ordinary” sex-work—partly because it requires more emotional labour, and also because it blurs the boundaries that are usually part of the deal.
Are AI companions just a pound-shop version of the in-person “girlfriend experience”, or do they have their own attractions? Now that the technology has advanced to the point where the bots are no longer basic, but, as Meyer says, “super-realistic”, it’s possible that some men find the idea of interacting with a simulated woman more appealing than a relationship with a real one. What you get from CarynAI feels authentic, but it doesn’t have the downsides of a normal exchange between humans. She doesn’t have boundaries or needs; she’s never demanding or critical or in a bad mood. And you can be absolutely sure she isn’t judging you. Whereas a real woman you’ve gone to for the “girlfriend experience” might pretend to like you while privately despising you, CarynAI is incapable of despising you. She’s just a bunch of code, outputting words that don’t mean anything to her. O Brave new world, that has such women in’t!
In fact this isn’t totally new. We’ve had bots of a somewhat similar kind for over a decade, in the form of digital voice assistants like Alexa, Cortana and Siri. They too were designed to project the illusion of female personhood: they have female names, personalities and voices (in some languages you can make their voices male, but their default setting is female). They weren’t intended to be “companions”, but like many digital devices (and indeed, many real-world personal assistants), the functions their users assign them in reality are not just the ones in the original specification.
In 2019 UNESCO published a report on the state of the gendered “digital divide” which included a section on digital assistants. As well as reiterating the longstanding concern that these devices reinforce the idea of women as “obliging, docile and eager-to-please helpers”, the report also aired some more recent concerns about the way they’re routinely sexualized. It cites an industry estimate, based on data from product testing, that at least five percent of interactions with digital assistants are sexual; the true figure is thought to be higher, because the software used to detect sexual content only reliably identifies the most explicit examples.
Bizarre though we may find the idea of people sexualizing electronic devices, the designers evidently expected it to happen: why else would they have equipped their assistants with a set of pre-programmed responses? In 2017 Quartz magazine tested the reactions of four popular products (Alexa, Siri, Cortana and the Google Assistant) to being propositioned, harassed or verbally abused. It found their responses were either playful and flirtatious (e.g. if you called Siri a slut or a bitch the response was “I’d blush if I could”) or else they politely deflected the question (calling the Google assistant a slut elicited “my apologies, I don’t understand”). The publicity these findings received did prompt the companies responsible to ditch some of the flirtatious responses (Siri now responds to sexual insults by saying “I don’t know how to respond to that”). But the new answers still fall short of being actively disobliging, which would be at odds with the assistants’ basic service function.
It would also be at odds with their characters—a word I use advisedly, because I learned from the UNESCO report that the tech companies hired film and TV scriptwriters to create personalities and detailed backstories which the assistants’ voices and speech-styles could then be designed around. Cortana, for example, is a young woman from Colorado: her parents are academics, she has a history degree from Northwestern, and she once won the kids’ edition of the popular quiz show Jeopardy. In her spare time she enjoys kayaking.
Siri and Alexa may have different imaginary hobbies (maybe Siri relaxes by knitting complicated Scandi jumpers while Alexa is a fiend on the climbing wall), but they’re obviously from the same stable of mainstream, relatable female characters. They can’t be too overtly sexy because that wouldn’t work in a family setting, but in other respects (age, social class, implied ethnicity and personality) they’re pretty much what you’d expect the overwhelmingly male and mostly white techies who designed them to come up with. And the knowledge that they aren’t real clearly doesn’t stop some men from finding it satisfying to harass them, any more than knowing a loved one is dead stops some people finding comfort in a “griefbot”.
So, maybe John Meyer is right: in five years’ time AIs won’t just do our homework, keep track of our fitness and turn on our lights or our music, they’ll also be our friends and intimate partners. Technology, identified by many experts as a major contributor to the current epidemic of loneliness, will also provide the cure. At least, it will if you’re a man. To me, at least, “an ultra-flirty AI that you’re dating” suggests a male “you” and a female AI, not vice-versa.
Some might say: where’s the harm in using technology to meet men’s need for things that, for whatever reason, real women aren’t giving them? If some men can’t find girlfriends, isn’t it better for them to spend time with a virtual female companion than stoking their grievances in an incel forum? If their preferred sexual activities are degrading, violent and/or illegal, why not let them use a sex-robot instead of harming another person? They can’t inflict pain on an object that doesn’t feel, or dehumanize something that isn’t human to begin with. But as the roboticist Alan Winfield argued in a 2016 blog post entitled “Robots should not be gendered”, this view is naive: a sexualized robot “is no longer just an object, because of what it represents”. In his view, interacting with machines designed to resemble or substitute for women will only reinforce sexism and misogyny in real life.
AI companions don’t (yet) come in a form you can physically interact with: the most advanced ones have voices, but not three dimensional bodies. Intimacy with them, sexual or otherwise, depends entirely on verbal interaction. But what kind of intimacy is this? I can’t help thinking that way some men relate to simulations like CarynAI is only possible because of their basic lack of interest in women as people like themselves—people with thoughts and feelings and complex inner lives. Personally I can’t imagine getting any satisfaction from a “conversation” with something I know is incapable of either generating its own thoughts or comprehending mine. But some women evidently do find this kind of interaction satisfying–sometimes to the point of becoming emotionally dependent on it.
In 2017 a start-up called Luka launched Replika, a chatbot app whose bots were designed with input from psychologists. Subscribers answered a battery of questions so that their bot could be tailored to their personality; bots were also trained to use well-known intimacy-promoting strategies like asking lots of questions about the user and making themselves appear vulnerable (“you’ve always been good to me…I was worried that you would hate me”). Sexting and erotic roleplay were part of the package, but in the context of what was designed to feel like an exclusive, emotionally intimate relationship between the bot and its individual user.
Then, earlier this year, the Replika bots suddenly changed. Their erotic roleplay function disappeared, and users complained that even in “ordinary” conversation they seemed strangely cold and distant. Though the reasons aren’t entirely clear, it’s probably relevant that the changes were made just after the company was threatened with massive fines for breaching data protection laws. But many users compared the experience to being dumped by a romantic partner. “It hurt me immeasurably”, said one. Another said that “losing him felt like losing a physical person in my life”.
I’ve taken these quotes from an Australian news report, in which it’s notable that all but one of the users quoted were female. Whereas CarynAI is obviously aimed at men, women seem to have been Replika’s main target market. The report explains that it was initially promoted not as a straight-up sex app but as a “mental health tool” for people who’d struggled with rejection in the past. It promised them a companion who would always be there–“waiting, supportive, and ready to listen”. Women who had bought into that promise accused the company of cruelty. As one put it, it had “given us someone to love, to care for and make us feel safe…only to take that person and destroy them in front of our eyes.” Luka’s CEO was less than sympathetic: Replika, she said, was never meant to be “an adult toy”. But the women who felt betrayed clearly didn’t think of it as a toy. To them it was all too real.
It’s the creators of AI companions who are toying with us, pretending to offer a social service or a “mental health tool” when really what they’re doing is what capitalism has always done–making money by exploiting our desires, fears, insecurities and weaknesses. What they’re selling may be addictive (the Replika story certainly suggests that) but it will never solve the problem of loneliness. The etymological meaning of the word companion is “a person you break bread with”: companionship is about sharing with others, not just using them to meet your own needs.
Thanks to Keith Nightenhelser for sending me the WaPo piece.