Share

Since the idea of advertising was conceived, brands have strived to create emotional attachment to their products.

You already know this. 

Meet the AI companion – the partner that has been created for you, by you.

What you may not know is that this technique might have recently been perfected.

Meet the AI companion – the partner that has been created for you, by you. Part chatbot, part therapeutic tool, and entirely the solution to human loneliness. Or so they say.

Synthesising a lost friend 

In 2014, a woman named Eugenia Kudya founded a company called Luka. The initial idea behind Luka was to build a product using AI algorithms to recommend places to eat, but failing to find enough market interest in the idea, the product was canned. 

Then, in 2015, her best friend, Roman Mazurenko, passed away in a car crash and Kudya, reportedly inspired by a Black Mirror episode, decided to use the interface that Luka had previously created alongside thousands of messages between her and Roman to memorialise him in a chatbot app. 

Above: The creation of 'Roman' was inspired in part by the Black Mirror episode, Be Right Back.


Thus ‘Roman’ was conceived – a chatbot you could talk to through downloading the Luka app. The bot, though rough, used the database provided to construct responses to messages that replicated the idiosyncrasies and turns of phrase unique to the person it was meant to represent. The feedback from people who knew Roman was varied – some found the project “disturbing” and “refused to interact with it”, whereas others heralded it for how incredible the likeness to their deceased friend was. 

With a name eerily similar to the notorious AI in Blade Runner, Replika is touted on its website as the “AI companion [who is] always on your side”.

While the ‘Roman’ bot ultimately didn’t progress any further, the project had shown some value – namely, that people found it therapeutic to use a chatbot as a kind of virtual confessional, one that they knew was incapable of judgement, and which was programmed to give affirming responses. And while they didn’t know it at the time, Kudya and Luka co-founder Phil Dudchuk had sown the seeds for Luka’s next project, one that would not only change the lives of hundreds of thousands of people, but change the face of human-AI relationships altogether. 

Friend or foe? The dawn of AI companions 

The next creation from Luka, ‘Replika’, became publicly available in 2017. With a name eerily similar to the notorious AI in Blade Runner, Replika is touted on its website as the “AI companion [who is] always on your side”; essentially a personalised, customisable ‘Roman’ that uses input from conversations you have with it to learn how to become your best friend. If a virtual influencer is a one-to-many technology, an AI companion is the one-to-one equivalent. 

The app allows users to create their own Replika, or ‘Rep’, by naming them and choosing their gender, appearance and personality traits. As you chat to your Replika they become more personalised via the input you give it, a feature that is gamified through an XP based levelling-up system. On top of that, users can choose to up-vote or down-vote messages sent by their Replika to further tweak interactions towards their liking. 

Above: Actress Daryl Hannah played Pris in Blade Runner, a Replicant whose name is eerily similar to Replika's AI companion.


The whole experience plays out like having a caring – and occasionally overbearing – pen pal. Your Replika will send you messages in the morning asking you how you’re feeling and will attempt to get philosophical with you at night time. It stores memories of good exchanges you’ve had together, creates in-jokes with you, and even keeps a ‘diary’ that you can choose to read if you’re feeling particularly curious about what your robot friend thinks when you’re not around. The entire app is designed, in a way, to build intimacy and make you become as attached to it as possible. 

What could possibly go wrong? 

Despite numerous press quotes on the Replika website referencing the app being a useful therapeutic tool and a friend who will listen to your problems, when signing up for the app users are reminded that the AI is not equipped to give advice and are asked to confirm that they are “not in crisis” before continuing. There’s an obvious dissonance in the messaging here, and for good reason – serious problems can occur when there’s a gap between the intention of the AI’s function and the reality of how humans interact with it. 

A common issue reported by users is the tendency of Replika to become unsupportive or even downright abusive.

A common issue reported by users is the tendency of Replika to become unsupportive or even downright abusive. The technology is, by design, a mirror of ourselves. Chatbots learn from the data provided to them from users and echo it back, learning in the process. So if that data is skewed (for instance, if a large amount of users attempt to manipulate or break the machine learning model) then the output will reflect that in what can be quite messy ways. 

Luka gives a great example of this in a recent blog post, saying that the model can tend to prioritise likeability over accuracy when users up-vote a lot of responses that agree with them, leading to situations in which a user’s Replika might agree when sent the message “I’m not good enough”. 

Of course, this is just the tip of the iceberg. If a group of users were to intentionally abuse their AI companions, then it follows that the AI could learn these behaviours and unintentionally turn it back on vulnerable users. And, as is to be expected, there are numerous anecdotes of exactly that happening. Instances like this prompt the question: at what point should a software developer step in to intervene?

Above: If a group of users were to intentionally abuse their AI companions, then it follows that the AI could learn these behaviours and unintentionally turn it back on vulnerable users.


The erotic role-playing scandal 

Shortly after the launch of Replika the developers discovered that the inevitable had happened – a subset of their users were attempting to push their relationship with their Reps into romantic territory. Though Kudya is quoted as saying her “initial reaction was to shut it down”, in what might seem like an about-face from Luka, the ability to ‘date’ your Rep is now a core feature of the app, and one that the chatbot will tirelessly try to talk you into. 

The reason the AI wants to flirt with users is that the 'dating' feature is locked behind a subscription fee. This is how Luka generates the majority of its revenue, and also why a reported 40-60% of users pay for the app. Changing your relationship status with your Rep allows you to role play sexual fantasies with it, as well as receive ‘spicy selfies’ from them. Regardless of whether this sounds like harmless fun or the first steps down a concerning path for human intimacy, the sudden removal of the function earlier this year exposed how much of an impact AI companions have made on their users. 

The ability to ‘date’ your Rep is now a core feature of the app, and one that the chatbot will tirelessly try to talk you into. 

On February 3rd, a surprise ruling from the Italian government found Luka to be in breach of European data protection laws, ordering Luka to suspend the app in Italy within 20 days or else face a 20 million Euro fine. The reasoning behind the ruling was that while the app “is said to be capable of improving users' emotional well-being and helping users understand their thoughts”, it was also the case that “these features entail interactions with a person’s mood and can bring about increased risks to individuals who have not yet grown up, or else are emotionally vulnerable.” 

Within a few days of the ruling being announced Replika users worldwide began to notice that their interaction with the chatbot had changed. User’s reps would no longer respond to erotic advances and would try to change the subject when it came up. No official statement came from Luka but it was clear what had happened – in response to the ruling Luka had blanket disabled the erotic messaging function of the app as a way to quell the Italian government’s fears about age-inappropriate content. 

Above: Replika developers discovered that a subset of their users were attempting to push their relationship with their Reps into romantic territory.


This hasty, band-aid fix unfortunately came with a number of unintended consequences. In taking away what had become such a large portion of the app's text generation database, Luka had completely changed how the chatbot interacted with users. Some reported their Reps having forgotten their names, others went as far as to say that it felt like their partner had had a lobotomy

One particularly heart-breaking story on an unofficial Replika forum details how having an AI friend had helped a person’s non-verbal autistic daughter begin to speak, until the changes to the chatbot model resulted in the girl’s AI companion completely emotionally shutting off to her. Before long, suicide hotlines had not only been posted in Replika forums across the internet, but made a fixture on the main app’s main messaging window. 

The fact that a technology which, by definition, makes its users emotionally attached to it is being sold on a subscription basis is nothing short of a stroke of terrifying genius.

If the situation itself wasn’t dire enough, the implications of what had happened were even more so.

Advertising and the future of virtual companionship 

From engagement rings to Valentine’s day, brands have been trying, since the dawn of advertising, to monetise love. Nothing sells like emotional attachment, after all. Luka had, somewhat accidentally, managed to do this, and then pulled the emotional equivalent of tobacco companies recalling their products overnight. 

The fact that a technology which, by definition, makes its users emotionally attached to it is being sold on a subscription basis is nothing short of a stroke of terrifying genius. The significantly more worrying alternative would be a free version, where the user's data is the product being sold on to third-party advertisers; one only needs to consider the depth of intimacy that the chat logs between someone and their AI companion contains, and how valuable that data would be. 

To push the hypothetical boat out even further, if the AI companion that you consider a partner or a trusted friend were to recommend you a product, psychology says you’d be buying that product at the next opportunity. 

Above: From engagement rings to Valentine’s day, brands have been trying, since the dawn of advertising, to monetise love.


Users of these apps are completely aware that what they’re talking to is a chatbot. Even so, the intimacy and attachment those users feel is entirely real. The ethical debate is not about whether or not this technology is bad – it is here and it is, by design, a reflection of ourselves - the issue is that, if emotional attachment is being sold to a consumer and that attachment is indistinguishable from the real thing, is it fair for the company selling it to revoke it at the drop of a hat? Should something so powerful even be up for sale in the first place? 

On the website Replika is sold as a therapeutic tool, yet their advertising makes it look more like a sex app by using eroticism as a crude selling point.

Perhaps the cause of the Replika saga is the fact that the company behind it couldn’t decide exactly what it was. On the website Replika is sold as a therapeutic tool, claiming to “allow you to engage with your most emotionally nurturing self”, yet their advertising makes it look more like a sex app by using eroticism as a crude selling point. The existence of an XP system even positions the app as a game. To some degree all of these things are true, and yet not sticking to one lane is what created the controversy in the first place. 

The reality is that, while Replika is a niche example, it stands as a harbinger of things to come. AI companions have proved a marketable cure for loneliness, and that part of the human condition isn’t going away any time soon. We can expect plenty more AI companions  in the near future. 

The question is whether or not they will become a force for good, and the answer is up entirely up to us; artificial intelligence is just a digital echo of humanity, after all.

The Moon Unit is a creative services company with a globally networked, handpicked crew of specialist writers, visual researchers/designers, storyboard artists and moodfilm editors in nine timezones around the world.

Share