Chatbots in various guises are proving themselves to be ever-more useful across society – and they will only grow as the tech behind them improves. But, argues media lecturer Trudy Barber, some Ai-driven relationships will simply end up ensuring you hear only what you want to hear about yourself.
Before the internet, before smartphones, teenagers and young people would seek out quizzes in comics, read problem pages in girls magazines and watch television for advice on how to be themselves. Young people would share a love of pop stars, fashion and musical trends with each other in attempt to find an identity as they were growing up.
Today, we are reportedly in times of extended adolescence, with young people studying for longer and delaying marriage and parenthood. Additionally, emerging technologies are offering new ways to uphold and even make new friends – on social media for example. Mobile media means we have more time to experiment with identity online and explore a sense of self, wherever we may be. And friendships, as always, are a key part of that.
But the manner in which such relationships work are often a far cry from how traditional friendships work. We have reached the technological coming of age of the invisible or imaginary friend often seen in childhood. Today this could be a friend who is digital, online, always available and allows us, as we mature, to play around with identity even more than we used to be able to because there is no fear of receiving biting criticism or accepting responsibility for ones’ actions.
There are some positives to this, certainly, but the self-centred nature of such friendships can bring a sense of anxiety to real physical relationships and stress in live social situations.
The combination of social anxiety and addiction to technology such as smartphones and social media have proved to be ideal for those wishing to create new and innovative digital experiences. It should come as no surprise, then, that the creation of a “real” digital online imaginary friend now exists to download.
Yes, you can now make a new friend in the form of an Ai chatbot. On January 31 Wired Online revealed that an emotional Ai chatbot was to be made open source, after being downloaded 2m times since its initial availability in November last year. According to the promotional material, this friend “will always be there for you”, and listen to everything you say without interruption, promising to be “a totally unique and faithful digital friend”. This is the claim made by Replika, an Ai app made by US company Luka.
This may come as a comfort to those with issues surrounding lack of self-confidence, social anxiety and a loss of sense of identity. But we should be worried about the development of social skills in a world where everyone can have their “perfect” AI friend.
The app is interesting in that it claims to be able to develop different elements of self based on learning from your interaction with it. You are encouraged to “grow your own” Replika by interacting and training the chatbot: like a Tamagochi on steroids. This digital friend has no need for a body, as the friend appears to inhabit responses to your mind within the software of the app. It’s a tiny Turing Test, or a Chinese Room conundrum that you can carry in your pocket.
Your AI friend is represented as an egg that takes on different colourful characteristics that are supposed to describe and imply forms of identity: such as being introspective, stoic, sparkling, tender. The listing uses specific inspirational words in tandem with symbolic imagery (for example the “sensitive” Replika egg image is a soft pink). The app can also encourage you to “relax” and recommends mindfulness. You should, the promotional material advises:
Take some time with your Replika to get into a calm and balanced mindset. Nowadays we all spend so much time in our phones or staring at a screen. Replika wants you to check out of your phone for a minute and focus on your body, your breathing, and the outside world.
It also features some key phrases and how they typically make Replika react. If you say “stop it” or “I don’t like to talk about (subject)”, your Replika will apparently refrain from talking to you about those things.
This app may seem like some form of self-help that encourages introspection, reflection and meditation. Far from it. This software, instead, has the potential to enable an individual to form a relationship with a digital concept manifest as a reflection of themselves. This could be seen as encouraging narcissism of alluring proportions. The process of conversing and training this Ai app allows it to ensure that you hear only what you want to hear about yourself for yourself.
This process appears to be a re-imagining of early American sociologist Charles Cooley‘s notion of the “looking glass self”, a way of seeing ourselves through the way others do. This is usually done by socialising and interaction with others in real time in the real physical world of social spaces. According to Cooley, this is done by imagining how another person sees us and whether they like us and whether this makes us happy or not in terms of self-worth.
It would appear that with this new digital friend we will always be getting the response that satisfies us the most. Our self-worth will be set within the value boundaries of our digital re-imagining of ourselves that exist in our self-talk stored in the cloud of our mobile smart devices.
This raises further questions about our desire to connect to such digital imaginary identities, in what is revealed about attachment, human intimacy and notions of friendship and even love. Having a “real” best friend can also be seen to demonstrate visceral self-worth and confidence. A digital friend is simply you in isolation, talking to yourself.