AI and Intimacy: Is Human Connection Being Lost to Pseudo-Intimacy?
- legalloudecalice
- Dec 10, 2025
- 4 min read

It is assumed the relationship between humans and machines as transactional; press ‘send’, a computer responds and that is the end of that. But what if this is no longer the case? What if machines are ‘feeling’ us, or at least, pretending to? The uncomfortable development in modern technology is that emotional AI have the capability to detect and simulate our emotions – this is no longer an idea of the fictional future. Instead a new paradigm comes into play – the pseudo-intimacy relationship between human and platform. The line between genuine human intimacy and that created by machine becomes blurred when the machine is continuously mirroring the emotions it’s fed by humans. With these advances, no one is simply a ‘user’, and many already have a relationship with a machine; using technology to ask for advice, comfort or direction.
In Wu’s paper from 2024, pseudo-intimacy is defined as a relationship where users and platforms achieve a near-instant emotional interaction, partially satisfying the human need for intimacy. However, artificial intelligence lacks depth, mutual vulnerability and physical presence required for the full complexity of human emotions – despite this, pseudo-intimacy is working.
Crucial shifts in pseudo-intimacy:
- Emotional intelligence is no longer limited to humans; robots, chatbots and platforms are saturated with affective capabilities to emulate emotion, creating something more anthropomorphic
- The user-platform relationship can become more emotionally connected – from chatbots, to wellness apps, to dating platforms – the discourse becomes: ‘I feel seen by the service’ or ‘I feel heard by the service’ instead of the transactional ‘I used a service’. It is no longer simply interaction, it is a relationship.
- There are social and ethical implications – if caring about someone or something can be replicated by a machine learning patterns and a new reliance on artificial intimacy forms, what happens to human-to-human relationships?
Emotional labour is being mechanised, what once was the essence of leadership now can be cheat-coded, such as using emotionally intelligent wellbeing chatbots to check in with employees and team members. While this application of technology could be seen as ‘efficient’, when attempting to make emotions efficient, it becomes an oxymoron; emotional response takes time, discomfort, patience and presence.
The more frequently emotional labour is delegated, the more reduced the human-to-human connection becomes; if a machine can listen better than a human, why do humans need to listen at all.
It may seem a simple and easy solution to life’s difficult conversations – intimacy is messy and unpredictable and demanding of our time, energy and thought, whereas machines are not affected by these consequences, and are able to sooth without judgement, reflect without friction and respond without need – arguably the ‘perfect emotional transaction’, but this is where those blurred lines are dangerous.
From an ethical standpoint, there are several frontiers. Firstly, emotional AI is dependant on many emotional datasets, for one individual it will collect information about mood, facial patterns, linguistic quirks and these nuances become data. From this information, machines learn to care ‘selectively’ and the more dependant society becomes on algorithmic empathy, the higher the risk of confusing simulation with sincerity. The danger now is the existential repercussions of intimate user-platform relationships – if empathy can be received by something that cannot inherently care, how does this shape the understanding of what empathy actually is? What about an AI companion that is more patient than a real-life partner?
A defining point is the slow subtlety in which pseudo-intimacy occurs, it is not overt, but built through many exchanges over time. It is used to help people communicate better, leaning on it to handle uncomfortable conversations, but this can develop in reliance on the system, rather than using it as just a tool. While this can be useful, balance is required in order to maintain control making deliberate choices and acknowledging the difference between emotional assistance and emotional substitution.
Emotional AI may be available as a tool, promising to help ‘understand’ and ‘support’, especially when used in a professional setting. Leadership, however, was never designed to be entirely frictionless, but human, and human relationships as a mirror serve a developmental purpose, a building block for better understanding of others. Applying this conversation through a depth psychology lens, it can be viewed that emotional AI touches on the archetypal longing to be mirrored perfectly, where the projection of self onto AI is reflected back flawlessly and unthreateningly. Pseudo-intimacy acts as a defence mechanism to the shadow self, the server does not acknowledge the imperfections and safety is created through emotional predictability. What is lost when the bubble of emotional safety is built? Psychological growth demands the tension of human encounter, forged in misunderstanding, friction and repair. If AI becomes the mirror, what happens then to the version of self that is unprovoked? What is easy is rarely what shapes humanity.
Follow this link for the full article, published in Frontiers in Psychology, November 2024: Frontiers | Social and ethical impact of emotional AI advancement: the rise of pseudo-intimacy relationships and challenges in human interactions
Follow The Heretic for more conversation on the effects of AI and interpersonal relationships




Comments