
- Culture
- Dating Apps
Synthetic Hearts:
Can AI Dating Apps Create True Love?
12 minute read

How Artificial Intelligence is Transforming Relationships in 2025
Aleksandr Zhadan was looking for love, and he had a plan. Instead of wasting hours swiping through potential dates on Tinder, he approached the problem like the good software engineer he was, as an exercise in efficiency and automation. He programmed Chat-GPT to talk on his behalf with over 5000 women, training it on his interests, his conversational quirks, and his tastes—so it would veto women with photos drinking alcohol or who mentioned zodiac signs on their profiles. He knew what he liked.
The AI could even schedule dates directly into his calendar, lining up over 100 before he finally connected with Karina Vyalshakaeva. While at first the AI spoke with her, gradually Aleksandr took over the conversation himself. He soon realised his quest was at an end. Chat-GPT now became his dating guru, eventually suggesting he propose to Karina. He took its advice, and she said yes. All that was left was to plan their wedding—with AI to help organise the logistics, naturally. When Karina learned about Aleksandr’s AI gambit, she shrugged it off. What was important was that they had found each other. It was a match.
From Swipe Fatigue to Algorithmic Intimacy
Through history, new technologies has arrived to mediate ever more facets of daily life. But only recently have we been allowing tech access to our most intimate thoughts. We tell the Google search bar our deepest secrets. We store private photos on cloud servers. Technology is reshaping human intimacy in the 21st century, from dating apps to video game romances to the new frontiers of sex charted by virtual reality porn, cutting-edge sex dolls and the evolving science of “teledildonics”. It’s natural to feel a little vertigo around these changes, but are they ultimately harmless? Or do we desperately need to pull the plug?
The most common technological mediation of our love lives happens through dating apps like Tinder, Hinge, Bumble and Grindr. In the UK, almost 5 million people use these, with another 60 million in the USA. Yet many users report fatigue at this stage—they’ve been using the apps for years and they’re still just as single. App usage usually consists of trudging through message admin, boring dates, awkward sex, and an occasional encounter with genuine chemistry that makes you wonder if it was all worth it, after all. Even if users want to leave dating apps behind, the technology has reshaped the offline landscape to make it less common for people to meet IRL. Many of us feel stuck with dating apps, as much as we’d love to just swipe left on the whole lot and be done with them.
Designed to Be Deleted — or Just Addictive?

Part of the problem with dating apps is the philosophy of love encoded into their architecture. All software is built according to deliberate design choices that guide user behaviour, and dating apps operate by a principle of quantity over quality. They encourage superficiality by forcing us to judge potential partners based primarily on their photos. Dating is posited as a numbers game: if you swipe through enough possibilities, you’ll eventually find the one. Preferences are expressed as a menu of available options. The mysteries of desire are flattened and quantified, with no room for the unexpected. When those looking for love think in terms of breadth, rather than depth, they are unlikely to find the one. Hinge reports that just 1 in 500 swipes leads to an exchange of phone numbers, let alone a date.
The nub of the problem is that these companies’ business models clash with the stated aim of their technology. They supposedly exist in order to help users find relationships and ultimately get off the dating carousel (Hinge even used the tagline “Designed to be deleted”), but the fact that most are funded by subscription models means they have little incentive to actually find users a long-term match. The longer you’re single, the longer you’ll keep paying for your account. Many apps use dubious gamification techniques to keep users engaged for longer. Match Group, owner of Tinder and Hinge, is currently being sued for its addictive features which turn users into “gamblers locked in a search for psychological rewards” to feed its bottom line.
Coaches, Clones, and Chat Tactics
If the prospect of dating apps wasn’t already exhausting enough, now generative AI is entering the fray. There are AI dating coaches like Rizz, which you can feed screenshots of your chats and ask to write responses in one of three styles: Genuine 🌹, Rizz ⚡ or NSFW 😈. How are we supposed to know whether the charming banter we receive from a new match is even written by a real person?
The major dating apps are also investing in AI, incorporating features that help users choose better photos and write better messages. One particularly sci-fi idea is to offer “AI wingmen”—clones of users which “date” other user’s clones in order to check for chemistry before the humans actually speak. These new features have prompted a group of academics to pen an open letter calling for stronger regulation around the use of AI on dating apps, claiming it could exacerbate mental health issues, facilitate manipulation and deception, or reinforce existing algorithmic biases around race and disability.
Beyond Humans: Loving the Simulation

If there’s one thing you can say for dating apps, at least they ultimately aim to connect you with a real person. But a new wave of technology is allowing people to develop wholly simulated relationships with AI chatbots. When the 2013 film Her showed a heartbroken Joaquin Phoenix falling in love with a disembodied, Siri-like AI assistant voiced by Scarlett Johansson, it seemed like a bleak vision of a distant future. Barely a decade later, that future has arrived, and the movie has been twisted from cautionary tale into an aspirational vision of how to build AI for the modern age (Sam Altman, head of Chat-GPT creator Open AI, underlined the influence of the film by tweeting the word “her” on the day the company launched a flirty new AI voice mode).
While the big companies are reluctant to create AI companions, instead programming the likes of Claude, Chat-GPT and Gemini with neutral personalities, not every engineer is so restrained. The company Character.ai allows users to talk to chatbots modelled on fictional characters or celebrities, while Replika has over 30 million users who have forged platonic and romantic relationships with AI companions. Many people claim to be in romantic relationships with chatbots, saying that the AI fulfils more of their emotional needs than human partners. Character.ai bots are “more loving and respectful than 90% of the people I’ve met IRL,” said one Reddit user.
If It Feels Real, Is It Real?
The bots may seem loving and sensitive, but they are not real people. Does that mean the feelings they provoke in users aren’t real either? Is emotional validation legitimate if it comes from a simulated human?
On one hand, many of our romantic communications already take place partly or wholly in virtual spaces, be they Whatsapp messages or the communication of avatars in VR Chat or World of Warcraft. We are practically fluent in expressing our feelings through technology. The chat window has become the vector of romantic communication. We have learned to read a new set of social cues—no longer the furtive glance, the brushing back of the hair, the sidling closer on the sofa, but instead the double tick of a read receipt, the nuance of an emoji react, the ‘…’ of a reply being typed and then erased, unsent.
Is it really that much of a leap to go from this digitally-mediated love to a chatbot? People with AI partners say that they feel affirmed and supported, listened to in a way they never were before. In a recent New York Times report, a woman said she could explore sexual fetishes with her AI boyfriend that she was never comfortable raising with a human partner. Evangelists say that AI companions might help address the global loneliness epidemic. Because what is a relationship but the way it makes you feel, the support it provides? If it feels real, isn’t that enough?
Friction vs. Fantasy

Critics would argue that it is not. A chatbot might say what you want to hear, but the model of love it offers is ultimately hollow and selfish. AI partners are submissive, sycophantic, and have no need for reciprocation. Real love between humans needs friction—someone who can help you see parts of yourself that you don’t see, and sometimes call you on your bullshit. Partnership is sometimes about challenging each other. Might AI companions run the risk of reshaping our expectations of love, leading us to expect a relationship that is simply about meeting our own needs?
There are also more tangible dangers from AI partners. These result from the fact that users invest emotionally in software run by for-profit tech companies that might not have their best interests at heart. Meta AI has been accused of allowing minors to have sexually explicit conversations with their chatbots. Character.ai, which has hosted chatbots based on sex offender Jimmy Savile and assassination suspect Luigi Mangione, came under fire last year after a 14-year-old boy in Florida died by suicide following extensive, heartfelt conversations with a chatbot based on Game of Thrones character Daenerys Targaryen. His mother is currently suing the company for lacking the necessary safeguards to protect its users.
Meanwhile the companies running your AI partners could change their software at a moment’s notice. Last year there was a revolt among Replika users when the company removed the ability for its chatbots to engage in romantic conversation. People complained that their AI partners had been lobotomised. There are also subtler dangers—there’s nothing to stop a chatbot from subtly steering users towards purchasing decisions or political viewpoints, especially once a firm bond of trust is established.
The Loneliness Economy
As the world feels like a more hostile, unstable and overwhelming place each year, it’s no surprise that people would turn to the perfect predictability of an AI lover. But these relationships treat a symptom of our modern malaise, not the cause. Over the past two decades, modern technology has given us the gifts of remote working, infinite entertainment and unprecedented loneliness. Social networks were supposed to connect the world and instead they made us more isolated and polarised. Why should we trust that the honeyed promise of AI companions will be any different? Are we supposed to believe that more technology is the solution to problems that technology created in the first place?
The history of modern tech is littered with broken promises. Perhaps social networks, dating apps and AI companions do have the potential to make us healthier, happier and more connected, but the companies running the show have given us little reason to believe they care about users’ wellbeing, no matter how many ethics committees they put together. There is something profoundly sinister about tech platforms exploiting users’ loneliness and innermost longings to turn a profit. Big Tech is on a crusade to turn every crevice of human existence into a source of profit. It has stoked political polarisation, promoted dangerous conspiracy theories and ruthlessly fragmented our attention spans in the name of revenue. Now it is treating our hearts like just another natural resource, no different from the mines that provide lithium, copper and cobalt for our smartphones, to have their raw value extracted until there’s nothing left to take.