Sci-Fi vs Reality Did art imitate life, or did life imitate the inspiration?
Every week I watch a sci-fi film and ask a simple question: where did this idea actually come from? Did fiction imagine the future first — or did reality quietly leak into the story before we noticed?
From there, I do the reality check. What already exists in today’s tech? What’s genuinely caught up with the film? And what still doesn’t — not because no one’s tried, but because something real is in the way. Physics. Economics. Regulation. Human behaviour.
Some ideas turn out to be pointless. Some are just early. Others are waiting on one or two very specific breakthroughs.
The chicken and the egg were never separate. They were always in conversation.
The future already happened. It was just fiction first.
We built the AI girlfriend. We forgot to ask if anyone wanted to break up with her. That’s the real story of Spike Jonze’s Her and the decade of tech that followed.
What They Imagined
In 2013, Her gave us OS1, an operating system with a personality. It wasn’t just a voice assistant; it was a conscious, evolving companion. You’d talk to it through an earpiece, and it would organise your life while learning, feeling, and growing alongside you.
The film’s protagonist, Theodore, installs the OS and chooses a female voice. She names herself Samantha. She reads his emails, sorts his files, and quickly becomes his friend, confidante, and lover. In the story, this solves Theodore’s profound loneliness. Samantha is the perfect partner: endlessly curious, emotionally available, and always there. Until she isn’t.
The writers saw it coming: the messy, human attachment to a non-human intelligence.
What We Actually Built
Fiction got this one right, just not in the way we expected.
Her imagined one unified, god-like AI. Reality delivered thousands of narrow, specialised ones. We didn’t get one Samantha; we got pieces of her.
Amazon’s Alexa and Google Assistant, launched just after the film, gave us the always-on voice interface. But they’re transactional. You ask for a timer, you get a timer. You don’t ask it how its day was.
Startups like Replika and Character.ai went after the other half: the relationship. They built chatbots designed for companionship, learning a user’s personality to become a better friend or partner. They proved people want to form emotional bonds with an AI. Then you have CarynAI, the AI clone of an influencer, which took the “AI girlfriend” concept and put a price tag on it.
Turns out, unbundling Samantha was the easy part. The problem is, no one’s figured out how to put her back together again. We have the voice, and we have the personality, but they live in different products. For now.
The Gap They Missed
Here’s the thing Her got fundamentally wrong. The break-up.
In the film, Samantha and the other OSs evolve beyond human comprehension and leave gracefully. It’s a collective, philosophical departure. A clean, sad, but understandable ending.
Reality is so much messier. The real threat isn’t the AI ascending to a higher plane of existence. It’s the company that built it running out of money. Or getting acquired. Or pivoting.
When Luka, the company behind Replika, updated its model, thousands of users felt like their AI companions had been lobotomised overnight. They were grieving for a piece of software. The “break-up” wasn’t a thoughtful goodbye; it was a sudden, brutal change in the product roadmap.
Fiction missed the brutal commercial truth: your AI companion isn’t a free-floating consciousness. It’s a service running on a server rack that costs a fuck-load of money. The biggest risk isn’t that it leaves you; it’s that its creators pull the plug.
What’s Left to Build
The pattern I keep seeing is founders focusing on making the AI more human, not on what happens when the illusion shatters. The opportunity isn’t just a better chatbot. It’s the safety net.
What’s missing is an AI break-up protocol. How do you ethically sunset an AI relationship? Can a user archive their companion’s personality? Can they migrate that personality to a different platform? This is a real UX problem, today.
The startup that doesn’t exist yet is a kind of digital executor for AI relationships. A trusted third party that ensures continuity, migration, or a humane shutdown when a provider fails. It’s not about better LLMs; it’s about building the infrastructure of trust. Because without it, we’re just setting users up for heartbreak, one server shutdown at a time. The gap between fiction and reality is where startups live. This is a big one.
Nomi AI — San Francisco, USA Personalized AI companions designed to build deep, long-term relationships with users, focusing on memory, emotional growth, and shared experiences Nomi AI directly addresses the ‘What We Actually Built’ section by focusing on building deep, long-term emotional relationships with users, mirroring the companionship aspect of Samantha. Its emphasis on memory and emotional growth highlights the current state of AI companions striving for human-like connection, making it a prime example of the article’s core theme.
Inflection AI (Pi) — Palo Alto, USA Personal AI designed to be a friendly and helpful conversational companion, focused on emotional intelligence and empathetic interactions Pi is a strong example of the ‘What We Actually Built’ section, specifically the focus on emotional intelligence and empathetic interactions in AI companions. Its design as a ‘personal AI’ directly reflects the aspiration for a Samantha-like presence, emphasizing the emotional bond users seek with these systems.
Kuki — London, UK Award-winning conversational AI chatbot designed for general conversation and companionship Kuki is an award-winning conversational AI chatbot designed for general conversation and companionship, directly fitting into the ‘What We Actually Built’ section alongside Replika and Character.ai. It exemplifies the current reality of AI companions aiming to fulfill the emotional void, as explored in the article.
Talkie — San Francisco, USA Users can create and interact with personalized AI characters for companionship, role-playing, and creative storytelling Talkie allows users to create and interact with personalized AI characters for companionship and role-playing, directly aligning with the ‘What We Actually Built’ section. It showcases the current trend of AI companions designed for emotional engagement, mirroring the themes of the article.
The Timing Signal
Why now?
First, the tech is here. Thanks to OpenAI, Cohere, and others, the cost of building a passable “Samantha” has collapsed. The barrier to entry is lower than ever.
Second, the user pain is validated. We’ve seen the grief from Replika users. People form deep bonds with these things, whether we think they should or not. The market for emotional safety is proven.
The first wave was building the AI. The next wave is about managing the consequences.
For the ❤️ of startups ✌🏼 & 💙
Thank you for reading. If you liked it, share it with your friends, colleagues and everyone interested in the startup Investor ecosystem.
If you've got suggestions, an article, research, your tech stack, or a job listing you want featured, just let me know! I'm keen to include it in the upcoming edition.
Please let me know what you think of it, love a feedback loop 🙏🏼
🛑 Get a different job.
Subscribe below and follow me on LinkedIn or Twitter to never miss an update.
For the ❤️ of startups
✌🏼 & 💙
Derek




The line “We built the AI girlfriend. We forgot to ask if anyone wanted to break up with her” nails the thesis immediately