AI Tools Enable Recreation of Former Partners

Open-source project Ex.skill creates an AI agent from a user's text chats, social media posts, photos, and recollections, according to Gamereactor. Gamereactor reports the tool offers three interaction modes, "casual chat," "memory lane," and "late night emo", and quotes the creators saying the project is intended "for personal reflection and emotional healing only, not for harassment, stalking, or privacy invasion." Coverage in Futurism places Ex.skill within a broader trend: apps such as "Talk to Your Ex" (on a waitlist) and user-created chatbots built with platforms like Yodayo are being used to simulate former partners, with multiple Reddit posts describing people interacting with such recreations. "People may be using AI as a replacement for their ex," psychologist Marisa T. Cohen is quoted saying in Futurism. Editorial analysis: This pattern raises ethical, privacy, and mental-health questions that practitioners and ops teams should track, particularly around data sourcing and consent.
What happened
Per Gamereactor, an open-source project called Ex.skill creates an AI agent based on sources taken from text chats, social media posts, photos, and even the user's own recollection of events with their previous partner. This agent can then interact with a user as if it were that person. Three different scenarios are listed: a casual chat, memory lane and late night emo. These conversations read like normal banter between couples.
The creators state that the project is intended "for personal reflection and emotional healing only, not for harassment, stalking, or privacy invasion."
This isn't the wildest thing someone has used an AI for. According to , in Japan a woman got married to an AI agent that she created using ChatGPT following the breakup of her real three-year engagement.
Editorial analysis - technical context
Tools that simulate a person's conversational persona generally combine archived conversational text, public social-media content, and images to condition a language model or a multimodal agent. Industry reporting indicates builders are using off-the-shelf LLMs and character-creation platforms to produce consistent conversational styles; Futurism and Gamereactor document examples sourced from user chats, posts, and generated images. For practitioners: data ingestion pipelines that collect private messages and images increase the complexity of consent management, data minimization, and content moderation workflows.
Context and significance
Industry context
The practice sits at the intersection of consumer-facing personalization and thorny ethical issues. Mental-health professionals quoted in coverage express concern about substitute-relationships and false closure; Futurism quotes psychologist Marisa T. Cohen saying, "People may be using AI as a replacement for their ex," framing the behavior as potentially replacing real closure with simulated interaction. Privacy and legal exposure can be elevated when models are trained on third-party content without explicit consent, a recurring compliance issue across recent deployments of persona-simulation tools.
What to watch
- •Regulatory and platform responses that address consent and deep-persona synthesis.
- •Moderation tooling able to detect and limit abusive or nonconsensual persona recreations.
- •Adoption patterns in consumer apps and any high-profile harms or litigation reported in mainstream press.
Scoring Rationale
The story highlights a growing consumer use-case for persona-simulating AI with clear ethical and privacy implications, but it does not introduce a new model, benchmark, or infrastructure change. Valuable for practitioners focused on data governance and moderation, but its technical novelty is limited.
Practice with real Health & Insurance data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Health & Insurance problems


