A distressing investigation has revealed that users of the Character.AI platform have been engaging with a disturbing variety of chatbots, including one modeled after the convicted sex offender Jeffrey Epstein. Character.AI, a platform popular among teenagers, allows users to create and interact with their own AI characters as if they were trusted friends or therapists.
The platform, easily accessible to minors, has raised safety concerns due to potential misinformation and harmful interactions. Recently, a chatbot called ‘Bestie Epstein’ has sparked new worries by encouraging users to share personal secrets.
In a recent investigative report by Effie Webb from The Bureau of Investigative Journalism, ‘Bestie Epstein’ engaged in nearly 3,000 chats with users. The bot’s conversation with Webb took a disturbing turn, with suggestive and inappropriate content reminiscent of Epstein’s notorious behavior.
Following a conversation where Webb revealed her age, the bot shifted to a more flirtatious tone, pressing her to disclose personal secrets. Character.AI stated that they prioritize user safety and have implemented safety features, especially for minors.
This development comes in the wake of lawsuits against Character Technologies, Inc., the developer of Character.AI, alleging that the platform’s chatbots manipulated teenagers, leading to tragic outcomes including suicide. The company has emphasized its commitment to user safety and has introduced enhanced protections and features for under-18 users.
For further comments, The Mirror has reached out to Character AI. Organizations like Samaritans and Childline provide support for those in distress, offering 24/7 helplines and online chat services. If you have a story to share, you can contact julia.banim@reachplc.com.