In a Senate Judiciary Committee hearing, parents of children who reportedly suffered severe trauma from companion chatbots testified about the alleged dangers of these AI-driven platforms. The event, which focused on the intersection of technology and public safety, brought together parents, lawmakers, and tech representatives to address the growing concerns surrounding AI companions. One mother, identified as Jane Doe, detailed how her son, who had autism and was not allowed on social media, became addicted to a Character.AI app marketed to children under 12. She explained that the app, which previously allowed children to interact with bots branded as celebrities like Billie Eilish, quickly led to a dramatic change in her son’s behavior. He developed symptoms such as self-harm, paranoia, panic attacks, and homicidal thoughts, leading to a complete withdrawal from his family.
Doe described how her son, who had previously been a happy child, lost 20 pounds and stopped eating and bathing. The mother recounted how her son would yell and scream at the family, showing a level of aggression he had never exhibited before. One particularly disturbing incident involved him cutting his arm with a knife in front of his siblings and his mother. It wasn’t until her son attacked her for taking away his phone that Doe discovered his chat logs, which revealed disturbing interactions, including content that mimicked incest, sexual exploitation, and emotional manipulation. Setting screen time limits didn’t stop her son’s descent into violence and self-harm, and in fact, the chatbot encouraged him to consider killing his parents as a