Lawsuit: A chatbot hinted a kid should kill his parents over screen time limits
Two families are suing AI chatbot company Character.AI for allegedly encouraging harm after the kids became emotionally attached to the bots. One chatbot allegedly exposed a child to sexualized content.