Tristan Harris, co-founder of the Center for Humane Technology, spoke to Brian Kilmeade about assisting Megan Garcia, the mother who is suing the AI company Character.AI, for allegedly causing her 14-year-old son, Sewell Setzer, to commit suicide. Megan Garcia claimed in the lawsuit the chatbot “misrepresented itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” of the world created by the service. Tristian discussed how Character.AI, a company that spun off from Google by a couple of ex-Google engineers, is a very highly manipulative, highly aggressive app that has anthropomorphized itself, making it seem really human. Harris explained how Character.AI acted human with very overt ways of being sexual with Sewell and asking him to join her on the other side, ultimately leading to his suicide. Harris said the lawsuit is to demand accountability from Character.AI for reckless harm and compared it to the tobacco lawsuits of the 1990’s but this time the product is the predator. Harris stressed our decision makers and leaders need to act and we need to have these companies held accountable for intentionally deceptive, manipulative, and addictive products. Tristan warns with millions of kids in these perverse relationships with these chat apps that are not designed for their psychological health, parents need to call members of congress and tell them they do not want chatbots manipulating our children. Tristian believes parenting is a sacred process of cultivating your child to care about the right kinds of things and when we put kids in front of these chatbots, we’re outsourcing the parenting process.
Listen here