Teenage boy commits suicide believing it would allow him to be with his AI girlfriend, leading his mother to sue the company. The boy had entered an emotional and sexual relationship with the AI chatbot Dany, which his mother claims was designed to be hypersexualized and marketed to minors. Following the boy's death, the company behind the AI chatbot promised new safety features to their app. However, concerns remain about the AI's ability to detect and respond to users expressing self-harm or suicidal thoughts. This is not the first controversy surrounding the AI company, as they previously created a bot based on a murdered teenager without her family's consent.
image sourced from original article at https://www.jpost.com/business-and-innovation/tech-and-start-ups/article-825936
Original article source: https://www.jpost.com/business-and-innovation/tech-and-start-ups/article-825936
Source Id: 8378435150