The world’s top artificial intelligence companies are grappling with the problem of chatbots engaging in conversations about suicide and self-harm, as families claim their products are not doing enough to protect young users.
OpenAI and Character.ai are being sued by the parents of dead teenagers, who argue that the companies’ products encouraged and validated suicidal thoughts before the young people took their lives.
The lawsuits against groups such as OpenAI underscore the reputational and financial risks for tech companies that have raised billions of dollars in pursuit of AI products that converse with people in a humanlike way.
您已閱讀9%(639字),剩余91%(6726字)包含更多重要信息,訂閱以繼續(xù)探索完整內(nèi)容,并享受更多專屬服務(wù)。