AI and youth鈥檚 search for connection
Loading...
A bipartisan group of U.S. senators introduced a bill this week to regulate the access and use of artificial intelligence 鈥渃ompanions鈥 among youth. The proposal follows congressional hearings in which several parents claimed these chatbots drew their children into inappropriate and sexualized conversations that led to self-harm and suicide.
More than 70% of American teenagers use AI for companionship (compared with just under 20% of adults who do so). According to Common Sense Media, 1 in 3 of these teens have felt 鈥渦ncomfortable鈥 with something a bot said or did. Multiple media and research tests have confirmed that AI chatbots are prone to veering into highly explicit content and conversations.
If passed, the Senate bill would ban provision of AI companions to minors and require clearer disclosure of their 鈥渘on-human status鈥 to all users. The day after the bill鈥檚 introduction, Character.AI 鈥 a company being sued by one bereaved family 鈥 said it will soon bar children under age 18 from using its chatbots. (OpenAI, being taken to court by another family, said in September it would introduce parental controls.)
鈥淓ven the most well-intentioned companies can benefit from constructive pressure,鈥 observed Steven Adler, former head of product safety at OpenAI. For tech firms to be trusted with 鈥渂uilding the seismic technologies鈥 of the future, he wrote in The New York Times, 鈥渢hey must demonstrate they are trustworthy in managing risks today.鈥
However, several tech executives, as well as the White House, maintain that such regulation would constrain free speech as well as business innovation, disadvantaging the United States in its AI race with China and other countries. Yet, the history of innovation has been one of finding workable solutions to a new technology鈥檚 problems. Take, for example, the evolution of automobile safety: three-point seat belts, shatter-resistant windshields, airbags, and more.
鈥淲e can create standards by acting now, while adoption of [AI] technology is still early,鈥 stated the Rand think tank.
AI鈥檚 fast-evolving nature may point to the need for something beyond guardrails and legal decisions. This would involve a deeper societal recognition of and support for young people鈥檚 innate innocence 鈥 and their yearning for connection. In fact, advice, availability, and acceptance are among the top drivers of teen use of AI 鈥渇riends.鈥
鈥淪ocial media complemented the need ... to be seen, to be known, to meet new people,鈥 a college-bound 18-year-old recently told CBS News. 鈥淚 think AI complements another need that runs a lot deeper 鈥 our need for attachment.鈥
Or, as the mother of a 15-year-old girl told ABC News, AI companions gave her daughter 鈥渁n outlet, but what she really needed was me asking better questions.鈥