The news: Amid pressure to establish better child safety guardrails in the AI industry, Character AI will block users under 18 from chatting with bots on its platform starting November 25.
Why it matters: Rather than limiting engagement time or restricting what users can see—methods used by digital platforms like YouTube and Roblox—Character AI is taking a blanket approach. This change could also alter ad targeting on the platform.
Character AI’s decision could affect advertisers’ ability to reach Gen Alpha, a demographic with high potential for spending and engagement.
“We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Character AI CEO Karandeep Anand said, per The New York Times.
Zooming out: This change may protect users’ well-being, but it also preserves brand reputation and addresses impending regulatory demands.
What it means for advertisers: As regulatory and safety concerns rise, advertisers face greater scrutiny when looking to reach younger audiences. CMOs should assess brand alignment with AI platforms and, on platforms that have the potential to pose a safety risk for children, shift efforts to target adult users.
You've read 0 of 2 free articles this month.
One Liberty Plaza9th FloorNew York, NY 100061-800-405-0844
1-800-405-0844sales@emarketer.com