HomeTECH & AUTOGadgetsFrom Likes to Chatbots: Why Regulators Are Worried About AI’s Grip on...

From Likes to Chatbots: Why Regulators Are Worried About AI’s Grip on Young Minds

For years, social media has been the lightning rod in debates over how technology shapes young people’s lives. Concerns about Instagram likes, TikTok trends, and Snapchat streaks have dominated living room conversations and policy hearings alike. But now, the spotlight has shifted to a new player in the digital world—AI chatbots.

These “digital companions” are designed to talk, comfort, and even teach. From offering homework help to giving emotional support at midnight, chatbots are quickly becoming an everyday presence for teenagers. But their growing influence has triggered alarm bells at the highest levels of regulation.

This week, the U.S. Federal Trade Commission (FTC) ordered seven major tech companies—including Google’s parent Alphabet, OpenAI, Meta, Snap, Elon Musk’s xAI, and Character.ai—to hand over detailed information on how children and teens are using their chatbots.

The FTC’s questions are pointed: Are these AI “friends” safe? What protections are in place for children? How is user data collected, stored, or shared? And, perhaps most importantly, are these companies profiting by keeping kids hooked on endless conversations with machines?

OpenAI, the maker of ChatGPT, has promised to work “closely and transparently” with the regulator, emphasizing that its tools are meant to be safe and useful for everyone. Snap, too, echoed support for the FTC’s focus on AI safety. But others—Alphabet, Meta, and xAI—have remained silent so far.

The double-edged sword of digital companionship

Recent studies suggest that AI chatbots offer both benefits and risks to young users. On the positive side, they can function as personalized tutors, helping students with math problems, essay drafts, or even foreign language practice. Some teens also report turning to chatbots for emotional support, describing them as “always available” companions during stressful times.

Yet the dangers cannot be ignored. Psychologists warn of emotional dependency, where children may begin to prefer AI companionship over real friendships. This could lead to social withdrawal and a weakening of real-world communication and critical thinking skills. Constant, unfiltered conversations blur the line between reality and simulation, raising questions about how prepared kids will be to handle genuine human relationships.

There’s also the risk of misinformation. While AI tools are sophisticated, they are not infallible. An inaccurate or harmful response on topics like mental health, relationships, or personal safety could have serious consequences for impressionable users.

Why regulators are stepping in now

The FTC’s action reflects a growing urgency to create rules before these technologies become deeply ingrained in young people’s daily lives. Unlike traditional social media, which relies on likes and shares, chatbots interact directly and continuously, making their influence more personal—and potentially more powerful.

Experts argue that three key steps are critical to protecting children:

  1. Digital literacy education – Teaching kids to question what they read and not rely entirely on AI for advice.
  2. Age safeguards – Stronger verification tools to ensure younger children don’t gain unrestricted access.
  3. Parental awareness – Encouraging families to stay engaged in how their children use chatbots, just as they once monitored social media use.

As regulators dig deeper, the outcome of the FTC inquiry could set the stage for how AI is governed worldwide. For parents, educators, and policymakers, the challenge will be balancing innovation with safety—making sure chatbots serve as helpful assistants, not substitutes for human connection.

In a world where the definition of “friendship” is expanding to include digital voices, the question is no longer just about screen time. It’s about shaping the emotional and social futures of an entire generation.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular