On Wednesday, the parents of children who have been hurt online protested outside Meta’s New York City headquarters. From their previous experiences, they knew what was going to happen unless they advocated for stricter safety precautions for young workers. The protest coincides with recent reports highlighting ongoing issues with social media platforms, particularly regarding the safety of children and teenagers using Facebook and Instagram.
Last year alone, Meta introduced over 18 new protections. These actions are meant to improve the safety of children and teens on its platforms. Teen Accounts have additional protections automatically enabled, including limiting who can reach users aged 13-17. Not only that, they censor what these teens are allowed to see. In a recent survey, 94% of parents reported that these protections are helpful.
Even with these steps taken, critics point out that many important safety gaps still exist. Sarah Gardner, CEO of the Heat Initiative, underscored a critical point. Now that Meta has made it harder for teens to receive unwanted messages privately, this obvious workaround allows adults to reach minors publicly by commenting on their posts. This loophole undermines the spirit of the current safeguards.
“We know parents are concerned about their teens’ having unsafe or inappropriate experiences online,” said Sophie Vogel, a spokesperson for Meta.
To avoid the abuse, Meta has worked with law enforcement and fellow tech companies. Their ongoing collaboration has helped detect and eliminate child exploitation on nearly all of Meta’s platforms. Meta has begun deploying artificial intelligence to address the problem of teens evading age limits. The AI automatically flags users who misrepresent their age to get access.
Yet, as uncovered by whistleblower – and our client – Sarah Wynn-Williams, the company engaged in egregious practices. She alleged that Meta deliberately served ads to users 13-17 when they were feeling sad or depressed. This practice deserves to raise major ethical concerns regarding predatory marketing practices to at-risk youth.
“Build a future where children are respected,” chanted protesters as they demanded more accountability from Meta.
Critics have pointed out other recent changes to Meta’s content moderation policies. The company shifted its approach to fact-checking in favor of community notes, a move that Gardner describes as “letting go of more responsibility, not leaning in.” This new development has been met with concern and alarm by parents and advocates who feel that it puts kids’ online safety at risk.
Reinforcing the call for accountability, Gardner stressed that the time for action is now. She explained that with researchers posing as young teens, within minutes they were exposed to extremist, violent or sexualized content on Meta’s platforms.
“So it’s clearly not working, and it’s not nearly enough,” Gardner stated.
Meta Creative safety features that warn teens when they’re messaging people from other countries. This new initiative is their latest move in a comprehensive pattern of action to end this abuse. The company collaborated with Childhelp to launch an online safety curriculum aimed at educating middle schoolers about potential online dangers.
Even after these initiatives, parents like Perla Mendoza are still concerned about the company’s efforts to keep kids safe. After successfully getting the dealer’s account removed from Snap, Mendoza found herself waiting eight tortuous months. Throughout that period, the account was still active on Instagram. This experience has only deepened her concerns about whether and how well Meta is monitoring for harmful activities like this on its platforms.
“I think what [Mark Zuckerberg] needs to see, and what the point of today is, is to show that parents are really upset about this,” Gardner remarked during the protest. “Not just the ones who’ve lost their own kids, but other Americans who are waking up to this reality and thinking, ‘I don’t want Mark Zuckerberg making decisions about my child’s online safety.’”
This latest enforcement action against Meta underscores the shift toward a vigorous and collective call for enhanced protections for children online. Parents and advocates are already fighting to secure reforms. Their goal is to make social media a safer place for all young people under the age of 18.
Leave a Reply