FTC Confronts Meta Over Instagram’s Alleged Risks to Young Users’ Safety

Phone showing social media app icons Snapchat TikTok YouTube

The FTC has brought serious accusations against Meta, claiming the company has endangered minors on Instagram amidst internal clashes over resource allocation for user safety.

Key Takeaways

  • The FTC presented evidence in an antitrust trial, focusing on Instagram’s past safety issues related to child protection.
  • An internal report revealed Instagram algorithms inappropriately suggested minors to adults with predatory tendencies.
  • Emails and testimony suggest Mark Zuckerberg withheld resources for Instagram due to fears of overshadowing Facebook.
  • A Meta spokesperson claims safety measures for teens have been implemented, including default private accounts.
  • Meta says it has invested in safety efforts since 2018, supporting legislative updates for child protection.

FTC’s Allegations Against Meta

The Federal Trade Commission accused Meta of endangering young users on Instagram by allowing its platform’s algorithms to recommend minors to predatory adults. This came to light during an antitrust trial where Instagram’s past safety deficiencies were brought to focus. Referring to a 2019 internal report, the FTC noted that 27% of these recommendations involved minors, identifying over two million problematic accounts within a span of three months.

The testimony during the trial linked back to emails and reports that Mark Zuckerberg withheld necessary resources for Instagram’s security initiatives. These actions were allegedly due to concerns that ramping up investments in Instagram could potentially slow down Meta’s growth, which some internal voices feared might impact Facebook’s standing.

Child Safety and Internal Turmoil

The FTC argues that Meta’s acquisition of Instagram led to a significant underinvestment in user safety measures, ultimately harming consumers. There was tension within the company as resources were unevenly allocated between Instagram and Facebook, leaving Instagram particularly lagging behind in dealing with issues such as child exploitation.

This situation worsened, with an analysis of 3.7 million user reports showing one-third from minors, 54% of whom reported inappropriate adult interactions. Meta executives themselves acknowledged these security lapses, revealing Instagram safety teams were sorely understaffed and lacking adequate resources.

Meta’s Response

Despite these claims, a Meta spokesperson stated that since 2018, the company had initiated several child safety efforts, including making teen accounts private by default and limiting adult-minor interactions. Meta also supported legislative updates aimed at enhancing child protection on digital platforms.

This reflects an attempt by Meta to assure the public and the authorities that they are committed to user safety, regardless of the accusations presented in court.

Sources:

  1. Meta Antitrust Trial: FTC Says Instagram Urged ‘Groomers’ to Connect With Minors – Bloomberg
  2. FTC Describes Instagram as a Groomer’s Paradise at Meta Antitrust Trial