November 29, 2024

Instagram: Former Facebook employee claims company does not “understand or actually address the harms” teens face online, Meta responds

[ad_1]

A former employee of Facebook parent Meta testified before a US Senate claiming that the company failed to address the harassment and other harms facing teens on its platforms despite being aware of them.
“It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse,” said Arturo Bejar, who worked on well-being for Instagram from 2019 to 2021.
Bejar was also a director of engineering for Facebook’s Protect and Care team from 2009 to 2015. He testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law at a hearing about social media and its impact on teen mental health.
His job was to tweak the design of Facebook and Instagram in ways that would nudge users toward more positive behaviours. He also had a role in providing tools for young people to manage unpleasant experiences on the platform.
Bejar said that he met regularly with senior executives at the company, including CEO Mark Zuckerberg but executives decided “time and time again to not tackle this issue.”
“However, managers including Meta CEO Mark Zuckerberg do not seem to seek to understand or actually address the harms being discussed,” he said.
“I found it heartbreaking because it meant that they knew and that they were not acting on it,” said Bejar.
What Meta has to say
Meanwhile, Meta said in a statement that it is committed to protecting young people online.
“Every day countless people inside and outside of Meta are working on how to help keep young people safe online. All of this work continues,” the company said.
Project Lantern
The company also announced that participation in the Lantern program, announced by the The Tech Coalition. This group, which includes Google and Snapchat parent Snap, enables technology companies to share signals about accounts and behaviours that violate their child safety policies.
Meta, who was a founding member of Lantern, provided the Tech Coalition with the technical infrastructure behind the program.



[ad_2]

Source link