Facebook’s vice president of global affairs, Nick Clegg, said Sunday that the company should adopt new management tools for its apps to steer teens away from harmful content. The statement comes as US lawmakers analyze how the social network and its affiliates, such as Instagram, are affecting the mental health of young people.
Clegg also said the company is open to the idea of letting regulators access the algorithms used to amplify the reach of posts on its networks.
Algorithms “should be enforced, if necessary, through regulation, so that people can compare what our systems say they should do with what is actually happening,” Clegg told CNN’s State of the Union program.
The interview took place days after former Facebook employee and whistleblower Frances Hogan testified about how the company encourages users to continue browsing its social networks indefinitely, especially harming teens’ well-being.
“We are going to introduce something that I think will make a huge difference, which is for our systems to realize that the teen is seeing the same content over and over again, and the content may not be good for their well-being, and we will encourage them to watch other content.”
Additionally, the CEO said it will include a “Take a Break” device, to encourage teens to stop using Instagram for a while.
Last week, US senators questioned Facebook about its plans to protect young people, based on an internal investigation that showed the social network knows Instagram is bad for young people’s mental health.
environment: Calculate your carbon footprint
Democratic Senator Amy Klobuchar, who chairs the Senate Judiciary Committee’s antitrust subcommittee, has been calling for more regulation against tech companies like Facebook.
However, the executive said he could not determine whether the algorithms duplicated the messages of people who invaded the Capitol on January 6, another question facing the social network.
“Gamers. Unfortunate Twitter teachers. Zombie pioneers. Internet fans. Hardcore thinkers.”