By Barbara Ortutay, AP Technology Writer
A Facebook executive is pushing back on a whistleblower's claims — supported by the company's own internal research — that the social network's products harm children and fuel polarization in the U.S.
Monika Bickert, Facebook's head of global policy management, told The Associated Press Wednesday that "we do not and we have not prioritized engagement over safety." Bickert said the reason Facebook researches teen well-being on Instagram is so that the company can build better products and features to support them.
Whistleblower Frances Haugen, however, testified before the Senate Tuesday that Facebook knows that vulnerable people are harmed by its systems and has not made meaningful changes to prevent it. The platform is designed to exploit negative emotions to keep people on the platform, she said.
"They are aware of the side effects of the choices they have made around amplification," Haugen said. "They know that algorithmic-based rankings, or engagement-based rankings, keeps you on their sites longer. You have longer sessions, you show up more often, and that makes them more money."
Bickert pointed to features and tools Facebook has introduced over the years, such as hiding "like counts" on Instagram "which means when you post something, if you're a young person, you don't have to worry about how many people are going to like your post and whether people will see that."
But Facebook's own researchers found that hiding like counts did not help make teenagers feel better.
South Korea fines Meta $15 million for illegally collecting information on Facebook users
South Korea's privacy watchdog on Tuesday fined social media company Meta 21.6 billion won ($15 million) for illegally collecting sensitive personal information from Facebook users, including data about their political views and sexual orientation, and sharing it with thousands of advertisers.
It was the latest in a series of penalties against Meta by South Korean authorities in recent years as they increase their scrutiny of how the company, which also owns Instagram and WhatsApp, handles private information.
Following a four-year investigation, South Korea's Personal Information Protection Commission concluded that Meta unlawfully collected sensitive information about around 980,000 Facebook users, including their religion, political views and whether they were in same-sex unions, from July 2018 to March 2022.
It said the company shared the data with around 4,000 advertisers.
South Korea's privacy law provides strict protection for information related to personal beliefs, political views and sexual behavior, and bars companies from processing or using such data without the specific consent of the person involved.
The commission said Meta amassed sensitive information by analyzing the pages the Facebook users liked or the advertisements they clicked on.
The company categorized ads to identify users interested in themes such as specific religions, same-sex and transgender issues, and issues related to North Korean escapees, said Lee Eun Jung, a director at the commission who led the investigation on Meta.
"While Meta collected this sensitive information and used it for individualized services, they made only vague mentions of this use in their data policy and did not obtain specific consent," Lee said.
Lee... Read More