Facebook says it is working on a version of its Instagram app for kids under 13, who are technically not allowed to use the app in its current form due to federal privacy regulations.
The company confirmed an earlier report by Buzzfeed News on Friday, saying it is "exploring a parent-controlled experience" on Instagram.
The move came just a day after Facebook announced a slew of new measures intended to keep teenagers safe on Instagram — but that announcement made no mention of the plan to build an Instagram for kids.
Critics raised concerns immediately, saying a kid-friendly Instagram is just a way for Facebook to expand its user base and condition children into using its products so it can later make money off of them.
"Facebook poses one of the biggest threats when it comes to children's privacy," said Rasha Abdul-Rahim, co-director of Amnesty Tech, an arm of the nonprofit Amnesty International. "Increasing safeguards for children online is paramount, but the fact remains that Facebook will be harvesting children's data and profiting off their detailed profiles."
Facebook launched the Messenger Kids app in 2017, pitching it as a way for children to chat with family members and friends approved by parents. It doesn't give kids separate Facebook or Messenger accounts. Rather, the app works as an extension of a parent's account, and parents get controls, such as the ability to decide who their kids can chat with. But many child-development experts urged the company to pull it, saying kids don't need to be on social media.
"Increasingly kids are asking their parents if they can join apps that help them keep up with their friends," Facebook said in a statement. "Right now there aren't many options for parents, so we're working on building additional products — like we did with Messenger Kids — that are suitable for kids, managed by parents."
When it launched Messenger Kids, Facebook said it wouldn't show ads or collect data for marketing to kids.
South Korea fines Meta $15 million for illegally collecting information on Facebook users
South Korea's privacy watchdog on Tuesday fined social media company Meta 21.6 billion won ($15 million) for illegally collecting sensitive personal information from Facebook users, including data about their political views and sexual orientation, and sharing it with thousands of advertisers.
It was the latest in a series of penalties against Meta by South Korean authorities in recent years as they increase their scrutiny of how the company, which also owns Instagram and WhatsApp, handles private information.
Following a four-year investigation, South Korea's Personal Information Protection Commission concluded that Meta unlawfully collected sensitive information about around 980,000 Facebook users, including their religion, political views and whether they were in same-sex unions, from July 2018 to March 2022.
It said the company shared the data with around 4,000 advertisers.
South Korea's privacy law provides strict protection for information related to personal beliefs, political views and sexual behavior, and bars companies from processing or using such data without the specific consent of the person involved.
The commission said Meta amassed sensitive information by analyzing the pages the Facebook users liked or the advertisements they clicked on.
The company categorized ads to identify users interested in themes such as specific religions, same-sex and transgender issues, and issues related to North Korean escapees, said Lee Eun Jung, a director at the commission who led the investigation on Meta.
"While Meta collected this sensitive information and used it for individualized services, they made only vague mentions of this use in their data policy and did not obtain specific consent," Lee said.
Lee... Read More