instagram kids

According to BuzzFeed News, Instagram CEO Adam Mosseri has reported that a version of the popular photo-sharing app for children under the age of 13 is in the works. The Facebook-owned company knows a lot of kids want to use Instagram, but there isn’t a “detailed plan yet.”

But, as Mosseri told BuzzFeed News, “part of the solution is to create a version of Instagram for young people or kids where parents have transparency or control. It’s something we’re looking into. It’s something we’re looking into.” Instagram’s new policy forbids children under the age of 13 from using the app.

In an email to The Verge, Joe Osborne, a Facebook spokesperson, said, “Increasingly, kids are asking their parents if they can join apps that help them keep up with their friends.” “Right now, there aren’t many choices for parents, so we’re focusing on developing additional products — like Messenger Kids — that are appropriate for children and can be handled by parents. We’re looking at getting a parent-controlled Instagram experience to help kids stay in contact with friends, explore new hobbies and interests, and more.”

Instagram vice president of product Vishal Shah said a “youth pillar” project has been listed as a priority by the company, according to a message obtained by BuzzFeed News from an internal messaging board. According to Shah, “the company’s Community Product Group will concentrate on privacy and safety concerns to ensure the best possible experience for teenagers.” Mosseri and vice president Pavni Diwanji, who managed YouTube Kids at Google, will be in charge of the project.

Instagram posted a blog earlier this week about its efforts to make the app safer for its youngest users, but there was no mention of a new version for kids under the age of 13.

Targeting online products to children under the age of 13 poses not only privacy issues but also legal concerns. The Federal Trade Commission fined Google $170 million in September 2019 for breaching the Children’s Online Privacy Protection Act (COPPA) by recording children’s viewing histories to deliver advertisements to them on YouTube., the precursor of TikTok, was also fined $5.7 million in February 2019 for violating (COPPA) in February of 2019.

In 2017, Facebook released an ad-free version of its Messenger chat app for kids, aimed at children aged 6 to 12. Children’s health advocates criticized it as harmful to children’s health and urged Facebook CEO Mark Zuckerberg to discontinue it. Then, in 2019, a bug in Messenger Kids enabled children to join groups with strangers, resulting in thousands of children being placed in chat rooms with strangers. Facebook secretly shut down the unauthorized chats, saying that only a “small number” of users were affected.



Please enter your comment!
Please enter your name here