Instagram on Tuesday announced the addition of new features to its parental control tool to allow parents or legal representatives to more directly supervise the activities of their minor children.

• Read also: Find out who is the “best influencer in the world”

• Read also: Meta adds the possibility of spending time with friends in the metaverse

Parents and guardians will now have the ability to send invitations to young teens offering to monitor their account. They will have to accept the request to activate the tool.

Until now, only children had the possibility of initiating parental controls, this option having been available since March in the United States.

“It is crucial for us to develop tools that respect the right to privacy and the autonomy of young people while involving parents in the experience,” Clotilde Briend, the program manager for ‘Public Affairs of Meta (the parent company of Facebook and Instagram) in France.

With the new device, parents will be able to limit screen time by setting daily app usage limits (from 15 minutes to 2 hours) or by scheduling breaks.

Young people will also be able to report to their parents or legal representatives if they encounter content that does not comply with Instagram’s rules (incitement to hatred or violence, nudity, etc.).

Parents and guardians will finally have access to their children’s Instagram contact list, whether it’s their followers or the people they follow.

These features will be available by the end of June in France, Germany, UK, Ireland, Japan, Canada and Australia as well as the US before rolling out to the rest of the world by at the end of the year.

A family information center with expert advice and supervision resources will also be accessible from Instagram. A first version of this platform was launched in March for American users.

Parental control tools will also be deployed for Oculus Quest virtual reality headsets developed by Meta.

Instagram is frequently accused by American elected officials and child protection associations of having harmful effects on its youngest users.

In the fall of 2021, Frances Haugen, an ex-Facebook employee, leaked internal documents showing that the leaders of the platform were aware of certain risks for minors, in particular for the mental health of certain young girls confronted with the myth. of the ideal female body.

Instagram has since tried to give pledges on the protection of teenagers, as have other platforms popular with young audiences.

TikTok announced new features last week aimed at limiting the screen time of its underage users.