Meta Now Requires Parental Approval for Instagram Live Use by Younger Teens

Meta has updated Instagram to require parental permission via Family Center before teens under 16 can use the Live feature, expanding safeguards.

Meta is adjusting controls for its younger users on Instagram, specifically restricting access to the platform’s real-time video broadcasting feature, Instagram Live. Individuals confirmed to be under the age of 16 must now secure explicit permission via a linked parent account before initiating a livestream.

The gatekeeping mechanism operates through Meta’s established Family Center parental supervision tools, adding another layer of oversight for guardians.

This Instagram Live update, which Meta confirmed is rolling out globally now, was announced alongside a similar permission requirement for under-16s wanting to deactivate the automatic blurring of potential nudity in Instagram direct messages. That nudity filter feature, introduced globally in late 2024, is now the default setting for all teen accounts worldwide.

These Instagram-specific adjustments arrive as Meta prepares a broader, phased global expansion of its “teen account” framework—encompassing features like time limits and contact monitoring—onto Facebook and Messenger over the next few months, starting initially in the US, UK, Canada, and Australia. Meta reports high uptake for its existing Instagram system, stating that 54 million under-18s globally use the supervised accounts, with over 90% of those aged 13-15 keeping the initial, more restrictive default settings enabled.

Expanding Supervision Across Platforms

The underlying structure enabling these controls, Meta’s Family Center, began its introduction in mid-2023. It provides parents a dashboard to oversee aspects like privacy settings and app usage time, forming the backbone for the new Live permission system. Explaining the company’s tiered approach where 16 and 17-year-olds often have more latitude to change defaults than younger teens, Instagram head Adam Mosseri stated earlier the goal is to “give parents meaningful controls” while providing older teens increasing “autonomy over their experience.”

This latest restriction on Instagram Live follows a series of safety enhancements implemented on the photo-sharing platform during 2024. Measures against sextortion included blocking the ability to screenshot or save temporary DM photos and videos, restricting interactions with suspicious accounts, and limiting visibility of follower lists for potentially harmful actors. These steps were designed to curb common scammer tactics.

While these tools add layers of protection, their overall effectiveness depends on factors outside the interface, such as accurate age verification – a persistent challenge for online platforms – and the extent to which parents actively utilize the supervision features offered. As noted by several child safety groups, parental engagement remains a key variable.

Navigating Regulatory and Safety Concerns

Meta’s ongoing adjustments to teen features occur under considerable regulatory watch, particularly in Europe and the UK. The company is facing formal proceedings initiated by the European Commission in May 2024 under the Digital Services Act (DSA). This official investigation scrutinizes Meta’s compliance regarding child protection on Facebook and Instagram, specifically examining potential addictive platform designs, the efficacy of age verification, and overall privacy safeguards for minors.

European Commission Executive Vice President Margrethe Vestager stated at the time, “Today we are taking another step to ensure safety for young online users. With the Digital Services Act, we established rules that can protect minors when they interact online.” Potential penalties for DSA violations are substantial.

Simultaneously, the UK is implementing its wide-ranging Online Safety Act, which mandates platforms take steps against illegal content and shield minors from harmful material. This legislative backdrop was highlighted recently by concerns from safety groups about potential weakening of the act via trade deals.

Reacting to Meta’s latest Instagram controls, the UK’s NSPCC acknowledged the changes but reiterated the need for broader action. “For these changes to be truly effective, they must be combined with proactive measures so dangerous content doesn’t proliferate on Instagram, Facebook and Messenger in the first place,” stated Matthew Sowemimo, the charity’s associate head of policy for child safety online, according to The Guardian.

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x