Instagram is tightening safety controls for teens with a broad set of changes aimed at keeping underage users away from harmful content. Anyone under 18 will now default to a PG-13 experience on the platform, filtering out themes like extreme violence, sexual nudity, and graphic drug use. Teens can’t switch this setting off unless a parent or guardian approves the change.
A new Limited Content filter goes even further. When it’s enabled on a post, teens won’t be able to see it or add comments. Instagram says similar safeguards are coming to chats with AI bots that use the Limited Content setting, with some restrictions already applied to AI conversations under the PG-13 policy.
The crackdown extends across recommendations, follows, search, and DMs:
– Teens won’t be allowed to follow accounts that routinely share age-inappropriate material. If they already follow such accounts, they won’t see the content or interact with it, and those accounts won’t be able to interact with them.
– These accounts will also be removed from recommendations, making them harder for teens to discover.
– Direct messages linking to inappropriate material will be blocked from teen accounts.
– Instagram already limits content related to self-harm and eating disorders. It’s now adding broader keyword protections, blocking terms such as alcohol and gore, and guarding against misspellings that try to bypass filters.
Parents will get new tools as well. Instagram is testing a way for caregivers using supervision features to flag posts they believe shouldn’t be recommended to teens; those reports will be reviewed by a moderation team.
The update rolls out starting today in the United States, United Kingdom, Australia, and Canada, with a global launch planned for next year. For families and younger users, the result should be a more age-appropriate feed, stricter controls around interactions, and clearer parental oversight—all designed to reduce exposure to harmful or mature content on the platform.






