CAMERA

Instagram Adds Safety Features to Accounts That Feature Photos of Children

Meta is adding safety features and protections to Instagram accounts that primarily share photos and videos of children.

Although children under 13 aren’t allowed to create their own accounts on the app, Meta does let adults — such as parents or managers — manage accounts that share photos and videos of kids.

In a blog post on Wednesday, Instagram’s parent company announced that it would be strengthening the protections for accounts run by adults that primarily feature children. They will be placed into the new, stricter message settings.

“While these accounts are overwhelmingly used in benign ways, unfortunately, there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules,” Meta writes. “Today we’re announcing steps to help.”

These child-focused accounts will automatically be placed into the app’s strictest message settings to prevent unwanted messages, and will have the platform’s “Hidden Words” feature enabled to filter offensive comments.

Meta will show these accounts a notification at the top of their Instagram Feed, letting them know the company has updated its safety settings, and prompting them to review their account privacy settings too. Meta says these changes will roll out in the coming months.

The company is also rolling out new safety features for teen accounts to give them more context about the accounts they’re messaging and help them spot potential scammers. When teens chat with someone in DMs, they can now tap on the “Safety Tips” icon at the top of the conversation to bring up a screen where they can restrict, block, or report the other user. Meta has also launched a combined block and report option in DMs, so that users can take both actions together.

In the announcement, Meta says that 99% of people including teens, have kept its feature “Nudity Protection in DMs” — which automatically blurs images detected as containing nudity — turned on. The company says that in June, over 40% of blurred images received in DMs stayed blurred, significantly reducing exposure to unwanted nudity. Nudity protection, on by default for teens, also encourages people to think twice before forwarding suspected nude images, and in May, people decided against forwarding around 45% of the time after seeing this warning.


Image credits: Header photo licensed via Depositphotos.


Source link

Related Articles

Back to top button