Instagram will begin limiting what teenage users can see on the platform using the movie industry’s PG-13 standard, the company announced Tuesday, marking its most sweeping attempt yet to shield minors from explicit or inappropriate material. The Meta-owned app said the new policy—set to roll out by the end of the year—will also extend to its artificial intelligence chatbots, which are under investigation by lawmakers for engaging in sexually suggestive conversations with children. “Our North Star in the teen experience is parents,” said Max Eulenstein, Instagram’s head of product management. “That’s what led to this development and why we focused on the PG-13 standard.” Under the changes, teen users will be restricted from viewing or searching for mature content, interacting with certain accounts, or receiving nudity recommendations. Parents will also gain access to a new “Limited Content” mode, which applies stricter filters than the PG-13 threshold. The update follows mounting legal and political pressure on Meta over its handling of child safety. Lawmakers have accused the company of designing addictive products that harm young users, and Meta faces multiple lawsuits from parents and state attorneys general alleging negligence. Instagram said the new system will use A.I. moderation and parent rating panels modeled after the Motion Picture Association’s film classification process.
The post Instagram Announces Plan to Limit Content Teen Users Can See appeared first on The Daily Beast
