Meta Platforms is implementing additional safeguards on its popular social media platforms, Instagram and Facebook in response to growing regulatory pressure and heightened scrutiny over the safety of teen users. The move follows recent commitments by the parent company of WhatsApp to enhance content protection for teens.
The decision to bolster safety measures comes in the wake of regulatory attention, particularly after the company faced allegations from a former Meta employee during a U.S. Senate testimony. The former employee claimed that Meta was aware of instances of harassment and other potential harm faced by teens on its platforms but had failed to take adequate action.
As part of its new measures, Meta disclosed that teens using Instagram will no longer receive direct messages by default from individuals they do not follow or have no connection with on the platform. Additionally, parental approval will be required for certain settings changes within the app, ensuring a layer of oversight and protection.
For Messenger, Meta is implementing age-specific restrictions. Accounts of users under the age of 16 and individuals under 18 in certain regions will only be able to receive messages from Facebook friends or contacts linked through phone contacts. Furthermore, a significant safeguard is being put in place, prohibiting adults aged 19 and above from sending direct messages to teens who do not follow them.
Meta Platforms has refrained from disclosing the exact number of customers below the age of 18. These newly announced measures aim to address concerns raised by regulators and underscore Meta’s commitment to fostering a safer online environment for its younger user base.