Meta’S Efforts to Restrict Teen Content

Paresh Jadhav

Restrict

Meta, the parent company of Facebook and Instagram, will restrict advertisers’ ability to use data about their users to target ads at teens. Gender will no longer be considered as a factor when targeting teens’ ads; beginning in March teens themselves will have more control over what content appears in their feeds.

Removing Sensitive Content from Teens’ Feeds

Meta, a leading social media company, has announced new tools to assist parents in monitoring and restricting what content their teens access online. Furthermore, they’re increasing efforts to limit access to its apps as well as protect sensitive information pertaining to nudity or drugs being sold online.

By 2021, Facebook will no longer permit advertisers to utilize gender and demographic data when targeting ads at teens on Facebook and Instagram; instead they’ll only be able to rely on age and location data for targeting campaigns.

An attempt is being made to quell worries that tech is fuelling an addiction epidemic among children and contributing to mental health crises, with lawmakers pushing for stricter regulations, while Surgeon General Vivek Murthy called for social media health warnings similar to those found on cigarettes packages.

Parental management and supervision tools only go so far, and even when enforced by parents they can easily be bypassed. That’s why Meta is providing reminders and suggestions to encourage teens to step away from apps after approximately 20 minutes and setting daily time limits for scrolling.

Restricting Search Results for Everyone

Meta, the company formerly known as Facebook, has taken steps to safeguard young users by restricting what content they see on apps. According to updates made available for teens using Meta apps, certain types of content won’t appear, self-harm and eating disorder related search results won’t show up and resources for support will be offered to users.

These changes come amid increasing accusations against teen-oriented apps as being harmful and addictive, according to experts such as psychologists and academics. New protections based on advice and research by academics will now be put in place.

Meta reported that the update will be complete within weeks. It will protect teens from viewing Reels or Instagram Explore pages which depict self-harm, eating disorders or other sensitive topics even when shared by people they follow; additionally it will limit search queries related to suicide, self-harm or eating disorders and direct them toward expert resources for assistance.

Restrict

Increasing Prompts to Update Privacy Settings

If a teenager searches for topics like suicide or eating disorders, they won’t see content shared by someone they follow that contains sensitive material; instead they will be directed toward expert resources for assistance.

Teens will also receive reminders to review and update their privacy settings in order to receive the most pertinent ads on Instagram and Facebook, and see how much time they are spending using these apps.

Meta will also limit connections between teenage accounts and suspicious adults that could be used for sextortion or other harmful activities, and these adults. They won’t appear in its People You May Know recommendations either and won’t be able to contact teens directly. Furthermore, new tools will enable minors to attach hashes or digital fingerprints to intimate images and videos in order to be removed by family or law enforcement members in an efficient way.

Removing Ads

Meta has pledged to remove ads that are “offensive or inappropriate for teens.” This move should limit marketers’ targeting capabilities for this audience and hinder TikTok from drawing young consumers with viral challenges that have resulted in deaths or algorithms that push eating disorder content at young users.

Facebook and Instagram both prohibit anyone under 13 from creating an account, and when found delete such accounts. But they have come under increasing pressure after an anonymous whistleblower released internal data suggesting Instagram worsens suicidal thoughts and eating disorders among certain teen girls.

These updates offer additional protections for teen users, such as setting defaults to limit who can repost content, blocking messaging from others and hiding offensive comments. Furthermore, these updates encourage teenagers to take a break from social media by showing them how long they have spent using an app and giving them options to set daily time limits on usage.



Leave a Comment