NY AG Releases Proposed Rules for SAFE for Kids Act to Restrict Addictive Social Media Features

NY AG Releases Proposed Rules for SAFE for Kids Act to Restrict Addictive Social Media Features

By Forum Staff

State Attorney General Tish James on Monday released proposed rules on how social media companies should restrict addictive features on their platforms to protect children’s mental health, as required by the Stop Addictive Feeds Exploitation (SAFE) for Kids Act.

Algorithmically personalized feeds are known to drive unhealthy levels of social media use in minors that can affect their mental health. Research shows that children as young as 10 to 14 years old experience addictive use of social media, and the more time children spend online, the more likely they are to experience negative mental health outcomes such as depression, anxiety, and eating and sleep disorders.

The SAFE for Kids Act addresses these mental health concerns for children by requiring social media companies to restrict addictive feeds for users under 18. Instead of the default algorithmically personalized feeds that keep young people on the platform, users under 18 will be shown content only from other accounts they follow or otherwise select in a set sequence, such as chronological order unless they get parental consent for an algorithmic personalized feed. Users cannot be cut off from the platform simply because they don’t want or don’t have parental consent for an addictive feed. Instead, all users will still be able to access all of the same content they can access now.

The law also prohibits social media platforms from sending notifications to users under 18 from 12 a.m. to 6 a.m. without parental consent.

Age Assurance

For users above the age of 18, social media companies must ascertain that the user is an adult before allowing them to access algorithmic feeds and/or nighttime notifications. Companies may confirm a user’s age using a number of existing methods, as long as the methods are shown to be effective and protect users’ data. Companies can use options, such as:

  • Requesting an uploaded image or video; or
  • Verifying a user’s email address or phone number to cross-check other information that reflects a user’s age.
  • Social media companies must offer at least one other alternative method for age assurance besides providing a government-issued ID.
  • Any information used to determine age or obtain parental consent must not be used for any other purpose and must be deleted or de-identified immediately after its intended use.
  • Young users who turn 18 must have an option to update their age status on the platform.
  • Social media companies must choose an age assurance method with a high accuracy rate, conduct annual testing, and retain the results of the testing for a minimum of 10 years.

Parental Consent

  • Social media companies must first receive a minor’s approval to request parental consent for algorithmic feeds and/or nighttime notifications. Once a minor approves, the platform may seek verifiable parental consent to allow a minor to access algorithmic feeds and/or nighttime notifications.
  • The platform may not block the minor from generally accessing the platform or its content through, for example, searches, simply because they or their parent has refused to consent.
  • The platform is not required to show parents the user’s search history or topics of interest to obtain parental consent.
  • Parents and minors must also have the option to withdraw their consent at any time.

These proposed rules apply to companies that display user-generated content and have users who spend at least 20 percent of their time on the platform’s addictive feeds.

facebooktwitterreddit