Instagram's new 'nighttime nudges' aim to reduce teens' time on the app
Instagram is introducing new nighttime nudges for teen accounts to limit their time on the app, the company announced on Thursday. The new nighttime nudges will appear when teens have spent more than 10 minutes on Instagram in places like Reels or DMs late at night. The notice will remind teens that it's late and encourage them to close the app and go to sleep.
Teens will start to see a notice that says "Time for a break?" followed by the message "It's getting late. Consider closing Instagram for the night." The social network told TechCrunch in an email that the nudges will appear after 10 p.m. The nighttime nudges will be shown automatically and can't be turned off, which means teens can't opt in or out of seeing them. Of course, teens have the option to simply dismiss the nudge and continue using the app.
Image Credits: Instagram
TikTok rolled out a similar feature last March that also reminds users to when it’s time to put the app down and go to sleep.
The new nighttime nudges join Instagram's other features aimed at reducing teens' time on its app. The app already has a "Take a Break" feature that shows teens full-screen reminders to take regular breaks from Instagram, and a "Quiet Mode" feature that lets teens mute notifications and notify others that they are unavailable for a block of time.
Last week, Meta announced that it was going to automatically limit the type of content that teens can see on Instagram and Facebook. Teen accounts will automatically be restricted from seeing harmful content, such as posts about self-harm, graphic violence and eating disorders.
The new teen safety features come as Meta is facing regulatory pressure to do more to protect children. The company is scheduled to testify before the Senate on child safety on January 31. Executives from X (formerly Twitter), TikTok, Snap and Discord will also be testifying. Committee members are expected to press executives from the companies on their platforms’ inability to protect children online.
In addition, more than 40 states are suing Meta, alleging that the company’s apps are contributing to young users’ mental health problems. Meta has also received another formal request for information (RFI) from European Union regulators who are seeking more information about the company’s response to child safety concerns on Instagram.
Today's announcement comes a day after TechCrunch reported that internal Meta documents about child safety revealed that Meta not only intentionally marketed its apps to children, but was also aware of the significant amount of inappropriate and sexually explicit content being shared between adults and minors.