Instagram is implementing new measures to help teens limit their time on the app, the company announced on Thursday.
The latest addition to these efforts is a feature called “nighttime nudges.” These will appear when teens have been using Instagram for more than 10 minutes late at night, specifically in places like Reels or DMs.
The nudge will pop up with a message saying, “Time for a break?” followed by a reminder that it’s getting late and suggesting that the user close the app and go to sleep.
In an email to TechCrunch, a spokesperson for Instagram said that these nighttime nudges will start appearing after 10 PM and cannot be turned off. This means that teens cannot opt in or out of seeing them, although they can dismiss the nudge and continue using the app.
“We want to create a safe space for teens on Instagram, and part of that includes promoting healthy habits and reducing excessive screen time.” – Instagram
This idea is not new to social media, as TikTok also rolled out a similar feature last March to remind users when it’s time to put their phone down and get some rest.
These new nighttime nudges are just one of Instagram’s efforts to limit teens’ time on the app. The platform already has a “Take a Break” feature, which shows full-screen reminders to teens to take regular breaks from Instagram. Additionally, there is a “Quiet Mode” option that allows teens to mute notifications and let others know they are unavailable for a specified period of time.
Meta, the parent company of Instagram, recently announced that they will be automatically restricting the type of content that teen users can see on the app. This includes harmful content such as posts about self-harm, graphic violence, and eating disorders.
This move comes amidst increasing regulatory pressure for Meta to do more to protect children online. In fact, the company is scheduled to testify before the Senate on child safety on January 31st. Representatives from other social media platforms, such as X (formerly Twitter), TikTok, Snap, and Discord, will also be present. It is expected that these companies will face tough questioning about their inability to properly safeguard minors on their platforms.
In addition, more than 40 states have filed lawsuits against Meta, alleging that their apps are contributing to mental health issues among young users. The company has also received a formal request for information from European Union regulators, who are seeking more details about Meta’s response to child safety concerns on Instagram.
Just yesterday, TechCrunch reported on internal Meta documents that revealed the company knowingly marketed their apps to children and was aware of the large amount of inappropriate and sexually explicit content being shared between adults and minors on their platforms.
Remember to use appropriate HTML tags for any quotes, lists, or other elements, and to maintain a well-structured and easily readable article.