Instagram on Thursday announced a new feature called “quiet mode,” which aims to help users focus and set boundaries with friends and followers.
When the option is enabled, all notifications will be paused and the profile’s activity status will change to ‘In quiet mode.” If someone sends a direct message during this time, Instagram will automatically send an auto-reply notifying the sender that “quiet mode” is activated.
While the feature applies to all users, Instagram appears to be focusing on teens. Instagram is pitching it as a tool to help with studying and prompting teens to turn on the feature “when they spend a specific amount of time on Instagram late at night.”
The tool will roll out to users in the United States, United Kingdom, Ireland, Canada, Australia, and New Zealand, and plans to add it to more countries in the future.
The tool is the latest example of instagram offering users more ways to manage their usage, after years of scrutiny over how much time people — and especially teens — spend on various social media applications, and the harms it can pose to their mental health.
“These updates are part of our ongoing work to ensure people have experiences that work for them, and that they have more control over the time they spend online and the types of content they see,” the company said in a blog post.
As part of that effort, the platform is also introducing features to give users more control over what shows up in their Explore feed. For example, it’s now possible to mark content with a “Not Interested” label to prevent similar content from showing up in the future. Instagram is also introducing an option to block words or lists of words, emojis or hashtags, such as #fitness or #recipes, from being recommended in the Explore feed.
Instagram is updating its parental supervision tools, too. When a teen updates a setting, parents can receive a notification so they can talk to their teen about the change. Parents will also be able to view accounts their teen has blocked.
In a series of congressional hearings in 2021, executives from Instagram, Facebook, TikTok, and Snapchat faced tough questions from lawmakers over how their platforms can lead younger users to harmful content, damage mental health and body image (particularly among teenage girls), and lacked sufficient parental controls and safeguards to protect teens.
The social media companies vowed to make changes, and Instagram in particular has made many. It has since introduced an educational hub for parents with resources, tips and articles from experts on user safety, and rolled out a tool that allows guardians to see how much time their kids spend on Instagram and set time limits.
Another Instagram feature encouraged users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. The company has also said it’s taking a “stricter approach” to the content it recommends to teens and actively nudges them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.
Fewer Americans file for jobless benefits last week
Twitter and Facebook bent rules to favor Donald Trump, Jan. 6 staff report finds
Elon Musk’s Tesla tweets could cost him billions more — in court
Laid-off Twitter workers in limbo over severance pay
Social media agitator Tate detained in Romania