Today, we are announcing a change to our account disable policy. Together with Facebook, we develop policies to ensure Instagram is a supportive place for everyone. These changes will help us quickly detect and remove accounts that repeatedly violate our policies.
Under our existing policy, we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time. Similarly to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram.
We are also introducing a new notification process to help people understand if their account is at risk of being disabled. This notification will also offer the opportunity to appeal content deleted. To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months. If content is found to be removed in error, we will restore the post and remove the violation from the account’s record. We’ve always given people the option to appeal disabled accounts through our Help Center, and in the next few months, we’ll bring this experience directly within Instagram.
Today’s update is an important step in improving our policies and keeping our platforms a safe and supportive place.