At Snap, we contribute to human progress by empowering people to express themselves, live in the moment, learn about the world, and have fun together. We care deeply about the wellbeing of our community and, when building products, we consider the privacy and safety of Snapchatters on the front-end of the design process. 

We have clear and thorough guidelines that support our mission by fostering the broadest range of self-expression while encouraging Snapchatters to use our services safely every day. Our Community Guidelines prohibit the spread of false information that can cause harm, hate speech, bullying, harassment, illegal activities, sexually explicit content, graphic violence, and much more. 

Our Transparency Report offers important insight into the violating content we enforce against, governmental requests for Snapchatters’ account information, and other legal notifications.

For more information about our approach to and resources for safety and privacy, please take a look at our About Transparency Reporting tab at the bottom of the page.

Account / Content Violations

More than four billion Snaps are created each day using our camera. From January 1, 2020 - June 30, 2020, we enforced against 3,872,218 pieces of content, globally, for violations of our Community Guidelines—which amounts to less than 0.012% of all Story postings. Our teams generally take action on such violations quickly, whether it is to remove Snaps, delete accounts, report information to the National Center for Missing & Exploited Children (NCMEC), or escalate to law enforcement. In the vast majority of cases, we enforce against content within 2 hours of receiving an in-app report.

Total Content Reports*

Total Content Enforced

Total Unique Accounts Enforced

13,204,971

3,872,218

1,578,985

H1'20: Content Enforced

*Content & Account Reports reflect reports via Snap's in-app reporting mechanism.

**Turnaround Time reflects the median time in minutes to action on a user report.

Expanded Violations

Combating The Spread of False Information

We have always believed that when it comes to harmful content, it isn’t enough to think about policies and enforcement — platforms need to consider their fundamental architecture and product design. From the beginning, Snapchat was built differently than traditional social media platforms, without an open newsfeed where anyone can broadcast to a large audience without moderation.

Our Guidelines clearly prohibit the spread of false information that could cause harm, including misinformation that aims to undermine civic processes, like voter suppression; unsubstantiated medical claims; and the denial of tragic events. Our Guidelines and enforcement apply consistently to all Snapchatters — we don’t make special exceptions for politicians or other public figures. 

Globally during this period, Snapchat enforced against a combined total of 2,597 accounts and pieces of content for violations of our false information guidelines

Chart Key

Reason

Content Reports*

Content Enforced

Unique Accounts Enforced

Turnaround Time**

1

Harassment and Bullying

857,493

175,815

145,445

0.4

2

Hate Speech

229,375

31,041

26,857

0.6

3

Impersonation

1,459,467

22,435

21,510

0.1

4

Regulated Goods

520,426

234,527

137,721

0.3

5

Sexually Explicit Content

8,522,585

3,119,948

1,160,881

0.2

6

Spam

552,733

104,523

59,131

0.2

7

Threatening / Violence / Harm

1,062,892

183,929

141,314

0.5

*The Content Reports reflect alleged violations via our in-app and support inquiries.

**Turnaround Time reflects the median time in hours to action on a user report.

Expanded Violations

Child Sexual Exploitation and Abuse

The exploitation of any member of our community—particularly young people—is totally unacceptable, and prohibited on Snapchat. Preventing, detecting and eliminating abuse on our platform is a priority for us, and we are constantly improving our capabilities for combating this type of illegal activity.

Reports of child sexual abuse materials (CSAM) are quickly reviewed by our Trust and Safety team, and evidence of this activity results in account termination and reporting to the National Center for Missing & Exploited Children (NCMEC). We provide around-the-clock support to international law enforcement agencies who contact us for assistance with cases involving missing or endangered juveniles. 

We use PhotoDNA technology to proactively identify and report the uploading of known imagery of child sexual exploitation and abuse, and we report any instances to the authorities. Of the total accounts enforced against for Community Guidelines violations, we removed 2.99% for CSAM takedown. 

Moreover, Snap proactively deleted 70% of these.

Total Account Deletions

47,136

Terrorism 

Terrorist organizations and hate groups are prohibited from Snapchat and we have no tolerance for content that advocates or advances violent extremism or terrorism. 

Total Account Deletions

<10

Country Overview

This section gives an overview of the enforcement of our rules in a sampling of individual countries. Our Community Guidelines apply to all content on Snapchat—and all Snapchatters—across the globe, regardless of location. 

Information for all other countries is available for download via the attached CSV file.

Region

Content Reports*

Content Enforced

Unique Accounts Enforced

North America

5,769,636

1,804,770

785,315

Europe

3,419,235

960,761

386,728

Rest of World

4,016,100

1,106,687

413,272

Total

13,204,971

3,872,218

1,578,985