At Snap, we contribute to human progress by empowering people to express themselves, live in the moment, learn about the world and have fun together. We care deeply about the well-being of our community and, when building products, we consider the privacy and safety of Snapchatters on the front-end of the design process. 

We have clear and thorough guidelines that support our mission by fostering the broadest range of self-expression while encouraging Snapchatters to use our services safely every day. Our Community Guidelines prohibit the spread of false information that can cause harm, hate speech, bullying, harassment, illegal activities, sexually explicit content, graphic violence and much more. 

Our Transparency Report offers important insight into the violating content we enforce against, governmental requests for Snapchatters’ account information and other legal notifications.

For more information about our approach to and resources for safety and privacy, please take a look at our About Transparency Reporting tab at the bottom of the page.

Account / content violations

More than four billion Snaps are created each day using our camera. From 1 January 2020 – 30 June 2020, we enforced against 3,872,218 pieces of content, globally, for violations of our Community Guidelines – which amounts to less than 0.012% of all Story postings. Our teams generally take action on such violations quickly, whether it is to remove Snaps, delete accounts, report information to the National Centre for Missing & Exploited Children (NCMEC) in the United States or escalate to law enforcement. In the vast majority of cases, we enforce against content within 2 hours of receiving an in-app report.

Total Content Reports*

Total Content Enforced

Total Unique Accounts Enforced

13,204,971

3,872,218

1,692,859

H1'20: content enforced

*The content reports reflect alleged violations via our in-app and support inquiries.

**Turnaround time reflects the median time in hours to action on a user report.

Expanded violations

Combating the spread of false information

We have always believed that when it comes to harmful content, it isn't enough just to think about policies and enforcement – platforms need to think about their fundamental architecture and product design. From the beginning, Snapchat was built differently from traditional social media platforms, to support our primary use case of talking with close friends – rather than an open newsfeed where anyone has the right to distribute anything to anyone without moderation.

As we explain in our introduction, our guidelines clearly prohibit the spread of false information that could cause harm, including misinformation that aims to undermine civic processes, like voter suppression, unsubstantiated medical claims and conspiracy theories such as the denial of tragic events. Our guidelines apply consistently to all Snapchatters – we don't have special exceptions for politicians or public figures.

Across our app, Snapchat limits virality, which removes incentives for harmful and sensationalised content, and limits concerns associated with the spread of bad content. We don't have an open newsfeed, and don't give unvetted content an opportunity to “go viral”. Our content platform, Discover, only features content from vetted media publishers and content creators.

In November 2020, we launched our new entertainment platform, Spotlight, and proactively moderate content to make sure it complies with our guidelines before it can reach a large audience.

We have long taken a different approach to political advertising as well. As with all content on Snapchat, we prohibit false information and deceptive practices in our advertising. All political ads, including election-related ads, issue advocacy ads and issue ads, must include a transparent “paid for” message that discloses the sponsoring organisation. We use human review to fact-check all political ads, and provide information about all ads that pass our review in our Political Ads library.

This approach isn't perfect, but it has helped us to protect Snapchat from the dramatic increase in misinformation in recent years, a trend that has been especially relevant during a period when false information about COVID-19 and the U.S. 2020 presidential election consumed many platforms.

Globally during this period, Snapchat enforced against 5,841 pieces of content and accounts for violations of our false information guidelines. In future reports, we plan to provide more detailed breakdowns of false information violations.

Given the heightened concern about efforts to undermine voting access and the election results in the U.S. in the summer of 2020, we formed an internal Task Force that focused on assessing any potential risk or vectors for misuse of our platform, monitored all developments and worked to ensure Snapchat was a source for factual news and information. These efforts included:

  • Updating our community guidelines to add manipulating media for misleading purposes, such as deepfakes, to out categories of prohibited content;

  • Working with our Discover editorial partners to make sure publishers didn't inadvertently amplify any misinformation through news coverage;

  • Asking Snap Stars, whose content also appears on our Discover content platform to make sure they complied with our Community Guidelines and didn't unintentionally spread false information;

  • Having clear enforcement outcomes for any violating content­­­ – rather than labelling content, we simply removed it, immediately reducing the harm of it being shared more widely; and

  • Proactively analysing entities and other sources of false information that could be used to distribute such information on Snapchat to assess risk and take preventative measures.

Throughout the COVID-19 pandemic, we have taken a similar approach to providing factual news and information, including through coverage provided by our Discover editorial partners, through PSAs and Q&As with public health officials and medical experts and through creative tools, such as Augmented Reality Lenses and filters, reminding Snapchatters of expert public health guidance.

Chart key

Reason

Content reports*

Content enforced

Unique accounts enforced

Turnaround time**

1

Harassment and bullying

857,493

175,815

145,445

0.4

2

Hate speech

229,375

31,041

26,857

0.6

3

Impersonation

1,459,467

22,435

21,510

0.1

4

Regulated goods

520,426

234,527

137,721

0.3

5

Sexually explicit content

8,522,585

3,119,948

1,160,881

0.2

6

Spam

552,733

104,523

59,131

0.2

7

Threatening / violence / harm

1,062,892

183,929

141,314

0.5

*The content reports reflect alleged violations via our in-app and support inquiries.

**Turnaround time reflects the median time in hours to action on a user report.

Expanded violations

Child sexual exploitation and abuse

The exploitation of any member of our community – particularly young people – is totally unacceptable and prohibited on Snapchat. Preventing, detecting and eliminating abuse on our platform is a priority for us, and we are constantly improving our capabilities for combatting this type of illegal activity.

Reports of child sexual abuse materials (CSAM) are quickly reviewed by our trust and safety team, and evidence of this activity results in account termination and reporting to the National Centre for Missing & Exploited Children (NCMEC). We provide around-the-clock support to international law enforcement agencies who contact us for assistance with cases involving missing or endangered juveniles. 

We use PhotoDNA technology to proactively identify and report the uploading of known imagery of child sexual exploitation and abuse, and we report any instances to the authorities. Of the total accounts enforced against for Community Guidelines violations, we removed 2.99% for CSAM takedown. 

Moreover, Snap proactively deleted 70% of these.

Total account deletions

47,136

Terrorism 

Terrorist organisations and hate groups are prohibited from Snapchat, and we have no tolerance for content that advocates or advances violent extremism or terrorism. 

Total account deletions

<10

Country overview

This section gives an overview of the enforcement of our rules in a sampling of individual countries. Our Community Guidelines apply to all content on Snapchat – and all Snapchatters – across the globe, regardless of location. 

Information for all other countries is available for download via the attached CSV file.

Region

Content reports*

Content enforced

Unique accounts enforced

North America

5,769,636

1,804,770

785,315

Europe

3,419,235

960,761

386,728

Rest of world

4,016,100

1,106,687

413,272

Total

13,204,971

3,872,218

1,578,985