logo

Every day, Snapchatters around the world use our app to talk to their close friends and express themselves creatively. Our goal is to design products and build technology that nurtures and supports real friendships in a healthy, safe and fun environment. We are constantly working to improve the ways we do that – from our policies and Community Guidelines, to our tools for preventing, detecting and enforcing against harmful content, to initiatives that help educate and empower our community. 

We are committed to providing more transparency about the prevalence of content that violates our guidelines, how we enforce our policies, how we respond to law enforcement and government requests for information and where we seek to provide more insight in the future. We publish transparency reports twice a year to provide insight into these efforts, and are also committed to making these reports more comprehensive and helpful to the many stakeholders who care deeply about online safety and transparency.

This report covers the second half of 2020 (1 July – 31 December). As with our previous reports, it shares data about our total violations globally during this period; the number of content reports we received and enforces against across specific categories of violations; how we supported and fulfilled requests from law enforcement and governments; and our enforcements broken down by country.

As part of our ongoing efforts to improve both our safety enforcements and our transparency reports themselves, this report also includes several new elements:

  • The Violative View Rate (VVR) of content, which offers a better understanding of the proportion of all Snaps (or views) that contained content that violated our policies;

  • Total content and account enforcements of false information globally – which was especially relevant during this time period, as the world continued to battle a global pandemic and efforts to undermine civic and democratic norms; and

  • Requests to support investigations into potential trademark violations.

We are working on a number of improvements that will enhance our ability to provide more detailed data in future reports. That includes expanding on subcategories of violating data. For example, we currently report violations related to regulated goods, which includes illegal drugs and weapons. Moving forward, we plan to include each in its own subcategory.

As new online threats and behaviours emerge, we will continue to improve our tools and tactics for fighting them. We constantly evaluate the risks and how we can advance our technological capabilities to better protect our community. We regularly seek guidance from security and safety experts about the ways we can stay a step ahead of bad actors – and are grateful to our growing list of partners who give invaluable feedback and push us to be better.

For more information about our approach to and resources for safety and privacy, please take a look at our About Transparency Reporting tab at the bottom of the page.

Overview of content and account violations

Our Community Guidelines prohibit harmful content, including misinformation; conspiracy theories that can cause harm; deceptive practices; illegal activities, including buying or selling illegal drugs, counterfeit goods, contraband or illegal weapons; hate speech, hate groups and terrorism; harassment and bullying; threats, violence and harm, including the glorification of self-harm; sexually explicit content; and child sexual exploitation.

Every day, more than five billion Snaps are created using our Snapchat camera on average. From 1 July – 31 December 2020, we enforced against 5,543,281 pieces of content globally that violated our guidelines.

Enforcement actions could include removing the offending content; terminating or limiting the visibility of the account in question; and referring the content to law enforcement. If an account is terminated for violating our Guidelines, the account holder is not permitted to create a new account or use Snapchat again.

During the reporting period, we saw a Violative View Rate (VVR) of 0.08 percent, which means that out of every 10,000 views of content on Snap, eight contained content that violated our guidelines.

We offer in-app reporting tools that allow Snapchatters to quickly and easily report content to our Trust and Safety teams, who investigate the report, and take appropriate action. Our teams work to take enforcement actions as quickly as possible and, in the vast majority of cases, take action within two hours of receiving an in-app report.

In addition to in-app reporting, we also offer online reporting options through our support site. Furthermore, our teams are constantly improving capabilities for proactively detecting violating and illegal content, such as child abuse material, content that involves illegal drugs or weapons or threats of violence. We outline specific details of our work to combat child sexual exploitation and abuse in this report.

As the charts below lay out, during the second half of 2020, we received the most in-app reports or requests for support about content that involved impersonation or sexually explicit content. We were able to significantly improve our time responding to reports of violations, in particular for regulated goods, which includes illegal drugs, counterfeit goods and weapons; sexually explicit content; and harassment and bullying.

Total content reports*

Total content enforced

Total unique accounts enforced

10,131,891

5,543,281

2,100,124

Reason

Content reports*

Content enforced

% of Total content enforced

Unique accounts enforced

Turnaround time**

Sexually explicit content

5,839,778

4,306,589

77.7%

1,316,484

0.01

Regulated goods

523,390

427,272

7.7%

209,230

0.01

Threatening / violence / harm

882,737

337,710

6.1%

232,705

0.49

Harassment and bullying

723,784

238,997

4.3%

182,414

0.75

Spam

387,604

132,134

2.4%

75,421

0.21

Hate speech

222,263

77,587

1.4%

61,912

0.66

Impersonation

1,552,335

22,992

0.4%

21,958

0.33

*The content reports reflect alleged violations via our in-app and support inquiries.

**Turnaround time reflects the median time in hours to action on a user report.

Expanded violations

Combating the spread of false information

We have always believed that when it comes to harmful content, it isn't enough just to think about policies and enforcement – platforms need to think about their fundamental architecture and product design. From the beginning, Snapchat was built differently from traditional social media platforms, to support our primary use case of talking with close friends – rather than an open newsfeed where anyone has the right to distribute anything to anyone without moderation.

As we explain in our introduction, our guidelines clearly prohibit the spread of false information that could cause harm, including misinformation that aims to undermine civic processes, like voter suppression, unsubstantiated medical claims and conspiracy theories such as the denial of tragic events. Our guidelines apply consistently to all Snapchatters – we don't have special exceptions for politicians or public figures.

Across our app, Snapchat limits virality, which removes incentives for harmful and sensationalised content, and limits concerns associated with the spread of bad content. We don't have an open newsfeed, and don't give unvetted content an opportunity to “go viral”. Our content platform, Discover, only features content from vetted media publishers and content creators.

In November 2020, we launched our new entertainment platform, Spotlight, and proactively moderate content to make sure it complies with our guidelines before it can reach a large audience.

We have long taken a different approach to political advertising as well. As with all content on Snapchat, we prohibit false information and deceptive practices in our advertising. All political ads, including election-related ads, issue advocacy ads and issue ads, must include a transparent “paid for” message that discloses the sponsoring organisation. We use human review to fact-check all political ads, and provide information about all ads that pass our review in our Political Ads library.

This approach isn't perfect, but it has helped us to protect Snapchat from the dramatic increase in misinformation in recent years, a trend that has been especially relevant during a period when false information about COVID-19 and the U.S. 2020 presidential election consumed many platforms.

Globally during this period, Snapchat enforced against 5,841 pieces of content and accounts for violations of our false information guidelines. In future reports, we plan to provide more detailed breakdowns of false information violations.

Given the heightened concern about efforts to undermine voting access and the election results in the U.S. in the summer of 2020, we formed an internal Task Force that focused on assessing any potential risk or vectors for misuse of our platform, monitored all developments and worked to ensure Snapchat was a source for factual news and information. These efforts included:

  • Updating our community guidelines to add manipulating media for misleading purposes, such as deepfakes, to out categories of prohibited content;

  • Working with our Discover editorial partners to make sure publishers didn't inadvertently amplify any misinformation through news coverage;

  • Asking Snap Stars, whose content also appears on our Discover content platform to make sure they complied with our Community Guidelines and didn't unintentionally spread false information;

  • Having clear enforcement outcomes for any violating content­­­ – rather than labelling content, we simply removed it, immediately reducing the harm of it being shared more widely; and

  • Proactively analysing entities and other sources of false information that could be used to distribute such information on Snapchat to assess risk and take preventative measures.

Throughout the COVID-19 pandemic, we have taken a similar approach to providing factual news and information, including through coverage provided by our Discover editorial partners, through PSAs and Q&As with public health officials and medical experts and through creative tools, such as Augmented Reality Lenses and filters, reminding Snapchatters of expert public health guidance.

Total content & account enforcements

5,841

Combating child sexual exploitation and abuse

The exploitation of any member of our community, especially young people and minors, is illegal, unacceptable and prohibited by our guidelines. Preventing, detecting and eliminating abuse on our platform is a top priority for us, and we continuously evolve our capabilities for combating Child Sexual Abuse Material (CSAM) and other types of exploitative content.

Our Trust and Safety teams use proactive detection tools, such as PhotoDNA technology, to identify known images of CSAM and report it to the National Centre for Missing and Exploited Children (NCMEC). When we proactively detect or identify instances of CSAM, we preserve these and report them to the NCMEC, who will then review and coordinate with law enforcement.

In the second half of 2020, 2.99 percent of the total accounts we took enforcement action against globally for violations of our Community Guidelines contained CSAM. Of this, we proactively detected and took action on 73 percent of content. Overall, we deleted 47,550 accounts for CSAM violations, and in each case reported that content to NCMEC.

During this time period, we took a number of steps to further combat CSAM. We adopted Google's Child Sexual Abuse Imagery (CSAI) technology for videos, allowing us to identify videos of CSAM and report it to NCMEC. Combined with our PhotoDNA detection for known CSAM imagery and industry has databases, we can now proactively detect and report to authorities known video and photo imagery. This enhanced capability has allowed us to become much more efficient in our detection – and thus our reporting of the criminal conduct.

In addition, we continued to expand our partnerships with industry experts and rolled out additional in-app features to help educate Snapchatters about the risks of contact with strangers and how to use in-app reporting to alert our Trust and Safety teams to any type of abuse. We continued to add partners to our trusted flagger programme that provides vetted safety experts a confidential channel to report emergency escalations, such as an imminent threat to life or a case involving CSAM. We also work closely with these partners to provide safety education, wellness resources and other reporting support, so they can effectively support the Snapchat community.

Additionally, we serve on the Board of Directors for the Technology Coalition, a group of tech industry leaders who seek to prevent and eradicate online child sexual abuse exploitation and abuse, and are constantly working with other platforms and safety experts to explore additional solutions to strengthen our collective efforts in this space.

Total account deletions

47,550

Terrorist & extremist content

At Snap, monitoring developments in this space and mitigating any potential vectors for abuse on our platform was a part of our U.S. election integrity task force work. Both our product architecture and the design of our Group Chat functionality limits the spread of harmful content and opportunities to organise. We offer Group Chats, but they are limited in size to several dozen members, are not recommended by algorithms, and are not discoverable on our platform if you are not a member of that Group.

During the second half of 2020, we removed eight accounts for violations of our prohibition of terrorism, hate speech and extremist content.

Total account deletions

8

Country overview

This section gives an overview of the enforcement of our rules in a sampling of individual countries. Our Community Guidelines apply to all content on Snapchat – and all Snapchatters – across the globe, regardless of location. 

Information for all other countries is available for download via the attached CSV file.

Region

Content reports*

Content enforced

Unique accounts enforced

North America

4,230,320

2,538,416

928,980

Europe

2,634,878

1,417,649

535,649

Rest of world

3,266,693

1,587,216

431,407

Total

10,131,891

5,543,281

1,896,015