As Snapchat continues to grow, we aim to foster a platform that empowers people to express themselves, live in the moment, learn about the world, and have fun together -- all in a safe and secure environment. To do so, we continually improve our safety and privacy practices — from our Terms of Service and Community Guidelines, to tools for preventing, detecting and enforcing against harmful content, to initiatives that help educate and empower our community.
To provide insight into these efforts and visibility into the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about online safety and transparency.
This report covers the first half of 2021 (January 1 - June 30). As with our previous reports, it shares data about the total number of policy violations globally during this period; the number of content and account-level reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country. It also captures recent additions to this report including the Violative View Rate of Snapchat content, potential trademark violations, and incidences of false information on the platform.
As part of our ongoing efforts to improve both our safety enforcement and our transparency reports themselves, for the first time this report shares our median turnaround time to action user reports in minutes instead of hours, to provide more granular detail on our operational practices and efficacy.
We have continued to expand our partnerships with industry experts as well as our in-app features to help educate Snapchatters about online risks and how to use in-app reporting to alert our Trust and Safety teams to any type of concern or policy violation.
In addition, we continued to add partners to our trusted flagger program, which provides vetted safety experts with a confidential channel to report emergency escalations. We work closely with these partners to provide safety education, wellness resources, and other reporting guidance so they can help support the Snapchat community.
For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this transparency report.
To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.
Every day, on average more than five billion Snaps are created using our Snapchat camera. From January 1 - June 30, 2021, we enforced against 6,629,165 pieces of content globally that violated our policies.
Enforcement actions could include removing the offending content or terminating the account in question. If an account is terminated for violating our Guidelines, the account holder is not permitted to create a new account or use Snapchat again.
During the reporting period, we saw a Violative View Rate (VVR) of 0.10 percent, which means that out of every 10,000 snap views on Snapchat, 10 contained content that violated our guidelines.
Total Content & Account Reports*
Total Content Enforced
Total Unique Accounts Enforced
12,101,326
6,629,165
2,510,798
Reason
Content & Account Reports*
Content Enforced
% of Total Content Enforced
Unique Accounts Enforced
Turnaround Time**
Sexually Explicit Content
6,638,110
4,783,518
73.2%
1,441,208
<1
Regulated Goods
776,806
620,083
9.5%
274,883
<1
Threatening / Violence / Harm
1,077,311
465,422
7.1%
288,091
13
Harassment and Bullying
911,198
319,311
4.9%
249,421
27
Spam
560,509
243,729
3.7%
120,898
5
Hate Speech
241,332
121,639
1.9%
92,314
15
Impersonation
1,896,060
75,463
1.2%
43,983
13
*Content & Account Reports reflect reports via Snap's in-app reporting mechanism.
**Turnaround Time reflects the median time in minutes to action on a user report.
We have always believed that when it comes to harmful content, it isn’t enough to think about policies and enforcement — platforms need to consider their fundamental architecture and product design. From the beginning, Snapchat was built differently than traditional social media platforms, without an open newsfeed where anyone can broadcast to a large audience without moderation.
Our Guidelines clearly prohibit the spread of false information that could cause harm, including misinformation that aims to undermine civic processes, like voter suppression; unsubstantiated medical claims; and the denial of tragic events. Our Guidelines and enforcement apply consistently to all Snapchatters — we don’t make special exceptions for politicians or other public figures.
Globally during this period, Snapchat enforced against a combined total of 2,597 accounts and pieces of content for violations of our false information guidelines.
Total Content & Account Enforcements
2,597
The sexual exploitation of any member of our community, especially minors, is illegal, unacceptable, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Abuse Material (CSAM) on our platform is a top priority for us, and we continuously evolve our capabilities for CSAM and other types of child sexually exploitative content.
Our Trust and Safety teams use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of CSAM and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.
In the first half of 2021, 5.43% of the total number of accounts enforced globally contained CSAM. Of this, we proactively detected and actioned 70 percent of CSAM violations. This increased proactive detection capability combined with increased spam attacks in this category resulted in a notable increase in this category for this reporting period.
Total Account Deletions
119,134
During the reporting period, we removed five accounts for violations of our prohibition of terrorist and violent extremist content, a slight decrease from the last reporting cycle.
Total Account Deletions
5
This section provides an overview of the enforcement of our Community Guidelines in a sampling of geographic regions. Our Guidelines apply to all content on Snapchat—and all Snapchatters—across the globe, regardless of location.
Information for individual countries is available for download via the attached CSV file.
Region
Content Reports*
Content Enforced
Unique Accounts Enforced
North America
5,095,831
3,133,534
1,119,681
Europe
3,081,426
1,615,381
573,009
Rest of World
3,924,069
1,880,250
554,727
Total
12,101,326
6,629,165
2,247,417