1 July 2021 – 31 December 2021
Released: 1 April 2022
Updated: 1 April 2022
As Snapchat grows, our aim is to continue to empower people to express themselves, live in the moment, learn about the world and have fun together - all in a safe and secure environment. To do so, we continually improve our safety and privacy practices - including our Terms of Service and Community Guidelines; tools for preventing, detecting and enforcing against harmful content; and initiatives that help educate and empower our community.
To provide insight into these efforts and visibility into the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about online safety and transparency.
This report covers the second half of 2021 (1 July - 31 December). As with our previous reports, it shares data about the global number of in-app content and account-level reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country. It also captures recent additions to this report, including the Violative View Rate of Snapchat content, potential trademark violations and incidences of false information on the platform.
As part of our ongoing focus on improving our transparency reports, we are introducing several new elements to this report. For this instalment and going forward, we are breaking down drugs, weapons and regulated goods into their own categories, which will provide additional detail about their prevalence and our enforcement efforts.
For the first time, we have also created and new suicide and self-harm reporting category to provide insight into total content and account reports we receive and take action on. As part of our commitment to supporting the well-being of our community, our Trust and Safety teams share in-app resources with Snapchatters in need, and we're also sharing more details about that work here.
For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this transparency report.
To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.
Overview of Content and Account Violations
From 1 July - 31 December 2021, we enforced against 6,257,122 pieces of content globally that violated our policies. Enforcement actions include removing the offending content or terminating the account in question.
During the reporting period, we saw a Violative View Rate (VVR) of 0.08 per cent, which means that out of every 10,000 Snap and Story views on Snapchat, 8 contained content that violated our guidelines.
|Total content and account reports||Total content enforced||Total unique accounts enforced|
|Reason||Content and account reports||Content enforced||% of Total Content Enforced||Unique accounts enforced||Media turnaround time (minutes)|
|Sexually explicit content||7,605,480||4,869,272||77.8%||1,716,547||<1|
|Harassment and bullying||988,442||346,624||5.5%||274,395||12|
|Threats and violence||678,192||232,565||3.7%||159,214||12|
|Other regulated goods||56,505||38,860||0.6%||26,736||6|
|Self-harm and suicide||164,571||33,063||0.5%||29,222||12|
Combating Child Sexual Abuse Material
The sexual exploitation of any member of our community, especially minors, is illegal, unacceptable and prohibited by our Community Guidelines. Preventing, detecting and eradicating Child Sexual Abuse Material (CSAM) on our platform is a top priority for us, and we continuously evolve our capabilities to address CSAM and other types of child sexually exploitative content.
Our Trust and Safety teams use active technology detection tools, such as PhotoDNA, robust hash matching and Google's Child Sexual Abuse Imagery (CSAI) match to identify known illegal images and videos of CSAM and report them to the US National Centre for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.
In the second half of 2021, we proactively detected and actioned 88 per cent of the total CSAM violations reported here. Advancements in our proactive detection capability resulted in a notable increase in this category for this reporting period.
|Total account deletions||198,109|
Combating the spread of false information
We have always believed that when it comes to harmful content, it isn't enough to think about policies and enforcement - platforms need to consider their fundamental architecture and product design. From the beginning, Snapchat was built differently than traditional social media platforms, without an open newsfeed where anyone can broadcast to a large audience without moderation.
Our Guidelines clearly prohibit the spread of false information that could cause harm, including false information that aims to undermine civic processes; unsubstantiated medical claims; and the denial of tragic events. Our Guidelines and enforcement apply consistently to all Snapchatters - we don't make special exceptions for politicians or other public figures.
Globally during this period, Snapchat enforced against a combined total of 14,613 accounts and pieces of content for violations of our false information guidelines.
|Total content and account enforcements||14,613|
Terrorist & violent extremist content
During the reporting period, we removed 22 accounts for violations of our prohibition of terrorist and violent extremist content.
At Snap, we remove terrorist and violent extremism content reported through multiple channels. These include allowing users to report terrorist and violent extremist content through our in-app reporting menu, and we work closely with law enforcement to address terrorism and violent extremism content that may appear on Snap.
|Total account deletions||22|
Self-harm & suicide content
We care deeply about the mental health and well-being of Snapchatters which has informed a lot of our own decisions about how to build Snapchat differently. As a platform designed to help real friends communicate, we believe Snapchat can play a unique role in empowering friends to help each other through these difficult moments.
When our Trust & Safety team recognises a Snapchatter in distress, they have the option to forward self-harm prevention and support resources, and to notify emergency response personnel where appropriate. The resources we share are available on our global list of safety resources, and these are publicly available to all Snapchatters.
|Total times suicide resources shared||21,622|
This section provides an overview of the enforcement of our Community Guidelines in a sampling of geographic regions. Our Guidelines apply to all content on Snapchat — and all Snapchatters — across the globe, regardless of location.
Information for individual countries is available for download via the attached CSV file.
|Region||Content reports*||Content enforced||Unique accounts enforced|
|Rest of world||4,539,292||1,963,590||668,555|
Transparency report archives
- 1 July 2021 - 31 December 2021
- 1 January 2021 - 30 June 2021
- 1 July 2020 - 31 December 2020
- 1 January 2020 - 30 June 2020
- 1 July 2019 - 31 December 2019
- 1 January 2019 - 30 June 2019
- 1 July 2018 - 31 December 2018
- 1 January 2018 - 30 June 2018
- 1 July 2017 - 31 December 2017
- 1 January 2017 - 30 June 2017
- 1 July 2016 - 31 December 2016
- 1 January 2016 - 30 June 2016
- 1 July 2015 - 31 December 2015
- 1 January 2015 - 30 June 2015
- 1 November 2014 - 28 February 2015