logo

Snap Partners with It’s On Us for Sexual Assault Awareness Month

IOU

In February, Snapchat partnered with It’s On Us, a national non-profit dedicated to combating campus sexual assault through awareness and prevention education programs, to announce our important new Snap Map safety feature helping friends share their real-time location. 

Together with It’s On Us, we introduced this new tool to help  Snapchatters to look out for one another while they are on the go, whether they are on their way to meet up, or on their way home at night–and already, more than three million members of our community use the feature to connect with their friends each week on average.  

This April, for Sexual Assault Awareness month, Snapchat and It’s On Us haved joined forces again to continue our community education around this important issue with new in-app resources and content, including:

  • A Lens raising awareness around this important issue, reminding Snapchatters to look out for their friends;

  • An episode of Snapchat’s original news show, Good Luck America, where our host Peter Hamby explores what’s happening around Title IX and sexual assault on U.S. college campuses today; and 

  • Map Markers on our Snap Map. These unique, tappable icons highlight a handful of active university It’s On Us chapters. Our Snap Map Markers link seamlessly back to the Lens in our Camera to make it easy for Snapchatters to share the message with their friends. 

With many in our community heading back out and about, whether they are on their way to spring break or coming back to campus, we know that this is a critical moment to raise awareness around this important issue. We’re proud to partner with It’s On Us to help Snapchatters keep each other safe.

If you or a loved one is in need of extra support at this time, please know that you are not alone. Please head to https://www.itsonus.org/ where you can find additional resources.

Apply to join our growing Safety Advisory Board!

Since 2018, members of Snap’s Safety Advisory Board (SAB) have provided critical feedback on fostering the safety and well-being of our Snapchat community, and they’ve helped us navigate some complex safety issues. Thanks to the expert advice and guidance of our SAB members and their partnership, we’ve made progress over the last four years, implementing important awareness-raising and educational efforts.

Snap remains committed to helping parents, caregivers, safety advocates, and others better understand how young people experience our platform and how we approach vital issues that impact safety and trust. However, given the ever-changing online safety landscape, we believe we have an opportunity to “reinvent” and relaunch our SAB to include new members who reflect our global community and our growth across products, including augmented reality and hardware, as well as expertise regarding newer online risks facing young people and their families. 

With those goals in mind, today we are opening applications to join our new and expanded SAB, which we hope will include members from diverse geographies and safety disciplines, including research, academia, technology and related fields. In keeping with Snap’s commitment to being victim- and survivor-informed in safety matters, we also welcome applications from those who may have experienced hardship or tragedy related to online interactions. Indeed, we are open to all applicants who have a unique perspective to share and an interest in constructively advising our ongoing safety work.     

We believe this approach to fashioning an SAB is unique among technology platforms, and we’re eager to receive applications from all corners of the globe. The application process will remain open for about two months, after which we will invite a number of experts to join our board.

In keeping with past practice and to help ensure the independence of our SAB, members will not receive compensation for their participation. Commitments will include three board meetings per year of roughly two hours each, in addition to occasional email correspondence. Those invited to join the SAB will be asked to agree to Terms of Reference, outlining expectations of board members as well as Snap’s commitments to the SAB.    

If you are interested in applying or would like to recommend someone, please complete this short application form by Friday, July 22. As advocates for online safety, we’re excited about the next chapter as we grow our network of advisors and trusted partners. Learn more and apply here!

- Jacqueline Beauchere, Snap Global Head of Platform Safety

Our Transparency Report for the Second Half of 2021

We are committed to making each of our Transparency Reports more comprehensive than the last. It’s a responsibility we don’t take lightly, as we know our stakeholders care as deeply about online safety and accountability as we do. As part of these ongoing efforts, we have made several additions and improvements to our latest Transparency Report, which covers the second half of 2021.

First, we are offering new detail on the amount of content we enforced against drug-related violations. We have zero tolerance for promoting illicit drugs on Snapchat and prohibit the buying or selling of illegal or regulated drugs. 

Over the past year, we have been especially focused on combating the rise of illicit drug activity as part of the larger growing fentanyl and opioid epidemic across the U.S. We take a holistic approach that includes deploying tools that proactively detect drug-related content, working with law enforcement to support their investigations, and providing in-app information and support to Snapchatters through our fentanyl-related education portal, Heads Up. Heads Up surfaces resources from expert organizations when Snapchatters search for a range of drug-related terms and their derivatives. As a result of these ongoing efforts, the vast majority of drug-related content we uncover is proactively detected by our machine learning and artificial intelligence technology, and we will continue working to eradicate drug activity from our platform

When we find activity involving the sale of dangerous drugs, we promptly ban the account, block the offender from creating new accounts on Snapchat, and have the ability to preserve content related to the account to support law enforcement investigations. During this reporting period, seven percent of all content we enforced against globally, and 10 percent of all content we enforced against in the U.S., involved drug-related violations. Globally, the median turnaround time we took action to enforce against these accounts was within 13 minutes of receiving a report.

Second, we have created a new suicide and self-harm category to share the total number of content and account reports that we received and took action on when our Trust & Safety teams determined that a Snapchatter may be in crisis. When our Trust & Safety team recognizes a Snapchatter in distress, they have the option to forward self-harm prevention and support resources, and to notify emergency response personnel where appropriate. We care deeply about the mental health and wellbeing of Snapchatters and believe we have a duty to support our community in these difficult moments. 

In addition to these new elements in our latest Transparency Report, our data shows that we saw a reduction in two key areas: Violative View Rate (VVR) and the number of accounts we enforced that attempted to spread hate speech, violence, or harm. Our current Violative View Rate is (VVR) 0.08 percent. This means that out of every 10,000 Snap and Story views on Snapchat, eight contained content that violated our Community Guidelines. This is an improvement from our last reporting cycle, during which our VVR was 0.10 percent. 

The fundamental architecture of Snapchat protects against the ability for harmful content to go viral, which removes incentives for content that appeals to people’s worst instincts, and limits concerns associated with the spread of bad content such as disinformation, hate speech, self-harm content, or extremism. In the more public parts of Snapchat, such as our Discover content platform and our Spotlight entertainment platform, we curate or pre-moderate content to ensure it complies with our guidelines before it can reach a larger audience. 

We continue to be vigilant to improve our human moderation and as a result, we have improved the median enforcement turnaround time by 25 percent for hate speech and eight percent for threats and violence or harm to 12 minutes in both categories. 

We believe it's our most important responsibility to keep our community safe on Snapchat and we are constantly strengthening our comprehensive efforts to do that. Our work here is never done, but we will continue communicating updates about our progress and we are grateful to our many partners that regularly help us improve.

Announcing New Policies for Snap’s Developer Platform

We want Snapchatters to have fun and stay safe when using our services, and that goal drives the design of our products, our policies and our platforms for third-party developers. We also focus on building technologies that support real-life human connections and communications between close friends – a principle that helps create more secure and more positive online experiences. 

We first launched our Snap Kit Developer platform to bring some of Snapchat’s most popular features to third party applications and services. From the outset, we set safety and privacy standards for all participating apps, and required that developers go through a review and approval process when they first apply to work with us so we can examine how their integration will work and their customer support operations. 

Among other things, our guidelines prohibit bullying, harassment, hate speech, threats and other types of harmful content – and we require that developers have adequate safeguards in place to protect their customers and take action on any reports of abuse. 

Last year, a lawsuit raised serious allegations about two integrated apps that included anonymous messaging features. At the time, we suspended both apps from Snap Kit, and began conducting an extensive review of the program’s standards and policies. 

As a result of this review, today we are announcing several changes to our developer platform that we believe are in the best interest of our community, and further aligned with our focus of supporting communications that reflect real-life friendships. 

Banning Anonymous Messaging 

First, we will prohibit apps that facilitate anonymous messaging from integrating with our platform. During our review, we determined that even with safeguards in place, anonymous apps pose risks for abuse that are impossible to mitigate at an acceptable level. 

While we know that most Snapchatters used these anonymous integrations in fun, engaging, and entirely appropriate ways, we believe some users might be more prone to engage in harmful behavior – such as bullying or harassment – if they have the shroud of anonymity. Under our new policy, we will not allow third-party apps to use a Snapchat integration to facilitate communication between users without registered and visible usernames and identities.

Age-Gating Friend Finding Apps to 18+ 

Our review was holistic and examined the privacy and safety of integrated apps well beyond anonymous messaging. Today we are also announcing that friend-finding apps will not be allowed unless they are age-gated and restricted to Snapchatters over 18. This change will better protect younger users and is more consistent with Snapchat’s use case – communications between close friends who already know each other. 

As a platform that works with a wide range of developers, we want to foster an ecosystem that helps apps protect user safety, privacy, and wellbeing, while unlocking product innovation for developers and helping them grow their businesses. 

We believe we can do both, and will continue to regularly evaluate our policies, monitor app compliance, and work with developers to better protect the wellbeing of our community.

Looking Out For Friends on the Snap Map

TLL

At Snap, we help friends stay connected no matter where they are, and we want to give our community even more tools to safely explore the world around them. So today, we are introducing a new safety feature for the Snap Map that will help Snapchatters look out for one another while they are on the go, whether they are on their way to meet up, or on their way home at night. 

Since 2017, Snapchatters have been able to opt in to share their location with their friends on the Snap Map, but to date the app needed to be open for their location to be updated. This new tool will give Snapchatters the option to share their real-time location with a close friend even while their app is closed. With this new buddy system, Snapchatters can throw their phone in their pocket and head out the door, feeling confident that the people they trust most are looking out for them while they're on the move.

Location sharing on the Snap Map has always been, and will continue to be, off by default, meaning that Snapchatters have to proactively opt in to share where they are. Importantly, Snapchatters can only ever share their whereabouts with their existing Snapchat friends – there is no option to broadcast their location to the wider Snapchat community. 

As a platform built for communicating with close friends, we know that location sharing can be an easy and impactful way for young people to stay connected and safe. In fact, according to feedback from our community, we know that Snapchatters feel even more connected to their friends when they see them on Snap Map, and that they are motivated to share their location with friends because they think it’s a safe and fun way to stay connected. 

We’ve built this new tool to offer Snapchatters a buddy system, and we’ve included several safety elements from the start, including:

  • A fast and clear way to activate, so Snapchatters can share their real-time location in an instant if they ever feel unsafe.

  • Limited time sharing & notification-free pausing so Snapchatters can easily turn this off when they’ve reached their destination. Plus, this minimizes any undue pressure to constantly share. 

  • Required two-way friendship meaning that only those who have mutually added each other as friends on Snapchat can share their location, keeping in line with our existing Snap Map policies. 

  • A safety notice that pops up when Snapchatters use the feature for the first time, ensuring our community knows that this is meant to be used only with close friends and family.

  • Ultra clear design so Snapchatters always understand their setting selections and who can see their location.

We are all adjusting to new ways of being out and about in the world -- especially on college campuses, where Snapchat is widely used. Many students have gone back to campus to be with their friends despite remote or hybrid learning, but with schools expecting less activity on the grounds, there may be gaps in normal security and safeguards. That's why we are launching this new tool as part of a partnership with It’s On Us, a national non-profit dedicated to combating campus sexual assault through campus awareness and prevention education programs. Starting today, a new PSA from It’s On Us will debut in our app, encouraging our community to look out for one another. 

We know many parents may have questions about how the Map works, who can see Snapchatters’ locations (if they choose to share them), and the policies and tools we have in place. So, we wanted to share more on the key safety and privacy features of the Snap Map:

  • Location Sharing is OFF by Default and Only for Friends: For all Snapchatters, location-sharing is off by default and completely optional. Snapchatters can update their location sharing preferences at any time by tapping the settings gear at the top of the Snap Map. There, they can hand select which existing friends can see their location, or hide themselves completely with ‘Ghost Mode.’ Snapchatters who do decide to share their location on the Map will only be visible to those they have selected -- we don’t give anyone the option to share their location publicly with people they have not proactively and mutually added as a friend. 

  • Education & Reminders: Snapchatters are taken through a tutorial when they use Snap Map for the first time. Here, they can learn how to opt-in for location sharing, how to select friends to share with, and how to update settings at any time. Snapchatters who choose to share their location with their friends receive periodic reminders asking them to confirm that they are still comfortable with their settings and if they are not, they can easily switch off location sharing without prompting other users.

  • Additional Privacy Safeguards: Only content that is proactively submitted to the Snap Map appears on it; Snaps between friends remain private. For Snapchatters who maintain our default privacy setting, content shown on the Map is automatically anonymized, so anyone looking at the Map cannot see the name, contact information, or exact location of the person who shared. We also protect sensitive businesses and locations on the Map.

We know that mobile location sharing is sensitive and needs to be used with caution, but we believe that with the right safeguards in place, it can be an impactful way for friends to not only stay connected, but also to help keep each other safe. We encourage you to visit our support page here for more information.

Safer Internet Day 2022: Your report matters!

Today is international Safer Internet Day (SID), an annual event dedicated to people coming together around the world to make the internet safer and healthier for everyone, especially young people. SID 2022 marks 19 straight years of Safer Internet Day celebrations, and the world is again rallying around the theme, “Together for a better internet.”

At Snap, we’re taking this opportunity to highlight the benefits and importance of letting us know when you see something on Snapchat that may be of concern to you. Snapchat is about sharing and communicating with close friends, and we want everyone to feel safe, confident and comfortable sending Snaps and Chats. Still, there may be times when people may share content or behave in a way that conflicts with our Community Guidelines.

When it comes to staying safe online, everyone has a role to play, and we want all Snapchatters to know that reporting abusive or harmful content and behavior – so that we can address it – improves the community experience for everyone. In fact, this is one of the most important things Snapchatters can do to help keep the platform free of bad actors and harmful content.

Reporting reluctance

Research shows young people may be unwilling to report content or behaviors for a variety of reasons. Some of these may be rooted in social dynamics, but platforms can also do a better job of debunking certain myths about reporting to foster comfort in contacting us. For example, in November 2021, we learned that just over a third of young people surveyed (34%) said they worry what their friends will think if they take action against bad behavior on social media. In addition, almost one in four (39%) said they feel pressure not to act when someone they personally know behaves badly. These findings come from Managing the Narrative: Young People’s Use of Online Safety Tools, conducted by Harris Insights and Analytics for the Family Online Safety Institute (FOSI) and sponsored by Snap. 

The FOSI research polled several cohorts of teens, aged 13 to 17, and young adults, aged 18 to 24, in the U.S. In addition to the quantitative components, the survey sought participants’ general views on reporting and other topics. One comment from an 18-year-old summed up a number of young people’s perspectives, “I guess I didn’t think the offense was extreme enough to report.” 

Fast Facts about reporting on Snapchat

The FOSI findings suggest possible misconceptions about the importance of reporting to platforms and services in general. For Snapchatters, we hope to help clear those up with this handful of Fast Facts about our current reporting processes and procedures. 

  • What to report:  In the conversations and Stories portions of Snapchat, you can report photos, videos and accounts; in the more public Discover and Spotlight sections, you can report content. 

  • How to report:  Reporting photos and videos can be done directly in the Snapchat app (just press and hold on the content); you can also report content and accounts via our Support Site (simply complete a short webform).  

  • Reporting is confidential:  We don’t tell Snapchatters who reported them.

  • Reports are vital:  To improve the experiences of Snapchatters, reports are reviewed and actioned by our safety teams, which operate around the clock and around the globe. In most instances, our teams action reports within two hours. 

  • Enforcement can vary:  Depending on the type of Community Guidelines or Terms of Service violation, enforcement actions can range from a warning, up to and including account deletion. (No action is taken when an account is found not to have violated Snapchat’s Community Guidelines or Terms of Service.) 

We’re always looking for ways to improve, and we welcome your feedback and input. Feel free to share your thoughts with us using our Support Site webform

To commemorate Safer Internet Day 2022, we suggest all Snapchatters review our Community Guidelines and Terms of Service to brush up on acceptable content and conduct. We’ve also created a new reporting Fact Sheet that includes a helpful FAQ, and we updated a recent “Safety Snapshot” episode on reporting. Safety Snapshot is a Discover channel that Snapchatters can subscribe to for fun and informative safety- and privacy-related content. For some added enjoyment to mark SID 2022, check out our new global filter, and look for additional improvements to our in-app reporting features in the coming months.    

New resource for parents 

Finally, we want to highlight a new resource we’re offering for parents and caregivers. In collaboration with our partners at MindUp: The Goldie Hawn Foundation, we’re pleased to share a new digital parenting course, “Digital Well-Being Basics,” which takes parents and caregivers through a series of modules about supporting and empowering healthy digital habits among teens. 

We look forward to sharing more of our new safety and digital well-being work in the coming months. In the meantime, consider doing at least one thing this Safer Internet Day to help keep yourself and others safe. Making a personal pledge to report would be a great start! 

- Jacqueline Beauchere, Global Head of Platform Safety

Data Privacy Day: Supporting the Privacy and Wellbeing of Snapchatters

Copy of Template 1 (2)

Today marks Data Privacy Day, a global effort to raise awareness about the importance of respecting and safeguarding privacy. Privacy has always been central to Snapchat’s primary use case and mission.

We first built our app to help people connect with their real friends and feel comfortable expressing themselves authentically – without feeling pressure to curate a perfect image or measure themselves against others. We wanted to reflect the natural dynamics between friends in real life, where trust and privacy are essential to their relationships.

We designed Snapchat with fundamental privacy features baked into the app’s architecture, to help our community develop that trust with their real-life friends, and support their safety and wellbeing:

  • We focus on connecting people who were already friends in real life and require that, by default, two Snapchatters opt-in to being friends in order to communicate.

  • We designed communications to delete by default to reflect the way people talk to their friends in real life, where they don’t keep a record of every single conversation for public consumption.

  • New features go through an intensive privacy- and safety-by-design product development process, where our in-house privacy experts work closely with our product and engineering teams to vet the privacy impacts.

We’re also constantly exploring what more we can do to help protect the privacy and safety of our community, including how to further educate them about online risks. To help us continue to do that, we recently commissioned global research to better understand how young people think about their online privacy. Among other things, the findings confirmed that almost 70% of participants said privacy makes them feel more comfortable expressing themselves online, and 59% of users say privacy and data security concerns impact their willingness to share on online platforms You can read more of our findings here.

We feel a deep responsibility to help our community develop strong online privacy habits – and want to reach Snapchatters where they are through in-app education and resources. 

We regularly remind our community to enable two-factor authentication and use strong passwords -- two important safeguards against account breaches, and today are launching new content on our Discover platform with tips about creating unique account credentials and how to set up two-factor authentication. 

We are launching new privacy-focused creative tools, including our first-ever privacy-themed Bitmoji, stickers developed  with the International Association of Privacy Professionals (IAPP), a new Lens in partnership with Future Privacy Forum that shares helpful privacy tips.

In the coming months, we will continue to leverage our research findings to inform additional in-app privacy tools for our community.  

Meet Our Head of Global Platform Safety

Hello, Snapchat community! My name is Jacqueline Beauchere and I joined Snap last Fall as the company’s first Global Head of Platform Safety. 

My role focuses on enhancing Snap’s overall approach to safety, including creating new programs and initiatives to help raise awareness of online risks; advising on internal policies, product tools and features; and listening to and engaging with external audiences – all to help support the safety and digital well-being of the Snapchat community. 

Since my role involves helping safety advocates, parents, educators and other key stakeholders understand how Snapchat works and to solicit their feedback, I thought it might be useful to share some of my initial learnings about the app; what surprised me; and some helpful tips, if you or someone close to you is an avid Snapchatter. 

 Initial Learnings – Snapchat and Safety 

After more than 20 years working in online safety at Microsoft, I’ve seen significant change in the risk landscape. In the early 2000s, issues like spam and phishing highlighted the need for awareness-raising to help educate consumers and minimize socially engineered risks. The advent of social media platforms – and people’s ability to post publicly – increased the need for built-in safety features and content moderation to help minimize exposure to illegal and potentially more harmful content and activity.  

Ten years ago, Snapchat came onto the scene. I knew the company and the app were “different,” but until I actually started working here, I didn’t realize just how different they are. From inception, Snapchat was designed to help people communicate with their real friends – meaning people they know “in real life” – rather than amassing large numbers of known (or unknown) followers. Snapchat is built around the camera. In fact, for non-first-generation Snapchatters (like me), the app’s very interface can be a bit mystifying because it opens directly to a camera and not a content feed like traditional social media platforms. 

There’s far more that goes into Snapchat’s design than one might expect, and that considered approach stems from the tremendous value the company places on safety and privacy. Safety is part of the company’s DNA and is baked into its mission: to empower people to express themselves, live in the moment, learn about the world and have fun together. Unless people feel safe, they won’t be comfortable expressing themselves freely when connecting with friends.

The belief that technology should be built to reflect real-life human behaviors and dynamics is a driving force at Snap. It’s also vital from a safety perspective. For example, by default, not just anyone can contact you on Snapchat; two people need to affirmatively accept each other as friends before they can begin communicating directly, similar to the way friends interact in real life.

Snap applies privacy-by-design principles when developing new features and was one of the first platforms to endorse and embrace safety-by-design, meaning safety is considered in the design phase of our features – no retro-fitting or bolting on safety machinery after the fact. How a product or feature might be misused or abused from a safety perspective is considered, appropriately so, at the earliest stages of development.  

What Surprised Me – Some Context Behind Some Key Features 

Given my time in online safety and working across industry, I’d heard some concerns about Snapchat. Below are a handful of examples and what I’ve learned over the past few months. 

Content that Deletes by Default 

Snapchat is probably most known for one of its earliest innovations: content that deletes by default. Like others, I made my own assumptions about this feature and, as it turns out, it’s something other than I’d first presumed. Moreover, it reflects the real-life-friends dynamic.

Snapchat’s approach is rooted in human-centered design. In real life, conversations between and among friends aren’t saved, transcribed or recorded in perpetuity. Most of us are more at ease and can be our most authentic selves when we know we won’t be judged for every word we say or every piece of content we create. 

One misperception I’ve heard is that Snapchat’s delete-by-default approach makes it impossible to access evidence of illegal behavior for criminal investigations. This is incorrect. Snap has the ability to, and does, preserve content existing in an account when law enforcement sends us a lawful preservation request. For more information about how Snaps and Chats are deleted, see this article

Strangers Finding Teens

A natural concern for any parent when it comes to online interactions is how strangers might find their teens. Again, Snapchat is designed for communications between and among real friends; it doesn’t facilitate connections with unfamiliar people like some social media platforms. Because the app was built for communicating with people we already know, by design, it’s difficult for strangers to find and contact specific individuals. Generally, people who are communicating on Snapchat have already accepted each other as friends. In addition, Snap has added protections to make it even more difficult for strangers to find minors, like banning public profiles for those under 18. Snapchat only allows minors to surface in friend-suggestion lists (Quick Add) or Search results if they have friends in common. 

A newer tool we want parents and caregivers to be aware of is Friend Check-Up, which prompts Snapchatters to review their friend lists to confirm those included are still people they want to be in contact with. Those you no longer want to communicate with can easily be removed. 

Snap Map and Location-Sharing

Along those same lines, I’ve heard concerns about the Snap Map – a personalized map that allows Snapchatters to share their location with friends, and to find locally relevant places and events, like restaurants and shows. By default, location-settings on Snap Map are set to private (Ghost Mode) for all Snapchatters. Snapchatters have the option of sharing their location, but they can do so only with others whom they’ve already accepted as friends – and they can make location-sharing decisions specific to each friend. It’s not an “all-or-nothing” approach to sharing one’s location with friends. Another Snap Map plus for safety and privacy: If people haven’t used Snapchat for several hours, they’re no longer visible to their friends on the map.  

Most importantly from a safety perspective, there’s no ability for a Snapchatter to share their location on the Map with someone they’re not friends with, and Snapchatters have full control over the friends they choose to share their location with or if they want to share their location at all.

Harmful Content

Early on, the company made a deliberate decision to treat private communications between friends, and public content available to wider audiences, differently. In the more public parts of Snapchat, where material is likely to be seen by a larger audience, content is curated or pre-moderated to prevent potentially harmful material from “going viral.” Two parts of Snapchat fall into this category: Discover, which includes content from vetted media publishers and content creators, and Spotlight, where Snapchatters share their own entertaining content with the larger community.

On Spotlight, all content is reviewed with automated tools, but then undergoes an extra layer of human moderation before it is eligible to be seen, currently, by more than a couple dozen people. This helps to ensure the content complies with Snapchat’s policies and guidelines, and helps to mitigate risks that may have been missed by automoderation. By seeking to control virality, Snap lessens the appeal to publicly post illegal or potentially harmful content, which, in turn, leads to significantly lower levels of exposure to hate speech, self-harm and violent extremist material, to name a few examples – as compared with other social media platforms.

Exposure to Drugs

Snapchat is one of many online platforms that drug dealers are abusing globally and, if you’ve seen any media coverage of parents and family members who’ve lost children to a fentanyl-laced counterfeit pill, you can appreciate how heartbreaking and terrifying this situation can be. We certainly do, and our hearts go out to those who’ve lost loved ones to this frightening epidemic.

Over the past year, Snap has been aggressively and comprehensively tackling the fentanyl and drug-related content issue in three key ways:

  • Developing and deploying new technology to detect drug-related activity on Snapchat to, in turn, identify and remove drug dealers who abuse the platform;

  • Reinforcing and taking steps to bolster our support for law enforcement investigations, so authorities can quickly bring perpetrators to justice; and

  • Raising awareness of the dangers of fentanyl with Snapchatters via public service announcements and educational content directly in the app. (You can learn more about all of these efforts here.)

We’re determined to make Snapchat a hostile environment for drug-related activity and will continue to expand on this work in the coming months. In the meantime, it’s important for parents, caregivers and young people to understand the pervasive threat of potentially fatally fake drugs that has spread across online platforms, and to talk with family and friends about the dangers and how to stay safe. 

Snap has much planned on the safety and privacy fronts in 2022, including launching new research and safety features, as well as creating new resources and programs to inform and empower our community to adopt safer and healthier digital practices. Here’s to the start of a productive New Year, chock-full of learning, engagement, safety and fun!   

- Jacqueline Beauchere, Global Head of Platform Safety

Expanding our Work to Combat the Fentanyl Epidemic

Head Up Portal

Late last year, the CDC announced that more than 100,000 people died from drug overdoses in the US over a 12 month period -- with fentanyl being a major driver of this spike. This staggering data hits home – we recognize the horrible human toll that the opioid epidemic is taking across the county, and the impact of fentanyl and adulterated drugs (often masked as counterfeit prescription drugs) is having on young people and their families in particular. We also know that drug dealers are constantly searching for ways to exploit messaging and social media apps, including trying to find new ways to abuse Snapchat and our community, to conduct their illegal and deadly commerce.

Our position on this has always been clear: we have absolutely zero tolerance for drug dealing on Snapchat. We are continuing to develop new measures to keep our community safe on Snapchat, and have made significant operational improvements over the past year toward our goal of eradicating drug dealers from our platform. Moreover, although Snapchat is just one of many communications platforms that drug dealers seek to abuse in order to distribute illicit substances, we still have a unique opportunity to use our voice, technology and resources to help address this scourge, which threatens the lives of our community members.

In October, we shared updates on the progress we have been making to crack down on drug-related activity and to promote broader public awareness about the threat of illicit drugs. We take a holistic approach that includes deploying tools that proactively detect drug-related content, working with law enforcement to support their investigations, and providing in-app information and support to Snapchatters who search for drug-related terms through a new education portal, Heads Up. 

Today, we’re expanding on this work, in several ways. First, we will be welcoming two new partners to our Heads Up portal to provide important in-app resources to Snapchatters: Community Anti-Drug Coalitions of America (CADCA), a nonprofit organization that is committed to creating safe, healthy and drug-free communities; and Truth Initiative, a nonprofit dedicated to achieving a culture where all young people reject smoking, vaping and nicotine. Through their proven-effective and nationally recognized truth public education campaign, Truth Initiative has provided content addressing the youth epidemics of vaping and opioids, which they’ve taken on in recent years. In the coming days we will also release the next episode of our special Good Luck America series focused on fentanyl, which is featured on our Discover content platform. 

Second, we’re sharing updates on the progress we’ve made in proactively detecting drug-related content and more aggressively shutting down dealers. Over the past year:

  • We have increased our proactive detection rates by 390% -- an increase of 50% percent since our last public update in October. 

  • 88% of drug related content we uncover is now proactively detected by our machine learning and artificial intelligence technology, with the remainder reported by our community. This is an increase of 33% since our previous update. When we find drug dealing activity, we promptly ban the account, use technology to block the offender from creating new accounts on Snapchat, and in some cases proactively refer the account to law enforcement for investigation. 

  • We have grown our law enforcement operations team by 74%. While we’ve always cooperated with law enforcement investigations by preserving and disclosing data in response to valid requests, this increased capacity helped us significantly improve our response times to law enforcement inquiries by 85% over the past year, and we continue to improve these capabilities. You can learn more about our investments in our law enforcement work here

Since this fall, we have also seen another important indicator of progress: a decline in community-reported content related to drug sales. In September, over 23% of drug-related reports from Snapchatters contained content specifically related to sales, and as a result of proactive detection work, we have driven that down to 16% as of this month. This marks a decline of 31% in drug-related reports. We will keep working to get this number as low as possible. 

Additionally, we continue to work with experts to regularly update the list of slang and drug-related terms we block from being visible in Snapchat. This is a constant, ongoing effort that not only prohibits Snapchatters from getting Search results for those terms, but then also proactively surfaces the expert educational resources in our Heads Up tool. 

Third, we’re continuing to make our underlying products safer for minors. As a platform built for close friends, we designed Snapchat to make it difficult for strangers to find and connect with minors. For example, Snapchatters cannot see each other’s friend lists, we don’t allow browsable public profiles for anyone under 18 and, by default, you cannot receive a message from someone who isn’t already your friend. While we know that drug dealers seek to connect with potential customers on platforms outside of Snapchat, we want to do everything we can to keep minors from being discovered on Snapchat by people who may be engaging in illegal or harmful behavior. 

We recently added a new safeguard to Quick Add, our friend suggestion feature, to further protect 13 to 17 year olds. In order to be discoverable in Quick Add by someone else, users under 18 will need to have a certain number of friends in common with that person -- further ensuring it is a friend they know in real life. 

In the coming months, we will be sharing more details about the new parental tools we are developing, with the goal of giving parents more insight into who their teens are talking to on Snapchat, while still respecting their privacy. 

And we will continue to build on this critical work, with additional partnerships and operational improvements underway.

Investing in and Expanding our Law Enforcement Operations

LEO Asset

When we first launched this blog, we explained that one of our goals was to do a better job of talking to the many stakeholders who care deeply about the health and wellbeing of our community -- parents and other family members, educators and mentors, safety advocates, and law enforcement.  In this post, we wanted to provide information about steps we’ve taken to facilitate better communications with the law enforcement community.

Law enforcement at every level are crucial partners in our efforts to combat illegal or harmful activity on our platform. As part of our ongoing work to keep our community safe, we have an in-house Law Enforcement Operations team dedicated to reviewing and responding to law enforcement requests for data related to their investigations. For example:

  • While content on Snapchat is ephemeral, designed to reflect the nature of real life conversations between friends, we have long offered law enforcement agencies the ability to, consistent with applicable laws, preserve available account information and content for law enforcement in response to valid legal requests. 

  • We have always proactively escalated to law enforcement authorities any content that could involve imminent threats to life. 

  • Once we have received a valid legal request for Snapchat account records, we respond in compliance with applicable laws and privacy requirements.

Over the past year, we have been investing in growing this team and continuing to improve their capabilities for timely responding to valid law enforcement requests. The team has expanded by 74%, with many new team members joining across all levels, including some from careers as prosecutors and law enforcement officials with experience in youth safety. As a result of these investments, we have been able to significantly improve our response times for law enforcement investigations by 85% year-over-year.  In the case of emergency disclosure requests -- some of the most critical requests, which involve the imminent danger of death or serious bodily injury -- our 24/7 team usually responds within 30 minutes. To learn more about the types of law enforcement requests Snap receives and the volume of requests, we publish a Transparency Report every six months to provide the public with these important insights. You can read our latest report, covering the first half of 2021, here

Recognizing that Snapchat is built differently than traditional social media platforms, and many members of law enforcement may not be as familiar with how our products work and what capabilities we have for supporting their work, one of our top priorities is to provide more -- and ongoing -- educational resources to help this community better learn how our services and processes work. We recently took two important steps forward as part of this larger focus.

First, we welcomed Rahul Gupta to serve as our first Head of Law Enforcement Outreach. Rahul joined Snap after a distinguished career as a local prosecutor in California, with an expertise in cybercrime, social media, and digital evidence. In this new role, Rahul will develop a global law enforcement outreach program to raise awareness about Snap’s policies for responding to legal data requests. He will also build relationships and seek regular feedback from law enforcement agencies as we continue to identify areas for improvement. 

Second, in October, we held our first-ever Snap Law Enforcement Summit to help build stronger connections and explain our services to U.S. law enforcement officials. More than 1,700 law enforcement officials from federal, state and local agencies participated. 

To help measure how useful our inaugural event was and identify areas for opportunity, we surveyed our attendees before and after the Summit. Prior to the Summit, we found that:

  • Only 27% of those surveyed were familiar with safety measures Snapchat had;

  • 88% wanted to learn what kind of data Snapchat can provide in support of their investigations; and

  • 72% wanted to know what the process is for how best to work with Snapchat.

After the Summit:

  • 86% of attendees said they have a better understanding of our work with law enforcement;

  • 85% said they had a better understanding of the process to submit legal requests for data; and

  • 78% would want to attend future Snap law enforcement summits.  

We are deeply grateful for all of those who attended, and in light of their feedback, are pleased to share that we will be making our Snap Law Enforcement Summit an annual event in the U.S. We are also planning to expand our outreach to law enforcement agencies in certain countries outside the U.S.

Our long-term goal is to have a world-class Law Enforcement Operations team -- and we know we have to continue to make meaningful improvements to get there. We hope our inaugural Summit was the start of an important dialogue with law enforcement stakeholders about how we can continue to build on the progress we’re seeing -- and help keep Snapchatters safe.