Mental Health Awareness Month: Announcing an Industry-Wide Campaign to Combat the Nationwide Fentanyl Epidemic

Over the past year and a half, Snap has been deeply focused on doing our part to help combat the broader national fentanyl crisis, which has continued to intensify during the pandemic. The Centers for Disease Control and Prevention (CDC) estimates that over 100,000 people died due to drug overdose in the 12-month period ending November 2021. And according to a recent study published in JAMA, 77% of adolescent overdose deaths in 2021 involved fentanyl. 

At Snap, we have taken a holistic approach to eradicating drug dealers from our platform, strengthening our support for law enforcement investigations, and educating Snapchatters about the dangers of counterfeit pills laced with fentanyl. Our work has involved working closely with parents, expert organizations, and law enforcement to continue to learn how we can keep strengthening these efforts.

To help inform our in-app education efforts, last year we commissioned research from Morning Consult to better understand young people’s awareness of fentanyl, and why they are increasingly turning to prescription pills. Our research not only found that young people were significantly unaware of the extraordinary dangers of fentanyl and how pervasive it is in counterfeit prescription pills, but it brought to light the strong correlation between the larger mental health crisis and the increased use of prescription drugs. Teenagers are suffering from high levels of stress and anxiety, and as a result are experimenting with non-medical, abuse of prescription drugs as a coping strategy. 

Through all of our ongoing work, it has become clear that a larger, industry-wide approach is needed to help educate both young people and parents about the dangers of fentanyl. 

Today we’re grateful to be collaborating with the Ad Council on an unprecedented public awareness campaign launching this summer to help Americans learn about the dangers of fentanyl. Snap, along with Meta and Google, will be funding this effort. With the help of additional media partners, we will also be donating media space and developing and distributing content designed to educate both young adults and parents on this growing fentanyl crisis. 

Additionally, we’re sharing updates on our continued work to crack down on drug-related activity by improving our underlying technology to better detect drug-related content, while continuing to increase public awareness through key partnerships and in-app educational resources. 

Since our last public update in January, we have put even stronger machine learning models in place for automatically detecting drug-related text, images, and emojis. As of March 2022, more than 90% of the dangerous drug-related content that we proactively detected using these tools has been removed within minutes.

We’re also expanding our educational resources for Snapchatters by: 

  • Welcoming new partners like SAFE Project founded by Admiral James and Mary Winnefeld – working to combat the nation's catastrophic addiction epidemic  – to Heads Up, our in-app portal that distributes expert resources to Snapchatters who search for a range of drug-related search terms and slang. Since the launch of Heads Up, over 2.5 million Snapchatters have been proactively served educational content from trusted expert organizations like Song for Charlie, Shatterproof, the Substance Abuse and Mental Health Services Administration (SAMHSA), the Centers for Disease Control and Prevention (CDC), the Community Anti-Drug Coalitions of America (CADCA) and Truth Initiative.

  • Expanding our partnership with CADCA by collaborating with their National Advisory Youth Council to develop Heads Up resources specifically geared towards substance misuse, community engagement, and prevention advocacy.  

  • Teaming up with the Partnership to End Addiction – the nation’s leading organization dedicated to addiction prevention, treatment, and recovery – to develop a guide focused on educating parents and caregivers about the dangers of fentanyl, and provide tips for how to discuss the risks with their teens. This will be available in English and Spanish on the Partnership to End Addiction’s resource page and on Snap’s Safety Center. 

  • As part of our ongoing special Good Luck America series focused on the fentanyl crisis, in the coming weeks we will release our next episode featuring an interview with Dr. Rahul Gupta, the Director of National Drug Control Policy at the White House, to help Snapchatters understand the drug-overdose epidemic and its impact on young Americans. 

In addition to these new tools and resources, we are also updating our in-app reporting flow to make it even easier for Snapchatters to report harmful content – that way, we can act even faster to protect our community. We’re doing this by reducing the number of steps it takes to submit a report, providing detailed definitions of each reporting category so it’s clear what our corresponding policies are, increasing the number of reporting categories so there’s more specificity around the type of abuse that’s taking place and closing the feedback loop by letting Snapchatters know what take we take on their reports.

In the upcoming months, we will also be rolling out our new parental tools, with the goal of giving parents more insight into who their teens are talking to on Snapchat, while still respecting the teen’s privacy. 

Additional Efforts to Support our Community’s Mental Health and Wellbeing 

With Mental Health Awareness Month underway, we are also announcing a slate of new partners and launching creative and educational tools to help Snapchatters look after their mental health and wellbeing, and to support their friends. We believe this is especially important given the strong correlation between mental health and young people self-medicating. These efforts include: 

  • Participating as a partner in the White House’s first-ever Mental Health Youth Action Forum alongside leading mental health non-profits in an effort to empower young people to drive action on mental health. 

  • Serving as a Founding Partner of Mental Health Action Day. In an effort to remind Snapchatters to take care of their own wellbeing, we will launch a new augmented reality (AR) Lens that encourages Snapchatters to take a wellness break and participate in a breathing exercise. 

  • In addition, we’re adding new partners to Here for You, our in-app mental health portal, including The Jed Foundation, The American Foundation for Suicide Prevention, Movember, and the National Alliance for Eating Disorders.

As we roll out these additional resources, we continue to prioritize the mental health and wellbeing of our community every day. As an app built to help people communicate with their real-life friends – who we know are critical support systems for those experiencing mental health challenges – we will continue to develop innovative tools and resources to help Snapchatters to stay healthy and safe.  

Snap Partners with It’s On Us for Sexual Assault Awareness Month

IOU

In February, Snapchat partnered with It’s On Us, a national non-profit dedicated to combating campus sexual assault through awareness and prevention education programs, to announce our important new Snap Map safety feature helping friends share their real-time location. 

Together with It’s On Us, we introduced this new tool to help  Snapchatters to look out for one another while they are on the go, whether they are on their way to meet up, or on their way home at night–and already, more than three million members of our community use the feature to connect with their friends each week on average.  

This April, for Sexual Assault Awareness month, Snapchat and It’s On Us haved joined forces again to continue our community education around this important issue with new in-app resources and content, including:

  • A Lens raising awareness around this important issue, reminding Snapchatters to look out for their friends;

  • An episode of Snapchat’s original news show, Good Luck America, where our host Peter Hamby explores what’s happening around Title IX and sexual assault on U.S. college campuses today; and 

  • Map Markers on our Snap Map. These unique, tappable icons highlight a handful of active university It’s On Us chapters. Our Snap Map Markers link seamlessly back to the Lens in our Camera to make it easy for Snapchatters to share the message with their friends. 

With many in our community heading back out and about, whether they are on their way to spring break or coming back to campus, we know that this is a critical moment to raise awareness around this important issue. We’re proud to partner with It’s On Us to help Snapchatters keep each other safe.

If you or a loved one is in need of extra support at this time, please know that you are not alone. Please head to https://www.itsonus.org/ where you can find additional resources.

Apply to join our growing Safety Advisory Board!

Since 2018, members of Snap’s Safety Advisory Board (SAB) have provided critical feedback on fostering the safety and well-being of our Snapchat community, and they’ve helped us navigate some complex safety issues. Thanks to the expert advice and guidance of our SAB members and their partnership, we’ve made progress over the last four years, implementing important awareness-raising and educational efforts.

Snap remains committed to helping parents, caregivers, safety advocates, and others better understand how young people experience our platform and how we approach vital issues that impact safety and trust. However, given the ever-changing online safety landscape, we believe we have an opportunity to “reinvent” and relaunch our SAB to include new members who reflect our global community and our growth across products, including augmented reality and hardware, as well as expertise regarding newer online risks facing young people and their families. 

With those goals in mind, today we are opening applications to join our new and expanded SAB, which we hope will include members from diverse geographies and safety disciplines, including research, academia, technology and related fields. In keeping with Snap’s commitment to being victim- and survivor-informed in safety matters, we also welcome applications from those who may have experienced hardship or tragedy related to online interactions. Indeed, we are open to all applicants who have a unique perspective to share and an interest in constructively advising our ongoing safety work.     

We believe this approach to fashioning an SAB is unique among technology platforms, and we’re eager to receive applications from all corners of the globe. The application process will remain open for about two months, after which we will invite a number of experts to join our board.

In keeping with past practice and to help ensure the independence of our SAB, members will not receive compensation for their participation. Commitments will include three board meetings per year of roughly two hours each, in addition to occasional email correspondence. Those invited to join the SAB will be asked to agree to Terms of Reference, outlining expectations of board members as well as Snap’s commitments to the SAB.    

If you are interested in applying or would like to recommend someone, please complete this short application form by Friday, July 22. As advocates for online safety, we’re excited about the next chapter as we grow our network of advisors and trusted partners. Learn more and apply here!

- Jacqueline Beauchere, Snap Global Head of Platform Safety

Our Transparency Report for the Second Half of 2021

We are committed to making each of our Transparency Reports more comprehensive than the last. It’s a responsibility we don’t take lightly, as we know our stakeholders care as deeply about online safety and accountability as we do. As part of these ongoing efforts, we have made several additions and improvements to our latest Transparency Report, which covers the second half of 2021.

First, we are offering new detail on the amount of content we enforced against drug-related violations. We have zero tolerance for promoting illicit drugs on Snapchat and prohibit the buying or selling of illegal or regulated drugs. 

Over the past year, we have been especially focused on combating the rise of illicit drug activity as part of the larger growing fentanyl and opioid epidemic across the U.S. We take a holistic approach that includes deploying tools that proactively detect drug-related content, working with law enforcement to support their investigations, and providing in-app information and support to Snapchatters through our fentanyl-related education portal, Heads Up. Heads Up surfaces resources from expert organizations when Snapchatters search for a range of drug-related terms and their derivatives. As a result of these ongoing efforts, the vast majority of drug-related content we uncover is proactively detected by our machine learning and artificial intelligence technology, and we will continue working to eradicate drug activity from our platform

When we find activity involving the sale of dangerous drugs, we promptly ban the account, block the offender from creating new accounts on Snapchat, and have the ability to preserve content related to the account to support law enforcement investigations. During this reporting period, seven percent of all content we enforced against globally, and 10 percent of all content we enforced against in the U.S., involved drug-related violations. Globally, the median turnaround time we took action to enforce against these accounts was within 13 minutes of receiving a report.

Second, we have created a new suicide and self-harm category to share the total number of content and account reports that we received and took action on when our Trust & Safety teams determined that a Snapchatter may be in crisis. When our Trust & Safety team recognizes a Snapchatter in distress, they have the option to forward self-harm prevention and support resources, and to notify emergency response personnel where appropriate. We care deeply about the mental health and wellbeing of Snapchatters and believe we have a duty to support our community in these difficult moments. 

In addition to these new elements in our latest Transparency Report, our data shows that we saw a reduction in two key areas: Violative View Rate (VVR) and the number of accounts we enforced that attempted to spread hate speech, violence, or harm. Our current Violative View Rate is (VVR) 0.08 percent. This means that out of every 10,000 Snap and Story views on Snapchat, eight contained content that violated our Community Guidelines. This is an improvement from our last reporting cycle, during which our VVR was 0.10 percent. 

The fundamental architecture of Snapchat protects against the ability for harmful content to go viral, which removes incentives for content that appeals to people’s worst instincts, and limits concerns associated with the spread of bad content such as disinformation, hate speech, self-harm content, or extremism. In the more public parts of Snapchat, such as our Discover content platform and our Spotlight entertainment platform, we curate or pre-moderate content to ensure it complies with our guidelines before it can reach a larger audience. 

We continue to be vigilant to improve our human moderation and as a result, we have improved the median enforcement turnaround time by 25 percent for hate speech and eight percent for threats and violence or harm to 12 minutes in both categories. 

We believe it's our most important responsibility to keep our community safe on Snapchat and we are constantly strengthening our comprehensive efforts to do that. Our work here is never done, but we will continue communicating updates about our progress and we are grateful to our many partners that regularly help us improve.

Announcing New Policies for Snap’s Developer Platform

We want Snapchatters to have fun and stay safe when using our services, and that goal drives the design of our products, our policies and our platforms for third-party developers. We also focus on building technologies that support real-life human connections and communications between close friends – a principle that helps create more secure and more positive online experiences. 

We first launched our Snap Kit Developer platform to bring some of Snapchat’s most popular features to third party applications and services. From the outset, we set safety and privacy standards for all participating apps, and required that developers go through a review and approval process when they first apply to work with us so we can examine how their integration will work and their customer support operations. 

Among other things, our guidelines prohibit bullying, harassment, hate speech, threats and other types of harmful content – and we require that developers have adequate safeguards in place to protect their customers and take action on any reports of abuse. 

Last year, a lawsuit raised serious allegations about two integrated apps that included anonymous messaging features. At the time, we suspended both apps from Snap Kit, and began conducting an extensive review of the program’s standards and policies. 

As a result of this review, today we are announcing several changes to our developer platform that we believe are in the best interest of our community, and further aligned with our focus of supporting communications that reflect real-life friendships. 

Banning Anonymous Messaging 

First, we will prohibit apps that facilitate anonymous messaging from integrating with our platform. During our review, we determined that even with safeguards in place, anonymous apps pose risks for abuse that are impossible to mitigate at an acceptable level. 

While we know that most Snapchatters used these anonymous integrations in fun, engaging, and entirely appropriate ways, we believe some users might be more prone to engage in harmful behavior – such as bullying or harassment – if they have the shroud of anonymity. Under our new policy, we will not allow third-party apps to use a Snapchat integration to facilitate communication between users without registered and visible usernames and identities.

Age-Gating Friend Finding Apps to 18+ 

Our review was holistic and examined the privacy and safety of integrated apps well beyond anonymous messaging. Today we are also announcing that friend-finding apps will not be allowed unless they are age-gated and restricted to Snapchatters over 18. This change will better protect younger users and is more consistent with Snapchat’s use case – communications between close friends who already know each other. 

As a platform that works with a wide range of developers, we want to foster an ecosystem that helps apps protect user safety, privacy, and wellbeing, while unlocking product innovation for developers and helping them grow their businesses. 

We believe we can do both, and will continue to regularly evaluate our policies, monitor app compliance, and work with developers to better protect the wellbeing of our community.

Looking Out For Friends on the Snap Map

TLL

At Snap, we help friends stay connected no matter where they are, and we want to give our community even more tools to safely explore the world around them. So today, we are introducing a new safety feature for the Snap Map that will help Snapchatters look out for one another while they are on the go, whether they are on their way to meet up, or on their way home at night. 

Since 2017, Snapchatters have been able to opt in to share their location with their friends on the Snap Map, but to date the app needed to be open for their location to be updated. This new tool will give Snapchatters the option to share their real-time location with a close friend even while their app is closed. With this new buddy system, Snapchatters can throw their phone in their pocket and head out the door, feeling confident that the people they trust most are looking out for them while they're on the move.

Location sharing on the Snap Map has always been, and will continue to be, off by default, meaning that Snapchatters have to proactively opt in to share where they are. Importantly, Snapchatters can only ever share their whereabouts with their existing Snapchat friends – there is no option to broadcast their location to the wider Snapchat community. 

As a platform built for communicating with close friends, we know that location sharing can be an easy and impactful way for young people to stay connected and safe. In fact, according to feedback from our community, we know that Snapchatters feel even more connected to their friends when they see them on Snap Map, and that they are motivated to share their location with friends because they think it’s a safe and fun way to stay connected. 

We’ve built this new tool to offer Snapchatters a buddy system, and we’ve included several safety elements from the start, including:

  • A fast and clear way to activate, so Snapchatters can share their real-time location in an instant if they ever feel unsafe.

  • Limited time sharing & notification-free pausing so Snapchatters can easily turn this off when they’ve reached their destination. Plus, this minimizes any undue pressure to constantly share. 

  • Required two-way friendship meaning that only those who have mutually added each other as friends on Snapchat can share their location, keeping in line with our existing Snap Map policies. 

  • A safety notice that pops up when Snapchatters use the feature for the first time, ensuring our community knows that this is meant to be used only with close friends and family.

  • Ultra clear design so Snapchatters always understand their setting selections and who can see their location.

We are all adjusting to new ways of being out and about in the world -- especially on college campuses, where Snapchat is widely used. Many students have gone back to campus to be with their friends despite remote or hybrid learning, but with schools expecting less activity on the grounds, there may be gaps in normal security and safeguards. That's why we are launching this new tool as part of a partnership with It’s On Us, a national non-profit dedicated to combating campus sexual assault through campus awareness and prevention education programs. Starting today, a new PSA from It’s On Us will debut in our app, encouraging our community to look out for one another. 

We know many parents may have questions about how the Map works, who can see Snapchatters’ locations (if they choose to share them), and the policies and tools we have in place. So, we wanted to share more on the key safety and privacy features of the Snap Map:

  • Location Sharing is OFF by Default and Only for Friends: For all Snapchatters, location-sharing is off by default and completely optional. Snapchatters can update their location sharing preferences at any time by tapping the settings gear at the top of the Snap Map. There, they can hand select which existing friends can see their location, or hide themselves completely with ‘Ghost Mode.’ Snapchatters who do decide to share their location on the Map will only be visible to those they have selected -- we don’t give anyone the option to share their location publicly with people they have not proactively and mutually added as a friend. 

  • Education & Reminders: Snapchatters are taken through a tutorial when they use Snap Map for the first time. Here, they can learn how to opt-in for location sharing, how to select friends to share with, and how to update settings at any time. Snapchatters who choose to share their location with their friends receive periodic reminders asking them to confirm that they are still comfortable with their settings and if they are not, they can easily switch off location sharing without prompting other users.

  • Additional Privacy Safeguards: Only content that is proactively submitted to the Snap Map appears on it; Snaps between friends remain private. For Snapchatters who maintain our default privacy setting, content shown on the Map is automatically anonymized, so anyone looking at the Map cannot see the name, contact information, or exact location of the person who shared. We also protect sensitive businesses and locations on the Map.

We know that mobile location sharing is sensitive and needs to be used with caution, but we believe that with the right safeguards in place, it can be an impactful way for friends to not only stay connected, but also to help keep each other safe. We encourage you to visit our support page here for more information.

Safer Internet Day 2022: Your report matters!

Today is international Safer Internet Day (SID), an annual event dedicated to people coming together around the world to make the internet safer and healthier for everyone, especially young people. SID 2022 marks 19 straight years of Safer Internet Day celebrations, and the world is again rallying around the theme, “Together for a better internet.”

At Snap, we’re taking this opportunity to highlight the benefits and importance of letting us know when you see something on Snapchat that may be of concern to you. Snapchat is about sharing and communicating with close friends, and we want everyone to feel safe, confident and comfortable sending Snaps and Chats. Still, there may be times when people may share content or behave in a way that conflicts with our Community Guidelines.

When it comes to staying safe online, everyone has a role to play, and we want all Snapchatters to know that reporting abusive or harmful content and behavior – so that we can address it – improves the community experience for everyone. In fact, this is one of the most important things Snapchatters can do to help keep the platform free of bad actors and harmful content.

Reporting reluctance

Research shows young people may be unwilling to report content or behaviors for a variety of reasons. Some of these may be rooted in social dynamics, but platforms can also do a better job of debunking certain myths about reporting to foster comfort in contacting us. For example, in November 2021, we learned that just over a third of young people surveyed (34%) said they worry what their friends will think if they take action against bad behavior on social media. In addition, almost one in four (39%) said they feel pressure not to act when someone they personally know behaves badly. These findings come from Managing the Narrative: Young People’s Use of Online Safety Tools, conducted by Harris Insights and Analytics for the Family Online Safety Institute (FOSI) and sponsored by Snap. 

The FOSI research polled several cohorts of teens, aged 13 to 17, and young adults, aged 18 to 24, in the U.S. In addition to the quantitative components, the survey sought participants’ general views on reporting and other topics. One comment from an 18-year-old summed up a number of young people’s perspectives, “I guess I didn’t think the offense was extreme enough to report.” 

Fast Facts about reporting on Snapchat

The FOSI findings suggest possible misconceptions about the importance of reporting to platforms and services in general. For Snapchatters, we hope to help clear those up with this handful of Fast Facts about our current reporting processes and procedures. 

  • What to report:  In the conversations and Stories portions of Snapchat, you can report photos, videos and accounts; in the more public Discover and Spotlight sections, you can report content. 

  • How to report:  Reporting photos and videos can be done directly in the Snapchat app (just press and hold on the content); you can also report content and accounts via our Support Site (simply complete a short webform).  

  • Reporting is confidential:  We don’t tell Snapchatters who reported them.

  • Reports are vital:  To improve the experiences of Snapchatters, reports are reviewed and actioned by our safety teams, which operate around the clock and around the globe. In most instances, our teams action reports within two hours. 

  • Enforcement can vary:  Depending on the type of Community Guidelines or Terms of Service violation, enforcement actions can range from a warning, up to and including account deletion. (No action is taken when an account is found not to have violated Snapchat’s Community Guidelines or Terms of Service.) 

We’re always looking for ways to improve, and we welcome your feedback and input. Feel free to share your thoughts with us using our Support Site webform

To commemorate Safer Internet Day 2022, we suggest all Snapchatters review our Community Guidelines and Terms of Service to brush up on acceptable content and conduct. We’ve also created a new reporting Fact Sheet that includes a helpful FAQ, and we updated a recent “Safety Snapshot” episode on reporting. Safety Snapshot is a Discover channel that Snapchatters can subscribe to for fun and informative safety- and privacy-related content. For some added enjoyment to mark SID 2022, check out our new global filter, and look for additional improvements to our in-app reporting features in the coming months.    

New resource for parents 

Finally, we want to highlight a new resource we’re offering for parents and caregivers. In collaboration with our partners at MindUp: The Goldie Hawn Foundation, we’re pleased to share a new digital parenting course, “Digital Well-Being Basics,” which takes parents and caregivers through a series of modules about supporting and empowering healthy digital habits among teens. 

We look forward to sharing more of our new safety and digital well-being work in the coming months. In the meantime, consider doing at least one thing this Safer Internet Day to help keep yourself and others safe. Making a personal pledge to report would be a great start! 

- Jacqueline Beauchere, Global Head of Platform Safety

Data Privacy Day: Supporting the Privacy and Wellbeing of Snapchatters

Copy of Template 1 (2)

Today marks Data Privacy Day, a global effort to raise awareness about the importance of respecting and safeguarding privacy. Privacy has always been central to Snapchat’s primary use case and mission.

We first built our app to help people connect with their real friends and feel comfortable expressing themselves authentically – without feeling pressure to curate a perfect image or measure themselves against others. We wanted to reflect the natural dynamics between friends in real life, where trust and privacy are essential to their relationships.

We designed Snapchat with fundamental privacy features baked into the app’s architecture, to help our community develop that trust with their real-life friends, and support their safety and wellbeing:

  • We focus on connecting people who were already friends in real life and require that, by default, two Snapchatters opt-in to being friends in order to communicate.

  • We designed communications to delete by default to reflect the way people talk to their friends in real life, where they don’t keep a record of every single conversation for public consumption.

  • New features go through an intensive privacy- and safety-by-design product development process, where our in-house privacy experts work closely with our product and engineering teams to vet the privacy impacts.

We’re also constantly exploring what more we can do to help protect the privacy and safety of our community, including how to further educate them about online risks. To help us continue to do that, we recently commissioned global research to better understand how young people think about their online privacy. Among other things, the findings confirmed that almost 70% of participants said privacy makes them feel more comfortable expressing themselves online, and 59% of users say privacy and data security concerns impact their willingness to share on online platforms You can read more of our findings here.

We feel a deep responsibility to help our community develop strong online privacy habits – and want to reach Snapchatters where they are through in-app education and resources. 

We regularly remind our community to enable two-factor authentication and use strong passwords -- two important safeguards against account breaches, and today are launching new content on our Discover platform with tips about creating unique account credentials and how to set up two-factor authentication. 

We are launching new privacy-focused creative tools, including our first-ever privacy-themed Bitmoji, stickers developed  with the International Association of Privacy Professionals (IAPP), a new Lens in partnership with Future Privacy Forum that shares helpful privacy tips.

In the coming months, we will continue to leverage our research findings to inform additional in-app privacy tools for our community.  

Meet Our Head of Global Platform Safety

Hello, Snapchat community! My name is Jacqueline Beauchere and I joined Snap last Fall as the company’s first Global Head of Platform Safety. 

My role focuses on enhancing Snap’s overall approach to safety, including creating new programs and initiatives to help raise awareness of online risks; advising on internal policies, product tools and features; and listening to and engaging with external audiences – all to help support the safety and digital well-being of the Snapchat community. 

Since my role involves helping safety advocates, parents, educators and other key stakeholders understand how Snapchat works and to solicit their feedback, I thought it might be useful to share some of my initial learnings about the app; what surprised me; and some helpful tips, if you or someone close to you is an avid Snapchatter. 

 Initial Learnings – Snapchat and Safety 

After more than 20 years working in online safety at Microsoft, I’ve seen significant change in the risk landscape. In the early 2000s, issues like spam and phishing highlighted the need for awareness-raising to help educate consumers and minimize socially engineered risks. The advent of social media platforms – and people’s ability to post publicly – increased the need for built-in safety features and content moderation to help minimize exposure to illegal and potentially more harmful content and activity.  

Ten years ago, Snapchat came onto the scene. I knew the company and the app were “different,” but until I actually started working here, I didn’t realize just how different they are. From inception, Snapchat was designed to help people communicate with their real friends – meaning people they know “in real life” – rather than amassing large numbers of known (or unknown) followers. Snapchat is built around the camera. In fact, for non-first-generation Snapchatters (like me), the app’s very interface can be a bit mystifying because it opens directly to a camera and not a content feed like traditional social media platforms. 

There’s far more that goes into Snapchat’s design than one might expect, and that considered approach stems from the tremendous value the company places on safety and privacy. Safety is part of the company’s DNA and is baked into its mission: to empower people to express themselves, live in the moment, learn about the world and have fun together. Unless people feel safe, they won’t be comfortable expressing themselves freely when connecting with friends.

The belief that technology should be built to reflect real-life human behaviors and dynamics is a driving force at Snap. It’s also vital from a safety perspective. For example, by default, not just anyone can contact you on Snapchat; two people need to affirmatively accept each other as friends before they can begin communicating directly, similar to the way friends interact in real life.

Snap applies privacy-by-design principles when developing new features and was one of the first platforms to endorse and embrace safety-by-design, meaning safety is considered in the design phase of our features – no retro-fitting or bolting on safety machinery after the fact. How a product or feature might be misused or abused from a safety perspective is considered, appropriately so, at the earliest stages of development.  

What Surprised Me – Some Context Behind Some Key Features 

Given my time in online safety and working across industry, I’d heard some concerns about Snapchat. Below are a handful of examples and what I’ve learned over the past few months. 

Content that Deletes by Default 

Snapchat is probably most known for one of its earliest innovations: content that deletes by default. Like others, I made my own assumptions about this feature and, as it turns out, it’s something other than I’d first presumed. Moreover, it reflects the real-life-friends dynamic.

Snapchat’s approach is rooted in human-centered design. In real life, conversations between and among friends aren’t saved, transcribed or recorded in perpetuity. Most of us are more at ease and can be our most authentic selves when we know we won’t be judged for every word we say or every piece of content we create. 

One misperception I’ve heard is that Snapchat’s delete-by-default approach makes it impossible to access evidence of illegal behavior for criminal investigations. This is incorrect. Snap has the ability to, and does, preserve content existing in an account when law enforcement sends us a lawful preservation request. For more information about how Snaps and Chats are deleted, see this article

Strangers Finding Teens

A natural concern for any parent when it comes to online interactions is how strangers might find their teens. Again, Snapchat is designed for communications between and among real friends; it doesn’t facilitate connections with unfamiliar people like some social media platforms. Because the app was built for communicating with people we already know, by design, it’s difficult for strangers to find and contact specific individuals. Generally, people who are communicating on Snapchat have already accepted each other as friends. In addition, Snap has added protections to make it even more difficult for strangers to find minors, like banning public profiles for those under 18. Snapchat only allows minors to surface in friend-suggestion lists (Quick Add) or Search results if they have friends in common. 

A newer tool we want parents and caregivers to be aware of is Friend Check-Up, which prompts Snapchatters to review their friend lists to confirm those included are still people they want to be in contact with. Those you no longer want to communicate with can easily be removed. 

Snap Map and Location-Sharing

Along those same lines, I’ve heard concerns about the Snap Map – a personalized map that allows Snapchatters to share their location with friends, and to find locally relevant places and events, like restaurants and shows. By default, location-settings on Snap Map are set to private (Ghost Mode) for all Snapchatters. Snapchatters have the option of sharing their location, but they can do so only with others whom they’ve already accepted as friends – and they can make location-sharing decisions specific to each friend. It’s not an “all-or-nothing” approach to sharing one’s location with friends. Another Snap Map plus for safety and privacy: If people haven’t used Snapchat for several hours, they’re no longer visible to their friends on the map.  

Most importantly from a safety perspective, there’s no ability for a Snapchatter to share their location on the Map with someone they’re not friends with, and Snapchatters have full control over the friends they choose to share their location with or if they want to share their location at all.

Harmful Content

Early on, the company made a deliberate decision to treat private communications between friends, and public content available to wider audiences, differently. In the more public parts of Snapchat, where material is likely to be seen by a larger audience, content is curated or pre-moderated to prevent potentially harmful material from “going viral.” Two parts of Snapchat fall into this category: Discover, which includes content from vetted media publishers and content creators, and Spotlight, where Snapchatters share their own entertaining content with the larger community.

On Spotlight, all content is reviewed with automated tools, but then undergoes an extra layer of human moderation before it is eligible to be seen, currently, by more than a couple dozen people. This helps to ensure the content complies with Snapchat’s policies and guidelines, and helps to mitigate risks that may have been missed by automoderation. By seeking to control virality, Snap lessens the appeal to publicly post illegal or potentially harmful content, which, in turn, leads to significantly lower levels of exposure to hate speech, self-harm and violent extremist material, to name a few examples – as compared with other social media platforms.

Exposure to Drugs

Snapchat is one of many online platforms that drug dealers are abusing globally and, if you’ve seen any media coverage of parents and family members who’ve lost children to a fentanyl-laced counterfeit pill, you can appreciate how heartbreaking and terrifying this situation can be. We certainly do, and our hearts go out to those who’ve lost loved ones to this frightening epidemic.

Over the past year, Snap has been aggressively and comprehensively tackling the fentanyl and drug-related content issue in three key ways:

  • Developing and deploying new technology to detect drug-related activity on Snapchat to, in turn, identify and remove drug dealers who abuse the platform;

  • Reinforcing and taking steps to bolster our support for law enforcement investigations, so authorities can quickly bring perpetrators to justice; and

  • Raising awareness of the dangers of fentanyl with Snapchatters via public service announcements and educational content directly in the app. (You can learn more about all of these efforts here.)

We’re determined to make Snapchat a hostile environment for drug-related activity and will continue to expand on this work in the coming months. In the meantime, it’s important for parents, caregivers and young people to understand the pervasive threat of potentially fatally fake drugs that has spread across online platforms, and to talk with family and friends about the dangers and how to stay safe. 

Snap has much planned on the safety and privacy fronts in 2022, including launching new research and safety features, as well as creating new resources and programs to inform and empower our community to adopt safer and healthier digital practices. Here’s to the start of a productive New Year, chock-full of learning, engagement, safety and fun!   

- Jacqueline Beauchere, Global Head of Platform Safety

Expanding our Work to Combat the Fentanyl Epidemic

Head Up Portal

Late last year, the CDC announced that more than 100,000 people died from drug overdoses in the US over a 12 month period -- with fentanyl being a major driver of this spike. This staggering data hits home – we recognize the horrible human toll that the opioid epidemic is taking across the county, and the impact of fentanyl and adulterated drugs (often masked as counterfeit prescription drugs) is having on young people and their families in particular. We also know that drug dealers are constantly searching for ways to exploit messaging and social media apps, including trying to find new ways to abuse Snapchat and our community, to conduct their illegal and deadly commerce.

Our position on this has always been clear: we have absolutely zero tolerance for drug dealing on Snapchat. We are continuing to develop new measures to keep our community safe on Snapchat, and have made significant operational improvements over the past year toward our goal of eradicating drug dealers from our platform. Moreover, although Snapchat is just one of many communications platforms that drug dealers seek to abuse in order to distribute illicit substances, we still have a unique opportunity to use our voice, technology and resources to help address this scourge, which threatens the lives of our community members.

In October, we shared updates on the progress we have been making to crack down on drug-related activity and to promote broader public awareness about the threat of illicit drugs. We take a holistic approach that includes deploying tools that proactively detect drug-related content, working with law enforcement to support their investigations, and providing in-app information and support to Snapchatters who search for drug-related terms through a new education portal, Heads Up. 

Today, we’re expanding on this work, in several ways. First, we will be welcoming two new partners to our Heads Up portal to provide important in-app resources to Snapchatters: Community Anti-Drug Coalitions of America (CADCA), a nonprofit organization that is committed to creating safe, healthy and drug-free communities; and Truth Initiative, a nonprofit dedicated to achieving a culture where all young people reject smoking, vaping and nicotine. Through their proven-effective and nationally recognized truth public education campaign, Truth Initiative has provided content addressing the youth epidemics of vaping and opioids, which they’ve taken on in recent years. In the coming days we will also release the next episode of our special Good Luck America series focused on fentanyl, which is featured on our Discover content platform. 

Second, we’re sharing updates on the progress we’ve made in proactively detecting drug-related content and more aggressively shutting down dealers. Over the past year:

  • We have increased our proactive detection rates by 390% -- an increase of 50% percent since our last public update in October. 

  • 88% of drug related content we uncover is now proactively detected by our machine learning and artificial intelligence technology, with the remainder reported by our community. This is an increase of 33% since our previous update. When we find drug dealing activity, we promptly ban the account, use technology to block the offender from creating new accounts on Snapchat, and in some cases proactively refer the account to law enforcement for investigation. 

  • We have grown our law enforcement operations team by 74%. While we’ve always cooperated with law enforcement investigations by preserving and disclosing data in response to valid requests, this increased capacity helped us significantly improve our response times to law enforcement inquiries by 85% over the past year, and we continue to improve these capabilities. You can learn more about our investments in our law enforcement work here

Since this fall, we have also seen another important indicator of progress: a decline in community-reported content related to drug sales. In September, over 23% of drug-related reports from Snapchatters contained content specifically related to sales, and as a result of proactive detection work, we have driven that down to 16% as of this month. This marks a decline of 31% in drug-related reports. We will keep working to get this number as low as possible. 

Additionally, we continue to work with experts to regularly update the list of slang and drug-related terms we block from being visible in Snapchat. This is a constant, ongoing effort that not only prohibits Snapchatters from getting Search results for those terms, but then also proactively surfaces the expert educational resources in our Heads Up tool. 

Third, we’re continuing to make our underlying products safer for minors. As a platform built for close friends, we designed Snapchat to make it difficult for strangers to find and connect with minors. For example, Snapchatters cannot see each other’s friend lists, we don’t allow browsable public profiles for anyone under 18 and, by default, you cannot receive a message from someone who isn’t already your friend. While we know that drug dealers seek to connect with potential customers on platforms outside of Snapchat, we want to do everything we can to keep minors from being discovered on Snapchat by people who may be engaging in illegal or harmful behavior. 

We recently added a new safeguard to Quick Add, our friend suggestion feature, to further protect 13 to 17 year olds. In order to be discoverable in Quick Add by someone else, users under 18 will need to have a certain number of friends in common with that person -- further ensuring it is a friend they know in real life. 

In the coming months, we will be sharing more details about the new parental tools we are developing, with the goal of giving parents more insight into who their teens are talking to on Snapchat, while still respecting their privacy. 

And we will continue to build on this critical work, with additional partnerships and operational improvements underway.