Snapchat, Privacy & Safety: The Basics
Welcome to the Safety & Impact Blog! I’m Jen Stout, VP of Public Policy, and I’m thrilled to follow Evan’s introduction with this post today that digs into our approach.
Because Snapchat is an inherently different kind of platform, we recognize that it can feel especially difficult to figure out if you have never used it yourself and don’t plan to (which is ok!). We hope this blog becomes a helpful resource for anyone who wants to better understand how young people experience our product, or the many stakeholders and advocates who seek to better understand how we approach critical issues around safety and trust.
For us, nothing is more important than the safety of our Snapchat community, and we have always believed we have a responsibility to help our community learn about how to protect their security, privacy, and wellbeing when using our products. To date we have done this through both online resources and in-app education efforts, and at the start of this year kicked off a more expansive digital literacy campaign to double down on these efforts. We have been hyper-focused on better supporting Snapchatters, and we want to do a better job of talking to their support systems: the parents, family members, mentors, and other people in their lives who care deeply about them.
To start, we’ll cover more specifics on the ways Snapchat is designed differently than traditional social media platforms. As Evan laid out, our purpose is to design products and build technology that nurtures and supports real friendships in a healthy, safe, and fun environment. This goal drives the decisions we make about how to structure and operate our platform and develop new products, how we think about the future and ultimately about our role in the technology sector.
The Architecture of Snapchat
We use product development processes that consider the privacy, safety, and ethical implications of a new feature at the front end of the design process -- and we don’t launch products that don’t pass our intensive reviews.
We don’t have an open newsfeed, where anyone can broadcast unvetted content to the whole world. In fact, Snapchat looks different when you first try the app because it opens to the camera -- essentially a blank canvas -- not a feed to scroll through.
We don’t allow unvetted content to ‘go viral’ on Snapchat. Our content platform, Discover, only features content from vetted media publishers and content creators. Our entertainment platform, Spotlight, is proactively moderated using human review before content can reach a large audience.
On Snapchat, Group Chats are limited to 64 members. These Groups are not discoverable on our platform to anyone who isn’t already a member. Just as with 1:1 Chats, you can’t join a Group Chat if you are not already friends with someone in a Group.
Safety and Privacy
Snapchat doesn’t have public comments or browsable profile photos -- it’s one of the ways we intentionally make it more difficult for strangers to reach people they shouldn’t on the app. To help protect Snapchatters under 18 in particular, we don’t allow them to create Public Profiles, and when searching for new friends on Snap we only display the user’s bitmoji avatar and not an actual photo.
We have always required two-way contact authentication by default before you can send a chat to another user, because we believe friendship is mutual and secure. It’s not one person following the other, or random strangers entering our lives without permission or invitation. This is also true for other features on Snapchat like our Snap Map -- you can’t see anyone on the Map and they can’t see you unless you are friends with each other and you’ve expressly chosen to share your location.
We focus on making our features private-by-default, because just like in real life, we think individual users should choose what information they want to share and when. For example, location-sharing is off by default for all users when they first use our Snap Map. They have the option to decide to share it with their friends -- but never with strangers.
We design products with the goal of collecting less data from our users and retaining that data for shorter periods of time - because while advertising is necessary to allow us to provide our service for free, we have found that we can provide value to business partners while still respecting the privacy of people’s friendships and not making them feel like we are turning their relationships into a commodity.
We offer fast and easy ways for users to report content that concerns or worries them directly to us, using in-app tools, so our dedicated Trust and Safety team can investigate and take action.
Our Guidelines and Enforcement
From day one, we designed Snapchat to avoid incentives for those looking to spread harmful content. Our Community Guidelines have long prohibited bullying, hate speech, and the spread of misinformation that could cause harm, including conspiracy theories, false medical claims, and efforts to undermine civic processes.
We don’t tolerate misuse of our platform and we have a dedicated infrastructure for effectively designing and enforcing our Guidelines. One way we do this is simple: when attempts are made to violate our policies, we simply remove the content. We don’t label offending content or make exceptions for public figures -- our Guidelines apply equally to all Snapchatters.
If violating content involves the safety of minors we report it to the National Center for Missing and Exploited Children (NCMEC) and have specific processes for working with and supporting law enforcement agencies.
This doesn’t make us perfect, and any app that facilitates communication has the potential to be abused. We have to constantly improve our tools and tactics in order to maintain and improve the safety of Snapchat. We constantly think about the risks and how we can advance our tech capabilities and practices to better protect our community. As part of these efforts, we regularly seek guidance from security, intelligence, and safety experts about the ways we can stay a step ahead of bad actors. We plan to dive deeper into many of these topics on this blog, so you can understand how we think about these problems and work to solve them.
As a parent myself, I spend a lot of time having these conversations with my own children, with my friends grappling with the role of technology and platforms in their kids’ lives, and with the many stakeholders we talk to as part of our work at Snap. I hope this post gives insight into some of the basics, and encourages parents and family members to also check out our Parents Guide for a detailed explanation into each of our products, and additional resources for helping Snapchatters connect in a healthy, safe and fun environment.