snapchat app opened on mobile phone in coffee cup

Why Can't a Social Media Threat Alert Service Work With Snapchat?

Written by Social Sentinel

Since its launch in 2011, private network social media provider Snapchat rose to prominence as the preferred social media platform for teenagers. The design of the platform makes it equally appealing to users and potentially frustrating for those trying to stay on top of possible threatening or harmful content to keep their students and community safe. For schools implementing a social media threat alert strategy, those inaccessible dark areas can become very frustrating.

For schools incorporating a social media threat alert system as part of their overall safety and security strategy, private or closed network applications pose a particular challenge.

Snap stats

Though it lacks the global adoption of social media titans like Facebook and Twitter, Snapchat carved a niche for itself that attracts enthusiastic users and brands alike. Most devout are kids between the ages of 12-17. With greater control over who receives their messages and its ephemeral nature, it’s easy to see why kids gravitate toward the platform.

Privacy is an equally attractive quality. To date, Snapchat’s closed network design forms an impenetrable barrier for any organization’s choice of social media threat alert technology. Thus creating a blind spot to insights that may allow safety and security teams to take proactive steps against threats.


Before technologies–like social media threat alert services–can access posts, Snapchat must first create a bridge to its feed. This is typically done through an application program interface (API) connection. APIs allow different applications to talk to one another in specific ways to send and retrieve data. Snapchat hasn’t yet made an API available for any use case. Without the connection, no one has eyes on the conversations taking place.

 

Fostering open communication around a closed network

For schools incorporating a social media threat alert system as part of their overall safety and security strategy, private or closed network applications pose a particular challenge.

What, then, should be done if private/closed platforms, like Snapchat, don’t have publicly-viewable content to review? You likely already promote a see something, say something culture wherein kids and parents feel encouraged to report possible threats they receive in their private conversations. Promoting open communication and transparency at home and in school could also go a long way towards supporting the culture. Especially if it can be viewed as a positive way to help protect fellow students – or themselves – from harm.

In the digital space, see something, say something isn’t that much different. Similar to a student reporting details of a potentially threatening conversation to a counselor, some may opt to post screenshots of private chats to their public feeds. They’ll feel they’ve done their duty to report trouble.

No solution on the market today can scan or provide insights of potentially harmful intent shared over private social media platforms. However, some changes to the culture could encourage school kids to help surface threats before they have a chance to materialize as tragedies.

We proudly serve: