Islamic State terrorists are using TikTok to recruit young women

Islamic State terrorists are using TikTok to recruit young women

On TikTok, high schoolers flaunt their VSCO style, sing along to Billie Eilish, and turn themselves into cowboys. Cat owners and dog lovers upload clips of their furry friends dancing to “Mr. Sandman.” On the “trending” page, you’ll find hashtagged challenges, lip-synched duets, and elaborate pranks.

But scroll past the light-hearted videos and there’s a small chance you could stumble across something much darker—a scary side of the app moderators work incessantly to quash. As The Wall Street Journal first reported, at least two dozen Islamic State militant accounts were recently banned for posting propaganda videos featuring “corpses paraded through streets, Islamic State fighters with guns, and women who call themselves jihadist and proud.” According to the outlet, many were set to Islamic State nasheeds. Some were filtered with pink hearts, sparkling glitter, flowers and emojis to appeal to young women.

ELLE.com has been unable to verify the existence of these videos (they were taken down after the Journal alerted TikTok to their existence), but a spokesperson for TikTok told me they had “very limited views” and that the company “aggressively moderates, targets, and shuts down [terrorist-related].”

More will inevitably crop up, research fellow for George Washington University’s Program on Extremism Andrew Mines said, and the app’s response is a good indicator of how seriously it plans to take the problem, how willing it is to respond, and how hospitable it’ll be for pro-ISIS supporters or official ISIS propaganda in the future.

Since the Islamic state began using social media to recruit members several years ago, behemoths like Facebook and Twitter have struggled with how to address hate-fueled violence and how to define terror-related content. Unlike its predecessors, the much younger TikTok—launched in 2017 by Chinese company ByteDance and worth an estimated $75 billion—seems to be taking an immediate and more active approach to moderation by reportedly employing thousands of content moderators in the U.S. and China, and investing in two former lawmakers to review its content-moderation policies.

Because its audience is composed mostly of Millennials and Gen-Zers (30 percent of users are reportedly under the age of 18) looking for funny videos, TikTok is an especially insidious way for ISIS to appeal to potential members. It appears the group is also specifically targeting young girls through the app.

“The flowers and heart filters and sparkles all point to gender aspects of the conflict and the roles that women eventually grow up to play in the group,” Mines said. “This appeal to a younger audience, and a younger female audience, has been really effective for the group in prepping women for the roles they play in ISIS, which is really vital for the group’s strength and operation.”

While it’s often difficult to differentiate between what’s produced by the ISIS core media center and more grassroots recruitment efforts, Mines said, at some point it doesn’t really matter. Either way, TikTok explicitly prohibits criminal organizations from using it. The app, which has been downloaded over 950 million times, has a strict policy listed on their community guidelines regarding terrorist-related content: “Terrorist organizations and any other criminal organizations are strictly prohibited from using TikTok. DO NOT use TikTok to promote and support these organizations or individuals.”

The TikTok rep I spoke with explained that the industry-wide challenge they face is “complicated by bad actors who actively seek to circumvent protective measures,” but that TikTok has “a team dedicated to aggressively protecting against malicious behavior.”

To enforce those rules, it has reportedly taken a cue from Facebook, Twitter, and YouTube, which have all invested in moderators to troll for offensive content. Similar to Facebook, TikTok also uses a system of “classifiers” like imagery or music, to flag accounts that potentially violate their terms of service.

If it does, TikTok terminates the account and bans the associated device to prevent its owner from creating another account on that phone. Last month, The Guardian obtained leaked documents instructing TikTok moderators to take down videos mentioning Tiananmen Square, Tibetan independence, and the banned religious group Falun Gong, after it was reported that discussion of the Hong Kong protests was being censored on the app for political reasons. The revelation has raised concerns that TikTok is exporting Chinese-style censorship policies around the world.

In the past year, social media companies have found themselves under immense pressure to limit hate speech content. In response, Facebook, the world’s largest social network, announced last month its expansion of the definition of terrorist organizations. A recent Twitter transparency report revealed that 10.8 million accounts were reported from July and December 2018. Twitter took action against 612,563 of those accounts for violating six rules categories: abuse, child sexual exploitation, hateful conduct, private information, sensitive media, and violent threats.

Tik Tok’s operational components, like the inability to share URLs or distribute instructional material through the app, prevent it from being the kind of recruiting tool Twitter and Facebook once were for ISIS, Mines said, but that doesn’t mean the group won’t try and use it to spread their ideologies.

“[ISIS] is going to use whatever it has in its toolbox to recruit,” Mines said, “to push out their message, and certainly that’ll be on one of the most popular platforms in the world, which is now Tik Tok.”

Source: Elle