TikTok has announced some new measures in its efforts to combat hate speech and violent extremism in its app, with the platform establishing a new partnership with Violence Prevention Network, which is an organization dedicated to stopping violent extremism, while it’s also joined the Global Internet Forum to Counter Terrorism (GIFCT), in order to implement enhanced approaches to removing extremist content.
First off, TikTok says that it’s working with Violence Protection Network on a new series of resources that will be aimed at building community resilience against extremist content, in an effort to improve audience understanding, and raise awareness of how these groups operate.
As per TikTok:
“In the fight against online hate and violent extremism, we can use technology to help us reach and educate communities where they are. We already offer our global community access to resources that can help them develop media literacy skills and encourage them to view and create online content responsibly and critically.”
This new initiative will add more to this educational push, in order to better address violent extremism and hate speech across the app.
TikTok says that these new resources will initially be made available in Germany, with German users able to access these evolving education tools.
“[German users] will be able to find these resources in-app when they search for words related to violent extremism. We will evaluate the impact of this approach as we consider bringing it to other parts of the world.”
In addition to this, TikTok’s also joined the Global Internet Forum to Counter Terrorism (GIFCT), and will work with the group to develop improved approaches to tackling extremist content.
In combination, these partnerships will help to expand TikTok’s capacity to combat extremist content, and limit user exposure to harmful movements in the app.
TikTok also notes that its automated detection systems are getting much better at finding violative content of this kind, and removing it before anybody actually sees it.
In the first half of this year, TikTok says that:
- It removed over 6.5 million videos for violating its rules against violent and hateful organizations, with 98.9% of them being taken down before being reported by users. 94% of these videos were removed within 24 hours.
- It’s taken down 17 extremist networks, made up of more than 920 accounts dedicated to spreading hate.
The reach of social networks make this an ideal vector for recruiting users to harmful causes, and it’s good to see TikTok taking more steps to stop these groups from using its platform to spread their rhetoric.
And TikTok has been identified as a host of such in the past. Previous research has found videos that promote hatred or extremism gaining reach in the app, while extremist groups have also been found to be using “cloaking” terms in the app in order to avoid detection, while spreading their messaging.
TikTok’s systems are getting better at detecting and removing such, but it remains an important area of focus.









