Easy to integrate web service to filter toxic speech in your community forums and channels.
Toxic speech and hatred are tarnishing the value of online community spaces. Studies have shown that prolonged exposure to hateful content and messages affects your mental state.
We trained a multilingual toxicity classification model to filter out toxic messages or conversations.
We built our product to scale so we trained and deployed the model in AWS with a load balancer to manage a high volume of traffic.
Tackle cyber-bullying in online forums and channels
Help moderators to maintain a safe community space to have conversations