Toxicity Detection
The Toxicity Detection plugin provides content moderation capabilities by analyzing and filtering potentially harmful or inappropriate content in API requests using OpenAI's moderation API. It can detect various categories of toxic content including sexual content, violence, hate speech, self-harm, and illicit content.