AJ Isaacs 188370b1fd Fix LLM scoring usernames as toxic content
The display name "Calm your tits" was being factored into toxicity
scores. Updated the analysis prompt to explicitly instruct the LLM
to ignore all usernames/display names when scoring messages.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 15:51:14 -05:00
Description
No description provided
1.2 MiB
Languages
Python 98.8%
Shell 0.8%
Dockerfile 0.4%