This problem requires a Pro subscription. Upgrade to access all 15 industry domains and 1,500+ questions.
Upgrade to ProYou are a Trust & Safety Engineer at Meta. The moderation team needs a risk classification for users whose posts have been reported. By analyzing how many reports a user has, whether those reports involve severe violations (abuse or harassment), and how many resulted in action being taken, the team can prioritize investigations and set automated thresholds for content review queues.
You are a Trust & Safety Engineer at Meta. The moderation team needs a risk classification for users whose posts have been reported. By analyzing how many reports a user has, whether those reports involve severe violations (abuse or harassment), and how many resulted in action being taken, the team can prioritize investigations and set automated thresholds for content review queues.