This is a remote position.
Key ResponsibilitiesContent Review & Moderation- Review user-generated content including text posts, images, videos, and comments.
- Identify and remove content that violates community guidelines (e.g., harassment, hate speech, misinformation, graphic content, spam).
- Enforce platform policies consistently and fairly.
- Escalate complex or high-risk cases to senior moderation or policy teams.
Policy Enforcement & Decision-Making
- Interpret and apply content policies to real-world scenarios.
- Document moderation decisions and maintain review accuracy.
- Provide feedback to improve moderation processes and policy clarity.
Community Safety & Risk Monitoring- Identify emerging harmful trends or coordinated abuse patterns.
- Help detect fraudulent accounts, bots, or suspicious behavior.
- Support user reports and appeals processes.
Quality & Performance- Meet accuracy, speed, and quality benchmarks.
- Participate in regular calibration sessions and training updates.
- Maintain confidentiality when handling sensitive data.
Requirements
- Experience in content moderation, Trust & Safety, customer support, or community management.
- Familiarity with online safety standards and platform policies
- Experience working in fast-paced or high-volume environments.
- Strong written and verbal English communication skills. Multilingual abilities are a strong asset. Ability to clearly understand and assess nuanced content
- Ability to make objective decisions using policy guidelines.
- High attention to detail and consistency.
- Ability to handle exposure to sensitive or disturbing content.
- Strong time management and organizational skills.
- Review user-generated content including text posts, images, videos, and comments.
- Identify and remove content that violates community guidelines (e.g., harassment, hate speech, misinformation, graphic content, spam).
- Enforce platform policies consistently and fairly.
- Escalate complex or high-risk cases to senior moderation or policy teams.
Policy Enforcement & Decision-Making
- Interpret and apply content policies to real-world scenarios.
- Document moderation decisions and maintain review accuracy.
- Provide feedback to improve moderation processes and policy clarity.
- Identify emerging harmful trends or coordinated abuse patterns.
- Help detect fraudulent accounts, bots, or suspicious behavior.
- Support user reports and appeals processes.
- Meet accuracy, speed, and quality benchmarks.
- Participate in regular calibration sessions and training updates.
- Maintain confidentiality when handling sensitive data.
Requirements
- Experience in content moderation, Trust & Safety, customer support, or community management.
- Familiarity with online safety standards and platform policies
- Experience working in fast-paced or high-volume environments.
- Strong written and verbal English communication skills. Multilingual abilities are a strong asset. Ability to clearly understand and assess nuanced content
- Ability to make objective decisions using policy guidelines.
- High attention to detail and consistency.
- Ability to handle exposure to sensitive or disturbing content.
- Strong time management and organizational skills.
