Community & Customer Success Manager
Mozilla- Full Time
- Entry Level & New Grad
Candidates should possess 2+ years of experience in quality assurance, trust & safety, or content moderation, preferably within a tech or online platform environment, and a deep understanding of issues related to minor safety, exploitative content, and global online safety trends. Strong analytical skills, excellent written and verbal communication skills, familiarity with moderation tools, audit processes, and metrics-driven performance tracking are also required, along with a calm and resilient demeanor when handling sensitive content.
As a QA Specialist for Minor Safety and Exploitative Content (MSEC) at Discord, you will review and audit moderation decisions to ensure adherence to policies, collaborate with moderators and teams to identify trends and gaps, provide constructive feedback to improve decision-making, lead calibration sessions, report on quality trends, and partner with teams to influence policy updates and improve internal tools. This role also involves supporting other QA initiatives, including automation and machine learning, and potentially working with machine learning systems and LLM/AI technologies.
Voice, video, and text communication platform
Discord is a communication platform that allows users to connect through voice, video, and text. It serves as a space for friends and communities to gather and share their interests, catering to a wide range of groups such as artists, activists, study groups, and hobbyists. Users can join various communities, known as servers, which host discussions and activities related to their interests. Unlike many other platforms, Discord does not rely on advertising or selling user data; instead, it offers a premium subscription service called Nitro, which provides additional features like enhanced streaming quality and customization options. The goal of Discord is to create a welcoming environment where people can build connections and foster a sense of belonging.