Content Moderator will have to review objectionable content on the platform, which has recently hosted live-streamed footage of murder, suicide, and rape.
Facebook directly employs content moderators but also outsources the task of screening posts in more than 50 languages across the globe to companies like Genpact.
Facebook already has 4,500 people around the world working in its community operations team and that the new hires help improve the review process, which has come under fire for both inappropriately censoring content and failing to remove extreme content quickly enough.
Facebook recognizes that the work can often be difficult and that every person is offered psychological support and wellness resources. It also has a program in place designed to support people in these roles, which is evaluated annually.
Facebook steps to give jobs to content moderators will help to keep social media platform clean from fake contents. This will also improve professional support for content creators on the social media platform.
Like Facebook other internet giants Google is also actively working in giving jobs to people in the content segment to keep the internet free from fake news spreading.