Social media giant, Facebook, has put the safety of its content moderators at risk as a security lapse exposed the personal profiles of its content moderators to suspected terrorist users on the social media platform.
The security loophole affected more than 1,000 workers across 22 departments at Facebook. The employees were tasked with using the company’s moderation software to review and remove inappropriate content, including sexual material, hate speech, and terrorist propaganda, from the social media platform.
The software bug led to the profiles of employees appearing in the activity logs of groups that were being removed. The issue came to light when company moderators began receiving friend requests from terror suspects on the social media platform. The security lapse lasted for a month before Facebook fixed the issue. Facebook also warned all employees believed to be affected by the security lapse and offered counselling through its employee assistance program.