Facebook Moderators Are Suffering from PTSD, Question Reality, Morphing Into Conspiracy Theorists, from Reviewing Sinister User Content
ARIZONA – Some of the people responsible for moderating content on Facebook are suffering from post-traumatic stress syndrome while others are morphing into rabid conspiracy theorists, The Verge reported Monday.
Many of the employees at Facebook contractor Cognizant are having meltdowns while attempting to moderate the vast troves of content people post on the social media platform. Combing through people’s content is turning Cognizant’s Arizona office into a dark and sinister place, the report noted, citing interviews with moderators.
Several moderators told a reporter that conspiracy theories took strong root at the office. The 2018 Parkland shooting, which resulted in 17 casualties in Florida, initially horrified staff, moderators said. One person The Verge called Chloe, for instance, claimed her colleagues eventually began expressing doubts about the initial story as more conspiracy content was posted to Facebook and Instagram.
“People really started to believe these posts they were supposed to be moderating,” she said. “They were saying, ‘Oh gosh, they weren’t really there. Look at this CNN video of David Hogg — he’s too old to be in school.’ People started Googling things instead of doing their jobs and looking into conspiracy theories about them.”
Chloe added: “We were like, ‘Guys, no, this is the crazy stuff we’re supposed to be moderating. What are you doing?’” Others employees reported similar experiences. Moderating posts gave another moderator named Randy a form of PTSD and made him think weird thoughts about reality.
After seeing vast numbers of videos claiming the 9/11 terrorist attack was an inside job, he came to believe them. Randy also claims conspiracy videos about the Las Vegas massacre were also very persuasive — he also now believes that multiple shooters were responsible for the attack.
“I’m fucked up, man,” Randy told The Verge, referring to his mental state after working at the Arizona center for roughly a year. “My mental health — it’s just so up and down. One day I can be really happy, and doing really good. The next day, I’m more or less of a zombie. It’s not that I’m depressed. I’m just stuck.”
One moderator refutes these allegations. The majority of the content is benign, Brad, who holds the title of policy manager, told a reporter. “Most of the stuff we see is mild, very mild. It’s people going on rants,” he said. “It’s people reporting photos or videos simply because they don’t want to see it — not because there’s any issue with the content. That’s really the majority of the stuff that we see.”
This is not the first time a Facebook moderator has complained about the dredge work. One former employee named Selena Scola filed a lawsuit against the Silicon Valley giant in December 2018, alleging that her job gave her PTSD — a claim her lawyers are using to seek a chain reaction that leads to a class-action lawsuit.
Facebook uses more than 7,500 content reviewers worldwide to ensure its content is appropriate, but the company is pursuing artificial intelligence options that one day could replace human workers, reported Fox News. Facebook acknowledged that there are sometimes times when employees will be unsatisfied.
“We are taking steps to be extremely transparent about the expectations we have of our partners,” a company spokeswoman told The Daily Caller News Foundation.
Comments are closed.