The corporate previously stated that it’s taken motion in opposition to 9.6 million items of content material within the first quarter of 2020 — a big improve over the 5.7 million within the quarter prior. Whereas a few of these posts are apparent sufficient to result in automated blocking or removing, the remaining are entered right into a queue for human moderators to judge. The method of figuring out if content material is dangerous can result in psychological well being points, and earlier this 12 months Fb settled a case with about 11,000 of its moderators with a $52 million payout. It additionally promised to replace its content material moderation software program, muting audio by default and displaying movies in black and white.
With Fb persevering with to be the discussion board by which many individuals on this planet talk with their family and friends, its potential to react to faux and hateful content material is essential to preserving its platform protected.
#Fb #content material #moderators