Advertisement

Facebook to target harmful ‘real’ accounts

Facebook is taking a more aggressive stance against harmful users, aiming their anti-troll strategy designed for fake accounts, to take down real users.

The company told Reuters the new approach, uses the tactics usually taken by Facebook's security teams for wholesale shutdowns of networks engaged in influence operations that use false accounts to manipulate public debate, such as Russian troll farms.

It could have major implications for how the social media giant handles political and other coordinated movements breaking its rules, at a time when Facebook's approach to abuse on its platforms is under heavy scrutiny from global lawmakers and citizen groups. The approach will target real accounts that systemically break its rules, through mass reporting, where many users falsely report a target's content or account to get it shut down, or brigading, a type of online harassment where users might coordinate to target an individual through mass posts or comments.

The expansion means Facebook's security teams could take more sweeping actions than the company removing posts or individual accounts as it otherwise might.

High-profile instances of coordinated activity around last year's U.S. election, from teens and K-pop fans claiming they used TikTok to sabotage a rally for former President Donald Trump in Tulsa, Oklahoma, to political campaigns paying online meme-makers, have also sparked debates on how platforms should define and approach coordinated campaigns.