Twenty-one months after the 2016 presidential election–which many believe was hacked by Russian propaganda farms–and with the midterm elections just around the corner, Facebook is finally forming up a holistic approach to fighting political disinformation on its platform. It’s an imperfect strategy, but it may be the best it can do. The company held a conference call with journalists on Tuesday to explain the strategy and detail how it’s being implemented.
Regarding the false news that was rampant on the platform in 2016, Facebook says that it’s now partnering with fact checkers to find bogus content, dial down the visibility of hoaxes in news feeds, and provide more contextual information about questionable news stories.
What it won’t do is completely remove false news content–even when it’s been debunked by fact checkers. And for that it’s let itself in for a boatload of criticism. As long as a piece of disinformation does not incite violence or violate other existing community guidelines, the company explains, it will stay on the platform for all to see.
“If you are who you say you are, we don’t believe we should stop you from posting content,” said Facebook product manager Tessa Lyons during the conference call. Put simply, Facebook doesn’t want to be put in the position of deciding if some piece of content is mostly true or mostly false. “Not everybody agrees where the line is,” Lyons said. “But just because something is allowed on Facebook doesn’t mean it should get distribution,” she added.
Lyons explained that each piece of content Facebookers see in their news…