Facebook's 'Facing Facts' documentary pulls back curtain on its fight against 'false news'

The social network is hard at work making the News Feed safer, and making sure we know about it

By I-Hsien Sherwood. Published on May 24, 2018

Editor's Pick

Facebook is finally pulling back the curtain on some of its efforts to combat fake news--or "false news," as it's apparently called in Menlo Park. The specifics are detailed in an 11-minute video filmed over several months at the company's Silicon Valley headquarters by documentarian Morgan Neville.

Eyes focused slightly off-camera, a series of Facebook employees explain what they're doing to prune the thicket of misinformation and clickbait choking the site. It's a tough problem that requires a delicate touch, they declare.

"I think an extreme that would be bad would be if a group of Facebook employees reviewed everything that people tried to post and determined if the content of that post was true or false, and based on that determination, decided whether or not it could be on the platform," says Tessa Lyons, product manager, News Feed integrity. "What I think would also be bad is if we took absolutely no responsibility whatsoever, and allowed hate speech and violence to be broadly distributed."

What's the right balance? Facebook isn't prepared to give an answer. Instead, it's getting philosophical, defining the most dangerous misinformation as both inaccurate and intentionally misleading, as opposed to worthless drivel that's sincere but wrong, or cherry-picked info that's technically correct but still meant to unfairly sway public opinion.

They're tweaking the algorithms that identify fake accounts, hoaxes, spam, bots and the occasional foreign intelligence agency, implementing a program called Remove, Reduce, Inform. The most egregious, verifiably false information will be deleted. Dubious content like clickbait stories that might still be interesting to some people will appear less frequently in most news feeds. And Facebook has enabled a button, stylized as an italicized letter "I," that gives the provenance of a particular piece of content and information about the publisher. Posts from McSweeney's, for example, now offer a link to the humor site's Wikipedia page.

Okay, it's not bulletproof. But it's a good deal more explicit than the vague assurances we received a few weeks ago. Still, all this talk of fighting misinformation does nothing to improve the public's trust in media and the value of objective facts. The real enemy is not misinformation, which can be corrected, but disinformation, which erodes the institutions that undergird a functional democracy. Let's hope it's not already too late.

Rate this Ad

You must be registered to rate this ad.
Please or Register Now

Most Popular


May 24, 2018

Need a credit fix? Contact the Creativity Editors


The Creativity Newsletter

The Creativity newsletter is editorially curated to spotlight the work that’s hitting the mark—or missing it altogether. Sign up to have it sent to your inbox.