In the midst of the heated U.S. presidential race last summer, with hypercharged scrutiny of partisan propaganda on social media, Facebook Inc. Chief Executive Officer Mark Zuckerberg received a letter from a group of U.S. senators led by Massachusetts Democrat Elizabeth Warren that had nothing to do with elections. They were angry about a year-old piece of climate news.
A Washington Examiner article shared on Facebook in 2019 had denounced climate models, which are widely used by scientists around the world to measure and predict the impacts of warmer temperatures. Science Feedback, an outside organization Facebook works with on fact-checking, had labeled the story false. A review by five scientists found the story “highly misleading” because of “false factual assertions” and accused the authors of “cherry-picking datasets.” The conclusion meant Facebook posts linking to the story would now be saddled with a label saying it had been disputed.
But then Facebook said because the article was designated as an op-ed, it was exempt from fact checks under the company’s policies. The “false” label was removed. Warren’s letter called the op-ed policy a “massive loophole.” Facebook’s policies on climate lies “represents another unfortunate example of Facebook’s refusal to fully combat the deliberate spread of misinformation,” she added.
Climate change has emerged as a key priority in Facebook’s quest to stomp out misinformation, a complicated effort that involves policing user posts while simultaneously defending free speech. In the past few months the company has started fighting climate misinformation with some of the same strategies used to battle Covid‑19 myths and election falsehoods—a sign of the topic’s growing importance internally. But Facebook’s misinformation policies have also left climate activists frustrated.
Zuckerberg got another letter last month, this time from 13 environmental groups including the Union of Concerned Scientists and Greenpeace, asking the company to commit to monitoring climate disinformation and releasing reports, among other things. “Climate change disinformation is spreading rapidly across Facebook’s social media platform, threatening the ability of citizens and policymakers to fight the climate crisis,” the groups wrote.
Unlike elections, which have shorter timelines, climate change is a long-term problem with no definitive ending. That means Facebook doesn’t consider lies about the climate an “imminent” threat of real-world harm, which is the threshold the company uses to determine whether a post containing misinformation should be removed from the service entirely.
“It is an immediate threat, and the fact that they don’t see that makes me mad,” says Naomi Oreskes, a professor of the history of science at Harvard and the co-author of Merchants of Doubt, a book about climate science being deliberately obscured. “We have enormous evidence now that many storms, floods, hurricanes, cyclones have been made worse by climate change.”