Facebook Expands Climate Science Center to More Regions, Ramps Up Climate Misinformation Detection

SeoToolsPortal

As the scientific evidence about human-caused climate change continues to strengthen, so too have efforts to push back against this growing body of knowledge through misinformation campaigns. These campaigns typically target policymakers and other decision-makers who are in a position to take action on climate change, making them an audience of particular interest for Facebook. Last year, Facebook launched its Science Program, which is investing in collaborations with leading scientists around the world to help answer some of the most pressing questions facing science today.

Part of that program includes an effort aimed at better understanding how people understand and talk about climate change by examining how the general public uses the term “climate change” as well as other related terms, like “global warming.” With support from scientists working within its Science Program, Facebook is now expanding its efforts to the global scientific community and plans to double the size of its Climate Change Communications team working at Facebook’s headquarters in Menlo Park, California.

According to a recent scientific paper analyzing how U.S. climate science has been discussed on social media, there are five distinct “tribes” fighting about what we should be doing about human-caused climate change: Alarmed Early Adapters, Concerned Conservative Christians, Libertarians/Market Skeptics, New Age Environmentalists, and Traditional Environmentalists. These groups often use different terms for the same phenomena or sometimes use almost opposite words. Understanding these differing tribes can provide some insight into why misinformation campaigns are so effective. By using natural language processing techniques that have been applied successfully in other areas of science communication, Facebook aims to understand how these different tribes use climate-related terms on social media and to identify misinformation targeting these groups.

The success of one tribe’s messaging can affect the entire debate. For example, a small group of Libertarians/Market Skeptics accounts for about 25% of all posts using “climate change.” The New Age Environmentalist tribe makes up only 10% of the population but has an outsized impact. If they get their message out successfully, it can sway opinions across many other groups. One way this is happening is through the use of polarizing language that triggers intuitive thinking and bypasses rational discourse. As a result, misinformation spreads more effectively among people who are less familiar with the topic. This also means that effective educational messages stand to benefit more people.

The work at Facebook is part of a broader $20 million investment by the company in efforts to combat misinformation. According to its 2018 Science Program Annual Report, Facebook plans to “serve as an open platform for scientists, academics, and government agencies” studying climate change and facilitating dialogue between these groups using social media. It’s clear that Facebook sees this effort as critical given all of the questions it has received about how it can help stem the flow of misinformation on its site. With one billion active users, Facebook reaches far beyond just your friends and family members to include public figures, world leaders, businesses, advertisers…and politicians. As previously discussed in the blog, Facebook’s ability to influence political behavior is a major concern ahead of the 2020 election.

Recently, Facebook CEO Mark Zuckerberg confirmed that his company has identified foreign governments and other organizations attempting to interfere in this year’s elections by promoting misinformation on its platform. ” When they do this,” he wrote in a post, “they emulate some of the same behaviors as people who spread nasty rumors about a neighbor carrying diseases…While we have found and taken down thousands of [inauthentic accounts] working in France, Austria, Germany, and Mexico [and are investigating others], for example, they are keeping their playbook open.” In addition to these traditional concerns about misinformation from foreign interference during the 2020 election cycle, there will also be new questions about how social media may feed polarization and exacerbate the spread of misinformation.

For example, Facebook has said that it plans to improve the ” Quality Filter ” feature for its News Feed service this year. With 1.4 billion daily active users, Facebook’s actions have significant consequences for the public conversation about climate change. As Facebook discusses in its 2018 Science Program Annual Report: “We are interested not just in helping people become aware of existing research but also in supporting additional research aimed at understanding what kinds of news stories cause confusion.” The growing number of Facebook users—as well as problems stemming from misuse by bad actors—raise major questions about how all online platforms should approach this complicated problem.

In a recent presentation about these issues at George Washington University, Facebook announced that it is now offering funding to support research at universities related to the spread of misinformation. The company is now accepting proposals for research projects studying “misinformation and false news online” through its social science grants program.

This latest move follows on the heels of increased support by Facebook for researchers in this area. Earlier this year, Facebook announced a new machine learning effort to better identify potentially false stories on its site. In addition, it is also funding better digital literacy among users. The company’s 2018 Science Program Annual Report also notes that user education efforts will continue this year, drawing from lessons learned about how to best reach different people.

The announcement of the new grants at George Washington University also included Facebook’s efforts to improve the detection of misinformation from those who may be sharing it for financial gain. In a July blog post, Monika Bickert, Facebook’s Head of Global Policy Management, writes that the company is focused on “improving machine learning technology designed to detect fake accounts, just as we’ve done with spam and malware.”

Facebook is not the only online platform looking to take action against misinformation. Twitter recently announced it took down a network of more than 600 accounts that were engaging in coordinated manipulation. This follows on its earlier efforts to crack down on bots and other automated accounts.

For supporters of climate action, Facebook’s announcements should be viewed as positive steps. Our analysis suggests that Facebook has likely played a role in the spread of climate disinformation, but at the same time, it is working to release more data related to its platform for research purposes.

Leave a Comment