Big Tech Targets ‘Climate Change Denial’ in New Policy

Google and Facebook recently introduced new initiatives related to climate science. Those moves generated further debate about the role – and responsibilities – of social media in regard to public discourse.

Critics say that discouraging alternative viewpoints online can have negative consequences, largely because of the silo effects and bias reinforcement that already take place on social media.

Google announced it would begin enforcing a new policy related to climate-change content in November. In a statement attributed to the Google Ads Team, the company reported:

“We’re announcing a new monetization policy for Google advertisers, publishers and YouTube creators that will prohibit ads for, and monetization of, content that contradicts well-established scientific consensus around the existence and causes of climate change.

“This includes content referring to climate change as a hoax or a scam, claims denying that long-term trends show the global climate is warming, and claims denying that greenhouse gas emissions or human activity contribute to climate change.”

Facebook announced it would further expand its effort to label climate-related content and direct users toward an information center with posts on climate change.

“We’re expanding our Climate Science Center to more than 100 countries to connect more people with factual resources from leading climate organizations. We’ll continue adding informational labels to posts about climate change and directing people to the Climate Science Center,” it stated.

Nick Clegg, Facebook vice president of global affairs and communications, said, “We have a responsibility to tackle climate misinformation on our services, which is why we partner with more than 80 independent fact-checking organizations globally to review and rate content, including content about climate change.”

“When they rate content as false, we reduce its distribution so fewer people see it and we show a warning label with more context. And we apply penalties to people who repeatedly share false information,” he said.

The Trap of Echo Chambers

Two major concerns of social media observers are a phenomenon called “siloing” and online services’ tendency to reinforce confirmation bias.

Even when an online service starts with a highly diverse audience, “what ends up happening, and it happens very quickly, is that we create these silos of information. That means that we only show you information you agree with,” said David Schweidel, Rebecca Cheney McGreevy endowed chair and professor of marketing at Emory University in Atlanta, Ga.

Confirmation bias is the widely recognized human inclination to seek out information that confirms one’s own conclusions and beliefs. In social media, that bias is supported when users search for, access, “like” and share content that reinforces their personal convictions, while avoiding conflicting content.

Schweidel explained that online services like Google have discovered users are more likely to click on an ad or take action when they access information they agree with, in part because they access that type of information more often due to confirmation bias. Social media advertising algorithms end up directing users to viewpoints they favor in order to maximize revenue.

Please log in to read the full article

Google and Facebook recently introduced new initiatives related to climate science. Those moves generated further debate about the role – and responsibilities – of social media in regard to public discourse.

Critics say that discouraging alternative viewpoints online can have negative consequences, largely because of the silo effects and bias reinforcement that already take place on social media.

Google announced it would begin enforcing a new policy related to climate-change content in November. In a statement attributed to the Google Ads Team, the company reported:

“We’re announcing a new monetization policy for Google advertisers, publishers and YouTube creators that will prohibit ads for, and monetization of, content that contradicts well-established scientific consensus around the existence and causes of climate change.

“This includes content referring to climate change as a hoax or a scam, claims denying that long-term trends show the global climate is warming, and claims denying that greenhouse gas emissions or human activity contribute to climate change.”

Facebook announced it would further expand its effort to label climate-related content and direct users toward an information center with posts on climate change.

“We’re expanding our Climate Science Center to more than 100 countries to connect more people with factual resources from leading climate organizations. We’ll continue adding informational labels to posts about climate change and directing people to the Climate Science Center,” it stated.

Nick Clegg, Facebook vice president of global affairs and communications, said, “We have a responsibility to tackle climate misinformation on our services, which is why we partner with more than 80 independent fact-checking organizations globally to review and rate content, including content about climate change.”

“When they rate content as false, we reduce its distribution so fewer people see it and we show a warning label with more context. And we apply penalties to people who repeatedly share false information,” he said.

The Trap of Echo Chambers

Two major concerns of social media observers are a phenomenon called “siloing” and online services’ tendency to reinforce confirmation bias.

Even when an online service starts with a highly diverse audience, “what ends up happening, and it happens very quickly, is that we create these silos of information. That means that we only show you information you agree with,” said David Schweidel, Rebecca Cheney McGreevy endowed chair and professor of marketing at Emory University in Atlanta, Ga.

Confirmation bias is the widely recognized human inclination to seek out information that confirms one’s own conclusions and beliefs. In social media, that bias is supported when users search for, access, “like” and share content that reinforces their personal convictions, while avoiding conflicting content.

Schweidel explained that online services like Google have discovered users are more likely to click on an ad or take action when they access information they agree with, in part because they access that type of information more often due to confirmation bias. Social media advertising algorithms end up directing users to viewpoints they favor in order to maximize revenue.

“The (ad) algorithm takes into account what cookies have been loaded into your browser, and it takes into account your search history,” said Shiv Ganesh, professor of communication studies in the Moody College of Communication at The University of Texas at Austin.

Social media and search services also can tap into user emails, purchase histories and other accessible online information, he noted.

“For me, as a researcher and educator over the years, I realized there is no neutral way to search for information over the Web,” Ganesh said.

“That is going to continue to affect what people see when they conduct searches. When you run searches, you do not get a neutral, non-curated list of results,” he observed.

Ganesh said, “One of the things I say to my students is, ‘You don’t search Google. Google searches you,’” an insight he attributed to Harvard researcher and author Shoshana Zuboff.

Ads and Algorithms

Google stated that its new climate-science advertising and monetization policy is tied to advertiser feedback.

“In recent years, we’ve heard directly from a growing number of our advertising and publisher partners who have expressed concerns about ads that run alongside or promote inaccurate claims about climate change. Advertisers simply don’t want their ads to appear next to this content. And publishers and creators don’t want ads promoting these claims to appear on their pages or videos,” the company reported.

In a hypothetical example, Schweidel said, Proctor & Gamble might not “want their ads showing up next to content with climate denial. They don’t want people to make this spurious assumption that P&G doesn’t believe in climate change.”

“At the heart of all this are the algorithms. Algorithms aren’t smart. They do what people tell them to do,” he noted.

Adding a directive not to spread disinformation creates another algorithm parameter, with the caveat that “someone needs to define what ‘disinformation’ is, and that requires that someone make a judgment call,” Schweidel said.

Google is “drawing a line in the sand that says, ‘We don’t care if you want to reach people who don’t believe in climate change. We think this is bad,’” he observed.

Ganesh noted that it’s somewhat ironic for Google to talk about not monetizing contrarian climate-science content.

“The fact is, they’re the ones doing the monetizing,” he said. “Monetization is such a deep part of Google’s structure that you can’t avoid it.”

He recalled that during the dot.com bust of the early 2000s, when many online services struggled to show a profit, “Google doubled down on the need to be financially viable.” It first responded by introducing Gmail and then AdWords, now Google Ads, he said.

In 2020, Google’s parent company Alphabet generated about $182.5 billion in revenue. Of that, around $147 billion – more than 80 percent – came from Alphabet’s ads business, according to the company’s annual 10-K report. Approximately $104.1 billion came from Google ads, $19.8 billion from YouTube and $23.1 billion from other Google members’ properties.

By comparison, ExxonMobil’s sales and operating revenue last year amounted to just under $178.6 billion. It was a revenue down-year for Exxon and other oil companies, but Alphabet’s total revenues have climbed steadily, by more than $20 billion annually in recent years.

Beyond the pressure to increase revenue, Google, Facebook and other social media/online services have faced intense examination and questioning from governments in several countries. Ganesh believes the scrutiny influenced Google’s recent moves.

“I think what they’re doing is responding to the political heat they’re getting from Washington, D.C. It sounds in this case like Google is responding to that pressure,” he said.

In its new climate-content policy, “what Google is addressing is related to what Facebook is dealing with right now around vaccine disinformation,” Schweidel said.

Google is not a public forum like Facebook, but online services are similar in the way they approach making money from advertising, he noted.

“At the core, it’s the same kind of algorithm in that it puts in front of you an ad that you are likely to click on,” Schweidel said, adding “it’s not that Google is shooting themselves in the foot, necessarily. They’re just going to serve up a different ad” to match with climate content.

Google pledged to “look carefully at the context in which (climate) claims are made, differentiating between content that states a false claim as fact, versus content that reports on or discusses that claim.”

“That seems to read like a relatively weak statement to me,” Schweidel said, “but I don’t want to underestimate the extent of the problem.”

He called it a difficult challenge – with so much material to assess, Google could end up “playing a game of whack-a-mole” as it tries to police content, he said.

“How much of this is really to appease advertisers compared to, is this a PR strategy aimed at making them look like the good guys?,” Schweidel asked.

“I suspect it’s some of A and some of B,” he said.

Ganesh predicted that disincentivizing alternative online views, combined with siloing, will have negative long-term effects.

“I think it corrupts the marketplace of ideas, and without competing ideas you can’t have a functional democracy,” he said.

Without unbiased openness, “You have a political sphere that’s totally fragmented,” he added.

Managing Monopolies

Ganesh has been using the Startpage search engine as an option to Google searching, and said, “I think we need to ween ourselves away from social media like Facebook and Google and find what other alternatives are out there.”

“When the advertisers drive the logic of the search system, then you do have a problem,” he said.

Where social-media power is headed “depends on how much public concern continues to focus on this,” he said. “I think the closest analogy we have to this right now is the breakup of Bell and the Baby Bells.”

Ganesh said Facebook, Amazon and Google are the Big Three of social media power and “act like monopolies.”

“I think those three need to break up. That’s an important political solution,” he said.

As Google’s parent company, Alphabet is attempting to navigate a financial and political maze with its new policies. Prohibiting ads for and monetization of alternative climate content is a tool, but not censorship, Shweidel said.

For material posted to YouTube, “They aren’t saying, ‘These videos are banned from the platform.’ They’re saying that they are going to change the way advertising is done,” he said.

How evaluations are made and how the rules are applied will matter.

“It would be fairly naïve to say that whatever happens on these platforms, this is free speech and people should be able to say whatever they want to,” Schweidel observed.

“This is one of those things where the devil is going to be in the details,” he said.

Comments (4)

Opposing views on climate change.
It is important to look at the credentials of those that render an opinion. (1) What makes that person an authority? (2) How is the argument defended? (3) What do we know for sure about facts supporting an idea? (4) What are the qualifiers and assumption in the idea, and those that challenge to that idea? (5) How would you assess the probabilities of those assumptions? (6) Do you think any significant parameters have been left out of the arguments, and why? Everyone thinks they are an expert. Even the, so called experts, think they are infallible in their thinking. I have lived long enough to know that no one knows everything, and it is the assumptions that everyone accepts as facts but have not been rethought that can make the whole lot of us look foolish down the road. And it is the unintended consequences of our actions that are so often lethal.
12/7/2021 1:50:13 PM
API's pivot on climate change
I recognize the concept of confirmation bias and agree it is a powerful concept; I have learned the hard way that it is impossible to discuss some topics with some people. The alternative, however, is to allow inaccurate statements and outright lies to be published in the name of diversity of opinion. Is this what we really want? Lee is correct that Google is a private company and has a right to govern their site. But he relates their process as to being analogous to the medieval church burning people at the stake for stating that the earth revolves around the sun and not vice versa. While I fully understand his point, I do not accept this as a valid point for allowing outright lies from being published in the name of fair and open discussion. When I ran a conference, I allowed publication of some papers by a person who, shall we say, had some off-the-wall ideas. When asked why, I noted that although I did not agree with the concepts, who knows he may be correct. This does not mean I allowed papers on why the Earth is really flat.
12/7/2021 1:44:37 PM
Opposing views on climate change.
It is important to look at the credentials of those that render an opinion. (1) What makes that person an authority? (2) How is the argument defended? (3) What do we know for sure about facts supporting an idea? (4) What are the qualifiers and assumption in the idea, and those that challenge to that idea? (5) How would you assess the probabilities of those assumptions? (6) Do you think any significant parameters have been left out of the arguments, and why? Everyone thinks they are an expert. Even the, so called experts, think they are infallible in their thinking. I have lived long enough to know that no one knows everything, and it is the assumptions that everyone accepts as facts but have not been rethought that can make the whole lot of us look foolish down the road. And it is the unintended consequences of our actions that are so often lethal.
12/7/2021 1:38:55 PM
Google Police
So Google will now police search results to ensure adherence to scientific orthodoxy. And the "settled science" adherents will cheer them on. Climate scientists today apparently think that science can only advance by crushing all dissent.
12/7/2021 12:28:54 PM

You may also be interested in ...