Facebook’s fake news problem is about more than just ads

Chris Hughes’s attack on his co-founder won’t stop the spread of misinformation

facebook ads
Facebook CEO and founder Mark Zuckerberg on Capitol Hill
Share
Text
Text Size
Small
Medium
Large
Line Spacing
Small
Normal
Large

It seems like it should be quite the scandal: one co-founder of Facebook chastising another publicly for a business decision that has, allegedly, had major social reverberations. In response to Democratic presidential contender Elizabeth Warren calling out Facebook for loosening its restrictions on political advertising, Facebook co-founder Chris Hughes took to Twitter.

‘I have a feeling that many people in tech will see Warren’s thread implying FB empowers Trump over Warren as unfair,’ Hughes wrote. ‘But Mark [Zuckerberg], by deciding to allow outright lies in political ads to travel on Facebook, is embracing the philosophy behind…

It seems like it should be quite the scandal: one co-founder of Facebook chastising another publicly for a business decision that has, allegedly, had major social reverberations. In response to Democratic presidential contender Elizabeth Warren calling out Facebook for loosening its restrictions on political advertising, Facebook co-founder Chris Hughes took to Twitter.

‘I have a feeling that many people in tech will see Warren’s thread implying FB empowers Trump over Warren as unfair,’ Hughes wrote. ‘But Mark [Zuckerberg], by deciding to allow outright lies in political ads to travel on Facebook, is embracing the philosophy behind Trumpism and thereby tipping the scales.’

Hughes, however, hasn’t had anything to do with Facebook for years, and criticizing what the social network has turned into is a very on-brand move for him as he attempts to cement himself as a civic innovator. Months ago, in a widely read New York Times op-ed called ‘It’s Time To Break Up Facebook’, Hughes called for government intervention in a company whose CEO’s ‘focus on growth led him to sacrifice security and civility for clicks.’

To state the obvious, I didn’t co-found Facebook, so there’s certainly insider knowledge I’m lacking. But it strikes me that curtailing political advertising on Facebook won’t actually solve the problem of misinformation on the platform. It’s considerably more widespread than that.

First, the impact of paid media in the game of political influence is overstated these days. To put it simply, people hate ads. They don’t trust them, either — especially on platforms like Facebook. Two-thirds of consumers don’t trust ads on social networks, according to Nielsen, far below the levels of those who say they don’t trust TV commercials or billboards. Meanwhile, analysts have estimated that Trump received the equivalent of $2 billion in advertising simply through the amount of press coverage he garnered in the 2016 election — and the $7 million that Democratic billionaire Tom Steyer has spent on advertising his presidential campaign in early primary states has been unable to push him out of the single digits in polls.

That’s not to say that ads don’t work. They do. I wouldn’t work in advertising if they didn’t, because there wouldn’t be anything to pay my salary. But the online influence machine is just as likely — if not more likely — to be driven by forces far beyond the relatively narrow scope of media buyers. Plus, there’s plenty of content on Facebook that isn’t advertising, and likely has more impact because it isn’t advertising.

You can get a better idea of this by looking at other sectors of viral misinformation. Social networking platforms have banned ads that push anti-vaccine advocacy, but Facebook also took steps to reduce the influence of activist groups and information pages, as well as provide pop-ups with information from the World Health Organization to users who have input vaccine-related search queries. One of the most prominent anti-vaccine activists on Facebook said he’d spent $35,000 on ads on the social network over the course of four years. That’s a drop in the bucket when compared to political ads — and yet the misinformation machine spiraled out of control regardless.

Moreover, if a politician with millions of social media followers is going to use Facebook or Twitter to lie, he or she is probably going to do so whether ad spend can be put behind it or not. At last count, Donald Trump had 24.5 million followers on Facebook and 65.4 million on Twitter. Misinformation on social media is profitable whether or not there are restrictions on political ads. Even if Trump’s reelection campaign is unable to advertise on Facebook, Trump’s posts will still get seen and shared — and there will be ads from other advertisers running alongside them. So Facebook will profit either way.

One thing’s for sure: Facebook appears to have been genuinely blindsided by the tactics of political influence actors in the 2016 election, and it’s unclear how much they’ve learned since, because their approach to political ads is still evolving. While, as Elizabeth Warren highlighted, the company quietly loosened restrictions around false claims in ads from politicians and political parties, although other political entities (like advocacy groups) are not exempt. But Facebook has also considerably tightened controls around who’s allowed to buy political ads in the first place.

‘We don’t believe that it’s an appropriate role for us to referee political debates,’ a Facebook representative told the Guardian. ‘Nor do we think it would be appropriate to prevent a politician’s speech from reaching its audience and being subject to public debate and scrutiny.’

It’s as though Facebook has realized there’s not much it can do about the fact that all politicians lie.