Facebook "subsidizes" polarization

Facebook’s algorithms cause political polarization by creating “echo chambers” or “filter bubbles” that insulate people from opposing views about current events.

An abstract image of a boxing match

Facebook’s algorithms cause political polarization by creating “echo chambers” or “filter bubbles” that insulate people from opposing views about current events.

But how does this happen inside Facebook’s ad delivery process? And is there an economic impact for campaign advertisers and for Facebook?

Researchers from Northeastern University and the University of Southern California published a paper this week which shows that Facebook essentially “subsidizes” partisanship.

The research team experimented by running political ads on Facebook. While it’s impossible to fully understand the algorithms, the research has produced some unique results.

Facebook’s ad algorithms predict whether a user will already be aligned with the content of the ad. If a user is likely to be already aligned with the content (say, a democrat delivered a Bernie ad), the algorithm predicts that the user will be more valuable to Facebook (more engagement, likes, shares etc) than a user that isn’t aligned (say, a republican who might see the same ad). This results in a “discount” to serve an ad to a “relevant” user.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.