Facebook's algorithm doesn't create political echo chambers, users' choices do, says study
A new study from Facebook, published in Science, crunched data from 10.1 million Facebook users in the US and determined that user choices created homogenous ideological environments. Facebook's News Feed algorithm caused 15% less "cross-cutting" content to be shown to users, but users clicked on 70% less of this content than the content they agreed with.
"Within the domain of political news encountered in social media, selective exposure appears to drive attention," the study's authors wrote. "Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content."
These results reflect positively on Facebook, who have been accused in the past of creating echo chambers that limit exposure to diverse opinions. It turns out, if this study is to be trusted, that's our fault.
Facts about politics on the internet:
The internet's favorite passtime, venting anger at random strangers, doesn't quell people's frustration. In fact, a study showed that venting online just makes you angrier and more aggressive.
In 2012, a Facebook study showed that people were more likely to interact with acquaintances content than that of close friends on the social network. They concluded, "The information we consume and share on Facebook is actually much more diverse in nature than conventional wisdom might suggest."
A 2011 TED Talk by Eli Pariser argued that we put ourselves in "filter bubbles" of content we agree with online.