Facebook Explains ‘Echo Chambers’ and ‘Filter Bubbles’ in New Study

Facebook Explains 'Echo Chambers' and  'Filter Bubbles' in New Study

Credit: Facebook
  • Facebook Claims That Its Algorithms Aren’t Responsible For The User’s Online Echo Chamber

There have been reports that Facebook observes the user’s movements and they know what a particular user likes and dislikes.

Eli Pariser was at TED Talk in 2011. He told very strange stories about how the social media networks are controlling our world. He said that he had always gone out of his way to meet conservatives. He said that one day he was surprised that he didn’t notice any conservatives.

He said that they had all disappeared from his Facebook feed. Eli said that he was clicking on his liberal friend’s profile and not on his conservative friend’s profile. He claims that Facebook was observing all of that.

He also said that without consulting the user, Facebook edited out the conservative friend and it seemed like they had disappeared. Eli believes that there is a filter now on these social media sites and only filtered material is reaching the users.

This allows these companies to show us more of what we like and less of what we don’t like. Now Facebook knows what a particular user likes. They will now target those specific ads to them.

Facebook on the other hand argues that it’s not Facebook that is stopping you from reading opposing viewpoints but it’s the user. Facebook said in an official blog that they didn’t pass judgment on the normative values of cross-cutting exposure. This stance by Facebook is a negative one because diverse news sources are foundational to democracy.

A Facebook source said that although normative scholars often argue that exposure to diverse marketplace of ideas is key to a healthy democracy, but then again when you are exposed to cross-cutting viewpoints there are lower levels of political participation. So Facebook are arguing that this reduced exposure can be seen as a good thing and a bad thing.

There is an algorithm called the curation algorithm which removes hard news from diverse sources which the user is less likely to agree with. Instead Facebook would show you a source you are more likely to agree with.



Comments

Share this Story

Follow Us
Follow I4U News on Twitter

Follow I4U News on Facebook

You Might Also Like

Read the Latest from I4U News

Comments


blog comments powered by Disqus

Back to Top , Read the Latest Stories

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *