Researchers have been looking into the impact being on social media has in terms of the way we consume news.
Carried out by Facebook, the tech giant examined its news feed, which filters stories that are likely to not to be of the content they would like or agree with.
The BBC reported that the sophisticated algorithms underpinning this particular part of its service does in fact prevent users from being exposed to articles that do not reflect their worldview.
The paper, which was published in the journal Science, found that this can have an impact on how we engage with information that might, at first, trigger a negative response.
"People are increasingly turning to their social networks for news and information," said co-author Solomon Messing, a data scientist at Facebook, told the broadcaster.
"We wanted to quantify the extent to which people are sharing ideologically diverse news content – and the extent to which people actually encounter and read it in social media."
This is a pressing issue, as the less willing we are to consult with other sources of information, the less likely we are to have a rounded personality.
It’s not that we have to agree with what we’re reading, just that it’s important to see things from the other side and use that to better shape our own understanding.
A lot of the time, our opinions and biases are informed without us really knowing why – ask someone why they believe a particular thing and sometimes they will remark “I just do”.
As the authors of the paper highlight, “exposure to news, opinion and civic information increasingly occurs through social media”.
If this continues – and given how important social media is – then it’s vital social networks like Facebook do more to be less restrictive in what content is filtered through and what isn’t.