With the advent of technology and its integration into almost every aspect of life, many are turning to different mediums to receive their daily news. One of those mediums is social media, a place users are turning to more frequently to read up on current affairs through links posted by friends and news outlets themselves. However, the exemplar of this situation, Facebook, may not be at fault for causing the ideological slant that occurs in users’ newsfeeds – it might just be the users themselves.
It is worrisome to think that sites such as Facebook could be potentially using an algorithms to collect a newsfeed that reflects those prior clicks, potentially bypassing important news. The result would be a biased, uninformed citizen who is checking a page of news filled with everything that matched their political opinions. Currently, one fifth of liberals’ and one third of conservatives’ posts on their timeline are of opposing political views.
A new big data study directly addressed this concern by compiling the activity of 10.1 million Facebook users that previously indicated their political standing through a survey. In six months, a total of 7 million links were posted by that specific set of users. Thirteen percent of all those links were news or political in nature.
A social algorithm in Facebook does exist, but the researchers were interested in how robust that algorithm is. They found that the algorithm’s effect on biased posts is a lot less severe than anticipated. In fact, users ended up clicking on one percent less challenging views with the social algorithm suggestions, but clicked on 4 percent less challenging views with their personal choice suggestions.
For right now, it seems users influence their biased Facebook timeline more than Facebook does. However, on April 21 of this year, the future of applicability of these results may be in jeopardy. The social media giant released that it will be updating its newsfeed ranking system, complete with a brand new formula for analyzing what specific people would like to see.
Read the full article at www.sciencenews.org.