Over the years the internet has been celebrated for sharing information, disseminating knowledge, promoting freedom and debate, thus contributing to the enthusiastic rethoric of the so-called collective intelligence, a new form of intelligence that emerges from the collaboration and collective efforts of single individuals (Lévy and Bononno, 1997). Indeed, a hyperconnected environment such as the internet greatly facilitates communications among people, bringing down both temporal and spatial barriers. The small-world theory tested by Milgram in the 1960s (Milgram 1967) — resulting in the famous six degrees of separation[1]— was recently verified by researchers at Facebook, who found out the same number to be reduced to 3.57, meaning that each person is now connected to every other person by an average of three and a half other people (Bhagat et al., 2016). This is an example of how the advent of new technologies, and social networks in particular, has revolutionized the way we communicate and share information, at the same time allowing everyone to produce contents and share their opinion. As of the fourth quarter of 2018, Facebook had 2.32 billion monthly active users, generating 2.46M posts and 1.8M likes every minute[2]. Social media have rapidly established as the main information source for many of their users, who prefer to access news through social media, search engines, or news aggregators, rather than going directly to a news website. Over the years, traditional media such as print, radio, and television have been joined by a heterogenous mass of alternative news sources, where information is no longer mediated. However, despite the increasing quantity of contents, quality may be poor, due to issues of content monetization and the persisting reduction of investments in the news production and distribution.[3] Moreover, smartphone reach for news is significant, and might affect the way we consume information and the time that we devote to its processing. Such a context has contributed to the loss of reputation and trust for traditional media, encouraging people to rely on alternative information sources, not always qualified.
It is since 2013 that the World Economic Forum (WEF) has been placing the global risk of massive digital misinformation at the core of technological and geopolitical risks such as the rising religious fanaticism, cyber-attacks and terrorism (Howell, 2013). On the Internet, a huge amount of information competes for our attention, which is instead limited, and it is often difficult to apply our abilities to analyze, reflect and draw conclusions. Instead, our cognitive biases — i.e. shortcuts, heuristics that we use to simplify the reality and (re)act rapidly — emerge forcefully. As human beings, we need such biases to interpret the reality. Unfortunately, while these cognitive mechanisms are often fundamental to our survival, they might also act as mental traps and mislead us. Among them, a fundamental role in information consumption and diffusion is played by confirmation bias, which is the human tendency to look for information that is already coherent to one’s system of beliefs. Indeed, despite the availability of a huge, virtually infinite variety of information, online users tend to fragment into bubbles — the so-called echo chambers (Del Vicario, et al., 2016, Zollo & Quattrociocchi, 2018). Users in a same community share a common narrative and, immersed in echo chambers, select information coherent to their worldview, even when false (Bessi et al., 2015), ignoring information dissenting from their beliefs. Users from different and contrasting communities rarely interact and, when that happens, the debate degnerates, especially for longer discussions (Zollo et al., 2015). Response to debunking attempts is not that dissimilar, and results in the well-known backfire effect (Nyhan and Reifler 2010). Correction is perceived as a further attempt to manipulate information, thus reinforcing users’ original positions (Zollo et al., 2017). Such aspects are especially important in a media pluralism perspective. Indeed, it is widely assumed that citizens and democracy benefit from a greater quantity of information available in the information system (Prat and Stromberg, 2013). However, due to users confinement into echo chambers, information pluralism on social media does not appear to produce the positive effects generally connected with citizens’ exposure to different points of view.[4]
It is possible to quantify the turnover of Facebook news sources by measuring the heterogeneity of users' activity. We may observe that, for increasing levels of activity (number of likes) and lifetime (the temporal distance between the first and last interaction of a user on the platform), users interact with increasingly fewer new sources (Schmidt et al, 2017). While users with very low lifetime and activity levels interact with about 100 pages in a year, 30 in a month, and ten in a week, the same values are far lower for more active and long-lived users, who interact with about ten pages in a year, and less than four monthly and weekly. News consumption on Facebook is therefore dominated by selective exposure, showing a natural tendency of users to confine their activity on a limited set of pages, focusing their attention on certain topics (and claims), thus contributing to the formation of a high-polarized community structure. Such dynamics appear to be independent of the topic, and also applies to online political debates (Adamic and Glance, 2005). In this regard, looking at users' behavior on Facebook pages engaged in the debate around Brexit (Del Vicario, Zollo et al. 2017), we observe the spontaneous emergence of two well-separate communities (echo chambers), where connections among pages are a natural result of users’ interaction on them. Users are divided into two main distinct groups, confine their attention on a specific narrative and seem to ignore the other. Similar patterns may be found around the Italian Constitutional Referendum's debate, where the emergence of five main communities of news pages in Facebook, and four in Twitter (Del Vicario, Gaito et al. 2017) is observed. Also in this case, users are strongly polarized and tend to confine their attention on a specific cluster (community) of pages.
Empirical evidences suggest that the increasing segregation of users in echo chambers plays a pivotal role in (mis)information spreading on social media. To contrast misinformation, and encourage effective communication, smoothing polarization is thus essential. To this end, users’ behavior and their interactions with information may be used to determine in advance the targets for hoaxes and fake news in the short term (Del Vicario et al., 2019). A timely identification of potential misinformation targets may allow for the design of tailored counter-narratives and appropriate communication strategies. In this direction, within the EU H2020 project QUEST[5], we are now working to analyze, design, test, and evaluate different strategies to improve science communication on social media, with special attention to delicate and polarizing topics that need to be addressed with care, such as climate change or vaccines (Schmidt et al., 2018). It is finally crucial to promote a culture of openness and to emphasize the importance of critical think, together with a better awareness of both digital tools and humans limits (and biases).
References
[1] According to the theory, six is the number of intermediaries necessary to connect any two individuals in the world.
[2] Source: Statista 2019
[3] AGCOM, 2018. News vs. Fake in the Information System. Interim Report Sector Inquiry “Online platforms and the news system”.
[4] AGCOM, 2018. News vs. Fake in the Information System. Interim Report Sector Inquiry “Online platforms and the news system”.
[5] EU H2020 Project QUEST: https://questproject.eu






