The development of emerging technologies, the increasing centrality of data to nation-state power, and the possibility of concentration of social, economic, and political influence by a few elite platforms are pushing dramatic shifts in the way the technologies and data use are envisioned and governed. Data and technology are now increasingly being configured around national politics, laws, and interests and are at the heart of geopolitical and diplomatic negotiations.
It is increasingly evident that the architecture and use of online platforms can impact democratic societies and institutions, market power, free speech, and national security. In response to this, governments and experts are grappling with questions of how to regulate big tech and respond to online harms in a way that enables innovation while protecting rights. Trends like increased surveillance capabilities, internet shut-downs, and traceability requirements are raising concerns about the rise of digital authoritarianism. At the same time, there is a broader negotiation taking place over what democratic governance of the internet should look like with differences centered around security, data governance, emerging technology use, and end-user rights.
The splintering of the internet, balkanization, fragmentation, and digital state sovereignty have been some of the terms used to describe these shifts. Questions of security, commerce, innovation, and rights are at the center of these debates - with key actors being companies, governments, law enforcement, and citizens. In some contexts, communities are also becoming key actors in questions over data ownership and harm.
A wide variety of policy trends have emerged across contexts connected to these ideas. Some of these include:
- Development of domestic infrastructure including national and regional clouds;
- Prioritization of domestic technology and capacity;
- National and regional generation, sharing, use, ownership, and access of non-personal and personal data;
- Applicability of domestic legislation to the internet and digital technologies particularly with respect to the development and use of AI, online speech and behavior, online platforms, and the governance of personal data.
These shifts have also resulted in a global push to influence and define the standards that underpin emerging technologies.
The pandemic has underscored the importance of the digital in our everyday lives and the impact of digital divides. It has also highlighted the need to revamp digital infrastructure and leverage new technologies to ensure connectivity. But could the ‘splintering of the internet’ or ‘data sovereignty’ impact an individual’s ability to meaningfully access and use the digital?
With these shifts, individual experience with the digital may depend heavily on where they are located and their nationality. Beyond quality or type of infrastructure, hardware, or device, individuals may have access to different services and content depending on regimes around intermediary liability and content, privacy requirements, and security standards. To a certain extent, this is already the case - as companies undertake restrictions based on geography when required by law - and companies treat European data differently to maintain GDPR compliance. These differences could deepen and expand into other services and sectors like cryptocurrency.
More subtly, if legislation like the proposed EU Digital Services Act and AI Regulation comes into force, users may have significantly different ‘algorithmic’ experiences and may or may not encounter AI technologies designated as ‘high risk’. Online rights afforded to end-users and their ability to define their own online experience will also vary - something that is already apparent with the GDPR through rights such as the right to an explanation and data portability and will continue to be so with the proposed Digital Services Act - with the ability to opt-out of and define recommendation systems.
It will also become increasingly difficult for multi-national online platforms to navigate complex and differing regimes around data and technologies with differing standards for personal data, permitted content, transparency (content, algorithms etc.), audits, risk assessments, and more. Multi-tiered regimes with requirements based on the size of the company (very large online platforms, significant social media intermediary etc.) can add more confusion. This could result in companies offering different services and technologies based on regions or nationality, and/or could result in global standards being set by default. It may also become riskier for companies to operate in certain contexts as a result of requirements for local offices to be established.
It is important to note that differentiation of user experiences, data collection, and policies that dictate these can be underlined by national or political needs as well as human rights considerations.
The present discourse and emerging trends can be seen as a political reframing of the digital by nation-states with competing agendas of national interests as well as what rights are to be accorded to its citizens. Data has become an essential resource for innovation, business, and national interests. As with other resources, conflict starts with differing visions for the use of the resource, perceptions of ownership, and possibilities to reframe the existing system to one’s advantage.
As this reframing happens, it can be useful for the following to guide decisions and directions for the governance of the digital:
- A common thread of human rights: Multiple visions for the internet can be imagined but tied together with the common commitment to human rights by companies and governments to guide both innovation and regulation. Human rights impact assessments and feedback and redressal mechanisms are key to identify, document, and respond to harms.
- Multi-stakeholder cooperation across the ecosystem and contexts: This includes citizens, governments, civil society, academia, and industry. Challenges like disinformation are complex, heavily dependent on context, and require evidence-based solutions at multiple levels and cooperation between actors. Initiatives like the Freedom Online Coalition - which brings together 32 governments working together to advance internet freedom through coordination and the development of shared language is one example of what this cooperation could look like. An emphasis on cooperation enables balancing interests and opens the toolbox for which solutions can be found.
As the vision for an open internet and the possibility of a common understanding for the use of emerging technologies splinters, the pandemic has underscored that we have to collaborate and coordinate to find solutions to shared problems, or as in the case of the internet and data, for shared resources. Conversely, we have to evaluate if the global human cost is balanced by individual national gains.