Too many summits are high-level, ineffective meetings filled with well-meaning but empty speeches. But the multistakeholder Christchurch Call summit this year differed greatly from the usual UN General Assembly meetings. New Zealand and France used their political capital in this meeting to bring together representatives of different stakeholders to have a discussion about countering terrorism online.To the participants’ surprise, most of the interventions were conversational and civil society was included on an equal footing.
This blog includes Digital Medusa’s opinions about the summit and the Christchurch Call.
In 2019, New Zealand and France convened Christchurch Call to Action after the terrorist attack that killed 51, and injured 50, Muslims in a mosque in Christchurch. The horrendous attack was live streamed on Facebook and other tech-corporations’ platforms.
When the Call was launched in 2019, civil society globally criticized their own lack of presence and complained of being treated as an afterthought. But for the past 3 years, we have witnessed a slow but sustainable change and a move towards convening a true multistakeholder forum. It has become a forum that can go beyond mere lip service, that is genuinely multistakeholder, and that takes part in the Christchurch Call commitments to preserve a secure, open, interoperable Internet while fighting with terrorist behavior online.
The governments of New Zealand and France convened the Christchurch Call Advisory Network (CCAN) which comprises civil society organizations including impacted communities, digital rights organizations and the technical community that aims to defend a global, interoperable Internet. While the role of civil society organizations in other similar forums is often contested and not very clear, CCAN has made real progress towards meaningful participation. It is important now that progress continues beyond just attending a high level meeting with the leaders of the world.
Crisis Response Management and Internet Infrastructure
During the summit, we discussed the crisis response management and protocols. A lot of progress has been made by various countries to create a voluntary but cohesive protocol management that can also adapt to the global nature of the Internet. However, we increasingly see calls for content moderation at the Internet architecture level (domain names, hosting providers, cloud services etc). A proportional moderation of content at the infrastructure level might not be possible in all cases. Especially during a crisis, we have to be extremely careful with the techniques we use to filter and block access to domains and websites, as it might not be possible to do proportionally. Such techniques might hamper access to other services online. We also need to evaluate the impact on human rights of each at each stage of crisis response. A global interoperable and interconnected Internet needs a holistic approach to safety—one that does not focus exclusively on blocking, take-downs and after the incident responses, but that offers a systematic way of tackling the issues raised by horrific content online.
Researchers Access to Data
Perhaps surprisingly, Digital Medusa deviates from various researchers and civil society organizations in calls for researchers’ access to data. While the Digital Services Act will facilitate such access, I do not believe we have the governance structures in place to validate research, nor to provide the privacy-enhancing structures to diminish abuse of personal and private data. New Zealand, France and a few others announced an initiative that can address this issue while also facilitating researchers’ access to data. The effectiveness of such an initiative remains to be seen, especially as it primarily focuses on fixing problems by focusing on technology.
Internet of the Future
It is natural to think that, if bad content online is the source of danger, then all that is needed is to remove that content and moderation. But content removal does not on its own bring safety to the Internet. For our future efforts, we need holistic approaches. We also need to work with impacted communities and operators on the Internet. Content removal and take-downs on the Internet can have a major impact on the well being of individuals. Careless removal and takedown can affect access to a host of essential online services and it can also hamper uprisings and information sharing online during a crisis. I hope that content moderation will become only one tool (and not even the most important), and we come up with more innovative ways to deal with governance of our conduct on the Internet.