Session
Organizer 1: Nikita Volkov, 🔒Civic Chamber of the Russian Federation
Organizer 2: Alexander Malkevich, 🔒
Organizer 3: Lydia Mikheeva, 🔒
Speaker 1: Aminé Mounir Alaoui, Civil Society, African Group
Speaker 2: Nayla Abi Karam, Civil Society, Asia-Pacific Group
Speaker 3: Chahrazed Saadi, Private Sector, African Group
Speaker 4: Dramane Traoré, Civil Society, African Group
Speaker 5: Charlemagne Tomavo, Civil Society, African Group
Alexander Malkevich, Civil Society, Eastern European Group
Lydia Mikheeva, Civil Society, Eastern European Group
Nikita Volkov, Civil Society, Eastern European Group
Round Table - 90 Min
1. Is contemporary content moderation policy effective in fighting extremism vis-a-vis its side effects featuring digital space fragmentation?
2. Is it possible to coin an exhaustive definition of malware and harmful content, set its clear criteria, and devise straightforward transparent moderation guidelines?
3. Is it feasible to devise multistakeholder transparent mechanisms of control and observation over the content moderators on social networks? Could promoting inclusion moderation policy towards the disseminators of non-mainstream views be a solid alternative to bans and blocking on social media?
What will participants gain from attending this session? The workshop participants are expected to accustom to the issue of the non-transparent moderation policy exercised by IT moguls that grants digital communication monopolies immense powers in agenda-setting and agenda-framing. Attendees will also get to know multiple approaches to reshaping current mechanisms of content moderation in the digital space to make them more inclusive, transparent, and accountable before the institutional stakeholders and millions of Internet users.
Description:
The Internet infrastructure has long been considered a neutral foundation, on which the debates on various social, economic, and political issues could take place. Yet the rampant evolution of new technologies has altered the significance of the Internet infrastructure, and the major social media platforms have become irreplaceable and unique services, de-facto resembling digital communication monopolies.
Millions of users across the globe post vast amounts of digital content daily, including malware and harmful, hence bringing in the necessity of the social media platforms’ administration to react accordingly. As a result, major social media platforms, e.g., YouTube, Twitter, Facebook, and Instagram, have gained the capacity to control global discursive space via exercising moderation policies based on unilaterally imposed arbitrary and non-transparent criteria, values, and beliefs.
Social media platforms possess the power to delete content enshrining views that alter the dominating narratives or even block media outlets and user accounts that disseminate non-mainstream views, thus violating the values of freedom of speech, core principles of free information flow and unimpeded access to it.
Having the last word in de-facto censorship practices grants the IT moguls immense agenda-setting and agenda-framing powers, consequently challenging the strive for democracy, neglecting social inclusion, marginalizing social groups, and, pivotally, leading to further digital space fragmentation.
Banned or blocked users tend to shift to alternative social media platforms, usually localized in the country of their residence and complying with the national digital regulations. Driving off hundreds of thousands of people out of the global digital communication infrastructure creates information bubbles and limits the outreach of the internationally operating digital ecosystems, thus reproducing the national borders in the digital space.
In this regard, it is necessary to instigate a public discussion on designing new mechanisms to protect freedom of speech in digital space and preserve its integrity.
As a tangible outcome of the workshop, the organizers expect to tailor a set of policy recommendations on the issue of IT companies’ content moderation policy accountability while also raising the awareness of the society and expert community towards the issues discussed at the workshop and advocating for more transparent content administration practices on the social media.
Hybrid Format: The onsite moderator will steer the discussion by delegating the speaking slots of approximately 8-10 minutes to the onsite speakers and then turning to the online speakers overseen by the online moderator via Zoom, Microsoft Teams, or other platforms.
During free discussion, the same rotation scheme will be implemented, yet time slots will be shorter.
To ensure the discussion flow and preserve the workshop’s internal logic and coherence, both onsite and online speaker reports and topics will be collected prior to the event, and the speakers’ sequence will be arranged according to the discussion plan.
The online moderator will rely on the IGF, Facebook, Instagram, YouTube, and VK online translation chats and WhatsApp and Telegram chats to collect the questions and relevant feedback to be covered by the speakers during free discussion.
The onsite moderator will ask for attendees’ questions and comments and then forward them to the speakers.