IGF 2023 WS #209 Viewing Disinformation from a Global Governance Perspective

    Time
    Wednesday, 11th October, 2023 (04:00 UTC) - Wednesday, 11th October, 2023 (05:30 UTC)
    Room
    WS 5 – Room B-2

    Organizer 1: Jeanette Hofmann, 🔒Berlin Social Science Center
    Organizer 2: Anriette Esterhuysen, Association for Progressive Communications🔒
    Organizer 3: William J. Drake, 🔒Columbia Institute for Tele-Information

    Speaker 1: Nighat Dad, Civil Society, Asia-Pacific Group
    Speaker 2: Clara Iglesias Keller, Civil Society, Latin American and Caribbean Group (GRULAC)
    Speaker 3: Aaron Maniam, Government, Asia-Pacific Group
    Speaker 4: David Kaye, Civil Society, Western European and Others Group (WEOG)
    Speaker 5: Jeanette Hofmann, Civil Society, Western European and Others Group (WEOG)

    Additional Speakers

    William J. Drake

    Moderator

    Anriette Esterhuysen, Civil Society, African Group

    Online Moderator

    William J. Drake, Civil Society, Western European and Others Group (WEOG)

    Rapporteur

    Jeanette Hofmann, Civil Society, Western European and Others Group (WEOG)

    Format

    Round Table - 90 Min

    Policy Question(s)

    1. What is disinformation and why is it a problem? 2. How strong and clear a baseline do existing international instruments provide for the governance of disinformation? 3. Concerning new governance initiatives, what sort of consultation and decision-making process is best suited to the governance of disinformation, and can the IGF assume a role in the process?

    What will participants gain from attending this session? The workshop will provide an overview of the controversy surrounding disinformation. The workshop intends to clarify the state of knowledge about disinformation and its effects. The workshop will discuss current regulatory approaches including their impact. It will provide an overview of the different models of cooperative governance that are currently being devised. Some of these are stakeholder-led efforts, such as the Facebook Oversight Board, the Christ Church Call, the DISARM Framework, and the various proposals for new international press councils. Others are intergovernmental ‘minilateral’ or regional initiatives, such as the implementation work of the European Union’s Digital Services Act including the former Code of Practice on Disinformation. Others are intergovernmental initiatives such as UNESCO’s Guidelines for Regulating Digital Platforms. The participants will get a sense of the complexity and culture-dependency of disinformation but also a better understanding of the significant challenges of building international consensus and designing effective institutions.

    Description:

    Disinformation is generally seen as a problem. One aspect concerns its definition and the empirical identification of harms caused by it. A second aspect concerns adequate responses: how to address disinformation without undermining human rights? Some observers say disinformation is an urgent problem that can only be addressed through national regulation. Others believe it is a global challenge that also requires international cooperation between all stakeholders. Some believe that the scope and seriousness of disinformation are so profound that it presents a threat to democracy. Others believe that the problem of disinformation is being amplified and its regulation will result in restricting freedom of expression. In view of the controversy over disinformation, this workshop will consider two questions: (a) the definition of disinformation as a policy problem, and (b) the establishment of shared governance responses on the national and global level. (a) The definition of disinformation as a policy problem While there is agreement that disinformation can be harmful, it is challenging to determine its causes and effects in ways that facilitate robust regulation. Current definitions of disinformation emphasize the intentional dissemination of false, misleading, or manipulative information aimed at influencing public opinion or the behaviour of people. In practice, disinformation proves to be a complex issue with nationally varying actors, motives, and effects. Even seemingly clear-cut attributes like misleading information are culturally sensitive and open to different interpretations. (b) The establishment of shared governance responses First, we will consider the trend in many states to regulate disinformation. Then we will look at international instruments that may directly or indirectly bear on disinformation. Next, we will explore the different models of cooperative governance that are currently being devised. Some of these are stakeholder-led efforts, others are intergovernmental ‘minilateral’ or regional initiatives. Others are intergovernmental initiatives at the broad multilateral level.

    Expected Outcomes

    The ultimate goal of this workshop is to propose options for the evolving global Internet governance system to address the problem of disinformation in ways that respect human rights and reflect the diversity of interests at play, and, at the same time, connect interventions at national and global levels. In this way, the workshop hopes to contribute to the consensus-building process in understanding and adequately addressing the issue of disinformation.

    Hybrid Format: The interactive roundtable format allows a dynamic and flexible discussion. The organizers have extensive experience with managing such sessions in the IGF since 2006, as well as in related international venues. This includes substantial experience with managing hybrid sessions and ensuring the inclusive participation of people who are onsite and online. The moderator will keep an eye on raised hands in both spaces, give everyone a chance to speak in turn, set time limits and promote respectful interactions, ensure people can respond to points directed at them, read out typed questions if someone’s sound fails, and so on. We will use the Zoom chat to include virtual participants and encourage as many people as are willing in the room to also log into the Zoom session. We will recruit one or two participants to live-tweet the session so that people are also able to follow the conversation in that manner.

    Key Takeaways (* deadline at the end of the session day)

    1. A more nuanced approach to disinformation is called for, which should not only focus on social networks or digital platforms but also consider the wider media landscape. Furthermore, more empirical research is needed to realistically assess the dangerousness of disinformation. We should not simply take for granted the effect of disinformation on people's thinking and (voting) behaviour.

    2. There is not one global solution against disinformation that works in every instance or context. It is unlikely that governments agree on how to address disinformation. However, what is needed is a common set of principles that guides how we think of and act upon disinformation. Human rights and access to information must be front and center of such principles.

    Call to Action (* deadline at the end of the session day)

    1. Regional human rights courts need to be resourced in a way that they can function as mechanisms in the regulation of disinformation.

    2. High quality journalism is an effective means against the impact of disinformation but faces an uncertain future. More work needs to be done to strengthen independent journalism particularly in countries with a high incidence of disinformation.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

     

    Workshop Report  - IGF 2023 WS #209: Viewing Disinformation from a Global Governance Perspective

     

    Workshop process

    Part 1:

    The workshop opened with the moderator asking participants to stand and gather along an imagined line on the floor in the room based on the extent to which they agreed or disagreed with the following statement: "Disinformation is undermining democratic political participation". The moderator then walked around the room and asked people to share their views and why they agreed/disagreed or stood somewhere in the middle. They were encouraged to shift their position if the discussion led to them rethinking their initial opinion.

    Views in the room were diverse.  Almost all participants stood in the area signifying agreement with the statement.  Several offered examples from their countries and larger experiences that they believed demonstrated a strong causal link between disinformation and democratic erosion.  Two people, including one of the speakers, stood in an intermediate position and argued that a nuanced and contextualized approach is needed in examining cases so a binary choice between “is/not causing” was not adequate.  One person stood in the area signifying no impact of disinformation.

    The moderator also asked the panelists to share their perspectives, and, in doing so, to respond to the question: “What is disinformation, is it a serious problem, and if so, why (or why not, if you believe it is not a serious problem)?”

    Interactive discussion on this question between participants and the panelists continued for about 25 minutes. One of the panelists responded by asking what impact of disinformation we care about. He also suggested that disinformation is an umbrella term that is too broad as a basis for regulation. A person from the audience added that disinformation is not new and that each media has been abused for purposes of propaganda. One panelist pointed out that there is a lack of empirical evidence about the impact of disinformation. Most of what we know concerns the production and dissemination of disinformation while its effect on people’s worldviews and voting behaviour is mostly taken for granted. Recent research suggests that disinformation amplifies extremist beliefs rather than instigating them. As a closing question, the moderator asked participants if any of them lived in

    contexts where disinformation does not have a major impact. Two people responded to say that in their countries disinformation does not appear to be causing much harm due to the presence of a serious and legitimized mass media and other factors. A panelist concluded that high quality journalism is the best way to combat disinformation.

    Part 2

    The second question put to the panel and the participants was: “Can disinformation be regulated internationally? How strong and clear a baseline do existing international instruments provide for the governance of disinformation? What are the implications for rights to access information and freedom of expression?

    There was no common view on whether disinformation can be regulated internationally. Panelists doubted whether there can be one solution for all the different forms of disinformation. There was agreement on the need for a common set of principles to guide how we think of and act upon disinformation. Human rights, particularly Article 19, which protects freedom of expression and information must be front and center of such principles.

    One speaker briefly flagged three examples of efforts to devise international Internet governance responses to disinformation.  These included some problematic proposals for binding treaty commitments among governments that have been floated in the UN cybersecurity and cybercrime discussions; the European Union’s Code of Practice on Disinformation; and the UN Secretary General’s proposed Code of Conduct for Information Integrity on Digital Platforms.  It was pointed out that while the first example involved efforts to devise constraints on state behavior that would never be agreed in geopolitically divided UN negotiations, the second two involve codes of practice pertaining mostly to the providers and users of digital platforms.  It was noted that while platforms certainly have responsibilities, focusing largely on them rather than on the governments that produce or support the production of a lot of disinformation is quite a limitation.  There are also open questions around the reliance on codes and guidelines varyingly interpreted and implemented at the national level.

    The next question was: “Concerning new governance initiatives, what sort of consultation and decision-making process is best suited to the governance of disinformation, and can the IGF assume a role in the process?

    This provoked a very interesting discussion. Participants involved in the Christchurch Call shared how they put multistakeholder consultation at the centre of their efforts to combat online extremism. The key lessons they shared that are relevant to responding to disinformation was (1) the multistakeholder approach has been critical to create trust among the actors involved, (2) the need to take time and form partnerships with diverse actors involved, (3) to keep the scope and focus really tight and (4) not to rush into regulatory intervention.

    Part 4 - Closing

    In closing, panelists offered their main take-aways, including things they did and did not want to see.  There were calls for better empirical research and evidence about the effects of disinformation; for more nuanced policy responses, including avoidance of governments using “moral panics” on disinformation to justify restrictions of human rights; for multistakeholder participation in crafting governance responses; and for hefty fines on Elon Musk’s X for violations of the EU’s rules.