IGF 2020 WS #304 Reaffirming human rights in company responses to crisis

    Time
    Tuesday, 17th November, 2020 (10:10 UTC) - Tuesday, 17th November, 2020 (11:40 UTC)
    Room
    Room 1
    About this Session
    Even in normal times, companies are often erratic, inconsistent, and opaque about how they implement international human rights standards. Abnormal times amplify such challenges. This workshop will highlight trends in companies' response to calamity, drawing from both COVID-19 and other emergencies in three areas: governance, freedom of expression/information, and privacy. Together, we will identify practices to be followed and avoided in future crises.
    Subtheme

    Organizer 1: Jan Rydzak, Ranking Digital Rights
    Organizer 2: Amy Brouillette, Ranking Digital Rights
    Organizer 3: Jessica Dheere, Ranking Digital Rights
    Organizer 4: MacKinnon Rebecca, Ranking Digital Rights
    Organizer 5: Veszna Wessenauer, Ranking Digital Rights

    Speaker 1: Szymielewicz Katarzyna, Civil Society, Eastern European Group
    Speaker 2: Lene Wendland, Intergovernmental Organization, Intergovernmental Organization
    Speaker 3: 'Gbenga Sesan, Civil Society, African Group

    Additional Speakers

    1. Katarzyna Szymielewicz will be replaced by Dorota Głowacka (Panoptykon Foundation).

    2. Lene Wendland will be replaced by Isabel Ebert (B-Tech Project, OHCHR).

     

     

    Moderator

    Jan Rydzak, Civil Society, Eastern European Group

    Online Moderator

    Veszna Wessenauer, Civil Society, Eastern European Group

    Rapporteur

    Jan Rydzak, Civil Society, Eastern European Group

    Format

    Break-out Group Discussions - Flexible Seating - 90 Min

    Policy Question(s)

    1. How should companies effectively apply international human rights standards in times of crisis and potential threats to public health or safety, and how should they adapt their transparency practices to the emergency?

    2. How have companies responded to large-scale crises in the areas of governance (e.g., human rights due diligence), freedom of expression and information (e.g., harmful content, network shutdowns), and privacy (e.g., data collection, inference, and sharing)?

    3. What lessons can we glean from companies' responses to both public health emergencies and other crises, and what kind of consultation and information sharing is needed to raise awareness of good and bad practices in the session's areas of focus?

    How should companies apply international human rights standards in times of crisis? Even under normal circumstances, technology companies are inconsistent in their application of human rights standards to their operations. During major crises, the threat of inadvertent human rights violations is especially high, especially with strained resources and lack of clear guidance. At the same time, the problems facing the users of those companies’ services take on new features in such situations. Misinformation flows revolve around a new set of themes; predatory advertising finds new targets; information is disrupted through deliberate network shutdowns; and personal data is aggregated and processed without appropriate due diligence mechanisms that would prevent downstream harms.

    These challenges have been a fixture of localized disasters and political crises with an impact on communication networks, but they are more salient than ever in the face of a global crisis. This presents a major opportunity to (a) present a panorama of how tech companies and telcos have responded to crisis and (b) identify practices that follow international human rights standards as well as those that fall short.

    SDGs

    GOAL 3: Good Health and Well-Being
    GOAL 9: Industry, Innovation and Infrastructure
    GOAL 10: Reduced Inequalities
    GOAL 16: Peace, Justice and Strong Institutions
    GOAL 17: Partnerships for the Goals

    Description:

    The COVID–19 pandemic has generated new questions around human rights in crisis. A central theme of these questions is what standards of transparency and accountability technology companies should follow, both in adjusting their services to a global calamity and in responding to extraordinary requests to share user data. Such large-scale emergencies are the true test of companies’ responsibility to respect human rights without causing or contributing to harms, in line with the UN Guiding Principles on Business and Human Rights.

    The goal of this workshop is to clarify the ways in which international human rights standards should apply to technology companies in times of crisis. The UN Guiding Principles do not provide specific guidance that companies should follow in the midst of a sudden or protracted emergency, or clear ways in which such guidance should be implemented. As a result, company responses to COVID–19 and other crises have varied widely on both the national and international level, and company disclosure of the enforcement of their own rules is weakened. The frequency of large-scale emergencies – of which the coronavirus pandemic is only the most far-reaching manifestation – creates a pressing need to remedy this gap.

    The session will be structured into three parts. In Part 1, the speakers will present a panorama of company responses to the COVID–19 crisis and other emergencies based on policy research. The three central axes of this overview will be company governance, freedom of expression and information, and privacy. The overview will be based on the speakers' areas of expertise in these three areas, Ranking Digital Rights’ research and policy tracking on the topic, and previous insight from large-scale public safety emergencies. In Part 2, participants will break out into three randomly assigned groups (Governance, FoE and I, and Privacy) to discuss major trends that have emerged among companies and identify ideal-case practices in their area, with grounding in international human rights principles. In Part 3, the rapporteurs from each breakout group will report back on the key challenges and recommendations identified in their group and open the floor to cross-examination by the other groups.

    Expected Outcomes

    1. Draft framework for human rights-based responses to large-scale crises by technology companies, including key categories of response (e.g., data collection, use, sharing, inference, and retention; content moderation; network shutdowns).

    2. Draft evaluation criteria for companies’ responses to crisis.

    3. Collaborative report on corporate responses across telecommunications companies and digital platforms.

    After the initial conversation, the session will be split into three breakout groups with three focal points: Governance (including human rights due diligence), Freedom of Expression, and Privacy. Each breakout will be moderated by one of the organizers. At the start of the breakout segment, participants will be encouraged to bring up use cases of companies that exemplify both responsible and flawed responses. The group will use these as touchstones for subtopics that will subsequently be discussed in the breakout (e.g., expanding Privacy into data collection, inference, use, sharing, and retention). In the final segment, the plenary will re-open for a “cross-examination” in which each group will summarize the ways in which the items they discussed can be applied to their category, while the other two groups will be tasked with finding gaps in those conclusions.

    Relevance to Internet Governance: This workshop aims clarify and highlight shared principles and norms, grounded in international human rights standards, that should underlie companies’ activities in times of extreme uncertainty. The private sector has previously participated in similar discussions on standards in isolated contexts where large populations were abruptly exposed to extreme risk, such as during natural disasters and network shutdowns. However, this has not translated into any outcomes resembling a shared set of norms or evaluation standards beyond efforts by individual organizations. Thus, the internet governance community still has no definitive answers to questions such as how human rights due diligence should be adapted to crisis situations, what level and type of pushback is appropriate for overreaching data requests by government actors that are motivated by public safety concerns, and how each aspect of the data collection pipeline (including aggregation and inference) should or should not occur under exceptional circumstances. Preliminary responses to these questions will lay the groundwork for the Internet’s responses to future crises and provide a blueprint for smaller tech companies, in line with existing international human rights standards.

    Relevance to Theme: This session addresses the policies and practices that form the cornerstone of Track IV. It tackles the roles and responsibilities of the technology companies as either core enablers of international human rights or exacerbators of human rights violations, which are especially salient during times of crisis. At such times, the likelihood and frequency of human rights blind spots increases, opening billions of users up to exploitation and ultimately eroding trust on all fronts. The lack of an established framework or decision-making process to help steer companies’ decisions can lead to unbridled collection of user data, unaccountable data sharing agreements that lack sunset clauses, and haphazard content moderation practices building on algorithmic systems that are typically opaque in the first place.

    Secondly, Track IV emphasizes the relationship between security and people’s fundamental freedoms and rights. Such trade-offs permeate companies’ responses to various degrees during large-scale crises. Major disruptions such as the COVID-19 crisis can also shatter existing collaboration to protect human rights, such as social media platforms’ partnerships with fact-checkers. This requires additional transparency and accountability as well as clarifications on how the existing standards that companies follow should be applied in a new reality.

    Online Participation

    Usage of IGF Official Tool.

     

    Agenda

    1. Kickoff and speaker introductions (10 min)
    2. 
    Speaker presentations (15 min)
    3. Moderated discussion (20 min)
    4. Breakout groups (Governance, Freedom of Expression and Information, Privacy) (25 min)
    Key questions:
    What trends in crisis response have emerged among tech companies and telcos in the topic under discussion? How are they (un)aligned with international human rights standards?
    - What specific practices do you want to see companies implement in this domain?

    5. Report back from breakouts and cross-group discussion (20 min)
    6. Wrap-up (4 min)

    1. Key Policy Questions and related issues
    How do companies fulfill or fall short of their responsibility to protect and respect human rights in times of crisis, and what best practice should they follow?
    What existing or potential mechanisms can the multi-stakeholder human rights community apply to improve companies' transparency and accountability during such crises?
    What policies and practices in the areas of governance, respect for freedom of expression, and privacy are critical for companies to safeguard human rights in crisis situations, and how to prevent the crystallization of bad precedents?
    2. Summary of Issues Discussed

    Areas of broad agreement included:

    1. There is broad consensus on the importance of maintaining strong human rights due diligence practices in the midst of crises and in the build-up to crises. Participants also expressed robust support for mandatory due diligence requirements on a regional level, following regulation emerging from the European Union. Such efforts provide more tools and leverage to human rights defenders and affected rightsholders.
    2. When receiving excessive demands for user information or content removal, companies must not only exercise due diligence, but commit to pushing back to the maximum extent possible, guided by the principles of legality, necessity, and proportionality.
    3. Crises enable contagion effects: practices normalized during crisis risk spilling over into non-crisis contexts. For instance, data collection goes beyond the data minimization principle can form a precedent for similar practices once the emergency is over, and network shutdowns ostensibly carried out with the intention of countering violent protest can spill over into other situations.
    4. Crisis protocols should encompass a range of scenarios at all geographical levels.

    Areas that require more discussion included:

    1. Questions remain about the best way to implement, prioritize, and report on human rights due diligence efforts in an environment disrupted by conflict and volatility, especially when the capacity to conduct due diligence is limited and there are numerous salient risks to rightsholders. 
    2. The role of transparency reporting and access to remedy are two perennial issues. How should such reporting be conducted in light of global and local crisis? How can companies strengthen their structures to better provide remedy?
    3. While most participants agreed that it is important to set limits on privacy-related practices such as data collection, inference, and retention, many broader questions remain about topics such as algorithmic transparency and how to ensure it (e.g., through disclosing source codes, standalone policies, audits).
    3. Key Takeaways

    In the context of COVID-19, it is critical for companies to outline plans for how to limit their collection, inference, and retention of data outside of their usual practices to the crisis context, and precisely define the actions they will take once the crisis is over. As a general rule, companies should also commit to push back against excessive government requests - not just in isolated cases, but as a matter of policy. This should feed into their transparency reporting practices, which should observe the principle of reporting on such requests unless legally barred from doing so, and refrain from presenting data in ways that obfuscate their meaning.

    Decision-makers at all levels must also understand that crisis breeds contagion. Both within and across countries, the global spread of network shutdowns was driven by governments learning from each other, adapting extreme restrictions to a plethora of circumstances in which these restrictions were a disproportionate measure. Similarly, the abrogation of the right to privacy in the context of a pandemic will likely encourage abuses in other contexts. Governments should refrain from redeploying tools that give them access to vast troves of user data in other situations. At the same time, companies should avoid becoming enablers of human rights violations and refuse to buckle when facing threats of blocking their services if they do not comply with excessive demands.

    Social media companies should be particularly vigilant and introspective given that human rights harms they may cause or contribute to can be insidious and difficult to trace. For instance, their human rights impact assessment processes should explicitly encompass the interaction of their algorithms (including, but not limited to, ranking and recommendation algorithms) with the environment in which they are deployed rather than focus exclusively on the risks of the environment itself

    6. Final Speakers

    Speaker 1: Dorota Głowacka (Panoptykon Foundation), Civil Society, Eastern European Group
    Speaker 2: Isabel Ebert (OHCHR B-Tech Project), Intergovernmental Organization, Intergovernmental Organization
    Speaker 3: 'Gbenga Sesan (Paradigm Initiative), Civil Society, African Group

    7. Reflection to Gender Issues

    Although the session did not directly discuss gender issues, women are disproportionately affected by many of the crises discussed. Instability in the midst of political turmoil and network disruptions erect additional barriers to access to online education and other services that have proven vital during the pandemic. Disruptions associated with these periods of crisis have a disproportionate effect on women and slow down efforts to reduce gender gaps, driving communities that are often already vulnerable into even further vulnerability. This underscores the importance of companies pushing back against deliberate disruptions to access in times of overlapping emergencies.

    8. Session Outputs

    While it remains to be seen whether the organizers and speakers will produce joint outputs from the session, all of them will continue to publish relevant research and advocacy work. Ranking Digital Rights plans to discuss the relationship between company policies and crises in the 2020 RDR Index, to be published in February 2021. RDR’s 2019-20  methodology development process and the final methodology for the 2020 RDR Index can be found here: https://rankingdigitalrights.org/methodology-development/2020-revisions/. The Paradigm Initiative’s 2020 Digital Rights in Africa report will be released in April 2021. The Panoptykon Foundation and B-Tech Project will continue to release publications that engage with the topics discussed, ranging from privacy to access to remedy. Recent and forthcoming publications can be found on their respective websites

    9. Group Photo
    WS #304 session screenshot (non-organizers and non-speakers blurred)