IGF 2020 WS #279 Digital Due Diligence: Tech Companies and Human Rights

    Subtheme

    Organizer 1: Chanu Peiris, Chatham Housee
    Organizer 2: Harriet Moynihan, Chatham House
    Organizer 3: Madeleine Forster, Chatham House
    Organizer 4: Marjorie Buscher, Chatham House

    Speaker 1: Harriet Moynihan, Civil Society, Western European and Others Group (WEOG)
    Speaker 2: Kate Jones, Civil Society, Western European and Others Group (WEOG)
    Speaker 3: Thiago Alves Pinto, Civil Society, Western European and Others Group (WEOG)

    Moderator

    Harriet Moynihan, Civil Society, Western European and Others Group (WEOG)

    Online Moderator

    Chanu Peiris, Civil Society, Western European and Others Group (WEOG)

    Rapporteur

    Madeleine Forster, Civil Society, Western European and Others Group (WEOG)

    Format

    Panel - Auditorium - 90 Min

    Policy Question(s)

    What role can States play in encouraging tech companies to uphold the UN Guiding Principles through the conduct of due diligence, and how far have actions taken by tech companies on their own initiative meet the global standard for expected conduct by companies?

    Regulators have struggled to keep pace with rapidly evolving technologies and online practices. Companies have attempted to address some of governance gaps through community guidelines and rules. However, these standards are not necessarily compatible with international human rights law, and in some cases risk undermining the international human rights law framework through the establishment of parallel processes that fall short of universally recognised international standards established over decades in human rights law. The UN Guiding Principles on Business and Human Rights provide a framework for informing, engaging with, and holding to account dominant tech companies in relation to their responsibilities to individuals under international human rights law. Due diligence responsibilities under the UN GPs require companies to take preventative action, including the conduct of human rights impact assessments, and to introduce greater transparency about their procedures, for example in relation to the use of algorithms to boost particular content over others. The international human rights law framework also acts as a lever to increase the accountability of dominant tech companies for their actions, including the rights of users to seek a remedy where they suffer harm. The challenge is how to get tech companies to use the international human rights law framework to inform their processes in a meaningful way (for example, by carrying out human right impact assessment in advance, and identify and mitigating risks to users at the product design stage). This panel offers an opportunity to increase awareness among the tech community of their human rights due diligence responsibilities and to explore measures to improve standards on procedures and processes, which would increase transparency and accountability.

    SDGs

    GOAL 9: Industry, Innovation and Infrastructure
    GOAL 16: Peace, Justice and Strong Institutions

    Description:

    discussions, and how discussion will be facilitated during the session. Content and agenda: This session will explore two issues: (i) how can the UN Principles on Business and Human Rights apply in practice to the policies and procedures of tech companies, with particular reference to dominant social media companies who wield great power over users; and (ii) the role that States can play in encouraging tech companies to uphold the UN Guiding Principles on Business and Human Rights (UNGPs). In relation to (i), discussion will include examination of the extent to which policies and procedures of tech companies such as Facebook, Google and Twitter currently meet the ‘golden standard’ set out in the UNGPs and how they can be improved in order to do so, by reference to examples from other sectors including the extractive and garment industries. This includes UN Guiding Principle 18, which underlines that the purpose of human rights due diligence is 'to understand the specific impacts on specific people, given a specific context of operations' and requires (tech) companies to pay special attention to vulnerable groups, and UN Guiding Principle 21, which requires (tech) companies to 'both know and show that they respect human rights in practice', in particular 'by providing a measure of transparency and accountability to individuals or groups who may be impacted and to other relevant stakeholders'. In relation to (ii), discussion will include the role that States can play in mandating or encouraging companies within the State’s jurisdiction to carry out ‘digital due diligence’, i.e. due process in assessing and mitigating human rights risks and providing transparency and access to remedy, including with reference to recent legislation on mandatory due diligence in France, the UK, and proposals in other European countries. This methodology supports the practical outcome of increasing knowledge of attendees about what human rights due diligence entails in practice in the tech sector, and how the processes of tech companies (including on use of algorithms and data) can be improved by benchmarking against the standards in the UNGPs. Discussion will be facilitated by the moderator who will agree the perspective sought from each speaker in advance of the panel. Discussion between panellists will be limited to 30 minutes to allow 60 minutes for questions from the audience.

    Expected Outcomes

    The outputs from this panel will contribute to ongoing research projects on this subject matter, including one run by the B Tech team at the UN’s OHCHR. They will also help to shape the direction of a new Chatham House project and research paper on due diligence in the digital sphere. We hope the discussion encourages tech companies attending IGF and participating online to examine and explain the extent to which their processes and procedures conform to international human rights standards on due diligence.

    Dissemination of event: Chatham House will publicise the event through its website, social media and direct email to its networks to generate an audience for the panel. Design of the panel: We will provide 60 minutes for Q&A following presentations from each of the speakers. Questions will be taken from women and men in equal measure and the moderator will be directed to encourage participation from as broad a group as possible.

    Relevance to Internet Governance: Internet governance must ensure the protection of all fundamental rights and freedoms (see Council of Europe’s ‘Declaration on Guiding Principles on Internet Governance’ and the Human Rights Council’s resolution on ‘the Promotion, Protection and Enjoyment of Human Rights on the Internet). Given the Internet’s effect on expression, association, information and privacy evolve over time, periodic reassessment of technologies and compliance mechanisms with reference to human rights standards is essential. In this panel, we aim to provide concrete recommendations, with reference to case studies, on developing procedures for human rights due diligence in the tech sector.

    Relevance to Theme: The exponential growth in the gathering and use of personal data has resulted in a digital trust deficit. Redressing this deficit requires public confidence in the adequacy of safeguards established by tech companies and States. International human rights provide a well-tested framework for assessing the robustness and balance of any such safeguards.

    Online Participation

     

    Usage of IGF Official Tool. Additional Tools proposed: Chatham House has extensive networks and would like to livestream this session on its website. We intend to distribute notifications of the livestream and/or videos of the panel discussion to our network of over 100,000 stakeholders through social media and through direct email campaigns.