IGF 2019 Platform Values: Conflicting Rights, AI and Tax Avoidance

    DC

    Dynamic Coalition on Platform Responsibility

    Debate - Auditorium - 90 Min

    Description

    Platform regulations are having enormous impact on the lives of several billion individuals, and this impact is poised to increase over the next decade. This session discusses three of the most crucial points of contention with regard to values underlying the operation of digital platforms: the dispute resolution mechanisms they design and the ways such mechanisms are structured to deal with conflicting rights and principles; the values that can or should be baked into platforms’ automated decision-making and the rights of appropriation in relation to the development of artificial intelligent systems; and the tax avoidance strategies that are frequently pursued by tech giants to minimise their fiscal responsibility across the multiple jurisdictions in which they provide their services.

    This session will include presentations based on the papers featured in the special issue of the Computer Law & Security Review, celebrating five years of activities of the UN IGF Coalition on Platform Responsibility and dedicated to Platform Values: Conflicting Rights, Artificial Intelligence and Tax Avoidance. The special issue, which is the 2019 official outcome of the coalition, will include also the finalised Best Practices on Platforms' Implementation on the Right to Effective Remedy, produced by the Coalition between May 2018 and March 2019.  

    Over the first years of activity, the Coalition has explored the role of digital platforms as gateways to speech, innovation and value creation; it has highlighted that their ascendance as central elements of our society, economy, and public sphere is redefining the concepts of “private” and “public”, and challenging conventional approaches to regulation and governance. Along those lines, the special issue starts from the consideration that, to guarantee the balance and sustainability of governance systems, the exercise of power should be constrained. To do so, a deliberative process over the aims, mechanisms and boundaries of regulation is needed. Accordingly, when private entities rise to the level of quasi-sovereigns or private regulators, it is natural to expect discussion, shared understanding and scrutiny of the choices and trade-offs embedded in their private ordering. In this perspective, the papers featured in the special issue provide analyses and putting forward concrete solutions and policy proposals with regard to platform values. This call is therefore aimed at papers analysing conflicting rights, artificial intelligence systems and tax avoidance strategies with regard to digital platforms. 

     The IGF session will have the following agenda:

    •          Introduction and presentation of the coalition and its work
    •          Brief presentation of the Best Practices 
    •          Presentations of the contributions to the special issue
    •          Discussion of next steps and priorities for the coalition
    Organizers

    Nicolo Zingales, Sussex University

    Speakers

     Opening remarks by Nicolo Zingales, University of Leeds, and Luca Belli, FGV 

    Part I- Platform Values, Freedom of Expression and Democracy

    • Keynote by Edison Lanza, Special Rapporteur for Freedom of Expression Organization of American States
    • Nic Suzor, Queensland University of Technology
    • Monica Rosina, Facebook 

    Quick round of questions

    Part II: Platform values and content moderation

    • Chris Marsden, University of Sussex
    • Ivar Hartmann, FGV 
    • Giovanni De Gregorio, Univerista' Milano Bicocca
    • Dragana Obradovic, Balkan Investigative Reporting Network 

    Quick round of questions

    Part III: Conflcting rights and values

    • Catherine Carnovale, Elsevier
    • Rolf H. Weber, University of Zurich
    • Catalina Goanta, Maastricht University
    • Yseult Marique, University of Essex

    · Open Debate

    SDGs

    GOAL 12: Responsible Production and Consumption

    1. Key Policy Questions and Expectations

    Is there a common understanding on the type of values that ought to be promoted by platform regulations? 

    What are some of the best strategies to ensure that digital platforms aim to maximize not only shareholder value, but values of the broader set of stakeholders affected by platform regulations? 

    To what extent are platforms the best-placed entities to identify which rights should be privileged when regulating social and economic interactions, and how should balancing between conflicting rights and values be conducted?   

     

    2. Summary of Issues Discussed

    There was a broader concern about freedom of expression for the next years on social platforms. Some participants observed that companies should be called out toward the implementation of the principles on business and human rights. Especially, regulators could create procedural norms, such as due process on activities of content removal and toward more transparency of algorithms and data applications.  However, on one hand it was pointed out that regulators are regulating without knowing about the subject. On another hand, it is a great worry that platforms are becoming private regulators. The fact that they can censor more content than what the law obligates was highlighted as a big concern. 

    3. Policy Recommendations or Suggestions for the Way Forward

    The debate on content moderation should be less about the content that companies fail to remove and more about legal speech that platforms illegally censors. State regulation should be less about individual instances of speech and more about protection of procedural rules. Platform liability in this sense needs to be more for failures in terms of overarching procedure and architecture and less about individual instances of speech. Legislators and regulators should work on basic procedural. Less decisions on the merits of specific instances of speech by platforms, and more by its users. The discussion must be about user empowerment.

    4. Other Initiatives Addressing the Session Issues

    Participants mentioned some of initiatives already in course and some others that need attention. Some of them are: Santa Clara principles, the elaboration of a general Digital Services Act, and the Facebook Oversight Board. However, there’s an urge that companies cooperate with transparency and assuring freedom of speech according to public’s interests.

    5. Making Progress for Tackled Issues
    1. Disinformation is not about to go away, but it can be targeted with other measures such as fact checking.
    2. A simple handbook for regulators to inform them about what is this debate about.  
    3. It's necessary to draw a definition of platform. It is an essential term to create liability. 
    6. Estimated Participation

    Onsite and online: 80 and 119

    Women onsite: 40 

    7. Reflection to Gender Issues

    Harassment and hate speech have women as a main target in general. Platforms could consider this and cooperate with measures toward promoting more equity in online participation.