IGF 2023 DCCOS Risk, opportunity and child safety in the age of AI

    Time
    Wednesday, 11th October, 2023 (04:00 UTC) - Wednesday, 11th October, 2023 (06:30 UTC)
    Room
    WS 11 – Room J
    DC

    Dynamic Coalition on Children's Rights in the the Digital Environment

    Round Table - 90 Min

    Subtheme(s)

    Child Online Safety
    New Technologies and Risks to Online Security
    Online Hate Speech and Rights of Vulnerable People

    Description

    The overarching theme of IGF 2023 in Kyoto is The Internet We Want - Empowering All People. Our starting point for this discussion is clear: There can be no empowerment on the Internet without a foundation of safety. And the Internet we want, and need is one where children's rights are protected. With AI and the Metaverse on the agenda of governments and increasingly embedded in the lives of digital platform users worldwide, tech legislation and safety policy are at a critical moment of transition. Different types and applications of these new and frontier technologies have the power to be transformative in positive ways as well as potentially harmful in ways we can and cannot yet predict, including for children and young people. As a result, and as seen in other areas of technology, governments often find themselves playing catch up, struggling to define the proper guardrails for its use across diverse environments from social media and online gaming to EdTech. Society will only be able to harness the benefits of the ongoing technological transition based on AI when proper safeguards are in place. We need to build a shared understanding of the risks and how we can develop the right safeguards for children and young people. Nowhere has the misalignment between what is technically possible, socially acceptable and legally permissible been exemplified more than in the debate around generative AI models. Indeed, between the date of submitting this proposal and the delivery of the session in October 2023, the societal, legal and other debates around this are likely to undergo further rapid change. At the same time, there is a risk that conversations around AI as a ‘new’ or ‘complex’ landscape distract from the foundational safety issues that already put children in harm’s ways in digital spaces that were not designed for them. For example, virtual worlds powered by AI and characterized by anonymity and open access will expand the opportunities for people to exploit children and other vulnerable groups, as already seen by law enforcement. For children, the psychological impact of abuse experienced in virtual worlds will present new and likely more intense forms of trauma. If a core goal of the Metaverse is to blur or remove the boundaries between physical and virtual realities, the differences between physical hands-on abuse and virtual abuse will vanish, with a hugely negative impact on victims and society at large. Either way, the principles and models underpinning AI and the Metaverse are mediums in which child protection must be addressed in a holistic, rights-based and evidence-informed way. This will inform safety policy, awareness by children and parents, and help ensure global alignment between regulation for safety and regulation for AI to avoid fragmentation and inefficiencies in our collective response. This is also based on General comment No. 25 (2021) on children’s rights in relation to the digital environment[1] which obliges state parties to protect children in digital environments from all forms of exploitation and abuse. This session will: 1. Discuss whether and how different approaches to regulation are needed for different digital spaces such as social media, online gaming, communications platforms and EdTech. 2. Discuss how existing safety nets and messaging meet the needs, aspirations and challenges voiced by children, young people and parents around the world. 3. Address the following policy questions: 4. How do you design robust and sustainable child safety policy in a rapid changing tech landscape? 5. How do you create meaningful dialogue around the design, implementation and accountability of AI towards children and society? Goals / outputs of the session 1. Identify the main impact of AI and new technologies on children globally. 2. Understand young peoples’ own perception of risks in virtual worlds 3. Create the basis for DC principles of AI regulation for child safety. 4. Initiate DC messaging for parents to support their children in the digital space. 5. Co-construct DC guidelines for a modern and child rights-oriented child and youth protection in the digital environment. [1] https://www.ohchr.org/en/documents/general-comments-and-recommendations… .

    The session will be run as a roundtable, with speakers to guide the key topics of conversation, but an inclusive approach to discussion and idea-sharing. The online moderator will ensure a voice for those attending online, and the use of online polls and other techniques will ensure an effective hybrid experience. To answer the questions directly: 1) We will facilitate interaction between onsite and online speakers and attendees by inviting comments from both groups, and bringing questions or comments from the online attendees to the room. 2) We have designed the session as a roundtable to ensure that both online and onsite participants have the chance to have their voice heard. 3) We aim to use online surveys/polls to ensure an interaction session.

    Organizers

    Amy Crocker, ECPAT International, Civil Society, Asia Pacific (onsite moderator) Jim Prendergast, The Galway Strategy Group, Civil Society, WEOG (online moderator) Jutta Croll, Digital Opportunities Foundation, Civil Society, WEOG (rapporteur) Jennifer Chung, DotKids Foundation, Private/Civil Society, Asia Pacific

    Speakers

    1. Liz Thomas, Microsoft, Private sector, WEOG (onsite) 2. Sophie Pohle, Deutsches Kinderhilfswerk e.V, Civil Society, WEOG 3. Katsuhiko Takeda, Childfund Japan, Civil Society, Asia Pacific 4. Jenna Fung, Asia Pacific Youth IGF, Civil Society, Asia Pacific 5. Patrick Burton,Centre for Justice and Crime Prevention, Civil Society, Africa

    Onsite Moderator

    Amy Crocker, ECPAT International, Civil Society, Asia Pacific

    Online Moderator

    Jim Prendergast, The Galway Strategy Group, Civil Society, WEOG

    Rapporteur

    Jutta Croll, Digital Opportunities Foundation, Civil Society, WEOG

    SDGs

    5.2
    16.2

    Targets: Target 5.2 eliminate all forms of violence against all women and girls in public and private spheres: Violence against children, including its manifestations online is a gendered phenomenon. Affecting girls in high numbers, the dynamics of violence and exploitation of boys and children of other genders and sexual identities is also a priority if we are to have a sustainable impact on violance and sexual exploitation. Child safety in digital environments will be increasingly complex in the face of AI and the Metaverse, for multiple reasons. This session seeks to identify and unpack some of these challenges. Target 16.2 end abuse, exploitation, trafficking and all forms of violence and torture against children: We are living through an unprecendented period of technological change, and one of the most nefarious consequences of this is the increase, reach and impact of child sexual exploitation and abuse. New and frontier technologies threaten and promise to change our world, yet there can be no sustainable, positive online world without safety, and in particular for children. This session seeks to identify, and understand the risks that children may face from AI and the Metaverse, and to propose concrete solutions that can help mitigate risk, prevent harm, and harness the positive benefits of the online world for children and society around the world.

    Key Takeaways (* deadline at the end of the session day)

    • Children have the right to safe, inclusive age-appropriate digital spaces; these must be in line with the evolving capacities of each child.

    • To create such a digital environment for children, we need a broad focus and to take into account the perspectives of children, parents, educators, the private sector, research, policymakers and civil society as much as possible.

    Call to Action (* deadline at the end of the session day)

    • We call upon all IGF stakeholder and members of the DC Child Rights to engage their communities, including children and youth, in ensuring a child rights based-approach in their work.

    • We call on the IGF community to join the DC Child Rights to continue and enhance the dialogue initiatied at IGF Kyoto 2023.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Session Report 2023: Risk, opportunity and child safety in the age of AI

    Dynamic Coalition on Children's Rights in the Digital Environment

    Key Issues raised

    1. Obstacles to Child Rights in Practice: Ensuring child rights – as outlined in the General Comment No.25 (UNCRC) - requires a constant balancing exercise based on the best interests of the child; regulation often prioritizes certain rights over others, such as protection over empowerment, socio cultural differences exist; research including with children is essential to assess and mitigate risk. 
    2. Lack of knowledge about the impact of technology on society: There is an urgent need for improved understanding by children and adults in all settings of the risks and applications of AI and other advanced technologies in their lives.
    3. Regulation, corporate responsibility and diverse consultation: Regulation and codes of practice are urgently needed to ensure action against illegal content, conduct and contact such as child sexual abuse material; this needs to come alongside evidence-informed commitments and transparent processes.

    Key Intervention themes

    1. The view and evolving needs of children: Deutsches Kinderhilfswerk highlighted the results of a meta-research focused on children aged 9-13. Reflecting a key principle of the GC 25, younger children need a strong social support system, while older children are more comfortable using tools such as blocking and reporting but also need information on how to report people, how to block people and how to protect themselves from uncomfortable or risky interactions.
    2. Public Awareness: ChildFund Japan presented a recent public opinion survey for people aged 15 – 79 in Japan. The results highlighted the internal conflict between human rights, especially child rights and freedom of expression.  Asked about computer/AI generated CSAM, some respondents consider it to be acceptable as a method to prevent ‘real’ crime, demonstrating a lack of understanding of the multifaceted harm of CSAM. 20% of respondents said they did not know about the risk of AI, or even understand AI, which points to a need for much better education and awareness. 
    3. Participation and Trust: The APAC Youth IGF perspective called for more youth voices in internet governance spaces. Even young adults cannot advocate fully for younger generations that face more complex challenges, risk and harm that previous generations. It is natural for children to turn to parents and caregivers, but many adults do not grasp the complex nuances of the risks. Fostering trust and ensuring fundamental digital knowledge are essential steps in creating a reliable and ethically responsible digital environment for the younger generations.
    4. Investment in prevention alongside regulation: The CJCP highlighted that relying on children’s rights even as they are contained in the CRC and in General Comment 25 is complex today due to interpretation. Often, emphasis is placed on specific rights rather than equitable embracing of all child rights. Regulation is essential but alone will not resolve the challenges presented by emerging technologies, immersive spaces, and AI.  States must put proportionate investment into prevention – education and awareness-raising.
    5. Risk Assessment that embraces diversity to inform design: Microsoft highlighted the necessary balance between regulation and outcomes-based codes that can offer more flexibility for different services to navigate inevitable trade-offs and achieve the safety and privacy outcomes that are desired. Risk assessment and needs analysis also means improving ways to understand impact on issues through an age-appropriate and gender lens among others. This requires greater consultation with children themselves to inform debates such as those around age assurance.

    Participatory Discussions

    • Children’s Rights in Policy and Practice: Children have the right to participate in a digital environment and to be protected when using digital services. The collection of data by services, the sharing of information without children's consent by parents or others and the risks of interacting with peers can affect children's well-being and influence their use of digital services. Children need (more) media literacy and parents, educators and other adults should be aware of children's rights and also acquire knowledge about the digital environment. Platforms play a central role, and we need to overcome the contradiction between providing age-appropriate, safe services for children without knowing the age of users of the service.
    • Regulation: Regulation takes a long time, which we do not have in the fast-changing landscape. All sectors need to be aligned on what child rights are, and tech companies must make transparent commitments based on risk assessment and mitigation that is differentiated by service and product based on safety-by-design principles and practice. Regulation is complex, and there are particular challenges in taking into account the individuality of each child and their developing capacities. It should not be forgotten that existing regulation and the way it is implemented can also lead to further disparities. Regardless of this, it seems appropriate to hold service providers accountable and give them guidance on providing safe services for young people that ensure the rights of the child.
    • Geographical differences also impact more deeply on children’s rights. Participants from Brazil, Bangladesh and Afghanistan highlighted the challenges that these countries have with technology, access and capacity building. Despite international standards, they are not applied equally. Evidence shows that some children, especially from the Global South have a much lower level of protection than those from the Global North. Young people themselves call for clear definitions and scope about online safety to frame the conversation equally. Furthermore, the evolving capacity of children in different contexts is largely influenced by different contexts in which they live. 
    • Research: More research is needed about children’s experience and resilience. This will enable solutions that meet children's best interests and guard against legislation and policy that are out of step with reality. Safe spaces where children can implement their own ideas without being affected by product guidelines or market-driven interests are now created by civil society organisations, communities and families. Therefore, we need research to inform policy and practice on safe social spaces, e.g., in the Metaverse.

     

    Participant suggestions regarding the way forward/ potential next steps

    • Use the IGF platform to help align, localize and make inclusive approaches: To create safe digital environments for children and young people, it is essential to understand how risks are regulated in different parts of the world. This also requires caution against applying existing legislative or policy approaches from one location to another without proper analysis of local context and culture.  It is also important not to underestimate different needs, for example around gender and disability; doing so may result in services that fail to prioritize the best interests of every child. And to bring children's rights and their interests more strongly into discussions about the design and regulation of the internet, relevant exchange and cooperation with other Dynamic Coalitions should be expanded. Parallel to this, the existing DC Child Rights should also intensify its work to facilitate child rights debate within and across the IGF community, also serving as a reference point for others.
    • Improve public understanding of and participation in debates around technology and society: Many people do not understand how existing and new technology such as Artificial Intelligence works. Improved education and awareness are needed for both children and adults. Governments, civil society and tech companies must do much more to bring children and young people into the center of these debates, in a meaningful and outcome-driven way. Their generation-specific experiences will ensure a sustainable approach to tech policy and internet governance at this pivotal moment.
    • Consider clearer guidance on what the balance of rights, as well as trust and transparency look like for different people in the digital world: There is an urgent need to build and improve trust - trust in algorithms, in institutions, in companies, and in each other whether youth peers or adults – to address the key challenges presented by today’s digital world.  And trust must be built on transparency of design, decision-making, processes, and action that can enable informed public debate.