The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>>MARIE‑LAURE LEMINEU: I think we're going to start in one minute, if you want to take your seats. It's already 4:13. Yeah, yeah, 5:10. Do we have enough chairs?
So good afternoon. This is a packed table, but I think we can manage. My name is ‑‑ can we start? Is someone recording? Is there any technician around? I'm not sure. I think we are actually looking and what you're looking at on the screen now, John. Are you reading anything that that you want to hide from us? No? Is it shareable?
Aren't we supposed to have some kind of transcript or something on the screen? No? Not in this room, okay. I'll start speaking and we'll see what happens. I hope it's recorded.
Can you check if there is? Because we have a live web cast, maybe to see if it's ‑‑ so that means it's recorded. Okay. Thank you. Excellent. So I can smile. We are being recorded. So I work for ECPA international. This is an NGO. A network of NGOess. We have 103 members in 93 countries and we're combatting all forms of sexual exploitation, one of them being the sexual exploitation of children online. And we are chairing the dynamic coalition of child online safety. We have over 40 members of children's rights organizations.
I would like to thank you for attending the session today and the topic bringing us here today ‑‑ I find it very interesting because we actually wrote a proposal; but I think it's a challenging one, and we can manage to have a very interactive session and discuss interesting points. It's the moderation of illegal contents on the and the staff welfare and discuss a little bit, the responsibility of the actors involved in this matter.
So we have different panelists today. Some of them you might know. We have with us John Carr who is actually on my left‑hand side, over the. John wears many hats. Some of you might know him. But today he's speaking in the east upper city of ECPA international. Then we have Larry on my right hand side. Larry also has a long CV and wears many hats. Today he's speaking in his capacity as a member of the safety advisory board of Facebook, Twitter, Snapchat. And Google, to some extent.
Then we have Marco. He's the director of the public policy for Google. And then, yes, Marco. Then we have Karuna Nain, head of the global safety policy program for Facebook. And last but not least we have Michael Turks who is also a policy manager for IWF, internet watch foundation, who is the reporting mechanism in the UK reporting mechanism for illegal contribution.
So what we'll do today is that John is going to frame a little bit of the topic, raise some questions, make some comments. Then we'll pass on the Mike. The speaking order will be like this for the first round. Then we pass on the mic. Sorry, I just realize I forgot to introduce our remote moderator. I'm so sorry. David is helping us. He's from ehelp association from Hong Kong. Has been involved with IGF and ICANN for a long time, and he's helping us. Thank you very much. He's remote moderator.
So after John frames the debate, then we will pass the mic to Michael so that he can explain to us a little bit what the welfare program is, and then we'll continue with the other speakers and then we want to keep the session very interactive. So at any moment you want to ask a question or you want to react on something that has been done after the first round, please feel free to sort of intervene and raise your hands so that we ‑‑ it's not that structured on purpose. It's more lively and interesting to all of us.
So John, over to you. Oh, I see the transcript now. Thank you.
>>PANELIST: So we got interested in this as an organization. That's to say international following I think it was originally me, but other people then read a book written by an academic from Western University in California, a book called digital refuse. And what the academic she did in that book was look at where the rich countries of the world, the OECD member states predominantly, but Western Europe, America, Japan, so on, where were they sending their old computers, their old TV sets, old refrigerators, and all of these sorts of bits of metal and technology that were no longer useful ‑‑ past their working life and needed to be dumped, needed to be got rid of?
And what she found was that there were lots of ‑‑ this is not new. Everybody kind of knew this. There were businesses who had been set up specifically to transport these bits of metal to different parts of the world to unload them, to dump them. I think sometimes they would try and retrieve bits of precious metal that were in old mobile phones or old computers. But in essence, what she traced were the places around the planet where these old bits of technology, dead bits of technology, were being dumped.
And then she ‑‑ I don't know quite what inspired her to do it, but it was inspirational, and then she looked where the bad images and the bad stuff that was appearing on the internet was being viewed by moderators who were then having to make decisions or be part of the decisionmaking chain in big internet companies about whether or not this particular content is either illegal or contravened the terms of service or standards of the company that hired them to do the moderation.
And what she found was there was almost an exact match. So the countries where the west were sending their old computers and TV sets to get buried in the ground, or whatever it might be in parts of the developing world, were the same countries where people were being employed to look at the visual digital outputs, as it per, that was coming predominantly when the book was written and I think probably still largely from the same countries.
And then she described in the book some of the kinds of things that the moderators were having to look at. And, you know, videos of heading, child abuse images, child pornography, whatever you want to call it, children being beaten up, all kind of horrible horrible things that these people, these moderators were having to look at on behalf of the companies who were employing them.
I think typically in those days they were all working as subcontractors for moderation companies. Very few I think in those days were direct employees of the companies who would ask them to do the moderation.
And she was very concerned about what was happening to them as people. So in the name of protecting free speech, in the name of all of these sorts of things, the reality was that there were these people in third world countries or developing world countries who were having to look at this stuff, as it were, on our collective behalf.
And the wages were very low. The working conditions were not very good. And in one particular case this company, not only were they employing subcontractors in these offices, in particular she visited offices in manila where some of these moderators were being viewed also went to the head office of the same company where they also employed a smaller number of moderators but these people were in head office.
And what she found was a great difference in the way these moderators were being looked after. The ones in manila, almost no kind of services, no kind of support, the screens were open, things of this kind.
The ones being employed in head office back in the U.S.A., their screens were all kind of sequestered and they were in a safe room and it was very difficult for anybody else to see anything that was going on the screen. The only people who could see the stuff was the actual moderator themselves. So there's a big difference in other words between the terms and conditions under which these moderators or subcontractors were doing this job and the moderators who were being employed in head office back in California or Seattle or wherever it might be.
And I think she also found some of these moderators were working from home. They were home workers. While they were doing this job, hooking at this horrible horrible stuff in their living rooms or the equivalent of a living room or whatever it was.
So we had these images of these poor William who were working a dollar a day or something like that, maybe slightly more but not a great deal more, looking at this horrible stuff, maybe with their children running around in the house, or other people coming in and moving around. And just all of it seemed a little kind of off. So we thought it would be good to raise this as an issue at a meeting of this kind.
What these people are doing on our behalf is in the name of free speech. They are the ones who are actually at the coal face trying to make sure that the internet is the kind of place we all want it to be. But they're the ones who are paying a very heavy price. One of the people who Sarah Roberts interviewed was a psychiatrist or psychologist who got called in to help some of the people in manila. And when the psychiatrist or psychologist heard the kinds of material that these people were having to look at, she had to go and get help herself, just as it were from this indirect contact with it.
So it just seemed to me that there were obvious personal welfare issues, child protection issues so on and so forth that ought to be aired because it's not an issue that's been discuss very much. And it's been given its moment about the number of moderators they're going to be employing. We don't know how many moderators they're employing at the moment, but Google said they intend to employ another 10,000 by the end of next year and Facebook said they intend to increase the number of moderators they employ by 20,000. I don't know exactly what the timeframe is.
This is not ‑‑ I'm sure this is a good thing in general because it will help deal with terrible content that's on there. But it seemed to me that more and more moderators in combination with artificial intelligence or whatever were being engaged, this was a timely opportunity to think about some of the wider issues.
Final point on this, I was one of the original directors of the British internet watch foundation back in 1996, 97. I'm very old, you can probably tell that. Where we had to face ‑‑ at the time, we only had two staff. One was the chief executive and one was the deputy chief executive. Great titles. But they were the people who actually had to look at the images and we became acutely aware, as their employers, if you see what I am, that we had a duty of care to them.
Now, since that time, the office has expanded hugely. I don't know exactly what the arrangements are now, but I'm happy to hand over to Michael, who's going to tell us.
>>PANELIST: Thanks very much. I think those were very interesting points you picked up, certainly around the psychiatrist who was indirectly engaging in this content needing to have access to help and support. And I think that really does just show what an issue that we are dealing with here.
First, I'd like to say thank you to Mary for inviting the IWF to participate in this panel. And for those of you that don't know about the IWF, so we are the UK's hot line to remove child sexual abuse material. We have an online reporting portal where members of the portal can report online child sex abuse images or images they believe to be child sexual abuse.
And that is then assessed by our team of analysts in Cambridge. And our mission is to eliminate child sexual abuse from the internet. We are a self regulatory body that works with the internet industry, government, and law enforcement to tackle this issue.
Probably one of the biggest game changer moments for the IWF was in 2014 when the former UK Primse Minister David Cameron authorized us to proactively search for content.
In 2013, the figures from 2013, where we didn't do proactive searching to the 2014 figures where proactive searching was included, we saw 138 percent increase in the material that our analysts were discovering. Clearly, this has had a significant impact on our staff.
To give you an idea of what our analysts see on a daily basis is truly appalling. They see everything from images of children being brutally raped and abused and beaten to beheading videos that aren't within our remit but are sent in because the members of the public are concerned about what the content they are seeing is. To perhaps potentially some hilarious videos and images coming in of for example cats dancing around in people's underwear that they believe just shouldn't be on the internet.
So there are lighter moments for our analysts, as well.
But on a serious note, our analysts do an extraordinary job. They're ordinary people doing an extraordinary job, and they are our superheroes, I think.
And last year, they took down 60,000 individual web pages of child sexual abuse material. And what's significant about that figure is that each individual web page can contain up to hundreds and thousands of indecent images of children.
On those images, we found that 50 percent of those images are children under the age of 10 and 10 percent of those, under the age of two. When we're assessing these images, we assess them in line with UK law and assess them during A, B, C system, A being the most severe forms of abuse which usually involves penetration, B probably involving oral and C involving any material that then focuses specifically on a child's genital region. So really some quite hard hitting stuff that we see.
65 percent of the content for under 2s is category A or B, so the most severe form of abuse that you can imagine. And these are the very, very worst acts.
We have around 38 staff, so we've grown a lot since the days where John was the director and we have around 20 people actively engaged in viewing child sexual abuse material in some form or another.
And there were two I suppose really big welfare issues for the IWF. The first is obviously the impact that this has on the analysts themselves, and I agree with John, there really isn't and hasn't been enough work done in this area to analyze what the impact is from a psychological point of view.
Secondly, there's also the welfare of the IWF to consider as well. One of the worst case scenarios for the IWF would be if someone was doing so for their own sexual gratification. So we make sure we monitor what work they are looking at in the hot line to spot any trends which might flag up some impeculiar or peculiar behavior. So we look at that.
I just want to talk about the recruitment processes we go through to make sure we get the right people. So the recruitment process starts off by them having a psychological assessment, and an interview, really, to understand what their support networks are around them and could they cope with the level of imagery that I've just described.
Secondly, they're then put through what would be more of a competency based interview to check they are able to do the job and have those analytical skills we're looking for. And finally, the last part, the process, is that we show them and they usually come in on a Friday, we show them some images, and gradually getting worse as they go through. And then they have the weekend to reflect on how they feel about those images.
And over the weekend, they have access to psychological support and support from the IWF and independent psychological support as well after that viewing session to mitigate any impact on them.
Of course, it's not for everyone so there's an opportunity to drop out the process. Following that, if successful, there's a 6‑month induction period for them. In the first week, they spend a lot of time with the hot line manager acclimatizing to the new environment they find themselves in. Again, this is no ordinary job.
We also try to assure the working environment, is as best as it can be for our staff. The analysts are table to take breaks whenever they want and there are enforced breaks for certain activity such as image hashing where they'll be seeing so many images flicking through at a very very fast rate, that every 15‑minute, we actually say they need to take a break.
We also provide mandatory counseling support, so all of the analysts have to go to counseling once a month, and that reduces the stigma of someone going to counseling within the office where others might not. It takes that stigma away because they all have to do that.
We also run a series of away days which improves the relationship between those that work in the hot line and those that don't. Because it's strange. The only time I see some of thigh colleagues in the kitchen because we have a big wall down the middle of our office that absolutely divides us. So it's really great. Around three times a year, we get together and have away days which improves the team bonding.
Our analysts are also subject to annual psychological assessments which just checks in with them in terms of their well being. And finally, I just want to say, just a note on technology and how I think technology can improve some of the welfare conditions that we're facing.
So we know through our URL list and our image hash list that there are images of children that are repeated throughout the web. We're looking ‑‑ we do our best to take those down and make sure they're prevented from spreading again. But we put them with developing a program of web scrawlers which enables us to put those images into a web crawler to find other images or duplicates or whether possibilities for other images to be in the future. And eventually where we're trying to get to with that is that those images will return other images that we don't know about which will go through to assessment but the known images will be blanked out so our analysts won't have to see images in the future that they have already confirmed as being child sexual abuse images. So there's a real possibility for technology to expand further in this space.
>>MARIE‑LAURE LEMINEU: Thank you very much, Michael. Very interesting. Karuna, do you want to react on this and then Marco, maybe? Thank you.
>>PANELIST: Thank you so much for having Facebook as part of this discussion. As you can imagine, this is very very important for us as a company to get right. The well being of our mod moderators is super critical for us. At Facebook, the way in ‑‑ let me take a step back.
How it works on Facebook, when people see consent on their news feed or post that they think should not be on Facebook, they have the option to click report. They can go to our help center to find stand alone forms where they can report stipes of content directly to the teams. When this selective report option, they'll be asking a series of questions to help us figure out which is the best team that report should go to. And why is that important? We have teams of trained people on specific subject matters that we want handling these reports.
We also know when people are clicking these options, they may not use the correct one. All our teams are trained across the areas. Even if I'm on suicide prevention, I should still know what the other content standards are so that if the report does not violate the community standard it was reported for but are violating something else, they would still get the revision that they need. That's how reporting on Facebook works.
We also try and make sure our reviewers are native language speakers because we know that if I'm ‑‑ just because I learned Spanish for two years in college, I should not be reviewing a report in Spanish. I will not know local cultural newances. That is not a thing be take care of. Our moderators are not content experts. They're native language speakers and trained across our entire community of standards.
We want to make sure they have the psychological reports and wellness programs in place so they can reach out whenever they need it.
One of the things we've learned, there's no size fits all. Each person reacts to content and different pieces of that content. I might find comments and texts more triggering than the photograph or the video. Each person is different. When they require that kind of support is different. And what is going to impact them is going to be very different. We want to make it good for each person's needs.
The way we do that is making sure the team wellness and psychological well being is privatized for the team on the whole. Managers are incentivized for focusing on wellness of the team members. It is any team member should be able to go out and talk to each other when they're feeling stressed, looking at something which is causing them pain or making them uncomfortable, they should be able to go out and speak to fellow team members and get the support they need.
We need to make that environment on the teams so they come and have conversations on the board. It's also important to put things into perspective. Most of the content that comes to these viewers is just cat and dog, a lot of garbage. Very small percentage is the very very violent or graphic content which is very very hard to look at and very impactful. It's not like they're constantly getting a barrage of content. It comes, but it's a very small percentage. I just want to put that into perspective. Sometimes you get the image and they're just sitting there, is seeing one after the other of bad content. It's a mixed bag that can come their way.
Our goal is to build very strong team structures. We want them to have the support and help they need and make sure they have the psychological support and wellness programs in place they can utilize when they need it. And also we want to make sure that we don't try and protect a one size fits all approach. It is very different. Each person's needs are very different.
In terms of our outsourcing partners, we've chosen to only work with reputable partners who take care of their employees and also that they have wellness and psychological programs in place. I think that's about it.
>>MARIE‑LAURE LEMINEU: So all of the moderators that you have now are working for outsourcing companies.
>>PANELIST: Not all. Some.
>>MARIE‑LAURE LEMINEU: Some of them. So if I understand from what you said is that when you hire the company, the outsourcing company, then there are some requirement prerequisites. Yeah, that is correct. Okay. Thank you very much. And do you know how many hours the ones that are based at the company Facebook, do you know how many hours.
>>PANELIST: It varies by the type of content they're going to be reviewing. For some content, we want to make sure there aren't as many hours. One of the coping mechanisms that got my attention, the reviewer wears a southern set of clothes and when they're about to leave the office, they take that off and leave it to the office. They want to distance themselves from the work and personal life. Limiting the number of hours in certain cues. So there are different best practices which we have adopted. And the other thing we're doing is working very closely across industry. So for instance at the crimes against children conference last year in Dallas, we had somebody from our team come and present on these best practices and various coping mechanisms are so that other people could also learn and share what they would do.
The technology coalition has come up with a manual of the best practices so smaller companies can pick them up and utilize them for their teams as well.
>>MARIE‑LAURE LEMINEU: Thank you very much. Marco, do you want to speak now?
>>PANELIST: Thank you very much for having guests today discussing this important issue.
When I received your invitation, I went directly to our team dealing with these issues because I think something really bothering me sometimes is we end up talking about things we don't know or we were briefed by other people, and that's not really happy. So I went straight to the source as to one of our guys that is working on a people manager. And he gave me a good review on what they're doing, and the things that what I will try to do today, with not the presumption to give all the answer but to give you you second hand but a real company of how Google deals with this contents.
He said to me it was very clear that repeated exposure to this type of content has an impact on people that do this job. That's something we need to acknowledge and work towards in order to mitigate.
Also, the best protection would be to not use humans in doing this job. Of course, use technology. But we heard today, and in these days, but we are hearing also very often how humans are important in the process in order to make sure that the decisions they are taking are balanced and that algorithms are not deciding what should be the content you can find online. We still believe that part of the resolution in this problem is to work on technology.
From this point of view, we made a lot of progress in the context of for example identification of controversial content on YouTube thanks to the algorithm that we are using. 98 percent of all deflects that we are receiving for review in our team are coming from automatic search on the platform.
What is happening then is of course humans still have to review the content, and we try to, because of this first activity to use as much as possible the possible the exposure to the old content for these individuals. For example, not showing the old video but showing only specific frames of it.
So taking into consideration that the human factor in this process still irrelevant, what we are doing in order to mitigate, this is the following. So there are some best practice that are develop. And as was said before, they are process orient, so I'm going to repeat some of the things you already heard but it's good news because it shows how this experience is assured.
Different dimensions. First of all, the core of all the prosis to make sure that we are assuring quality of life in your job day‑to‑day life that is at the eyes ‑‑ I guess possible level. In order to do that, you need to take into consideration different factor of course is the fatigue of watching this kind of content. It is also providing a sense of proposal. Why? People reviewing this content is doing this job. It is important social role. This person are fulfilling and therefore the importance of actually their contribution of the society. And therefore also to work on their overall personal satisfaction for the activity and of course about any kind of (?) Avoid any kind of trauma.
And in doing that, we scored a flow of different things that can be taken. Some of them are about prevention. So gets that active step to prevent stress from like focusing on things like for example mindfulness, that's something that Google in different levels, even in our team we are very keen of. Try to put yourself in a state of mind which is going beyond your single task and address that single task that you are performing is affecting, how it is affecting you.
This can be done through meditation, encouraging obvious, encouraging as many times with friends, spending times out away from the decks, exercise, develop your personal safety plan. And that's something that can be done together with your manager, can be done in the context of the organization.
Then also very important for the manage to identify any sign of stress, any feedback coming from directly the person or it appears all the family. And these can be like sign that's come different from different sources, of course, in this physical ‑‑ most physical fatigue is something that can be taken in consideration of the manager is very important. And the behavior that of course a possible accident. All of these things are to be taken in consideration.
Then the third elementary is country. Taking action against the stress when it emerges, which means connect with the venues of the work we say before, department with the work said the kind of support but from expert and from your manager in order to relax, stretch, really achieve a better status.
And of course at the last stage, in case it needed a talk with professional, make sure all employees have a plan in order to get specialize counseling. Speak out at this self educatory resilience and take action to prevent this situation to getting worse.
All of this, we try to do it to Google, but also we expand it to all of our friends. We make sure that in the contract with our vendors, this flow and kind of approach to the work life is taken in consideration. And again, that's not to be done, absolutely not, was stressed from me for the book with that there are different ‑‑ different ways how this could evolve and could improve. But we fully, the message here is that we are fully acknowledge about the consequence of this, and we are mindful of the persons finding solutions.
>>MARIE‑LAURE LEMINEU: Very well. Thank you very much. Larry, I'll go to you.
>>PANELIST: First of all, I meant to congratulate the coalition for taking on this project. I want to especially thank John for bringing up the issue of how this is really impactful in the developing world among people who are extremely underpaid and probably in many cases underserved by the kind of resources at Google and Facebook and certainly internet watch foundation are putting into this.
As it turns out, I was advertised as being on the advisory boards of several of these companies, and while that's true, I'm pleased to say the subject has never come up. When we advise these groups, we advise more on the actual direct treatment of the constituency of children and others.
We have not talked about this in any of the advisory groups. However, another hat I do wear is for the past 20 years, I've been on the board of the national center for missing and exploited children. And they pioneered the type of work all of these are doing many many years ago, and has for well more than a decade, if not two, been offering counseling services and screening processes and really making sure that the employees that are doing this kind of work are getting the resources they need.
And just as the case of the internet watch foundation where this work is done is an isolated section to building a secure section building, this content, in addition to being extremely distasteful, it's also illegal for anybody other than those who are tasked and approved for access to it. So it's something that in fact we're in the process of considering moving to a new building.
And one of the things that we are looking at is to make sure that we can secure this area, and also to make sure that janitors and others that may come into the building to do work who are not employees don't have access to these areas of the building which by the way are 24‑hour a day operation, or at least the reporting lines are.
I also want to point out, and this gets to a point John made, if you look at the clothing industry, Nike in particular had been under tremendous pressure to make sure that its contractors were treating employees the way you would expect a company like Nike to treat its own employees, in terms of pay and working conditions. Apple had been under similar pressure for manufacturing in China and elsewhere.
Its incumbent upon the tech industry, not just these companies at the table, but the start‑ups as well that they take responsibility for the conditions under which people who are working on their behalf, whether employed or contractual are getting the kind of treatments they need.
Even if you're a standard of proof, the people doing the work for you, some thought be under the kind of conditions that John articulated.
One of the things my own small NGO connect safely is trying to do is work with venture capitalists and others involved in the standard of proof process at day 0, day minus one so to speak. So when companies start to process of rolling out their products, they think about safety on all levels.
To be honest with you, until today, it really hadn't occurred to me to be thinking about these kinds of issues. I've been this morning about how they develop safety into their products for their end users, similar to the way automobile manufacturers make sure they don't ship a car until the brakes and air bags are installed. That often is not the case when it comes to apps that are starting out, serving the needs of young people. But I hadn't heard this conversation. It occurs to me, we should be adding how they treat the contractors who are likely doing this kind of work.
I suspect that today, many of these companies that are employing these moderators are the start‑ups and the smaller companies that don't have the resources that the Googles and Facebooks do. So I think it's really important that all of us in the developed world are fully cognizant of the impact that our garbage so to speak is having on those that have to pick through it.
I love that analogy from the book, because it's very similar, and obviously applaud the work that's being done by IWF and Facebook and Google and of course NECMEC in trying to take care of the people doing the work.
And it's already been mentioned one of the more serious issues to look at are individuals who may be themselves exploiting or getting some kind of sexual gratification or some form of gratification from looking at this material. That can easily be a preelection that you can determine in the recruitment process or it can be an acquired behavior that comes as a result of the trauma they're going through.
So it's very important, and you may sure you do, that we constantly re‑evaluating our employees and also thinking about a ‑‑ and I'm curious to ask the people on the panel here whether there's a rotation process, how will someone stays in this role. It strikes me this should not be the type of career you start in your 20s and retire in your '60s. It's something you rotate into and out of so it doesn't become a lifetime job for you.
>>MARIE‑LAURE LEMINEU: Do you want to answer that?
>>PANELIST: Can I ask a question as well? Is there a trade association of moderation companies? Because I'm sure you guys have got contracts that say with the moderation companies and if you subcontract from this contract, you've got to ensure that the same standards are applied. But there must be a whole network of moderation companies springing up all over the world. And whether or not they're being held to the same standards is another matter. Maybe there's something we could all do collectively to try to intervene in that space.
>>MARIE‑LAURE LEMINEU: I think you just found your next job.
>>PANELIST: I can speak for Facebook, and I'm not sure whether we allow subcontracting so I need to go back and check. I am honestly not sure we allow subcontract.
>>PANELIST: Same thing for me on the subcontracting question. But in terms of tools they are using are also suggest that they are included in the way the other contractors are working with us is providing for example privacy screen, control and separate space for in order to prevent accidental exposure is levels that need to engage with the material and at blurring, major distortion, capabilities. So incorporate technologies that eliminate the views.
Use ensure that can be used only in the corporate in order to avoid the problem, working from home. Which I agree with you, this is the goal. I mean, I already in my experience at Google, two people coming from this team and now moving to the policy team or moving to different goals. And it's absolutely part of the development.
Without considering that the job that the people is reviewing in this country is not an important job, and also working day‑to‑day, every day with the people from these teams. And I think they are amazing in the dedication. Their understanding of the issues, the context ‑‑ well, the contextual information you needed in order to make an assessment, especially when we think about for example legal aid speech. So it's a combination of the things.
There are people that is really like thinking that they work they're doing has an impact.
>>PANELIST: Obviously if we come across someone that is struggling in that role, what we try and do by building a team environment that we have is we rely on the analysts to keep an eye on each other. So we want analysts to spot problems and issues with each other. There is the ability to take people out of the hot line if they are struggling with any issues that they're facing.
So absolutely, our staff's well being is of paramount importance. We do take them out of the hot line if we need to. And there are measures in place. Our manager for example monitors what our staff are looking at to spot any trends that may come out, if they're looking at inappropriate content and it's throwing up trend. So a good example is one of our former analysts ended up looking at an awful a lot of male indecent images. So that was sent up to our hot line manager.
And as it turned out, that person was removed from the hot line for a period of time to investigate what was going on. And as it turned out, they were doing nothing wrong. It was just the fact that they had hit a huge scene of child sexual abuse material and kept going and going and going down this scene until they found loads and loads of indecent male images.
So it wasn't that they had a particular potential for that sort of imagery. It was just that was where they were being led. So that proves that those safe guarding mechanisms are in place and they work. But yes going back to the point, there is always the ability to be able to take those people out of the hot line and ensure.
>>PANELIST: But what about the notion of staying in that job for 40 years? You haven't been around for 40 years. But is that something that you guys think about, not wanting to make sure this is a permanent position for these people?
>>PANELIST: I couldn't actually say.
>>MARIE‑LAURE LEMINEU: Because he started three weeks ago.
>>PANELIST: I started three‑month ago. But from a personal point of view, I do think ‑‑
>>PANELIST: They have a policy which requires all hot lines to have something similar. I doubt there's anybody else another than NECMEC and IWF and maybe the Canadians who do it to the same standard, but it's meant to be ipplace for all of the hot lines. Put I don't know if it's released. I don't know how well it's looked at closely. But it's meant to be there.
>>MARIE‑LAURE LEMINEU: Excellent. So we have 15 more minutes to go. We can open the floor if you have any particular questions.
>>AUDIENCE: My name is Natalie. I am from ‑‑ how effective do you think self regulation is self regulation of internet service providers in terms of countering such type of internet illegal cont tent?
>>MARIE‑LAURE LEMINEU: Internet service providers do not look at content.
>>AUDIENCE: I know but.
>>MARIE‑LAURE LEMINEU: So what would be the.
>>AUDIENCE: We have a code of conduct by which internet service providers can control access to the certain types of websites. How effective do you think this would be to control the illegal content?
>>MARIE‑LAURE LEMINEU: Who is we? We didn't hear your name.
>>AUDIENCE: I represent telecom operators as an operation in Georgia.
>>PANELIST: Some of stuff that IWF and Canadian hot lines are doing now is extremely effective and very high tech. It's not done casually. It's done on an industrial scale on a large base. But there are definitely issues around self regulation and how ‑‑ and the transparency. The key to a lot of this is transparency.
From a technical point of view, I think we can do it very well, ma'am.
>>PANELIST: As John's alluded to, I think self regulation has worked in the well of the space of child sexual abuse material, particularly in the UK. In 1996, the UK was responsible for 18 percent of child sexual abuse material. Now we're responsible for just 0.1 percent. So you can clearly see the impact that self regulation has had in that field. We work with the internet industry, and I think that's been really effective in terms of taking down illegal content. And now we're seeing content removed usually within two hours of the provider being notified of that.
So I think certainly in this space, self regulation has worked. But I agree with John. There are wider issues around transparency in some of this as well and transparency of services and what is taken as well from the IWF.
>>PANELIST: In the United States, internet providers are required by law to the cyber security tip line. So while it is somewhat voluntary, it's also mandatory. So at the end of the day, they're there, complying with the law. Although there are some that don't, and that's a constant struggle to make sure some of the newer and smaller services are aware of and are in compline.
>>MARIE‑LAURE LEMINEU: Does that answer your question? You're welcome. I think you're next please.
>>AUDIENCE: From the university of Cambridge. It strikes me that Facebook and Google are in a slightly different position from say the IWF. Because it's directly uses of your services who are disseminating this material, and that led me on to the question, it seems slightly odd that this is done through terms of service when you're talking about illegality, that this isn't simply addressed as a law enforcement issue.
Do you always report the individual to the police? Do you have procedures to escalate and clearly, when things do get serious enough, what are your procedures to law enforcement and are there issues also to bring it back to the theme of the panel, the people who are involved in that sort of relationship, welfare issues or safety issues which need to be addressed?
>>PANELIST: As U.S. company, we are required to report all cases. If we feel the child is in imminent danger or harm, we will reach out to law enforcement and act with those channels. But as a U.S. company, we are mandated to report it to the national safety.
>>AUDIENCE: What about if the user is based in that country, not in the U.S.? And I thought U.S. was based in Ireland.
>>PANELIST: Our headquarters is in California. Our international headquarters is in Ireland, yeah.
>>AUDIENCE: Here we are talking about ‑‑ I don't think we look into these issues as through an angle of jurisdiction. So as was said, we report directly to the organization because we know they have a direct relationship with law enforcement all over the world and they can act quickly in terms of saving actually the children that is exploited.
So this is what is happening at the moment. While we use them because they have this kind of context. Seem point of contacts in every country. Otherwise, we would really run the risk to go and talk with the law enforcement that maybe specialize on technical issues but not on ‑‑ able to run investigation and to find children.
Overall, there is a problem in trying to find the right combination of terms and conditions which is global and the respect of local laws, which does not mean that there is a conflict between terms and condition and the laws. But in practical terms, we have two channels. We can receive a fair amount of content because it's against our policies and we review this content in relation to these kind of referral that we see. So the content is against or not against our policy and then there is also the possibility to report content that is considered illegal through a different channel. And in this case, this content is reviewed by legal experts and they're taken into consideration and any kind of measure, according to the legal jurisdiction in these specific countries to be taken.
>>MARIE‑LAURE LEMINEU: Just a quick thing before we move on to the next question. In case of hot lines which are reporting mechanism of what the internet user or the person reporting thinks is illegal or immoral, then the role of the moderators or the analysts, sorry is, to look at the content and against the local legislation, the one that is in place in the country define whether the contents defines as illegal or legal. Then they would forward on to the police if it's considered illegal by the an analysts. So they act as intermediaries.
>>PANELIST: I think that child pornography is a very clear case, and that's what we're focusing on in this session. But there does seem to be a bit of ambiguity, the terms and policies team is making a legal conclusion. And yet it's being done under the terms of service rather than a legal issue.
>>PANELIST: I don't agree. As an internet user if for whatever reason I bump into an illegal content that I think in this case it shows up as material as not a legal person, I can make a decision to report that content because I don't have to report to any kind of authorities. And without considering my activity as a normal citizen to be considered legal analysis. It's just as our team bumps into a content, they consider that this content is relevant to be reported. I don't consider this action ‑‑
>>PANELIST: That's a different contribution. It is the same kind of problem. Actually, in this case, it's actually the other way around. Because in this case the kind of consideration that has to be made is first and foremost from the law enforcement. Because maybe they are running an investigation and they want to content to stay up. So on this one, John, I think it's a very different case.
>>MARIE‑LAURE LEMINEU: So sorry, we already have a gentleman over there in line and then it would be you; is that correct?? Is that okay? Do you mind holding for a minute? Please.
>>PANELIST: My name is Catherine. I'm with the European commission. On the terms of service issue, what is legal and not is stipulated n law. What the companies have chosen to do is also their users in terms of the inspections they perform of the materials and the actions they take, that's the part that's regulated in terms of service.
From the law itself, there arises no obligation for the companies to do the monitoring. So also in terms of the transparency to the users, we welcome the fact that this is clearly detailed in the terms of service, enabling them to take this action in the contractual relationship with their individual user. And then the reporting obligation is laid out in the U.S. law and also the scope of the reporting obligation in terms of the illegality of the material in defined there. That's how it works between terms of service and legal structure.
In terms of the forwards to the national authorities, the national center for missing and exploited children has set up a secure computer connection so that law enforcement from other countries, I think something like 90 percent of the notifications actually go to outside of the U.S. law enforcement from other countries can have access to these notifications directly. A lot of them go through your pole, which cross matches and enriches with whatever information is available and issues it to the members through a secure connection.
>>PANELIST: I think it's important to be careful to distinguish between the difference between illegal child abuse images and violations of terms of service. That's like comparing murder cases with parking tickets. Not that terms of service aren't important, but when governments start to require companies to enforce their own terms of service, many of which may be a higher standard than free speech laws.
For example, a Facebook and Google ban material that is absolutely legal in the United States but they choose to ban it. It gets very dicy when governments start telling them to enforce these kinds of rules when the content itself may or may not be illegal.
I have rules on how I treat my children, but I don't want the police coming in and enforcing them.
>>MARIE‑LAURE LEMINEU: We have a friend here who has been waiting to ask a question. We have to move on. Thank you. We have two minutes left.
>>AUDIENCE: I'll be brief. I have to say that having some background in copyright enforcement I found the whole issue of welfare of moderators quite new, quite fascinating. The two questions I have are actually focused on the procedures which kind of were tested on Facebook and Google.
So the first question is to the gentleman from Google. You said that when you talked about algorithms and their use, you said that 98 percent of finds, of matches would you get come from algorithms. Was this about child related content or was it about content in general, infringing content in general?
>>PANELIST: Controversial content. So the worst of the worst.
>>AUDIENCE: On to the second question for the lady from Facebook. You said you have various teams dealing with moderation of various types of content. Is there some kind of a fast track for staff, which is like, you know, child pornography, which is something which is arguably much more abhorrent than copyright infringing content? Is there a fast lane for that or not?
>>PANELIST: So when you choose the type of content you try to report to us and when we triage those teams, your answer helps determine which team it should go to and that determines how fast we need to turn it on. If it's suicide, something related to a child safety issue, that has the highest turnaround time.
We also have placed on our help center where you can go and get specific forms of reporting. Those forms which require a higher, faster turnaround, you can go down that route as well. Those have a very very high turnaround time.
>>AUDIENCE: Just to build up on that, when you make this determination of which content gets the fast lane, do you look towards your terms of service or do you look towards the laws that you're of U.S. or any other country when you make the fast track determination?
>>PANELIST: I'm not the best person to answer that question, but my understanding, it's a whole variety of factors that determines that. It's about real world harm and whether the person needs immediate attention or not. Some of these cases are very easy like child safety, suicide prevention, you know you have to react quickly.
>>MARIE‑LAURE LEMINEU: Unfortunately, we have to wrap up. Of course I'm left with the feeling that we could continue talking about that for many more hours, which is good I think in a way because it means we definitely need to continue talking about that and organize more sessions, with the IGF next year or other policy forums. And of course we're all aware that only half of the world is connected to the internet. The other half is going to be connected in a matter of years now.
So this will grow, and this topic is not going to go away. We really need to continue discussing that. Thank you very much to all of the speakers, and I look forward to seeing you at another session, another place, another country. Thank you very much. [APPLAUSE]