The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> DAVID WRIGHT: Okay, ladies and gentlemen. If we can take some seats, we'll make a start.
Good morning and welcome to this session entitled Is Personal Data Mine or There to be Mined?
My name is David Wright. I'm from the U.K. Safe Business Centre and the European Insafe Network. Insafe is a network of 32 national awareness centres who deal with legal online content. We provide supportive online services and raise awareness of online child safety issues.
A great example of that, great example of this awareness raising is Safe Internet Day. And next year that will be on February 7. It is currently in its, I think, 14th year, celebrated in over 100 countries. It is a great opportunity to raise awareness about online safety issues nationwide. You can find more information about Safer Internet Day as well as the Insafe Network at our booth in the village. So do go and check that out.
We do need to press on. The session, we only have an hour. So we have got eight panelists that we are going to hear from. I want to open the floor to questions as well. So we will press on.
So technology is astonishing. We have an insatiable appetite for apps, especially if they are free. Amongst just us we probably have a huge range of free apps on your mobile or smart devices. Don't we love stuff that's free?
Fixated on accessing services or apps all too often we tick the accept terms and conditions, often with little or no regard to what is contained within them.
However, and as well, we also know there is no such thing as free. Given the knowledge in the room we all know that if we're using a free service, indeed we are the product. Whilst advertising in app few, the models go for personal data. We are going to focus on children and young people as they often make full use of social media and mobile apps, especially free ones. How is their personal data protected, managed and used? What is the legislation and regulations in this area? That is what we are going to be looking at.
We will start off, Jianne will initiate the discussion by providing a multifaceted analysis of incoming European data protection regulations. Just by way of brief introduction to the panelists, John Carr will be familiar to you in the room. He works with the NGO Alliance for Child Safety Online and London University.
Sonia Livingstone is secretary of U.K. Coalition for Children's safety and U.K. Council for Child and Internet safety. She is the Professor at the Department of Media and Communication at the School of Economics and Political Science, author of 20 books and reports on children's online opportunities and risks. She founded EU Kids Online Network.
Auke Pals is a part-time student and the youth representative of European Digital Youth, very much the European perspective.
We will hear from Larry Magid and Marsali Hancock who will make comparison of the Children's Online Protection Act, deliberations from the 1990s. Larry is a technology journalist and Internet safety advocate, Chief Executive Officer and cofounder of connectsafely.org and online technology analyst for NBC News.
Katherine Montgomery is a professor American University in Washington, D.C. where she directs the Ph.D. program in communication. During the 1990s she spearheaded the efforts that led to the passage of COPPA. She is that you or of generation digital and conducts discussions.
Marsali Hancock focused extensively on work with I Keep Safe on creating a new California study privacy assessment. Ann Neves and Louise Marie Hurel will balance the spectrum. Ann is from the Society of Foundation of Science and Technology in Portugal, former member of MAG and worked in the field more than 25 years. Louise Marie Hurel is researcher at the University for Technology and Society at SGW located in Brazil and part of YouthIGF and Youth Observatory initiative used to promote Internet Governance debates. She comes from a public sector background and is passionate about privacy.
Each of the panelists will have no more than five minutes to share their experience with us. After which we'll open it up to questions.
So John, to start off, is personal data mine or there to be mined?
>> JOHN CARR: Both. So the European NGO Alliance for Child Safety Online recently, about six or seven months ago published a thing called When Free Isn't. And it looks specifically at children as actors in the online eCommerce space. One of the tables that we produced in this excellent publication, which is available online, showed you the proportion of revenue from apps that was derived from apps that you actually paid for at the point where you downloaded them and the revenues that derived from so-called free apps.
We had an interest in this, of course, because overwhelmingly, but children are a major users of apps. I think 90 percent are, or in excess of 90 percent of all of the revenues from apps came from apps that were described as free. So one of the points that we make in our publication is that we think that the use of the word "free" or even freemium is a deceptive term. The reality is if you want to do anything interesting or useful with the app, you're going to have to pay at some point. And often quite quickly.
Anyway, the general data protection regulation, here it is. I have it in my hand. It is the biggest single piece of legislative action taken by the European Union in its history. The consultations began informal around 2008, 2009. The formal consultation began in 2012. The final text was adopted in December 2015. It becomes law in every Member State in May of 2018.
From a children's point of view, there are definitely some very, very good bits in here. We like, for example, Article 35 which specifically requires everybody who is involved in the provision of an Information Society service, which is what they call things like apps or indeed almost any commercial activity on the Internet -- anybody who is involved in the provision of an Information Society service will be required to carry out an impacts assessment to determine whether or not in providing this particular service, whatever it is they are selling or promoting, there is a risk of any harm being done. Obviously in this case harm to children.
If they conclude that there is a risk of harm, how broadly you define harm, well, it could be very broad. We just don't know yet. If you conclude that there is a risk of harm, you will then be under an obligation to take steps as a company, as a business, as an information service provider, to take steps to mitigate that harm. That we regard as a very positive aspect of the GDPR.
Less positive, in some way, although it is too soon to say with certainty, is Article 8 which is introducing the default age 16 for all young people to be able to give consent themselves without having parental consent or verifiable parental consent before they can access an Information Society service.
Each Member State is given discretion to either stick with the default age of 16 or adopt a lower age of 15, 14, or 13. You can't go below 13, but you could go for 14 or 15. For the first time in practical terms in Europe, we have the possibility of seeing different jurisdictions having different age limits for young people. How this is actually going to impact upon young people's experiences or changes with each other across national boundaries and jurisdictions is still very unclear. The simple truth is, in the course of formulating this monumental piece of legislation, I'm afraid the legislators didn't think about that aspect. Certainly they didn't think about it in any depth and absolutely they did not consult any children's organisations or child welfare organisations. That is still a little bit hazy.
As I said to the beginning, the answer to the question is: Both. In principle, the idea of consent, for how your data is used is there for most categories of data. Not all, interestingly enough. It is possible if a company can establish that it has a legitimate interest that it doesn't need your specific consent to be able to collect or use that data. Analytical data, things of that kind.
I can see David's fingers are getting twitchy. There is a great deal to be said about this largest ever piece of legislation in the EU's history. That will perhaps give you a flavor of some of the things I think are important.
>> DAVID WRIGHT: Thank you, John. It was always a worry, was keeping John to five minutes. It was my main worry coming into this session.
(Chuckles.)
>> DAVID WRIGHT: Hand it over to Professor Livingstone.
>> SONIA LIVINGSTONE: Thank you. Could I have the slide, please?
Are you putting the slide up? Okay, great, thank you. I'm a researcher and I like to begin with data. So I just have one slide and I want to focus on the way -- I'm afraid this is from the U.K. because this is the only place I have the data. I want to focus on how U.K. children are using social media platforms by age, because age is the point at issue.
So what we see here is that one in five nine to ten-year-olds and nearly half of 11 to 12-year-olds are using social media profiles in Britain despite being underage. It is obvious that self declaring your age is ineffective.
And what the GDPR invites to think about as well as in other countries around the world is whether data authorities can exert greater power to ensure that underage children are not using services, or whether in making any such legislation we are inviting children more often to declare a false age. So to be treated as adults on social media platforms even though they have rights as children.
By 15, you can see that nine in ten U.K. children have a social media profile. I don't know what is going to happen if EU keeps the GDPR age of 16, if those children are thrown off of social media platforms. I don't know and it's a good question if social media sites will seek verified parental consent for the age 13 to 15 by May 2018, or whether it won't bother and it is going to just delete all of its profiles.
What interests me about this graph in addition to pointing out the number of underage users and the difficulty of managing the will regulation is that this graph is not smooth. It has some jumps in it. Look at the jump from 50 percent at 12 to 74 percent at age 13, this this tells you a quarter of British teenagers are complying voluntarily to wait until they are 13. Whether they will wait until 16, I don't know.
Look at the jump from 21 percent to 43 percent at 11. Any Brit knows that's when a child goes from a small local school to a big anonymous school. It's the moment when they need social connections. If social media platforms are not to provide those connections, my question is who is going to provide those as a society?
Then if we look at the jump from 9 percent at eight years old to 21 percent at nine years old, I think that tells you this is something about the age when parents are giving their children a mobile phone or connected devices. They are doing it because they want them to be able to communicate and be safe. They are giving access to their kids by the commercial world.
So is the GDPR right to protect the under 16s? How am I doing? Am I gaveling? Oh, good, that's where I am!
So something about the media literacy of children. All this data comes from the media regulator OFCOM. They report the literacy indicators for 12 to 15-year-olds who are the age group at issue. So for 12 to 15-year-olds, what they tell us in Britain is half know YouTube and Google are funneled by advertising. Less than half can tell that the search results with the orange box on it with the word "ad" are adverts. Only, 27 percent think the information returned by a Google search can be trusted. 17 percent say I will give details about myself to a Web site to get what I want. 17 percent said getting more followers is more important than keeping my information private.
For younger children media literacy is even lower. Should we throw them off or consider also the benefits, because all of these children in my graph love to communicate online. All of them are using the Internet to get information services. 44 percent of 12 to 15-year-olds use the Internet to make a video. 18 percent to make music. 16 percent to make animation, 13 percent to make a Web site. This is a creative space. They are developing digital skills.
Thirty percent of them at 12 to 15 go online for civic activities like signing a petition, sharing news stories, writing comments, talking online about the news. If we make the age 16 in Europe we will prevent all of that activity as well.
So the right to participation and the right to protection are strongly in conflict. I hope someone in this room knows how to solve that problem.
>> DAVID WRIGHT: What a great point to finish on. Moving along, I'll hand it over to Auke.
>> AUKE PALS: In this session I would like to talk about personal data and not about children data because this session is called personal data. Two weeks ago, prior to this session, John Carr asked us all who was paying us. And I was telling him, yeah, I am getting paid by the NRGF. And we do represent, I represent myself. And we are all sitting here with its, because this panel is mostly with child protection people and that is not really fair, but it happens everywhere. It happens everywhere. Also with data collection. But yeah, everybody is sitting here because he has to represent someone and maybe because they just have to be here because some people might be sitting on Amazon or just playing chess or something like that.
But yeah, I still remember Eric Schmidt saying in 2010 every two days we create as many data as we did until the dawn of -- yeah, our creation, actually.
And that made me realise that storing our data is definitely not free. Someone has to pay for storage and for the services and cooling and we can use services for free. We can use, for instance, Google, Facebook. I don't want to mention, there are a lot of other companies that are collecting our data.
But yeah, that made me realise that everyone is sitting here with some interests. Also the large technology companies are. But my conclusion was, we are paying with our data and not with money. But what if we will pay with data and just the data collection would stop? Would we feel better about it?
Some people would, because they are not giving away their data. And some of them don't really care about it. And just want free services. But yet you can't change these large technology companies. They are having so many interests of data collection and selling their data to other companies for way more than they can ask for having their service being paid for. And I guess that a government has to interfere in that. Just having a law, regulation saying we have to stop data collection at some point if we are paying for the services, because if I buy a car, the car company won't collect all my data. They will just sell the car and have a little percentage on that. Now we are sitting here with a panel of child protection people.
Now I want to say something about that. Because we have made some standards in the past where your criminal record is closed and you can start over again at the age of 18. In Holland our healths in is free until the age of 18. So why not do that with data collection? Don't collect any data from children before the age of 18. And when they are an adult, then they can decide themselves whether they want a service to be free or want to pay for it with their data or with money.
And I guess that's just a better solution than banning children from social media and other things, and in general data protection regulation there is not a clear line about that. You can be 13 or 16, and the country itself can decide about that. And I think every service, the whole Internet should be open and free for everyone, whether they want to pay with something or pay with money. But every service has to be available for everybody. If somebody at the age of 2 wants to use SnapChat, for instance. They should if their parents allow them to because they are still under the age of 18. And the parents have responsibility for them. So the parents can decide whether they want or don't want to use, or let the children use so many. Thank you.
>> DAVID WRIGHT: Thank you. We've heard three panelists around the changes that are happening across Europe. At the moment with regard to data protection and also clearly the voice or perspective from a young person too.
Now turning across the Atlantic to the experiences that we have had across America. Start with Larry.
>> LARRY MAGID: Thank you very much, David. Well, in the spirit of data mining being used as a way to promote things I'm going to make two quick promotions. David mentioned Safer Internet day. If anyone finds them self on the Eastern Seaboard on February 7 join us at the National Constitution Center in Philadelphia where we will be celebrating. ConnectSafely will be organizing that event and we would love to have all of you come. We are not mining your data, just a commercial.
The other commercial, on connect safely.org we have a section on guides. We have a parents' guide to student data privacy and educator's guide to student data privacy. These have more data than I have available to discuss with you today in my few minutes.
Before I talk about the laws of the United States, I want to talk about the unintended consequences. In almost every aspect of child protection, there is a tension between protecting children and protecting children's rights. That is also true when it comes to student data privacy. Of course, I applaud all efforts to regulate and control the ability for young people to erase the data that is collected on them during their youth, as well as to protect the privacy of the data collected not only that they post perhaps in social media but the vast, the extreme amount of data, at least in the United States, that is collected on them by the school system starting at a very young age and continuing at least all the way through age 18 if not all the way through college and graduate school.
Having said that, it is important to point out in this day of cloud based applications such as, for example, Google Docs and other cloud-based apps used by students often sanctioned by schools, this data which is admittedly being stored by large corporations is the property of the children. May be of value to the children not only throughout their childhood but perhaps throughout their entire life and needs to be accessible by the children and only deleted with the permission of the children.
Unfortunately in the United States with all aspects of child issues, children essentially have no direct rights. The rights are vested to their parents. It would be possible, for example, as I understand -- we have other experts here on the Family Educational Records Protection Act, FERPA. It is possible for a parent to order the deletion of a child's papers and other information even if the child him or herself may not have wanted that deleted or at that point in their life may not have been aware of the fact that they might be able to access it later in life. It is always important in every discussion that we have about child rights to make sure we are talking about child rights, not simply child protection.
Having said that we have a piece of legislation in the United States which is extremely important in protecting the privacy of children called FERPA. It applies to all schools that receive federal funds, which is not every school in America, but the vast majority of schools and covers the vast majority of students enrolled in schools. What this does for the parent, not the child, it protects the right of the parent to review and inspect the child's records, request corrections and provide consent or denial of disclosure of some but not all of your children's information.
It does permit schools to share information for certain situations such as enrollment and transfer information, auditing and information. When I talk about students information, it ranges the gamut from educational records, their grades for example, to participation in a school lunch programme which can have some indications or information about their financial status, to potentially even their health or psychiatric records. There's a great deal of information that should be and must be protected, and is largely is protected about I this federal law.
But that doesn't address the issue that others have raised so far, John and others, about the information that people disclose voluntarily such as in apps and in public social media and in other forms of public disclosures. Therein lies the realm for education. All of us in this field need to do everything we can to educate young people as well as adults, understanding how to use the privacy tools built into social networking and understanding what is appropriate and not appropriate to post.
Finally, I know I'm out of time almost. It is up to both government and industry to make sure that industry doesn't abuse that information. We have to be very certain that all of these companies, most of whom have very good privacy policies, actually stick to them. Those who don't have adequate privacy policies, and there are some, are pressured to make sure that they are doing everything to protect the data, for the most vulnerable but all of us. Not everything but most things are good for children and for adults. I argue this is something that should go across the board.
Thank you so much. Only 19 seconds over.
>> DAVID WRIGHT: Fantastic. Thank you, Larry.
I am going to hand it straight on to Katherine Montgomery, for Katherine's five-minute perspective.
>> KATHERINE MONTGOMERY: I am creating too much pressure for that. I'm glad to be here representing my University, American University but also representing the centre for digital democracy, a U.S. NGO involved in privacy and consumer issues. It is really important to have this conversation internationally about children's rights and how they are being fostered in the digital media culture because we are a global culture now. And too often I think these conversations have taken place within individual countries. I have made a commitment now, I'm going to be having more of these cross-national, international conversations with my colleagues.
On that note I also wanted to just let you know, for those of you who came to the panel the other day on Internet connected toys, we have been working with NGOs in Europe around a dozen or more of NGOs in Europe and the U.S. that have filed in a collaborative strategic effort, have filed complaints with regulatory agencies in both the United States and in EU against some toy companies that are manufacturing some dolls and robots and other products that are not secure and that are not in compliance with current laws.
I see this as the first of a number of efforts that we hope to make and involving academics as well as NGOs, civil society organisations.
So just quickly, since sometimes I am called the mama of COPPA, because I led the campaign in the 1990s to establish some safeguards for children, in the Internet. At a time when the Internet was being commercialised very rapidly, and when children were, as they are today, a lucrative target market. We saw practices being put in place that with very potentially manipulative and harmful for children.
Our goal at that time was to build in safeguards at the outset of that marketplace to get buy-in from industry and to have them, established by government so that there would be a level playing field. Everybody understood if you are going to do business with children on the Internet, you have to treat them fairly. We included in that law rules about minimizing data collection, rules about not being able to condition a child's participation in a game or something else on the basis of what they told in terms of personal information because there was a lot of that already going on.
And we also included parents in the law so that parents would be providing permission and be involved in what their young children were doing. We set the age at under 13 based on developmental psychology literature, based on traditions in terms of those protections.
So I think it has been important. It has helped guide the development of the children's media culture, the commercial culture online and we updated it a few years ago so it would apply to social media, to many of the practices taking place there.
Let me say, having done that, and enacting any policy is always a humbling experience. You try to get it right and you are continually working to think these things through and to do things that will be able to protect children, on the one hand; but not to do it in a way that inhibits the participation.
In terms of GDPR, I do not support requiring parental permission for children who are 13 years old and over. Over 13, as we struggled with in the '90s, it is not appropriate. These are young teenagers -- am I not doing this right? Okay, sorry about that.
These are teenagers who need to be empowered. I always called for more granular kinds of privacy policies so they can make their own decisions. I think that's still the case.
I don't think we can talk about any of this without taking into account the nature of the digital media commercial marketplace. And we've got to think about a couple of things -- where am I? We are almost out of time. I'm going to cut to the chase. Big data. We are talking about a system that is based on data monetization, personalisation, targeting individuals based on where they go on their friends. Not just what children disclose about themselves but how their behaviors are tracked everywhere they go. We are talking about enormous risks. We are also talking about some challenges to our traditional models of notice and consent. It is tough, it is going to be tough in the big data era.
What I think is that the GDPR debate and controversy creates an opening for us to talk about building safeguards for teens and children into the digital marketplace. And get an industry to have that conversation with us as well as youth and civil society. I think it can be done. I think it is going to be challenging.
Tough questions. No simple answers. I think we must do it.
>> DAVID WRIGHT: Katherine, thank you very much. And again, because of time, we will move for the final perspective to Marsali. Thank you.
>> MARSALI HANCOCK: Thank you very much, David. I'm pleased that I'm positioned here in the panel. Because Larry spoke about FERPA, the education privacy law and we heard about COPPA, the Children's Online Privacy Protection Act, all children, whether they are in or out of the classroom.
Let's put in context that everything we do online, absolutely everything actually shapes our web experience moving forward. So if we were all to do a search word, pick the word Oreo. If we were all to search on our own devices, what you get and what you get and what you get will look different than what I have, depending on what I searched before.
So it is really important that we think about the role of our own data and how the entire engine of the web, the economy of the web and the opportunity of the web and the risks is either they give us something we are looking for or somehow someone wants us to find something they think we will find interesting. In the discussion around data and can it be mined or is it mine? I don't think we can begin to fathom the number of data points that each of us in our own room have already released into the digital universe. I don't know how many of you took an Uber, and now Uber knows everywhere you have been and where you are now. Every time you turn on your phone, every time you click. It is not to be paranoid, but to see it as an opportunity. The one place where the children do not have the ability to make choices about data they are sharing and with whom is in the school setting. One of the biggest gaps we have is around accountability and transparency. In the United States the FERPA, the only child privacy law that is federal, specific to all students in a public institution and our public funds is paying for that information, has never once been enforced. Never.
So it is not a very high priority either inside the classroom in the schools or businesses. They have their priorities and they have their interests that they have to meet with stakeholders. So I think as we look as a global community, identifying ways to help find and foster and reward the companies that are willing to be transparent and have accountability measures that allow educators to select products that have been through a third-party review. So that we are not just guessing. When you look at the potential impact for good and for harm, I want the Internet and my Internet company to provide me the services I want. I want the advertisements that are interesting or healthy for me.
When we are talking about our children, their lives will be absolutely shaped by what they have done as a child. So here is an example. The reading skills of the seven-year-old in the United States are pre-indicator of their success in all of their other academic studies. How many people on the planet do you want to have access to your child's reading scores? There are the records created on campus and there are all the little things about us that we are interested in, the colors we like, the music, style, fonts. There's such a great opportunity to create. In that creation we also generate data. So the goal is not to reduce creation or to reduce expression. It is transparency and accountability. If you know where a child sits in the school room, do you know where their data goes when it leaves? And do you know what happens to it? Well, it's gone. And will you get it back? And will you have an opportunity to review and look at it?
What we don't understand today, people are making decisions about us and for us before we ever have an opportunity to know who they are and what they are. Here is another example. Some research found take people using Twitter in English that do not use proper grammar are more likely to default on a loan. Now, if you are likely to default on a loan, it means the loan agency will charge you more for the same loan. Which means a child who is learning to read -- this happened to my daughter in ninth grade. She barely turned 13. Her teacher required that she answer on Twitter. So Twitter is a public forum. And so depending on how children are using Twitter and at what age, it may mean that they will pay more for a mortgage and they will pay more for their car. Or maybe they won't have an opportunity for a job.
Another one of my daughters, new in college. The English professor required that the students create confidential papers and then post it on social networks. So she talked to the Professor and said this really isn't a good idea. If you want a job in a law firm or to put this up on the web, it is going to create some problems for us. It was mandatory. So she had to drop the class.
So as we think about data, we have to do both sides of the coin. We create data in schools about our children and we also release enormous amounts of data about our students. That in itself can be a wonderful or a horrible thing, depending on if we know who that company is and what they are doing with it and who has access to it and if we have a chance to look at it and review, because students don't have a voice and choice in which teachers ask them or require them to use for data. They don't have a choice on the information systems that are used.
When you look at the fastest growing victims of ransomware, meaning you go on campus and all of a sudden you don't have access to any of your information as ransomware is schools. One in ten are victim of ransomware this year. I have been crooked off. But I tell you as we think about the importance of data and mining data, there are solutions within that narrow Venn diagram of students and creating accountability.
I'm happy to talk to anyone who is interested in having their country have a collaborative work an approach to this. Thanks, David.
>> DAVID WRIGHT: Marsali, thanks very much. Without further ado, we have even either side of the perspective. Ana, you have five minutes. Thank you.
>> ANA NEVES: Thank you very much. Thanks for the opportunity to share with all of you where we are on the discussions on this field in Portugal. First of all, we have to be all aware that there is not ... (feedback.)
>> ANA NEVES: -- such a thing as online and offline worlds. They are totally interconnected. What one does in one has consequences in the other.
What is happening here?
Closer? Closer, okay.
So privacy in the Internet is more and more important, bearing in mind that more than 50 percent of the world population is connected, meaning around 3.5 billion are connected to the Internet and more are about to come as well, as more interconnected devices. We quite often provide our personal data on a voluntary basis to use the social network, an application, a service, or just do a purchase. But these are not the only data that you are providing. There are so many other data collected by companies such as you have the position, date and time, when you use a service and they become metadata, meaning that data, information that provides information about other data. We should be all aware, well aware of that. Our children today are being socialized in a way that is vastly different from their parents. But the ease of use of digital, does that mean that they really understand their rights to protect their personal data? The children and users in general, want to have access to social networks, services, apps. But who really reads the terms of contract for such use? We are often giving away things that we have no notion. The issue here is that it seems that we have no choice. Either we accept the rules of the game and have access to the service, social network or app or we just don't be part of it. Then children and youth become excluded, which is highly problematic in a connected digital society.
Lastly, I would like to raise briefly some issues that must be involved in the near future at the worldwide level. What is data privacy and security? Until what age is a user in the Internet considered a child? Is it only a question about age? During the Portuguese initiative of the IGF, we discussed these questions. I would like to share with you some of the messages that came out. The ubiquity of the Internet and the Information Society have changed habits, behaviors, aspirations, rules, jobs, fears, prejudice and the citizens' needs. Therefore, the implementation of new privacy standards for example the regulation of the European Parliament and the Council of this year on the protection of network persons with regard to the processing of personal data and on the free movement of such data, which will enter into force in 2018, is particularly relevant to the various representatives of the Internet community.
Too often issues surrounding privacy need more transparency and more stakeholders should be involved in the discussion, implementation and regulation. We are in the Internet Governance Forum so we have to discuss governance issues about privacy.
The argument that users must have full control over their own data implies alongside a preparation of users and consumers to this process through proper skills and training. On the other hand, it is required greater responsibility and accountability of actors who provide services and products over the Internet.
By design and by default principles should be the private sector priors which could reduce eventually government regulation. In addition to legal frameworks, governments should also prioritize citizens, digital literacy training and education. Thank you very much.
>> DAVID WRIGHT: Ana, thank you very much indeed. Final input, Louise, I'll hand it over to you.
>> LOUISE MARIE HUREL: Thank you very much. It's a pleasure to be here. First of all, I would like to start off by saying that the Internet as well as the expansion and the development of information and communication technologies has profoundly changed our way of communicating and interacting, changing information and taking part in social life. In the same way, these changes have also provoked deep changes in the relationship between children and the world around them. ICTs have become a crucial part of many people's lives. Growing in the world where connectivity seems to be an undeniable trend also carries its own challenges.
What does it mean to be born and raised in a so commonly hyper-connected world? Where experiences are not limited to devices which we so commonly hold in our hands but is otherwise expanded to the most particular and intimate aspects of our lives?
Perhaps this is the next question we should ask ourselves when thinking about the relationship between security, privacy, and children interaction with ICTs. One can trace many experiences from the relationship between children, adolescence and the growing Internet access. When we talk about personal data, we are mobilizing a category, classification of data that is somehow attached to a notion of self.
I remember several years ago now when I first got my mobile cell phone. I was nine years old and it was mind blowing. I could talk to my mother finally.
(Chuckles.)
>> LOUISE MARIE HUREL: While a few years after that we saw Nokia's colorful cell phones and they seemed to flood the worlds of children and adolescence. We could have the pleasure of playing games and sending SMS and using ring tones. The attractiveness of tablets and notebooks is because there is a shiny screen that interacts with those not in front of it but serves as a venue for children to communicate with others and access information.
Once I was with my little cousin, I would like to share a really brief personal note on that. She forcibly made me sit down with her and play an online game. All the users were some kind of an animal or stuff. Each one of them, including my little cousin could buy clothes for the animal avatar, buy their own houses and clothes. She was clearly familiar with surfing the web and playing games such as this.
However, while I was wondering about privacy, she was concerned with being able to access a particular service, a particular game. So it seems as if our online experiences are ever more attached to these big thematic parts where we sign in, accept the terms of service blindly and dig into an experience where privacy and security are not central and constant take downs are opaque processes. Algorithms who select who and what is important. We condition ourselves to limited forms of expression that are translated into posts, images, videos, and mostly we are okay with that because, well, all of our friends are in there. So that's okay.
But being a digital native, being born into connectivity, they use the user appropriation of ICTs does not include knowing all about it. Parents, schools, adults have an important role as mediators and caregivers in a certain way. Providing safety measures and helping them to expand perspectives related to the aspects of Internet usage. Right now according to the ICT kids from Brazil annual report 2015, 87 percent of children and adolescents in Brazil from nine to 17 years old have an account in social networks.
What we've seen in this past month in Brazil, for example, is that the appropriation of a concern with children and child pornography can be articulated as a discourse, as a pathway to promoting bills that tend to over feed law enforcement agencies and not only necessarily tackling the most important fabric that helps us in thinking, for example, in how to connect the next billion, education.
One thing I would like to highlight fast -- and I know the time is ticking. I would like to highlight that these processes are not devoid or exempt from underlying problematic disparities and asymmetries related to race, gender, stereotypes, physical appearance and socioeconomic status.
In 2015, 25 percent of online discrimination in Brazil was focused in race between Internet users that ranged from 9 to 17 years old. However, the highest rates of discrimination in general are normally associated to adolescents between 15 to 17 years old.
Building one's identity is an ongoing process. That happens both online and offline. And most certainly that depends not only in the child as a user, but in the environment that she or he is exposed to. These services and entertainment platforms, islands, they need to reinforce and they reinforce the individual character of Internet access as something that pertains to the routine of being connected, as being part of what is socially normal, being autonomous and independent and playing games, navigating through different wents and accesses different services. How can we promote safe spaces and collaborative interaction between. There are shared responsibilities between parents, teachers, caregivers, as well. In order to foster an online that is more age sensitive.
An Internet Governance that the international regional and national and local that provides institutional mechanisms for building and connecting the next billion, but bearing in mind that what kind of connectivity and environment that these future users are going to relate to. So thank you.
>> DAVID WRIGHT: Thanks very much. Thank you. I'm going to in the last five minutes open the floor to any questions. Have we got a microphone? Okay. If you can just please say your name and any association would be great. Thank you.
>> AUDIENCE: Hello. Can everybody hear? Hello?
My name is Juan Pedro from Portugal. I'm an Insafe use Ambassador. I would like to address because we spoke about transparency. We also spoke about some opportunities that we miss if we can't really engage on a social network or other type of site.
I brought a question and since we are speaking about data property and collecting, I think we should really address the big elephant in the room which is terms and conditions. If on one side we are seeing that data should be protected, we also should empower youth, and we are speaking about youth. We should empower them with the knowledge of what we are giving because as we know, we wouldn't really have the time to go through each point and specifically about privacy and what we are giving away. And it is not as if we can really accept them because there is no alternative. So it is also an initiative, the youth manifesto which is the right to know what we are giving. Thank you very much.
>> DAVID WRIGHT: Okay. So panelists, if anyone else has another question? Okay, we'll take two questions and then answer them and we may well have run out of time.
>> AUDIENCE: Thank you. I'm Alan Lewine here with Privacy Fundamentals, having recently left the corporate world. I am aware and think often of Stewart Brand's famous and perhaps apocryphal saying: Information wants to be expensive. And it does. There are real costs involved in operating all these systems and in providing good, adequate privacy protections. As we implement good privacy protections, I think we need to think about -- I'm going to invite comment on -- how do we provide the economic incentives for companies that provide their services and build the pipes to make them effective and make them implemented?
>> DAVID WRIGHT: Okay, thank you.
And you have a quick question as well? We'll take that and open it to the panelists as well.
>> AUDIENCE: I have the question off Twitter from Jan Hedrick. What do I want to reach with data not being collected under the age of 13. My response by law the children are not responsible for their actions. So being in database forever wouldn't be fair. And getting to an adult is a process of being or getting slightly responsibility. So at a later age, getting punished for these actions wouldn't be fair.
That was the question.
And the response.
>> DAVID WRIGHT: Okay. So the question about terms and conditions? Sorry?
There are three questions. Terms and conditions and then we'll come to the commercial question too, John?
>> JOHN CARR: I'll read one from the GDPR since that was what I was asked about. Quote. Given that children merit specific protection and that's referenced elsewhere in the thing, any information and communication where processing is addressed to a child should be in such clear and plain language that the child can easily understand. So that goes to the point about the terms and conditions.
That doesn't necessarily deal with the volume of words that can appear in terms and conditions. There is a famous story about this guy whose terms and conditions for this particular app extended to about 1,500 pages on the screen of his typical smartphone. On page 1,275, he inserted the following sentence: If you've read this, ring this telephone number and I will give you $5,000 U.S.
Nobody rang.
(Applause.)
>> DAVID WRIGHT: Larry?
>> LARRY MAGID: In response to Alan's question, it's a very interesting question. First of all, I think that companies have a right to make money, but we have a regulatory process that says they have a right to make money under certain circumstances. While I'm not in any way believing in over regulation, I think holding people to their privacy promises is essential. The government has a role in that.
Finally, I would like to look in traditional media, the U.K., the BBC, the United States, of National Public Radio and Public Television, there has been a precedent for publicly supported media. Whether that works in social media or not, I don't know. We have to make sure that government had no control over those media, but it is an interesting role that we have not explored sufficiently when it comes to social media.
>> DAVID WRIGHT: Okay. And final response, I think, in like ten seconds, Katherine.
>> KATHERINE MONTGOMERY: Well, we are not talking about whether they can make money or can't make money, as Larry says. But there are conditions, I think, that we need to place on companies in terms of corporate responsibilities to young people. We need to review those. I think it's time to really review them and identify some safeguards in terms of data limitation, in terms of certain marketing practices, in terms of transparency and clarity in communication with young people. And there's much that needs to be done in this area.
>> DAVID WRIGHT: Okay, everyone. The time is now up. And so we have lots of questions in this debate, and that will carry on and on and on.
Just a quick plug again, if I haven't mentioned already, Safer Internet Day on February 7. More information at the Insafe booth. I ask that you thank the panelists for their fantastic input. Thanks very much, everyone.
(Applause.)
(The session concluded at 11:47 a.m.)