The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> AYDEN FERDELINE: Good morning from Kyoto. It's 8:30 a.m. on Thursday, October 12. My name is Ayden Ferdeline. I'm a public interest technologist and democracy Fellow and moderator of today's discussion on strengthening worker autonomy in the modern workplace. I'm pleased to be joined by five esteemed Panelists as we discuss the digital transformation of work and how we are redefining the relationships between employers and employees.
I want to introduce each of our -- I won't introduce our Panelists yet but call upon our first speaker who is Wilneida Negron the Director of Policy and Research at Coworker. She has been instrumental in this space in providing terminology to describe a lot of what we are seeing when it comes to the evolutions in workplace surveillance and new ways of measuring productivity and other sort of forms of datafication that is happening in workplaces.
Wilneida, good morning. Good evening for you.
The question I have for you is building off of a report that Coworker published in 2021 examining the impact of technologies on workers in the U.S.
You are now taking your research on little tech, as you term it, global. Can you maybe give us a little background as to what little tech is? How it differs from big tech? And now that this research is being taken global, what are some of the preliminary findings that you are seeing?
>> WILNEIDA NEGRON: Yes. Good morning to you in Kyoto as well. And I am happy to kick start this. As Ayden mentioned I'm Director of Research and Policy at Coworker. Coworker is considered to be the welcome mat to the labour movement.
We really, a lot of the worker, because we are not a union, we are agnostic. We get to talk to workers from every industry. Everything from the tech workers and the bigger tech companies to workers in retail, workers in manufacturing. Workers in hospitality.
So through conversations with a broad set of workers across many industries we started collecting literally just a list of different apps, different vendors, different workers, sort of like coming to us, do I download this app now, Starbucks workers to track, to sign in to work. I'm not really sure what the privacy issues were.
We really started collecting different products that we were hearing from different workers across different industries.
At the time I was also in sort of a consumer privacy big tech policy conversations and obviously those conversations were focused around sort of the five big tech companies. And they were, you know, they play an out sized role in society. Not to be ignored.
But it was five companies. It was the result of like 20 years of innovation really that we got the Googles and the Facebook and the Amazons, et cetera.
And what I, what we covered from the conversation with workers and being a consumer privacy, it was that the worker, there was less attention on sort of the worker privacy surveillance and that the ecosystem and the marketplace and what workers are encountering is much more fragmented, much more expensive. It is hundreds, what became really like a short list of ten vendors became into hundreds. And we stopped researching. It was hundreds of different apps. Yes, there were some Amazon has a lot of apps that they designed.
Big tech was playing a role, but there were a lot of startups and apps. We wanted to quantify that by creating a database and getting a sense of what the ecosystem looks like.
What we, when we began -- we used the word little tech. It was sort of ironic, it was smaller tech, but it was actually thousands when you visualize the five that dominate consumer privacy conversations with the thousands that dominate worker privacy conversations. Workers because they are smaller vendors and unknown companies that just tailor to different sectors. They are not really known to workers. They don't have familiarity. So they were very confused often times. And very concerned, actually more confused and concerned what is this new technology? Who is this vendor? Are they trustworthy? How are they using my information? Et cetera.
Out of that, out of quantifying the ecosystem that workers an coining, trying to compete for air time with the big tech world by calling it will little tech we didn'ted three hypotheses that contributed to sort of expanding this to what is a global little tech, what basic technologies look like for workers across these regions.
The three hypotheses was that it was obviously unregulated marketplace of different products and and vendors.
So we wanted to see if different countries have vast ecosystems of different workplace technologies that are being integrated. And they are touching on every part of the labour process. That sort of is really key because what we learned, there was obviously a lot of focus on gig economy. But then bossware surveillance, but the suite of intrusive products spanned everything from workplace benefits to workplace safety during curved and other kind of labour optimization products like automation, for example.
And productivity monitoring.
We created a taxonomy on hiring and recruitment. In other words, it's an unregulated marketplace. These technologies are touching on every part of the labour process. Again from hiring and recruitment, productivity monitoring, that includes surveillance to workplace benefits. It really has a lot of touchpoints in workers' lives.
The third hypotheses as we looked globally was that this is collecting a lot of sensitive data points on workers. We saw that with the little tech in the U.S. There is all of these products that are at every step of the labour process, collecting an increasing amount of really sensitivity data. When we look the a a consumer privacy space, like we know we are just starting to come to grips with how much is being collected. And workers that ecosystem of awareness and political education is not as strong, but workers, we are starting to uncover how many sensitive data points.
And why is there, we need to focus on like the increasing amount of data points. Again that ranged from biometric to sentiment analysis to productivity monitoring like the outputs to time attendance. What we are seeing now, which is problematic, is these data points are obviously in the wave of AI, they are either being used to train AI models as we've seen in call central work. The workers, their data has been collected. Everything from sentiment to biometric to productivity has been collected to train AI models, in particular sectors without their awareness. It has been collected for surveillance like traditional privacy issues there.
And it is being used increasingly to make predictions about which workers won't be a cultural fit, which workers have risk of everything from organizing to, you know, stealing sensitive data.
A lot of employees are worried with whistle blowing that has been going on and data, like industry secrets, et cetera, being released so that there's a lot of risk analysis that is happening. There's a lot of predictive elements of which workers are going to go rogue, which workers are at risk.
Again, collection of sensitive data points with the sophistication of technology to predict, make predictions that can affect workers with very limited recourse. It's problematic.
Those are the three things we went out to see. We focused our global research, to wrap up, it was in Nigeria, Colombia Kenya and Brazil. We zeroed in on those countries because we were looking at sort of, you know, this next wave of tech innovation that was unleashed because of COVID. These particular countries had received large share of sort of venture capital money for tech innovation in the past four, five years. So we sort of, like countries at the global level, they were in the top ten. We wanted to see what is the innovation space look like. What are the types of technologies that are coming out.
What we are seeing is we are seeing that the ecosystem of products in the marketplace are mostly still dominated by gig economy, but there is an increasing amount of products and companies not necessarily in those particular countries, but sometimes it's, you know, Global North countries selling to global majority employers that are being used, being sold to workers kind of as traditional business to fulfill business outcomes to process payroll, everything from processing payroll to tight keeping and some low tech things.
The data again, it is not only the business function we are looking at but also the types of data being collected. Those are the same patterns we saw in the U.S. And it is global majority countries there is no data consumer privacy laws. For workers, the awareness of what to do is a lot more limited.
I don't know if I should stop there to give other folks a chance to weigh in.
>> AYDEN FERDELINE: Thank you so much, Wilneida, for that excellent introduction. I want to bring Eliza into the discussion now. You published digitally divided. Some of the comments Wilneida comments that Wilneida made about ghost work hit out to me. Perhaps, Eliza, you can comment on why amnesty international is researching the intersection of this area. It is newer to investigate the investigation of labour rights and technology. What changed? What sparked the need for your digitally divided report and the case study that you introduce in the report on ghost work. Maybe you can briefly summarize that for everyone in the room today.
>> Thank you so much. Thank you, Wilneida, for that introduction. I can't follow that with quite the same level of specificity because the report covers a much more broad set of issues at the insection of inequality. I like back out and start from like a more high level approach.
I will say, again I'm representing my views here and not on amnesty. Our views are still in flux.
The human rights will community doesn't have a strong history when talking about economic inequality writ large including employer exploitation. That has been the case for quite some time even prior to the advent of the digital era and the roll out of these more at based or database technologies that Wilneida was describing.
The concept of the sweat shop, right, the idea of workers in the global majority in a country of labour laws are much more lax doing and creating value for the companies in the Global North has always existed. What I tried to show in the report is the digital advent of this digital sweat shop, that is the term that others have used is basically the same practice just applied to a different case. I think what is useful, I have another kind of note and point of reference as to why this work is coming out of amnesty right now. My work is part of a fellowship that specifically focused on the intersection of inequality and technology. It does come out of the issue within the human rights community and policy circles in general, talking about the issue, you see this buzzword now, inequality and all these grant making schemes and different human development reports.
I do take issue with that term a little bit. I think it anonymizes the issue and makes it out to be a mystery dropped out of nowhere. Poverty doesn't come out of nowhere. It comes from explicit deliberate work and tax practices. Amnesty has interesting work with tax policy and how the lax or ineffectively enforced tax policies of countries around the world made it possible for austerity measures with cuts to social programmes directly leading to enormous human rights abuses. What we show in the report is that all of this is coming in the context of two issues that Wilneida nicely laid out for me that I think are essential context which is, one, that this is happening in the context of an unprecedented status for global inequality, wealth inequality around the world. The latest number, the world's will poorest own 2 percent of wealth and the top 1 percent own 75 percent. Hard to believe. Since the outbreak of the pandemic, this has really, really accelerated. That happened in tandem with the roll-out of different kind of reliance, different kind of government, public sector applications of technologies that Wilneida alluded to.
So what we lay out in the report is three areas of concern for policymakers that are trying to understand the impact of technology and inequality in a very Broadway. There is so much to cover within that. We tried to narrow it down to a couple of core populations of concern. One is labour. We will continue to talk to this and continue to say why I think that is a core area I laid out.
The other is immigration and borders, movement of people and the right to asylum. The last one is criminal justice and policing.
The last thing I'll say before I finish. This is a long answer to your question, is that part of the reason, it is interesting that this work is coming at this moment. Amnesty tech specifically has preexisting work on issues of surveillance where we mean spyware, different kind of predictive policing and different issues around digitisation and automated decision making in the public sector.
In some ways that sets up the framework to talk about labour. When you think about criminal justice and predictive policing, labour and migration, it is easy if you look closely to see how these things are related and how data sharing between employers, between schools, between local law enforcement agencies is going to be increasingly a practice, particularly for populations for whom there are few or lessen forced legal protection, especially for more vulnerable people.
What I show, this is my last point, I think technology in some ways has become an accelerator and facilitator of inequality, the helpful cover story for why we allow poverty to persist. That's an introduction to the report. I'm happy to go into more detail and talk more about that.
>> AYDEN FERDELINE: Thank you so much, Eliza. I would love to go into a little more detail in the second half of our session. For now I will bring in Eduardo Carillo who is the ...
(Audio problems.)
>> AYDEN FERDELINE: What is your reaction to what Eliza has said and also, I mean, I find the comment that you said, Eliza, that instruct me, poverty doesn't come out of nowhere. It comes out of deliberate policy interventions. What do you think about that, Eduardo? And you have been doing research on the impact of algorithmic regimen on workers in Paraguay. Do you agree with that statement? The research you have been doing in the transport and delivery space, how does that ring true?
>> EDUARDO CARILLO: Is this working? Yes. Thank you so much, Ayden. And great to connect with Eliza and Wilneida. Definitely poverty doesn't come out of nowhere. We are in an inequality system that is now further perpetuated by the implementation of the digital technologies. So I'm glad that there is this connection happening because I sort of like digress with my presentation to talk about the context in Paraguay in the surveillance. Situation that workers already suffer. It is important to realise it is not that technology creates a new surveillance and before time workers weren't surveilled. This is a situation in which that already existing surveillance is being augmented and improved by digital systems.
It is important to alsosy wait ourselves historically and recognize this is not happening out of the blue. It is just a way in which our current capitalist system is reinventing itself to exploit workers and extracting as much surplus as possible.
So sorry for that.
For those who don't know me, thank you so much, my name is Eduardo and I'm the co-founder of technique based in Paraguay. Part of our efforts for a more just digital economy that includes workers rights. With he partnered last year with the firm fair work project, which is an action project that validates working conditions in the platform economy in countries. It is important to generate as much data as possible in order to have a comparison of the different platforms that we rate across the globe and that most of the time repeat themselves in different contexts, different countries. I'm going to come back to that. There is an element of trust nationality that I think is useful to reflect upon in this particular panel.
So in this project we score the platforms against five principles. For those who don't know the research and the methodology, fair pay, fair conditions, fair contract, fair management and fair representation.
Now, going a bit into the beginning of what I was saying before going into the core of my presentation and why we focused on transportation and delivery apps within the gig economy. I think it is important to highlight that in paragraph guy there are traditional ways in which workers are surveilled. They reflect a complicated reality for workers.
For example, we research a few years ago how companies when they hire workers, they ask for more health data than what is required by law. We are talking, for example, HIV status. Year-by-year we hear how workers have been fired because their employers unlawfully accessed HIV studies information. What I want to say, workers' rights to data privacy and protection have been violated. This is happening particularly in the gig economy.
This is why for the past years we have seen an exponential growth of the platform economy in Paraguay and pair with fair right to look at platforms operating in Paraguay. Some of them are familiar to you all. Talking about Ubers, transnational delivery transport. Very dominant in Latin America. And in general, I don't want to lose that much time in talking about the findings, but only two of the six studied platforms could score any points in the principles that in total scored ten points. The platform had the best amount of scoring is only two out of ten. It is safe to say that economy workers had little to no possibility of meaning fully engaging with these platforms when they felt mistreated nor have true capacity to scrutinize the algorithms that surveil and govern their every day lives. This is what is reflected in the overall scoring that we were able to gather, let's say.
And perhaps connecting these reflexes with the broader topic of the panel it is important to point out also, I'm going now to the issue of transnationalty. In this perspective, these platforms are transnational by nature. So these transnationalty poses an important data sovereignty that we should also reflect upon when we are thinking about workers' rights in different nuances and intersection with data protection.
We are in a scenario now where global subworkers are providing a vast amount of data of a sensitive date, for instance, biometric data used to train algorithms of these platforms for that not so far future they can create a future where the platform is less dependent on worker services in general. It is a circle that never ends, the exploitation, how exploitation can reinvent itself.
Another thing I would like to leave as a final reflection, in Latin America in general, I'm based in Paraguay but we also try to see in the original lens this situation. When we think about workplace surveillance that is augmented by digital systems it is important to remember surveillance itself inserts itself in a precarious work environments where workers are excluded from reparation mechanisms in general. In this scenario introducing digital interfaces that intermediate work or surveil the workforce, they tend to go unnoticed until they become difficult to roll back.
More importantly and specifically in relation to the gig economy, in Global South countries there is a normalization of work pre-carty in a context where there is lack of economic opportunities in general. There is a take it or leave it work philosophy installed that has counterproductive results to workers rights.
Lastly, I don't know if I'm past my time, but just two final reflections. We have a complicated scenario that is cut through by historic regulatory depths. Paraguay doesn't have a personal protection law. We don't have a law against all forms of discrimination. That lack of regulatory certainty for traditional rights when intersecting with digital technologies and intersecting with workers' rights, they pose an additional situation that tends to expand injustices for workers.
I think that, you know, the future or if we are going to aim to try to build a fair digital future will only happen through workers collective organisation. So we need to fight for a true free surveillance environment workplace for workers to truly exercise their right to freedom of expression, association, autonomy and collectively organise.
So thank you very much.
>> AYDEN FERDELINE: Thank you so much, Eduardo. Can I ask a super quick follow-up question just to the very last point that you made about really, which gets to the heart of this discussion which is about resistance and pushing back.
What can workers actually do in Paraguay? Are there particular actions that workers have been able to use --
>> EDUARDO CARILLO: Sorry.
>> AYDEN FERDELINE: Let me --
>> EDUARDO CARILLO: What can workers do? That was the general question?
>> AYDEN FERDELINE: Yes, resistance, what does that look like in the Paraguayn context?
>> The thing is Paraguay, not so curiously is one of the countries with the lowest unionisation rate of Latin America. So that is already a lot. In a context of pre-carty in general.
The private sector workforce has less percent, less than 1 percent of the unionisation rate. So traditional ways in which workers could organise, which is indeed being perhaps in a union, is not something that is very traditional in the country. There needs to be a shift in that understanding from workers. It is a difficult cultural shift that is also associated with a lot of uncertainty because most of the time workers, whenever they do try to organise in the country, they tend to be fired before they form a union.
So we didn't have data that this is happening currently in the actual gig economy environment, but it is definitely a cultural perception of people and workers in general. This is the first step that we need to overcome for starting to generate these organised spaces that also understand digital technologies and the complexities in the intersection with workers' rights.
>> AYDEN FERDELINE: Thank you, Eduardo and Raashi, Raashi Saxena is based in India. The situation that Eduardo just described in Paraguay, does it sound familiar? I imagine the answer is yes, that gig economy platforms are being rolled out in India, treating workers similarly. Resistance is futile. You are probably an independent contractor versus a worker.
What do you say? What is the situation look like on the ground for you?
>> RAASHI SAXENA: I do feel like there there -- we have historically had a lot of short-term contracts called gig economy platforms. We have a very large informal sector and since we are defined or called an informal sector it is usually dubbed as not contributing to the economy.
But yes, there have been various instances of workerrers and policymakers pushing back against in exploitation. A lot of this exploitation has been exacerbated in India with the emergence of tech based p, it could be cans like Uber or.
(Naming others.) more food companies, other companies that work on any con receipt services. They offer a wide range of services as, you know, ordering food, eCommerce, home services and more.
A lot of concerns have been coming on the working conditions of these gig economy workers, right from issues of low pay, especially with the kickback in the ecosystem, job security, ridiculous long working hours.
Of course, the lack of general social protections, such as healthcare or general pension benefits.
In response to that there has been a lot of gig economy workers that have banded together and formed unions for better working conditions.
A lot of them amongst themselves banded together and they are inspiring to me. In Mumbai we have a local taxi and we call them yellow black taxis banded together and form their own local app that has slightly better wages. Back in Bangalore where I reside we have an application, I would say again local, you know, local transport folks coming together and it is called no more Metro which offers fair pay and also, of course, is more liable surprisingly amongst us who are after I had users.
In 2020 the Indian federation of transport workers organised a strike to have better pay conditions for a lot of these ride hailing platforms. Recently there was also the local government in India, the government also came out with the legislation where a lot of gig workers should be given basic pension and social scheme benefits. And yes, there have been significant steps, I would say, by the Indian government where they introduced a labour core that aims to provide social benefits to gig economy workers.
Again, a lot of it is an implementation issue in my country. There might be a lot of things that are well defined ...
(Poor audio.) I would say there have been a lot of instances and opportunities where things have been exploited and most employers are due the owe dough the bare minimum when it comes to addressing these issues.
Of course, leading to them being treated unfairly. We've seen several instances, even during COVID where most of the factories were shut down, but there was no direct support to many industries. I hope that helps.
>> AYDEN FERDELINE: Absolutely. Thank you. Raashi. I'm going to bring Eliza back into the situation. Some people are going to dive into your report.
You introduce three different case studies in the report which are different examples than the ones that Eduardo and Raashi have highlighted. One of them was when you spoke about the availability of generative AI tools most of which are relying upon hidden human labour, trained unon the labour and data of people around the world. These are points Wilneida also raised earlier today. These pose risks to labour security and workers rights in the gig economy and that it seems likely that this is only going to grow in scope and significance in the future.
I'm curious what you think is the future of the next five to ten years or even further. How do you see workplace surveillance technologies evolving, particularly with advancements in AI and other emerging technologies?
Is it as dystopian to you as it feels to me?
>> ELIZA CAMPBELL: Yes, I think we're kept up at night by the vision of the next five to ten years. To address that question and some of the points raised, I see this point I keep thinking about about the London based venture capital firm that surveyed about 2800 startups in the EU that are reporting themselves to be AI first companies and found that more than 40 percent of them actually weren't using AI in any meaningful way. It was a branding exercise.
So I kind of think about that as the touchpoint for how I think of where I see kind of the next phase. How this is going to be rolled out and impact people across the global economy.
I think basically what we are seeing right now, even though, you know, there are remarkable advances to some of these generative AI or I don't necessarily agree with calling it that, but some of the newer versions of automation for MX or text generation, but it is the case that we have a remarkable lack of clarity about exactly how and in what ways those tools are developed with what models.
We know with some certainty that a lot of the models that these tools are being trained on are either committing plagiarism at a huge rate or trained on data sets that don't take into consideration the vast amounts of inaccurate data, the vast amounts of hate speech they might be absorbing.
It is also the case that a lot of workplaces and a lot of companies now are going to be under pressure as AI becomes basically like I said, a very trendy marketing ploy which companies have to assert themselves as being relevant to the market.
There is a race to the bottom in terms of how the marketing out paces the tools themselves. At the same time a lot of these companies are actually being forced to rely upon what others have termed ghost work or sort of, yeah, the digital sweat shop that I mentioned earlier.
A lot of the time there have been cases shown in Finland there is a company that has gotten a significant round of venture capital funding to create a model using incarcerated people in Finland to basically do image labeling and do the kind of digital piece work that makes that tool possible.
There have been cases of humanitarian instances or companies that purport themselves to be assisting or helping asylum seekers or refugees, using populations who don't have a lot of choices for work, using them as image labelers or doing the work that makes these tools possible.
My fear is that we will see an increasingly bottomless need for not just data but for the capacity to basically do what AI purports to do which does require as of now a tremendous amount of actual human labour, but that most companies sort of want to keep hidden.
That is something I think about as we think about the future and where policymakers should be putting their attention, looking carefully where there will be populations of precarious people who have very little -- desperate and have very little option for how to make their living and look at where and in what ways they might be fed into the global supply chain of a lot of these companies and a lot of these technologies. That's something I will watch with a little bit of fear, but I'll watch closely.
>> AYDEN FERDELINE: Thank you. Can I ask a really basic follow-up question which is why do companies want to keep that labour hidden?
>> I think it's embarrassing to them. They don't want to admit that the tool they developed isn't as advanced as they purport it to be. We saw, one of the cases I saw that was most impactful was meta. We still don't know again because a lot of the tools and evidence about them is kept very nontransparent but we have it on pretty decent authority that content strop base that they will said, was human tools basically. When I think about that particular instance, I think it makes people feel safe and better about the tool that they are using and people naturally have a bias towards respecting something if it is tech-powered, frankly. I think that may be part of it. I also think again it is just people don't want to have to consider the human power, the human labour that goes into the goods we consume whether it is generative AI or the clothes we make. It is easier to not think about where they came from, how they were made.
>> AYDEN FERDELINE: Thank you. Wilneida, we have an opportunity here at the IGF to provide some quota actions to policymakers and put some some quota actions into the messages coming out of the IGF.
What has come to your mind begin the increasing prevalence of AIsurveillance, the trajectory that ill Eliza out lined, what steps would you recommend that policymaker make to mitigate the change in work that is happening around us?
>> WILNEIDA NEGRON: Sure. I think just from the conversations in the U.S., what we have been seeing in other regions, it is really going to require a mix of policy and regulatory action and some of the policy work is going to be, need to be focused on establishing some basic protections for workers. A lot of the legislation coming out of the U.S. is really focused at the national and at the state levels, focused on just disorder, like requiring employers to disclose in a timely manner that they are using these technologies. That is from GDPR we know the consent model is problematic. And when dealing with not, a small collection of big tech companies, they talk about being able to consent and for workers to sort of disclose different products take they are interacting with, it's highly problematic.
So we know that workers need a basic level of protection that goes beyond consenting and disclosure requirements. We are seeing it, because it is fragmented, the policy solutions are fragmented right now. We are seeing a lot around focusing on algorithms, like trying to regulate the algorithmic tools being used. In hiring and recruitment, requiring vendors to undergo particular Dawits and impact assessments. There's a whole ecosystem who Dawits the auditors that are now being required by some agencies to sort of regulate some of these actions.
So I think that these are all worthwhile conversations. Looking at the particular kinds of really invasive uses that are algorithmic driven or that collect highly sensitive data like biometric data. Zeroing in on a sensitive types of data, really sensitive types of using and maybe going with the GDPR model where you focus on risk and regulate by risk an use cases.
Again, not forgetting that workers need a basic level of protection. In the U.S. unfortunately we don't have a consumer privacy law. It is hard right now, to encroach on private actors, employers in this case and acquire them. There's a whole lot happening on the he self-regulation. In the U.S., this goes to the economic dynamics that Eliza is talking about. We work in an interconnected will system where the state does not feel like it can intrude in the private matters of business in the private sector. So you're seeing a lot, as regulatory and policy try to figure out what particular assets of these technologies can be regulated or provide protections for or require employers to do due diligence on.
There is not much intrusion in sort of the economic private matters of companies in terms of requiring them to provide protections to workers.
So that goes into sort of the reality of the market economy and that goes into the sort of last set of kind of solutions, is like you have to address -- it's what a lot of -- we spend a lot of time as well not only organizing workers but on the market dynamics. The drivers of these trends, we not continue to fight after the fact of just the fact that -- like hundreds of products are coming out every year and there is still no legislation as I mentioned.
So how can we tackle the market dynamics that have created that inequality. In the U.S. that looks like, looks like a monopoly, mergers and acquisition which happens a lot in the data brokerage industry in the U.S. The small data brokers collecting sensitive employment data and sort of being acquired by bigger data proke brokers.
Can we use the power of mergers and acquisitions to kind of, lack of a better word, try to tone down that market dynamic.
Another thing would be looking at, you know, the private capital space, the venture capital requiring greater disclosures, requiring right now there is very low accountability of the sort of private markets, the private markets are where niece companies, it is where the Facebooks of the world go before they IPO and hit the public markets.
So it is really critical to try to intervene at those early stages when these future Facebooks of the world are in the private market space. There's a lot happening there. Everything from ESG to other kinds of disclosures of how, what types of companies are being invested in.
Yeah, a lot of market kind of industry-focused dynamics of how can we -- we cannot continue to fight this battle with these markets, existing market conditions that drive this kind of innovation and these products.
And as a state sort of struggles to intervene, in addition again to like all the policy and multiagency work, that is needed to regulate particular harmful technologies and provide some kind of protection for workers beyond just disclosure and consent.
So there's a lot of work to be done
(Laughter.)
>> WILNEIDA NEGRON: To what extent there are spaces, as we were saying, for us to strategize on national, regional level, I think is really, it is really very much needed.
>> AYDEN FERDELINE: Thank you so much, will. Eduardo, I would love for you to react to Wilneida's comments around do we need to turn down on that market dynamic, do we need greater disclosure when it comes to market capital.
If I can throw in an extra point that others have raised around the argument that the value of worker data is arguably in its collective use by both workers and employers.
And if we were to think about what that would look like if workers and/or their representatives who advocate for the legitimate interests were to have collective rights to worker data, and I don't mean health data, for example. I mean different forms of data, whether that is around injury rates, other metrics. What might those rights look like? Would that be helpful? What do you think? What else should we be asking for?
>> EDUARDO CARILLO: I think that, I'll start from the last question and connect with some of the points that Wilneida and Eliza made which I think were quite interesting.
That my worry in sort of like posing those very complex ways of governing a shared access of data in public interest way, in the benefit of workers, I feel it is a bit difficult in the Global South context where people are still learning how this ecosystem even works. Sometimes I feel that that is going to be the whitewashing that some companies could potentially do.
It could be dangerous. Without any true meaning. So I feel that at this point we need to go back to the basics of what workers rights fight is in the Global South context. That means to fight for companies to truly allow workerso, in the context of gig economy platforms they recognize that they are workers. In order for those other complex discussions can come to be.
Right now in Latin America in general, none of the platforms recognize the workers dependency. And most of the regulation at the moment because there are regulatory efforts from governments that are trying to understand how the gig economy works, are posing -- are starting to pose this question: Are they dependent workers or not? What kind of models, hybrid models could coexist in terms of dependent and not dependent workers that are interesting and could potentially pave the way for more safeguards for these workers.
And perhaps connecting with some of the things that, as I said, Eliza and Wilneida were saying, I think that another tool that perhaps we don't think a lot is how competition can perhaps help us in also better improve the current data economy exists as we know is currently very concentrated.
Competition in its intersection with data protection and privacy is pretty much a novelty also at the moment. I think we should fight for competition to also help in the fight of creating better working conditions and understand that if there are perhaps very unequal ways of treating workers and in which, or very let's say predatory ways in which data is exploited and so on, that can also be considered a competitive parameter.
I know there's a lot of resistance in trying to expand in how competition currently works. It is a conversation worth having because we need to use as much elements as we have to improve the current digital economy ecosystem and a lot of the problems that we have right now is that it is highly concentrated.
And then of course that has an impact in the way that data protection is enforced and how people interact with platforms and so on.
And then I am very happy that the issue of the ghost sweat shops was mentioned by Eliza. One of the other reasons of why platforms perhaps don't recognize that is that if you don't recognize the problem, then it is not a problem, right?
So you won't address those issues and you won't address the current inequality and the current exploitation that these workers are currently facing because you just don't cannage that that is a problem in its own. And it is also connected with a lot of interest from governments that they want these companies to install those sweat shops in their countries in order to create jobs that although precarious are also jobs in the context of a lot of inequality.
I think lastly also in terms of the future, that was a question that you asked me at the beginning. I think that a lot of the future and what other things workers could also do is related to platform cooperativism. I think that's an interesting concept that is starting to become more present in different discussions of workers who are trying to build their own digital infrastructure and have more autonomy in the ways they will design the platforms that they will work on.
But I think that that, those discussions have to be highly supported by national governments, that should invest in those kinds of programmes and allow this kind of exploration to happen in order to build other sorts of business models that are more cooperative and just in their roots.
So yeah.
>> AYDEN FERDELINE: Eduardo, thank you. Rash, I will bring you back into the conversation. Of course, you have the opportunity to respond to Eduardo's comments as well as the comments from Eliza and Wilneida. I'll also give you your own question to answer. That is, we are in UN fora. You have been following the WSIS plus 20 process and the digital compact and how it can potentially bridge the gender divide and the promotion of empowerment of women and girls.
Are these instruments that can help workers, particularly female workers?
>> RAASHI SAXENA: There definitely is. I feel like a lot of my responses would also contribute to a lot of the aspects that Wilneida brought out with the ecosystem and how fractured it is when it comes to contributing or providing support. I think globally only 7 percent of female founders are actually backed up by these. In this ecosystem it is more of you need to be from an abs ecosystem to be able to -- ABC ecosystem to be able to capture that funding. It is very insular in many ways.
Also to bring down to Eliza's point, I have seen that, especially with the labour practices of a lot of content moderators are usually in general very hidden and in the VC ecosystem I have seen that maybe the PR aspect of promoting something as AIwould gather or harness more money. So it is more of a PR exercise than actually -- also kind of propagating that AIis this magic wand that will magically wash away a lot of the aspects of inequality.
But I am talking about the WSIS 20 process and the global digital compact I do feel like it has the potential to significantly bring in the gendered digital divide, wn with helping identifying the values that prevent women from accessing digital technologies.
In India most of the local households have devices shared among an entire family. So having, having your own device could help in improving digital literacy skills. Also kind of crossing and cutting the social and cultural barriers that women have when it comes to mobility. It could also, India also has very cheap Internet -- having access to these devices and Internet could also help in promoting locally relevant content and services. And also employment.
Would be one aspect that gets one aspect that gets missed out, a lot of people with disabilities, especially women, are more disproportionately impacted and have more social stigma. So having proper access wools help them to participate in a session setting, in a cultural setting and also give them dignified livelihood.
The other one with the global digital compact could also assure that women have access to equal opportunities in the digital revolution, whether it is an initiative that they want to promote on a small scale or a medium scale to also support women in the startup ecosystem and help gendered representation in digital leadership roles.
And one of the very important ones is also the addressing the online violence and harassment. The growing phenomena that you have, especially with generative AI, doctored videos, synthetic videos which for the longest time used to affect women in public life, but a lot of women such as my sell and Wilneida and so many others could be perpetrators of this. I do feel like having robust policies around this could help develop responses on how to combat this, promote digital safety and security, collaborating with T Dick, perhaps, to assure that victims have access to effective support mechanisms.
And lastly, of course, the global digital compact and WSIS process could also encourage governments and other stakeholders given that we are at the IGF to take specific needs and priorities for women in the digital sphere to of course increase participation in the decision making processes and help in the development, implementation, and initiation of policies an programmes.
>> AYDEN FERDELINE: Thank you so much, rash. We are nearing the top of the hour. I would like to give each Panelist 30 seconds on brief closing remarks on how we can develop trag strategies that assure an equitable workplace. I know that 30 seconds is not enough to answer that question, but will, please.
>> WILNEIDA NEGRON: Thank you. Building cross class power with workers across regions, cross class, across industries.
There is a lot of connective tissue there, a lot of shared analysis that could be connected and I think that is, you know, there is a lot of opportunity that we are not tapping into.
>> AYDEN FERDELINE: Thank you, Eliza?
>> ELIZA CAMPBELL: I'll say as final wrap-up that I think our community of people who work on digital rights and tech policy, we need to do a lot more to expand the way that we think across different sectors of policy community and work with people who are working in unionisation, people working in climate change. There are increasing numbers of climate issues in the application of AIthat we didn't even get into and thinking about our sector as part of the global set of issue that is are creating and exacerbating wealth inequality, racial inequality and yeah, I'll stop there.
Aid ailed thank you, Eliza. Eduardo?
>> EDUARDO CARILLO: I would say definitely in the context of the gig economy at least we need more regulation that is collectively built with the voices of workers. Not necessarily from a top-down approach. But also at least more ownership of the infrastructure from workers is something that is also important and, of the digital infrastructure and something that should be in the discussions as well.
>> AYDEN FERDELINE: You get last word, Raashi.
>> RAASHI SAXENA: I think there needs to be more conversations around this. There is a lot of cultural stigma on speaking up. We need to stop being in silos, acting up, having more conversations, information around how we could effectively band up together in places like this and hold companies accountable.
>> AYDEN FERDELINE: Thank you, everyone. It has been a pleasure being in company with you today. I hope we can continue this discussion intercessionally and also at next year's Internet Governance Forum in Riyadh. And for now we can adjourn this session at 9:30 a.m. Thank you.