 And we're live here at the MIT Media Lab. I'm Dazza Greenwood, a scientist here in the Human Dynamics Research Group. And I also run law.mit.edu. And of course, this is our very favorite week of the year when we run the Computational Law course. And as part of that course, I want to introduce everybody to a colleague and a new friend, Christina Kochloff, who's director at UNI Global Union, in the Union space. And you're responsible for kind of everything related to the future of work there. And we've been talking the last several months as part of different affiliations and collaborations through the Media Lab and other partners about very interesting work that you've been diving into with respect to workers' data rights. But just to get into things, maybe could you introduce yourself and talk a little bit about this future world of work. Well, thanks, Dazza. And it's right at UNI Global Union, we are Global Union Federation representing about 20 million people from around the world. And we have offices in various regions of the world. But I'm responsible for everything on a global scale in relation to how is the world of work changing because of technology? What is the world of artificial intelligence, data? How can we see that job profiles are changing, we and upskilling? What does this mean to our social security systems, our welfare policies, and all that? So that's the work I drive on multiple levels. That's cool. I'm sure you're very, very busy now. So the reason why I wanted to invite you to help make this video and to get a few points across was primarily about this initiative or I guess on program that you launched and we've been talking about related to workers and data rights. So data is something near and dear to the essence of the research we do at the Media Lab on human dynamics. It's very much part of understanding how the law is transforming to becoming this sort of like next generation cyber enterprise. What does it mean from the perspective of workers and just of that essential part of life in the digital age? Well, this was exactly the entry point to all of this for me. It was by saying we have all of this data that is being created continuously. No matter what we do, we walk down the street, we take the bus, the plane, we use our credit cards, we shop on this and this e-commerce site. You know, we work in this direction, we have that bank. Everything we do continuously is creating data. And then we started hearing about something called management by algorithm. So how algorithms, how data is being used in human resource processes in companies. So in the hiring process or who gets fired, who gets disciplined, who gets a pay rise and who doesn't. And then you have to ask yourself the question, well, hang on, what data are they using here? And where are they getting it from? And what rights do the workers or the applicants to a job have on all of this data? So I started a couple of years ago looking really deep into this and was quite shocked to find out in the beginning that no law across the world on data protection includes specific paragraphs on workers' data rights. Oh my. None. So this was a huge regulatory void. And then you start thinking, well, you know, is it particularly different? Is there something we should be interested in? Then I'd like to ask, firstly, the people listening to this is going to take a couple of seconds here and consider would you have the job you have today in an algorithm that hired you? Wow. Okay. So that really brings it home. Because I got to assume that most people right now would be thinking possibly slash probably not. And where does that leave us as a society and an economy? And where do we go from here? So to the extent that algorithms are certainly useful. It's the essence of computation. What do you, what questions would one ask even just to start to frame this so that we could begin to arrive at a framework that could work for everybody? Let's start with workers. Yeah, exactly. I think and workers, you know, I just want to say something here. Who's a worker? You know, a lot of people think a worker is somebody who's in a factory. Somebody who has a casual job somewhere. But the actual fact is the majority of us are workers. And I think we have to remember this because we kind of seduced into thinking, oh no, I'm a knowledge person. I'm an expert. I'm a specialist. I'm not a worker. That's exactly what you are. And I think this is something we just have to tap into. And to answer your question of what questions should we ask, well, we need to tap into the state of play first. And that is very much that currently the data that you're giving away when you apply for a job. Just think about that. You're giving away your history, your employment, your education. In many countries you have to state your age, your gender, your address, your email, your phone number. Now all of this data is kind of used a lot. And just to pile on, pile on increasingly as a condition of surviving an interview to be hired, your Facebook account, your social media. So it becomes this 360 degree virtual dossier. And it's the dossier that keeps on giving. Because of the continuous connections in our lives. And so I don't want to break your strides, but it really is profound what we're giving up. What we're giving up is profound. And to be honest with you, I think the majority of us have sleepwalked into this situation. We never really thought about why was Facebook given to us for free? Why was LinkedIn free? What is it we have been giving away? You know, my grandfather used to say, never accept anything for free because it costs too much. Well, there you go. And I think we're waking up now to realizing what these costs potentially are. And I think what you just said there that we're giving away a lot and it forgets its own life. Now this is the really important point. Because how a company is actually analyzing your CV or your data. What is it they're looking for? Are they looking for opinions or likes you do, the connections you have? Are your connections radical left-wing, you know, political people? Are they environmentalists? What does this say about you? So suddenly interpretations on who you are when they might be right or they might be wrong are being made. And you don't know. You don't know what are they looking at? What is in what is out? What are the buzzwords what aren't? So this is just one tiny example in the world of work in relation to this, but the question is much larger than that. The question is who owns all of this data? The data that we produce as citizens, as workers, as workers as we work. How many, how quick are you typing on the keyboard? How many products do you manage to sell? How much service do you manage to give? How often do you leave your desk? Every single movement is creating data. Where does it go? Who owns it? And you know, I spent a lot of time trying to find out, you know, follow the trace of data. Who owns it? And of course you can't. It's not a tangible good. But I spoke to a couple of insiders. They felt it wouldn't be one the previous Chief Technology Officer for President Obama, and then Reid Hoffman, the founder of LinkedIn, and they independently said to me that the Big Five owned, probably on control, 60 to 70% of the world's combined data. Oh my, oh my. So we're creating a monopoly of data. So that would be a digital oligarchy, the likes of which, you know, haven't been seen really in any prior empire. Empires have always been, you know, pretty focused in the continent. 60 to 70% of the essential asset of the digital age is significant consolidation of power, ownership, and control. Exactly. And to a degree that is, we've never seen this before, of the significance of this, because what is data essentially? It's information. Or it's knowledge, right knowledge or wrong knowledge. But it's creating a superiority that is beyond even the touches of governments. So, you know, we're having a monopolization of technical power, of knowledge, of social power, of economic power, and essentially also power of democracy. Yes. So that was a nice context there. So let's just say this is relevant, it's timely, and there's a sense of urgency. I would add because as we move forward through this transition phase into the digital age, options narrow. And, you know, things that are patterns and, you know, conventions solidify into rules and architectures that become resistant or almost impervious to change. So, while we still have some freedom of choice, while we still have some agency to be active participants, individually and in different groups and in society, and how to structure the role of technology and how to structure our environment. So it's the environment we want to inhabit. What are the main questions you think we want to, that we should be asking right now? Yeah. And I think, you know, your sense of urgency is dead right. And usually I say we have two years to get this right. I'm shrinking that down to one. Okay. So I think, it's super. Let's go for it. I think we have four or five really, really main questions and it would be great if this group would discuss them. Okay. So just as a quick, kind of, Boche to the crew in the computational law course, what Christina just said is the questions we're about to pose are your homework in a sense. Okay. So there will be a test, a test of your metal, among other things. But we're going to be asking you, in fact, I am here by asking you as the teacher to go into the pigeonhole on the page for Christina's talk and start providing your answers to these questions and vote the answers of other students that you think are particularly worthy of our getting some feedback on as we go forward in this dialogue. Intervention concluded. Let's launch into the questions. Okay. So about that too, but you've got some big things to do. So, I mean, we've outlined the count paradigm, the huge information, the symmetry, this data ownership, the model of data, it's unfair that it's uncontrolled. So my first question would be, you know, this is almost an existential first principle. Could we imagine a different way? Could we imagine that all of this around data, around AI, structured, governed, around other principles, then kind? Okay. This is the first big question. Now, if I can give you a little bit of hints on what you could think about is, could we imagine an ownership which is more collective? Could we imagine a protection of data around the human rights foundations or the declaration of human rights? Correct. Could we imagine that we actually didn't change anything? Could we imagine a model where we're paid for our data? So just think really broadly about, if we want to change a paradigm, on what principles should we do that? Okay. What else? Then, keep that idea in your head. You know, do a little mind mapping on, what would the nightmare scenario be? Where could it go wrong? You know, where would this ideal situation was used for a student dog? Where are his weaknesses? Where could he be exploited? Now, what would be the unintended consequences of this model, right? I think that would be a really good second thought. Okay. Two unintended consequences. The unintended, the nightmare scenarios. What, you know, let's say we created a new paradigm. Would it, for example, ask yourself the devil's advocate questions? Could this lead to an increase in inequality? Could this lead to an increase in the wealth of the 1%? Yeah. Yet you're trying to do the opposite. So just think about, play the devil's advocate. But then also, really think about what would be the benefits of your paradigm. So, I could pose, from my point of view, as a unionist, you know, I believe everybody should work. You know, we should keep people in work. Artificial intelligence, great. We can work with that. We can change our tasks. We can move. People should work. What would happen with the social cohesion in our societies if a lot of people were unemployed? So that's my paradigm. I'm going to go in and create something and data around that. But what's yours? And what dream scenario are you creating? Okay. So number one, what is the appropriate kind of, I'd almost say like a high level framework. Number two, what can you imagine with the unintended consequences? Maybe like how can it backfire? Number three, what would it benefit? Oh, through so. So not only how would it backfire, but also what are the benefits? And then here you could point at anything. You could point at the benefit for the local community, for the economy, for the environment. It could also be on a balance scale, what could be the benefits for global growth and equality and sustainability. So pick your thing. Okay. And then enough of the brand thinking and then down to how do we actually go about this? If you want to change a paradigm, what legal changes, what institutional changes need to happen? What technically do we need that we don't have or maybe we do have but need to apply in different ways? What socially will need to change more? I think big questions, but pick and choose. Just keep this a good thing. And then of course comes a question that is essential. And this is the fifth point here. It's really essential for this class, I think, is how do we engineer the legal responsibilities and relationships around this? Do we have to change competition law, for example? Do we have to find new ways of defining human rights? Do we have to find new ways? So what relationships, roles, responsibilities in a legal framework fix your new paradigm? Okay. So I think we've got a nice assignment here, basically. Can you postulate a framework, basically, that kind of old saw, if you don't change directions now, you're likely to end up where you're headed. So we've postulated some not very good aspects of the direction we're headed now if there's just a linear extrapolation. We indicated there is an opportunity for course adjustment. So I feel like in that framework, the first question is, if we were just course, what would be the landscape in this other destination? Describe it a little bit. And now we come into some questions about what's the pros and the cons, in a sense. What are the risks and the costs? What are the benefits and why would we value this more? Prioritize it. Questions two and three. Number four gets to navigation to this new destination. So how would one begin to manage to catalyze and then to make it happen? There's so many aspects of that. There's the economics of it, social ethics and so forth. You sort of mentioned legal in there, but I want to suggest maybe go light on legal on number four, because finally now we come to deserts for legal hackers anyway and for law.mit.edu, which is of course, can you postulate the, let me go to this camera, here we go, can you postulate the legal framework that would support and reflect this proposed new destination? And there's so much to that. There's particular rules and regulations governing the workplace and hiring. And then you go back to all the commercial codes and statutes. There's what lines of judicial kinds of rules could be extrapolated? What would the contract clauses be? And then what does it mean for interpretations at a constitutional level? Can we get there from our existing constitutional fabrics? Would we, some reinterpretation needed, some amendment needed? And then at the highest level in the sense or maybe the lowest level fundamentally, what is the social compact? What are the basic understandings and agreements that underpin almost like the philosophy kernel at the bottom of the stack? And so what a terrific exercise. And the way I would suggest doing it is you can team up with others, you can do it yourself, but I think instead of answering them in sequence, answer it in one big gulp. So just kind of hitting questions, one, two, three, four and five. And we'll work with you with a way to express that so that we can start to share different ideas and then kind of discuss, synthesize, iterate, evolve, present at the end of class. And I wonder if we could maybe ask the little favor of you. I know you're very busy and you've got a lot in your plate, but if we were to come up potentially with some interesting ideas, could we share them with you for your feedback? Absolutely. No, I would love that. And I think the more creative you can be, the more you can work on the benefits of this, what we'll have to change to get there, you know, the navigation, I would love to give you feedback on that and I'd love to also take that forward in my work and hopefully urge you all to do the same. So yeah, absolutely. Thank you. Good luck with the course. Appreciate it. The outlet has been thrown down, the challenge has been cast. Let us not shy from this. Good luck everybody. Let's go get it.