 Welcome everyone. Good evening. Thank you for joining us and thank you to our panelists for being here. Reagan McDonald leads Mozilla's policy work in the EU covering a range of issues including privacy, data protection, content regulation and disinformation. Prior to joining Mozilla, Reagan worked at AccessNow and before that at the European Digital Rights Institute. We're also joined by Annabel. Annabel works for Sheleed's Amazon Web Services Policy Work on Data in Asia Pacific and welcome Annabel. We're really looking forward to hearing your views today. Sean, who isn't here yet is a CIGI Senior Fellow. He's the co-founder of Digital Public which builds legal trust to protect and govern digital assets and is the CEO of Frontline SMS, a global technology social enterprise. He's also a fellow at the Duke Center on Law and Technology, a visiting fellow at Stanford's Digital Civil Society Lab and is an advisor to IEEE's Ethics and AI Committee. Welcome all. Before I get to asking questions, I wanted to set the stage for today's discussion. As the driver of growth in the digital economy, it is fairly obvious that data is of strategic value in the digital economy. Incidents like Cambridge Analytica have exposed how tech firms are collecting and exploiting data to influence our behavior and reveal the power of that data over our lives and democracy. Increasing awareness of growing dependence on data and its value for society has generated interest from policymakers seeking to control the market for data and set rules for how data is gathered and used. We have two large data blocks which are emerging. The first being led of course by US which has prioritized a free market approach to data governance. The second being the Chinese model of data governance that is more state oriented in terms of control over data. We also have a third block which is emerging led by the EU, the European Union, that is championing a rights-based approach to data governance. From research and talk and from speaking to policymakers and other researchers working in this area. The second understanding is that at the heart of regulatory efforts for data governance, there are two key objectives. The first is enabling access to data held by private platforms as without access to vital data, neither the public nor the private sector will be able to exploit the benefits of data. The second objective that policymakers seem to be pursuing is to tackle market challenges presented by data monopolies. Platforms that are effective due to their network effects actually become data monopolies as they consolidate their control over data. Regulatory response is required but and while some are pursuing antitrust action in this area, a big tech also reduces the network effects which make them important in the first place. So much of the debate on data governance in India and beyond has focused on protecting personal data so far. We've had the general data protection regulation in the EU which has that has carried forward the legacy of the data protection directive and introduced a new sanctions regime, which as Professor Thomas Trines has noted seems inspired by the EU competition law. India too has been focusing on regulating personal data at least since 2017 after the Supreme Court of India. India's decision in K Putta Swami versus Union of India that held the right to privacy to be a fundamental right. A government appointed committee has worked on a draft personal data protection bill which is yet to become law. While personal data of customers and citizens is protected by privacy laws across the world, non personal data remains largely unregulated. Non personal data includes anonymized data like climate trends collected by weather app or data collected by machines or also commuter patterns gathered by cab aggregators. India, a committee of experts constituted by the Ministry of electronics and information technology to come up with a framework for non personal data governance has released its report and is accepting comments to the end of this month. So, our objective in this panel is not to really analyze welcome Sean sorry I was distracted by my screen. So, again, this panel is not going to get into the nitty gritties of the non personal data framework. It's a long way from becoming law. Rather, what we want to do is we want to explore some of these concepts that seem to be informing policymakers around data governance and the strategies around data governance at the heart of the report for example the companies report on non personal data is this idea that data is an economic resource that can be owned and accessed and the question of who has ownership and control over value generated through the production of non personal data. So, I'm going to start with Reagan, we've seen that data is reordering markets and its economic and social importance have led to data being compared to other natural resources like oil, and you know, we see the focus on trying to regulate data as a valuable resource. We've also seen that in the EU a slew of new legislation we have apart from the GDPR we have the digital services and the digital markets act. So I was wondering and the digital governance act so I was wondering if you could talk us through where the EU seems to be coming from and where it is headed in terms of regulating both personal and non personal data. Thanks. Hi. Good evening and good afternoon everyone. So indeed the EU has been making a lot of headway and thinking about like many other governments including India, thinking about what the future of data governance should look like. And there have been some really interesting regulatory proposals that go in that direction. So first it's really vital to acknowledge that data can play a key role in industrial policy and also serve as the basis for insights and innovations that can advance the public interest. We've seen this thinking evident in the data governance act which the EU proposed just at the end of last year. We're really trying to wield this nature as a tool to advance the public interest, and also to regulate monopolies. But achieving that I think there's a lot of good ideas that we're seeing but really being able to achieve it and make it practical and avoid the risks. I think there's quite a bit of details that need to be worked out in this process. So, first on the on the sort of public interest piece. I do have the GDPR, as you've mentioned, but that those rights the GDPR and other sort of data protection frameworks at least in the EU are still really centered around individual protection. And it hasn't quite gotten to that point of thinking about collective collective representation collective protection and mitigation of collective harms. And Jodi you had already mentioned Cambridge Analytica that was also my example because that's really where we're seeing a lot of these harms, like data related harms that live is not necessarily in terms of the individual. And in this case in particular, you didn't have to have your data taken or given to Cambridge Analytica to be to have experienced the harm. I think there's a lot more thinking now about how can we think about data protection and build on the individual data protection rights to think about the collective and both in terms of management and also protection. So that's sort of one piece at this moment, where you know those those collective rights aren't quite where the GDPR has landed but we hope that this will be part of the discussion in the EU. On the industrial policy side, the EU is really, as I mentioned, looking to improve competition and open up markets with with different data approaches, and many governments are really becoming much more savvy about the role of data. And we're seeing this evident in not only, for instance, the data governance act but also in a lot of antitrust trends. And there's a lot more awareness about how data and market power are very much intertwined and so opening one can lead to competition or more open markets in another area. We've at the same time been very cautious like encouraging a lot of these ideas but trying to be cautious not to duplicate existing problems into new frameworks. So for instance, for the EU, which is we're just seeking to think about these sort of data intermediaries as a way of replacing these these data monopolies or private companies. We don't want to just replace US data guzzlers with EU ones. Right, so we also need to think about what what, you know, this is I think also an opportunity that we can think about what are different ways that we can think about data governance and we like to say that Zilla has been long time champions of lean data practices so maybe there's new ways to think about how we perceive of data, particularly in the context of innovation. And then there's also the other sort of broader risks that we have we've been very public about around the possible centralization of power around increased privacy and security risks. So just generally I think around data stewardship models is in that there is some lacking some practical examples of how these stewardship models whether trust or otherwise can really work in practice so like I said, I think there's a lot of promising ideas, we're really a good direction, but there's quite a bit to be worked out in terms of the details and so we have been advising governments that they should be consulting often and openly with experts from everything from civil liberties to security to businesses to better understand how to forge a sort of future way of thinking without recreating the same systemic problems that we have now. Thanks Reagan I have so many questions for you but I want to get my panelists to get a word. Sean as somebody who has been critical of the data governance act as recently as 20 days back and as somebody who's been working on the idea of data trust and community rights for quite a few years now. What is your reading of EU strategy India strategy. The whole idea of trying to establish a framework of community rights over where the personal or non personal data when you see it had it. Thank you and thank you for inviting me to this. I think the old adage is like any panel in which there are two McDonald's is at least bound for economic prosperity. Good news on that front, but no it's, it's, it's, I have been so fascinated by the way that that data trusts and sort of rights associated. I've been interpreted in these legislative frameworks, I just want to set out like two or three of my really quick biases, and I just just to be clear so you can can so the rest of it makes sense. The personal non personal data distinction, I think is an extremely precarious one in today's compute computing environment and in today's data availability environment. And so, when we talk about personal versus non personal data the distinction to me is a really blurry one and I think that there's been such great commentary throughout the series about the definitional ambiguities and the concerns there. And I just want to say like, if what we're trying to achieve are these sort of social and economic outcomes that we're describing then I think we have to grapple seriously with some of those definitional issues. And I think, you know, the second piece of this is that there's a lot of empathy because there's a need to create a sort of professional management infrastructure for data as a sort of cross cutting industry. And at the same time, a lot of what we talk about as like ownership of data, or a lot of frankly a lot of what we talk about when we talk about data full stop is not really data the object. It's rights that emanate through or are associated with that data so my right to be represented in a particular way or my right to access a particular service. My rights are what convey my what's in law it's called standing usually and so that's what gives me the right to to bring a claim against something or to participate in the decision making around how data gets governed and used. I only start with those assumptions to say that, you know, a lot of what both the NPD and the data governance actor trying to do is come from an existing framework which everybody know has some familiar holes, and build this you know kind of professional edifice or professional set of standards of treatment. And in my experience has been that exercising agency is is the bigger problem than almost any of the individual issues that we talk about. And so, you know, if, if you're, if you're a person who's who feels wronged or misrepresented the things that you can do or help correct the commercial ecosystem, but you it's very difficult to try and compel change in the commercial ecosystem or to seek resolution. So, you see kind of data trusts becoming about stewarding data and not stewarding rights. And I think that that's the big cleavage that we start to see in between what what trusts are historically, which is the management of a very specific trust trust but you know stewards representatives attorneys doctors insurance brokers, people who go forward theoretically under you know as a service to you to represent your interests in a complex ecosystem. No matter what that you know what that profession is. It's your ability to manage your relationship with that service provider and the standards not only of care which it's lovely to see the standard of care articulated in the NPD. I think what's missing conspicuously is the duty of loyalty. Right. And so that how it is that you enforce that these folks are actually governing in your best interests. So those are the issue for governance right that's that's what builds the trust that's what builds the fidelity of the ecosystem that's what creates effective rights brokerage. And without that loyalty without articulating not only whose interests you're representing and what interests you're representing in that ecosystem. And then, and then that's how you derive the standard of care in most settings. And so what we have at the left, you know, at this point is sort of the tabletop without the legs. And I think that that's the main, you know, in trying to prop up a lot in trying to help support these commercial ecosystems while recognizing that that realizing the rights attended to these ecosystems is a gigantic undertaking. The policy, you know, they build on really good ideas and I think they're they're a clear form, and to really belabor this analogy set the table, but there are really really important missing legs and so it'll be interesting to see how it is that as those proposals move toward legislation they start to fill in some of those gaps. And Sean, we are going to circle back on many of the issues that you raised, particularly this binary, you know, split between personal and non personal data but Annabelle, you are based in Asia Pacific, which has been the hub of some really interesting strategies around data in recent years and working for as a representative of a transnational company that has to navigate these various strategies. How do you see what has been the impact of such strategies on the company's approach to data and how they, you know, are modifying their practices or the behavior to cope with these multiple strategies and framings of data governance. Good evening, everybody as well. Thank you so much for having me today. So just a bit of context, I think it's helpful for me to also state my own personal biases. So I'm with Amazon Web Services. So we're the cloud computing arm of Amazon. And that means that I think for the most part we don't really manage personal data. You know a lot of our customers companies who maybe deal with personal data but often we don't have any control visibility over that data that's our customers data. So just just kind of setting the stage very quickly but before my time in AWS. I was actually with the Singapore data protection authority and DPAs out in Asia are very interesting. So we were part of the same organization that was also the industry engagement industry of the Singapore government for the info con industry. And so in my time I my portfolio had me covering dealing with kind of very day to day issues around you know having written a law and having companies asking you many many questions around how to comply with the law and of course you know there were also cross border data for restrictions in some shape or form in the Singapore law. So, you know even though it's fairly permissive there was still a lot of confusion from companies. And on the flip side of that, we had that kind of data innovation, you know mandate as well you know trying to help Singapore companies who are interested in innovating the data, innovating in a safe manner. And so I very much come from the mindset that the idea that privacy and innovation are mutually exclusive isn't necessarily the case. That's not to say though that just because they're not usually exclusive doesn't mean therefore that you know you can go out and do whatever you want and there wouldn't be any ramifications. And I think, ultimately, when we think about how, first of all, how if you're a company and you're operating in multiple jurisdictions, and you're thinking about data governance as a whole. What is really really important at the end of the day, from my perspective both wearing a privacy hat but also kind of ensuring that the value of data that you're creating. You know is being managed and maintained properly. It's making sure that data is ultimately safe. What do we mean by safe. I think at the end of the day, you could have all the best intentions in the world we often focus on the Cambridge end of the diggers because it captures the imagination and you know it's you do sometimes have genuinely bad actors but most companies. They don't necessarily set out to do bad things with data that they've collected, usually for a very functional boring purpose. What often happens though is they don't think about how they're governing data well that results in, you know, a data breach. And if that data breach is not controllable and it really just goes out into the open will. So when I was, you know, wearing my head as part of the data protection authority, a lot of my focus was there for talking to not only companies that you know we kind of were responsible for my country but also the regulators privacy regulators in particular about the importance of focusing on security controls and privacy regulators do not like to think about security controls. It's a principle in the law, but it's very technical and so usually they're like, you know, we'll leave it to a security expert to think about it but you have to have a reasonable standard of security put in place. The challenge with what I'm seeing candidly speaking with a fragmentation I know the question you kind of posed to me is, when you're in Asia you're looking at it, you know, you see maybe three big blocks of I don't know you've killed data blocks but, you know, kind of data regulation framework blocks I think there were and increasingly, we're seeing fragmentation. So with greater fragmentation, that means that companies who operate in multiple jurisdictions need to think about how to create a global kind of we can call it baseline framework and then you know, build on top of that. That's maybe more feasible for a company that's well resourced, but becomes incredibly difficult for smaller companies who may not have those resources. A lot of the time startups for example, you know, dealing with lots of lots and lots of potentially sensitive data, but they may not have, you know, the resources to build a full data governance team and go and have deep thoughts about how are they going to meet those standards. And to do what they feel, I guess it's a reasonable, you know, best effort approach and focus on that. And so I think at the end of the day. The challenge with having many different laws and trying to figure out how to comply with different laws and trying to figure all the different principles that are entailed in this different laws. That ultimately means that companies might take risks. And again, you know, wearing that data protection authority hat, that's the worst thing that could happen, you know, you create a framework that, you know, in theory sounds great on paper. But when you implement it in practice, you know, people can't implement it and then don't implement it and, you know, don't focus on a lot of things. So, at the end of the day, I mean this is a very roundabout way of me saying, I think data security is really important. As a cloud service provider, we often worry that people in order to comply with a set of requirements that seem really complicated people take what in their view is the safer option which is maybe like keeping data and in an on-premise storage kind of data center that they built themselves, you know, under their desk in a little server. And that sometimes means that they aren't being able to maintain the level of security that they should be on that server. But still be carrying a lot of really important data. And that increases the risk of data breaches as well. And, you know, and then that cycle perpetuates. So briefly then I want to touch about what I feel as I listened to, you know, lots of really wonderful panels that have been put together today on non-personal data. I think it's well and good to talk about the rights. I think it's a very noble thing that the Indian government is trying to do, trying to figure out how do you, you know, there is data that exists. How can we access and some shape or form that data and do good things with it? I think that's a very noble aim and a very noble outcome. But I think the thing that we're not talking about is how scary, and I think Sean touched on that briefly, but how scary is if the governance frameworks aren't in place when whoever it is for whatever reason, using whatever mechanism is able to obtain that data. And I like to kind of end my opening remarks really on this note. I think it's incredibly inspiring that the work that the Indian government thought of a really, you know, I guess even people thought of when they decided instead of calling it data controllers in the personal data protection bill, they call it a data fiduciary. And when you think about fiduciary, you think about fiduciary data. I think Sean mentioned the duty of loyalty and it's also the duty of care. So the idea that you have that fiduciary duty in the personal data protection space, but somehow you don't have maybe that fiduciary duty lined out in the non-personal data space is deeply confusing. And I think we will spend more time talking about how that line is very artificial and, you know, ultimately you could be using non-personal data to affect an individual that could have really bad outcomes, even though it was maybe weather data about, you know, the individual that somebody lives in, combine that with enough information about an individual that might be anonymized, and you can have some pretty granular profiles. So when you really think about all of that, it shouldn't matter whether that data is personal or non-personal. What should matter is whoever is creating that data, and then whoever is using that data and that could be the same or different people or entities need to still continue that kind of fiduciary duty of care, making sure that you know they're hitting the right protection on the one hand, but also making right and accountable decisions and that's obviously going to be challenging, you know, but I think that's something that should be the focus as more and more people start thinking about using data. So I'll just stop there right now. Thank you. It is quite interesting that, you know, the non-personal data report actually states that while the data protection authority's objective is protection of data, the non-personal data authority is actually to make more data available for innovation and economic and social benefits and just thinking about it while reading the report. I was like these two regulators are constantly going to be fighting with each other, and you know, we can see that kind of tension grow over the years. But Regan, I want to circle back to you as someone who is, you know, working out of a jurisdiction where these conversations are probably a bit more, have been going on for longer. Do you see this distinction between personal and non-personal data as something that is sustainable? We understand the objectives and we probably, like Sean mentioned, we empathize with the idea of where this is coming from. But if you are to try to control data is creating a non-personal and personal data by reading the most sustainable way forward A and B, where I'm getting into a bit of speculation here but is also non-personal data, has it become the object of regulation because the GDPR is in place and it sets and defines rights and boundaries around personal data. So now the only way to access data is to create this alternative framework which looks at like, you know, the kind of distinction that the non-personal data framework in India is looking at, that the data protection authority will take care of protection and we will focus on innovation and economic prosperity. Thanks, Jodi. I will try to answer both of these questions. These are really good questions and I think they're really at the core of some of the challenges basically that we're looking at in the discussion of the Data Governance Act and future data governance regulations. I think there's actually maybe two sort of binaries. One is the technical one of is anonymization possible or feasible or sustainable? And then another one around that already came up to Annabelle's and Sean's points around does the same level of protection apply to personal data as it does to non-personal data. On the first one, I mean on an anonymization or that binary question, generally like anonymization is a very good and encouraged privacy enhancing technique. But the issue that arises is and what many security experts continue to say is that it may not even, it may not be possible to achieve anonymization. I think as we continue to live in a data-filled world where you can constantly add and combine data sets, even public pieces of data sets can lead to identification of individuals or groups of people, which is just about as good as identifying an individual. It becomes a very difficult topic. So in the case of regulation, this is almost impossible and I've seen some confusion I think around the use of this term. It is anonymized. That means that it cannot be re-identified period, not possible. I know the NPD we have left in our public comments and we will file in our following public comments that this in particular cannot be, you know, that binary just does not exist. So that's something that needs to be taken really into account. The GDPR, the way they handle this is that there is no anonymization in the GDPR. It's called pseudonymization. So there is an open acknowledgement that it might not be technically possible and you use pseudonymization as a privacy enhancing technique for companies, for instance, to split the data sets and to protect them in different ways, which makes it harder or so when you have like a data breach, you're not, you know, creating a whole bunch of risks. So again, it's not perfect, but in the context of risk mitigation, this could be possible. There's also other theories around thinking about the context and the use of data and according the kind of risk to that, in addition to the re-identification aspects. So that's on that binary, on the other binary, I just think it's also just a really good point to remember about the non-personal data. There is, at least from what I've seen in the political discourse around this, there is this sort of understanding or hope, I guess, from the regulators that if it's non-personal data, it's a whole different thing. And, you know, we're talking about data flows, we're talking about industrial data, none of the supplies, this doesn't count for the GDPR. And it is, I think, a blurry line, but it definitely, again, to the previous point about the age of data where everything can be sort of combined and identified, that must be taken into account into the equation. And while in the EU, you know, data protection and privacy are fundamental rights and, you know, non-personal data kind of sits in a different place, there should be an according sense of, you know, duty of loyalty, duty of care that should apply to ensure that this overall risk is accounted for. Otherwise, we're facing something, we're creating a lot of risks and in the EU context, undermining personal data protection, which would be, I think, counter to some of the main objectives. Thank you. That is a really useful perspective to have on this. And I hope that the committee members, at least, you know, are watching this, or I believe you will be submitting comments and I hope this is reiterated again. So we've gone on this idea of ownership, you know, and this idea of data as an economic resource. We have other framings as well. And I want to ask you this question because CIGI has done this paper capturing the various analogies and metaphors that are used around data. I was attending a session where the committee member, Parminder Cheeth Singh, was also presenting today and he talked about, you know, the rights framework, the data as labor, and how that could possibly be also useful for informing the work on data governance. I kind of, so are there various models to how you actually think about data trust? Is the Indian model really different from the EU model? Is this, like you said, is it data centric? Is it a data collector centric model that is in the EU? If there is to be a data comments, then what are some of the principles? Is the data comments even possible with respect to data? And if we are to move in that direction, then, you know, what are some of the principles we need to think through that you have, that you're aware of? I just want to thank, those are great questions, but I'll also reflect that both Annabelle's comments and Reagan's comments, I just agree so much. One of the things that I wrote for CIGI, they wrote, they did a series on platform governance and my contribution to that was about fiduciary supply chains. And so it's this idea of how do we trace loyalty through contracts, right? Like if I've signed 12 terms of service down through APIs, like who at the top of the food chain or the bottom of the food chain, depending on how you think of it, you know, is the holder of that fiduciary and how does duty and how does it travel through those digital relationships. So just to offer that, you know, it's so wonderful to see so much of the conversation kind of moving, starting to interrogate how that is and is not real. I think the data trust model, you know, trust, sorry, fiduciary duties, generally speaking, are kind of like directions. They work when they're really specific and the less specific they become, the sort of more frustrating and prone to conflict they become. So a lot of what we see when we see broadly defined duties or duties that are sort of, you know, now trustees hold these duties, but do they have more access to courts? Is there a separate court that is going to adjudicate, you know, the disputes between trustees? Are we just outsourcing essentially the kind of political and social risk to this future professional class of people who are going to, you know, get stuck in the same access to justice issues as every other, you know, kind of rights protection provision. I know that's not exactly the question you're asking but what I'm saying is that there's in law, right, we have procedural justice and then we also have sort of substantive or normative justice and what that means is that like the system has to work to a certain standard in order for us to believe that any justice is happening at all. And I think that what we see right now is that that things like the NPD, you know, do a, and the Data Governance Act and any number of other pieces of legislation really focus on, you know, building a market for exchange, but don't focus on any of the institutional investments necessary to adjudicate the very predictable disputes and sort of legal gray areas and questions that are going to come out of figuring out what these things mean in context. So what happens when you have big procedural mismatch issues is that, you know, you can kind of have whatever theory of rights that you want, but it's the theory of rights that is most effectively adjudicated that tends to win the day. So if you think about, you know, we're talking about ownership ownership is probably the best institutionally supported theory of rights in the world because money moves through and everybody likes to move money and so we make sure those those systems work. But it's weird with data right like if I hand you if I sell you a data set if I sell you yeah if I sell you a data set. So the assumption is not that you collect spreadsheets right there's no like list of all the spreadsheets then you just got to have them all right like you are presumably ingesting things that you take as a representation of fact. I mean, you may put a confidence interval on it you may, you know, add caveats. When I give you data I'm not selling you an item, I'm telling you a thing right and the way that law treats how we tell each other things versus how we sell each other goods is big. And so you know, so to your point at the reason that I keep focusing on fiduciary law and procedural justice and things like this is that India specifically but many many countries in the world have huge civil law backlogs. The the mechanisms of institutional justice just are overwhelmed and understandably because we keep piling more and more things on them. But it we need you know something like an NPD needs a reinvestment in the, you know, the legal sector the adjudicating authorities the clarity to Annabelle's points about you know not only guidance at the beginning but this is how you go and settle problems if the guidance isn't working for you right. And so you have to have the sort of whole life cycle. And I think what we're doing is we're saying, let's make it a market issue. And we'll let's let's focus on the economic value because we recognize that because it's really legible to us, but then let's sort of kick the can on how it is that we're going to deal with the fallout from this. And that's not working particularly well I say that from Washington DC. So, hello. You know where there are there are real real consequences to not taking the governance aspects of the seriously and I'm certainly in a place that's that's feeling them quite quite recently so you know it's there are different categories of rights but no matter what the fear rights is in order for it to sort of work, you have to be able to adjudicate and a fiduciary has become an important part of the procedural justice infrastructure. I think there's a real opportunity to do important important work here, but it's it's it's that piece that feels really unclear to me. Trust the trustees is my takeaway from that. And actually, this whole notion of that, you know, at least big tech, you know, these sort of regulations are aimed at kind of tackling the competition issues or the market challenges that arises from data firms becoming monopolies. It's also likely that these big firms are the ones that will be able to, you know, put in infrastructure or structures that are needed to cope with these various regimes, particularly for startups, you know, how is this going to play out for them in terms of innovation is great is taking this sort of an approach actually good for innovation in the long term what are what what is your view of this from coming from the perspective of you know trade secrets or intellectual property and I know that the second version of the report has gone into some level of clarification into you know what kind of data is not under NPD. Going back to what Sean also said that you know it's not just data that you're collecting a company is collecting data in a certain context that value is derived from that data set in a particular context. And even if you were to take that data set and make it available. It's not necessary that you've been able to tackle the fact that this big company is deriving a certain kind of value from it. So, you know, just your thoughts that is this is does this thinking in the report and the committee is thinking on this issue does it need more reflection, or is it suitably advanced in terms of providing that kind of regulatory clarity that the committee is actually aiming to go for. Yeah, you've asked quite a number of very interesting questions I'm going to try and pick up on a few and if I leave anything else please please let me know. I think the first thing that I want to say before we jump into anything is, it will 100% affect innovation, right it will 100% affects that innovation. And the reason is not because. Like, it's a great idea to go and figure out how to do data trustees right and to have good data sharing frameworks in place and good data governance frameworks in place. All of those things are great and a lot of governments around the world have been also trying to think about, you know, are there things that they can be doing to help build a trust. S you know companies are trying to share data with each other voluntarily, you know, and are there any missing parts in the legal system for example, that may not address certain rights if you know, so you had contract with somebody else and then you agree that you only use data in a certain way and then it's use some, somewhere else how do you pass through liability or do you pass through you know that fiduciary duties or and so forth. So I think thinking about that framework is great. And Kennedy thinking about that framework specifically. I would probably encourage innovation and data innovation. The issue of MPD is not that it's raised a whole bunch of great questions around data governance. The issue of MPD. That's really harming innovation is that it's mandating the sharing of data in a very great way with, you know, a very great notion of somebody, you know, could be boogeyman, could be called a trustee we don't know who this is right. It doesn't make sense if he knows how to, you know, she knows how to manage and govern data and protect that data. Right. That's what is causing, you know, uncertain dinner market that's what's making VCs go well hang on. If I invest in this company that's an Indian company and they go big, and you know they're doing lots of cool innovative things and the government turns up tomorrow and they want your data, I'm going to take it and I may or may not protect it properly, you know, and I may or may not I mean I may choose to use it for whatever means. I may use it in a way that it was never intended to be used and comes out with really terrible analysis, but I rely on that because I somehow think that's good data. And then pass the back over back to you if something goes wrong, you know, and say well it's clear your fault that you know my policy analysis fail, because I did evidence making policy and I rely on your data and your data told me X but it turned out to be why. So I mean there are so many probabilistic issues around data sharing around data governance around data trustees that need to be addressed first, before you talk about mandating. That's something that you know if we if we think about the NPD conversation as it's evolved over the last two years really. I think, you know it raises a really great point, you know honestly which says hey I don't think, you know, India knows how to do data governance here. I don't think actually for that matter I don't think the world really knows how to do data governance here I don't think the world knows about how to ensure that you know you've got that food issue you've been being passed on and the liability. How do you judge that if someone you know takes data that you know was bad data and make some really bad life altering decisions for someone on that basis whose responsibility or should that be, you know so I think there's some really important questions and we see some questions being mirrored say in AI governance ethics and governance will. That's a really robust and deep conversation happening that we don't really see one happening as as much in the data sharing role and maybe it's because the word AI is more succeed and data sharing. I don't know, but you know, I think I think I think at some point that conversation needs to be had so I would say that. No, I don't think the co ease report goes anywhere close to needing to address that I think it's a great starting point. But then really maybe the focus for India if they wanted to build an innovation. If they if the government was starting to think about making some of their data sets publicly available making open data, or even just sharing it amongst themselves. Maybe that question is the one they should be answering first how do we do data governance and data trusty shift well. That's really down to it. I think it was another kind of very interesting angle I know we keep on talking about personal non personal data regulation. Can I can I just say this it's been on my mind for a while so I just want to get out my chest. I think the problem is that the the regulatory framework that we're calling non personal data should not be called non personal data. Non personal data is the ideal ideal scope, I guess on a proposed scope of the framework, but it's not actually what the framework is trying to do the framework is actually trying to enable data sharing. Right and maybe that's what you call it right in some cases, if you know mandate data sharing and we have to really debate whether that mandate in portion is necessary at this stage and my answer will be no but you know, let's call it data sharing framework or data governance framework or data accountability framework that will make a lot more sense and then we'll stop fighting about whether or not personal data non personal data the same of different things. They're completely different regulatory frameworks right I mean personal data protection is about pharmacy, you know it's about the individual and non personal data framework is about how you share data better that incidentally happens to be data that doesn't identify the individual, at least that's the focus of the framework. So I think that's another thing to just call out and and say very quickly. Yeah, I'm not sure I've answered all your questions. If there's anything I missed, please let me know. But hopefully that's, I have been thinking about these questions as I speak so I mean I probably don't remember but that is a fantastic point on probably actually defining this on naming, you know, the intention calling should not be non personal data. Regan, did you want to respond to something that Annabel said. I did I actually I just, I think that her points resonate, especially on like the innovation piece and thinking, as a Manzil is a medium sized player and also smaller player on on the data governance stuff but before that also I think that's quite an eloquent idea actually to stop calling it non personal data and then that would solve a lot of problems because it already embeds the risk into the into the name because it's just data. Right and if we know that inside of it there will be personal data and non personal data and just acknowledging that it's a mix anyway, I so I like that idea. So just on the innovation piece, just from our experience of, you know, working with smaller companies, thinking about the GDPR compliant like complying with the GDPR UK government there were a couple of examples that that, you know, they're thinking about their data governance regime so I guess there's just two points to highlight if there are a thinning government is is watching one for in terms of like mandating sharing I think we've also been public about this week. We also don't think that that's the right way. I think there needs to be value in the sharing, and in order to find that value. There has to be genuine and meaningful consultation with small and medium sized companies. I can say in the EU that is often very difficult and doesn't always happen. So because the big tech companies are always there and they're more, they're better resource, and then you have sort of associations organizations that are seem to be representing small companies but are, but are not really actually representing them so I would just say for any government looking at these types of regulations to really try to find those companies and to help like to get from them what they think would be valuable in order for them to actually derive the value from these types of regulations. And then that leads to the second point, which is another problem that we've seen with small companies is this misunderstanding of the regulations themselves and how to use them. So they don't have armies of lawyers. A lot of these smaller companies right they'll have one or two or maybe five, but they don't have, you know, a lot of that so what happens what we've seen is a lot of misunderstanding and very conservative readings of these legislation that's where you actually get the innovation chill. It's not in the law itself, but it's in the way that the small companies and the startups perceive it and think that it is. So there needs to be a better connection between those who are crafting the laws and those who they want to actually benefit and think less about the sort of bigger companies who often are crafting it for their own, their own value. Sean, do you have. Yeah, sorry, I'm super is in a unable to control the mute button on my trackpad apparently, but to two really wonderful comments I actually run a small business called frontline SMS which started as a last mile technology company and so we and I had the unfortunate problem of being trained as a lawyer so I was for the last 10 years have been this kind of small company with the legal nerds we don't have the resources to address the issues. But we definitely see all of them and so I remember sleep, but the thing that I wanted to speak speak back to you actually was this was the seizure of power and and or sorry the seizure of date the mandate of sharing of data, whichever term we're using for this that historically speaking, there is an enormous legal precedent for governments essentially compelling the disclosure of things under, you know, in from private entities, whether it's individuals or groups or companies. But it is under, as Reagan said, specific circumstances, it is, and it is also an extension of what we think of as the sort of emergency wing of government powers. And so, you know, you have full blown emergency powers, at which point, you know, they're typically overseeing, but you have you know by an independent or, you know, a checked authority like a legislature or legal authority. But you know, governments are able to do season tire tracks of land, you know, can can lay waste to all kinds of rights in in pursuit of a good enough public good right. But we recognize that most goods aren't that important, and we don't want to like completely let every government completely off the hook of accountability and checks on behavior. So we have reduced versions of that right we have things like subpoena authority, and we have, you know, wiretapping authority, and we have public disclosure and we have taxes right these are all ways in which private entities contribute back. But they are all specific, and they're all specific and sort of described under a set of circumstances or relationships, and they're uniquely requested. And so I think one of the things that we're seeing in the move to sort of fiduciaries is that while there is a real positive advance in building infrastructure, you know from bottom up accountability essentially or for the first rung in the supply chain to an extent. You've got to be really careful with creating functionally emergency powers that operate under umbrellas that are as administratively and definitionally abstract as the public good. I mean, to agree with with with both Annabelle and Reagan and just say that there is a framework and one of the things that's happening in this conversation and in this, in this report and in in the adoption of data trusts, globally speaking, is this moving of the of emergency powers from restricted relatively checked finite use in digital use and in data to be this more ambiguously administered public good public initiative. And that's not to say there aren't public goods, but it's to say that the specificity of the conversation matters dramatically to Reagan's point about whether they're realized and whether they're risky. True. We'd already over our a lot of time but since I'm a really person and I don't know when I'll have enough opportunity to have these kind of informed questions. Can you, can we quickly weigh in on this whole idea of data sovereignty. And I guess it touches on what Sean mentioned and what Reagan talked about and what Annabelle you've been talking about in terms of fragmentation with this whole idea that. Again, in the session that permitted was he said that the community rights framework is probably an improvement on the idea of the of eminent domain where the state can just come and say that you know we have all data of India belongs to India, I mean citizens of all data of citizens belongs to India, but frankly the report actually starts out with that acknowledgement so is it really an improvement and you know can be does the data sovereignty approach actually benefit global internet and you know the way things have been working so clearly not but you know where are we headed with this kind of fragmentation and the second is the question of data infrastructures which you know and you can choose which you respond to because you know we won't have time for all. But this whole idea that a lot of governments have been investing in creating digital champions. And you know how do you then negotiate how these digital champions are going to use laws to you know, collectivize so you know Indian data infrastructure versus global data infrastructure is that the World War three situation that we see playing out in the digital domain. Anyone can go for it it's not to stop anyone specific. Let's do a quick thing and then hopefully let other people say smarter things but jump on that that's a big question grenade. I sovereignty is one of those things that people talk about legally I think and really rarely understand right so like you can't declare sovereignty in the same way that you can't like clear back declare bankruptcy, you know you can just like walk around like there's a like bankruptcy like that's not what does it there's a whole process you have to file paperwork right and so in digital trends in the digital transformation of the relationship between a person and states right we're seeing a lot of political philosophy a lot of service design, you know, philosophy play out. And I think what we're really seeing is that I wrote a piece in foreign policy a while ago that analogize this very sloppily with with someone who knows much more about the sort of pre China era. It was in a lot of ways the definition of the modern state because you have different factions that had to balance kind of their interdependence and their economics and their social well being very very broad sweeping analogy but the point being is that we're in this place where different states are sort of choosing different approaches at different levels and some are calling it sovereignty some are calling it localization some are calling it adherence to international or combating that with globalization norms. And so there's a at some level it's very hard to do this unilaterally. And I think that what I the other part of this that I really wonder about, or I'm curious about is that we talk, you know, this is essentially getting done unilaterally through trade agreements or participation in international forum, but it's like, it's just this first past the post, and then people sign on based on where their other contextual allegiances lie. And so I think with sovereignty, you know, the term doesn't mean what most people think it means when they say it. And, you know, to Annabelle's point about let's just call it what it is like we're defining what the appropriate powers are in the digital relationship between a person and a state. We should be having those conversations but calling it something else or like standing it on some abs, you know, some abstract thing that people can't really touch I think separates and disempowers most people from conversations that are very visceral and impactful to their rights. So sovereignty not really a thing but if we all agree it then maybe like those things I guess. And I'm going to jump in now only because I'm actually pretty bad at really big abstract concepts I always like to bring that down to you know what me the little man can can understand. And so to me anything ultimately in order for the digital economy to work and all the relationships are between an individual and government between governments and governments between governments and corporations, big and small and in between individuals and corporations. What you really need to have is really good balance. So two things first of all you need to have a good concept and understanding around control and access to data to control access data ultimately is that what everybody's fighting about and what everybody's fighting for. So it needs to be balanced against, you know what I will broadly call rights, it's not the best way to turn it but let's just say let's just use that word. And that that goes to individual rights so you know what should the right of the individual that goes to potentially community rights as well. And so when we look at the term community data being used outside of India, when we look across the rest of the world and where we see the term community data use. It's actually used mostly in the context of like first people and indigenous peoples, and their community rights to data so that's a very interesting conception. And that's because they are conferred a set of rights as a community in relation to, you know everything including data. Right. And then we go beyond that and we think about what should, and I think this is where it gets really complicated maybe not the right work, a lawyer in the room might be better able to find a work for me but what is the government's, what kind of control should the government have, what kind of powers should the government reserve. As they're trying to on the one hand, you know balance the rights of the individuals, you know against the companies, and also at the same time try and figure out how they can potentially do certain things better you know, including policy making control and balance that against, you know companies and I think traditionally as I think Sean mentioned, there, you know, there is precedent for when data can be mandatorily obtained. Those things are well defined for a reason. And I think that it should continue to be well defined for reasons so it's just maybe a matter of sitting down and defining what are those circumstances in which the governments feel that they are rights, you know to certain kinds of data for certain kinds of purposes right and today a lot of it in the regulatory world is focused on investigation and having the power of information so that you can find out, you know, that regulated entities doing something that they should be doing. And so I think likewise and we then think about that in the data wall. What is that, you know, similar right and how do you define it in a way that's reasonable and that everybody can agree on. The reality is it's already fragmented. So I think, you know, we'll set that aside. I think I feel like the, you know, when GDPR was passed. I think there was a broad acknowledgement in the data privacy role that everything was either GDPR like, or not like GDPR. And that sounds binary but obviously it's really not in practice but the fragmentation is just a way it works. So I think the most important thing then is ensuring that, you know, data can be used for its intended purpose. And I know the tendency is to want to put physical and geographical borders around that I can understand why because you need to tie legislation and laws and control over that but at the end of the day. If all those regulations and laws means that data can't function for its purpose, then there's no point in collecting data and then data is pointless right but of course again you know that has to be balanced with all the rights that we talked about so control and access I think I'm thinking about those relationships and what control and access should look like in each of those relationships in every jurisdiction really has to be the responsibility of every government, trying to operate in the digital world. And that's one of the most drastic points. In fact, I've been grappling with how making more data available is actually in conflict with data protection principles of purpose limitation and you know consent based use so, you know, and I'm not a lawyer so I thought, you know this is just not able to understand what's written in the report but I'm glad that you know, somebody like you who's been working on these issues is also equally confused. Reagan any thoughts on very big questions. Very big questions. I'll be really brief, because I think Annabelle and Sean covered a lot of it. But it's not just you. You know, I have these same questions I think I would just say that I, I think that the concept of data sovereignty is problematic and generally of data ownership. And one of the things. I want to mention like what is core within data protection frameworks and what is the purpose of having data protection frameworks is to account for and mitigate against the risks of asymmetries of power. And that is built everywhere and on every level of society if it's, you know, patient and doctor, or if it's broader, you know, in our broader society with class differences, you know buying a home there's just a range of when you're sharing data or handing over data there is always an imbalance of power, and the data is supposed to protect against that. And that's what purpose limitation is for that's what all of those core data protection principles are about data sovereignty invokes. I think the ideas are, you know, in the EU there was a lot especially post Snowden revelations a lot of like technological sovereignty, you know, so I think it does provoke some good intentions in terms of let's be independent, let's build our own on technology, let's be in control of our of our you know technological destiny. I understand that, but that is inherently very problematic and so is the ownership piece. I think what we want to get to is a collective management. So an understanding that individuals have data that is related to them that needs to be protected and there are, you know, legal frameworks around that hopefully soon India will have one as well. And then on top of that how do we think about how to collectively manage that together and so it's not necessarily about ownership or sovereignty, but about collective management. Okay, any parting thoughts, any big issue regarding this big problem that we have not touched upon small issues. One little thing, there's a really great quote from like an early stage venture capitalist which is never a sentence I expected to say, which is that when you fail to consider the problems of the system pre deployment. The system is most often characterized by its failures, as opposed to whatever value you'd intended for it. And I think that, you know, to Reagan's point and I mean to to Annabelle's points as well. It's it's that piece of this type of legislation, it's considering the problems ahead of time and planning for them in a proactive way. And to your point about digital champions. I think a lot of a lot of countries developed digital champions and kind of banked on the idea that they would either be political independent or very nationalist, and not really built any mechanism to say which or why or how or provide any barrier or protection to that. And what a lot of the fragmentation that we're talking about to an extent is, you know, waking up with some real politic about how how that works in practice and so I think, you know, if we do these things if we build the mechanisms to do these things but if we just hope it happens and then don't consider what could go wrong and build real preventions or real mechanisms to deal with that. As as is sometimes the case in these frameworks. I think that's where you see them go off the rails quite commonly.