 We will get started by having our panelists, our three distinguished panelists, talk a little bit about their work and sort of get the ball rolling on some of the substantive issues that we want to put on the table today. So let me just very briefly introduce them. First we have Nani Janssen-Reventlow, who is the founding director of the Digital Freedom Fund and has been an advisor to the clinic since 2016. Nani co-teaches a class at Columbia Law School with Jan Higintzu, who also teaches at Georgetown and Koch University in Turkey and is a barrister at four new square chambers in London, practices commercial law and human rights law. We also have Vivek Krishnamarthi, late of the cyber law clinic, presently counsel at Foley-Hoag, whose practice is in corporate social responsibility and who has engaged with these issues across the spectrum. So I also want to remind folks we are video recording and live broadcasting this event. So in the event that you ask a question, we're going to ask you to use the mic and just keep in mind that you will be on take for that section. We're going to try to keep the discussion piece of the talk to like 35 or 40 minutes and then open up for questions. So if you have burning questions before then, by all means raise your hands, but know that there will be time at the end as well. So perhaps we can get started Nani with you. Could you talk a little bit about the digital freedom fund and the role of strategic litigation in advancing these questions? Sure. We can just talk like this without micropon. Actually, we probably should have them. Yeah. So I probably wasn't caught any of them. Thank you. Sorry, we're just tuning in. We had a really interesting conversation up to this point. Key issues have been discussed, and this is just, yeah, exactly. Yeah, this is great. Thank you. Thank you so much for having me here. I'm delighted to tell you a little bit about the digital freedom fund, which was founded about a year ago, to support strategic litigation on digital rights in Europe. The organization was founded based on an expressed wish for the digital rights field in Europe to do more with strategic litigation, to put more effort into that to advance digital rights in the region. And by digital rights, we mean human rights in online and network spaces. So the full range of human rights, as you would find it. For example, in the Universal Declaration of Human Rights. So we started a process where we spoke to all the different actors in the field to hear what their needs were. And also based on their input, we set a number of thematic focus areas for our litigation support. The idea is, of course, that strategic litigation means, per definition, that it doesn't exist in a vacuum. It's litigation married to other efforts. So campaigning, advocacy work, and other efforts. And that actually ties in nicely with the work that I did here at the Bergman Klein Center when I was a fellow in 2016 to 2017, where I looked at collaboration across different disciplines in strategic litigation, in strategic cases. How can you get lawyers, activists, technical experts, and advocacy people to work together and actually have more impact in their work? So we ended up devising a really nice website called the Catalysts for Collaboration, where we looked at the possibilities of good examples where that happened. But also this sitting a number of best practices for people to keep in mind when they work on strategic cases. For the Digital Freedom Fund, we now focus on three thematic focus areas. One is privacy. The second is the free flow of information online. And the third is accountability and transparency when looking at, and the adherence to human rights standards, and design and development of technology. And I guess we can talk about two examples specifically today, given the expertise that is in the room. One issue I would want to flag is the GDPR, which of course ties into the privacy aspect, but also surveillance, which is what John will be able to tell you more about in specific case that he has done. And I think Vivek will tie in the transatlantic aspect of it all. So the GDPR, I'm sure you've been bombarded with reflections on that and whether or not it's interesting, relevant and useful. We think so on the other side of the ocean anyway. The GDPR is a really interesting and useful tool to really give teeth, to enforcing your rights to privacy. It's a human rights based document. It's not that much reflected in the more technical articles that you will find in the GDPR itself, but in the recital you can very clearly see that it's very much rooted in the right to protect your private life, right? And the GDPR itself gives a lot of opportunities to basically be able to enforce that because it lists a number of clear rights that you have as an individual and possibilities to enforce them vis-à-vis private entities whose obligations are also really clearly listed in the GDPR. I made a list here of all the rights that you have under the GDPR. It's the right to be informed. It's the right to have access to your data, to request ratification and also erasure under certain circumstances. You can restrict the way that your personal data are being processed. You have the right to data portability, meaning that you can take your data from one provider to another. You have the right to object against the processing of your personal data and you also have specific rights in relation to automated decision-making and profiling, which is really important, of course, with all the developments in that area. What obligations do parties have, on the other hand, the private entities, they have the obligation that the data processing must be lawful, fair and transparent. They can only collect data for specified purposes, that they have to get your explicit consent for beforehand. They have to process the minimum viable number of data and it has to be accurate and they cannot keep it longer than it's strictly necessary for the purposes for which the data were collected, which is a really important point as well. Plus that it also puts obligations for security on the private entities. They have to make sure that the data remain confidential and that their integrity is protected. I think today there was a nice news report about Marriott loyalty program. Things like that should not happen in more ways than one anyway. But companies have to show that they're compliant with the GDPR. What did we see so far case-wise? A case that was very high profile in the news because it was filed, the data, the GDPR into force was a case that Mark Schrems filed against Facebook and Google. Basically targeting that issue of consent. Many companies use obscure terms and conditions and don't really give you a choice as to whether or not to consent to the processing of your data. And Mark Schrems has filed a case challenging exactly that of two of the biggest platforms. Why does this have a potential to have more teeth than perhaps previous cases? Under the GDPR you can get fines imposed on the companies of about 4% of the annual turnover of a company. That's going to run into pretty big numbers if you look at the amount of money that the Facebooks and Googles of this world make. There are some examples pre-GDPR of good cases that were handed down in this kind of regime. The data protection commissioner in the UK handed down a fine against Facebook for its involvement in the Cambridge Analytica scandal of about half a million pounds, which was the highest that you could get before the GDPR regime. So, you know, we're interested to kind of see how that develops now that we have the GDPR. The data protection authority in France find an ad tech company for basically not allowing people to give explicit consent. They had really complicated terms and conditions. People weren't actually sure what they were signing up for and their data were being shared with other ad tech companies. If you want to look at ad tech companies and how creepy they are, Privacy International has a really informative piece of that on their website to explain exactly how that works and how the sharing of data and putting them together again has an impact on your privacy. Recently, a plan was abandoned to use patient data that were collected by the National Health Service to basically report people who were suspected of being in violation of immigration rules. And that was because the conversation really got started, because the UK was implementing the GDPR nationally and an organization called the Migrants Rights Network and the UK was also working on the disability threat to file a case on this. So that also shows how, just by threatening some solid litigation, you can already kind of change practice and policy. I will now stop talking and hand over the mic back to you. Thank you, Nani. That's such a great overview of many of the ways that the UK and the UK are working together. Thank you, Jessica. Jessica, thank you very much indeed and thank you to the Berkman Klein Centre for having me today. It's an absolute pleasure to be back at Harvard. Strategic litigation is the way that I came into this area. I'm afraid I was just a boring conventional human rights lawyer and then landed on my desk a brief to intervene in a case in the United Kingdom, David Miranda. He, of course, was a key actor in breaking the Edward Snowden story about surveillance. So that's where life for me began in terms of considering these sorts of issues, but of course in that case it was all about the breaking of the story. It was all about the government could take some of the sensitive information back from those seeking to raise awareness about these issues. Big Brother was the next step and some, if not all, of you in this room will be familiar with that case. It is a case that made it to the European Court of Human Rights. It was a challenge to mass surveillance in the United Kingdom. And in terms of thinking about practicalities of strategic litigation just to give you an idea, we see you're dealing with technological change that is going very, very quickly. We're having to deal with new issues or the evolution of existing issues. This is a case that was brought five years ago and judgment was handed down earlier this year. The human rights system certainly in Europe under the European Convention is not well placed to respond very, very quickly to these sorts of cases. It's a case in Big Brother that was relatively speaking expedited and of course before you go to the European Court of Human Rights you usually have to exhaust domestic remedies which means you have to have a go in your own courts because Big Brother was fascinating because of course there is a US angle to mass surveillance in the UK putting it lightly and the court considered the issues by reference to the way in which the UK had decided to draw the balance between national security on the one hand privacy, the right to freedom of expression in the other and the court found violations in respect of the right to privacy under Article 8 and the right to freedom of expression under Article 10. It's a just telling Vivek before we started it's a 212 page judgment and rather than tell you absolutely everything now I'm hoping that we can explore some of the themes because Europe does not speak with one voice on surveillance Member States have both in the European Union and the Council of Europe have differing views about what is and what is not acceptable but most importantly for today we need to recognise that there is actually a in places subtle difference between the approach taken by the European Court of Human Rights as shown in Big Brother taken by the Court of Justice of the European Union of course not a human rights court but now has its own charter of fundamental rights and perhaps ironically it is the Court of Justice of the European Union that has taken a more rights protective approach to mass surveillance the European Court has said look this is an area where States really have a marginal appreciation for them in the first instance to balance these considerations and then we as a supervisory court review it whereas the Court of Justice of the European Union has said that mass surveillance per se cannot be proportionate right so that's quite a difference and I think sometimes in the coverage of these sorts of cases that's lost because of course with the European Court of Human Rights there were findings of violations but I think it has to look I think at the detail and see what the Court did not do anyway I'll just pause there because I could speak for 40 minutes about this case but hopefully that's an interesting by way of opening on those cases and the role that strategic litigation can play in this area Thank you Jan and I think that's a great point to think about the competencies and between courts certainly an important consideration when you think so many of the organizations that the Digital Freedom Fund funds about where to file these cases so I know we have some Europeans in the room I know we also have a lot of Americans and American lawyers so I'd like to turn it over to you Vivek to bring this discussion back across the Atlantic as I promised in the description European and US perspectives on this and to talk about perhaps some of the ways in which the GDPR or European case law around these issues has been persuasive here Sure, thanks so much Jess I'm delighted to be back in my old stopping grounds, the first classroom that I taught in actually many years ago okay so to wrap up what we've discussed I'd like to remind the image from Greek mythology of Helen of Troy but to substitute Edward Snowden's face for Helen of Troy because Edward Snowden kind of launched a thousand different initiatives in the area of privacy so that applies both to government surveillance on the one hand and on the other hand what enabled the PRISM program another similar program was the mass private collection of data so there were two different trends and the Snowden revelations were the catalytic moment so we're now five years on we've talked about the litigation path regarding surveillance but what's been interesting in the consumer privacy space is that we started with this idea that the Europeans were way out ahead and they were so Snowden catalyzed the discussions that led to the development of the GDPR it led to a realization in Europe that the 1995 data directive had what was good but had age and needed reform and what's been very interesting is that Americans and American companies have been very suspicious of the European approach to privacy for a very long time partly for reasons of libertarian political culture in the United States partly because it fit poorly with our first amendment traditions or so we thought but this has been the year of convergence I think so let me take you to California it's June in Sacramento of this year and you know really almost like a bolt out of the blue California enacts a very significant privacy law the California consumer privacy act of 2018 and it was a great American process that led to that law which was the threat of a ballot initiative this guy McTaggart and some other people who are privacy activists in California had collected the signatures that they needed to have a very strong law modeled on the GDPR approach to be enacted in California and ultimately the legislature within 48 hours before the deadline for pulling the measure they passed a law that was a compromise law that gave a lot of what McTaggart the particular catalytic force in this law wanted so what we now have in our country's biggest most popular state and the state that is the home of the tech companies is GDPR light right we have a framework that will come into effect in 2020 that is based on the idea of user consent to data processing it's a data processing kind of approach right it gives users right to access the data to delete under certain circumstances to prevent the sale and transfer of data all kind of weakened not as strong as in the GDPR and with some distinctly American features such as a private right of action so you can go sue if your rights have been violated as we do in this country so we now have this interesting convergence that convergence I think is driven by consumer appetite but also by the fact that Europe is now such a large market and the business model of major companies is such that it makes sense to comply up right so if you're a multinational company you have to comply with the GDPR and it's very hard to run your business in a way that is going to comply with different standards and different laws in different places right so there's going to be growing pains as a company as you sort of grow out of your home market perhaps in the US and have to comply with European law and that's precisely what's driving the convergence so for the first time in a generation there is now a serious conversation happening in Washington DC about a consumer privacy law senators are holding hearings on it this is unheard of right six months ago this seemed impossible not just for reasons of gridlock in DC but just because the interests that were pushing against such regulatory innovation in the US were so strong we seem to see that log jam breaking it will be very interesting to look at what the new congress does but to sort of wrap this up I think it was very fashionable after the Iraq war to talk about Europeans being from Venus and Americans being from Mars and certainly it was that way on privacy for a long time but perhaps our differences are overstated in terms of orientation and also where we are seen to be going in terms of the substantive law thank you I think that was a super helpful wrap up now interestingly another connection is that I think that the organizations that you represented in the Big Brother case were American organizations can you talk a little bit about that and the role that they had in the case yes I was counsel for CDT and Penn American in the Big Brother case before the European Court of Human Rights and now one of my chief roles really in the case was to fill any US gaps and to assist the court in setting out what had happened here in respect of the two regimes that were being looked at and to make sure that there was no submissions being made by United Kingdom government that were out of date and that was so we had quite an American perspective but it was more in terms of assisting the court to make sure that the court had everything before it and also trying to put the drawbridge down to explain why this was relevant the court was most interested I think to understand the checks and balances over here as it applied to information that was being shared with the United Kingdom government which makes complete sense but aside from that of course American organizations also want to take positions of principle in terms of the reasons that explained in a polity that is increasingly influential and I think the European court of human rights is one of those polities. Another issue that you raised earlier is the question of timing and how fast these technologies evolve in comparison to how fast our justice system works you noted that the big brother case which took five years from inception was expedited certainly in the US I don't think we would expect a resolution from the highest competent court in five years Nani can you maybe talk a little bit about how you and the organizations that the digital freedom fund works with think about timing and how you plan lawsuits with respect to the evolution of technology and balancing those two very different timelines against each other that's a really good question actually not one on which you have a standard answer I think what was interesting about the big brother watch decision is that the legislation actually changed in the meantime it wasn't even technological change it was at the new even worse surveillance law had been adopted in the UK however the issues of principle that the court looked at would apply across different circumstances I think that in that sense if you mount a strategic case it's important to kind of make sure that you frame it in such a way that parts of them can be transposed even if the fact pattern that underlight the case originally has changed in the meantime that there are still parts of the judgment that you can still take and make applicable to the specific circumstances another thing to take into account is that justice moves at different paces in different jurisdictions so it's important that the Court of Justice of the European Union is actually a really great court to go to with actually many digital rights issues at the moment we're lucky that we have human rights minded judges who get the tech generally speaking which is not a thing that we can always say about the European Court of Human Rights they're working on it so I'm sure it will improve but that's a thing that you can think about particularly if you have a directive for example if you want to clarify certain provisions you can actually be very strategic and think like what would be the national jurisdiction where we'd want to challenge this or where we want to get this elucidated where would we get first of all an outcome that would be helpful to us but also for example where might we be able to get a referral to the Court of Justice of the European Union because within the system there it's possible for national courts to ask questions to the Court of Justice and basically refer a matter on while you're still in the middle of proceedings nationally and the approach to that is very different depending on the member state so one of the landmark digital rights cases was Max Tram's Facebook Kate and there's of course a couple of really interesting cases from Ireland as well those are jurisdictions that generally are more flexible in referring matters on to the Court of Justice of the European Union so that's a thing to think about and if you think about something that could have an impact Europe-wide would you perhaps be able to mount a case in one of those jurisdictions initially that's very helpful Jan would you like to follow up? Yes I'll follow up on that both by reference to Big Brother and more generally of course it wasn't one applicant it was 15 applications and those 15 applications were grouped I think to three groups and the applicants were not all on the same page as to how to deal with this case and there were a number of applicants who argued that they could go straight to the European Court of Human Rights i.e. without exhausting domestic remedies in the case of the UK that would be the IPT it was a specific tribunal to rule on these matters so they went straight to Strasbourg and there was another group that said no no we've got to exhaust domestic remedies which I can imagine led to some fraught conversations behind closed doors as between those groups so there's that and I think that is very very important to get sort of a unified strategy and try to be on the same page because if you can if you have to exhaust domestic remedies you can find a forum where actually courts of international courts are slightly more permissive that's helpful but ultimately speed is something that is also political when it comes to something like the European Court if you've got advocacy taking place more broadly especially in some of the related institutions that can make a difference in respect of the two courts that Nani and I have mentioned we've been talking about the Court of Justice for the European Union and the European Court of Human Rights as Nani mentioned there is a specific preliminary reference procedure which enables a first instance court in a EU state to refer a point of law up to the Court of Justice for the European Union so to stay the proceedings domestically and say look we want some guidance on this one issue of law that can be done at least in theory relatively quickly in any event it must be a quicker process than having to exhaust all domestic remedies first instance appeal court Supreme Court constitutional court and then bringing an application before the European Court of Human Rights and of course these two courts are separate but let's not forget that under the latest EU treaty there is an obligation on the EU to sign up to the European Convention on Human Rights so whilst currently you have two courts and they are separate and one is purely human rights court the other one has a charter that relationship may also change in due course just something to think about as we continue to look to Europe to look at developments Thank you and that's fascinating to think about I hadn't fully processed the implications of that Vivek I wonder if we might turn to a different way that we can engage with governments around issues of tech equity and maybe step actually one very small step outside of the US and Europe in order to Canada and also bring in issues around emerging AI technologies I wonder if you would talk for just a few minutes about the report that you did with some others here around AI opportunities and risks so we've talked about litigation we've talked about regulation coming from the top down the third approach and this is sort of a situation that a lot of governments feel themselves in regard to technology is seeking expertise because the technology is moving incredibly fast it's not clear what the implications are what needs to be regulated and how to do that so the Canadian government engaged with us at the Brooklyn Klein Center to help them try to figure out what to do about AI and what they wanted us to do to understand how artificial intelligence as it's being used in the world today impacts human rights and the idea there was to try to understand if this we're about to celebrate the 70th anniversary of the Universal Declaration of Human Rights so if the modern human rights movement so that's December 10th so we were taking a 70 year old body of law and seeing what the applicability is of that law to this the most cutting edge phenomenon that there is right now so to make it short you can go download the report if you'd like it's on the Brooklyn Klein website the basic conclusion of the report was that there are a lot of ways to look at the social impacts of AI we've been having a lot of conversations about AI and ethics is this an ethical thing to do or not and one of the thrusts of the report was to say well you know we can also talk about it in terms of human rights human rights are a body of law that is established and we just discussed for the last half an hour how you can litigate human rights claims how human rights laws can be enacted so there's something in that experience that allows us to say that look these are impacts these are good consequences or bad consequences and the bad ones could be actionable right so we did that mapping for six current uses of AI in the world today from medical diagnostics to the way that credit is now extended not just a conventional credit score but there are companies out there that look at thousands of variables that are beyond conventional credit scoring and are kind of unregulated and those have impacts on people's rights including the right to non-discrimination but a lot of other rights too so that was one part of the mapping but at the same time there are questions that AI like any new technology raises that are very difficult to answer using the current law and a great example of this is we found criminal justice is a great example there's a lot of conversation here in the US about algorithms and criminal justice being used in sentencing and you know there are some basic trade-offs in the nature of how an algorithm is programmed that you can maximize for some things but not for others so you know to make a hard case if you have an algorithm that's going to reduce the prison population overall by 30% but increase racial disparities by 1% which means that all ethnic groups are going to have a lower rate of incarceration but the gap between let's say the majority and minority increases what do you do we actually there's no coherent answer yet in human rights on what to do there there are processes we could use part of the point was that you can go and litigate that we get a bunch of smart judges to speak on it and or as a society we need to make some decisions so the purpose of the report was really to start that conversation and it's interesting that a government went to an external group to do that thinking and Canada and a few other countries are starting to develop national AI policies that are trying to get their heads around this very powerful but generalizable technology it's kind of an empty vessel into which you can feed data it's going to learn so that's a really difficult kind of public policy challenge to deal with and it's great that governments are thinking about that at the beginning of the technology rather than when it's in much wider use thank you so sensitive of the time I want to move us to perhaps like one stage even further away from we start at litigation which is sort of after the fact using the mechanisms of the state to redress grievances thinking about advising states now perhaps I'd like to think ask each of you in turn to think about the most productive ways that you have seen in the space of advocating for fairness equity and human rights in tech of engaging directly with the companies so Nani, can I start with you any particular ideas that you have and I'll offer one provocation as a possible thing that's been in the news for you to respond to what are your thoughts about Facebook's proposed content moderation supreme court as a future president of that supreme court would you accept the seat on it? probably not probably not, no so this is a the reason that I find is a difficult question to answer is because I just haven't seen any examples yet of really successful efforts and I now I'm sitting next to GNI participants I think those are nice initiatives I just don't think that they have sufficient teeth what I actually think is that what we need for tech companies is similar types of regulations as we have for financial companies and I think until we have that kind of framework in place everything else is nice but it's just not going to have the impact that we need so I'm just going to leave it there for a moment and pass on the microphone to whoever wants it I actually want to ask John a question because there's media regulation in the UK which I think is quite interesting so a lot of what I do is working with companies and getting them in the same room as governments and their civil society interlocutors to try to set some standards so the leading one we call these multi-stakeholder organizations it's a terrible word I wish someone would come up with a better term that's less of a mouthful the gauntlet has been dropped the global network initiative is the leading one, the tech space and it formed about a decade ago to basically set some ground rules for how tech companies protect the free expression and privacy rights of their users against the government so I think it's been relatively successful in doing so and the success is less seen in terms of you don't see a lot of news coverage many of you probably never heard of it before and it's hard to devise metrics of success but what's interesting to me is that being part of these organizations has changed how companies make decisions internally so there's a lot of process that happens now inside a company and you could actually see this in the strangest of places in the intercept story yesterday about Google's Dragonfly project which you think, my god here's a company that is doing completely the wrong thing in designing this search engine that is easy to censor for the Chinese government but what's interesting about that story is that it describes the process that Google had to do a privacy review and about how this mid-level person presumably who was going to do that was squashed by the high level person who was trying to drive the project forward and there was an internal fight in the company as we see playing out in the media and in public protest by employees now but I think that is exactly that's a sign that the process works in some way that there was an internal voice and a business process that said, hey, wait a minute there's a sort of blundering in and then the thing leaking so one cheerer for multi-stakeholderism great, I'm singly the worst equipped person to answer this because I think this is an area of law that really challenges current structures on the provision of legal advice and I actually think based on my own experience in-house council have become increasingly sophisticated because they've had to and there are certain areas where in fact those not operating in-house, so those in private practice have had to catch up a little bit, obviously no one in this room saved for myself but I think that certainly in the United Kingdom I feel that's where we are we almost have an outmoded system to discuss these things but things are moving in terms of effects I participated the media round table but it was Chatham House so I can't say anything about it but it was fascinating to see just how much thought had gone into obligations on publishers, on GDPR and under the new data protection legislation in the United Kingdom it's meant to merit GDPR but there's a real shift there's a real shift previously it used to be a court that would consider an editor's compliance it no longer is it's now the media publisher who has that obligation that's a huge shift internally as to the way that companies have to grapple with these issues both structurally and potentially substantively as well so I'm afraid I think it's to watch this space but it's moving very very quickly and someone who still wears a wig for his profession and it's just about got a laptop I think we've got to catch up certainly at the London bar So now to kind of backtrack on my initial comment but it's of the puzzle I just think that and all of those are positive I just think that as long as we're in a situation where there's actually no clear regulation of the behavior of platforms and they are clearly not putting enough resources into addressing the issues that are there there are efforts and so on but I think for companies that have huge annual turnovers the amount of resources that are actually being allocated to address the issues it's nice to have processes but you know their stories keep on coming out about you know all sorts of misconduct and I yeah I just feel that until there's a clear framework that also allows for proper enforcement we will still have a problem Can I just follow up on that? I actually took a soft piece so I just realized how much of a piece I took but the Facebook Supreme Court details which I've not read but it opens up I think a real divide and approach between Europe and the US and I wonder if this will see a convergence but there is such a suspicion here of regulation that comes through government on content that's not the same starting point in Europe but it's obviously something that has to be dealt I mean Facebook is a fantastic example what's the answer where the two starting points do we look to the state do we look to Facebook itself as the starting point if there's going to be a supreme court then what's the relationship between that supreme court and state actors what's the supervision at the level of the state who is in charge of supervising that is it an old jurisdiction or is it an old jurisdiction that needs to deal with these massive entities in a certain way I mean there are real structural questions I think that needs to be addressed and I'm not sure in terms of general convergence I think that is an area where there's got to be some conversations because it just seems that sort of two ends of the spectrum in terms of at least a starting point I hesitate to interrupt it but I do want to save time for questions if there are any yep here microphone coming to you this which is on the bottom Kendra here here's one I had a slightly different question on the overlap the transatlantic issues involving GDPR in my work I've worked with Harvard as an institution on how sort of groups that have our US law and GDPR applying can sort of work together there's a lot of disconnect in terms of the US privacy law is very much patchwork GDPR is very much sort of principle based and also differences on what they protect you know for example sexual orientation, political party membership and union membership are protected under GDPR they aren't here and it seems very hard to figure out how groups that have to follow both regimes can work what are your thoughts on how those might better overlap if that makes sense other than hiring me so that's a really hard challenge right and there's a reason that privacy lawyers have been incredibly busy for the last year right because you're absolutely right so GDPR covers everyone and everything right but it's a framework that says like here is how you should tweet data and here are the kinds of data that fall it doesn't matter who you are as long as you are a data processor or you are holding the data etc you have obligations right in the US our approach is more sectoral so in health data right if you are apple and you're collecting my heartbeat information there's almost no law that covers you whereas if you go to a doctor and a doctor issues you some kind of device to you know check your heart and that's going into the doctor's file then HIPAA right which is our federal health privacy law does apply it varies a bit and then it varies state to state some states will say apple does have an obligation some will say it doesn't right so I mean the what we would do you know if we were providing legal advice to a client is to first map the data that you have right what kinds of data are you collecting and then you know and where are you geographically and then you know what legal requirements apply to your handling of that data and then at some point you have a choice are you going to distinguish based on where the people are or you're going to try to level up right and create a framework inside your institution that allows you with one mechanism to comply with multiple regimes right and that's I think where companies are moving to as a practical matter right so for example it is possible to comply with HIPAA and the GDPR with regard to health data right don't sell it right don't give it away without the user's consent so you can sort of figure out what you know what kinds of protections and compliance mechanisms you have around that data to comply with both so this is sort of a question going to someone made the point earlier about the Marriott breach and I think one of the observations is to some extent sort of the major players in sort of personal data space generally do I mean putting aside how they use the data they do a much better job of protecting it Google and Facebook for example and so do you think there's and this kind of goes back to the point about this trust especially in the US of sort of government institutional hold over that kind of data but do you think there's a role for some sort of data brokerage and I think this has been floated in the past people have some governments have said it would be nice to have a neutral trusted third party that could handle people's data in a way that they would trust them is that something which is attractive, at all practical plausible or just science fiction Nani you brought up the Marriott breach in the first case and you've been the one arguing for the regulation with teeth do you want to handle this one I can't really answer that I mean that was an example sorry I'm just not equipped to answer that that's a broader policy question that is outside of my area of expertise what I would say I think it's very hard to think of the idea of lots of different companies and social actors giving up their data that being said it's a practical matter we're all moving to the cloud right so you know even a sophisticated service like Dropbox or Evernote right so these are companies that handle a lot of our data they don't handle it themselves they contract out the storage and the security to Amazon and Google so effectively most of the world's data is being held by Amazon Google and Microsoft right including government data which is why the HQ2 of Amazon is outside Washington DC so it's a practical matter those are the companies whose architectures are protecting data against hacking right now with a caveat right there's different ways that you can deploy your software as a service you know if you're in the cloud so it'll be interesting to find out more about the Marriott breach and how that happened but I think there is a movement toward more sort of centralization of security as a practical matter because unless you're really on top of the ball you can't manage that cyber security risk I was just going to put my dystopian novel reading hat on but to make that point far less eloquently I mean if you push everything into all legs in one basket and then security concerns obviously come to the fore but I am in anticipation of today I read something just in terms of this trust point that you come to and it might be that one's heard about this already but Uber was fined in the United Kingdom quite recently it was up to 385 thousand pounds because it had not disclosed the existence of a data breach after it had become the victim of a cyber attack so rather than telling customers and drivers they spent a hundred grand paying off the people who had actually enacted breach and asking them to delete whatever it is that they stole so you know it's the sort of thing that you might read in a tabloid think oh well that can't possibly be right but I think trust is absolutely key and everything pushing up to the cloud it's the same we've been talking about legal privilege issues the fact that even a small barrister's chambers in London now everything is accessible and query and we get their attempts to hack us and we're not a particularly important organisation in the general scheme of things I should add and I guess but you know it's a serious issue a serious issue we have time for one more question if there is one oh you have another great terrific just in terms of reading the tea leaves what happens if the UK has a hard exit from Brexit and just drops out would they still keep GDPR do you think that is the big question in respect of the EU withdrawal bill I guess I'd reverse it and say what on earth happens if they don't how does that work and I do not think that enough thought has gone to that so we'll have to see how the bill goes but seeing as we have a brand new data protection law which was enacted this year it might be seen as a retrograde a pretty mass retrograde step to then enact something brand new thereafter it will be practically impossible for them to be able to continue with their services and business if they don't so I just don't quite see how from a practical perspective that even if they don't make it as explicit as saying we adhere to the GDPR they will have to have the same type of rules and force because whenever they want to do business with any type of customer they will have to be able to comply so thank you well we bit off a big topic for this hour fairness equity, human rights and tech from the US to the EU and I particularly loved Nani what you said about the necessity for there to be action on various fronts whether it is from human rights advocates who have a suspicion of multi-stakeholder organizations and the own efforts acting simultaneously alongside those organizations as they move these things forward these issues are so complex both the legislation, the regulation and the technology is moving quickly and so I think it's very exciting to have US and European lawyers more closely in dialogue more closely coordinated with relation to your Brickman project and working with governments as Vivek and the rest of your team here at Brickman with the Canadian government to push these issues forward so thank all three of you for being with us here today and thanks all of you for being here to listen