 Hello everyone. Wherever in the world you are tuning in from or watching this video, welcome to the World Economic Forum's Sustainable Development Impact Summit. I'm Gideon Litchfield, I'm the Global Editorial Director of Wired Magazine, and we are starting our session on shaping a responsible data ecosystem. Every day, the world generates an estimated 25 quintillion data points, a number that is only growing. There is so much we could do with this data for good, especially after a global, in a global pandemic. We could design better public health interventions. We could develop vaccines and drugs more quickly. We could monitor the economic impacts and figure out which communities or industries need more help after the effects of the pandemic. But all too often that data just isn't getting to where it needs to be in order to be put to effective use. And obviously, there are all sorts of risks associated with generating, storing, sharing, and then applying all of that data, risks of privacy breaches, risks of misuse of data, risks of algorithmic bias. So it's a very fraught topic, so much potential, so much risk, as with so many things in technology. So how do we and how do organizations work to improve the responsible use of data for development and for the benefit of the sustainable development goals? How might we collectively create a responsible data ecosystem? With me to discuss that question are three experts, Aaron Egan, who is the Vice President of Public Policy and Privacy at Facebook, Joanne Stonja, who is Chief Data Officer at MasterCard, and Quentin Palfry, the President of the International Digital Accountability Council. So we have a mix of perspectives here, a social media platform, a financial services firm, and a nonprofit watchdog that oversees the digital marketplace. We do have a fourth panelist, Alain Bejani of Majid al-Futayim, but he could not make it. Unfortunately, let us know just now that he couldn't be here. So nonetheless, we're going to have a fascinating conversation with these three panelists. Before we talk to them, I'm going to ask you, the audience, to take part in a poll, which you should see appearing soon on your screens. You should see the link to it in the chat. Using the slide, I'd like you to answer this question about what is the most effective way to encourage corporations to use data in a policy, sorry, in a privacy-presenting participatory and sustainable manner that can benefit the Sustainable Development Goals. And I put four options here, and let me talk a little bit while you're answering that question about these ideas, to adopt more GDPR-like laws. GDPR took hold in the EU, California's passed a similar GDPA, various U.S. states and other countries have bills in progress that would do something similar to GDPR. Do these laws really effectively protect privacy, or do they just create a whole bunch of annoying notifications that you have to swipe away every time you go to a web page? Data responsibility rankings are an idea that is touted a lot, sometimes proposed as a way to increase transparency. But how many consumers would actually pay attention to these rankings? Would they make an impact? Industry standards, option C in that poll that raises the question, whose job would it be to set these standards and then to police them? And it seems as if companies these days suffer increasingly massive privacy breaches almost with impunity. Hundreds of millions of records can be put out on the web, and the company keeps on operating. So if we impose bigger fines or other penalties on companies, will they simply treat it as a cost of doing business as happened in banking and with some of the big tech firms in the past? Or is there a more effective way to make data more secure? So those are the questions that I was offered or the options in that poll. We will come back to that in a bit. But right now, I'm going to turn to our speakers and ask them a few questions before we turn this into a more general discussion. So the first one I'm going to go to is Joanne. Joanne from Mastercard. Hello. Thank you for joining us. I'd like to ask you, Joanne, in an era where data sharing is necessary, essential really for building new solutions and solving challenges like health, like a pandemic, like climate, economic and other challenges as well. What are the elements, do you think, of building a data sharing ecosystem for social impact purposes that organizations can tap into whilst also preserving security and privacy? So hi, Gideon. It's very nice to join you today. And I think this is one of the important things that we need to address if we're going to really innovate with data in a way that's going to address some of the most important social impact issues of our day, including the pandemic for health, including climate change, and just really making sure that we address all sorts of social impact issues, including financial inclusion in the financial services space. But when you think about what we're talking about when we talk about an ecosystem, we're talking about organizations that have data. So we'll talk about them as data donors perhaps, right? We also have on the other side, recipient organizations that would use that data for insights for the benefit of beneficiaries, individuals who need that information. And then there's a whole system in between of what we will call data enablers that have to be created along the way, that have to address some of the legal issues that are in your poll. I'll be interested to see the results later, right? So to address the privacy issues, to address the potential of breach as data or insights are being shared by one organization to another, the technology to move the data, and then really also the data science, right? To make sure that we create the analytics, the algorithms to actually use the information in a way that's impactful for the problems we're trying to solve. And so we need kind of layers, if you will, of enablement. We need talent. We need technology. We need the data obviously, right? And we also need the methodologies to do it. And while organizations are doing this, we have to think of an ecosystem today as organizations are using their data to solve problems, but they're oftentimes doing it as one offs, right? One organization, such as Mastercard or Facebook or Microsoft or another, is having their data created into insights that can be shared to solve a problem. It may not be enough. We need to find ways that we can combine datasets in ways that are privacy preserving, that keep the information secure, that then can be used to solve multiple problems at scale. And that's really where the ecosystem needs to be designed to really enable multiple datasets to be used by multiple stakeholders to the benefit of multiple parties. And that's really what we're talking about in designing a responsible data ecosystem for the benefit of multiple parties at the same time. And that's the larger challenge, right? And so what are the different types of institutions that need to participate in that design? So, John, could you get specific? Could you give us an example maybe of a problem that Mastercard has been trying to solve or that you think there is potential to solve by having organizations come together in this way, but that just isn't possible right now and what it would take to get there? Well, sure. I mean, we're seeing today, right? We've talked, we've seen, we have participated in helping to solve and use our data to actually assist in the economic recovery after climactic events, right? So, whether that be after a snowstorm or after a hurricane or after a tornado, we help first responders in locations understand when services have been restored. But that is using just our information to help local governments and neighborhoods understand where first responders can go for services because we know that our transactions are back online and we're processing and we can tell where locations are happening. But imagine if we could get cooperation from multiple parties to actual engender more information, right? And so that's just a microcosm of what could be possible. We've seen over the past year the need for pandemic information. We've used our information to understand what the economic impacts have been as we've had seen shutdowns roll from Asia first, then to Europe, then to the to the Americas, right? And we've tried to help businesses understand those impacts. But now as the economies are shifting to opening up, right? We're one company with one point of view. Imagine if we take multiple datasets and actually combine them so that we can understand at a much more granular level rather than using one dataset and then providing it to just our constituents, our banks, our merchants and governments. Imagine the power of that if we had the foresight and the system enabled to combine those datasets and have that information available for the different types of analyses so many constituents would want to actually- What kinds of datasets, for example? For example, econometric information, right? We understand purchases. That's really what MasterCard understands, right? At a very large scale. But it would be very interesting to also understand other economic information, employment information, for example. When are people going back into cities and locations, right? What is going on locally that governmental data could be very powerfully added in to the data that we have as purchasing data that other financial market data could be added into? That kind of economic information added in would be one source, right? What other types of information? Transportation information, right? So that cities could plan for how do they need to ramp up transportation systems as economies open up, right? Same things for small business owners so that they know when they should be reestablishing their businesses as city centers come back to life. All of that information right now is being put together in pieces by data sharing that is very helpful. But imagine if the data was available, the science was enabled in a dashboard where individuals, individual businesses and governments could go in and just have it at their fingertips. It would be a much easier, simpler way to solve some problems. Does that make sense? It does, thank you. So it raises lots of interesting questions, which I want to come back to you on. But first I'm going to go to Erin. Erin from Facebook. Thank you for joining us as well. So Erin, I'm going to ask you maybe to riff out a little bit of what Joanne said, but in a similar vein, billions of people obviously share their data with Facebook every day. And MasterCard has almost reached so this Facebook. You both know an enormous amount about an enormous number of people. So what do you see from Facebook's point of view as the greatest opportunities for responsible data sharing that can advance the sustainable development goals? And then how do you balance that with obviously privacy concerns and massively varying national data laws? Yes. Well, first of all, thank you to the forum for inviting me here today. And thank you, Gideon and my fellow panelists. It's so nice to be here. And I really thank you, Joanne, for those comments. And I'll build on them a little bit. I mean, I think your question, Gideon, it really hits home the tension around data sharing. We need to find the balance between being open and allowing data sharing for all the benefits, but also being really careful. And when we think about opportunities to advance the SDGs with responsible data sharing, I think the most top of mind examples they really do relate to how can we share data to advance public good. We have something at Facebook called data for good. And it's a program we've had for a long time, for years, where it sounds in many ways like what Joanne was talking about. In our case, we've shared aggregated data with public institutions about people's movement. So we have general aggregated location data, and we have information about movement to help these public institutions deal with crises, helping the Red Cross, for example, determine where to send supplies after an earthquake. Well, you can see how that information could be really useful during COVID. And it has been public health authorities, they've relied on, we call these things movement range maps, movement range maps. And they measure the effectiveness, for example, of COVID-19 stay at home policies, and whether or not people are actually staying at home. And we've been doing this in countries from Mexico to Indonesia. And we're able to provide this data while still protecting people's privacy by incorporating differential privacy into these maps. Because you know where people are logging into Facebook from, but then you can also disguise their identities. That's right, using differential privacy methods. And so we've been doing it in this aggregated way. I think what Joanne is talking about is really interesting in how you combine it in ways that that's interesting. We haven't been doing that yet. We've recently committed working with gender equality organizations and experts to provide real time data to help close the gender gap where we're doing a survey on gender equality at home. So there's a lot of folks at home right now. And we're doing surveys that can focus on the differential impacts of the pandemic on women led businesses. And again, we're looking at what kind of data can be useful, how can we share it in a responsible way, privacy protective way, using privacy preserving techniques. I think that the second part of your question is really important, though, right? How do we navigate the different laws and how do we navigate doing this responsibly? Listen, we've all learned so much. Let's go back a few years, Cambridge Analytica, right? That exposed major gaps in legal regulatory commercial frameworks governing sharing of data. In that case, between platforms and third party developers, but there are these really hard questions around data sharing that come with respect to any sharing that happens. It's like, who's responsible when? What do people know? How do we make sure they understand the entire spectrum of data, where it's going and from whom to who? And we now have these obligations, really interesting obligations under our agreement with the FTC, super robust. They include ongoing and periodic monitoring of third party apps that have access to data. We've also done a lot of, obviously, work internally to fortify our own app review process and oversight programs. Now that's with respect to platform's developers, that's a specific context, but I think it's an important one that demonstrates, and even though we have those frameworks, there's such a need for multi-stakeholder projects like WEF's project we're talking about to develop standards promoting responsible data sharing among businesses. It's interesting when I'm super curious how your poll comes out, because I do think in many cases we need several of these things. We need laws, interoperable laws. We also need these kinds of multi-stakeholder projects that bring together leaders from different parts of the data ecosystem, industry, civil society, academia, to develop rules about the roles and responsibilities of different partners in this ecosystem that Joanne just talked about, and who's responsible when data misuse occurs. I'll end it there, but really, really interesting and hard questions that you're posing. Thank you. We brought up the poll. I'm going to come to the poll, but I'm going to keep this in suspense for just a little longer, because I want to throw a question at both Erin and Joanne together, which is, and this is putting you on the spot, is there some kind of scenario you can imagine where Facebook's data and MasterCard's data could come together to help in some specific situation, like maybe it is after a disaster of some kind, but what would be an interesting, really helpful way to match those sets, but then also, again, do it in a way that preserves privacy? So I'll try, based on what Erin just said. So we use our data that I just mentioned, which is our transaction data that's been processed. So where data has been processed tells us about where things are open. And Erin just talked about her movement range maps, and I have no idea how that data is created, but she said it's privacy protected. Both of those data sets are aggregations, but they would be really important to be able to tell what's going on in locations. It would be very helpful, I think, for maybe both of those data sets to be combined into a dashboard to talk about what's going on in locations, especially as we see the variants of this virus in the United States. So let's just pick in the United States. Locations having impacts of opening and closing. Now, we're trying to see a lot of things open, but we also know that we're seeing small businesses in particular have a real challenge with having workers available to them. And so if in a location that's very worker dependent, it might be really helpful to know when people are needing to stay home. Maybe it's because they are having symptoms. Maybe they need to quarantine. So I could see where combining those data sets, and I'm riffing here, Erin, because I don't know the data well at all. But I could see how both of those data sets, what ours are saying, that things are open or closed. Erin is saying that people are staying home. That might be very powerful to understand how people are individually, at least on an aggregated level behaving, according to what Erin is saying. And what we're seeing from the economy, either processing transactions or not. And so I guess we could partner, I think, a government or a location would be interested in that and maybe provide that to that's downtown areas to understand just how much transportation, again, the example I gave you, would be needed in that area. What should they be doing to prepare for reopening potentially? So Erin, I don't know, maybe you want to take that. Yeah, no, I love that idea. And I think another another, I love the idea of this dashboard. And we do have a data for good program where we release data sets to nonprofits. And so we can imagine a situation where MasterCard and Facebook could release data together to some of these partners. We partner with universities or nonprofit to develop insights. So then they can take, like you were talking about, Joanne, they can take the data and then they themselves can merge it, correlate it and come up with insights in a privacy protective way that can be useful. So I think it's that could be a really interesting idea. Yeah. And let me just add to that. MasterCard has what we call data fellows, which are university researchers who come in with projects. And in a privacy protective way, they work with some of our data scientists in academic sandboxes on different projects. So I guess in a foreseeable way, they haven't yet. Perhaps they could come up with a way to combine the data sets. So yeah. I think we have an idea, Gideon, coming from this. So yeah. There you go. Amazing. Generative ideas. So with that, I'm going to come to Quentin, who's been waiting patiently from the International Digital Accountability Council. And Quentin, I think probably a lot of people who are listening to that exchange right now might have been saying to themselves, oh, wow, that's a really cool idea for how we could merge data sets in order to help in a humanitarian crisis. But they might also be saying, what, combine the data of Facebook and MasterCard? Hell no. I don't want that kind of digital invasion of privacy. So when you hear a scenario like that, what do you think about the potential of it? And what do you think about the concerns about it? And what would be a way to make something like that happen without it, in fact, both spooking people and also creating genuine privacy concerns? Sure. Well, first of all, thank you, Gideon. Thank you, Erin and Joanne. I'm really looking forward to this. And I want to give a shout out to the forum who have been real leaders in developing solutions to these problems. And I want to start out with where you began, Gideon, which is that if you think about the way that digital technologies have changed our lives over the last 20 or so years, it's really staggering the way that it's affected our interaction with our health, with dating, with education, the way that we raise our kids. Every aspect of our life has been touched by digital technologies, by data sharing, and we barely scratched the surface of this digital transformation, personalized education, personalized healthcare, the kinds of collaborations that Erin and Joanne are talking about in terms of tackling really big challenges like the pandemic or like climate, the climate crisis. The potential here is enormous. And let me just sort of highlight two major challenges that get in the way of us unlocking the transformative potential of the kinds of ideas that Erin and Joanne are exploring and many of the other things that we can do with these digital devices. And the first one is trust. So fundamental to the functioning of this system and fundamental to sort of taking it even further is the question of whether you, as a user, feel comfortable with the amount of data sharing that's happening. And we can see this in survey data that a lot of individual users feel that they've lost control about the ways that their information is being shared in the digital world. And don't feel comfortable that these jargon filled monstrosities of privacy policies are actually meaningfully protecting our data and making it possible for us to be really comfortable with the ways that these technologies can be taken one step further. And when you don't have that trust, it holds us back from the kinds of collaborations that we want to enable in this system. And then I wouldn't have a way of knowing whether to trust these people. Well, and there's good reason to think that the system is not trustworthy. And we see that in terms of individual examples of privacy breaches or cybersecurity breaches. We see some of the ways in which artificial intelligence can lead to systemic racial discrimination, other kinds of challenges. So there are lots of ways in which it's not just trust, it's trustworthiness. But regulating the digital world is harder. And you can't just apply the old ways of thinking about consumer protection and expect them to work well in a new environment. So when governments pass large pieces of legislation, so the US has been trying to pass privacy legislation for a long time, Europe took a huge step forward with GDPR. These are often really broad abstract sets of rules. And if you're thinking about it from a business perspective, it's very hard to navigate. There are different rules and different jurisdictions. And there are different rules for different kinds of technologies. And you can really have a lot of sympathy for businesses not knowing what they're allowed to do and not do. And from a consumer perspective, there's a lot of risk. And the system doesn't work very well. So that takes us to the question, what can we do to make that better? And I think that fundamentally, there are three elements to a system that would work better than the current system, and that would enable us to take on some of the collaborative possibilities that Darren and Joanne are talking about, some of these other things that we want to do with personalized medicine or personalized learning. And those three things are you want clear rules, you want to develop some rules that people can understand, both from a user perspective and from the perspective of an app, a developer, a platform, participants in the systems. You need clear rules. The second thing you need is you need a process for training the people who are expected to follow those rules in what is expected of them. And then the third set of things is accountability. There needs to be a system for putting in place watchdogs or other kinds of enforcement so that we can feel comfortable that people are following the rules and that when they don't follow the rules, there is a course correction that comes into place. And so one of the challenges, I think, that we have in the public policy community is to come up with that dynamic approach of clear rules, good training, and reasonable accountability at the speed of the internet. So one of the challenges, let's take the U.S. environment. If you think about the way that rulemaking usually happens in the U.S., it typically takes about seven years for a notice and comment rulemaking to happen from beginning to end. But if you think about the challenges of the internet in 2014, they're very different from the challenges of 2021. So we can't solve those problems now and expect it to work well. So you need a much more dynamic and iterative process. And as several people have alluded to today, you want that to be multi-stakeholder. You want businesses at the table. You want civil rights and civil society groups at the table. You want academics at the table. So we need to think differently about the way that we govern the internet with the goal of enhancing trust to facilitate further transformation. Right. So now let's bring up that poll that we asked at the beginning of the session and see what the results were. Can we have that on the screen? Yes. So I actually need to make my screen bigger so that I can see the results myself. So clearly, there seems to be a strong preference here for setting industry-wide standards like for things like data interoperability. So actually, Quentin, do you want to comment on this? It's interesting that so many of the audience voted this way. What do you think? So I do think that we need industry-wide standards and then, you know, I think that so I had the honor to serve in the Obama administration in the White House as we were sort of trying to develop the notion of a consumer privacy bill of rights. And the idea there was that you would have a series of background principles based on the fair information privacy principles, things that sort of are almost like a constitution, right? So they're broad. They don't necessarily tell you how you apply those rules in particular circumstances. And then you need to build out what we think of as codes of conduct. Codes of conduct are more specific rules that industry would have a role in shaping. You want developers at the table when you're asking apps to behave in a certain way. You want to have the platforms at the table and you want to have others at the table. And then you want to make sure that that has some teeth and also that when you follow those rules that you can feel comfortable, that you get some protection from liability. So I think that a good system does have industry-wide standards, but they need to have teeth. They need to be meaningful. There needs to be meaningful accountability and it can't just be self-policy. Right. Okay, I'm going to go back to Joanne now because I'm coming back to some of the things you were talking about initially. One of the challenges obviously is that these rules and laws vary from country to country and indeed the world seems to be becoming more disparate. I don't know if that's how you perceive it, but it certainly seems as if the world is starting to become more disparate in the way that different jurisdictions treat data. Some countries taking a much stricter view of how companies can behave, some are looser view. So to what extent, how do you at Mastercard navigate that, particularly when you're thinking about a problem like how to put data to use for good when you're trying to solve problems or help with problems that are global, but they cross multiple jurisdictions with different laws? Well, I'm fascinated because I think it leads from your polling question, Gideon, and even Quentin's answer and your question. So I love that the audience thinks that I'm setting industry-wide standards. I think that got such a resounding response. And yet we have standards, right? We have lots of laws. And that got a lower ranking, not GDPR-like laws, but yet we have a patchwork of laws. And yet standards are one level lower than laws. So interesting that there's this desire. And then principles got lower than that. So it's a really, really interesting conversation because then how are organizations, I think your question is, how do organizations react to this when we have a landscape that is so very different depending upon where you operate in the world? Because you have kind of stricter, and then you have less strict, and then you have kind of none. And so what is a global company to do? And so what we've done, I think, is what many companies do. First of all, we hold ourselves to the highest level of legal compliance. And I think that's what most organizations do just because we believe that most consumers really do want you to earn their trust. And even though we're really a B2B company with a B2C brand, and our impacts of our products aren't on really to the individual, we really do believe that individuals should have rights when it comes to their data. And I think Quentin talking about trust, somebody recently said to me that trust is earned and drops and lost in buckets. And I think that that's also really true. And so what we've done is we've actually come up with responsibility principles that we design around that really help us as our North Star align with all the laws. So we meet all the legal requirements. But what we've done is we've published our own data responsibility principles to assist our product designers, our salespeople, our folks in HR and finance understand how we are designing both from a commercial standpoint, but also in the social impact space. And we started with the individuals that we believe individuals have the right to own their own data, that they should have the right to control their data, that they should have the understanding of how they benefit from how their data is being used. And I think Quentin really spoke about that, that that's really the issue and why we kind of ding on the privacy notices. I think that was their intent. I think that they have way too many legal requirements in them for most people to read them. But the idea was how do you benefit from when MasterCard gets your data? And we stand in between the bank and the merchant, but we use data to really operate our network, but also to create fraud algorithms, prevent fraud, prevent to give security to the transactions, prevent the bad guys from getting in. And then also to do these econometric type of analyses. And I think most individuals, once they understand that they understand the benefit of fraud protection, right? They understand the benefit of security. But then also individuals have the right to privacy and the right to security, right? That those are things that individuals should expect in any organization that they're going to give their data. And so when we came up with those principles, everybody said, well, these are lovely, but then what are we doing as MasterCard, right? That we believe this for the individual, what are we doing? And so we have our own design principles and some of them are, you know, are the right to privacy and security. We will provide privacy and security. We will adhere to those standards. And then we will be accountable in how we use data. We will be transparent. We will have ways that individuals can act with, interact with us to understand how we're using it. And then we will have integrity in our practices. And Quentin talked about the world of AI and how really minimizing bias and understanding the quality of the information we're using to create advanced products and solutions. We really need to continuously advance our game. But we also, one of our principles is that we're also going to innovate with data. And that means that we are going to keep moving as the market moves. We're going to innovate new fraud solutions. We're going to keep upping our game in security. We're going to keep upping our game in econometric. And our last principle is we're going to use data for social impact. We're being really transparent that we believe that leveling the playing field in financial impact, but also in other ways, is part of kind of our mandate of being a good citizen on this planet. And so we're trying to be kind of open kimono that we use some of our commercial solutions in a social impact sense and that we develop things in the social impact space. And sometimes we commercialize, but that we believe that that's part of how we will use our data to solve different problems in the world. And so that's one of the ways that we are trying to be more transparent and solve some of the issues even as a B2B company with a B2B to C business and brand. So it's important that the ecosystem I described in the beginning is really built, including codes like Clinton talked about, including standards. We talked about responsible data practices that everybody in the chain that's going to be using data to create solutions to global problems. And this is where WEF has done a really good job in leading begins to really understand that this is a very, very important resource to solve problems, but that we really have responsibilities as we do it. And so all those intermediaries as we pass data to each other to make these solutions work, then we have to be really, really careful in how we do it. And so that's really part and parcel of how we're approaching the law. So those are our principles, but we comply with the law, but we go way further than that and how we actually act at MasterCard. You raised an interesting point. Oh, sorry, go on. No, sorry, Gideon. There's one piece of, and I'm, you know, Joanne didn't mention this, but I think it's really relevant and important, which is, you know, to your question, Gideon around these legal regimes. I mean, what we are seeing that is really troubling are these data localization requirements around the world. And there's this trend towards internet fragmentation. And I'll just, just spend 30 seconds, but this irony of what we're talking about today is all the benefits of sharing data. But that can only be done, for example, when you allow cross-border data flows. And we'd be having a very different session today if data flows were no longer possible or were really hard because of legal regimes. So that is something that we are seeing as a trend globally that we collectively, I do think need to address in order to advance exactly what we're talking about today. Yes. I mean, do you think it would be fair to say, though, that a lot of these restrictions that we're starting to see in data flows are because companies, large companies, to some extent, lost the trust of consumers and of governments in the way that data was flowing before? It's a good question. I think that a lot of the data flow issues we're seeing is because what we're seeing is countries want to keep the data to themselves and have their core local businesses benefit from use of the data. So actually, I think countries recognize that data empowers innovation, right? Data enables businesses and small businesses to thrive. And I think one of the reasons we're seeing countries actually try to become more restrictive is a nationalistic effort to say we want the data to be leveraged by businesses here and not allow, for example, US companies to benefit from the data of our citizens. So that's more giddy in what we're seeing as the rationale behind it. Okay. Not to say your point isn't so valid, but yeah. Since we have only five minutes left, I'm going to throw in a couple of questions from the audience. One is asking, this is UN General Assembly Week. What message would you send to governments to support the data ecosystem beyond just privacy laws? Any one of you can take that question. So I have a thought on this, which is about harmonization. So I think it's really important for us to have interoperable laws so that it is more, it is possible for companies to understand how to navigate both US laws, European laws, laws around the world. So if you are a business and you're trying to launch a product that is going to be bought and sold on the internet or used globally, often you'll have to research the laws of a large number of different jurisdictions and guess as to whether you have complied with the laws in some others. That's a real challenge and that's a challenge that has a consumer impact as well. This patchwork of laws across international jurisdictions and even this patchwork of laws within the United States is not good for consumers and certainly not good for businesses. So one of the things I think as we're coming together as an international community to try and take on things like the climate crisis or as we're trying to take on the pandemic, we need to find ways to make our laws speak to one another. And I agree with Erin that this trend towards data localization is generally the wrong direction and is generally motivated by sort of more petty local interests. We need data to flow and we need to sort of think about ways that we can work together as a world community. Okay, another question from Amir Banifatemi, which is and I'll direct this to Joanne and to Erin. Would you commit to open data on an ongoing basis and would you support oversight governance to make sure there is sustained access to data? So I'm always leery of the phrase open data because we never share our Rohat transaction data because we don't share personal data for all the reasons we've been talking about. What I would commit to is open insights, which is really what I think people want, right? So all the econometric information that I'm talking about, I think that there's a version of insight sharing that needs to happen so that we can understand what is helpful information that should be shared. I think that there are, there are different ways data cooperatives, perhaps data collaboratives that need to be created to make this easier. And then I do think that we need tools to be able to then showcase that information and then enable its use. And so I do think there needs to be mechanisms created to make that easier to have that information available for a wide range of constituents. And I know WEF has been working on this. I believe there's a project called the data for common purposes that they are that they have launched. And I think it is a great idea to try to engender exactly that type of data use. Alright, Erin, what about you? Yeah, thanks getting just quickly. I mean, we, in general, I agree with Joanne. And but one thing we've done is we have created a program. It's called Facebook Open Research and Transparency or FORT. And FORT, the goal of FORT is to actually facilitate privacy protective data sharing for academics for research, because we do support data sharing for research. But again, as we've all talked about, we have to do it in a really smart way and not at the expense of people's privacy. And I think that the second question and the second point about oversight, I think there's a really important role for oversight. And I think that is something that we've been talking about, even as part of this, this proposal, this multi-stakeholder initiative around, and this is some of the points that were raised earlier, is how can we ensure accountability? And can we have some form of external monitoring of compliance? And we very much support the idea of an external monitor and think there's a real role for that as part of this ecosystem to engender trust like we've all been talking about. Great. Well, unfortunately, we are pretty much running out pretty much out of time. And for me, it has been a fascinating discussion. I'm very keen to see what comes out of this idea for a collaboration within Facebook and Mastercard and whether you can make it work in a way that people trust. I think this, one of the key themes that's emerged for me here today is this notion that actually people could see the benefit of data sharing and could be happier with their data being shared and used and remixed if the trust were there and if they could have some kind of concrete visibility into what those benefits are. So hopefully we can see some progress towards all of that. Thank you very much, Joanne and Erin and Quentin, and thank you to the World Economic Forum for organizing this panel.