 Okay, we're going to go ahead and get started with our last segment of the day. Thanks so much for sticking around, just a quick reminder, drinks are after this. So get excited, this is going to be if not the best segment of the day, equally tied with all the other segments for best of the day. Yes, Raven's Club, it's downtown, so it's about a 10 minute walk from here on Main Street. Yep, so what is it next to? You can Google it. We'll all walk over together, so there'll be a migration. I'm going to introduce these fine folks here, but before I do, I want to say a special thank you to Elizabeth. We spoke over the summer when this symposium was just like a baby idea, and she was a real guide in helping to find great folks that would be panelists, help shape some of the ideas that went into each of these, and so we're really grateful to you and for your input and your guidance. This is Elizabeth Khalil. She spent her career as an attorney focusing on bank regulatory compliance and risk management with a particular concentration on privacy and emerging technology. She's a former federal banking regulator having served in the office of the comptroller of the currency and FDIC in Washington D.C. In private practice, as a partner at Dicapagosit and senior associate at Hogan levels, she advised numerous banks, credit unions, and fintech companies. She now focuses on retail and consumer issues, including digital initiatives at CIBC in Chicago. To her right is Matt Homer. Until recently, Matt was head of policy and research at Cuobo, a fintech data connectivity platform. He previously served in government at the federal deposit insurance corporation and U.S. agency with Elizabeth. What a team. Amazing. At USAID, he developed two innovation programs. The first was a partnership between the U.S. government and government of India to promote payment innovations that would lead to greater financial inclusion and improve lives for ordinary Indians. The second, the reg tech for regulators accelerator, was a first of its kind accelerator program designed exclusively for regulators in order to help them test new technologies. Matt recently also supported the stand up of a new Rockefeller supported effort focused on helping policymakers in emerging markets build new digital infrastructure and pursue efforts that give individuals more agency over their data. As part of this, he embedded with the team in India that built the country's digital identity program and India stack. So please join me in welcoming our final panelists. Thank you so much. It's great to be back at my alma mater. Michigan law and be inside this beautiful new building for the first time. It's great. So thank you. Thank you everybody for having me and having us. So Matt and I thought that we would wrap up by kind of highlighting some key themes and issues that we saw as kind of threads throughout today's discussion. And one thing that I noticed in particular is how great it is to have all these great sort of minds together in one room to talk about these issues holistically. Because as somebody who's kind of grown up her career alongside a lot of our financial privacy laws and regulations, I can tell you that that hasn't really been the underlying sort of underpinning of our financial privacy legal regime in this country. So as I was coming out of law school in 2002, we have the Gramleach-Bliley Act Privacy and Data Security Laws regulations and law regulations and guidelines coming online. We're about to see the FACT Act, which amended the Fair Credit Reporting Act. So we have this sort of kind of patchwork of laws and regulations affecting privacy and data security in this country. We didn't really go into a lot of depth as to what the current kind of lay of the land is legally in this country, but just to kind of give you a snapshot of that and where these discussions kind of the backdrop against which these discussions happen, we have no one consistent, generally applicable comprehensive privacy and or data security law regulation in the U.S. We have mostly a sectoral approach. So there's health privacy, financial privacy, and so on. And then some federal laws and regulations that sort of apply more generally. There's no one comprehensive framework. The laws that we have in some cases have kind of reacted to certain situations, concerns that were happening at that time. So in 1970, we have concerns about credit reporting, consumer reporting agencies, aggregators of consumer data. So we get the Fair Credit Reporting Act, Gramleach-Bliley Act happening in 1999, expanding the powers of financial institutions, and as a little bit of trade-off and partly in response to a member of Congress's concern about his credit union, possibly selling his contact information to Victoria's Secret Laundry Catalog, we get a little bit of data protection for consumers as part of that law. So we haven't really sat down and had sort of all the key stakeholders at the table and thought really comprehensively from square one about financial privacy and data security. So here in this kind of symposium, we have the luxury of thinking from the ground up, what are the important principles that we would want covered in sort of a new regime that covers financial information, whether it's the magical dust of fintech or whatever it was called, or other products and services that happen to incorporate financial information of consumers. So I think that that is something really great and that framework is really helpful and would be great for any actual decision makers to use, to incorporate into a new legal or regulatory regime here. So kind of building on that, one kind of thread and concept that we want to talk about is the philosophy of privacy, kind of what are our privacy values as a nation and how does this relate to financial information in particular. So Professor Barr teed up that issue at the beginning of the day and saying that, you know, we haven't really thought again from square one as a country what are our privacy values, you know, kind of what is our philosophy there? So I don't know if Matt, you want to jump in with some thoughts there. I mean, there's a, yeah, I was just kind of reminded of some statistics when Michael teed up that question. So the Clearing House, which is an association of banks, they put out a survey last year on fintechs and have found, this is not going to be surprising to you, found that 99% of respondents said they are concerned about data privacy. And 67% of respondents said that they're either extremely concerned or very concerned. Yet in that same survey have found that a vast majority of respondents feel comfortable sharing their mobile phone number, their email address, their physical address, and their date of birth. So I think that it really highlights that there's a paradox, you know, in America when it comes to privacy. We say we really care about privacy and yet our behavior and our actions don't necessarily reflect that. And I think part of that can be explained through some other survey work, which finds that people also don't see, they don't see any contradiction between convenience and privacy. So people believe that they can have maximum convenience with maximum privacy. And they believe technology can be a means of helping them accomplish that. I think that's also true. But I think the challenge is that consumers are not really voting with their feet to force some action there. And I think this also relates to some issues that have been discussed throughout the day about what consumers are informed about and what choices they're given. So often, you know, we hear, well, consumers don't really care about privacy or in the U.S. they don't care about privacy. There's not like this underlying, passionate, you know, commitment to privacy like there is in Europe or, you know, under the Graham-Leach-Bliley Act privacy rule, you have a notice and opt-out regime. So basically, there's not really affirmative consent that's required there. It's more financial institution has to disclose certain things about what it's doing with consumer information. The consumer has to have an opportunity to opt out of that use. And that's not really consent. That's sort of an opportunity to not consent. And so very, very few people statistically use that opportunity to opt out. So some conclude, you know, people don't really care. And again, as we've heard many times today, people don't read privacy notices. Well, does that say that people don't care about privacy or is that, you know, they're not being given meaningful information and they're not being given a meaningful opportunity to control what happens to their data? Because GLIBA notice and an opt-out is not the kind of, you know, robust, you know, transparency, information, consent regime that we've been talking about. What has to be disclosed under that regime is pretty minimal in the sense of, you know, sort of general categories of use of data. It doesn't get super granular into what the financial institution is doing exactly with the data. So when people aren't, you know, as informed as they could be in the first place, and then, you know, they're not, they're not allowed to sort of affirmatively be able to consent to every use of it. It's just, you know, here are these general categories and do you want to opt out. It's important to realize like that is the regime. That's what's being given to consumers. That's what's sort of, you know, take it or leave it. That's what we have right now. Yeah. And I think that actually illustrates sort of a broader point, which is that there's a difference between having certain rights over your data and being able to exercise those rights. And I think that, you know, the gap between those two things is particularly enormous when you go to developing or emerging markets, right, where I think there's a strong desire and many of these markets are looking to GDPR as sort of the model that they want to follow. Yet there's a real question in those markets about whether the government has the capacity to make those rights meaningful to consumers. And I think that, I think as a result of that, you know, you're going to see a lot of interesting innovation actually come out of emerging markets. India, where I've spent a lot of the last four or five years of my career, I think is interesting. They're considering new privacy rules. And at the same time, there's a group of technologists that are considering technology standards that would accompany those privacy rules. So the idea is that rulemaking, particularly in sort of a low capacity environment, is not sufficient to enable consumers to exercise these rights, but the consumers actually need some more tangible set of tools to make those rights meaningful. And we could talk about what those things could be. It's sort of improved, permissioning, improved consent flows, data sort of data command centers or whatever you may want to call it a place where you can go to actually proactively manage your data and your consents, your consent afterward. As well as I think Sean mentioned something really important, you know, just the idea of a fiduciary standard too for some of these companies that you shift some of the liability from the consumer to the companies holding their data and try to make data become seen or perceived as more of a liability than an asset for companies, I think you're absolutely going to see that in emerging markets. And also in terms of, you know, control over your data, affirmative power over your data, again that goes to sort of the fundamental setup of our financial privacy framework that we have right now. It's very, very different from what we have right now. It's sort of this concept of transparency, a type of transparency. This is what we're doing sort of in general categories. FYI, you know, here's a long privacy policy that we sort of describe what we do, but is it really helpful or meaningful? And, you know, sort of FYI, take it or leave it. We've done what we can do and now we're moving on. So it's a very fundamental see change. And so just thinking about in practical terms how that changes, how does that work? Is that you have to get the buy-in of Congress, banking regularly, who, you know, all the stakeholders that need to be part of that, who needs to be at the table. And, you know, so far it's really, it's been very piecemeal in terms of who has been involved. Yeah, I think one way of sort of thinking about the problems. So I'm kind of the belief you can bucket the problems into two big categories. So one is our information asymmetries that exist between consumers and those who hold their data. And the second category is power asymmetries that exist between consumers and those who control their data. And I think that there's a real, I mean to, you know, I think that with consent it's an interesting topic. There is the New York Times a couple of weeks ago publishing op-ed, or an editorial actually, the headline of which was putting the con in consent, right? And basically making the case that, and sort of drawing on recent examples from large tech companies that consent is basically, you know, completely meaningless, right? And that even when people do consent, there's no way they can ever be expected to understand what that consent means. And if that's true here, it's particularly true in other markets where people may not be able to read, right? Or they may not be able to, you know, be able to even sort of understand some of the terms even if they're read to them. So yeah, I mean I think that when it, but if you get to the broader point of solving the information asymmetry, I think it just highlights the need to think beyond consent as we understand it now to other methods of solving that problem. And Professor Barrow also teed up the idea of perhaps thinking of, you know, things that you can't get consent to where setting substantive limits on using certain, using information in certain ways or certain types of information. But again, I think it's important to remember that basically we do not have a consent regime generally here with regard to privacy generally. In some cases, certain types of consent are needed. In other cases, it's again, it's, it's opt out. It's, it's not exactly the type of consent that one might think of in, you know, whether it's a, it's a GDPR context or some other context. Yeah, and there, there are also no required disclosure form. I mean, there may be just required disclosures for very specific sectors or use cases. But someone previously mentioned the idea of a Schumer box for data. There's no requirement for something like that. And I think even in a digital environment, we could probably start with a Schumer box, which is a, you know, just sort of a generic or a standard template upon which a consumer could see data in the same way across different institutions to enable apples to apples comparisons between multiple products. You could envision something like that. But I think frankly, technology also makes it possible to take a step further and think about dynamic Schumer boxes, right? So a consumer could have a dynamic disclosure they could go to at any single time. And it's updated to that current minute to tell them how their data has been used, who it's been shared with, etc. Yeah, I mean, we have the, the model privacy notice, which is sort of a safe harbor form of disclosure that incorporates Graham Leach-Biley and Fair Card Reporting Act disclosures. But again, as I was saying, it, I mean, it is sort of general, like more general categories of use of covered data. So it has sort of in format, it's sort of Schumer box-like. But, you know, again, you're talking about getting much more specific granular and even dynamic in the actual content of it. So those are some of the kind of the key kind of themes you wanted to talk about. We could talk about this kind of all day, since obviously we have been talking about these issues all day. Well, the last thing I mentioned just like on, so I mean, you know, we kind of, I spoke a little bit about the information asymmetry. I think the other side of it is the power asymmetry and how do you solve that? And, and I think the point was made earlier that I think is right, that even if you can, even if you can level sort of the bargaining power between the parties, it may not even solve the problem. But I think that there are some interesting things happening around these that I wanted to mention from India. One is that they are, the Reserve Bank of India is experimenting, they're actually moving forward and have licensed entities that are that they're calling account aggregators, which are, I would describe as consent intermediaries. So the idea is if I'm a consumer and I'm doing business with a bank, a consent intermediary would sit between us to collect my consent. And that's their sole sort of job is to manage consumer consent. And then that becomes the place I can go afterwards to evoke my consent and to see my consent, you know, across the landscape of financial institutions and to modify or evoke my consent. So I think that's, I think that's an interesting model. I think even outside of sort of a formal licensing regime, you could imagine kind of consent intermediaries paying an important role going forward. And then the second thing they're doing in India that is related is creating a digital locker functionality where it's sort of a single point of access for anyone in the country to access their information. It's not essentially stored. It's connected via APIs to the actual data sources. They're starting with government data. So it's a single place you could go to get your driver's license, to get your business license, to get your birth certificate, which can be quite important for then getting loans or things like that. But it's not just a place to access it. You can then also permission it, right? So you could go there and permission your business license to a lender. The lender would see that that is actually certified as authentic by the government. So they're starting with these types of government data, but then they'll be adding, the plan is to add in other types of then regulated data, which include health data, financial data, etc. So I think that's also an interesting model. And I'm kind of of the belief that if you're never going to be able to perfect consent, you should at least build tools that enable consumers to exert more control over the data when they need to at the right point in time. I think we all know that when we use apps, we're often trying to click through kind of the onboarding process as quickly as possible to get where we want to be. But then when something goes wrong or there's a data breach or sort of key moments in our life where we actually do want to exert control, right? And so that functionality needs to be there in place when we need it. We have some audience polling questions to enter the second phase of our closing remarks. So I need to get a clicker. All right. So this is where we're going to use the clickers. Okay. Let me just get this set up here. So it's pretty self-explanatory. I think you just, you just push the, you probably have to turn it on. We'll be trying, so if you use your thumbprint, it's capturing your biometric and pulling it in. So yeah, turn it on and then let me click start and then you just read the question and respond. We'll tell you later. You can go ahead and start. Yeah. So it looks like we've got nine responses, 12. It was all disclosed when you enrolled in this symposium. You clicked accept to come. This time next year, Americans will have a, more control over their data, be about the same level of control over their data, or see less control over their data. All right. Let's give it until it says 45 seconds on the clock and then we'll stop. All right. All right. Interesting. Okay. So less control over their data. And then someone selected option D. I don't know. Would someone like to share who voted for less control over their data? Why they feel that way? Yeah. We'll start here. Anyone who said there'd be more control would be willing to rebut that. Yeah. If D was intentional, would you like to describe your thinking? All right. Okay. So, all right. So pretty not super optimistic here, but all right. I have to figure out how to reset this. Okay. I'll show you the next question so you can start. All right. Here we go. When it comes to sharing personal data, consent is a useful but can be improved. Be useful, but insufficient. C, ineffective and should be replaced with something else. D, ineffective, but nothing else is better. All right. Looks like we're still waiting on a few. All right. All right. We'll give it to 45 seconds. Okay. Okay. So, let's see here. Yeah. So we need to find out who's voting E on all this stuff. I wish we were capturing biometrics now. So, for C should be replaced with something else. So, seven people voted for that. Does anyone have, those of you who voted for that, do you have any ideas of what might be better? Yeah. Yeah. Right. Yeah. So, how would you, what would be your golden solution? Yeah. Yeah. Totally. Yeah. That's interesting. I mean, it's, there's this concept of tiered KYC that exists in some markets, right? That, and you could imagine something around these issues generally, right? Was there another comment? Anyone who wants to comment on how consent could be improved? Yeah. Yeah. Well, that's, that's interesting. That's another idea that comes from other, is a concept of a learned intermediary, right? And that you can rely on a learned intermediary. Right. Yeah. Yeah. And even in the health privacy context of consent, they've run into issues here and there with that. With a business model? Well, with going, with like going, going beyond, you know, what was consented to the procedure? Yeah. Yeah. The business model is a good question. And I think that the model I mentioned in India, there have been questions about whether the business model has been clipped too much and how, well, I think what's emerged so far, I think at least one of the companies that's been licensed is a, is a bank utility. So similar to the way Visa or Mastercard started, right? Like banks would come together and form a joint venture realizing that it's more efficient. Consent should not be, you know, it could be something on which they could collaborate right in that, in that way. So you could envision a utility model that. So I think this comments every three years, right? So about three years from now, we'll probably have, have an answer. Yeah. So yeah, I think, I think getting money from consumers would not be, I just don't know how many consumers would actually pay. But you could imagine a value added service is like insurance, right? I mean, this could be tied to some sort of insurance product that, yeah. Yeah. So, yeah, so you're right. I mean, there could be a fun thing to explore in a session is kind of the business models and revenue. Any other thoughts on this one before we move to the next? Just, you know, I was just saying to disclose more, because they got no benefit out of the marketing stuff. Like just having a one-size-fits-all model that's here. Disclosure and a tiered, you know, benefit on it. Yeah. Interesting. Okay. Was there a comment up here? Was there? Okay. Can I jump on that real quick? Because I've been wanting to see this all day, which is we, especially in the financial services, regulatory environment, think about the value of data for risk mitigation on the industry side. I hate these things, by the way. Aren't I loud enough as it is? We're not sort of stepping back and thinking about the value we are, but we're not sort of as explicitly about the value of risk mitigation on the consumer side. And how are we thinking about policy societal objectives? Where at some level, Christine, you said it, right? The bargain. What's the tradeoff? What's the give to get? If we're thinking about, assuming we don't already see industry, using lots of data for risk mitigation, we want to make that more affirmative in the way like we're researching it around cash flow. What about thinking about obligations on entities that are getting the value of that data, that they then have the obligation of making sure that it is, at some level, improving consumer financial well-being. Pick your context. But that there is a give to get in a really explosive way. And that we, society, is thinking about how is this being structured so that it is beneficial on both sides of the coin? So, unless that hasn't come up yet, but would you see our immediate example? And that was the steps I thought about that, right? To give credit, do we even have to give credit? Yeah, partly. So, when you think about the CRA and the context flow, it might be in terms of populations rather than in the GRs. Actually, my colleague Larry White is- And it does consider it in populations, LMI. Right. My colleague Larry White is written, and I think we need to have a better way of thinking about a lot of those issues. Yeah, I agree. Do I need to read for my phone or could I just talk really loud? All right. Just two quick points today. I mentioned the digital standard during when I was on the panel, and one of its elements is every piece of information collected about being a cruise to my benefit. It stated a little bit differently that it gets to that idea, right? And then there's this other idea that's being a sort of subliminal part of our conversation throughout the day, which is ultimately, it's pricey only for rich people, right? So, you look at what's gone out in California where you have a CCPA that actually ensures in law your right to charge people more if they opt out of dealership, right? And so, when we look at, as we're thinking about this conversation, I think we also need to think about that ethical elements, right? And I think it absolutely gets to the point. Yeah, I think that's a great point. I mean, I opt out of data sharing every time I can and you've got to imagine that the people who aren't aware of being able to do that are the ones who are providing the data to enable these services to be improved. So, we're kind of those who are better off right benefiting from from those aren't, it's a good question. All right, we're going to move to the next one. Let me show it. Let me shut this. Over the next few years, who is most likely to have the biggest positive impact on the ability of U.S. individuals to exert greater control over their data? A, U.S. Congress. B, state lawmakers. C, private sector companies. D, federal regulators. And E, for U.E. people out there, U. person out there, we have an E this time. Other. If you select E, you have to tell us who the other is. No, that's interesting. Okay, private sector companies. Wow. Who wants to tell us about that? What company and what are they going to do? You think they'll be driven more by reputational concerns or more by legis like more by reputation? How do we, what about values that we're going to bring to our customers? How do we ensure that we gain the trust of our customers? And so what is a way that we could distinguish from some of the other players who are getting really digging to left and right? Like how do we not become that or how do we do that? I think that there could be, and also I see a lot of companies getting together to talk about this. In some of the other areas, I think it's a little bit too slow, potentially. And it's just amazing, three years ago you go to a client and maybe you talk to a privacy lawyer and a few other people who cares about privacy. Now it's a G-suite level issue. Pretty much across the board. So there is, you know, from the top down, I'm not saying that it's perfect and not to say that all companies are this way, but I think using mercy most ability for effective and meaningful consumer choice in this between the consumer and the companies to make sure their data interacts. Because only they can understand the value of that. I think that's crazy here. You know, and I hope and pray that that's what's going to happen. I was glad that nobody picked A because I was reflecting our earlier conversation as thus far, happy to be in responsibility in this arena and continue to do so, which is why I picked A because I think it'll be consistent. Oh, and courts. I'm interesting. Okay. Interesting. Okay. I heard some conversation over here during this question. What did you guys? Courts? Okay. Okay. Make sense. Because, you know, as much as I would like to think that companies are going to get religion and they're going to build brand and all of these, I think that, you know, there's just going to be, there's an asymmetry in what Matt talked about between convenience and protection. The symmetry is, hey, you know, I want convenience and I want protection. But as soon as things go wrong, that's when I kind of go for the protection side of the scale. And that's what's going to go to the courts. It's going to be litigated and there'll be lawsuits. So under what legal theory, like under what law? Yeah. Well, that's a great question because given the ambiguity in our, in our societal ambiguity. Yeah, it's not clear that there are actual violations going on of what the actual law is because the actual law allows for so much already, right? So, so which are you thinking of particular specific laws or what? You guys are the lawyers. I'm not a lawyer. A taking, a taking. Yeah. So like a constitution, constitutional theory. And there's no private right of action under a lot of, like, Gleba. And so everybody has to have standing, right? Everybody. And so it has to be something like that. Sean, where are you going to? Like a Spokio, kind of. Spokio. So we can see how, how much fine he sees in the leg of Facebook, no matter how much money they're going to collect, it's going to be dropped in the bucket from Facebook's perspective. So I just, I just don't have confidence that that's going to, so I'm going to check anything right now. Yeah, but what you're saying is that Facebook thinks the fines are just a constitution, this is exactly, so they're not going to be deterred by, I mean, pay the fines. So, I mean, so. So it's not like injunctive relief, maybe? I'm just wondering about, probably, because I picked C. I mean, I don't expect these big tech companies to change their behavior whatsoever, unless there's somebody in the first category, in category, you know, make some moves there, which nobody believes they can do. But I do think, you know, in the finance services sector, I think the situation is a bit different. I mean, Dick, in your presentation, you mentioned ES. So there is some, obviously consensus, no consensus whatsoever, but at least there is a forum where incumbents, you know, Wells Fargo, Chase of the World, and tiny companies, Fintechs, and, you know, the middleman, the aggregators are sitting at the same table and trying to figure things out. They probably started slow, just our standards, you know, the DEA, what kind of, you know, specs that people should use in APIs. But I think, naturally, there was this move on to what kind of, you know, ownership should be in place, right ownership should be, structure should be in place, and what kind of liability model should be in place. So I think broadly speaking, Google's, Facebook's a world, I have no idea how that's going to work, but I think, you know, in services, I'm a little bit more optimistic that I share that, because I think there's more of a, the trust that Melissa's talking about, I think it's a bigger issue for financial services than it is for others. I just want to make sure we can clear our own thing. The FDX, the APIs that the large institutions have built are supposedly only about technical specifications on data flows. They are about data fields, and they define what data will actually flow. So even right there, the idea that we're seeing, you know, the industry in the big name institutions be fully open about what data is able to get to flow, which is slightly different than consent. I think we have to be realistic about already what we're seeing, which whether it's competitive purposes, they don't want their competitors knowing what their flow is like. It's already not what a true consumer-related part of you is ideal. So if we do have time for the last two questions, we can vote on which questions. They're kind of similar. All right, why don't we do this, this one? They just do. Okay. In addition to personal data rights, which of the following is likely to most dramatically impact the ability of individuals to exert greater control over their data? A, greater U.S. federal and state government supervision and enforcement? B, enhanced permissioning functionality for users? D, or C, industry standards for data access and portability? Or D, other? Yeah, so whereas the other ones are focused more on stakeholders, this is focused more on what the intervention should be. And if you don't like this question, we can do the next one. All right, I think we're, okay, good, I think we're probably there. Okay, so yeah, similar to the other one, industry standards. All right, well, okay, so someone voted for D and someone voted for E, which is pretty interesting. Yeah, other, D, who picked D? Other? Yeah, Christina. So when I look at the other options, I was torn between C and D. And what I really think needs to happen is actually the state level laws that we referenced in the earlier question, is that that's my D, is it's not really about supervision and enforcement, it's actually coming up with a data minimization law or suck at some tool for really preventing the type of sweeping sharing that you saw yesterday that was exposed in what the moisture journal article is. Okay, I think we're a few minutes over, over time. Does anyone else have any other comments on this or just generally? Other than we'll turn it over to you guys.