 OK, well thanks for quieting down in advance, everybody. We are going to, I was really surprised, we are going to get started with our final panel of today. Last but certainly not least, moderating our panel will be University of Michigan Law School's very own Professor Gabriel Routerberg. He is an assistant professor of law here at Michigan, where he teaches corporate law, capital markets, regulation, and contracts. And his research interests include financial trading empirical research in corporate law and nonprofit organizations. He is the co-author of two recent books, and he has been published in illustrious publications such as the Columbia Law Review, the Duke Law Journal, and the Michigan Law Review. And in the past two years, he's had an article, each of the past two years, he's had an article selected in the top 10 best corporate and securities articles of the year by the corporate practice commentator. His current projects include an empirical examination of the tailoring of corporate governance arrangements in public companies, and a project with the economics and computer science departments here at Michigan exploring manipulation in algorithmic trading markets. Please join me in welcoming Professor Routerberg. All right, welcome everybody. Thank you so much for joining us. So first, I'd like to add my birthday wishes for the Michigan Technology Law Review, or a mittler. So this is, I believe, their silver anniversary. That's 25 years. It was extraordinarily farsighted of the founders of this journal in 1994. To start with was one of the nation's first online law reviews when the internet was first becoming a thing for ordinary people. And since then, technology has obviously only continued to further transform our lives, and mittlers become one of the country's premier technology specialty journals. So I would like to thank the journal organizers, first and foremost, for organizing such a fascinating conference full of wonderful speakers, and on such a hugely important topic. And also thanks for inviting me. So I would say a special thanks to our wonderful panelists. So I'm going to introduce them and try to be not too embarrassing in my praise for them in their careers. And I'd like to say that the topic of this panel is balancing innovation and consumer protection. So that's like one of the essential questions of our lives. If people don't find it interesting, it's their problem, right? In fact, I find that one of the most interesting things is how their privacy will be of value for the next generation. I'm not so sure. But anyways, so, let me introduce the panelists briefly in alphabetical order. And then I will talk about a few kind of things, I think, are central themes in this area. And then I will turn it over to them for making sort of zero to five minutes of early remarks. So they just have some general thoughts they want to get out. I'm going to be free to. I will awkwardly cough if they go beyond five minutes, because I'd like to turn it over for questions. I will ask the whole group. And anyone who has a question, please feel free to ask it, I think, throughout. And then at the end, we will certainly have reserve time for asking questions. Okay, so quickly, Sean Duff is a senior fellow at the Aspen Institute's Financial Security Program, where he serves as principal investigator for the project on pragmatic regulation and financial inclusion. Sean has a JD with honors from the University of Chicago Law School and a PhD from the Warren School at the University of Pennsylvania, where his research focused on issues related to corporate strategy and financial regulation. He practiced law for more than a decade in banking and financial regulatory departments at Ovelmini and Myers and Allen and Overy. And more recently, he co-founded a digital retirement savings platform and founded the Headlands Project, a boutique advisory firm. He is an elected member of the American Law Institute and it seems a Renaissance person or a jack-of-all-trades. Okay, Dan Kwan was recently the senior advisor to Richard Cordray at the US Consumer Financial Protection Bureau, where he led the FinTech office with its cool, very Skynet name, Project Catalyst. There he worked on identifying and promoting innovative technologies and hopefully helping the law itself to innovate, which I think will be a central importance for our society going forward. Can the law help improve on its toolkit for helping balance consumer protection and financial innovation? Before that, he was a research associate at Harvard Business School, where he worked with Peter Tafano, now Dean of the Syed Business School at Oxford. Jill Miller is an attorney at Varnum LLP and part of the corporate practice team. She routinely provides clients with updates and information and analysis on state and federal personal data protection law, assists with the preparation of privacy policies, which specifically address how personal information is collected, controlled, and deleted, also incredibly important. She counsels clients regarding their data breach security notification laws, she assists with compliance on a wide variety of notification requirements, and she has a unique background in aviation law, which I think is extremely cool. Maybe she remembers all of those first year contracts curriculum grade aviation law cases. Christina Tetria, if I'm pronouncing that right, maybe I'm not. Tetria, sorry, close, but is an attorney with the Consumer Union's Financial Services Program. She's a payments expert, particularly so when it comes to the payments in which so many of us are simultaneously interested and ignorant, virtual currencies like Bitcoin. And she has authored several important publications in the electronic payments space. We're delighted to have her, too, as well as, equally so, David Thott. David is an assistant professor at the University of Pittsburgh with appointments in the School of Law, schools of computing and information, and public and international affairs. His research ranges across a dizzying range of data-relevant topics, including cybercrime, cybersecurity law, and more. He teaches administrative law, criminal law, criminal procedure, constitutional law, and other things as well. So maybe I'll turn it over to you for your sort of zero to seven minutes of whatever you would like to talk about, including your area of expertise on the topic. And then I'll throw a couple of questions out to the panel and throughout. Please feel free to ask questions as well. So, Sean, please. Thank you so much. And I'm glad you're here and we're all energized. I didn't even need this second cup of coffee or something. And proving that data moves faster than we can keep up, I have two additional items for the CV1. I'm currently general counsel of a fintech in San Francisco called Juvo. It's a mobile analytics company working primarily in emerging markets. And we have about 750 million prepaid mobile subscribers on our platform globally. So I feel great empathy for the idea of engaging a GDPR data inventory process. It is as hard as everyone has made it sound. And the other thing I do in my spare time is I'm a consultant with CGAP, the consultative group to assist the poor, which is a World Bank-affiliated think tank. And in that capacity, I work with emerging market regulators on sandboxes, regulatory sandboxes, which were mentioned on the last panel. And I'll have some comments on those. So in my zero to six minutes, I think I have three points, maybe a fourth, all kind of law-ish, since we have law students in the room. The first is that that process matters. So we're thinking about balancing innovation and consumer protection. And that's ultimately a question of thinking about how to trade off the potential benefits of innovation in the marketplace with the potential harms that result to consumers that might be unknown at the time and innovation is sort of introduced. So we're thinking about that problem from a legal perspective. It's really about how to trade off type one and type two errors if you all have taken administrative law. So reducing the possibility that you're preventing good things from coming to the marketplace while also avoiding bad things entering the marketplace to harm consumers. And on that sort of procedural point, I think the notion of sandboxes or the type of project that Melissa was talking about or in general, more evidence-based ex-ante policymaking is actually quite worthy of kind of pause and reflection. One, because of, as the day suggests, there's just tremendous dimensionality to the topic that we're discussing today. So we're discussing data in the context of privacy, but the reality is our entire world is being rewired. The way we communicate, the way we do financial transactions, the way we travel, the way we experience the world, and to not have that evidence-based in front of a policymaker on sort of a real-time basis is actually an important risk to how we even kind of approach the question of regulation. So I just want to kind of remind us of not just the substance around data, privacy and security and compliance and so forth, but the process through which we uncover kind of the best answers to those questions. So that's sort of point one that's kind of harkening back to your CIVPRO and evidence classes and so forth. The second point I'd like to make, and it's on the question of the harms that we think about when we sometimes think about consumer protection. And the classic critique of sandboxes, which you heard from Marvin on the last panel is, if I'm an innovator, I don't want to go to the FDA equivalent to have my product approved before I launch it in the marketplace. And many sandboxes, the FCA and the United Kingdom kind of approach the problem this way, where it's sort of a threshold testing environment for someone with a new product or business model idea to go get it kind of blessed or consulted on by the regulator prior to launching it in the marketplace. And some folks view that sort of gatekeeping as sort of worrisome as kind of a hindrance to innovation. But there's another way to use the sandbox, which is to address not the harm of the product, but the potential harm of the rule. Melissa mentioned this a little bit in her cash flow discussion. But if you think about that there are, so many of our financial regulations and rules are based on assumptions about an analog financial system, and they just simply don't work or they don't have obvious application in a fully digital environment. And sandboxes can be used to tinker at those questions and to uncover rules which have unnecessary frictions for a digital economy. For example, in many emerging markets you have know your customer rules, KYC rules, that are based on physical identities, presenting a driver's license when you open a bank account. But what happens if you're opening an account here? You're not going to a branch but you're doing your transaction entirely through your phone. How do you verify your identity in this environment? There's a simple fix that happened in Malaysia through a sandbox where someone could take their ID, take a selfie, send it to the bank. The regulator looked at that solution as sort of simple as it seems, but blessed that type of KYC process for regulatory purposes. Sort of an interesting application of a sandbox. Third point, and this is just really a point for maybe the cocktail discussion, is if we think about consumer protection in this context, I always think back that the original consumer protection regime was antitrust, right? That was sort of our first take at consumer protection from a federal regulatory level. And I'm not going to have an answer to that but you have to wonder a little bit about concentration of data in a few actors in the marketplace. And I think if we pulled the threads on many of our conversations today, they would go back to potential harms or concerns about a few large actors holding most of the data about most of the people in the world. And that has a different kind of regulatory complexion to it that feels more like an antitrust problem than anything else. But I won't comment on that more later. The fourth point is really a point that I tell myself every day, every month when I pay my student loans from law school, which is to say that the law school experience does still have relevance to things like this, even though we were talking about lots of disparate data points today, I think you can make much sense of this discussion through three English law cases and a Benjamin Cardozo case, which you all probably read the first year of law school. The first is the kind of the classic stags and sheep line of cases from English property law. We have the first case is a stag wanders from A to B, property A to B and is killed on property B. And the law says the deer belongs to the property B owner because this was a wild animal in nature. No one had exercised dominion over it. He was killed on B, landowner B owns the animal. Second fact pattern is a sheep wanders from property A to property B and the property B owner takes the sheep. The case goes the other way. Why? It's because property A exercised dominion over the sheep. GDPR is basically stags and sheep. So if you think about the pre-GDPR world there's stags running everywhere. If you killed it, it was yours, post-GDPR is sheep. You can make much sense of the privacy landscape for that simple moniker. The second is less colorful but it's really around the notion of attenuation and proximity. And that's a concept around which you can make much sense of the emerging privacy regulation. The classic Hadley v. Baxendale case and the contract setting, the Paul's graph, the Long Island Railroad case and the tort setting both explore the boundaries of liability and responsibility with respect to foreseeability. And GDPR and other related property regimes have that same flavor about ownership and responsibility for data in some sense being limited by the foreseeability of the harm or use of that data. And you can ground that concept in contract-based legal basis for processing principles in GDPR. So I'll pause there just to say law school is definitely worth it. And first year pays long dividends. Over to you. Please. Hello, I am Jill Miller. I practice in electronic payments and so I represent people who sell credit card services to merchants. And so in this whole acquiring side of payments, there becomes the privacy issue and the management of data. In electronic payments, there is PCI, which is an organization who is trying to put down standards to manage and require merchants and other folks handling data in this environment and how they can be compliant with these PCI standards, which in my opinion, all the companies who have been breached still comply with PCI. So I'm not quite sure about the use and benefit of it. In terms of the airplane work I do, I've never flown on a private jet. I have helped purchase many of them. My daughter, when we go to Disney, we'll get on the plane and say, why aren't we sitting in the front? We always sit in the back of the regular plane, too. So in terms of what I do with privacy, my clients will call me when there's a problem. When something's happened and there's a data breach. And at that point, because data breach notification is covered by state law as opposed to federal law, the first analysis starts to, okay, typically it's someone in HR is calling me. It could be a variety of reasons. One happened a lot last year, there was the phishing scams where employees were receiving emails from who they thought was their CEO and they were sending data. And these poor employees, I mean, the first thing as a counselor, you have to do is say, it's okay. You know, it has happened to many others like you because these people feel like they have done something so terribly wrong. So you calm down the client first. And then the next step is really sort of looking at where are the people located and which states? What are the requirements for that particular state? And then do we have to notify someone besides the individual? Do we have to notify a state regulator in New Jersey? You have to notify the police department. The notification letter is now this form letter and Massachusetts has its own little paragraph that has to go in it. You know, and another thing I do is a lot of privacy policies and the things, you know, people are just like, can you just draft a privacy policy and just put it up on the website? And the answer is no. We really have to understand what data you're collecting, how you're collecting it. So it's really, you know, my work is really just sort of counseling, helping clients deal with data breach notification, helping them with privacy policies and helping to understand what data they're collecting, how they're collecting it, how they're deleting it. And I'm gonna move it on down without the cough. All right, thanks everyone for showing up, staying through the day. I wanna give a special shout out to the organizers. I've paid thousands of dollars to attend conferences that haven't been nearly as useful as these last couple hours. So, and also I just wanna thank whoever's gonna clean up the room where all the food is, because there's been so much food and it's been so great. But I really appreciate the opportunity to be here. And I have a little bit of a disclaimer, which is although I work for a consumer's union, which is now rebranded as Consumer Reports. And you're probably wondering why the lady who represents an organization that talks about ratings for dishwashers is here. We have started to build in data privacy and data security into our work. And I'm gonna talk about that in just a minute. But for today, I'm only speaking on my own behalf. So nothing I say here should be attributed to the organization. I wanna talk about the context in which this conversation is happening. It's a time of immense financial precarity for many consumers. And I'll just rattle off a few statistics as four in 10 households can't cover a $400 emergency expense. One in five households has either a zero or negative net worth. And one in three households has zero savings. And that includes nearly one in 10 households with incomes over $100,000. So this is not an issue that's confined to low and moderate income consumers. Now it is a problem, especially if you are a person of color, the racial wealth gap is real. For example, the median African-American household has wealth of $3,600. The median Latino household has wealth of $6,600. The median white household has wealth of $147,000. That is 41 times the household wealth of African-Americans. So there's also a gender wealth gap and there's also a rural urban wealth gap. So I think that context is really important because even though many of the financial technology products are tiny miracles and they really, really are, the idea that somehow amassing and crunching a massive amount of consumer data can solve what are essentially structural problems, I think is naive at the least and pernicious at the worst. So what are we gonna talk about? Well, we're gonna talk about financial products and services and there seems to be this idea that somehow adding the word fintech creates this magical fairy dust that takes it out of the idea of it being a typical financial products. We've heard the reference from the earlier panel about old line and new bottles. And I wanna say that usually the questions are not whether the law applies and often it's not, it may be in very rare instances, how, but I really think that there's a lot of demystification and the idea that somehow lawyers and bureaucrats are not smart enough to understand technology I find personally offensive because I actually think that many of the technologies are quite if the providers are doing their job correctly and this is why I loved Christie's presentation, it's actually pretty transparent. So there shouldn't be this total mystery about what's happening and how it's going in that sort of thing. So there isn't an easy answer for a lot of the big questions that have been presented today, but thankfully so many folks have put forward essential ideas that we can really take and run with and I don't purport to have the answers. But I do think everything Christie said at the end of challenging all of us to think about how we can be more transparent, how we can be more inclusive, how we can be more privacy protective and data security protective is essential. I would go on to highlight that I think there is an opportunity for law and regulation and we'll talk about that. And as was mentioned, I am particularly interested in cryptocurrencies, but I wanna make it clear that I think the idea that there should be an easy pass for some sort of financial products or services again because they have the fairy dust of technology is really mistaken. And so what I wanna talk about is that or what I wanna end with is the idea that we are all charged to carry this responsibility forward. I know many of you are students and you're gonna go out and you're gonna work at amazing big firms or you're gonna become consumer advocates or you're gonna go to work in-house. And if you can carry some of the ideas that have been put forward today and some of the responsibilities that you have to bring to your clients or to your colleagues, I think we'll all be better off because fundamentally we are all consumers and that's part of why we need to think about this is it's not just your selfish benefit, it's not just the, you wanna think about what's the societal impact because as we can see from some of the scandals that have rocked our country in the last couple of years about the misuse of data and I think that article from yesterday about what Facebook is doing is just absolutely horrifying and it highlights really the mess that we can potentially be in if this continues too much longer. So I'm the only non-lawyer on this panel and I'm speaking at a law school so whatever that means, you can interpret that. So in the past five years I was serving as the director-quartered as a right-hand person on everything FinTech. So spent a lot of my time in San Francisco in Silicon Valley, wrecking up a lot of miles of course and after I left the bureau I set up my own little consulting shop advising early stage to grow stage FinTech companies, business strategies, product strategies and partnerships and also I'm serving as the senior advisor from McKinsey's global banking practice and also help their large clients understand FinTech, partner with or acquire FinTech. So I guess I'm gonna just touch upon two things. So before I say that, I got texts from my wife a couple minutes ago and she said, well, don't be controversial. And I feel like I'm no longer working for the government. I can finally speak my mind, why would I care? This is, I don't have to put up a disclaimer. This is my opinion and my opinion of my own firm. So that's fine, I guess. But still, and as Sean know, he moderated the panel a couple of months ago at Boston Fed. I inadvertently said the curse word and that I was not received very well. So I'll be, I think I'll, I didn't object to it. I'll try to be controversial to the extent possible about being very polite here. So I'm gonna just basically talk about probably two things. One is about sandbox. That's a way to sort of balance innovation and the consumer protection. So you may think this is a bit sort of shocking if people, especially people who know me very well, I designed this first sandbox for the CFPB before I left the CFPB and I personally have a lot of doubt about sandbox because I've seen how sandboxes are being operated in other jurisdictions. And again, in this case, in our own country, a lot of the regulators probably just think about sandbox as a shiny toy and they want to grab five minutes of fame. Hey, we have a sandbox and we are very pro innovation, we are cool, we're sexy. And I really op at a couple of months ago, American Banker, and my point is sandbox actually has its own usefulness and if you think about the two areas that sandbox can be useful, one is there's a legitimate regulatory ambiguity or uncertainty, which is something that Melissa's nonprofit is trying to solve. Everybody wants to be fair to their customers but how to be fair, the law is very, very, very ambiguous. The law doesn't tell you how to comply with fair landing. There's standard, but the standard is not a line in the sand. So if there was a line in the sand, it'll be easy, right? So if a borderline, you're fine. If you're underlying, you know you are in trouble. So it's easy for an organization to crack that. Unfortunately, there's no such line. So it's a very much you know, up to the enforcer of the law. So in that area, I think there's a lot of sort of wiggle room. So especially with the development of AI machine learning and the use of non-traditional data and the proliferation of data, I think as a regulator, we need to be very, no we, they need to be very careful understanding these new algorithms and how companies are using those new data sets to help expand access to credit, to help improve the predictiveness power of their models. So I do think there's a legitimate use of sandbox in that area, but in my, you know, I've been looking at things for about five, six years now, there's probably 10% of the use cases are for sort of legitimate regulatory uncertainty. Most of the potential use of sandbox is really about helping regulators change their mindset. So Professor Barr in the morning, he talked about the previous OCC controller, Tom Curry, in his white paper from I think three years ago, Responsible Innovation, white paper. One of the things he addressed was really the culture of the agency, right? So he admitted that OCC had had, I believe still has a culture issue. So as a supervisor of the country's largest banks, it's natural for the agency to be very conservative because your job is to protect the integrity of the financial system, not to experiment with these new things. By the same time, we all understand there are true financial inclusion issues in this country, right? 23 million small businesses may have trouble getting credit. More than 40 million Americans are thin credit or thin file, no file. And a lot of them are young people, minorities, women. So these are the legitimate issues even for our country, which is one of the richest countries in the world. We still have a lot of people who just don't have access to credit. And we believe technology can play a very important role to help address that issue. So sandbox actually can be helpful in that regard as well. Mostly in the sense of promoting more collaboration between banks and non-banks. And I think it's really funny. A couple years ago, a famous CEO of a finna company once said, famously said, killed the banks. Now the company couple years later wanted to be a bank. And now we have in the square capital resubmitted their application to be a bank. And the OCC FinTech chart has still attracted some decent amount of interest, despite of the challenges from states and from CSBS. So at the end of the day, I have never met a company that can sort of live outside of the traditional banking system. So you still need a bank to origin the loans for you. So you think about lending kind of prosper. These are the sort of, or the on deck. And they actually either have to get some kind of license and most cases they don't. They actually rely on a small bank, unknown bank to help them issue those loans. So you need a bank. And if you want to Venmo, most people here use, probably everybody here use Venmo. Venmo needs a bank to make those transfer, PayPal too. So you need a bank in the back end. So the question is, well PayPal has been around for more than a decade. It's a very reputable company. So if you're a bank, if you're comfortable working with PayPal, or if you're a upstart, and I think there's a lot of risk there, banks are very reluctant. And I, sitting in that seat at so many times, having companies coming to me saying, hey, my compliance officer has told me I'm okay. The bank's compliance officer told me I'm okay. We talked to the bank's lawyers and our lawyers, they think they're okay. What do you think? You think are we okay or not? They say, well, if they're saying you're okay, why would you need me to tell you're okay or not? They say, well, if you say not okay, then my bank is not gonna have a partnership with me. So I think, so this is sort of the third party risk management guidance. I think in the presidential regulators, the CFPB, they all have those guidance out there. Again, the guidance is not law, is not rule. The guidance tend to be very high level, not specific. So there's a lot of room for regulators to really take a second look at these guidance to make sure banks can be more, responsive to those partnerships. And I think a lot of banks, especially community banks in today's world, they are dying for new technology to help them compete more effectively with the GPMC of the world, but they cannot do it on their own. So they need partnerships. So I think this is the area that where a FinTech sandbox can be helpful. So the second point I wanna make is really about data access. So when this was about three years ago, when Director Cordray went to Monday, 2020 and delivered the speech on FinTech, and in his, to the end of the speech, and helped write that paragraph, he basically said, you know, this is a consumer's data, they have right to their own data. So that really was a bit shock and awe, if you will. And a lot of the, I heard the people in the room, especially traditional bankers, they felt shocked and awed. So the point is, it is, we believe, and I still believe today that this is my own data. So I have the right to share with anyone I want to share with, have the company to provide the service that I would like to purchase or to use. But again, you know, we're living, financial services have heavily regulated the space. You know, I want to share this information with you, but if you, I don't know about your, your data, your cyber security standards, and if you don't keep my data in a safe place, if you're hacked, who do I go to? I go to my bank and say, hey, you screw up. You know, I want you to make me whole. So I think there's a lot of legitimate concern from bank's perspective that, you know, this information is secure with me. Once the information leaves the door, I should not be responsible. And at the end of the day, probably regulators will force me to be responsible. So I might as well not allow the data to leave. So the question is, you know, what the CFPB or the regulators should do, or the Congress should do, or maybe states can do to facilitate a more conducive environment where industry players can really sit at the same table and really think about a framework so that liabilities can be allocated appropriately among different parties. At the end of the day, consumers should be sitting in the center and they should not be harmed. And, but it's a tough one. So we've seen the big experiment in Europe, in the UK. And when I spoke to them, you know, this was a year or two ago, they told me they have no idea what's gonna happen when there's a big data breach. Everybody's waiting for the shoe to drop, right? So nobody can tell you when and how the data breach is gonna happen. And it happens when there'll be enough insurance, adequate insurance to pay for the data breaches and who are actually going to be punished for the data breach to shoulder the cost. I think it's a big unanswered question. And I'm not sure, and frankly, I don't think it's nearly possible for now in this country to have something like a PSD2 or open banking, which is pushed by the CMA, the competition and the market authority in the UK. That kind of sort of regulatory mandate or congressional mandate to make that happen. I think some constitution may have to be arrived by industry players trying to work among themselves. But again, we have seen so many different individual bilateral agreements, which are not sustainable. Think about how many banks we have, right? But that's the reality we're living today. And I'm not giving you any answers. I just, you know, post questions. Thank you. Great, David. I was thinking about what I wanted to say for my zero to six minutes. And I wanted to try and pick a couple of topics to keep it closer to zero minutes. I had a little trouble doing that pick a couple of topics, which is sort of the first point that I want to make to everyone. Look at the title of this panel. And look at the topics we're discussing in this conference. They're extremely complex. There's an inherent human nature to want to simplify things. Tons of research in psychology, if you're curious about that. That's dangerous. And at least as lawyers, but I would argue those in the room who build things, engineers, scientists, et cetera, and who think about the other aspects of the larger system, economists, public policy folks, et cetera. We have responsibility to accept that these problems are complex and look at that as a positive. Look at that as an opportunity, because society is complex. You want to address an issue like how do we provide credit to a society that is extraordinarily diverse? That's a really complex question. We can't just look at one thing, focus just on privacy issues, or focus just on a particular credit scoring metric, because as we've seen, and a lot of evidence was presented today, it turns out that doesn't work at all. So complexity matters, we have to think about it, and we have to try to not oversimplify it. Law, by the way, does have a historical tendency, sometimes of trying to oversimplify things that should not. The second is, you heard me mention the word privacy and people thrown around words like security, data security, data protection, cyber security, et cetera. Well, the comment in the introduction about me that I've taught all these different courses is really more reflection on my not knowing how to say no to my associate dean, but the comment about the number of different areas of expertise is really much more reflection on the fact that we have a lot of different words we use here. They don't mean the same things. Cyber crime is not the same thing as cyber security, is not the same thing as data protection, and is not the same thing as privacy. Privacy is about a set of normative choices. Policy decisions we make, what do we value, how and in what way? Data protection attempts to implement those. Cyber crime is totally a different area. It's about how do we use the power of the criminal law to say society is going to condemn this in an attempt to prevent it and we'll back that with physical punishment. And cyber security is a scientific exercise trying to figure out, okay, given this set of things that are desired, whatever they are, cyber security doesn't care what they are, how can we give effect to those desired things? When you match those all together, you conflate the questions that are being asked and you run the risk of not making progress forward, not because you're not trying, but because you're talking about five different questions as one singular question. Kim goes back to the issue of complexity. The last thing I'll say to actually try and keep this brief is, I should have counted because it would have made this more fun, but I didn't count the number of times today that I've heard the word fair used. Fairness is great, it's a virtue. It'd be shocked if there are two people in the world who have the same definition of the word fair. And that's where you start to run into problems. And I'm not saying, I really wanna stress this, I'm not saying fairness is a bad idea or that it should not be part of law. It's just that we need to recognize that the word fair does not have anything even close to a universally accepted definition. So it may be that from a functional standpoint, using the word fair is intended to be a proxy for something else, but that's dangerous because the people involved in the conversation, just like I said, hey, we throw around cyber crime, data protection, privacy, et cetera, and well, we kinda mix it together. Well, now we're just using one word. We all think it means something different. Dude, we all think it means the same thing, but it turns out we all actually mean something different. And for anyone who has a little bit of reticence and believing me on that one, I encourage you to walk over to the Department of Theology at any major institution that has such a department and talk to the theologians there. They will give you an excellent history on this. There's also a great book by Marcella Eliade called Patterns in Comparative Religion. But seriously, this is something that we need to take very seriously as we try and consider these problems. What do we mean when we are talking about something and making sure that we're speaking the same language in these conversations? Great, wonderful. So, let's start off by maybe asking questions at 50,000 feet and then each person can take it to the context they feel most comfortable discussing it. So, consumer protection, and I guess I'll propose what we just discussed. Consumer protection can mean so many different things. So you think about one of the central worries that people have about privacy in the United States today and privacy is off our Facebook data or our Google data. Google knows so much about me. What does it do with that data? What should it do with it? And what does consumer protection mean in that context? It could mean that we should police substantively the data protection policies that Google has. It could mean that the appropriate level of consumer protection is eliminating Google's quasi-monopolous status, using the antitrust laws as a consumer protection device, which Sean mentioned earlier. So what is the appropriate consumer protection regime? Are we anywhere close to it for the issues that you guys care most about? Just pick one. Is it antitrust? Is it disclosure? Is disclosure naive in this context? Because it's retail consumers. Is it fiduciary standards? So I just love for each of you in the context that you sort of want to talk about most to say what you think consumer protection should look like today. And then we could talk about balance after that, because it seems to me that we could talk about false positive and false negatives, but people have very different normative views about what the weights should be balancing consumer protection and innovation. So I'd love to hear people's thoughts on that too, but let me start off with what consumer protection means. And maybe we can just go to whoever wants to talk first or go down the panel. So please. Sean, you want to just start us off? Sure. Let's see. So what does consumer protection mean? Right. Just pick one place. See that. OK, so I think I'd send an email about fiduciary stuff. And I think that that's an interesting thing to consider, because one topic that hasn't been discussed today is product design and kind of how we interact with this and what the obligation of an app provider or a financial institution has with respect to the sound design of a product and the way in which choices are presented. I mean, that includes the way things like privacy policies are presented as well. But I think that we need to press past the notion of just kind of cold consent and disclosure and think about the power that data provides to, I know FinTech, we'll put a quote around FinTech and note that that's maybe a suspect category. But we can learn a lot more about how consumers interact with our products and services now. And I'm wondering if that creates more of an affirmative obligation on things like financial health by design or financial inclusion by design, sort of the next step in privacy by design and actually kind of thinking substantively about the choices and the order in which we present them to consumers on screens. I would just say from consumer protection, when I think about these cases where anytime there's a breach, it seems that I'm kind of numb about a breach anymore. And when it happens to my clients, they just seem, I mean, they're just following the rules. They're not really concerned about the consumer, just like, did I follow the state law? Did I send the notification? How long do I have to give them the ability to monitor their credit? And so it just seems to me that there's just a lack of concern about consumer protection. It's about how can I easily get this notice out and comply with the laws and, again, and how do I prevent my company from being exposed to any financial penalty? So I want to harken back to something from, I believe it was Florin talked about how essentially in some regards the evidence indicates that the disclosure regime is broken. And I know Professor Barr this morning in his opening remarks talked about a potential way forward. And I'll bring it back to the idea of taking a consumer focus on this, is consumer reports to the survey in 2017 when we asked people a series of questions using our survey folks. And 70% of Americans lack confidence that their data is secure and kept from the prying eyes of people who shouldn't have responsibility for it. There, given some of the abuses we've seen, I think that number is probably gonna be 100%. But Pew has done research that shows that there are different bargains that have different value, right? So it comes to this idea of what is, what is, when we think about consumer protection, I think we do have to think about this sense that there's this very real fear about unauthorized access to your data, but there is a great willingness to share that data with service providers depending on what the bargain is, right? So what does that look like in terms of the disclosure model? Well, so much of consumer protection is built on disclosure. And we heard in Melissa's presentation, they're looking at these new models and testing these new models for ways to think about what is meaningful disclosure in the context of new financial products or financial products that may not be entirely familiar. The idea of credit extension is familiar, but perhaps cash flow underwriting is not. And so you have this opportunity, I think, for us to rethink some of the ideas of what consumer protection looks like, given that now data is the new oil, right? And the data exhaust that we leave or the data footprint that we leave, right? And what is appropriate recognition of that bargain look like and what are the ways in which we engage consumers and we engage ourselves in thinking about what that bargain looks like. So like Dan, I don't have a simple answer, but I think that that's a really important way to think about how we move forward and what some of those markers should be and I think consumer testing in all areas has to be a fundamental part of it because, and I talk about this example a lot, so forgive me if you've heard me say this before, but we did some focus groups with folks on electronic payment product. And we asked the question, have you read the disclosures? And we had only one person over the course of four different focus groups say that they had read the disclosures. Then we asked the question, what do you think happens when something goes wrong? And to a one, every single person in the room named the brand that they were interacting with. Had they read the disclosures, they would actually know that 75% of the time that is not who they call if something goes wrong, right? And so that level of disconnect between what consumers think they're getting and what they're actually getting says that we have a whole opportunity to rethink what consumer protection looks like and we're gonna have to put on our really good thinking caps and take the tremendous ideas that have been put forward today and other places and put that into action. Although some of us already are, so just to list this credit. Yeah, so I guess I'm not gonna say anything new here. Maybe I can be a little bit more controversial. Disclosures is dead, has been there for a long time and I think the CFP spent a lot of time and if you look at the consumer protection rules the CFPB is in charge of, somebody may estimate probably over 40 to 50% of the rules are about disclosures. So if you think disclosures are dead, does that mean 40 to 60% of consumer protection rules are just not useful at all? So then why bother even enforcing these rules in the first place? But if you look at the enforced actions in the past seven years under the CFPB, there were a few enforced actions about the disclosure violations. Because frankly enforcement lawyers will say we don't care about disclosure violations because literally if you had a technical violation to slap on the wrist, nobody cares and everybody's looking at the bigger dollar, bigger dollar items like Udap, that sort of thing. So but in this new world where data is new gold, new oil, I think disclosure is also very important because I do wanna know how my data is being used even though I think I'll be really bored if you tell me exactly how my data is being used because I have freaking no idea how my data is being used and even if you tell me how my data is being used, I still have no freaking clue how my data is being used. So that's really a sort of catch 22. I know in the sort of data access, data aggregation space, lot of companies, as you see in the earlier days, couple years ago, not that long time ago, this idea of dashboard came up. So banks think they should be in charge, they should design this dashboard so that their customers know who actually are having access to their data and that they can easily turn on or turn off the data access. And if someone wants to even go one step further, they want to actually present to the customers exactly what kind of data fields are being used by the third-party service providers. And I'm not sure if anyone here would like to know what kind of data fields that are being used. I'm sure there will be hundreds, even of thousands of data fields. Are you really gonna uncheck all these boxes? I don't think so. So I think there's a balance about how much information is useful, is really sort of informative for consumers to make a sort of the right decision. On the question of what is consumer protection is, I just have a very simple sort of proposal here. So we know the current status quo is not good enough. So there's the issue of people being underserved, excluded from the mainstream electronic payment system outside of the mainstream lending market, getting unfair deals when it comes to lending. So if we can figure out a way to incrementally improve that situation, not entirely address that situation, I think that's gonna be a win for everybody. It should be a win for consumers, for a win for regulators, policymakers, and a win for the industry, because there will be more money to be made and the people have more access to better products and for policymakers, it's an improvement in society for the better of our society. But we have to do it very carefully, so there's always risk. Financial services is a risk-taking business. So how do you balance the risk and the improvement? And on this matter, I'm always sort of siding with those who are more sort of encouraging policymakers to have a more open mind, allow industries to innovate, and maybe a sandbox again here could be a framework where you are doing something good, but under a pair of watch flies, you're not really gonna screw up, if you do screw up, not in a big way. But I think we do need to allow technologies to further develop. I think the new things like blockchain or AI, these are really, really good fundamentally. I mean, I don't know much about blockchain, but I think AI is really something that can potentially fundamentally change our society for better or for worse. And if we don't allow this to happen, and I remember there was a Financial Times article from last week or maybe earlier this week, really talking about the dark aspect of AI, how algorithm can be used to discriminate against consumers. I think that's definitely one prospect, and there's also sort of the more bright side of AI. But the question is, as policymakers, if we don't engage, if policymakers are not engaging themselves in understanding this new technology, then you will be forever in the dark. We live in really interesting times right now. I don't know if that's a good thing or a bad thing. There is an old Chinese curse that comes to mind. When I was a kid, I was a huge Star Trek fan, and I grew up near the University of Connecticut. There was an article in the Hartford Current by a University of Connecticut, I think physics professor, talking about which technologies from Star Trek. This is probably late 1980s, early 1990s, something like that, probably late 1980s. Which of those were complete fiction would never happen, and which of those might happen X number of decades or centuries? He got it completely, I think it was a male. He got it completely wrong. And something like 75% of the technologies that existed across all of the Star Trek series that were around at the time now are there. In fact, these are actually more advanced than what you see portrayed in the hypothetical 24th century. Why am I saying that? If we wanna figure out what consumer protection means, we probably need to start by asking a set of questions that law professors would probably say, go to normative priors first. Because even something as high level or abstract as equal bargaining power, probably is not an adequately general question to cover what is all of consumer protection anymore. Because there are, I'll just use one, I hope, straightforward example to attack that. There are other market effects going on which can distort the market in ways where even if we equalize bargaining power between a consumer and a provider, market distortions will make it such that not only do they not know that they're not getting what they're bargaining for, but even if they did know, it's not clear they could correct for it. That's a bold claim. We had more time, I draw proof on the board or something of how effects in the way that machine learning, building on top of itself, et cetera, can bring you to a point like that. I ask you just to trust me on that for this purpose. Well, where does that take us? I think that takes us to, if we're gonna answer that, what does the question consumer protection means? We need to have a serious policy discussion where we lay out an actual set of policy goals and say, here are things we want to get to. Because if you get together any group of computer scientists, then you ask them what is fair or what is good consumer protection. You're gonna get, if you take 100 computer scientists, you're gonna get no less than 500 different answers. Really. So, we need to figure out, and this is hard. This is not something humans are very good at, but we need to figure out what are our goals first, and we need to make decisions about those. So, in ways, particularly the United States Congress does not like to do. Congress loves to kick things to agencies. I talk about this in my administrative law class every semester. Because it avoids making a hard political decision they can be held accountable for in the next election. Well, the problem is that there's a lot of good reasons, you can take my admin law class, I'll explain why, that we don't want agencies making all kinds of policy decisions. I'm probably the last person around who thinks non-delegation should be a very, very serious thing. That's not true, that's me being snarky. But we need our policy making bodies to do their job. Not just, that's not just a political statement, though I'm sure that's also true and there's lots of people believe that. But we needed to do their job because we're actually reaching a point where the consumer, whether it's financial or healthcare or whatever the case is, or the consumer regulators can't in a literal sense because the directive that they are given, the policy directive that they're given, no longer has functional or descriptive meaning in a world as complex as it's becoming. So what do we do there? Well, we need to demand that someone with the proper authority come and say, okay, here's what consumer protection means. Here's equal bargaining power is great but we even need to define what that means. Maybe the answer is that consumers have certain specific rights, the EU has tried to do this to some extent, to be able to know certain things. Well, maybe not, we've talked about how disclosure doesn't really get us very far. It does actually have effects. Has lots of effects in cybersecurity, I could talk all day about that, most of which relate to driving encryption. But to make any regulatory actor, whether it is an actual state-empowered force of law regulator, whether it is an industry self-regulatory body, whether it's an individual actor in the market responding to its perceptions of market demand. To enable them to respond in those ways, there needs to be greater clarity about what we want in this regard. I don't have an answer for that. That's also the fun part about it, Professor. And I don't have to actually give you an answer to the policy goals right now. I get to keep my law, if I put my computer science professor hat on that, that might go away, but if I want to keep my law professor hat on that, I can probably get away with that. Just nobody tell my deans so that my promotion still goes through this year. But seriously, this is something that we need to have a serious debate on it. And I think if I were to say, what does consumer protection mean? The first thing I would say is, find a mechanism to force the darn Congress actually to vote and specify what they mean precisely. And now I'll try to actually be a little bit helpful. This may be a place for the law students in the room where the proverbial federalism experiment can be helpful. It didn't work with state data breach notification laws because they all just cut and pasted California. And then, except for Massachusetts, New York, a little bit of New Jersey. And I think Virginia was the other one that made some modifications back when I was doing this full time for a living. But there are some opportunities there to see what has what effects. We're not gonna see what works. What works is not the metric, but we'll see the effects of certain types of laws at the state level. That might at least give us some insight so that when Congress, the federal Congress takes this up more seriously, they're not just going from a blank slate with a whole bunch of people on the panel, disclosure bias I've been on these panels, saying, oh, I think you should do X, but here's the tiny amount of data I've got to support it because this is a really complex problem. That's about the extent of what I've got for you on suggestions, but again, I'm a law professor now for a job, so I get to stray away from that a little. So I wanna leave time for people to ask questions. So maybe I'll just ask, we already have some, so I have some more targeted questions, but maybe I'll turn it over already. Please, go ahead. You're up my bali wig. And then, so just to give you a sense, I've actually done this using the NLP for both all the financial regulation laws in the US and in the EU. And so that is actually one way to start putting, if you wanna understand context, both in entity, what entity has the regulatory authority, and then also what are the buckets of actions that they take, which means what topical areas that they can, you can see changes both in entity and topics over time. So those are the ways in which you can actually say what does consumer financial protection, quote unquote, mean over time, because it's not the same. And you wouldn't wanna put a definition on it that was be precise, and you do wanna give agencies discretion because that is a changing thing. What means consumer protection back in the 1970s when you had a very brick and mortar type of, does not mean the same as it does today where you have so much of the data online. So that is one thing, and there is a strategic dynamic in which one just delegates discretion. And so delegating powers gives you both the model and all of the data associated with that and over time, so across the issue area, and but also the new work using both the NLP gives you both them and the machine learning gives you both the stuff on financial regulation in particular. So I suggest you look at it. Please. So when you say NLP, I'm assuming you're referring to natural language processing. Okay. I just wanna make, don't even get me started on overloading as it's called in computer science of three letter acronyms when you work in both law and computer science. I do agree. There's a lot of people working in legal analytics. I have a colleague actually on my faculty at Pitt who's also a lawyer and a computer scientist who does some interesting work with this. And I agree with you. There's a lot we can learn from that and it's a great tool. I completely disagree with you on the delegation part. Absolutely completely disagree. And I think that as a sort of functionalist does it get the job done? Sure, but it's incredibly dangerous. It's wonderful to have things delegated when you agree with the administration that is in power. So for example, I was in, I'm on quasi public record as this. I was in very substantial agreement with President Obama's Department of Defense and Department of Homeland Security on a number of cybersecurity related issues. I did a lot of work with them. And I thought there was a lot of great progress being made in that area. At the same time, when I was telling my administrative law students, here's what we're gonna cover a week from now instead of two days from now, because I'm gonna be gone, I'm gonna be on an airplane on Tuesday. I would explain to them, just because I agree with how President Obama has instructed his Secretary of Defense at the time, does not mean that I'm going to agree with the next Secretary of Defense. And that's where the danger comes in is designing a structural system. Well, maybe we can continue this conversation after the panel. So let me ask maybe Christina and Sean can speak to this question. So how much of this debate is people who care about privacy trying to tell people who don't, they should care about it. Revealed preferences seem to suggest that a lot of Americans, American youth just don't care about how their data is used. So if we think about the costs of consumer protection, it may be just a generation gap where those of us who grew up civil libertarians thinking it was really important to care about our data are speaking to a generation who's just like, let it go, and no longer has a value of privacy that we think they ought to have. So I wanted either of both of you or anyone else to say, what is your view on that? Is it just people don't understand how much their data is used without their permission? Just thoughts. Sure, so I'll kick it off. I actually don't think that, sorry, there is research that sends a different message about whether or not the youth care about data. And that I think it's an open question. And that's why I brought up the Pew study that talked about different bargains having different value. That's one. I think two is right now there's no Schumer box for your data. So do you think you can really understand what it means to log in with Facebook to your GLO app that's about your fertility now that you know all of that's gonna be shared entirely across the internet? And even if you don't have a Facebook account, Facebook is gonna end up with that information. Like I don't actually know that we know what people think about data privacy exactly. I do think that there is a real difference between data security and data privacy that we haven't really explored today that I just conflated with that last example, which is, is my data secure from people who shouldn't have access to it is how I think of data security and consumer reports and other groups are doing this open source effort to stand up what's called the digital standard. And it's a way of evaluating connected products for data privacy, data security and other important consumer values. And I think the distinction is actually more clear if you look at the elements there. So I'm being a little bit reductive right now in talking about these differences, but I really would question the idea that whether or not kids today don't care because you come back to the idea of like everyone still has curtains, right? So you may put up your most salacious pictures on social media, but when you go home you shut the door and you pull the curtains when you're gonna do whatever you're gonna do. So yeah, I would challenge that idea. Yeah, I think I'd sort of agree with that on the top line and just also say that I think that the, I'm gonna go simple instead of complex, so I'll just give it that disclaimer. I think the most critical legal question to solve is the definition of the ownership right with respect to personal data. And that a bunch of this other stuff sort of falls away when you say that here is the corpus of data that's owned by the person and at some point there's a barrier where it becomes industrial data because it's so derived or it's the sheep example. It becomes someone else's sheep at some point. But as soon as that ownership status is clearly defined and agreed upon and it will be defined through litigation and enforcement actions and everything else, but then you start to have market forces acting on that data, not just B2B market forces but B2C market forces. And I think a lot of these questions fall out from that. We talked about California with respect to the CCPA, our newly elected governor just recommended or proposed with zero details, a digital dividend, right, which kind of gets at this market concept for data subject ownership. And I think that ownership piece will ultimately have more traction than the privacy piece broadly as we kind of have these questions shaking out. So Jill, maybe we can hear some more from you. You're, they have given you the hat that allows you to control the legal system. You were designing the ideal data privacy or data breach notification law. What are your biggest thoughts from designing these policies? What would you like to see the law move on the issues that you work on? You know, I mean, I think a federal standard would be great. I think, again, I think the notification needs to be, you know, sort of like the disclosure, check the box you get with all your credit cards, how they're gonna share your information, you just throw in the garbage, the data breach notification, again, I don't think it's user friendly. I don't know that a lot of people take advantage of the requirements in there of the ability to have the monitoring on your credit. So I think from an exposure, from clients perspective, I think having one standard and a sort of consumer friendly notification would be a great step in a data notification. Wonderful. Well, I mean, we only have a couple of minutes left and we're gonna get sort of back on timing. So let's, you put it open for questions. And I see two, so please and then Ali. So I guess this is sort of for David, but I'd be loved to hear everyone else's thoughts. I thought it was really interesting what you said about fairness and how fairness means something so different to so many different people. And when you think about consumer protection, one of the agencies that we think of that's a protector of consumer protection is the FTC and then the state AGs, right? Which enforce their unfair and deceptive acts and practices standards when there are privacy violations, when there are security violations, et cetera. It's interesting because for many, many years the FTC's case law precedent has focused and the state AG level has focused mostly on the deceptive practices piece. So you don't say something in your privacy policy and then you do it, you could get hit with a claim. You make promises about security and then you have a data breach, you're facing a claim. And the last couple of years we have seen a trend towards looking at the unfairness prong and trying to figure out what that is. And I just found what you said to be particularly interesting because with unfairness or fairness being such a nebulous standard, I just wanted to echo that what you're saying that if we are moving towards that it's probably important that we more quickly than not figure out something else to replace it because it's almost like we're moving in that direction and I can understand the harms that you raised. Yeah, the very quick thing I would say about that basically is that deception is an example of something that is a more clearly articulated policy goal. Unfairness is not and I'm in that camp of probably maybe three law professors in the country who think that the FTC's unfairness jurisprudence is not a way to do cybersecurity for that reason. Wonderful, Ali, please. Yeah, so there's been some discussion about different regulatory models that like the sandboxing which you know there's kind of an interesting critique offered but I'm wondering about the role of disruption and looking at what kind of regulatory models will be important moving forward like examining the relationship between customers in local or smaller consumer banks like medium sized or smaller banks operating on a regional basis. You know, I feel like the relationships are kind of changing in the way in ways that consumers don't expect. For example, these smaller banks are maybe starting to partner with FinTech operations to develop more streamlined services but the focus is on the FinTech but not on the consumer bank and the relationship between them and their customer data and perhaps how they're gonna use that to develop these FinTech relationships as they look to be more competitive moving forward. So my question is whether there is a model to address that disruption and if so what does it look like? So guys, I'll take the question. So you're right. So I think a lot of community banks are really challenged in really for their own survivals, right? So think about the consolidation since the 2010 crisis. I think smaller banks, community banks, credit unions, they're just on the verge of, many of them are, I believe, many of them are actually on the verge of distinction. They just cannot compete. They don't have the capital, they don't have the know-how, they don't have the budget to develop their own, to spend on their own IT structure. So they have to work with a service provider. I was just on a call with the CEO of a president of a small community bank in, I don't want to mention which state. Anyway, so he basically told me, listen, I don't know how to do my job anymore because I really want to survive and I only have 30,000, 40,000 customers, customer accounts and my product is no different from others and I just cannot provide a competitive offering to my customers. So where should I go? What should we do? So naturally think about FinTech. So I think this kind of disruption, I wouldn't call this as a disruption. I think this is just sort of a gradual evolution, if you will, that you have to find a more competitive model in order for you to survive. Maybe we don't need that many banks in this country. I don't know what the answer is. You look at the UK, the EU, Australia, there are not that many banks and I think the reason there are so many sort of regulatory movements in those countries actually is because they want to have more banks to challenge the domination of the big ones and here, even though we have thousands of banks, but if you pick top 10 banks, they probably come from more than 80%, 90% of the deposit market share. So what's the point of have 2,000 banks when the 1,900 of them just don't even matter, right? So I'm not sure if there's sort of a right regulatory model to sort of help address the issue. One thing I think about, we just talked about sandbox is really, I think you have to figure a way to make that kind of partnerships more feasible because community banks on their own, they don't have the ability to really vet these solutions. So some of them actually are teaming up, right? They form alliances, there's an organization in Chicago called Fintech Forge. There are 14 community banks, they team up, they're trying to rely on the organization to help them vet Fintech solutions and there's a bank alliance, which is based in Chevy Chase, Maryland. They have I think over 200 community banks trying to work together, but as a regulator, I think the challenge is really, I think this is not probably CFPB, this is more the FDIC or the Fed because they oversee a lot of the small banks, they really need to step up and figure a way to encourage more kind of collaboration between banks and now Fintechs. Now, when it comes to the solution you talk about here, it is true Fintechs tend to get all the credit, all the glory, right? If you think about how many of you have heard about the bank called the Web Bank? Raise of hand. No, you guys don't count. No, none of you. How many of you have heard about Lending Club? Okay, all Lending Club loans are originated through Web Bank just so you know. Prosper, same thing. How many of you have heard about this bank called Cross River Bank? Very few, right? Cross River Bank is also another very active community bank, if you will, that's heavily engaged in the Fintech space. Just Google them, they got big investments from Andrews Horowitz, KKR, especially KKR most recently, I think 50 or $60 million investments, so they are a different kind of a bank. And these banks, you've never heard of them. They're sitting behind the scenes and they're just becoming a utility. And some of the banks are fine with it. We are okay with being a pipe, right? And all depends on what your perspective is. If you still want to be the one to serve your customers, you have to figure out how to do it. And if you cannot be those guys in this game, then are you satisfied just to be an enabler? Now some of the banks are okay with it and they are thriving as well. Well, unless we have, we maybe have time for a lightning one question, one answer, but I'm not sure if there are none left. Well, I wish we could hear more from our wonderful panelists, five fascinating people with different perspectives. But otherwise, let's just say thank you very much and thank you.