 Good morning. My name is Sophie Baum. I'm the editor-in-chief of the Michigan Technology Law Review. Before I introduce our first speaker, Dean Barr, I want to say a few words and acknowledge lots of the people and organizations that help make the symposium possible. So my first thank you is to all of you. Thanks for coming. Thanks for showing up. I felt sometimes like I was planning a kid's birthday party and like no one was gonna come, but you guys came. So that's great. As a reward, we have lots of food for you today. We have Zingerman's breakfast down the hall, so feel free to go and grab some now before Dean Barr starts. We've got lunch coming, Jerusalem Garden, we've got Washtenaw Dairy, so make sure to stick around all day for that. I also want to thank our sponsors, of course, University of Michigan Ford School of Public Policy and the Center on Finance Law and Policy. Christy Barr was a huge help in helping with recruiting speakers and creating promotional materials for the symposium, so we're very grateful to her. Our networking snack breaks are generously sponsored by the University of Michigan Department of Information Technology Services, the Information Assurance Team, and Dissonance, a university-wide group that brings together technology researchers, scholars, and students for discussions and campus-wide events. Saul Berman, the University of Michigan's Chief Privacy Officer and Interim Chief Information Security Officer, kindly worked to secure this donation, so we appreciate it. Thanks especially to our biggest sponsor, the University of Michigan Law School, for reviewing and selecting our symposium proposal. We're so fortunate to have had the law school support and guidance throughout the entire process. Thanks especially to Dean Marti for his leadership and to Jenny Rickard, who, if you don't know her, is a logistics, like rock star, wizard, queen, all of it. She's great. And finally, thank you so much to the members of the Michigan Technology Law Review for your enthusiasm in putting this day together. Andre Rulliard, standing right there, has been incredible throughout the process. He was involved in everything from conceptualizing panels to recruiting speakers, making dinner reservations, printing flyers, ordering catering. You get the idea, the whole deal. So he and the whole journal is really at the heart of today's symposium, and we couldn't have done it without all of you, so please enjoy. Thanks. Now, I know you'd all like to get the day started, so I won't talk for too much longer, but I did want to let you know that today is not just a symposium on data privacy and portability in financial technology. That is still a mouthful after all these months. It's also a celebration of our journal's 25th anniversary, and this may not sound like a long time compared to some law reviews, but for a tech journal, it's a pretty big deal. Some of our current members were not even alive when we got off the ground, so I'm thrilled that some of our alumni are here to celebrate with us too. One of them recently sent us a photo from 1996 with a note explaining that the editors shown in the photo were in our first office, which was the IT guys' closet in the library basement. The room had a shower in it, because apparently it was originally a makeshift dorm room for the women law students back in the day. So technically our office is still in the basement, but needless to say, we've come a long way. So to celebrate that, I want to remind everyone that there will be a happy hour to mark our anniversary after the symposium. We'll be walking over to Raven's Club downtown after we wrap up here. I'm under strict instructions to say that the event is not technically connected with the symposium, and we are not using any symposium money to fund it. So check on compliance. And without further ado, I'd like to introduce our first speaker, Dean Barr. His reputation precedes him. Michael Barr is the Joan and Sanford Wildein of Public Policy at the University of Michigan Gerald R. Ford School of Public Policy and serves as the faculty director of the Center on Finance Law and Policy. He's a professor of public policy at the Ford School and the Roy F. and Jim Humphrey Profit Professor of Law at the University of Michigan Law School. Dean Barr is a senior fellow at the Center for American Progress, and previously at the Brookings Institution. He served under President Obama as the U.S. Department of the Treasury's Assistant Secretary for Financial Institutions and was a key architect in the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. In the Clinton administration, Dean Barr served as Special Advisor to the President, Deputy Assistant Secretary of the Treasury for Community Development Policy, Special Assistant to the Treasury Secretary, and Special Advisor and Counselor on the Public Planning Staff at the State Department. So again, please join me in welcoming Dean Barr. Thank you very much, Sophia. And congratulations to you and to the other members of the Michigan Law and Technology Law Review. It's really a great honor to be here and to see so many friends and look forward to hearing a little bit more from you today. I want to apologize in advance that I can't stay much after my talk this morning, but I look forward to hearing about the results of the day's events. And I guess I'll miss the completely separate happy hour later tonight. What I thought I would do is frame up the discussion for the conference today by talking a little bit about why data ownership matters. And in a big picture sense, I think the central questions we're talking about today involve empowering consumers to have more control over their financial lives. Figuring out ways that we can really make a difference in people's lives. So sometimes we get caught up in the technical questions about this or that privacy provision, this or that liability rule. But all of those, I think, ought to be in service of having a market that is competitive and vibrant and innovative and that empowers consumers to make better financial decisions, better choices, and choices that enhance their personal lives. So what does having better control over your financial data mean? In part, it means knowing what's going on in your life. Having a sense that instead of things happening to you, you understand the information around you and you're able to act on that information. It could mean budgeting more effectively. Having access to information can help individuals get better control over their lives. For many, many households, there's enormous uncertainty and variability in their daily financial lives. The flow of income that they get goes up and down quite a lot. For many households, their expenses are also really variable. They can get hit with a car bill or an unexpected illness that's not covered by insurance and that throws them off for a long time. So having a better ability to manage both cash flows in and cash flows out and potentially to be able to save a bit for those kinds of, you might think of them as fully anticipated emergencies since they happen so often, but to have a little cushion, a little emergency cushion is really important. There are all kinds of positive spillovers from having that ability to budget better and potentially to save. One of them, for example, is overdrafting on your bank account less often. So in any given year, overdraft costs consumers between $32 and $34 billion. It's a huge drain on the finances of people who can ill afford that kind of a cost or fee. So if we can give people better access to information, better ownership of their own information and tools to manage that, we're going to protect them from a lot of potential harm. Lastly, I think, or not lastly, but another important example of this I think is that having better controller once financial data will make it easier to switch banks. Whereas I put it in an op-ed a couple of years ago, it ought to be easier to dump your bank. It ought to be easier to switch from one bank account to another. Now, why does that matter? It matters because the stickiness of our relationship with our bank makes it easier for banks to impose all kinds of costs on us that a fully competitive market where it was easy to switch would reduce. So it was easier to move from one bank to another. It would be harder for banks to impose gotcha fees, to impose insufficient fund fees, to impose high overdraft fees, or to just have a bad customer service. So a fully portable system, a system where we can easily take all of our data from our bank and move it to another bank would be a much more competitive system and a system that would reduce fees, increase competition, I think increase innovation in the market. So improved legal rules about data ownership, improved access to the information, improved ability to port your information from one financial institution to another can, I think, dramatically increase competition innovation in the industry and reduce fees and costs on consumers. Now, there are all kinds of issues with developing new rules that would enable better consumer ownership. One of those key areas is privacy. A central concern for many is how do I know that the data I've given away is being protected in some way, protected from a security standpoint that I'll get to in a second, but for now I mean protected from uses that I didn't anticipate. And in the current world we live in, the answer is you can't. Much of the data that you provide to others is used for all kinds of purposes beyond whatever was in your head when you gave up that information. If you had something in your head about why you were giving up that information, it's widely used for other purposes and consumers routinely either consent to things they don't fully understand or are not required in many instances to consent to the use of that personal financial information. So I think that's an important area that needs to get fixed if we're going to have better portability of financial data. In addition to not knowing whether the information you provide is being used for the use you intended, this data provision can also be used by others in unintended ways, even by those who you've given it to. So the information can be used by thieves, of course we're worried about, but also information might flow to those whose behavior is not ideal in the marketplace. Abusive debt collectors or predatory lenders who are taking advantage of that access to data to provide services that can be extremely harmful to consumers. So we're worried about it from that perspective as well. Data ownership also raises really important issues of access and equity. So unsophisticated consumers are more at risk. Low and moderate income consumers are more at risk. Minority communities are more at risk. Women are more at risk than men in finding, having these abuses take place in the marketplace. And I think that raises important issues of equity. And beyond that, many uses of data end up reinforcing problems that already exist in the marketplace. So in many instances, data can be unintentionally reinforcing of problems of lending discrimination. So we have this wonderful new system of thinking about how to use big data to expand the pool of people that are lent to. But big data can also lead to reinforcing discrimination in markets. So if you feed into your big data machine a set of information about historical connections, even if the machine is told not to discriminate on the basis of race, the algorithm can end up replicating and reinforcing problems in lending discrimination. So access and equity issues are also extremely important in thinking about who owns data and what it is actually used for. I mentioned that in addition to privacy, we're worried about security. So in addition to worrying about what intended uses are being made of the data, we're also worried about whether the institutions that we're providing data to are themselves secure and can protect our data from illicit use. And there is a wide variety of capacity to protect data in the marketplace today, from institutions who are quite sophisticated and good at it to institutions that are struggling with it. But even institutions that are quite good at it have found themselves subject to significant data breaches, data breaches that expose our information to illicit use. And that, I think, is a problem that's not going to go away. It's getting worse. The attacks on the financial sector are becoming much more sophisticated and harder to protect against over time. There's also a differential ability among financial institutions to back up their commitment to protect security. So if you're a startup and you happen to have really good security mechanisms, you're really super committed to it, but a data breach nonetheless happens, there's less there to insulate consumers to protect consumers if something goes wrong. If you're a big institution, you have more capital, you can at least in theory recompense individuals who are harmed. I mentioned at the outset that it's often the case that consumers don't really fully understand what they're consenting to, or in some instances are not required to consent at all for the sharing of personal financial information. And this is an issue that countries around the world struggle with, but I think we have to figure out some method for developing more meaningful consent, actual consent that people could at least in theory understand, although I'm going to give you a caveat in just a moment. So what are some of the kinds of elements that you might think about in thinking about consent? One of them is perhaps the most obvious is comprehension. So can we develop ways for articulating what it is that consumers are consenting to in their forms? Is there a way of breaking down the consent in such a way that consumers might reasonably think that they understand what actually they're consenting to? But a second, I think important consideration is reasonableness. That is, I don't think that we ought to have a situation where financial institutions can seek consent for something that is unreasonable. That is, it may be that consent is a good tool to help us choose between things that reasonable people might disagree about. They would want that or they wouldn't want this. But we ought to have some substantive limits on the ability of financial institutions to use data in ways that reasonable people would find that it's not something they would expect as a use of that data. And I think that both a procedural test with respect to comprehension and a reasonableness test ought to be part of the picture. I think there are also two other techniques that are really worth exploring as we think about ways to make consent more meaningful. One of those is time-limited consent and the other is use-limited consent. So in time-limited consent, I would say permit a financial institution to use my data for the next two days to decide whether or not to make me alone. And the use, the limit they are both time and use. So the time limit is the firm can only use my data for two days and the use limit is for the purpose of deciding whether to make me alone. So time and use limitations might give us better control over why we're giving our financial information away. It can only be used for a certain time and for a certain purpose. I think that's an important potential tool. And lastly, there are lots of situations in which essentially consumers need to opt out if they don't want to have their information shared. And I think our basic model ought to be always an opt-in model. That is, if the financial institution would like to use or the provider would like to use your information, that you need to affirmatively say that that is a permissible use of that information. And I think all these techniques would potentially make consent more robust. What are some of the key barriers to making progress in this space of data portability and data ownership? One major concern has been that in many instances banks are really uneager to share the information either with the consumer or with third parties. And you can see this from the bank perspective. A bank is unlikely to want to share information with a third party provider for really two central reasons. The first central reason is that providing that information to a third party means that that third party might get the customer instead of the bank. So the bank may be giving up revenue for doing so. And the second is that sharing with that third party may open up the bank to concerns about security or privacy violations that will end up with liability at the bank itself. And those two reasons have really slowed data portability in the United States. There's been some progress in this regard with data sharing agreements that are being built one by one with third party providers and individual financial institutions. But there's no unified framework in the United States for resolving these questions and that's inhibited the flow and increase in data portability. A second major concern with respect to the sharing of data has to do with problems in liability allocation. So the rules in the United States are quite unclear about how risk is allocated between banks and third party providers in the event of a data breach. And it is often the case that it is the sort of facts and circumstances tasks used to decide liability are very difficult to work through. And that lack of clarity on the liability allocation inhibits the sharing of that information. And another key barrier I think in this is that there just isn't enough consumer voice at the table in conversations about how to develop the rules for data portability, how to share information between banks and third party providers. There isn't enough, you know, even in the conversations that have happened that are advancing the ball on between third party providers and banks, there's insufficient attention to having consumer voice at the table in those decisions. Consumer groups, community groups, state AGs and others who might be able to provide a perspective that is a consumer focused, human centered focused approach, not one focused on just the question about the way in which the financial sector gets together on this. And if you think about these kinds of barriers, another significant issue is the wide variation in the ability of different kinds of financial sector firms to adapt to technology changes. So one of the things that has slowed the adoption of more efficient rules in the marketplace with respect to technology is that small banks and small credit unions are not well positioned currently to take advantage of those advances in technology. And because they're not well positioned to take advantage of them, the small banks and credit unions oppose efforts to develop these broad based rules. So there's both a technical problem with small banks and credit unions not being able to access the technology and a political problem in that it slows the adoption of more efficient rules. For example, on good funds availability. We have still today in 2019 banks and thrifts and credit unions can sit on your funds and earn the float from them and not provide instant access to funds even though technologically we have the capacity to live around that today in the United States. And the reason, a primary reason that good funds availability rules permit that kind of flexibility is because of this problem of differential access to technology among big and small firms in the United States. So essential, both political and technological problem. As you're going to hear about later today, there are lots of other countries who have made progress on these sets of issues. None of them perfect. All of them involving significant questions about a trade off among different values. But just to give a flavor, obviously, you're going to hear quite a bit about GDPR in the European Union and the framework for privacy in that law. There's been a huge progress in the developing world. And I'll just give a couple examples. In India, India developed something they call the India Stack, which is a set of rules designed to promote access and equity and privacy and security in the Indian financial context. And a core element of that was the development of a national ID program called Aadhaar, a unique ID that is provided to all Indian citizens and residents. And that any financial institution, any government agency can use in order to open a bank account or provide government services. Now, India rolled this out fast. And they rolled it out in some ways with insufficient attention to the kinds of issues I mentioned before about consent and about privacy. And that caused some problems for them legally. In 2018, the Indian Supreme Court upheld the government's use of Aadhaar but said that when private institutions use it, they need to have real meaningful informed consent. And so the framework for Aadhaar is being renegotiated now to try and take account more deeply of these principles. Another example I'll give is in Singapore. Singapore has really been a leader in open banking. In 2017, the government set up a system for the easy exchange of information through a centralized platform. And that permits you to switch your bank account quite easily. It permits you to link to third-party providers of apps for budgeting services really easily. And the framework that centralized platform has really been, I think, an important model. Some of you may also be familiar with the open banking portal in the UK which provides easy transitions between bank accounts the same way that you currently can, unless you're on a long-term contract, you can switch your cell phone from carrier to carrier. You can keep your cell phone number and we're not worried that you're switching from AT&T to Sprint, although you might worry for other reasons. But we don't have anything similar in the United States for switching your bank, even though it ought to be just as easy. We've made it extremely hard to do. So what are some of the potential paths forward? One of them is the Consumer Financial Protection Bureau's authority under section 1033 of the Dodd-Frank Act. So back when I was in Washington working on Dodd-Frank, we included a provision in the Consumer Financial Protection Bureau's authorities that gives consumers the right to have access to information about their bank account data and usage in a machine-readable format. And that provision we put in there to enhance competition in the marketplace to give consumers greater control over their financial data. And at the time it was not really, maybe we did too many controversial things in Dodd-Frank, because at the time it wasn't really noticed. But now that the dust has settled a bit on that, 1033 could provide a significant path, not a full path, not addressing all the issues that I've described, but a significant path forward to give consumers better control over their financial data. And I would hope that the CFPB would move forward on implementing it. I think more broadly we could develop in the United States a really comprehensive framework for data sharing with an open banking platform, with an independent entity or existing entity given the authority to oversee it, and with clear rules being established for privacy, for security, for consent, for liability, and for consumer protection. I think that kind of comprehensive framework would let us move forward in a way that truly enhances consumers' ability to own their own financial data, to use that data, and to, as I suggested at the outside, empower consumers to have more control over their financial lives, which I think is the core the core goal. So with that I'm going to skip all these things. With that let me just say it's really been a pleasure to be with you this morning. I think we have time perhaps for a few questions, which I'd be happy to take. So thank you very much. Melissa. That was great, Michael. Melissa Coyde. How do we know, particularly in the data ecosystem, when there is such unsettled business models, certainty over what the value proposition is for consumers, whether it's data for financial management, PFM tool, or data for underwriting, or data for how am I making long-term investment options? With all the uncertainty about what those business models in this ecosystem look like, but yet we know data is flowing in, I'll say, highly unregulated ways, when do we know is the right time to act from a legal and regulatory standpoint? Like by the time you got to the end of your remarks, you're like, we're there, but we hear this quite a bit, right? Like wait, let things settle out a little bit before we think about regs in this space, or even new laws. Well, I guess let me answer it in a flip way and then maybe in a more serious way. If we think the right time is in five years, let's start legislating now. So it takes forever to get anything done. So it's never, I don't think we're too early to be developing the legal framework, but the slightly less flip point is I think that the key is to develop a legal framework that is agnostic to the developing innovations in the market. That is, one of the problems that India got into that I didn't mention is they developed in some ways a quite innovative set of structures for non-bank participation in the payment system, but they did it in a way that it's extremely difficult for private parties to develop economically viable models. But I think that if you, they're trying to rectify it. I wouldn't say they're in process, but I think that, so the risk you point out is real, but I think that if you develop a framework that's open and is agnostic to those different business models, then I think you foster innovation, but you do it in a way that is set in a structure that everybody knows what the rules are. Liability allocation is clear. We're protecting consumers on the consent and privacy front. So I think having that framework in place in advance actually helps people innovate. You innovate around a uniform system and you do it in a way that is consumer centered. That is, let's start from the proposition that what we're trying to do is empower consumers to have better control over their financial lives and then let companies innovate on the basis of that principle. So that's what I would say. I'll bring up a question which is ill-formed because I never thought of it until you put up a slide. That's one of my favorite things to hear. It's the slide worthwhile or it indicated how ignorant I am. I'm not sure which it is. If we say consumer-centric is I own my data and I can prevent you my lender from using it for more than two days or for any reason other than to decide whether to lend to me or not, does it exclude other things that might have systematic consumer benefits? For example, being able to audit the quality of loan portfolios. If that data goes away, how do you know whether decisions were good? Or in the age supposedly of information allowing us to correct errors, if we don't have that information on hand, how do we learn anything new? Like for example, it's not true that you shouldn't lend to people in this neighborhood because we've learned differently. So you might have consumer protection at my level but systematically maybe it goes the other way. So that's a terrific question and demonstrates that you're not in fact ignorant, Eric, which is not shocking to me. So absolutely. So it may be that the principle we enunciate at a broad level is time and use limited consent, but we have provisions that are broadly beneficial to consumers where we say, but yes, of course, you can also use it for these following purposes. So a classic is information maintained in a credit bureau for the purposes of establishing both the individual's creditworthiness and for developing the kind of broader sense of market-wide information that provides positive externalities to everybody or as you suggest for purposes of regulatory auditing for issues of fair lending or discrimination. So the kind of broad principle that I enunciated would then need a set of clarifying limitations that suggest when there are positive externalities to the market as a whole those that information might be used for that additional purpose, but it kind of flips it on its head. So instead of saying as we do, not entirely today, but our starting framework is if you get consent, you can use it for anything. And that also has some positive spillovers. We would say what are the positive spillovers and then you can use it for those things. In areas where there are positive spillovers that are largely private, you can imagine different models. So obviously today, many apps and technology in the world from Facebook on use the fact that we've provided data to it to generate the revenue that supports the free app, the free app. And that's a private business model. We could have a different kind of model where we said no, actually, it's a pay-per-use, it's micro payments, it's not a free app with our own data being the revenue generator. But those sets of questions, I think, are different from the kinds that you're raising, which are really about broad public goods provision. Dick? It's time to start legislating now. State and local, municipal and federal governments, as you and I are all familiar with, where should we start? And how do we reconcile, in our own American house, the conflicts that might arise between or among those various branches of government? And then more broadly, when we think about, culturally, it does appear that Europe has a different approach and a different view of not just privacy, but as you started your talk about ownership and who owns it. And is there a concept of shared ownership under certain circumstances between the consumer or, for that matter, the business and the people with whom they transact? A lot of questions, but... Yeah, so those are three really good questions, at least. So some states are proceeding on their own. California passed, there was a California ballot initiative that was going to essentially enact wholesale into California law, GDPR. And in a modified form, the California legislature to preempt the ballot initiative passed its own version of that. They're having a little bit of trouble figuring out what it means and how to implement it. And some of the provisions in the law, as originally drafted, are not feasible to implement by their own terms. So they're in revisions right now working on that. But I do think that in the current political environment in the United States, having some state level experimentation might be quite valuable, particularly if the state that's experimenting is California or New York. It's going to cause some short-term pain. Even if the law were perfectly written, it'll cause some short-term pain because of state federal conflicts and conflicts between among state laws. So I don't want to say it's kind of nirvana to do that approach. But I do think having California experimenting this will give us some good evidence about what might work and not work in the U.S. context. And I'd be happy to see experimentation not just on privacy, but on other issues. Federal preemption makes that tough in the financial sector, as you know. So it doesn't fully address these sets of questions, but I think worth pursuing. At the federal level, I actually think there's some ground for optimism about bipartisanship. I think that not for all the issues that I raised, but for many of them, there are senators and representatives on both sides of the aisle who are interested in technology issues and want to see progress being made in the United States and see the U.S. falling behind because our frameworks are not as developed as they could be. So you have, for example, Representative McHenry in the House has been a longtime supporter of technological innovation. Senator Peters from Michigan. So I could see bipartisan work in trying to make progress on these efforts at the federal level. Your last question is really about different cultural norms across the world, and I totally agree with you. I didn't mean to suggest that we should have the same privacy balancing as the UK or India or Singapore or the EU, all of which have made different choices from each other, just that we need to be much more serious about the choices that we've made, and I don't think that we've really addressed in any deep way what our own privacy values are. So I'd like us to do that and then embody that in a coherent set of rules that then will foster innovation in the financial markets. Can you indulge me in a second question or maybe a fourth? It's up to Sophie. Sorry. Our credit bureaus are a source of misinformation, because they sort of take as given the information that they collect and don't really validate it, and that's the source of a lot of heartburn for consumers who are denied credit or for other reasons. Is there a legal remedy for that? Is there some kind of a policy that we could think about that would make that more consumer-friendly? Yeah, there's actually now enough authority that they could be better regulated. So both through individual error resolution procedures, but more importantly through the CFPB's ability to provide oversight with respect to their activities. And I would love to see the CFPB expand its activities in that space and try and, as you suggest, clean up some pretty bad practices. Even with greater supervision, it's super hard for us to make progress because consumers don't really know what's going on in that space. Unless something goes horribly wrong in their personal financial lives because of it, you're not going to see those things rise enough to the top. So you see the tail edge of really bad practices and really bad errors that cause harm. And less focused on the underlying question about whether the quality of the data is adequate and less attention to issues of fairness and equity, except at the level of extreme harm. Eric wants to go back for more. Our students are not usually this shy. Don't be shy. So I'm a bit confused is what I will say. And I'm wondering if there's differences within legislation regulation that apply to various tiers of consumers. Because when you started off, you talked specifically about the switching accounts over drafting of a specific individual. But then we also talked about the credit bureaus and the aggregated data set of people. And do you actually foresee are there different regulations at those different tiers? And do you think a common solution can exist for the entire tree? Yeah, I think it makes sense to think about the system as a whole. That is, both about individual level protections and also about how the system as a whole operates. So the role of different kinds of data aggregators are really important in that system. Credit bureaus are an important example of a data aggregator in a sense. So I do think it makes sense to think system wide. And we could have a set of rules that this goes a little bit back to Eric's question. A set of rules that permits the efficient use of data in an aggregated form but also protects consumers against misuse of that data. So I don't think of those as separate problems. I think of them as part of an integrated system. Can you say a little bit about cybersecurity? Part of what scares the crap out of me about India's digital identifier is the fact that it's linked to your biometrics. And I think there's probably, there have been points of time in time when I've received no fewer than four different letters warning me that my data was exposed to you know some other kind of hacker. I mean fortunately I don't use my real information anyway for almost anything but how do we get around that? Well for those of you who don't know that's Christie Bear who is the director of the Center on Finance Law and Policy and does a lot of work for me. I'm really grateful. And one of the things she does is ask good questions. So you should be really worried about cybersecurity. You should be more worried about it today than you were yesterday. And the techniques for detection of cybersecurity have advanced but the techniques for doing the cybersecurity have advanced in many ways faster and it's kind of a race. I also think there are particular problems with biometrics being used for identification because once your biometrics are stolen that's it. There's nothing, there's nothing left. You can't, you know despite all the kind of James Bond movies people don't go around you know sanding off their fingers and replacing their eyeballs to you know to change their biometric identity. So that's quite concerning in lots of levels. There are new techniques being used being developed right now that are much more protective of individual identity. They're not really being deployed in the market yet but there are techniques that permit through advanced computational technique permit an outside entity to perform calculations on encrypted data without decrypting the data first. And those techniques will eventually permit us to have our identity in an encrypted form on our person in our phone and to let an outside you know lender decide whether to provide credit to us looking at that encrypted data in a masked form and that will eventually be I think a more robust form of protection. You know there's also at a company-wide level advances in cyber security protection that shift the way in which an entity connects to the internet, connects to the market and slowly changes the nature of those connections in such a way that outside attackers can't tell that the network connection is changing but they can't access it as readily. And those two kinds of techniques you know I think are important advances in cyber security protection that you know buy us some more time to maybe get ahead of cyber security cyber attacks. But this problem isn't going away and sort of arms race in this is going to continue and the kinds of techniques people are developing are not you know full-proof techniques. So you should be worried. But there's not really a you know there's not really an answer to that fully unless you completely get off the grid. And you know there's some people in our society who have chosen to do that for that and other reasons having to do with their own personal ideologies. But it's pretty hard to do in a you know in a modern economy, impossible to do really in a modern economy. I think we probably have time for one more question. So who's very confident that they have the best question? Okay we can do two more. We'll go over it first. Thank you Mike. So in my past life at the CFPB or BCFP as they are called these days and we're back to CFPB. They haven't changed the logo yet actually if you go to the headquarters still BCFP and the ugly brown color. Anyway so we have you have talked a lot about privacy issues but I think some of the I guess when you talk to aggregators and the banks when the conversations sort of break down is when the liability question comes into play right. So who's really I mean Christie asked a question about security so we all know no matter how good your security is at some point it will be breached and you probably have been breached before you even realize it. So what are our thoughts on you know how the how the liability should be allocated among different parties and you talk about consumers in control but the consumer frankly you know I don't want to bear any responsibility right. If I give you my data it's your job to keep it to safeguard it and if there's a breach you know I'm going to hold you responsible and reasonably I think the banks will say hey we are always sort of you know the last stop so if anything goes wrong we're on hook for it and now what kind of ability do I have to go after the third party or fourth party or fifth party which you know at least at this point you know when data is breached we don't even know who the source is. My second question is really sort of follow up on the conversation about the data quality or the CRA. So do you think the aggregator should be regulated or supervised as a CRA? If so what do you think of the responsibilities of the furnishers right. So I remember you know at a meeting with a very large bank that was after the election of course so the attitude has changed so the pushback was if you made that as a furniture we're going to stop sharing the data through the third party to customers so I think that's just you know a question for I don't know if the new CFPB leadership is willing to take on that you know the challenge. Yeah I mean the two issues you raised are brutally hard and you know really at the core of fights currently about how to make progress in this space and not surprisingly. So I think we're going to have to end up with a set of clear liability rules and I think that at the end of the day those are likely to fall on a regulated financial institutions and in particular on depositories in part because they are the institutions that have the wherewithal to be the last stop and and they are supervised so we have a higher confidence in their ability to perform that doesn't mean that there can't be intermediate levels of liability before the last stop so we might have loss allocation rules that start with that principle but permit recovery from third parties where there's either fault or whether we want to where we want to distribute the costs of the compliance system so I can imagine a system where sort of the first order at the first order level we're relying on supervised financial institutions for liability and in a secondary level we're permitting put back to the to the vendor. With respect to the furnishing question I think it's basically the same the same set of issues that is we want we want aggregation to occur but we want it to be accurate you know how do you do that or you encourage the most provision of information but then you have some set of rules about the extent to which the holder of that information is responsible for checking verifying its accuracy and so I I think you know I'm not sure that I would go all the way to saying that all data aggregators are should be regularly the same way as a credit rating agency but also we don't have a fully I don't think we have a fully developed regulatory system for credit rating agencies themselves so credit bureaus themselves so I do think that we need you know a comprehensive approach that would treat this class of activity the same and certainly not leave it really completely unregulated which is more aware it is now well you one okay gotta make it quick it's up to you but it's gonna be great and I'll give a shorter answer first of all Mike thank you so much for this wonderful presentation so as used as you talk about the other so I think I am the first time who has gone through the other card and enrollment and everything it is very tough to to enroll that plans or to like you know do it at a massive scale when we're talking about 1.4 billion people and what I have seen while enrolling there so there will there there were like portable devices where you are giving your biometrics retina scans and everything and in last couple of years you might have heard about the news that data breach just pay $10 and you will get maybe like 50,000 people's data in your hand so I think that was one of the biggest breach happened ever in this scale and when it when we talk about the India so most of the time it is always the first-hand experience ever because there are no experiences available in any of the countries when we are rolling out such a massive scheme for these many people I think there should be some law enforcement or the robust framework well before rolling out these schemes so what is your take on that particular stuff so if suppose United States wants to come with any any of such stuffs so what should be those steps like the government come up with in terms of public private partnership or the law enforcement so that should be a smooth process yeah I mean I think one of the advantages that India had in that context is that it did like a super fast super fast rollout it developed all these the the whole sort of idea of the India stack in a very brief period of time and it tried it but the disadvantage of that is they had a lot of problems in implementation you know things on the ground were really super messy and are messy I had to go back to the drawing board in terms of privacy and consent issues with the Supreme Court ruling they had an absolutely wacky demonetization effort well let me leave aside what its motivation was that was incredibly disruptive so I wasn't holding up India as an example of like you know perfection in this case just that they've experimented in this way that is you know quite comprehensive and I think innovative in the US context you know the challenge is not quite as stark because you know most people in the United States have identification most people in the United States already have a bank account the income distribution is you know quite different here from in from in India so our challenges in the US context are quite modest in comparison and that gives me greater hope that you could have implementation done in a in a in a pretty easy way so thank you all very much