 Peter, I know one of the things that you've commented on about is this culture of sanctions and things. And I know you've often said that carrots perhaps work better. And I also know that you've got this ecosystem idea in mind. Do you want to? Yeah. Well, I don't know. Carrots and sticks are all, I guess when I think about the whole problem, I think about it as an ecosystem. You know, we're moving from a world where in the paper age, a lot of work was being done in this practical obscurity ecosystem. And that work was being done without our really being focused on it from a policy point of view. As we've moved into or been driven into an electronic world, that electronic world pushes toward transparency. It's just in the nature of electronic information that it tends to be transparent, wants to be transparent. And it makes us much more sensitive, sometimes involuntarily we become more aware of it when bites us in the butt. Sometimes we can get ahead of that. It makes us more sensitive to what was being happening in that ecosystem of practical obscurity where everybody could talk about a world being open, you know, the world was open, it was public, but as a practical matter it wasn't. And a lot of interesting important work was happening. So as we try to get a handle on this new world where practical obscurity is being sort of systematically killed, and I think it ultimately will be. What's happening is we're becoming aware of how we were regulating the system. And any system of regulation is going to use rules, it's going to use technology, technology of paper or the technology of electronic computer systems. And Linda's given us a wonderful sort of snapshot of the use of technology in a very meaningful way to try to address some of these issues where you're building the technology at the front end, but she's absolutely right. Once you've built the technology, there's a lock-in structure where people get used to that technology and it's extremely difficult to solve problems with technology afterward. And then you go to the third mechanism of regulating the stuff, which is training. So you can come up with rules, you can have technology and you have training. Actually what's happened in the federal system, which is where I mostly work, is that we had a fairly simple technology that we were working with in the PACER or electronic filing systems. We came up and there wasn't a lot of focus at the front end on the complexity of what was going to happen when everyone jumped into the deep end of the swimming pool. And so the impulse is to create some rules after the fact, and the federal rules that are now sort of embodying the rules of civil procedure do that, you know, don't file so security numbers in court filings, things like that, but then implementing it becomes extremely difficult. And the tendency on the part of the courts, particularly in an adversary system, which is what was regulating the world of practical obscurity reasonably well, is to put the burden on the lawyers. So the tendency is okay, if lawyers file pleadings with social security numbers in them, they'll be sanctions, you'll use a stick instead of a carrot. I think what is happening is a recognition that that model, the model of trying to regulate this animal through an adversary system needs to be supplemented. It's not that the adversary system isn't an appropriate way of trying to get a handle on many of these problems, but it's not going to get a handle on people who don't have lawyers whose information is in court, you know, victims, third parties whose sensitive information is relevant to a case. It's also not going to get a handle on another problem. I think Stephen Schultz is going to be addressing a little later today, which is there's also a good reason to make public information more accessible to make it Google searchable, for instance. And I don't think that the federal system has done a particularly good job in that respect either. So you've kind of got a, the whole system is kind of built around a proxy of practical obscurity simply because it's hard to work in the pacer system. The stuff that should be public, should be free, you know, opinions by judges, many of the briefs that are filed by the attorneys. This is really stuff that the public ought to be able to access, ought to be able to access without having to pay for it, or if they pay for it, it ought to be very convenient and easy to get hold of. That's something that the adversary system just doesn't work well in. And we can see that with the implementation of sealing orders. I mean, the reality, I mean, and again, you can have a very strict set of rules and policies, but it's very difficult to get lawyers and judges to follow those policies if the enforcement of rules takes place in an adversarial system. Because an adversarial system is set up around the resolution of disputes. And if there's no dispute that this thing can be filed under seal, the affected parties, whether they're affected, in the sealing system, it's usually the public, but it also can be in a class action, some of the actual parties to the case. And there's an abuse that can take place when stuff is filed under seal that shouldn't be filed under seal, which is exactly the mirror image, just the other side of the coin of the abuse that takes place when information affecting the privacy rights of third parties that are not represented get filed in court publicly and accessed and abused in an improper way. And it's the failure of a system that's designed around the resolution of disputes between the parties before the court to address these interests. I won't call them rights because rights are only enforced in the context of these disputes, but they're interests that are important from a policy point of view of both the public's right and need to have access to important information that affects the civic dialogue of a community, as well as the individual unrepresented parties whose privacy interests are affected. And that is sometimes even people who are represented in a class action context, or lawyers that aren't focused on their own clients, or we have lawyers filing their own clients of security numbers. That's just because the lawyers aren't zeroed in on that. And what I argue in the law review article that I published is that I think we need to supplement this with more of an administrative approach, where whether it's informally through committees that the judiciary appoints, whether it's an actual administrative agency, at some level we're transitioning from a world, and I make an analogy with the world of environmental law. It used to be in the 19th century, nuisance lawsuits between individual parties could deal with all practical, serious environmental pollution issues. In the 20th century we realized that pollution, which is the externalities, the bad things that happen that often affect people who really aren't directly involved, those need to be our interest in breathing clean air, our interest in clean water, that those really can't be adequately addressed through this traditional nuisance doctrine. It's not that the nuisance doctrine gets replaced. It means that you have to have a separate agency like the Environmental Protection Agency focus on the effects of really many, many, many, many different groups of people that are going to have their interests affected by a particular important decision that's going to be made. And so I'm not saying that we need an EP, we have to have environmental impact statements whenever we put information online. But I am saying that the kind of focus that's required, the kind of rules that will be required, the kind of focus that I think Colorado did relatively well by focusing on, well, how is this technology going to affect the interests of unrepresented parties? How is it going to affect victims, for instance? Focusing on that ahead of time by designing better technology, by coming up with a good set of rules, and then by the training that's implemented, all of which are really, really, really hard to do. That can't effectively be done within an adversarial system. It has to be done through some sort of body that puts every, tries to bring as many interested parties to the table, get comments from all the people who have an interest in the matter, and then try to come up with something that makes sense from a public policy point of view. But it is, we are still far from understanding what policies, what technology, and what training is the best. But I think a recognition is slowly growing that the process of engaging in that conversation has to take place in a more administrative model where there's public comment, where the interested stakeholders all have a place at the table, and policies can be adopted, which really reflect the public interest, both in making sure information is publicly available, but also the public interest in making sure that individuals' privacy interests aren't being violated when it's not a matter of legitimate public concern. It's very, very, it's easy to say that, harder to do. Right.