 is accessible, then certain things will come out. And perhaps because they come out faster, people will be able to catch problems earlier. But in and of itself, releasing data is not going to make that data better. I think that openness is a very good policy and I think it's something that people will potentially pay much more attention to. So if I am going to put, again, my reputation on the line and say, this is my data, maybe I'm gonna look at that data a little bit more frequently. Maybe I'm going to look at it a little bit more carefully to make sure that my data looks as good as my paper looks. But the act of opening data probably by itself, I'm not sure. I'm not sure. But hopefully one of the things that will happen is that it becomes a more important part of the paper, a more important part of my scholarly work. The things that you want to be able to do is the same thing as you're doing with journals. Making sure that you have the appropriate number of subjects that are both genders. Making sure that your statistics gets looked at by a statistician. Every campus has a statistics department. Those people are available. I'm sure that you can, you know, seek some collaboration. If nothing else, get some help. When I was a graduate student, I remember I had to do this because there were some numbers and they were a little tricky. I wasn't quite sure that I was doing the statistics correctly. So I went over to the stats department and I said, hey, could somebody please help me look at this? And I was a grad student, somebody helped. So that was great. But the other part is, of course, with the authentication of key biological organisms, or key biological resources, sorry. So again, looking at what are the ways that we can actually know that this is the cell line that we're using. So this is the antibody that is actually working in the case that we think it should be working in. So that piece of authentication, it's how do you know that this is actually working properly? And so there are different places that you can go. So for cell lines, iclack.org is a great resource. They have a lot of guidelines in terms of how do I get this information? How do I figure out, you know, that this cell line is actually what it's supposed to be? And they have a few different things that they say. One of the things is sequence it immediately. Then do a spot check. Then after that, at the end of the study, sequence it again. It should match. All of those three time points should match. So if they don't match, then you know something is wrong with them, methodology. Something has happened with that cell line. For antibodies, this is a little more tricky. There's a table that we've put together that really tries to outline how one might validate that particular antibody, how one might know that it's actually working properly. And this is, again, this is even more difficult than just sequencing a cell line, because antibodies tend to fail, unfortunately, way too often. There have been some estimates where about half of them fail. And they don't always fail in a way that you expect them to fail. Often they actually recognize some other epitope, for example. So one of the things that's really easy to check is the application that I'm using this antibody for actually validated by the vendor or the manufacturer. That's an easy thing to be able to look up. Another thing, if the vendor or manufacturer doesn't have any information or any data about what's been validated for that antibody, don't use that antibody. Oh my goodness. That means that they're just buying this antibody from somewhere or they're, you know, putting in something crazy in a bottle. They should have the data. They should have the material datasheet. They should have the validation information. If they validated that antibody for a Western blot, great. But if you're using it for immunohistochemistry, that's a problem. Because that particular protein in a Western blot will be tweaked in a certain way. And in an immunohistochemistry preparation, that's going to be sitting inside of other tissues. The antibody might work great for Western, but it won't work in this case. So you have to validate per usage. So you can't just validate Western blot and say it'll work across anything. Well, it may not work in immunoprecipitation. It may not work in immunohistochemistry. So you actually need to validate for immunohistochemistry. And we have some ways that people can actually look at this. There's a paper in Nature Methods that was sort of a consensus of of the experts paper, which kind of put together a nice table that showed, okay, here's the gold standard method of validating for every kind of experiment. Immunoprecipitation, immunohistochemistry, etc. You can go by that table. It's relatively straightforward, if not simple. But these are just some ways that we can actually know that we are actually using what we think we're using. And that's all that the NIH wants you to do. It wants you to think about the methods, it wants you to really pay attention to the methods. And I think good journal articles, and good advisors, good scientists have always known this. And they've always been able to do this. I think science has lost lost its way a little bit. But we need to come back on the right road.