 When I and I think the other editors in chief go to review papers, we take a couple, we take a two-step approach. So the first is just getting a sense of the science, the big picture. So we're not like studying it at this point. We're really just trying to understand what the authors are saying. There's something that you may notice immediately, which is that if it's difficult to read, the language quality is poor. You cannot decipher it sufficiently to understand the science. That is grounds for sending it back to the editors and letting us know why. Because if you have to translate it to understand it, then there's just too much possibility for misinterpreting the science. So this is just not a good approach. So what will happen then is it'll come back to us and we would then suggest to the authors that they work on improving the language quality and sending it back to us so that we can actually review the manuscript appropriately. So if this is something you encounter, just send it on back and let us know. Usually we'll try to pre-screen them sufficiently that we'll catch that, but sometimes it's not immediately apparent. So from there, assuming this is readable for you, I think the abstract is really important piece. You've already seen this. This is the big picture. This is what the authors are summarizing and they're going to show you in the manuscript. So this is important starting point, but then to come back to at the end. I think it's important to read the introduction for a couple of key pieces of information. Why is the study important? What is the knowledge gap that the authors are trying to fill? What are the questions and hypotheses under investigation? This should be pretty clear to you. You know, we all write papers and we get trained to communicate this. So these should be clear to you as a reader as well. And this is important then for framing the results and figures that you will go to next. So what were the questions they're asking? How are they addressed? What did the data show? So this is just again, first big picture, getting the essence of the paper. And then from there on the next slide. Can I, sorry to interrupt. I just wanted to add also that sometimes, although we pre-screen, sometimes we also miss a resolution of figures. If you cannot decipher the figures because you cannot read the numbers. So that's also something that needs to be returned many times. People still try to review, but it's very, very difficult to get some kind of a realistic review. So please also consider that. Yeah, and I should add that I resolution figure will be required anyway before publication. So the figure should be I resolution to start with. Alright, so then once you have this, like the essence of the paper, kind of the big picture, what are the questions? What are the big approaches and findings? This is then when you go to dig into the paper and this is what can take a fair amount of time, depending on your level of familiarity with the topic. So in thinking about results and methods, there's some key questions you should be thinking of to evaluate. So what are the experimental conditions? What is the experimental setup? This should be clear. Thinking about things like, are there appropriate replicates for each experiment? Are they biological replicates versus technical replicates? This should all be very clear as well. What's the sample size? Is this a sample size that's representative of the larger population also should be clear? Are the appropriate statistics applied? Could you reproduce the experiments based on the methods that are described? And then are they drawing the right conclusions from the data that they're showing? One thing to be aware of as you're evaluating particularly the results and methods is that often we're not expert in all of the methods that are applied in a paper. And so that's fine as well. So it's very common for a reviewer to say, okay, I can evaluate X, Y, and Z in a paper, but not this last section on methodology X. This is beyond my expertise. I haven't evaluated it. You can put that in the review. You can also put this as a comment to the editor just so they know. But that's perfectly fine to do. So if you are just leaving a piece of the manuscript out in terms of review, just let us know as well. Better to do that than to guess and not review or critique it appropriately. Once you have a handle on the methods and this I should say it's not uncommon for this to require some background reading. So sometimes going back to authors previous work or works reference where they reference methodologies or approaches that have been applied in other manuscripts. So it shouldn't feel unusual to have to go read some papers or actually there at least glance at some papers to get a handle on methods and approaches. And then from there, so now you have the data and the results. Are the conclusions that the authors are making? Are they supported by the data that they're showing? Are these conclusions then assembled and accurately synthesized within the literature? Are they placing it in the right context? Are they saying that they're the first people to have done something when really this has been published before? Or not if they have discovered something amazing and they're not appropriately communicating that, that's also useful to take note of maybe they're selling themselves too short. And you suggest that they would make note of their new discovery. Did the authors answer the question that they set out to answer? So, you know, is this whole manuscript a tight package? Does it make sense? Is the discussion well balanced? Have they considered alternative interpretations? Are they ignoring parts of the field or the literature that would conflict with what they're showing? This is what we mean when we say well balanced. And are the conclusions over speculative? I think different people have different approaches for writing discussion. Some people think this is the place for wild speculation and discussion. And I think there's certainly some room for speculation, but not over speculation that gets a little too far a field. Kitty, Yana, do you want to jump in at any of these points here? I think that summarizes it good. Okay. Some of these were your points. So, okay. Okay. And so then at the end, I like to go back to where I started. So going back to the abstract and asking, okay, so now I know what they said they were going to do. And what I know what they said they did. Did they actually do this? And so often what happens is the abstract might be a little overblown. It doesn't actually reflect what they actually did. It doesn't appropriately summarize what they did or maybe overstates the conclusions that aren't supported by the data. So it's really important then to go back to the abstract and check these boxes. And similarly, the title, often this gets ignored, you know, and the title might overstate the findings as well, or be a little more grandiose or a little more vague than it should be for the data that are shown. So that's really the things to think about in terms of the critique that you're going to then write up and submit. I add something to the title. I think the title is extremely important because that's the first thing that people see when they actually are looking through, I guess a list of manuscripts and papers. It's to our interest as well that the appropriate readers reach the reach the right articles, because of course that progressive science. So I think the title is extremely important to look at from your perspective when you're reviewing in terms of is it appropriate and does it cover what the content is in the manuscript. So it's always great to get input on that from reviewers.