 I'm having a bit of trouble locating the document. I didn't have a document attached to the informational email, it seems. Okay, so the document is located on the NIH Brain website, so in your search bar, search for NIH-based brain, and the new funding opportunities page, and you'll see a table, so go down to the row that says informational call, and in the middle there's a column that says about EB-15-006, there's a link there, and then you click on it and that is the actual document. Did you find it? The RFA. It's the row that says informational call, not the row that actually says the RFA title. I might not be smart enough to apply for that, but this table is on the new initiative, models and analysis of complex data, but there's another aspect to that. Table, let me count now, it's the one to the third row down, it says informational call with program directors. That table, I found it. Well done, thank you. There's a link next to it about EB-15-006, has everyone found that? And I will start our discussion now. Welcome to the informational call for the Brain Initiative series, models and methods for analysis of complex data from the brain. We are having the second informational call which will be recorded and posted online. So in the second slide you'll see that the overall goal of this initiative is really to develop analytical tools for understanding brain function. The deliverable from this RFA is our tools that could be broadly disseminated to the community. Remember that the purpose of this initiative is to develop tools and by tools we mean theories as a tool, models and methods. So we hope that these tools can then be broader used by the larger neuroscience community. And repeat again, the purpose of this RFA is to develop tools not to promote any analysis projects that might be tuned for your own laboratory. We are intending this RFA for developing tools for the broader neuroscience community. So in the next slide it starts with tips to applicants. I just want to remind you to look at the bullets in section one of the guidelines. The next three slides will actually show that section and the bullets within that section. And I would encourage you to identify topics that resonate with your proposed project. And these topics are listed in the bullets in the section one. So by identifying these topics it will help us program folks as well as the reviewers understand the type of research that is being proposed in this proposal that you're sending to us. So you can choose to either develop theories, models or methods for this project. And to also address how these tools can be used to facilitate other projects beyond your own. In a sense by identifying these bullets for us you'll be closing the loop to let us know what topics you've identified to let the reviewers know what topics you've identified so we know what exactly you're proposing to do. I will be saying the same thing to the reviewers when we orientate the reviewers so everybody will be on the same page. The bullet on this slide encourages you to form highly interactive partnerships that strongly integrate truly diverse expertise. We really would recommend that you include theorists, modelers, data scientists, experimentalists and end users in your project team. Now I've received a lot of questions about the weight of what you put on each of these types of experts and it will depend on the science of your project and the pipeline you are at in developing the tools for dissemination to the broader community. But we believe by incorporating these experts into your project team you'll have a product, a tool that's developed that can be helpful to the broader community with input from each of these experts. You can write the description of the participation of this expertise in your research plan, in the biosketches and also in the budget justification. In the last bullet, the emphasis is to tell you that experimental data is not expected. But we will allow a limited collection of justifiable data collection for the purposes of validating your tool. Again, we don't expect to have much data collection being proposed but you can if you are using the data, the new data to validate your tool development. I would encourage you to read section 4 of the requirements in the guidelines very carefully. Section 4 basically describes specific instructions for the applicants to follow in terms of what to include in the research plan. If you want to go to the guidelines right now, you can click on that link at the bottom of that slide and you'll see section 4 has a description of the specific instructions for applicants, what to include, what not to include, timelines, you know, personnel effort. There's a specific description of the end user activity as well. Now the next three slides, as I said, are the bullets from the topic areas. Slide are the theories, ideas and conceptual frameworks. Again, it would be nice for you to identify the bullet in which your project is resonating and that way it will help close the loop with the reviewers on what exactly you're proposing to do. Slide is on the category of models and associated statistical and analytical and numerical methods to integrate information across large temporal and spatial scales in the nervous system. The third slide is on new methods for complex data analysis. Slide is on programmatic issues for this RFA, which I would... I'm sorry, can I ask a question before you go on? Yes, of course. Go ahead. On specifying these bullet points, how specific do you want us to be? If it's in the wording and our research plan is specific, is that enough or do you want us to really call out bullet point number two on page, whatever? I think as long as you have alerted to the reviewers, I mean, you can say we are addressing the bullet and then quote theories on how information is encoded in the chemical and electrical activity of neurons to implicate behavior in both short and longer time scales. As long as it's identified, then we all know that you're addressing that. Are we supposed to address all of these bullets? No, no, no. You can address one or any other bullet that you choose that may not be listed here that is on an equally challenging level. I mean, we've provided examples of bullets that are general enough that hopefully most of these bullets capture everything that will be proposed. But if you find that there's something that you're proposing that is different and equally challenging, you're welcome to say that. The purpose of it is to really help us identify programmatically which area you are addressing in your proposal. As applicants, I know you are very much tuned to the specifics of your project and you want to jump right in and describe it to us and to the reviewers without really introducing it in this way where we programmatically know immediately that you're developing a new theory or you're developing a new model or you're developing a new method for this, that, and the other. It's just a helpful way to just add one sentence into your line to tell us what you're doing. A question. Can we have more than one category, bullets from more than one category like, for example, models and new method? Yeah, that's fine. We don't expect you to address that many categories. It's only a small proposal, but you're welcome to address more than one bullet. Can I also ask, you mentioned the project justification. Of course. Is this coming in the cover letter? I'm sorry, there's some noise on the line if I can ask others to mute their lines. Can you repeat your question, please? You mentioned that to write the project justification. Where is it coming in the cover letter? Project justification. You mean the bullets? No, you mentioned to write to explain in the project justification your approach and so on. Where does this come in? I'm not sure what you're referring to. You mentioned before that, for example, that you include other groups who mentioned the project justification asking where these come, which part of the application you put this. Okay, so I think I was referring to bullet number three in the slide that starts with tips to applicants on forming highly interactive partnerships between theorists, modelers, data scientists, experiment and users. And I was suggesting that you could talk about their roles in their plan section of the application. You can also include their biosketches and talk about their roles there and also in the budget justification. Yeah, where is the project justification? There is a special for the project justification. I mean the budget justification. The project, I think I'm not familiar with where I can put the explanation for the justification. Does any place in the application that I need to put this? So I was referring to the budget justification page. Ah, I see. The project justification. I'm sorry if I wasn't clear. Okay, okay, okay. I saw there's a special form that we can add to explain, justify the approach. Right. If you have any additional questions after the call, quick questions like that, you're welcome to email me and I can send you, you know, the instructions for the actual application. Search for budget justification and it'll tell you the instructions for that page. Is the expectation of the application of the project to address a single bullet? Sorry, the explanation? Is the expectation? Yes. The expectation to address a single bullet? I mean that's what is for. I mean you select the single bullet part of all these bullets. Is that the expectation? Yeah, I mean we don't expect you to address too many bullets. As I said to the previous questioner, you're welcome to address more than one bullet but I mean each of these bullets we believe are presenting some very difficult challenges and so we don't expect you to address that many bullets. The expectation is only a single bullet? Or not. Like I said, it's all dependent on what you're proposing to do. Sure. You're welcome to present more than one bullet. The purpose of presenting the bullet is just merely to close the loop programmatically to tell us, to tell us program people, to tell the reviewers and to repeat for yourself what you're trying to propose in the next 12 pages so that you don't just dive into the project without having this very brief introduction. Okay. Okay. Thank you. I'm sorry. I should have said that there are other colleagues from NIH on the line. Dr. Jim Knott, who is my co-team lead for the team that developed this initiative as well as Dr. Robert Elliott, who is the scientific review officer and organizer of the review, they are on this call. And Jim and Robert, if you wanted to add any comments, please do. So this is Jim, one of the earlier questions about the cover letter. Please don't put anything in your cover letter that you expect to be a part of your application, the reviewers, nor do program staff see your cover letter. So the cover letter is pretty much, please read my proposal and that's it. I've had, I've recommended people, if they're very concerned about the types of reviewer expertise involved, you can include the types of expertise that are required that you would like to have review your applications, but I would strongly, strongly remind you, please do not include any names because any names you put in your cover letter will put those people in conflict. But it is a great place for you to list the areas of expertise to help Robert out and finding appropriate reviewers for your project. Robert, did you want to add anything? Thank you. Okay, thanks. I will move on to the programmatic issues slide and we'll close up quickly so we can answer more questions. Just in the programmatic issues, I think I've divided this up by the review criteria. So again, the significance is described by the topics of interest and we've talked a lot about it. So these bullets that we just showed you in the previous three slides are written to stimulate new ideas for new analytical tools to understand the brain. We realize that bullets are pushing very difficult challenges. So we are going to ask the reviewers to look at this and ask the reviewers to assess whether or not you all as investigators have taken up any one of these challenges or something similarly challenging. And to what degree will the proposed tools be widely used in the neuroscience community and will they have a strong influence on fundamental approaches to understanding neuroscience data? So by having you look at the criteria, review criteria now that might help you also in organizing your thoughts when you prepare your proposal. The second, the next slide is on investigators. And again, I'm listing the review criteria that we are presenting to the reviewers so they can assess the following points. To what degrees have you incorporated the roles of the collaborating end users? To what degree is there an appropriate effort with the theorists, the modelers, the data scientists, experimentalists and end users for developing the tools for the wider community and are the roles of each expert clearly delineated? So again, that's what the reviewers will look at in terms of your project team. They will also assess whether or not the end user, oops, sorry, the end user contributions to the project will add substantive value to the project deliverables. Again, the end users are incorporated so they can evaluate how reusable will your tool be to someone in the larger community. This by having an end user participate on your project, that's the first step in sharing your tool. So the end user contribution will be particularly important to help with that evaluation for tool sharing. Finally, the last bullet is to what extent will the project inform experimental paradigms and drive future data collection? Again, this all goes to how the project will inform future use of your tool. In the approach, do we have a question? Yes. Oh, there exactly do we describe these roles? Well, it's up to you to design, decide in the research plan and the tools. Again, where you put this information. For investigators, as I said, you can put it in the research plan. You can incorporate it into the biosketches and in the budget justification. The budget justification. Okay, but the research plan would be the primary place. Yes. So the review criteria will be? I have actually a question. Yes? Can we have the same person be an experimentalist end user if we think that end users, some of them are experimentalists? I'm sorry, I missed the end part of your question. Can we have the same person serve both as experimentalists and as an experimentalist on the project, given that the project is pretty much on the theories. So there's not so much for the experimentalist and the end user to do. Can we have the same person have both roles or we might do different people? So again, it will be up to you to wait how much expertise you need for the tool development. So again, we have these bullets here as review criteria for investigators and the assumption is that, well, we expect that the tools that you develop from this will eventually be disseminated to third party users that have not participated in developing your tool. And so if you're developing a theory, will your theory be usable by other folks, other end users to then apply that theory? Similarly, the last bullet is about experimental paradigms. If you incorporated some experimental expertise, they could then help advise you on what types of paradigms your theories could be applied to, to drive future data collection. They don't necessarily have to have equal roles with the PI. They could be co-investigators or consultants. It really depends on what your needs are. But having these experts in your project team will help with these different aspects of disseminating your tool. Next slide is on the approach. And again, the review criteria is for the reviewers to assess whether or not your application has proposed adequate metrics and strategies for building confidence in the analytical tool and their predictive capabilities for their intended domain of use. So again, it would be the job of the end user role on your project to say, do I believe in your tool? Will it be able to be used for what I need it for in the future? Are there metrics this end user can use to evaluate your tool to build confidence to use your tool in the future? The second bullet is on, to what degree are the end deliverables clearly defined and or quantifiable? So again, it's all about the end deliverable. What are you going to present to the community at the end of your project? The third bullet is on, are the source of the data for the analysis appropriate for the project and readily available for replication studies? Now again, this goes to validated data sources and making sure that your tool is developed using data that's validated. And if you need to have extra data collection, are they used for validating your tool? And the last bullet is, are the use cases for the proposed tools well described and appropriately understood in terms of the end user needs? And again, hopefully all of these points have really brought it home to you that we really want these tools to be delivered, shared, reused by the broader community for other use cases other than the ones that you're developing the tools on. The next slide is on tool sharing. And so again, there's a review criteria on the tool sharing plan, which people have asked, where do I put the tool sharing plan and the software sharing plan? You can put it in the resource sharing section of your application, but just please label it tool sharing plan and software sharing plan. So the reviewers will assess, does the tool sharing plan include appropriate detail on modules, parameters, and data set? Does the plan adequately address documentation, validation, and tool reproducibility? Does the software plan appropriately address goals for dissemination and freely available, free availability for use and modification, et cetera? So again, the reviewers are addressing your plans for these things. And that is... I'm sorry, can I ask a question on the tool sharing? Yes, of course. So I know typically in a research proposal, you're not allowed to put a URL, but it seems like a lot of the ways these tools will be shared would be through online. So for this particular RFA, are you allowed to include URLs in your proposal? I'll let Robert, Elliot, address that, but I thought applicants are free to include URLs. I think maybe your statement is that the reviewers are not required to actually click on it. But I'll let Robert address that. This is Robert. It's a long-standing sort of bone of a contention, I'm afraid, but for reasons of preserving anonymity and just security as far as making sure that applications don't get added with additional material through websites, the reviewers are generally instructed not to go to any URLs that are included in the application. I know this is a problem in these sorts of instances where web utilization is kind of integral, but it's an unfortunate sort of conundrum that we have to deal with. In the review process in general, I follow the policy of the review, is to not have reviewers go to URLs. The only place where URLs are now actually permitted in the biosketch, that's just to allow the reviewers a way to get to your list of applications and those are supposed to be deposited on a publicly accessible database, so it's nothing that you should have any real control over. That's kind of the long and short of it, unfortunately. It would be preferable if you did not put URLs in your application. Again, I instruct the reviewers not to pay attention to them if they're in there. There's some other way that you can depict the information that you're trying to get across. It handicaps you a little bit and maybe more than a little bit, but if you can put in screenshots that are representative of resources, diagrams in your application, then that's generally a typical way that people get around this issue. Just a quick question. You said, but I didn't catch it, where do you put the tool sharing portion in your application? I think you said I just missed what you said. Yes, there's a resource sharing section that you can include in the application and under the resource sharing plans you can include both the tool sharing plan and the software sharing plan. Okay, so now we'll go to the questions that were submitted ahead of time. One more question, please. Yes. What exactly is the difference between tool and software in this context? Can you give a couple of examples? I would say software is a means for a tool, but the tools that we're interested in for this initiative is categorized as theories as a tool. You can argue that theories are not tools, but in this case we're saying theories are tools, models, computational models, and methods, which are tools. These tools are provided could be in the form of equations or software, however way you want to share these tools. Software is just one means of sharing it. I completely understand how to share software. I'm not sure how to share tools or equations. It would be in the papers. That's not obviously what's being meant. I'm confused about what exactly... What could be written in a tool sharing plan? Remember it's a plan for how you want to share it. First of all, you could say that I don't plan to share it in the first two years because we're developing it. Another possible sharing plan is that eventually you'll post it on a website and you'll publish it, of course. You could say that you're going to distribute it to 10 more end users with these types of expertise for them to try it out on their own projects, either testing out a theory, depicting... I'm just making things up at this point. It's a plan on how you want to share. You can use different media for sharing. You could provide a curriculum. You could do a webinar. There's lots of social media things you could do. But it's really to get the word out to the community that you have produced this tool. This is not. Another example might be in terms of delivering software. Maybe you're developing a novel algorithm, a new method for principal component analysis or something like that. You would want to thoroughly describe how your algorithm can solve the problem and how it would generalize to various kinds of problems in neuroscience. But it would be important to show, in other words, to deliver the tool, here's a novel algorithm and here's how it could be used. And it would be well described for the reviewers to assess. Okay, so I'll just try to run through some of these questions that were presented prior to the call. We, of course, welcome international participants. Foreign institutions are welcome to apply as primary applicant or as subcontractors. We cannot point out to other types of grants previously funded because it's a one-of-a-kind RFA. This RFA is a one-time initiative. If people have asked if we have future brain funding, would we possibly reissue it? It will depend on the needs of the community and the needs of the brain initiative. It's a possibility, but at this point it's a one-time initiative. In terms of, I think I've talked about data acquisition, the question is, does this program apply to efforts to obtain data from the sensory environment that a brain has to deal with? I'm thinking about capturing details of motion and position across all of an animal's visual and auditory scenes. So we're not limiting what type of brain application you're applying your tool to. Again, the purpose of the RFA is to develop the tools if they can be applied to a sensory environment to better understand how those brain circuits work. You're welcome to submit your proposal. Does the data have to be publicly available? People have asked about the data that you're using to develop the tool if they need to be publicly available. I think ultimately the tool is publicly available. If you happen to need the data to validate your tool or others need to use that data for whatever reason for your tool, whatever data that is needed should be publicly available. If you only need some privately available data to develop the tool, as long as the tool is publicly available, that's fine. If you have any overlap with a non-NIH proposal, I would recommend that you write it in a very transparent way within the proposal, that this application has overlap with an NSF proposal, should both be funded, I would remove specific game three or whatever it is that has the overlap so that the reviewers also know that you would be addressing the overlap after the review. Any other questions on the line? I have some. For those of us that submit first-time NIH, is there a place with a list of all documents? All right, I missed the last part of your question. Is there a place? Is there some place online with a list of all documents we need to include, like what form this is easy and stuff like that? Yes, so if you do a search for NIH Base S, F as in Frank, 424, you'll find all the forms. Also welcome to send me an email and I'll send you a link. Okay. Okay, I have a question on the duration of the project. Yes. It is listed as being three years. I'm wondering if there is any kind of flexibility on this? It could be four, maybe. I'd have to... Jim, did we have any specific... I think we expected the projects to be three years. If this is a hard limit, there is no way of extending this if needed. No. Okay, thank you. I do have a question about the early stage investigator. Is this proposal, is that... Can that be applied? Or I guess is that even possible to apply as an early stage investigator with this particular funding opportunity? You would self-declare yourself as an early stage investigator. This will use that information in our own programmatic decisions. If you do get funded, I believe you will not be able to qualify yourself as an early stage investigator because you would be a funded R01 awardee. If you're an early stage investigator on a multiple PI project and the other PI's are not early stage investigators and you get awarded, you're also disqualified to apply as an early stage investigator for future R01 awards. It's the grant that we... Are you just listening to it? Yeah, I'm just listening to it. If you... Let's give us an instruction about... What? No. Can you drive? I cannot drive. For some reason, I got dizzy. Are you okay? I have a question. Do you have any guidance for limitation on budgets and specifically do you require modular budgets or anything like that? We don't require modular budgets. Our budgets are not limited, so we expect them to be around 150K to 250K in direct cost per year. Basically, we expect you to request a budget that's required to accomplish the work. Okay, thank you. I have a question. Yes? So, if a particular lab is able to develop the tools and test them, experimentally validate them, collect the data and refine the tools. So, essentially, the team would be one or two collaborators. Would that be against or held against the lab that's not very collaborative with other labs? I'm not exactly sure your question, but we... Our funding decisions are based on a lot of different priorities, first of which is your tool and how broadly your tool could be used to understand brain circuit function. In terms of collaboration, again, you should develop your project team based on what is needed for developing the tool. If you need 10 collaborators, feel free to bring in 10 collaborators. The lab does have the expertise to develop the tools, the quantitative background and expertise, and also doing the biology. My question is, would that be viewed negatively during the peer review simply because it doesn't have this wide... Actually, you can. Collaboration. You can. I'm sorry, I can or I cannot. I don't know. Somebody else is talking online if they can mute their lines. I appreciate it. Your question. So this is Jim. Go ahead, Jim. So you should be prepared to describe and justify why the team you put together is the best team to attack the problem that you're posing. This may or may not be someone in your lab or someone down the hall, but if the best experts in the world to do what it is you're posing are in your lab, then you should be prepared to justify that. Hopefully that answers your question. Kind of a related question. Is it okay to have paid consultants on those applications? Yes, that's fine. Everything should emanate from the tool, the science that you're proposing, whatever it is that you need, what types of collaborators, consultants should be incorporated. Justifying. So I have a related question. So if you have some people that might be able to evaluate your tools, is there a means to have a letter of support added to supplemental data or something like that? So you're saying that you're having people evaluate, but they're not included as end users or they're end users that are incorporated on your project team? It's more that they're end users, but they're not necessarily on the team. Okay. You know, the development team. So it would be nice to have them say, letter of support, you know, we will evaluate or we're interested in this tool. We've gotten various questions on that particular topic. You're welcome to include letters of support from potential end users. Just remember, any additional letters of support, the people named in the letter of support would be put into conflict for reviewing your proposal, obviously. So alternatively, you could list the experts that you would bring in to evaluate your tool. You could say that, you know, that you would bring in these types of experts. Again, put your reviewers hat on when you look at what you write and see how much information would you need. If you require specific types of expertise, you may be better off putting in the actual person rather than the list of expertise. Hopefully that helps. That's helpful. Thank you. I'm sorry, is there a question? Excuse me. Regarding the letter of support, do you need letter of support from members on the team or does it have to be from somebody outside of the team? So I think the previous question was referring to people who are not on the team that you would want to use as end users. You're welcome to have them provide letters of support. I suppose you're welcome to bring in letters of support of members on the team as well to describe how... Robert, do you have any advice on that? Yeah, I was just going to chime in. Thanks, Grace. I think letters of support have good use, but I would be very careful about putting in 50 letters of support. A, it's probably not going to have as much impact or any greater impact on the reviewers than putting in maybe your top five or maybe 10 at the very most. And keep in mind that those people that provide letters of support are probably not going to be asked to review your application. So pick people that you think are going to be representative of what you're trying to accomplish and take the point, but don't belabor the point by putting in reams and reams of letters of support because, ultimately, that can disadvantage you and your application and think about it. Thanks. I have a related question. So I currently name a couple of experimentalists as collaborators, and they're not going to be paid consultants, but they both provide expertise and data to develop and validate my tools. What is their status? Is collaborator the right term? Do I need letters from them? Can you elaborate a bit? Go ahead, Robert, if you want time in first. There are people that you collaborate with on other projects but are not involved in this particular project, and a lot of support is going to be fine from them, but if there are people that are actually going to be involved somewhere with your application, then by all means they should be listed as collaborators. We listed it. In the budget justification, again, budget justification page, you're welcome to list your collaborators, even if you're not providing monetary funding to them. You could list them as not being paid, but providing the data with noise on the line. I would say that it's all how you want to justify how you're developing your tool. Basically, what we're saying is that all these possibilities are fine. You just need to make it clear for the reviewers what roles these people are playing. Does that answer your question? If you have follow-up questions, you're welcome to email. I asked about the early-stage investigator. As part of my team, we'll modify. Okay, hopefully you can hear me. I've forgotten the question. You are a new investigator, and you are the sole PI, and your senior investigator collaborators are not PI, but listed as co-investigators or consultants or collaborators. Then that's fine. You will be the sole PI. You will be... So if you get awarded, then... I'm not sure if I'm answering your question. You would not be qualifying for the new investigator status for future R01. The conference is now in talk mode. I just wanted to make sure that, because when I put together my team, I am the only PI, but I will have consultants and subcontracts in your personnel, and I want to make sure that I'm not violating the early-stage investment. This is a special RFA investigator incentive, a regular R01, or regular mechanisms at the annual age, and so I don't believe you'd be violating any of those policies because you're responding to an RFA anyway. Okay, gotcha. No, I just wanted to make sure it hurts. But if you're self-declaring yourself as a new investigator, I mean, please go ahead and declare it, so we'll know when we're considering funding. Okay, good. Can I add anything to that? Can you hear me? Yes. Okay, yeah, so, for the BRAIN Initiative, ESI, for new investigator status, will not be one of the specific criteria for choosing, like all applicants, consider career stage one of the components. But ESI status will not be an official factor. If I go in and come through, at the same time, really, I'm sorry. But Grace is correct that if you are listed as a principal investigator, as either a PI or a co-PI, you would lose new investigator status with this award. Gotcha. I understand that. Okay. I understand. Thank you. Okay. Hopefully you caught one. This is Robert. Can I chime in one more thing? That's correct. Even if you brought in a similar application, a new application, not as an amended application. So, I think as the others have some stances, they're really pretty minimal. Thanks, Robert. I have another question. Go ahead. And please, if others can mute their lines. Since this is an RFA and not, you know, a typical open call, are there any restrictions on the involvement of federally funded research and development centers? I think the same restrictions apply in that if you're referring to somebody from another government agency being a part of your project, they cannot apply for salary because they'd be already funded by the government. But you could apply for equipment and other things. So, it would be the same kind of restrictions for other government personnel or collaborators. The specific question is that, for instance, at federally funded research and development centers where people aren't on salary, where they're going out and raising their own funding, in general, it's possible to apply to NIH because the calls are open. They're not for specific purposes. However, there are non-competition clauses that have to be addressed for certain kinds of opportunities. And in particular, I'm guessing RFAs would fall into that category. So, I'm wondering if there are additional restrictions for federally funded research and development centers? Do you probably look deeper into this for you if you wanted to send me an email with your specific details? But I don't know, Jim and Robert, if you wanted to chime in. My suggestion is that you send us an email with the specifics of what you are deliberating here and then we can get an answer for you. All right, thanks very much. I have a question. We still have time for that. Yes, please go ahead. You said that international institutions were welcome. Yes. And so, we are an international institution with no experience whatsoever in NIH applications. Okay. So, we were wondering if you can give us any assistance. For example, if we send you a one-page synopsis of project, would you give us some feedback on that if we're going in the right direction? What is just something you wouldn't do? That's true for everyone. We're happy to give feedback in the sense of appropriateness for this call. Of course, Jim and I, we can't really edit text or anything like that, but we could give you a kind of a responsiveness take on your project. Probably if you've never applied to the NIH, we want to make sure that you're all registered to apply and make sure... I think one of the first questions was this person who wanted to apply from Czechoslovakia and they didn't have a DUNS number. I would ask you to send me an email to make sure that you're registered. Your institution is registered with NIH. We are, in fact, registered with SAM. Oh, okay. We have all the necessary registration. Okay. So, we wouldn't want you to have your proposal already to go and then you submit it and you receive errors because of these administrative issues. That's all. Yeah. Thank you very much. I would encourage everyone to submit early, not wait until the last minute, the last day to submit on October 21. Actually, some people are already submitting and receiving errors and it's great to have this extra time to really work out the errors. It could be something very silly as a zip code having entered incorrectly or a wrong email address just to have these errors worked out ahead of time so that that's not the reason why your application was not submitted. Okay. So, I mean, that is a structure that we've provided for you. You're welcome to use it, as you wish. You know, ultimately, we wanted to be a model sharing plan. So, how will you share your model? So, in terms of the model, you know, have modules of the models that would be shareable, you know, with the parameters and the algorithms you're using in the models to be shared. Software-wise, are there any IP issues or interoperability issues to be addressed? It's the structure of, you know, a heading called tool sharing versus software sharing is really for you to use to help you make this information clear to all of us on what exactly you're sharing. Question? Yes? I have one more question or one question regarding the reviewers. Will these be ad hoc reviewers or will you assemble a study group? I'm basically trying to get a better sense for the background of the reviewers for instance, how, you know, mathematically technical there will be or whether there's any ideas about the composition like a mix of experimentalists and theoreticians visited to guide my application in terms of how technical my application should be in terms of, you know, mathematical formulas and derivations or whether I should emphasize more the intuition than the, you know, experimental side. Robert, do you want to take the lead on answering this one? Yeah, sure. I think, you know, at this point it's, well, first of all, it's going to be ad hoc reviewers. This is a special emphasis panel, so it'll be constructed specifically for this RFA. The mix of reviewers will reflect the mix of applications and unfortunately I can't really be any more definitive than that, but I think it's safe to say that there will be a mix of theoreticians, modelers, tool builders. So I would certainly try and make sure that your application is accessible to everybody because everybody on the panel will be ultimately scoring it. Of course, it will be assigned to three and perhaps four reviewers within the panel and they'll be collected based on having the most appropriate expertise for your application, but again, you do have to speak to the larger crowd as well, so I wouldn't over-engineer it, is what I guess I'm saying, is try and make it as detailed as you think it needs to be to really have the quality shine, but at the same time, don't make it so no-pointed that people perhaps a little outside the field can't read it without gaining an appreciation for the general motivations behind it. I would agree with Robert completely and as I had mentioned before, you're welcome to use the cover letter to the application to list the areas of expertise that you wish to be included and again, no names, please, otherwise you'll put them in conflict, but that might help Robert in finding specific experts to review your application. Again, I would write your proposal to be as clear as possible to as many different communities as possible because it's always a communication issue in terms of interpreting what you're trying to say and how the reviewers are interpreting what you're trying to say, and that was the whole reason behind that bullet point identification. If you all were on the same page with what you're trying to say, then I think that helps with most of the review. So be clear and as simple as possible, but with enough detail to satisfy the reviewers' needs to understand where you're coming with your project. This is Robert again. I'm going to chime in talking about conflicts and cover letters. It's very important as well, in addition to not naming names of people that you think could review the application, but just referring to the fields, the flip side is making sure that if there are individuals out there that you feel would be biased in the review of your application. And these have to be really well-validated problems that you've had with people in the past or differences or those sorts of things. Please include those names in your cover letter. It makes a much more compelling case for me to honor your request to put someone in conflict, and it is ultimately up to me to decide. But I'm much more likely to kind of go with your request if it's previously stated. If you see a name on the roster that eventually gets published, say, ooh, I think that person's in conflict. I do look at those claims with a good deal more skepticism because I don't do geek reviews and this isn't like getting a publication in a journal. If you make a claim for a conflict, I will probably press you for specific details as to why that conflict should be instituted. So I'd say if there are any conflicts you have with people in the community out there, make those clear, detailed upfront and I think you'll be more satisfied. Thanks, Robert. Are there any other final questions before we end the call? If not, I'll thank you all for participating and participating in this recording session, and we will post this online. Good luck with your proposal preparations, everyone. Thank you. Bye. Thank you. Thank you.