 Go ahead and get started. Thank you for joining us today. I'm Cliff Lynch, I'm the director of the coalition for networked information and I'd like to welcome you to the start of week two of the CNI fall 2020 virtual member meeting. Let's start our themed weeks on Tuesdays and end as we did yesterday with a summary session late Monday afternoon trying to synthesize everything we heard during the week. I also want to note that as we start the second week which is focused on transformations of organizations and professional roles. We are including not just synchronous sessions but also some prerecorded videos for the week and the ones that go with week two were all released earlier today so I would invite you to have a look at those at your leisure as well. These things just a tiny bit differently. We're not going to use the Q&A tool today, or at least not much, but instead we're going to invite you to put questions in the chat, and we'll get to those as we go along. And Laura Brindle our presenter from the University of Waterloo will also be asking you a few things as we go along and inviting you to comment on those questions in the chat. So, with that, I think, oh, and I should note that this is being recorded and it will be subsequently available as well. With that, I think I'll just introduce the topic, which is an evolution of bibliometrics and research impact services and really deals with the challenges of supporting faculty who are increasingly being called upon to understand and demonstrate and document their scholarly impact and the impact of their work. And Laura is going to talk about how the strategy at the University of Waterloo has evolved in that area so thank you so much for joining us Laura. We really appreciate it and over to you. All right. Thank you very much, Cliff. Yeah, as Cliff said my name is Laura Brindle. Just a little background on me. I'm the Bibliometrics and Research Impact Librarian at the University of Waterloo. The University of Waterloo if you're not familiar with it is located in Waterloo, Ontario, just in the southwest part of Ontario, and we're having lots of wind today. So, if you're in the area, you're probably experiencing similar odd weather as warm and cold fronts change. Yeah, so I just wanted to thank you for having me at this CNI Fall 2020 virtual meeting. And I also, sorry I'm just, I'm trying to advance my slide here. Okay, there we go. Yeah, I just wanted to say that I also have my colleague Allison Hitchens listed here as a presenter, however she had a conflict. But I did just want to mention that Allison is my supervisor. And she's the Associate University Librarian of Collections Technology and Scholarly Communications. I just think this is important a little bit because this is this presentation is a little bit about the structure of Bibliometric Services at Waterloo. So understanding a little bit about how my role works is important to that discussion. I also just wanted to mention that this presentation is partly based on an ARL practice brief that's going to be published in the next few weeks, hopefully. And this, so the information in this presentation, some of it will be in available in more detail in writing in this ARL practice brief. All right, so I just wanted to take a moment to briefly define Bibliometrics for those who may not be as familiar with this area of specialization within libraries. Bibliometrics serves as a tool, or sorry, serves as one tool among many, used by universities, ranking organizations, funders, and others to measure research outputs. So you might be familiar with these research outputs such as measuring publications, citations, and authorship. We also use these outputs and these metrics to develop a whole host of indicators to look at research outputs and impact. Bibliometrics allows us to use to do some tracking of research scholarship and the creative activity within an institution. They also help us tell our story and promote and amplify our research and its impact. And they can also make connections between researchers to help enhance collaboration or help potential external partners identify research groups or institutions with it with the expertise that they might be interested in seeking out. However, Bibliometrics are not a silver bullet. They should be used really as part of the picture as our definition explains its one tool among many. And you really should use Bibliometrics with qualitative as well as the quantitative information to help tell a more broad story. We also have to keep in mind that the metrics are limited by the data set that the that it's based on. So therefore no analysis will ever capture 100% of the data that's available. The data that sorry the data sets themselves are restricted by the journals that are indexed within these data sets, as well as how robust the bibliographic data is within those data sets. There's also known disciplinary discrepancies between, sorry, within the data. There are more disciplines that aren't currently well represented, primarily due to the differences in scholarship and publishing practices within these various disciplines. And we should also be mindful of the biases that can be inherent in the data that concern diversity, equity and inclusion. And for example, we know that authors with female sounding names tend to be less well cited than authors with male sounding names, for example. Okay, so getting started with bibliometrics and research impact at the institutional level. And stakeholders including funders ranking organizations and various accountability organizations are increasingly using bibliometrics as one way to understand research impact. For example, decisions on funding for programs like the Canada excellence research chairs are highly dependent on institutions showing excellence and impact in the areas that the research chairs are awarded. Another example is in Ontario the Ministry of Training colleges and universities developed strategic mandate agreements where institutions need to provide metrics, including research metrics on how they are meeting goals in their key areas. So this makes it essential that institutions understand that these measures, sorry, understand these measures and how they may be assessed by external stakeholders. The library here has been supporting individual researchers to explore the impact of their individual publications for a number of years. This started prior to 2011, where our optometry librarian developed a workshop and a guide that helped researchers calculate their academic footprint. And I just wanted to note that I've provided some links at the end of this presentation with URLs. So if you're interested in looking at some of these guides or resources that I mentioned during this presentation, they're listed at the end. So really the the calculate your academic footprint process was meant to create a master list of publications and citations and for helping individual faculty keep current and accurate citation counts and develop almost like a comprehensive list of those citation counts, as well as a more accurate age index. In 2012, the university realized that we need to have an understanding at the institutional level. Therefore, we formed the working group on bibliometrics that included the library, our office of research, and our unit called institutional analysis and planning, as well as representatives from all of the faculty. And this working group still exists today. There was also an advisory group developed that was made up of the director of our institutional analysis and planning unit, the university librarian, and then the VP or the vice president university research. And the purpose of the working group is really to assist the university in understanding how bibliometrics are used and provide resources to support researchers and administrators to use them more effectively and responsibly. And also to support internal and strategic distinction making. So I wanted to take a moment to break from me chatting at you. And I'm just really curious to know among the attendees, whether your institution offers bibliometric services so you can either say like yes, or try raising your hand in the participants panel. And kind of give everyone an opportunity to see like how widely spread these types of services are. Okay, so there's only a few few here that are raising their hand. I see some raising and lowering. Okay. Yeah, so it's probably about a third of us that have some formal bibliometric services now I worded this specifically that you offer services. You might also, there's sort of a broad spectrum of services that happen amongst institutions, where, you know, you might consider bibliometric services as something as simple as like a liaison librarian helping a faculty member find a citation count for a specific publication or more broadly for a list of publications. Okay, I see one person here saying they're planning on hiring a live librarian in this field. Yeah, and then then there's a lot of ad hoc type of stuff where you fit it into existing expertise within your units. Yeah. Okay, thank you. So we're doing this without being able to see you all. Okay, so I just wanted to talk a little bit about the white paper, which was one of the first steps that the working group performed when they were first established. We created a white paper that was meant to researching. Sorry, that was really meant to do a thorough analysis of research outputs and the use of them and the use of bibliometrics. The development of the white paper follow was followed by a three stage consultation process that was really meant to gain input from the various stakeholder groups on campus. So the consultation included people such as the advisory group that I previously mentioned in the working group, as well as our Dean's Council, our assistant deans of research, our faculty association, undergraduate student relations and graduate student students were consulted library staff as well. The institutional analysis and planning department as a whole was consulted. And then there was consultation done by within the campus at large. And then finally the white paper went to the Senate grad and research council, and also went to Senate. And I just wanted to mention that the, I think I mentioned this already that the white paper is available within our institutional repository. And there's also links at the end of the presentation to a number of guides that are based on this white paper as well. The white paper, just to talk a little bit more about what it is, is really an extensive literature search and contains detailed information on bibliometric sources methodologies and assessments. And it was meant to provide some key messages, which I already mentioned a little bit when I was defining bibliometrics. So it highlights things like we need to work from a basket of measures. This is where that definition comes from to gain a fuller in fuller picture of what research impact really is. We also it also recommends that you involve the people that are being measured in the process. And also talks about when to use bibliometrics and when not to use bibliometrics and the limitations to keep in mind when interpreting the data. And also the reminder that you cannot and really should not compare across disciplines. And for those of you who might be a little more familiar with the metrics. There are a whole host of gray areas and exceptions, and ways that you can maybe do things that are like comparisons and that sort of thing so I just wanted to make that it's not an absolute you cannot and should not. It really is like I use with caution type of type of practice. Okay. One outcome of the campus partnership on bibliometrics and in support of the transformational research team of the campus strategic plan was the recognition of the need for a campus expert. So this led to a new special specialization in the library. And I'm going to try to include a link here. I actually can't can't copy it right now. I'll include a link to the job description to actually my job description. If you're interested in some more details about what my position entails. So the way that Waterloo approach the bibliometrics and research impact librarian is quite unique. So rather than being envisioned as a support for individual faculty members my position very much works at the institutional level with key partners, specifically in the institutional analysis and planning and the office of research groups. So much of my time is spent working with or through these partners with my work supporting requests that come into IP or the office of research from executive leaders on campus. I really expected to be the campus expert on bibliometrics and the analysis tools. So I'll handle requests for bibliometrics data analysis and replication of rankings method and sorry and replication of rankings methodologies as well as supply individual and group instruction around bibliometrics alt metrics and the use of the bibliometrics tools. I want to mention that I don't work solely. There are a couple people within the library that do support some of this work, especially some of the more data heavy stuff like if there's Python scripts or coding and that sort of thing that needs to happen. I also provide opportunities for co up students our master library and information studies co up students to support some of the work as well. So, why the library the libraries often sought out in this type of work because of our unique position on campus it makes sense. So I want to centralize bibliometrics and research impact at the library because we really bring a deep knowledge of the bibliographic data and the tools, and I just wanted to share an example of yesterday I was just working with a colleague that was able to pull develop some bibliometrics data at the faculty of engineering and what I was able to supply for them or help them with was understanding how to use the advanced search search method all methods in scopus. So this is something that is really unique to the library where we, you know, as liaison librarians I'm a previously is on library and spend a lot of time helping people develop those expertise so it's something that this position really benefits from as well. We have expertise across various stages of the research life cycle and within scholarly communications work. We have an awareness of the disciplinary limitations. As I mentioned before, and we also have a long history of working closely with and understanding the unique structures of academic support units and faculties and departments as well. And I think actually maybe our strongest and most influential aspect is that we are viewed as a neutral from the standpoint that the library doesn't have any high stakes in the impact data. And this is really something that is often referenced during planning and comes to consultation meetings. And the other flip side of this, the library really benefits, sorry benefits from these collaborations as well. It helps us to build partnerships that often mobilize other important work and help make connections and crossovers. For example, this work can be connected with our copyright or digital repository work or more in more generally, more generally with the research services provided through our liaison librarians. This work also raises our profile on campus and allows for valuable engagement and opportunities. Many of the partners we work with our senior level administrators who heavily rely on and really greatly appreciate having a central person who can coordinate and mobilize this really niche area of expertise. And just as another note, it also allows for us to use evidence based collection analysis within the library. Okay, so I just wanted to take another moment. If you could enter in the chat. Sorry, I'm just looking at some of the chats going on. If you could enter in the chat. If you do have formal bibliometric services. I'd be interested in knowing where these services are centralized, and who you consider your key partners. And I think everyone can see the chat so open it up here a little bit. Participants cannot see the chat. Okay, so the chat might be hiding in the menu of options. I found it was under the more area. And Laura, if I can, for just a second, I'm sorry, if panelists can set. I'm sorry attendees can set their own chat to be visible to all panelists and attendees. Okay. Yeah, they said it that way everyone can see it. So I'm seeing some things yeah the scholarly communications department. I see T are, I'm not sure. I'm not familiar with that acronym. Yeah, and the office of vice provost for research absolutely. Often, I know a couple institutions within Canada that have some of the expertise centralized there as well. The University of Michigan has a research impact librarian, and then expertise among the subject librarians yeah. Excellent. I mean feel free I'm going to move on but feel free to continue to fill that out that's really interesting to see. Because, yeah, the way that we've done it at the University of Waterloo is fairly unique, but not completely unique. Moving on here. All right, so I just wanted to share a really high level look at some of the projects that have come out of our working group. So this is, as I mentioned, our more formal group on campus that is has members from our institutional analysis and planning group our office of research but also has representatives from a number of our faculties and institute so it's really meant to be a group that does high level projects at the institutional level but also allows some of that expertise to trickle down into those specific groups. And so this was all began with the development of the working group and then the offering of the white paper. After the white paper was completed, we went into a, sorry, we were in a strategic plan cycle from 2013 to 2018 so they developed what they called the research metrics framework. So this framework was really meant to develop some standard metrics that the institution could look at specific research themes that were identified in the strategic, the strategic plan during that cycle. So after that cycle was done, they no longer are doing those specific research metrics frameworks, but we're now in a new strategic plan cycle. So we were, we did this big report through the working group to help identify new what they were calling research areas of excellence now at the University of Waterloo so there's metrics or bibliometrics associated with those research areas of excellence. That was sort of done during the planning phase. We're now in sort of the implement implementation phase so now they're looking at the more they changed slightly but they're looking at what they're calling research areas of excellence for this plan. There's six groups of them, but really the areas of excellence span about 16 different areas of excellence so there's quite a lot of research areas that we're using bibliometrics to help inform. Of course, you know, I should mention, as any good indicators we are using like various other indicators not just the research output metrics that bibliometrics are highly based upon. And then there's also, you might find interesting that in 2019, the working group did this really comprehensive assessment of bibliometrics tools, and the intention of this report was meant to do a review of all the existing tools, looking more closely at the ones that we currently subscribe to, but also capture a lot of the use cases across campus in this one report. And then I'll have this information to help us decide whether in which bibliometrics assessment tools we should subscribe to. And that resulted in us continuing to subscribe to the two tools that we have access to already so we initially subscribed to insights which is based on the web of science. And we have continued to subscribe to that. And in 2018, we picked up SciVal, which is based on the Scopus data set, and we've continued to subscribe to that as well so we have fairly robust access to bibliometrics assessment tools here. And part of that was supported through this report that was taken to our advisory groups and then our provosts and had that approved through that process. All right, so, so just a little bit about some more recent work that's been really helping to mobilize the understanding and responsible use of bibliometrics and research impact indicators across our campus has been the development of. Oh, are we at time? Okay. Should I stop? Maybe wrap up quickly and maybe we have time for a question or two? That'd be great. Thanks. Sure. Okay. So we developed this community of practice and really like to make it super basic. It's a more informal group to share amongst expertise across campus. And we just have fairly informal meetings with agendas where we do presentations and share information amongst each other. And it's meant to help spread the expertise across campus. Sorry, I'm rushing through these. I really wish I had more time. And this is just a slide about the expansion. Really, the idea of this slide is to show that we're expanding expertise within faculties and there is growing expertise across campus. And I'll just leave it at that. I'm really sorry. And this is just a little bit about what we don't do. I think I covered some of this. And yeah, I'll just maybe end here and allow some time for questions. And I just wanted to thank you and share my resources. Thank you, Laura. That was a really interesting talk about. No, no worries. I think we all enjoyed the questions that you had for the attendees and there is actually a question in the chat right now. So, I'm going to go back to Anna Park. We're considering cancelling side down and getting some plectic elements. The alt metric platform looks interesting. Have any comments on that. So my understanding is that some plectic elements is actually a Chris system, which is a little bit more like pure and the attention of some plectic elements and something like pure those Chris systems is slightly different than something like Now, they are based on the same data sets. So, well, I mean, some plectic elements, I'm not sure what it's based on, but they're still based on like, Syval is based on the Scopus data set. And it's pulling information from that. But the purpose of those two systems are quite different from my understanding. So Syval really provides those metrics. It calculates those bibliometrics for you and it provides really robust sets of indicators or something like And it makes some plectic elements from my understanding, it's meant to more display like individual faculty profiles, make connections between researchers based on their research areas. So, I don't know if you were to cancel Syval you would get everything that you need from some plectic elements, unless you're only using it for the things that some plectic elements would supply. So, you know, we have access to pure as well. And we're, we're deeply in that conversation right now about, you know, what one does over the other and really like what information it is that we're trying to pull from them. Okay, thank you. Thank you for that comment. As we are past time now. I think I will go ahead and close out the webinar just reminding everyone we will be putting these slides on the schedule page and I think Laura's contact information is in the slides and you can contact her through I think there are a lot of people who are eager to continue the conversation and I'm sorry that we have to close it out but Laura thank you again for coming and presenting at CNI is a really interesting topic. Thank you. You really appreciate it and to our attendees thank you for making time out of your day to come to CNI fall meeting. We look forward to seeing you at other sessions. Take care everyone. Bye bye.