 Hello, my name is Laura Bredel from the University of Waterloo where I work as the Bibliometrics and Research Impact Librarian. It's such a pleasure to be able to share with you some insights into the Bibliometrics and Research Impact Services at the University of Waterloo today and more specifically highlight the collaborations that make these services such a success. This presentation is part of two recordings covering this topic of collaborations and services related to Bibliometrics and Research Impact. So I highly recommend that you also check out the presentation recorded by our colleagues at the University of Illinois Urbana-Champaign to learn more about their great work. Although our official presentation information includes the presenters here on this slide including myself and my supervisor Allison Hitchens, I'd like to highlight that this presentation that in this presentation you will be hearing directly from some of our colleagues at Waterloo with whom we collaborate in various ways. We hope you will find the ways that they are working with and applying Bibliometrics and Research Impact in their home units inspiring. I'd like to take just a brief moment to introduce you to the University of Waterloo. We are a fairly new and university standards institution, however we are one of the largest universities in Canada with a student population of over 42,000 students. We grant undergraduate, master's and PhD degrees and although the university is a research intensive university known for innovation, it also has an international reputation for its comprehensive co-op program. We'll now focus on Bibliometrics at the University of Waterloo where I will give you a brief overview of the Bibliometrics and our service structure. In 2016 the University of Waterloo published the White Paper Measuring Research Outputs Through Bibliometrics which we continue to use as a guiding document on campus for the responsible use of metrics. In this document Bibliometrics are defined as one tool among many used by universities, funders, ranking organizations and others to measure research output. These metrics are used to help answer questions related to our individual and institutional research strengths and potential partnerships, helping to tell our story in order to promote or amplify our research and also questions around who is citing our research and where we have focused our research and other creative activities. However, we cannot talk about Bibliometrics without acknowledging their limitations. We know that it is essential that these metrics only be used as part of the picture. They are not the entire research impact story. This can be highlighted through the recognition that there are limitations to the data sets that we have access to. There are known discrepancy or sorry disciplinary discrepancies in not only the way research is captured in these systems but also how the cultural norms within disciplines affect the types of research outputs they consider are most impactful. And lastly there are well-known biases in Bibliometrics data that are a result of inequities in the behaviors of researchers. For example, we know that researchers with female sounding names are cited less than those with male sounding names. For these reasons, we always preface discussions and reports by stating the limitations of the data, defining the metrics and describing the methodology and sources of data. At Waterloo, the library works closely with our key stakeholders from the Office of Institutional Analysis and Planning as well as our Office of Research. We also have a formal group on campus that mobilizes and advises on Bibliometrics and Research Impact. The working group on Bibliometrics is a group made up of representatives from all six faculty and several of the research institutes and centers on campus. The individuals involved in the working group are primarily associate deans of research or similar positions or their delegates. The Associate Vice President Research Oversight and Analysis chairs this group. Approvals of recommendations and reports created through the working group are advised on and approved through the advisory group on Bibliometrics made up of the Associate Provost, International Data Analysis and Planning, the VP Research International, and the University Librarian. We also established a campus-wide Bibliometric and Research Impact community of practice in response to the growing interests and expertise at Waterloo. We recognized that information and skill sharing was going to be key to our service model. So far, the presentation discussion style meetings have been highly appreciated. They have not only been strengthening the understanding of Bibliometrics tools and uses across units, but have identified common areas of interest and concern, such as how COVID might be impacting the research productivity of various groups differently and the common challenges of author disambiguation and having up-to-date researchers lists. I'm now going to turn the presentation over to my colleagues from our institutional analysis and planning office, the Faculty of Mathematics, and the Faculty of Engineering. From them, you will hear some really interesting applications of Bibliometrics at Waterloo. Hello, my name is Janet Carson. I work in institutional analysis and planning at the University of Waterloo. I'm going to take you through a couple of examples of institutional level uses for Bibliometric data. The first example uses the LeLyden ranking, which we use in Waterloo's strategic plan to monitor the percentage of our publications with industry partners. We compare ourselves with Canada's U15 universities, a group of research-intensive universities. The 2020 LeLyden results show the percentage of publications with industry partners based on Web of Science data between 2015 and 2018. And as you can see in this slide, Waterloo publishes 6.3 percent of its publications with industry partners. The next example monitors Waterloo's ranking results among one of the big three ranking organizations, QS. This slide shows Waterloo's results from 2016 to 2020 on the citations indicator in the computer science subject area, compared with our U15 peers. The citations indicator is worth 15 percent of the ranking in computer science, but at this varies by subject. The citations indicator measures the number of citations per paper. The top rated institution receives a score of 100, and all other institutions are scored relative to that top score. The last example shows some data on Waterloo's areas of strength and has also informed strategic discussions at the university. This chart uses data from Syval between 2015 and 2019. This data can be interpreted by first looking at the research areas with the most documents, the rank column, indicating the productivity of the research area, and then looking at the citation impact. Research areas that are darker green have relatively high impact in comparison to the other Canadian universities in this comparison. For example, the research area material science has high productivity and high impact for you, Waterloo. However, this is a top 20 list, and so all of the research areas could be interpreted as being highly productive. So we pay attention to any of the research areas with relatively high citation impact in this list. Thank you very much. Hi, I'm Martha Polds. I'm the director of planning for the Faculty of Mathematics at the University of Waterloo, and I sit on the university's bibliometric working group and community of practice. I've created dynamic dashboards in Power BI to share bibliometric data with the Faculty of Mathematics leadership team and with our research advisory committee. To put this project in context, I curate dozens of measures and indicators about our students, faculty, staff, alumni, research, and teaching to support decision-making and to track progress toward our strategic goals. Because it is important to include many indicators in order to have a robust picture of our research portfolio, bibliometric data join information on funding, rankings, and graduate student-to-faculty ratios among the measures included to track our research. My data source for these dashboards is CYBEL. You can see from the screenshot here that I have an entity that's been created in CYBEL comprised of our regular faculty members mapped to their home department or school. This allows for the creation of dashboards and internal analyses at the overall faculty level and for each unit. As I mentioned, one use of this CYBEL entity is to produce data to populate a dynamic dashboard in Power BI that I then share with faculty leadership. I apologize if this slide is a bit noisy. The current bibliometric dashboard is comprised of five unique themed pages, which obviously couldn't fit on one slide. I've tried to clear a few different elements from across those pages. There are some common features across the dashboard. Trend data is presented in overlapping five-year increments. Deeper dives into the current context always focus on the most recent complete five-year period. Every page also includes detailed notes. These include information about the data source, timing, and definitions, as well as cautions and explanations to ensure that data isn't misinterpreted by the end users. At Waterloo, the Faculty of Mathematics is home to many disparate research disciplines, from computer science to pure mathematics, which have very different citing cultures and practices. It's important that these notes articulate the potential impact of those disparities on the data. Every dashboard also includes a drop-down slicer that allows users to review the information for the Faculty of Mathematics as a whole or for any given academic unit. The diverse disciplines that make up mathematics at Waterloo, coupled with size disparities among our five contributing academic units, makes this level of disaggregation essential to making meaningful use of the data. Each page of the dashboard addresses different metrics and indicators, which include scholarly output, including raw counts and proportions in top citation and journal percentiles, citations, again including raw counts and also a normalized count per publication, the field-weighted citation index, both average and distribution, various collaboration indicators, and publication and citation counts by subject area and subcategory. The end users for these dashboards include the Dean, Associate Dean Research, Department Chairs and School Director, and members of the Research Advisory Committee. To date, they've been used, along with other research indicators, to better understand our research portfolio and to monitor progress over time and toward our strategic goals. They've also been used to seed discussions around our international standing and strategic directions. End user testing and feedback was incorporated in the determination of what would be included in the dashboard, and I anticipate that they will continue to evolve based on ongoing user feedback. Thanks. Hi, I'm here today to talk about the Times Higher Education Impact Rank in 2020, University of Waterloo Faculty of Engineering case study using Power BI. I work in the Engineering Research Office as the Research Analysis and Proposal Development Officer. The Faculty of Engineering at the University of Waterloo was ranked in the top 50 engineering schools in the world in the QS 2020 World University ranking. We have over 12,000 undergraduate and graduate students, 13 percent of which are international, 348 faculty, eight departments, and over $96 million in research funding. The Times Higher Education Impact Ranking is the only university ranking in the world that assesses universities against the 17 United Nations Sustainable Development Goals. It demonstrates the differences that universities are making in the world we live in. Universities that provide data on SDG 17, three other goals, teach undergraduates, and can be validated or invited to apply. Universities are ranked using the four indicators of research, outreach, stewardship, and teaching. This case study focused only on the research indicator. The ranking occurs every year and currently two cycles have been completed. Microsoft Power BI is a business analytics service provided by Microsoft. It provides organizations with business intelligence and strategic planning capabilities. Microsoft clients need to purchase Power BI service or the Power BI report license, while report builders need to download the free Power BI desktop to do the backend data analysis. How it works is that data analysis is completed in Power BI desktop, and then the reports are published to Power BI cloud service in the form of interactive dashboards. It is a highly effective and secure platform with row-level user restriction features available for all reports in the Power BI service cloud. The purpose of this project was to visualize large amounts of data for strategic planning purposes in the engineering research office. We were looking specifically at engineering contributions to the University of Waterloo's Sustainable Development Goals Over Time. We wanted to understand engineering research excellence on the Sustainable Development Goals Over Time, and we wanted to understand which faculty members were having high impact on the SDGs. We chose to use Power BI because of the easy connection of data to a wide range of sources. It's also a highly efficient columnar database. The visualization and reporting functions are highly intuitive. It has an interactive data interface, and we wanted to be consistent with other initiatives using Power BI across campus to build out models of scale if necessary. How we use Power BI basically for this project was to export data, store it in Excel workbooks, upload the workbooks into Power BI desktop, create our analysis in desktop, and publish our reports to Power BI service. The Power BI Times Higher Education Impact Ranking Report contained a report overview of overall data per Sustainable Development Goal, department data per Sustainable Development Goal, publications, and top 10 publications for each Sustainable Development Goal. Here you'll see some screenshots of the different parts of our report. The first part we did was create a page which was the report purpose as well as the report overview, specific data, and any other data notes that we had. You can see over here on the left hand side that there's a very easy table of contents to follow, and it shows you which page you're currently on in the report. The second page we created was an overall data page. This was a very interactive page, so users can click on each of one of these Sustainable Development Goals in the slicer, and each of these four data areas will change depending on which Sustainable Development Goal is chosen. In these two graphs, we show the publication count and the distinct publication count as a percent of the total University of Waterloo publications. We also have our distinct publication count in the top 10 percent was cited publications, and the proportion of engineering publications in the top 10 percent. This overview gave us a really good way to understand how much engineering publications were making up the total publications for University of Waterloo publications, as well as research excellence within our engineering publications to understand how many of these publications were falling in the top 10 most cited journals. The next page we had was we had a publication page, and this page included the same standard slicer for each Sustainable Development Goals, but depending on each option users could choose just total publications, or they could flip over here to this button for the top 10 percent publications. So using this publication button, depending on which Sustainable Development Goal was chosen, you can see which departments were having the most publications in that Sustainable Development Goal. One other feature that was very handy for us on this page was the option to be able to right click on this bar, and you would only see the publication showing up for this specific department. So this is systems design engineering. So when we right clicked on this bar, we could see all the systems design publications showing up, and the metrics available for this specific bar, we had the name of the publication, the authors, the journal, title, and citation count. So this was a very easy way for users to be able to understand exactly which publications were coming out for each department. And then the last page that we had, we also had a top 10 publications page. So similar to the page that I just showed you, when we click on each of the different Sustainable Development Goals, it will auto populate which departments have publications within that Sustainable Development Goal. And then we can quickly look on this top 10 list and see which departments are having publications with the highest citations out of the top 10. But generally this gives us a really good perspective of where the top 10 publications in engineering for each Sustainable Development Goal. So we found in general that Power BI was a very, very helpful tool to develop for projects such as this, because there were so many components of data analysis that we wanted to do. But in the end, there was really only a couple of, you know, sort of high level visualizations that we wanted to provide for an audience such as the Dean of Engineering or assisted Dean of Research and Engineering. So thank you very much. And that's all I have today. Wow, that was so great. So clearly bibliometrics can be viewed from a variety of lenses as we've just learned through JANA's institutional level applications and Martha and Wendy's faculty level applications that looked at metrics from some very unique angles. And as we stated earlier, we only use bibliometrics as part of a basket of measures, but they can certainly help to answer some interesting questions about the research at an institution. It is these types of use cases that we share with our colleagues through our working group and our community of practice, all with the shared goals of increasing the bibliometrics expertise at Waterloo, using bibliometrics responsibly and breaking down silos so we may all work with and learn from one another. I hope you enjoyed this presentation and I welcome you to reach out to myself or my colleagues here at Waterloo if you have any questions or comments. I'd also like to encourage you once again to check out the presentation by our colleagues from the University of Illinois Urbana-Champaign, if you have not done so already. Thank you.