 Welcome to Communicating Results – Sharing Results with Stakeholders – part of the Research and Assessment Cycle Toolkit offered by the Association of Research Libraries and made possible by a grant from the U.S. Institute of Museum and Library Services. This presentation is part of a module that focuses on reflecting, communicating, and acting on the results of library assessment. It describes purposes of and techniques for communicating the results of library assessment projects to stakeholders, including users and library colleagues. We hope the content is useful to library practitioners seeking to conduct assessment projects. At the close of the presentation, you will find a link to a feedback form. Please let us know what elements were useful to you. Once library assessment practitioners have assessment results in hand and have identified the target audiences and key messages that will focus their communication strategies, a next step is to plan for reporting results effectively and in ways that increase the likelihood of users, stakeholders, and colleagues responding to the results in helpful and beneficial ways. Knowing the purposes of reporting results, anticipating the preferences of various audiences, and using strategies and structures to build reports that speak to audience needs and goals can all be part of an effective communication strategy. Reporting results is an important element of the overall assessment process. Often thought of as a last step, reporting is actually central to the assessment process, and shortcutting this step can negatively impact the ability of the assessment project to achieve its intended goals. Reporting is important from a processing perspective. That is, the reflection required to document and report an assessment helps library assessment practitioners think through their process and results. The time and attention involved in preparing results to be reported often leads to the generation of new ideas, realizations, and inspirations. Reporting is also essential from an external perspective. Unless assessment results are reported, it's likely that many people are unaware of either the assessment itself or the use of the assessment results. Transparency demands that assessment processes, accomplishments, and plans for improvement are shared widely, especially with those involved with or impacted by them. Without this kind of transparency, the results of the assessment are less likely to be impactful. Moreover, any lack of transparency in completing and acting on assessments is detrimental to future participants' motivation to engage further in assessment. If an assessment doesn't lead to some kind of action—big, small, or even an informed decision not to act—there seems to be little point in it, both from an internal and external perspective. In the assessment cycle, reporting is often concurrent with the interpretation of data and the enacting of decisions. All of these processes—interpretation, reporting, and action—are mutually informing. The reporting process helps interpret data and leads to conclusions that call for decisions to be made or actions to be taken. By engaging with reported results, audiences often offer new interpretations of or ideas for acting on the results. And sometimes, tentative ideas for decision-making and action-taking require assessment practitioners to return to the results, interpretation of results, or reporting of results, to reflect in nuanced ways. Thus, these stages of the assessment cycle tend to be co-mingled. Next, let's consider how the audiences for reporting assessment results influence communication designs. While assessment results are typically generated—at least initially—by assessment practitioners focused on the data at hand, a third piece of the communication mix must be considered in sharing assessment results. Along with assessment practitioners as authors and the results themselves as content, the third element of assessment reporting is the audience as receivers and, ideally, co-creators of reporting projects. Therefore, reporting strategies must make sense for and be grounded in audience contexts. When creating assessment reports, library assessment practitioners should impart information with the intended audience's perspective, context, and values in mind. Before finalizing assessment reports, even an initial round of assessment reports, library assessment practitioners will want to check with representatives of an intended audience to ensure that the reports are appropriate for the goals, language, formats, level of detail, relevant timeframes, and preferred communication modes of the target audiences. For example, when reporting to library or institutional administrators, all of these elements should help library assessment practitioners develop reports that resonate with decision-makers. Knowing as much as possible about the individual administrator's needs, perspective, intended goals, language preferences, style and tone included, data formatting preferences, desired level of detail, preferred ways of receiving information, and timeframes for needing to know the results information is essential to ensuring that the message found in the report comes through as intended. As a general framework to start with, this three-part reporting strategy can be helpful. First, provide background about the assessment effort itself. Of course, this information should be imparted, whenever possible, in advance of final reporting, and ideally this information would be available to the administrator prior to the assessment taking place, especially if the administrator is a key stakeholder in the project. Assessment background information would include what was assessed, what type of method or approach was used in the assessment and why, and how the results will be applied and reported to other audiences. Second, the three-part reporting strategy calls for providing the results of the assessment. This step should also directly answer questions the administrator is likely to have. And third, this process should include a follow-up on the status of efforts for improvements based on the assessment results and eventually the effectiveness of those changes. Finally, let's explore some major strategies for sharing assessment results with users and stakeholders and colleagues. As with the entire assessment project, library assessment practitioners should keep the end in mind when preparing to report assessment results to various audiences. At the beginning of any reporting document or presentation, be sure to articulate the goals for reporting results. Why are you reporting results to a particular audience? What do you hope will occur as a result of this communication? When designing reporting options, these questions should drive the selection of reporting strategies. If you want library assessment reports to evoke a particular response, understanding, decision, or action, what reporting strategies are most likely to ensure that response occurs? And of course, in all reporting, library assessment practitioners should strive for clarity and accuracy and use any vetting, piloting, or other double-checking practices to ensure that reports align with the actual assessment results rather than allowing what one might wish the results said into any reporting efforts. In making choices about strategies and formats for reporting, library assessment practitioners need to think through choices around the message they want to convey, the audiences they want to reach, the best presentation formats for carrying that message to the intended audience, and the most suitable structure within the format for doing so. In general, a typical structure for assessment reporting might look like the elements shared here on the right side of the screen, a title, a summary, an introduction followed by results, conclusions, and recommendations, and any appendices that make sense to support the preceding elements. Of course, this structure will look different in various types of reporting methods, as some are shorter than others, less formal than others, and so on. In a longer, more traditional reporting format, an outline like this offers a good starting point for library assessment practitioners. It includes the project title, abstract, acknowledgments, a table of contents and or list of tables and figures, an introduction, statement of the problem or need, conceptual or theoretical frameworks underpinning and guiding the project, a rundown of the design results, conclusions, and end-matter. In some cases, an executive summary may be part of a longer project or report, other times it might stand alone. Executive summaries are usually a page or maybe two in length and designed for a particular audience. They are short, so they focus on the key messages and takeaways from the results, usually the ones that will inform the decisions of the intended audience. Executive summaries of assessment projects should include the motivations for the study, major aspects of the assessment, results, conclusions, recommendations, and any asks that the reader is requested to act upon, and of course, limitations. In planning for assessment reporting, librarians will want to think through, carefully, who the responsible parties are for reporting, timelines that matter, and potential costs. In addition to library assessment practitioners, administrators, support staff, and participant representatives may be included in reporting. Timelines might include deadlines for decision-making, strategic planning, forming collaborations, budget decisions, and the like. In terms of costs, the time and effort necessary for reporting must be thought through as well as considerations for printing or web layout, accessibility, and other potential resource expenses. In terms of getting ready to report assessment results, another host of considerations are important and necessary. Getting to know your data in preparation for meaning-making and reporting is essential. Library assessment practitioners need to understand their data and how it was analyzed and visualized. If you need help, get help with this. Start simple and make sure you understand implications fully before trying to report them to others. In order to get to know your data well, consider revisiting related professional literature, looking for patterns, identifying the data that tells you the most about your outcome, question, or need, and will be most useful in making improvements, summarizing the most important takeaways or points for each outcome, variable, or concept of your assessment, and eliminating data that turns out not to be so relevant after all. Of course, do a quadruple check to ensure that individual participants cannot be identified in the data and avoid any other unethical practices in reporting. Another process for assessment results reporting is essential. Library assessment practitioners will be well served by finding the stories their data tells. In reporting your results, it's a good idea to tell the stories not only of your participants, but also the journey of assessment. What did you learn and how did it lead you to your conclusions and recommendations? What interested you about the results? What surprised you? Providing this context and commentary makes your results more understandable. It also allows others to examine the connections you made in reasoning, which provides another check from a user, stakeholder, or colleague perspective on your conclusions. In telling stories found in the data or the assessment process, take care not to use jargon or overwhelm an audience with irrelevant information. Check to be sure your visualizations are clear and easily understandable. Among the most important considerations to include are the limitations and flaws of the data and or the overall process, and any corroborating information such as multiple methods used to get at the assessment question, problem, or need. And of course, one of the most important contributions reporting makes to the assessment process is its role in closing the loop by ensuring that understanding is gained, decisions are informed and made, and actions are taken as a result of an assessment process. In reporting, library assessment practitioners can increase the likelihood of the loop being closed by documenting where outcomes were met or not met, documenting considerations or decisions for improvements, and making recommendations for the next round of the assessment cycle. Once more, in checking a draft assessment report, keep these basic tips in mind. Make as many things short and digestible as possible. Avoid jargon and technical or professional language that a given audience may be unfamiliar with. Use plenty of white space and relevant, not distracting graphics. Use headings and subheadings, numbered lists and bullets. Pilot the reporting strategy on members of the intended audience and keep accessibility in mind. At this point, it's important to acknowledge that the assessment path doesn't always run smoothly. For example, sometimes the resulting data is just bad. Bad data, so to speak, might be faulty data, inconclusive data, or data that for some other reason shouldn't be used. In these cases, library assessment practitioners can explain what might have gone wrong with the assessment and describe what will be changed in the next iteration of assessment. Having said that, the goal of assessment is to make improvements in some way. So it's worth examining the data closely to see if any improvements can still be made without causing harm by making decisions based on flawed data. As a quick reminder, quote-unquote bad data can often be avoided through planning by consciously choosing an outcome, question, or need to guide the assessment, determining what you need to find out to ascertain whether the outcome question or need was met, matching outcomes, questions, or needs with assessment methods appropriately, and checking repeatedly on the links that connect the various stages of the assessment cycle. In addition to problematic data, it's important to avoid poor assessment report writing. A few tips to keep in mind include. Avoid sweeping statements that can't be backed up by evidence. Be precise, not vague. Being vague leaves a lot of room for misunderstanding. Be organized throughout. Explain and describe thoroughly, even if those details are relegated to an appendix. Explain all connections and reasoning. Check repeatedly that assumptions and wished-for results are removed. Stick to the assessment results as they are. Attribute and cite all your sources. Acknowledge all your partners. In any assessment reporting, it's important for library assessment practitioners to be prepared. Interested audiences are likely to probe and ask questions about the outcome question or need that inspired the project, the methods used, the analysis and results, the conclusions, recommendations, and any asks. It's advisable to be prepared to explain, in detail, all of the elements of your project. Audiences who are interested will want to know more. Audiences who are involved in, will be impacted by, or need to make decisions about the results of your project, should want to know more. So be prepared to deliver. Though the road to the point of reporting out on an assessment project can be a curvy or even a bumpy one, it's worth it to stay the course. Remember that the purpose is to gain understanding, learn more, make decisions, and take actions to improve library services, resources, and spaces. With an end goal of improving experiences, engagement, and outcomes for users. Communications that connect the dots for users, stakeholders, and colleagues will help everyone participate in the process. Of course, reporting out is not the end, but it is a milestone in any assessment process. Doing this well will enable the final step, realizing the outcomes of assessment. Thank you for viewing this presentation on Reflecting, Communicating, and Acting on the Results of Library Assessment Projects. Please use the link provided to complete a feedback form on the usefulness of this information for your purposes.