 If you want to transform learning, present it as over-the-counter. I'll explain. Educators armed with data to inform decisions is a powerful tool to transform learning. However, data can transform learning in a bad way if educators misunderstand the data and thus use it to make poor decisions. This presentation will examine a largely overlooked culprit that interferes with data's ability to help educators transform learning. And that culprit is the manner in which data is communicated to educators. You will learn how to add research-proven supports to your own data systems and reports so that educators using the data can do so with significantly increased accuracy. References to research will appear on slides as I make cited points and will also be listed on the final slides of this presentation. Let's take a look at the problem we need to overcome. Data-informed decision-making, often called data-driven decision-making, is a process educators employ to transform learning and improve student success. First, educators view data in reports generated by a data system, which is any computer system educators use to access and analyze student data. There are many different types of data systems, but they all can be used on a computer and give educators data-based feedback on entities such as students. Most educators have access to these data systems for generating and analyzing score reports. Forty-four percent of educators use these data systems directly, but the majority view printed versions of reports others use the data systems to generate for them. But whether they are viewing reports on a computer or writing their hands, these data reports generally originate in data systems. There are many types of data systems, which generally house data like a data warehouse, but also arrange the data with topic-specific or data-specific reports with the goal of educators being able to use those data reports to inform decisions. Overall, most educators are eager to analyze this data to make decisions, but they cannot make correct analyses if they do not understand how to do so. For example, one of the most comprehensive studies on educator data use was conducted by the U.S. Department of Education at nine school districts known for strong data use. Despite these reputations, researchers found teachers answered only 48 percent of questions correctly when interpreting data. In another study at 13 school districts also considered exemplars of data use, the U.S. Department of Ed found teachers achieved only 48 percent accuracy when making inferences about data involving basic statistical concepts. The report noted it is unlikely other districts would perform any better. To improve student learning, educators must know how to correctly analyze student data, yet many do not. For example, many teachers and administrators do not know fundamental data analysis concepts and 70 percent have never taken a college or postgraduate course in educational measurement. The bottom line is, educators use data to impact students. Their data use impacts students. Data can be a good thing, but when educators misunderstand the data they get from data systems, these misunderstandings cripple data use in school districts. Sadly, there is an abundance of evidence educators of all levels are making data analyses that are commonly incorrect. This led me to research what is currently being done to improve educators' inaccurate data analyses. In most literature, the responsibility of improving analyses was placed with the educators themselves. Most school districts are employing solutions that fall into two categories. One is professional development or PD involving classes or training to teach educators how to better analyze data. The other solution relates to staff involving staffing hires or arrangements such as strong data leaders, teams or experts. PD and staff tend to be beneficial to data use and they are definitely recommended. However, neither approach is foolproof for achieving data competence. For example, in one study where teachers received PD in measurement, all teachers struggled afterwards with statistical terms and measurement concepts. Likewise, staff supports do not always operate as intended. Knowledge management research indicated knowledge can be hard to share with others even when the intention to share it is there, especially when power or status is involved. Given that educators are already doing what they can to improve their data analyses, yet these highly educated and intelligent professionals continue to struggle despite added staff and PD. We need to consider the tool these educators use for data analysis. There is extensive evidence data systems do have an impact on educators' data use. They can include supports that likely improve educators' data use, but almost no data systems offer such supports. So I researched a field where the product is known to feature comprehensive support in the use of its contents. I researched over-the-counter medication where it would be negligent to not help users understand how to use the contents. People using over-the-counter medication can read how to use the product's contents through varied supports such as the product's detailed label. Someone wondering how many pills should I take or is this medicine for the type of flu I have can immediately learn the answers to these questions through a variety of means. There is also evidence the types of added textual support found on over-the-counter medication can be employed with non-medication products where they also result in improved use. I considered what if data systems contained such supports, essentially making over-the-counter data in that data systems users would be better supported in the data's use. Could these supports help educators using data? There is already evidence concerning benefits for educators when a help system was added. If your data system does not feature a help system, you should ask for one, citing the sources cited here below, or even create one yourself as I once did for a district and for a data system using help system software like ScreenSteps. And there is also an abundance of evidence concerning the positive impact on educators' data analyses when student data is packaged or displayed more effectively. You should discuss what sources your data provider uses to ensure research-based visualization and design practices specific to education. And of course there is evidence that reports need to contain appropriate contents. However, my own study looked at what happens to educators' data analyses when data systems contain detailed labels next to data such as report footers, or when data systems contain supplemental documentation such as abstracts or interpretation guides to help educators use the data correctly. Most data systems currently report data without added support in how to correctly analyze the data. Thus failing to contain footers, offering information pertinent to correct analyses, failing to offer abstracts, which can be thought of as one-page reference sheets to accompany reports, and failing to offer interpretation guides, which guide educators in the use of a report's data. So I conducted a quantitative study to determine the degree to which including three different forms of data usage guidance within a data system reporting environment could help educators' data analyses. I employed a cross-sectional sampling procedure and incorporated responses from 211 educators of all school levels, spanning transitional kindergarten through 12th grade at all veteran levels, working in varied roles, and at schools with a range of demographics. This study did not rely on participants' preferences or perceived value of supports. Rather, the study was used to measure specifically how effective varied analysis supports are in improving educators' data analysis accuracy. And it was found these supports significantly improve educators' understanding of the data contents, much like including different forms of usage guidance with over-the-counter medication is needed to properly communicate how to use its contents. All supports investigated in the study had a significant impact on educators' data analysis accuracy. All secondary independent variables concerning school and educator demographics had an insignificant impact on these support success. This means regardless of an educator's role, background, or school site, these supports all proved significantly effective. Likewise, there were no significant differences between the effectiveness of the moderately different formats used for each support. Either format used for each support in the study garnered similar benefits. In the control group, where participants viewed the same data as other participants but received no embedded supports, educators answered only 11% of data analysis questions correctly. 87% of them indicated they would have used added supports such as footers, abstracts, or interpretation guides if they had received them. When a footer was simply present, regardless of its use, data analysis was 307% more accurate. When footers were present, they were used 73% of the time, and when they were used, data analysis was 307% more accurate. When an abstract was simply present, regardless of its use, data analysis was 205% more accurate. When abstracts were present, they were used 50% of the time, and when they were used, data analysis was 300% more accurate. When an interpretation guide was simply present, regardless of its use, data analysis was 273% more accurate. When interpretation guides were present, they were used 52% of the time, and when they were used, data analysis was 436% more accurate. Overall, when any one, just one of these three supports was simply present, regardless of its use, data analysis was 264% more accurate. When supports were present, they were used overall 58% of the time, and when they were used, data analysis was, on average, 355% more accurate. As the previous slides established, all three supports significantly improved educators' data analyses. In addition, educators reported wanting these supports when they did not have them, and the majority used the supports when they did have them. I also saw that educators struggle with data analyses even when they serve in leadership roles, considered their data analysis skills to be proficient, received data analysis PD, or took graduate-level educational measurement courses. Each of the study's analysis supports proved effective when used with any of the report types and in answering any of the data analysis question types. This implication is supported by the support success findings noted earlier, combined with the fact that there were insignificant differences in educators' data analysis accuracy, question-to-question, and report-to-report. These benefits hold implications for data-informed decision-making, of which data analysis is a key step. The findings of this study also fill a void in field literature by providing us with samples of data analysis supports proven to be effective. So let's take a look at how the supports can be structured. A footer specific to that particular report's data should be present at the bottom of each report. The footer should communicate only the most vital info an educator would need to analyze this particular data correctly, such as stopping him or her from making a mistake educators commonly make when analyzing this particular report's data. Since users who are analyzing the data incorrectly often don't know they are analyzing the data incorrectly, the footer should not be easy to overlook and should match the font of the rest of the report. For example, if the report's data is displayed in size 10 aerial font, the footer should be displayed in size 10 aerial font. It should also follow length guidelines and aim for brevity. Otherwise, it will not be read. Good guidelines are 224 to 324 characters with spaces or 1 to 3 lines of text. Report-specific abstracts and interpretation guides should also be present, accessible via links from the report. They should be printable and educators should be able to save them as PDF files. Free templates are available on my website and they will guide you or your data system provider through the creation of a report-specific abstract and interpretation guide. All abstracts you create and all interpretation guides you create should be consistent in appearance and in the type of content they provide, as this makes them easier for educators to use. Next, we'll look at the type of content they should contain. At the top of an abstract, you have the title and description, as well as an image of the report so it's clear which report this abstract should be used with. And then you have the report's purpose, such as a list of key questions this report can help educators answer. In giving the report's focus, the abstract can answer who is the intended audience of this report, what data is reported, and how is the data reported. Finally, the warning addresses whatever concept educators most commonly misunderstand when analyzing data in this particular report. The warning should also provide educators with the correct way to avoid the common error. As for a report's interpretation guide, it's like an expanded version of the abstract, since some users need more guidance than others and you want to support all types of data users. Thus, the first page of a report's interpretation guide is identical to the report's abstract, though page numbers have been added and it is labeled as the interpretation guide. After that page, you'll have an additional page or two or three, whatever you need to walk an educator through the use of the report. Let's take a look at those added pages. At the top of page two, you have instructions on how to read the report. After that, you have the essential questions section, which may span multiple pages. Basically, you take each question featured on the first page's purpose section and you walk the reader through how to use the report to answer that specific question. For example, show the user where to look on the report, what to look for, etc. Examples are particularly helpful. At the end of the interpretation guide, add a more info section to point the educator in the right direction for assistance with any likely needs he or she might have. Since all three of these reports significantly improve educators' data analyses, it is recommended that all three accompany each report you or your data system provides. Educators are encouraged to take steps based on those appropriate for their roles and circumstances to capitalize on the benefits of the three over-the-counter data supports investigated in these study. These steps involve encouraging and requiring data system providers to add supports, promoting related dialogue in educator communities, and personally adding supports to reports when none are otherwise provided. Ideally, you can persuade your data system provider to add them with evidence such as that provided in this presentation. Data system and report providers, such as data system vendors and also district staff who maintain in-house data systems, are encouraged to create a report-specific footer, abstract, and interpretation guide for each of the reports they provide. They should use guidelines, examples, and templates provided in relation to this study and should provide users with direct access to the supports. After all, research has shown footers, abstracts, and interpretation guides to improve educators' data analysis accuracy by no less than 205% and up to 436%. We should continue to provide PDM staffing to help improve data analysis, but we owe it to our students to also provide these supports to educators. This can cost us nothing but give us a significantly better shot at using data to transform learning. This concludes my presentation. Though references are listed on subsequent slides, thank you very much for your time and for all you do for students. Thank you. I'm Jenny Grant Rankin.