 Hi, I'm Andrea Scott and this presentation is Beyond the Metrics, a look into a program review process for sustaining OER. I'm the OER coordinator for the Office of Learning Advancement and I'm joined here with my colleague, which go ahead and introduce yourself. Hi, my name is Jamila Alani and I am the program review specialist at Salt Lake Community College out of the Office of Institutional Effectiveness. Great. Can we move to the next slide? So Salt Lake Community College is Utah's largest college with the most diverse student body. It serves more than 60,000 students on 10 campuses across the Salt Lake Valley. Salt Lake Community College is located on the Native American shared territory of the Goshute, Navajo, Payu, Shoshone, and New People. We honor the original ancestors of this land and also offer respect to the other tribal communities. We acknowledge this history to cultivate respect for and advocate with our Indigenous students and communities still connected to this land. Open SLCC is a large-scale OER program with an estimated 21 million in student cost savings with 11,730 sections to date. Open SLCC partnered with the Office of Strategic Analysis and Accreditation to perform a non-instructional program review. Our goal in this presentation is to walk you through the steps of an SLCC program review process, provide tips and lessons learned, and tools and resources. Can you move to the next slide? I want to share with you some samples of what our tracking looked like in the beginning. These images tell the story of our assessment journey and document our first attempts to track and share data. At the time, we didn't know how to track or even what insights we were hoping to gain from the tracking. We knew tracking data was crucial, but didn't have a set data plan or standardized methodology for tracking. So we start with what we knew and gathered as much non-invasive data as we could. These methods are basic and simple, but sharing demonstrates that everybody starts somewhere. And this is our beginning. Can you move to the next slide, please? Since the beginning of our initiative, our program has grown and our metrics have grown as well. But our assessment efforts have lagged. Starting in 2019, the OER program underwent several changes in transitions and leadership. In the summer of 2021, under the recommendation of the open SLCC advisory committee and college leadership, the program was moved from faculty development and transformational educational initiatives under institutional effectiveness to the Office of Academic Affairs under the Office of Learning Advancement. The OER program now leads the open SLCC team across institutional collaborative team consisting of members and support from library services, academic affairs, faculty development and transformational educational initiatives and student services. At the time of our transition, no official assessment or audit had been conducted on the program. Under new leadership and with additional focus on OER, we made a decision to undergo the process of program review. The goal was to examine our program goals and outcomes, also to explore how the program aligned with the institution's strategic plan to reexamine our vision and statement of purpose. And to develop methods of better understanding our strengths, weaknesses and areas of opportunities for growth. And also to provide an ongoing plan for assessment. Can you move to the next slide? So how the program review can assist OER programs with sustainability. There's several thoughts here. It can help provide insights that can be shared with your institution and the broader community. It encourages reflection, transparency and accountability. It encourages asking the questions what's working, what's not and why. It can be a strategy to promote further institutional buy-in. That's something at times OER programs do struggle with. So providing additional light on that. Also an exploration of further qualitative and quantitative assessment tools. At times, managing an OER program can be daunting, balancing the limited resources with the program needs. The program review process helps provide an alternative view to balance those efforts. Lastly, it also provides diverse levels of expertise and experience to enrich the quality of the feedback and results. Next slide please. So I would like to go over with you the basic structure of program review here at Salt Lake Community College. Enhancing our program review at our college was part of our strategic plan to align with our goals. We saw a great need in acknowledging our different programs and what are their perceived outcomes. How are these programs working together to better serve our community, our students, our faculty and our staff. And when we had this process of program review, we wanted to see how all the programs are fitting together to better serve what our outcome is and what our goal is. If we just have everyone working independently, there's little cohesion and we're not going to meet our goals and we're not going to meet our outcomes. So this is why our program review process was put in place. This also helps us understand the progress of each department. Where have you, what has the progression been over the past couple of years and what are their needs of our students? How do we keep up with those adaptations? How can we integrate different departments together and programs together? How can they better work together to meet their goals and needs? What kind of quality are we putting out there? Are we efficient in our work and are we being effective? And most importantly, how are we serving our customer, our students? Are we meeting their needs? Are we hearing their voice? Are we, you know, working together as we should? So what is a non-instructional program review? We have broken this up into three categories. The self-study, which is a lengthy document put together by the director or coordinator, giving the history, current projects, needs of improvement, points of pride. The external review, which is a third party that comes in and evaluates your program and gives you recommendations. Lastly, there's the action plan of what are you, what do you plan on doing? What is your roadmap? And we like to condense this into what can be done in about a two to three year time span. The self-report or self-study is all this information gathering, data collection, evaluating your program, using your expertise as to what's working, what's not working. Where would you want to be defining your program and your purpose? Now this also relies heavily on defining your outcome. Sometimes we get so goal focused that we forget what is our outcome, what are we looking to see to see if the program is effective? So we really want to highlight that. What are your department level goals? Resource allocation, what does your personnel look like? What does your budgeting look like? Where are you just putting all of your efforts? Are you wasting your efforts in areas that maybe aren't that valuable to your students or to your customer, to your stakeholders? And how can you better reallocate your resources? What are your areas of improvement? How can they be improved? What could you do? Once you start digging, you will find so much. You will find so much that it might be overwhelming, and it will also be exciting at the same time. But stay focused to what you want to work on. Data and metrics are very important to include. We'll go into that in a little more detail later on. As Andrea explained, OERs, partnerships with library services, with faculty development, who do you work with? Are you meeting their needs? Are they meeting your needs? Is the relationship working? What kind of gaps are there? And how can you better improve it? And to identify the stakeholders within your program, who are you serving? And are you meeting their expectations? And how can you find out if you're meeting their expectations? External reviews. Here at Salt Lake Community College, we require at least two external reviewers, one from in-state and one from out-of-state. Now this can be the director or coordinator's choice. But obviously, within a cross-functional area to what your program is, so they can be knowledgeable of industry trends to give you the recommendations input that will be valuable to you. The director or coordinator sets up meetings with various stakeholders, and at times the director will be present, and at times we request that they not be present. So that people may speak freely, and we can get the most out of this analysis. This can either be done on site or virtual, whatever you're choosing is. Lastly, the action plan is a compilation of the two processes about what are you going to, what do you plan to do in the next two to three years? What do you believe is a good roadmap for you to follow? And that will address major concerns from your self-study and external review. At Salt Lake Community College, we do have a three-year follow-up from the program review where we come back to the program and we go over this action plan, and we just kind of discuss where are you on this? Do you still want to stick to this plan now that you're actually working on it? Do you see it as possible? What do you want to change about it? And did you get the resources and assistance you needed from the college to be able to complete your action plan? And if you did not get that assistance, what are you in need of? And we report this to our cabinet, which is our president and our vice president, and our office would present this on the program's behalf to show their needs and where they are in order to meet their objectives. Now, there are various ways for you to go about analyzing your program and improving your program. One thing to be mindful of is qualitative and quantitative data. One misconception is to say, I have no data. I have nothing to go off of. And you always have something to go off of, whether that is numbers, tracking, or whether that's just your expertise or your intuition of how your program is going. So qualitative would be how many students are using their OER resources? Do they come back semester after semester? Do we have them reoccurring? Quantitative would be more like OER assists in student completion. That's a perceived outcome. But does it really? How can we further investigate that to answer this question? If you want more data than what you have, create the data collection plan to at least get it in the future. Use what you have now and work on what you would like to know for the future. Simple measures such as surveys or focus groups would give you a lot of feedback. Have focus groups with your faculty. Why are faculty members committed to working with OER and why are others not? And how can you meet their needs to further your program and further your relationships? And determine your goals and determine your outcomes. Andrea will share with us what OER at Salt Lake Community College's vision is, which is also an outcome, what we should be seeing from this program. How can we know that it's effective? What indicators are we looking for to see if it's effective? Personally, I draw upon Lean Six Sigma's program management tools to assist directors and coordinators through this program review process of how to measure, analyze, and improve and set controls in place. These are tools that help us understand one another and understand picking apart the various aspects of the program that we want to improve. And how can we clarify this into sections to be able to get the results that we want? Okay, so I'm going to discuss a little bit the highlights gained through the open SLCC self-study. So we were able to, during the process, we redrafted our vision and statement of purpose to better reflect our current goals and desired outcomes. It provided us a chance to take a pause and gather and evaluate further feedback to better determine the resources we had and capacity and also to prioritize our immediate needs. Within this whole process, we identified some inequities and inconsistency with our stipend process. We conducted several landscape analyses of other OER programs, performed a program comparison and identified strategies that other institutions were using to address similar challenges. We developed a data collection plan, gathering desegregated data by underserved populations, and then we identified areas to build and based on ongoing assessment and surveys. We implemented the doers equity, blue equity framework and rubric. And so those are just a few examples of some of the highlights we gained. Can you move to the next slide please, Jamal? Thank you. So here you'll see we actually didn't, at the beginning of this, we did not have a vision statement for our program. But we took the open SLCC team. We took a year long. It was a year long process of gathering feedback. We hosted several focus groups. We consulted with the open SLCC advisory committee, and also spoke with people informally just to get some some ideas about whatever what people are seeing our future vision. So this is we're very happy to share this. This is our new vision. Can you go to the next slide please. We also redrafted our statement of purpose. At the top, you can see our original statement of purpose. And what we realized is as our program was grown, this was drafted. Oh, at the beginning, I think in 2015. And although there's still some context within here, that is relevant to our statement of purpose. It doesn't capture the full picture of where our program is now or where we have grown. We realized that we are missing some pieces specifically in the way that we assist faculty. So we serve students, but we also serve faculty. And then our pieces within the equity diversity inclusion work that we've been doing and it's been part of the work that we've been doing, but to make it more intentional and visible. So, can you move to the next slide please. We also in the process we're able to design and disseminate a survey we partnered with data science and analytics on this, and our faculty fellow at the time was able to assist with drafting the questions. We work together as a team to do it. We had three different population target populations one was for we are faculty author survey. One was a department survey, and then the third was for non OER faculty survey. The OER faculty survey was mainly to gather information based on the services that the faculty had received and any kind of future services they would like to see. The department survey was based on information we were trying to gather regarding our stipend and payment compensation process and any challenges around that. The non OER faculty survey was really just to gauge faculty who are not currently participating in OER why and what what kind of services they may want to see. Okay, you can move to the next slide please. Thanks, Jamila. So we also have the opportunity to work with data science and analytics to create these demographics of disaggregated data. We don't have all the examples here these are just a few examples of the populations, but we did full age, ethnicity and race, first generation, gender identity. Those are just to name a few. Can you move to the next slide please. So, finally, we'd like to leave you with some tips of how tips for conducting an OER program review. So conducting a self study or a full program review can be daunting. And sometimes in some cases the resources may not be available. So if resources are not available to perform a full program review, you can consider doing a self study first. We'll be sharing a link below at the end that will give you access to our process that we use at SLCC for program review that includes an area for self study guideline. So it helps or to determine I guess another thought too is that you want to determine your priorities and scope and we talked about this a bit too, but it's I really struggled with this personally it's not realistic to address all challenges within a review. Jamila and I had several sessions going back and forth, where we discussed adding those items to a later time that while we will address them they're not something that's a priority at this moment. So in the beginning, we had we had our metrics we had some information, but we also felt like we were missing several pieces so we started we have lots of conversations with faculty. Some with students with college leadership and other stakeholders just to get a pulse on the culture. I can't even, I haven't kept track of exactly how many but we had lots of conversations and some of those were. Some of those were planned and some of those were just walking down the hall and chatting with someone that was appropriate time. Also, internal institutional departments may be able to help you with your work with a self study. So maybe you could do a little digging and find out who might be available to help or who may want to partner with you in your efforts. We developed a timeline and a plan and plan for several delays this may sound basic but it was something that actually very much helped for us to have a set deadlines and we constantly revisited this timeline. Also, another benefit we have of being with the way our community is we're very good at sharing or very generous for sharing. There are several tools out there that can be used for program review, some of the tools that we used where the doers blueprint equity framework rubric and also the RPK SUNY we are sustainability evaluation framework and evaluation plan. Can you move to the next slide please. Okay, so why was program review helpful. We are not completely through the entire process. We are done with our self study. So I can speak to those pieces. But we have now we feel confident and that we have been able to explore what what gaps are missing in our assessment what can we, and we have a data plan moving forward and of how we can further assess the effectiveness of our program. And then we are also on that three year cycle as well that will be reviewed within three years. I think I've uncovered several insights affecting the program effectiveness that several of those were very surprising I've worked with the program for several years and some of those things were things I had. I thought I, not to say that I know everything but there were, there were several pieces within there that were surprising. So that was, that was great. And then I think this is maybe something you wouldn't think of or associate normally with maybe a program review, but I really feel like having some of those conversations and being able to have these focus groups. It really strengthened our internal relationships with our partners and our faculty, because it opened the door to having these transparent conversations of discussing the challenges that we're having and opportunities for us to work together in the future. So that was, that was a third point to. I also think when we are, you know, with program metrics, they're always a valuable tool, but I do think when we're looking specifically just at the metrics we're denying ourselves the opportunity to critically examine deeper level of assessment that's crucial to sustainability efforts. So those are my thoughts and some helpful tips. And can you move to the next slide. Thank you so much for joining us. As I mentioned before Jamila and I mentioned, this is a resource here created by Jamila's office with Jamila also as a creator here. You can scan the QR code to gain access to these resources and we hope they will be helpful for your institution. Thank you so much for joining us today. Thank you.