 I'm Jerry Bain, multimedia producer for EDUCAUSE, and I'm here with Tom Kavanaugh, John Fritz, and Cynthia Golden. Could you guys introduce yourselves, starting with you, Tom? Sure. I am Tom Kavanaugh. I am Vice Provost for Digital Learning at the University of Central Florida in Orlando. Yeah, I'm John Fritz. I'm the Associate Vice President for Instructional Technology at the University of Maryland, Baltimore County, UMBC. And I'm Cynthia Golden. I am Associate Vice Provost and Executive Director of the University Center for Teaching and Learning at the University of Pittsburgh. I'm curious about assessment. How is your institution approaching faculty to be more focused on learning outcomes? Assessment obviously becomes more complex given the situation we're in, an unplanned move to online. Can you talk about assessment and how you guys are approaching that strategy? I'll go ahead and start. And it's been, I think, a series of increasingly urgent triages. When we moved this spring completely online, it was a bit of a fire drill, and we had a bunch of faculty that had never really thought about online assessments before. They, particularly in the hard sciences and engineering, they were used to giving classroom close booked exams, especially kind of written exams where you would like build a formula and they would check your work, especially in engineering and computer science disciplines. So for them, it was a bit of a challenge for other faculty who we have worked with traditionally in online development. They didn't have as hard a time with it because they kind of already understood online assessment affordances and limitations and how do you do try to or try to attempt to do authentic assessments as opposed to kind of knowledge recall kind of assessments. But for those faculty that had already, you know, created their assignments and assessments in those disciplines that are not used to that kind of a strategy. We did have to kind of talk them through proctoring strategies and the technical challenges sometimes associated with proctoring, which was exacerbated by the fact that many of our students just didn't have the equipment. They just didn't have webcams. They didn't they didn't have their laptops. When we went remote UCF happened to be on spring break. So students stuff is still in their dorms. They couldn't go and get it. So even if they had it, they couldn't access it in some cases and in other cases we had students who didn't know they needed to purchase that when they signed up for this course last November. And then when the time came for them to try to obtain a webcam. You couldn't get them. So it was a real struggle and trying to talk faculty through some of these different strategies trying to recognize that not every student has the same homogenous setup. Whether it's equipment or bandwidth, something as simple as a test strategy of showing one question at a time in the LMS versus showing the whole test at once in the LMS. Faculty like to show one question at a time it's randomized them it's a, it's a, you know, an academic integrity strategy. However, we started getting a bunch of complaints from students that their bandwidth was so low that it was taking too long for these one question at a time exams to show each question that by the end of the test. They hadn't gotten through it all, because it just took too long too long for the questions to load so we started espousing the faculty show all the questions at once which they weren't thrilled about. It was just very practical sorts of things like that that we were dealing with. So we had some challenges, just as Tom described in terms of what students had available to them. You know we had some some students who is especially in computer science class where the faculty member would say, you have to use our respond this lockdown browser and webcam monitor, and you couldn't buy it. And the faculty member was digging in on that requirement and so we had to have some conversations about, you know, even if they wanted to it couldn't do it and and so it's been sort of massaging, asking faculty to kind of rethink how they do assessment, which then is a short hop to jump back to what are your learning outcomes, your goals, and are they in alignment. You know, regardless of the pandemic, you know students hate it when they're taught one way but assessed a different way, and when those things are out of sync. You know what do you mean it's, you know, 50, you know question multiple choice test we never went over these kinds of things in the class either for lecture or whatever. So the those kinds of disconnects or things that are out of sync that pops up quite a bit, especially in a situation like this. But you know getting faculty to think about, you know, at scale, how can you do assessment. Maybe it's not everybody taking the same test, certainly not the same questions randomizing those questions. You know that that has been a challenge for us, particularly in the STEM disciplines, because that's just what they've been accustomed to for years. So for us at Pitt, we did have some similar issues. Our decision to go online happened while our students were on spring break and we started our remote classes. March 23, I think it was, and, and because our semester ends earlier than a lot of others we finish our classes toward the end of April and our commencement is usually the last weekend in April. Because of that timing we knew that when the inquiry started coming in about remote proctoring that we were not going to have time to develop an enterprise solution. And some of our schools or professional schools had some remote proctoring solutions in place. The university runs a large testing center for in person proctor testing, that would obviously not be open. And so what we tried to do was work with faculty and encourage that they think about other ways to address assessments and alternatives like open book exams or short answers or writing a transformative essay. That's the things that that our consultants suggested and we spent time with faculty, helping to move them in that direction. We also thought it might be a really good opportunity to think about high stakes exams and how to really measure student learning and consider some alternatives to the, you know, 100 question fill in the blank tests. Most of our students at Pitt were able to come back to the residence halls and get their things in that week that we took to prepare for remote learning. So we didn't have some of those issues that that john described, for example, but one of the things that our central it. Group did in conjunction with the provost office was to work to provide tablets and hotspots to students who didn't have robust enough access from the devices they had or who didn't have devices that they could use. And interestingly enough, we had some faculty in in that situation mostly with the network. Some of our faculty live places where there was just not good dependable network access. So we had to deal with some of those same issues at Pitt and it was a really nice partnership between it and the provost office to to make these devices and hotspots available to people.