 Let me welcome everybody. I'm delighted to see so many of you here today to discuss wonderfully deep and important topic with a fantastic guest. My name is Brian Alexander. I'm the Future Transforms creator, host, and Chief Catterer, and I'll be your guide to the next hour of conversation. I am absolutely delighted to be able to welcome Richard Aron. Richard is a professor and a dean at UC Irvine. He's the author of one of the most important and controversial books published in the 21st century about higher education, academically adrift. He has been researching and exploring how to measure and think about student learning, and most recently he is finishing what I believe is the first phase of the next generation undergraduate success measurement project. If you'd like to learn more about that, well, we'll be doing it together, but the bottom left corner of your screen, you should see a kind of yellow or tan-colored button. Just press that button and it will pop the homepage of that program. Now, let me welcome Professor Aron and bring him on stage. Greetings! Thank you Brian. Really grateful for all the work you're doing with this forum in general and for inviting me onto it in particular today. Well, thank you. It's an honor and just so you know, you're carrying a noble forum tradition forward, which is hosting guests with good beards. That's an honor. We ask people to introduce themselves using a particular method. I'd like to ask you what you're going to be working on for the rest of this year. What's the, what are the big projects and the big ideas? They're going to be top of mind and soaking up most of your time. That's great. Great question. Yes, others just to be complete, there's two projects I'm working on at currently. One is on a local project in Orange County. How do we help educators and social service agencies better serve foster youth and housing insecure youth? I'll set that aside. It's really not higher ed focused, but the other project that is my primary project, one we'll talk about today, the next generation undergraduate success measurement project that was funded by the Mellon Foundation to set up a state-of-the-art measurement system to track undergraduate experiences, outcomes and trajectories in a much more holistic, comprehensive manner than kind of the existing measurement that's occurring in the sector. Which is one of the main reasons I wanted to bring you here today to dive into that. I'm happy to jump in and explain some of the project goals, the way we're going about the measurement work, our holistic framework for thinking about student success and then you also share some findings today from this project that we can kind of learn from and help inform policy and practice moving forward. Fantastic. Friends, I have a couple of very, very quick questions to get the ball rolling, but the purpose of the forum is to be a platform for your own questions and comments. So again, just look in the bottom of the screen and press that raised hand button if you want to join us on stage, bearded or not, you're all welcome, or just type in the question mark and type in your question or comment. One quick question to start off with is where are you in the process? I was trying to guess that you'd finished your first phase and that's a very good attempt. So we are, let's see now, we are almost 18 months into data collection. We're just about to launch the second set of these innovative performance assessments we developed so to measure growth in some competencies we think are aligned with 21st century workforce needs and liberal arts education values. So let me share, since we started talking about the performance assessments, let me talk about that and then let's just put a little flag in the conversation that we're 18 months into this, that means we started data collection before the pandemic and also have this kind of unique opportunity by unfortunate accident to kind of understand the effects of the pandemic and moving to remote instruction on the lives of undergraduate students and outcomes as well. But let me say since we started talking about the performance assessments, that's one of many different data streams we're using in this project, but since I mentioned it, let me explain a little bit what we did there. So we had the good fortune of partnering with ETS, the big assessment firm, to not just use their off the shelf standardized assessment items, but they were so interested and excited about our project that they partnered with us to co-design new assessments that we thought were better aligned with the goals ahead. So we did use their off the shelf measure of critical thinking, it's called the Haydn assessment, it's a commercial product, it could be used by anyone, but then we designed a suite of other performance tasks for students to undertake. First we designed a collaborative problem solving task for students to engage in. Four students are put into a virtual space, they're given a set of documents to examine, they solve the problem first individually, then they're given an opportunity to interact with each other to share information from the documents they have and solve the problem again after the collaboration. We're able to observe the actual collaborative process as well as the extent to which the collaborative problem solving result at the end is aligned with the desirable outcomes. So collaborative problem solving, we also very interested in perspective taking, can students develop the competency to take the perspective of someone different than themselves. So they're given these scenarios, complex social scenarios where they're asked to take the perspective of diverse social actors in the scenario. We're also very interested in the 21st century at this historic moment around confirmation bias, individuals change their opinion when new information is presented to them. And so we have a confirmation bias task to assess students ability to change their opinion when new information is provided. And finally we partnered with Sam Weinberg at Stanford, a history project at Stanford, he's doing excellent work on online civic reasoning. Students are given something that they might see in Twitter, Facebook, social media, are they able to make sense of it and determine the reliability and validity of the data. And so again, we did these initially with students in fall 2019. We're going to assess them again on these competencies in spring 2021 and next month. And then we did a whole lot of other data collection, other than these performance assessments as well. And I'm happy to talk about that those other strands as well. Well, this is great collaborative problem solving, taking other perspective online civic reasoning, grappling confirmation bias. And that's just one set of what you wanted. Exactly, exactly. Tell us about a couple of the other strands. Yes, so we've got one strand, the project we called, did you see the administrative data strand, that has all the, you can think of it as the conventional data that colleges and universities often routinely collect, course taking, grades, credit accumulation, choice of major, the college admissions data. So we have some background information on them. And then some other administrative data that some universities like Georgia Tech and others make use of, others don't, data on what, when and where students are interacting with student support services. When they engage in academic advisor, when they engage in tutoring services, when they engage with the career service office, and so on and so forth. And so we also integrate all this administrative data into our project. That's again, the administrative data strand. We also make use of something that is kind of rare in higher education, but I believe an incredible opportunity. We make use of data generated in the learning management systems. So today, everyone is interacting through these systems, Canvas is one of them, Blackboard. There's a few different ones, but much of the interaction between student and faculty are increasingly mediated by these learning management systems. That was true before the pandemic, obviously, even more so since. And so what we've been able to do with that data is generate academic engagement measures for every student in every course at UCI, how many hours they're spending in the course, how online, how many times they go on the site to access materials, the extent to which they interact with faculty on those sites, and they interact with peers on those sites. We also have access to products that they deposit there. So again, an authentic way to measure how students are engaging in the academic enterprise, and we use that data to integrate it into this measurement system. And then the third and final strand is extensive survey methods and experiential sampling. And so the administrative data, the LMS data we look at for all undergraduate students, there's a 1200 students in a cohort, freshman and juniors, we take on each year. And for those students, they get the performance assessments and they also get extensive surveys. Those 1200 students will receive eight surveys over the course of each year, a background survey about their orientation, their goals, their values starting college, but then also at the end of each term, how they're experiencing their classes and what's occurred in their classes in terms of the type of instruction they've received, as well as again, their goals and how their goals are evolving over time. A subset of a third of those students, they get even more surveys. For those students, they get weekly surveys and also two weeks during the year, we do experiential sampling with them, where they get buzzed on their phone, what are you doing, who are you doing it with, and what's your emotional state. And so we've got this very fine-grained, detailed data on what students' lived experiences are at our college, at UCI. And this is from everything from their interaction with their peers, which is key to the holistic development you would want on a college campus, to experiences of discrimination and microaggression. So we do our best to capture the full range of student experiences and we do it for the following reason. If we are serious about serving our students well, we have to understand, in a broad, holistic, comprehensive framework, how they're experiencing undergraduate life. How these experiences track with outcomes and their trajectory. We can't intentionally design without that. We can do it on the basis of ideology and hunches and so on. But that's not, every other industry in the world today uses data to better and intentionally design how they interact with the clients they serve, the consumers, the customers. Or if Netflix does it, my gosh, in higher education where we have something much more serious in our charge, the responsibility of supporting student growth and development to face the incredible challenges that we're going, that we're facing in society, moving forward the environmental crisis, the current public health crisis, the political polarization in our country. My gosh, do we have a responsibility and a need to figure out how to best serve students so that they are developing in a way that they'll be prepared to meet those challenges. That's a very, very stirring call and by God, that's an enormous amount of research being done. I've never heard of anything anywhere near the scale. This is true, but let me get out of the way. Thank you for giving us that introduction, the rest to the work. And again, remember, everybody on the bottom left of the screen, there's a button which will take you to the website for the project. But I wanted to share some of the questions that have come up so far and again, invite you all to add your own questions and thoughts. As you can tell, Professor Aram is definitely happy to share with you. So to begin with, we have a question from Paul Walsh at USM. Does the project give students a live trending outlook so students can see how they are progressing in relation to their efforts and to previous students? Yeah, that's a great question, Paul. So one way to use data like this is to personalize instruction and to use the data to feedback to students so that they can better understand and take ownership of their own growth. Those are things we might aspire to get to down the road. It's not something we had the resources or ability to build into the system at the current stage of the project. Our goal is let's collect this data so that we can inform institutional improvement efforts, continuous institutional improvement efforts, as well as use this data so that the educational research community, the social scientific research community, can better understand the value of liberal arts education broadly conceived and also identify what works and what doesn't work in terms of these underlying educational processes. So this data that we're collecting, it's going to be de-identified and deposited at the University of Michigan in the data archive ICPSR for the larger social scientific community to make sense of. Let me say one more thing about that because if you've been reading the newspapers recently, Biden's talking about infrastructure. Let me tell you what the most important infrastructure we need in this country today is. We need infrastructure about how to deliver, measure, iterate, and improve higher education. I can think of no greater infrastructure need than that because individuals alone can't do this. We need to build the infrastructure so that we can improve higher education, not just to do better by the students we currently serve, but to increase access dramatically so that our country is prepared to meet the challenges we're going to face going forward. Shame on us in this sector if we aren't articulating that need to leaders in Washington and demanding that these kinds of investments get made. Well said. Paul, thank you for the great question and Richard, thank you for the multi-pronged response. Again, if you're new to the forum, this is a great example of the Q&A tool so you can just definitely use that. We have a question from the excellent Steve Ehrman just published his new book and he asks, how do your measures attract students to work softly enough to illuminate their current capabilities? Yeah, great question, Steve. One of the key challenges around performance assessment is motivation. If the students aren't motivated to do well, the reliability and validity of the measurement gets called into question. Going back to the academically adrift work, one of the critiques was that the reason why students didn't grow on these on these instruments was because they weren't motivated to take them seriously and the longer they spent in college, the less motivated they were to do something you put in front of them. So they're fresh, but by the time they got to be the end of their sophomore year, senior year, they were turned off to assessments and academics in general. And in fact, I hate to say it, from the Wabash study of liberal arts, longitudinal study, their measurement suggests that again, the longer students spend in college, the less academically motivated they are. If you're a committed educator like myself, really challenging finding from the Wabash study that really we need to do a lot of thinking about how academic engagement is structured and student esteem is relevant to students and motivated. That's a long-winded way to get back to Steve's question. So one is kind of on a basic level, how do you get them to do these surveys and assessments in the first place? We provide some limited economic incentives for students to take these surveys, something like $50 for the year to take the surveys and the performance assessments. For the ones that are doing the weekly surveys and the experiential samplings, they're signed up for an independent study course. And so they get course credits, essentially one class over spread out over the year to engage in these surveys, which are really about reflecting on their own education. And so they're motivated in that way. Finally, in terms of the performance assessments themselves, I can't say this about the off-the-shelf critical thinking measure, the heightened measure that we use from ETS, but the ones that we co-designed with ETS and the online civic reasoning assessment that we borrowed from Sam Weinberg at Stanford, those tasks are very engaging. The tasks themselves are kind of very engaging and motivating. So you're put into a group and you're, which of these three candidates should we hire? Here's some information on them, interact with each other. It's an engaging task. Thinking of yourself in this complex social scenario, engaging, confirmation bias a little bit less so that that task. But the online civic reasoning is, I don't want to reveal the items, but again, quite engaging. You can imagine scraping the social media to get a sample of what's out there and then asking students about reliability and validity. It would be hard for that task not to be engaging. That sounds like a lot of fun, a bit of role-playing. We have more questions coming in and thank you for that one. And I'm kind of building on that. We have one from Ray Garcelon. He asks, do you think there are any useful strategies or guidelines to effectively integrate external measurements of student success as a part of a holistic system of student learning, i.e., a nursing program? Yeah, thanks. So that's great. I think a lot about that. Ray also, in my day job as a dean of an education school, we have a teacher education credentialing program where there's always an interest to see not just how are they performing in the program and the competencies that they're developing in the program that are assessed there, but also to track them into their jobs as teachers and get feedback from the districts and from the field at the extent to which they're actually able to engage that work. So I think it's a very useful thing to do in general. Where we are in the project, we're going to get to some of that in the next two years. So I mentioned we started data collection in fall 2019. The design of the study was that the 1200 students in the sample that are with the extensive measurement, they are half freshmen, half juniors. Some of those juniors, essentially half of the juniors, a quarter of the sample, continuing juniors, they had started at UCI as freshmen. The other quarter are junior transfer students. In the UC system, for every two entering freshmen, you take one junior transfer student in, primarily from the community colleges. So we've got, one way of saying, from that fall 2019 cohort, roughly half of them were juniors, they are graduating, the ones that are graduating on time, will start to graduate in May this year, sorry, June this year. And we are, the Mellon Foundation has invited a proposal from us that's at the June board meeting they'll decide on. So I don't want to get ahead of the foundation. But if all goes well, we will have two years of additional funding to track those students after college into the labor market, into grad school, into life. Because college is not just about labor market outcomes. It's about a broad set of human development. And so we want to, we should really be interested in how they are faring in, after graduation, in terms of, broad set of outcomes, in terms of human development and general flourishing. Well-being, general well-being. That's an ambitious answer. I really want to see that go forward. So let also, let me be clear, I haven't had a chance to say this yet. It's not just me, Brian. So we've got an interdisciplinary group of working on this project at UCI. There's 12 faculty involved, leading people from developmental psychology like Jackie Eccles. She's the foremost national expert on student motivation. She's very involved in designing our surveys, taking a leading role in that. We have Mark Warshauer, National Academy of Education on Ed Technology, guiding the learning management systems analysis. We brought in a three dozen leading experts in the country for convenings at UCI prior to data collection to advise us on developing this measurement system. So let me assure you this, Brian, if this wasn't left up to just a sociologist like me, because I wouldn't have the ability to kind of put in place a comprehensive project of this character that's really interdisciplinary at its core. Well, my congratulations to the entire team, which sounds like a swarm of people. Yeah, and then imagine the doctoral students and we also involve undergraduate research assistants. It's several dozen folks on campus that are actively working on the project. Oh, let me say one other thing about that I'm really proud of. The undergraduate senate at UCI formally endorsed the project. We brought it to them. We said, you know, that we're measuring undergraduate student experiences, trajectories and outcomes for this reason. We'd like you to consider it and weigh in on whether or not this kind of work is in the interest of students. It's formally endorsed by our undergraduate student senate because again, this is work that every college and university should be doing in one fashion or another. We've got questions that actually address several different parts of your answer. Let me bring in our friend Tom Hames who books at one of these. Is this data being used to form part of the instructional redesign process and which isn't the same? Is this being shared with faculty real-time? Great question, Tom. So it is our goal to develop what you've called real-time ways to share this data. One can imagine a set of dashboards, data dashboards that faculty could have access to where they were able to observe outcomes generated from our project. Some of that is indeed happening right now in the following way. These kind of dashboards are being set up by our division of undergraduate education using data to inform instructional redesign. So it's kind of some standard things like where are the roadblocks, courses with roadblocks where students are getting DWIF. You're able to identify those in different majors. In real time, you're able to track student outcomes of track a dashboard of what courses students take subsequent to a particular course and what are the grades in those courses. And they're building into those systems. The measurement we're doing is informing the types of measures they're building into these dashboards. So the learning management systems, academic engagement measures that we identified are going to be part of that internal data system at UCI. So that effort is underway. We're also, hey, I'm going to be the only person in higher ed to ever say this. We have the good fortune of accreditation happening next year. So it's a joke because everyone in higher education don't people dread having to fill out their forms and all the exercise. But at its heart, it's about using data. It should be reflective on practice and improve practice. And so coincidentally, our accreditation is coming due next year. And if you're an administrator at UCI trying to think about where are some learning outcome datas on student experiences to see whether or not we're accomplishing this goals, lo and behold, we've got the state of the art measurement project that can inform that effort and hopefully be used in a way that accreditation is at its best to inform instructional redesign, not just to be a sham compliance exercise. Yes. Well, what a great coincidence. And Tom, thank you for a question true to your approach to education. Going back to the LMS, we have the awesome James Shulman question from the American Council of Learning Societies. And he asks, from your early work with the learning management system, what do the prospects look like for deriving measures of engagement that can be built into the very software? And what are the politics of the surveillance issues? Yeah, yeah, so very, that's a very, yeah, very good question from James Shulman. And I will take a moment just to out him here. He was formerly at the Mellon Foundation when this project was in early design stages. And so he helped inform kind of our original thinking about this project and engaged the foundation and consideration of it. So I'm very grateful for his prior efforts and his ongoing interest in the project. So our original project goals were to improve measurement. I mentioned around institutional improvement at UCI. I mentioned informing larger social scientific knowledge generation. But another goal of ours always from the start was developing innovative measures that could be disseminated throughout the field of higher education more broadly. So in the case of the LMS data, right? It was a major lift for us to make sense of that clickstream data, right? Every click that a student engages in the LMS system, we had to, we don't get too technical here, but we had to purchase, you know, space on this huge Amazon web service server to park all this data so that you could analyze it in, you know, it would have otherwise just overwhelmed even the data, the computing systems of a major research university. And then you had to have the scientific expertise to code it, make sense of, again, to translate all this big data noise into coherent measures. So we were able to do that at UCI because of a lot of our research capacity at the campus. But every college and university in the country certainly doesn't have that ability. You know, UCI, we're kind of exceptional. It's a major R1 university. It's also, as a dean of the School of Ed there, I'd be remiss to not say that it has a great school of education ranked 15 nationally as of Tuesday this week. It's the best public school of education in the country. Send your potential PhD students our way. So back to the LMS system. It's, we were able to generate this data, but not every campus can do that. The way to do it at scale is there's three majors, LMS providers that control 90% of the market. And so it's very, it would be very low cost for them to take measures that have been scientifically generated and validated in our project because we've got this data. We can triangulate it and make sure we're identifying things that track with other measures of academic engagement and the like. And then build them into the systems and automatically provide them to other colleges and campuses. So that is our goal. We're still, you know, honestly, we're still, you know, we're barely 18 months into data collection. The pandemic, you know, created opportunities for us, but also challenges. So I'm really proud of what the teams generated so far, but we haven't yet gotten to the point of identifying measures that we would recommend be provided for the field as a whole, but we are going to get there. And so I think that's coming. We haven't lost sight of that. Now, James also raised the issue of the surveillance data. Like, what do you mean you're measuring academic engagement in my class? That seems a little, you know, a little creepy surveillance. So the, I'd say two things around that. One is we have to have real conversations about what is the purpose of measurement. We have to say loud and clear, it's not about accountability, right? That's, you know, we know from K-12 that attempts to use measures that are imperfect and limited to enforce accountability on schools and K-12 teachers has been disastrous. It's been counterproductive. It wasn't scientifically justified in the first place. The measures weren't able to do what some well-intended stakeholders thought they were doing enforcing these accountability regimes. So measurement can't be about that. It's got to be about something else. It's got to be about using it to intentionally design programs to better serve students. And so one is, you know, let's be clear about what the goals are and the goals we can share and agree on and reject those that we, that, you know, are not useful ways for it. The other thing I would say is you build into the system safeguards so that individual privacy is maintained. So all the data we're using at UCI, by the time any researcher including myself gets to look at the data, it's been de-identified. There is a person, a literal individual hired for the project that takes the data from the system, merges data together in different strands, but de-identifies it. So by the time a faculty or a grad student or a researcher gets to start working on it, it's not identifiable down the road. And so again, I think there's ways operating procedures to protect privacy, to allay some of the fears of faculty worried about too much surveillance. And then I think we have to be really clear about the purpose and value of the measurement. And then I'll say one more thing, you know, like, you know, we have to put the students, we have to put students first, we have to put the needs of students first. The, that, you know, 50% of students starting college in the U.S. don't graduate in six years. It's probably a little bit closer to 40, 45% today. But come on, there's no other sector in this country that would tolerate that level of underperformance year after year, decade after decade, and not engage in a broad-scale social movement to do better by those, the students. And so part of doing better is to take student experiences seriously and use this measure for good in ways that are, ways that are good and can help us address these kind of problems of student success, retention being one of them, but holistic development and growth, you know, another. Here, here. James, thank you for the great pair of questions. And again, thank you, Richard, for the very, very deep and passionate answer. We have, we have one quick question, clarifying question from an awesome person, and then we want to hear about your findings. So this is from Roxanne Risken, a longtime friend and supporter of the program who says, during the two-year tracking student mood is mentioned. Can you talk about how a student's mood will be tracked and how this information will be used to help students during a two-year study? Hey, great, great, great. Well, that'll get that gets us into some findings. So thanks for that question. So when you use the term mood, you know, I think of our experiential, the best data on that is the experiential sampling, where we ask about emotions, right? What are you doing? Who are you doing it with? What's your emotional state? And, and so your, that data is, you know, really captures mood in real-time lived experience is not retrospective. How did you feel last week? It's how you feel now 50 times. Remember, they're, they're asked 25, one week, 25 another. Interesting. So here's a finding of throwing my first finding for you, for the, for the group. We had the by accident more than design, we, we deployed this experiential sampling method last year in February and there was no pandemic. So we had a people, you know, there was, for people following the news, you know, there was some concern about a looming, but camp in Orange County where we were, there was literally a handful of cases, no, never, no case on campus operations. And so we have people's, what are you doing emotional state? They're all on campus. They're all feeling pretty good hanging out as undergraduate students. And then we asked them again in April, experiential sampling. A lot happened in March. The, the World Health Organization declared a pandemic. President Trump declared a national emergency. The governor of California issued stay-at-home orders, stay-at-wide. The campus essentially moved to remote instruction and told everyone they should go home out of the dorms. So what's their, what was their mood? What was their emotional state after? Hypothesis, right? There were a lot's changed. They're worried, they're concerned. Turns out it was, it was remarkably stable, remarkably stable. We had very little emotional changes in emotional mood before and after. And, you know, you can, you know, social scientists were really good at post hoc explanations. Well, maybe it was still too early in the pandemic. You know, they went, you know, they were back home with their family that they had some, you know, resiliency around, you know, around because of that. Who knows. But, but the, the, the empirical fact is their emotional moods changed very little. Now, the mood is different from, we also track psychological stress, mental health issues with more standard battery in the social, the social surveys they get. Stress does increase during the pandemic. It was particularly elevated in the spring term right after the pandemic's onset when people move to remote instruction. It's, it's, this year it has lowered people's, this is, you know, we've gotten more used to the kind of the realities we're facing. So the stress has been lowered, but not yet lowered to the levels that existed pre pre pandemic. So we, we, we look at stress, mental health, but we also look at it, look at this kind of emotions, which is what Roxanne calls moods. That's great. Roxanne, thank you so much for the great question. And Richard, thank you for sharing that finding. What else, what else can you tell us? What else have you managed to discern? Yeah, so, you know, I'll try to come, I'll try to share some surprises, right? Because that's, you know, so going into the pandemic, we were, we expected, we're worried that not just about student stress, but students' ability to continue to make academic progress. They're being at higher risk of dropping out. You know, colleges and universities have thought for a long time, best way to keep students from dropping out is have them socially enmeshed in college life, residential facilities, extracurricular active, the more socially integrated, socially engaged they are, the less likely they are to be dropped out. So all of a sudden, they're dispersed to their homes. You know, it's hard to do this stuff on Zoom. Many of our students at UCI are 50% of first generation, 85% are from non-white Hispanic, non-white non-Hispanic backgrounds. So we could, many of the communities they lived in were deeply impacted by the pandemic. So really worried and concerned about academic progress and academic engagement, as well as the mental health thing. So one of the big surprises is, in fact, they've done incredibly well academically. The retention, there hasn't been elevated dropout rates at UCI. In fact, the retention, if anything, seems to be improved a little bit. And the grade point averages are up. The credit hours accumulated are up. And our measures of academic engagement, both from surveys and from the LMS system, when we look at the LMS data, we had 15% of the courses at UCI were already online. So when you compare those online courses before and after the pandemic, the academic engagement in the LMS system goes up even in those courses. So students academically are doing well, flourishing and thriving. I don't want to, those seem to flower the descriptions. But we were worried, a lot of our worries turned out not to pan out. Now question is, why did that happen? So here, this is somewhat hypothesis more than empirical findings. Well, the university invested a lot in encouraging faculty to be accommodating and flexible in how they interact with students to move whenever possible to asynchronous formats that are much more convenient for students. Students weren't spending time commuting, trying to get to campus. Students weren't spending a lot of time socializing with friends, which can be developmentally productive. And it can also be developmentally counterproductive, as well as academically distracting. So these are some of our hypotheses about why with our particular type selection of students, in a particular institutional response at UCI, so I don't want to generalize this to schools nationally, we saw this phenomenon. That's enormous. That's really striking. And that actually triggers a connection to another question that comes to us from across the Atlantic. Brian Mulligan, the University of Technology Sligo, asked, what if we find that nothing works really well, but you didn't find that? But also, or that these practices only are affordable by elite institutions? Yeah, well, that seems too pessimistic for my take on things. I think my take on undergraduate education and if you go back to academically adrift, you'll see it there. People might remember that book as being about limited learning and all the students we were underserving. And I think there's about a third of the undergraduate student body in the US that really isn't academically engaged, really poorly served. But there was also in that earlier work a sense of success and flourishing and students that were applying themselves and taking advantage of their education and were academically engaged. And is that easier to do with more resources? Yes, I would say that resources are helpful. But I don't think it's all. I mean, I really don't believe it's only about resources. I think it's about a pedagogical design. So let me give you on pedagogical design. One of the things we're tracked in our in our study pre-post pandemic is the extent to which students were the percentage of time spent in lectures in their classes, right? How much of their classroom time was in lectures? And boy, did that drop going from in person to remote. The remote was much more interactive, much less time listening to lectures, much more time through poor class discussions and, you know, connected to kind of practices we know people call active learning, progressive pedagogical practices, what different terminology around it. But that increased a lot. That's not, you know, that's not just about money. That's about kind of intentionally instructional design. We all as faculty, we all love hearing our own voices. And I say that kind of tongue-in-cheek because I'm talking a lot in this past hour to you. But, you know, there's a big disconnect between what we, you know, with us being interested in our own performances and what students are experiencing as learners. And yeah, I think we can, we can, that central insight has huge implications for how we redesign programs to better meet students and student needs. Richard, that's a great moment to, I'm afraid, pause on because we have reached the end of our hour. And I hesitate to go further than that given everyone's commitment. And of course, your time with the roughly eight jobs that I think you do, that's a great note about our focus on students and about what we can do well. Thank you so much for sharing this extraordinary project and for being so great, grateful as to share your findings right now as it's beginning to advance. What's the best way for everyone to keep up with you and with this project? Yeah, so I don't do Twitter, I'm sorry, I don't have Facebook, but we do have a web page for the project that you were so gracious to put, I believe, in the bottom corner of the screen here. And we're doing our best to keep that updated with findings that we're presenting papers, the media coverage that's around the project and so on. And so that's a great place to go to kind of keep up with things. And also people, of course, can email me and I'll do my best to respond and be responsive and to any questions that came in. Fantastic. Well, thank you so much. Good luck continuing this project. We're really looking forward to seeing what comes from the next. And thank you for being here. Thanks for having me. Thanks for this great forum you're doing, raising all these important issues with this larger audience. Thank you so much. Thank you so much. But don't go away, friends. Let me just point you to where we're heading next. Just to remind you that over the next few weeks, we have some great topics coming up ranging from accreditation to educational technology, to equity by race. If you want to learn more about that, just go to forum.futureofeducation.us. If you'd like to keep talking about this, if you'd like to keep discussing questions like how do we measure student learning? What do we do with the data? Brian Mulligan's question about their applicability. We have many venues to continue this conversation from LinkedIn, Slack, Facebook, and, of course, Twitter. If you'd like to go back into the past and take a look at some of our previous programs to touch on these issues, like last week's session about data analytics or previous sessions about student engagement, pedagogy, the LMS learning design, just head to tinyurl.com slash FTF archive. We have about 249 videos there. And before we go, let me just thank you all for a stack of really good, very thoughtful, very probing questions. As always, it's an absolute pleasure to work with all of you and to think with all of you. I'm delighted that forum can be this kind of base for all of your thinking together. But even more than all of that, please take care and be safe, all of you. We'll see you online next time. Bye-bye.