 I have the top of the hour, so let's begin. Let me welcome everyone. Welcome to the Future Trends Forum. I'm delighted to see so many of you here today, and I'm really looking forward to our conversation. I'm Brian Alexander. I'm the forum's creator, host. I'm your host and chief cat herder for the next hour, and I'll be your guide to our conversation as we proceed. If you don't know Audrey Waters, you absolutely should. She is one of the smartest, most influential, thoughtful, and critical writers about technology and education in the world. She is an astonishing discussant, a fantastic writer, and a really, really perceptive researcher. And we're here to welcome her back because she has a new book out from MIT Press called Teaching Machines, which is a powerful, powerful book that looks at the history of how we tried to personalize learning through mechanisms going back a century. It's a groundbreaking book. It's incredibly engaging. It's impossible to stop reading. Extensively researched with fantastic primary source material that she dug up. For anyone working or thinking anywhere around education and technology, I think this book is mandatory. Now, we're also just grateful to bring Audrey back because, among other things, way back in time, way back in the past, in February of 2016, she was the very first guest on the Future Transform. And so I just wanted to honor her for that. And I wanted to quell because she's amazing. Now, with that, let me bring her up on stage. Welcome back, Audrey. Letting me back on the show. Believe me, it's my pleasure. It's so good to see you. And this book is so amazing. And you have a bunch of fans already in the chat, people waving at you and clapping. I mean, a lot of really good, really good vibes. You're coming to us from Oakland, California, where, apparently, the temperature, at least, is paradiesel. It's very nice. It's quite lovely here. And no fires in our general immediate proximity, so we can still breathe. Good. Well, it's July. You've got time for that. We have time. Yes. Yes. I have so many questions to ask about this book. But as always, our tradition in the forum is to ask people to introduce themselves by describing their future, what they're going to be working on for the next year. So now that you have this book under your belt and print coming out into the world, what's next for you for the next year? What are the big ideas or what are the big projects that are up and coming up for you? Wow, that's actually probably the most challenging question that you could ask me. I was worried, you know, you're worried when you write a book and then you write it and then you send it off to the editor and it becomes part of the publishing machinery. You don't actually think about the book you get for a while and then you get the proofs back and then it goes away. And I'm always worried doing these media interviews for the book now that someone's going to ask me a question of something that I've forgotten that it's been so long since I wrote it, that someone's going to ask me a detail and I'll be like, is that in the book? So I was worried about getting that kind of question, but actually asking me about the future is really quite tough for me right now. As probably many of you know, well, as many of us experience, the pandemic was just not great. I experienced a profound personal tragedy and I don't really know what I'm doing with my life. I turned 50 in a couple of weeks and I guess I'm having, I don't know if it's midlife crisis, it feels like a midlife renewal, but I don't actually know. I'm just going to kind of sit with myself and focus on myself and not on ed tech, which I think will make for a healthier and happier Audrey. So I don't have a good answer, because I don't know. We caught you at quite a time. Yes, you did. I'm all in favor of a healthier and happier Audrey. That really matters a lot. I think probably a lot of people are because it cuts them some slack, right? Some people know that things can happen in the world of ed tech, right? Coursera can go public and I have nothing to say. Bill Gates can be accused of being a sexual harassment and weird creepy stuff during his divorce and I don't have to weigh in. So yeah, get it all out in the air while I'm taking a break, folks. Do your edX to you mergers now, because I don't care. I mean, I care, but not enough to write something. We'll see what happens. John in the chat says, it's hard to imagine ed tech conversations about your important voice, and I agree. I'm glad that we can at least hear your voice for this hour. Me too, I mean, I love what I do, but I just, I need to take a break. Understood, understood. Friends, we have a wonderful book here and we have a wonderful person who wrote it. And if you're new to the forum, the way this usually works is I ask a couple of unbelievably easy questions and then our guest gets to cut loose. But then my job here is not to be the interviewer. My job is for you all to be the interviewer. I'm here to help you ask questions to our guest. And so again, you can use that chat box, but best of all, you can use the raised hand to join us on stage or you can type in the Q&A box and I'd be delighted, delighted to move these here. One question I have for you, based on your book, on page 47, paragraph three, line. Well, I totally understand. I totally understand. I've had questions like that to me and it's a lot of fun. One of the things that struck me is, I know you've been very positive about Seymour Pappert, who after all, those of you who don't know, a big fan of constructivism and digital learning offer a very different paradigm to say, for example, of behaviorism. He also helped design the logo programming language so that students, kids would be able to control machines. As he put it, it would be, instead of having computers programming students, it'd be students programming computers. But at the end, you say this still locks us into a kind of computational thinking framework that even when we try and get out of the BF Skinner heritage, we're still stuck with all the limitations and affordances of computational thinking. Where do you see people getting away from that and still teaching? Yeah, I mean, I think that this is really the challenge with some of the stories that we've told ourselves about teaching machines in general. And I think that we tell a story as though new technologies come along and they replace the old technologies and we're kind of done with that and we've moved on to something better. That's the narrative that we like to tell about technology. It's a narrative about progress. And I think as Americans, and I should little side note, this book is very much a history of education and ed tech in the US. But as Americans, we are deeply committed to the story of progress. And I think that we believe that computers are somehow at the pinnacle of this and that by working with computers, we sort of broken free or we're able to break free or potentially we can break free from the past. And what I wanted to tell was a story that showed that really a lot of these ideas that we work with are very much embedded in things that came before, that are deeply wedded to ideas that predate the computer by a number of decades and that we've carried that with us. And I think that the same thing, we tell the same stories, I think in other disciplines, in other fields as well. And I think of that as like the way in which you read through the history of disciplines in a textbook, the kind of thing ideas are laid out one at a time and sort of we move from one idea to another and somehow the new idea just displaces the old one. So in psychology, in psychology, we had Freud and Jung, for example, and then behaviorism and behaviorism kind of supplanted that. And then we had cognitive science came along and cognitive science displaced behaviorism. And again, I don't think it works quite that neatly. And I think that, you know, back to your question about Papert, I think that Papert was very much immersed in a world that was trying to show that we were going to break free from some of these preexisting ideas, but that were still very much enmeshed in the ideas of their time, enmeshed, I think, in behaviorism, enmeshed in, although things were, I think new language was used to talk about things, I think that they were still very much enmeshed in what I think is a very 20th century view about man plus machine together. And I think that when we think about that, I think that it does in some ways circumscribe, it does circumscribe what we do. I mean, we use all of this language to talk about teaching and learning that we've actually, it's actually quite mechanistic, mechanistic language. And I think that that's the, it's almost like the play on words with teaching machines, you know, that we have, that there's this field in AI called machine learning. But also there's this sense that, so people, you know, people who work in AI are teaching machines, teaching machines. And I think that we kind of confuse a lot of the ideas, a lot of the, we use language that is, again, that describes the way in which computers or machinery, we describe how they operate in ways that I think make us more machine-like, that rather than sort of fulfilling a human potential, they actually sort of circumscribe us. So we think about memory now as being a computer. We think about learning as being computer. We think about the mind as a computer. And it's not, it's not at all. I think that our minds are much more complex than just the kinds of things that machinery can do that emulates the behavior, the behavior of a human. Can't escape the behavior word, Genway. We can't. That's a fantastic essay. Remind me, I need to just somehow figure out a way to transport you to my classes this fall. That's fantastic. Kisa Johnson in the chat says, this reminds her of what she calls imitation culture. She says that this is when emulation of bodies of knowledge that weren't studied properly. An example is famous Greek philosophers learning for African knowledge, for example. This is, and friends, I'm gonna have a couple more quick questions, but this is your time to ask your own questions. And again, please feel free to bring in whichever topic or point you have based on where you are thinking and also where you're working and what you hope to accomplish, especially as this next year looks like it's crazier and crazier depending on how different things fall out. You remind me of a discussion of the power of the computational metaphor how 18th century thinking was so based on clocks and clockwork and machines and mechanisms. And it's interesting to see that kind of heritage appear in the early parts of your book when we look at some of the early forms of machinery and how we are still so stuck with those. But one of the big differences now is that we live in the age of big data and data analytics at scale. And in your conclusion, you're very moving in your connection of teaching machines to Shoshana Zuboff saying, this is one of the great dangers of our time. All the dangers that we know well, thanks in part to your work, we have the connection of big data and education. Where do you see this headed for the next year? Are we in a battle in higher education where someone to obtain more and more data and impress it more and more to get more and more actionable intelligence and versus people who think this is in our well-being and dystopian nightmare and want to stop it? I mean, does that give me become even more the zone of struggle about this? Yeah, you know, one of the things I would say is that I would think I would say that actually education has been trapped in a big data nightmare for over a century. The data wasn't as big, but this is something that I think I try to show in the book that this fantasy that we have if we can just gather more data about students, if we can just gather as much data as possible about students, then we can personalize education to them and make it more efficient. This is something that people were thinking about in the early 20th century. This is of course connected to the rise of standardized testing, which is really so deeply intertwined with the history of education technology. As we were developing teaching machines in the early 1920s, we were of course also developing testing machines. And Ben Wood, who is one of the characters in my book, he's someone, interestingly, that isn't talked a lot about in ed tech circles, but he's very important in the history of educational testing. He was one of the, he went on to form AACE. He helped found ETS, Educational Testing Service. He worked at Columbia University and he was one of the first people to, well, he actually convinced Thomas Watson to donate a computational machine to Columbia in order for him to crunch the data on standardized tests. And it was really his vision. He was very clear that what we had to do was to mechanize education. And what he meant was we need to, again, get as much data as we can through testing. And then educators would know their students better and be able to individualize education for them. And he met standardized testing through achievement tests, aptitude tests, intelligence tests, personality tests. And this was really part of the interest that he had. And then by extension, quickly, part of the interest that IBM had in both creating products that they could sell to schools, but also creating products in a system that corporations could use again to better identify people who would be good IBM employees. So I would say that the big date, this big data thing has been something that's really consumed education and ed tech throughout its whole history. What we have ostensibly, I mean, I would say ostensibly, we have so much more data now. I'm not sure that we have any more useful data than Ben Wood did, right? I'm not sure that we, I mean, we gather a ton of data. We claim to be able to sort of identify more precisely what students are thinking and what students are doing. But I don't know that we actually have better insights than they did in the 1920s and 30s. And I think that that's something that we have to wrestle with is that we have been pursuing this sort of policy of mechanized education, right? Automation of education, testing for a very long time and to what end, right? We certainly haven't humanized the university. We've built a machine. We've built a machine that doesn't seem to really be interested in a practice of freedom. The opposite of freedom. Yeah. Which of course Skinner was like, right on. We have to get beyond that. We have a few folks who have, who let me join us on stage and a few questions. And I'd like to begin then with David Hul at National University. And let's see if he can join us. Hello, David. Good morning, Audrey. Hello. Good afternoon, Brian, and to the swarm. Audrey, I just wanted to piggyback a little bit on your last comment and last thoughts there about personalized education. I come from an institution that recently had an initiative called Precision Education with a lot of labeled precision education with a massive amount of data collection to in the attempt to personalize learning. And from the faculty side, we never really found an answer to what precision learning actually is. So number one, I feel like it was a bit of a catchphrase or something that seemed to be the edtech thing of the moment. And my question really is, what are your thoughts on how faculty can kind of become more empowered to become a sort of a voice in that getting away from the mechanization, as you said. And I'd like to love to hear your critique on the mechanization, but also how faculty can support that in a creative way. So the creativity of how faculty can reinvent this and be a part of this with edtech. So I hope that's clear, but just curious. That's a great question. And I think that like so much of this, this is that's a labor to me, that's a labor question. And it's also a kind of labor, I would say to make the university more human, more humane requires the kind of labor, affective labor that isn't so often historically been denigrated by the university. The university is not interested in that kind of work. And so there's a reason then that the university looks like this monstrous mechanized machinery is that the kinds of things care, the practice of care, the ethics of care is not prioritized financially or philosophically or any of it. And so to me, the answer is of course, for faculty to care, and that can sound quite cliche, but then that also raises so many questions about labor, about who cares already, who does the work of caring already. And what does that mean again, if the university system is not set up to honor, recognize, pay for people to care. And we can see this in the other kinds of systems, not just the faculty, the interaction of teaching, which I think is a deep, which is a deeply caring act, but in the other kinds of services that universities are now wanting to outsource to machinery. So rather than having advisors, students can just use the algorithm to tell them what to do. Rather than having people who help students learn how to college, right? Learn how to navigate this giant institution that we can have chatbots. And so I think that the university really doesn't want to pay for care. And among, and academia, ooh, academia doesn't really do so well on that front either. It's hard to, academics, they don't really care for each other, let alone people lower on the hierarchy. So I mean, it's really hard, because it's just not part of the institution. It's not part of the institution's core beliefs and practices, and that sucks. Well, that I certainly agree with you. I'm sorry, I don't have a good answer then. I think that's a great answer. I'd like to truly believe that those of us here today, those faculty members here, EdTech community, this is a group of caring individuals starts with Brian. And from here, I think this is a great source of those who care, and for those who want to rise up and start a march of caring. That's kind of what I hear you saying. There are many faculty members who might come across who care very deeply, but I also agree that many institutions here in San Diego, but also across the nation, are filled with those who are moving along and may not have that element of caring. And it's hard to, I mean, that's the other piece of it. It's really like, I mean, to tell an adjunct faculty member, for example, you should care more about your students is ludicrous. The university should hire people, pay full-time, with full-time benefit, full-time pay with generous benefits. Asking marginalized members of the community to care more, that just leaves a better taste in my mouth. So I mean, I think that this is really, it's really part of the systemic ill of the university. And I think that, to tie it back to the book, I think that this is something that as the machinery of the, as the machinery of education really took hold in the 1960s, that people recognized. And I think that what I'd like to see is sort of more of that. I'd like to see a resurgence of people really stating loudly that they refuse to be part of this machine. Mario's audio, for example, on the steps of Sproul Hall, saying, that we will stop the machine from working at all. You cannot treat us, you cannot treat us this way. And us there was students in his case, but I think us should be broader than all of that, yeah. I certainly agree. Thank you so much for that. And I think it's just so important to bring to light labor, faculty issues, and those, how it's connected to ed tech. Thank you, thank you, thank you. Thank you, David, and good morning. If you're new to the forum, by the way, that's an example of the video chat. So if you would like to just, if you wanna join us and follow David's footsteps, just press the raised hand button and you can join us up here. We also have a whole bunch of questions that have come in. And thank you, Audrey, for that very, very passionate answer. That reminds me a bit of the Biden administration's earlier budget proposals this year, which tried to incorporate caring labor under infrastructure. We have a stack of questions, which are more interesting than me. And the one comes from a metiplur at the Université Laval. And he asks, oops, excuse me, he asks, if I press the right button, is the fight against mechanization also the fight against scaling up or growth? I think that they're certainly intertwined today. I mean, I think that this is, but they're intertwined again, historically. And I, again, sorry to be like the book, the book. But like, really, for me, I was really interested in telling a story that wasn't about computers. Partially because I wanted to sort of break free of that narrative, that progress narrative that I talked about earlier, where so many books about the history of edtech, they sort of throw the pre-computer stuff in the first couple of chapters. And then it's sort of like, da-da, the computer came and we're all saved. But I think that if we look at win in the early 1900s, when this push for machines came, it was really coincided with the expansion of public education, with the expansion of educational institutions and access to education and expectations about education. And there were scrambles, of course. There were scrambles, of course, to staff this. There were scrambles to figure out what does it look like? Particularly a period of mass immigration to the US, what does it look like to teach everybody? How do we do that practically? How do we do that philosophically? But then, because we're Americans, God damn it, and we can't all be mass educated, that would be European, right? We have to recognize how to take care of our individualism. Again, another one of these things that is so core to American ideology. We want to scale up. We want to make sure that everybody has access, or everybody, asterisks, not everybody. But we want to make sure that we can sort of expand public education, but we have to make sure that it's personalized. And that's sort of this other sort of paradox, I think, about teaching machines, then and now, is that how we expand that is we mechanize it, we automate it, and then interestingly, by automating it, we personalize it. It doesn't make sense, but it, I mean, it makes sense to like Mark Zuckerberg, right? And in that kind of thought, we get to personalization through standardization. It's the personalization that we give companies, in that case, they give back to us. Exactly, exactly. We had several questions that respond to what you've been saying, and I want to make sure that we can bring them up at the same time. This is one from Judith Betchner. Judith, I'm probably mangling your last name, my apologies, I'm trying to get it right. I think it's Betcher. Does your book address how machines can be used to support critical thinking skills? Certainly, that's, I think that, I think that all of the developers of, and inventors of machines believed that, believed that they were making machines to make education better. They, none of them sort of had this vision that by mechanizing the process of education, we were gonna make things worse. That the idea was that we will, we will make education better. Students will move more rapidly through the materials, or, and students who struggle, will be able to move through the material at their own pace, right? This very common thing that we hear about today was absolutely part of the vision of teaching machines. Skinner talked about this a lot, about letting students move at their own pace. So, yeah, I mean, I think that that's, what counts as critical, what counts as critical thinking skills is something that I'd want to poke at it that. I mean, again, when we, even today, when you think about the kinds of platforms that offer personalized learning, right? That claim that this is what they do, I'm not sure that, I'm not sure that we've really advanced much more, much farther than having, again, a set of standardized materials that you could move through at your own pace, right? Where no one is suggesting neither Saul Kahn or Mark Zuckerberg's vision or Bill Gates' vision or P.F. Skinner's vision. No one suggests that fifth graders should do Egyptology, not algebra. Everyone says we're gonna do algebra. You just get to move through algebra at your own pace. So this idea of personalization is still very much grounded on a lot of ideas about standardization. And so critical thinking within that is, I think it depends on what you mean. So no, really, my book isn't, my book is a, it's a book is a history. It's less about today, although like any good work of nonfiction, it does have an introduction and conclusion that kind of gesture to that. But really it's the story, it's the story of ed tech. It's the history of personalized learning from the early 1900s through, let's just say early 1970s, when Skinner kind of went down in flames ostensibly, although not really darn it. That's a kind of specter now. Judith, thank you for that great probing question and Audrey, what a fantastic answer. We have more questions that are on the side of economics or operations. And this is one from David the University of Denver who asks, the money follows rankings, the rankings follow research. Who's gonna break the chain so that money flows to education and carry? Or how? Great question. I mean, I think that it would just require a huge recommitment first and foremost to public education, to really recommitting to the kind, to the visions that I think we've had from time to time in certain places. We have in this country, I say we Americans have had a great commitment. I mean, I think that in some ways the idea that we do have a free education from kindergarten through 12th grade shows that we do have ideas in which we've decided to invest in education. I think that certain states had, I'll put it in past tense sadly, had a great commitment to public higher education that was accessible to everybody that was free and high quality, right? The state of California, for example, where I live now, the governor, you might have heard of him, his name was Ronald Reagan, decided that he wanted to dismantle that for political reasons. But I think that first and foremost, I think it really does, we really do have to recommit to public education. And I think that rethinking what that looks like, that will require us to I think evaluate some of the things that are foundational to these institutions and some of the things that are foundational to these institutions are beliefs and practices and values that we need to get rid of, that we need to dismantle, that we need to challenge. I mean, our institutions are built on white supremacy, for example, and I think that we will not ever get, we will not ever move towards institutions that are grounded in care if we don't address the fact that we have built these institutions at the K through 12 level, as well as higher education on ideas of hierarchy, on ideas of difference, on ideas of exclusion. And so I think that we have a lot of work to do to get towards caring, but I think it's political work as well as personal work. But yeah, I hope that answers the question. That's a tough question. David asks good questions. Thank you for that, David. And Audrey, that's an equally tough answer. Or at least a strong answer, I should say. Thank you. We have another question coming in from longtime friend of the program and also brilliant CEO and commentator, as well as a wonderful teacher, Maria Anderson. How do we bring her up on stage? Hello. Hello. Hi, Audrey. Maria, hello. Long time no talk. All right. I think, I can't remember which question it was. I was going to ask. Oh, I think I just wanted to make a comment, I think that part of the problems that we're having is, we're in a chicken and egg cycle between technology and increased class sizes and increased loads and decreased full-time jobs and things like that, right? So it's kind of a question of at this point, what's leading what? I think the economic pressures on education are forcing or making it economically viable for some of these technologies to be built, but they're not actually solving the problems of increased learning. And then the other thing I just wanted to point out, I think, Audrey, that you're familiar with the sleeping-eating-bonding model of problem-solving. Propose, I think, Christy, did you put that in a book somewhere? And I really think that part of the problems we're seeing around EdTech is that technology companies are learning like it's a sleeping problem. Like it's a very simple A-B solvable problem. Try this, then try this, then try this, nothing else is going on, there are no other conditions at play, right? But learning is actually an eating problem. Motivation matters, your support group matters, your everything about your life matters with learning. And I think, here's where we have to separate learning and assessment. I think assessment, trying to get at what a person knows at this sliver in time, is probably a sleeping problem. It's probably a well-defined enough problem that we can get better at assessing things. But if we want to get better at students learning things, it's not something that's going to be simple technology-solvable, right? It's a completely different animal. Yeah, I mean, I think that one of the things that was interesting to me when I was doing the research for this, I spent a lot of time in archives going through letters that academics wrote one another within education psychologists wrote one another. And, you know, looking back and sort of seeing some of the same ideas that we have today with people who sort of decide that it seems very fixable to them, right? There's the famous story that Skinner told in it. I think, you know, it would be a TED Talk if he were alive today. In fact, I think it's, I think it actually is Sol Khan's TED Talk, which he noticed. He went into a classroom and he noticed that students, some students were able to do their math quickly. Some students were struggling, but the teacher moved at the same pace and gathered up the assignments, took them home to grade and the students didn't get them back until the next day and Skinner thought, aha, I should build a machine. I think that that was either Skinner or Sol Khan, one or the other. But I think this idea that this is, the education is an engineering problem, right? And if we just can turn the dial the right way, we can build a machine and sort of fidget with the dials that we're able to sort of fix it is so common. I mean, and it's been common for a very long time. And people say, why hasn't education changed, for example? Or we're still, you know, education has changed. It's changed immensely, but it hasn't changed in the way in which I think we expect the sort of dials to be adjusted and somehow, you know, everyone is learning, everyone is learning it three times the speed that they were learning a couple of months ago. I mean, it's a vastly complex institution and individual human learning, teaching and learning are incredibly complex processes. And so this idea that we can just sort of gadget it, you know, we can sort of appify it, big data it, whatever sort of verb you wanna use. I think it's sort of, again, I think it's really intertwined with the way in which Americans wanna, you know, gadgetize things, but it's so much more complicated. Okay, so now I'm dying to ask you a different, a question I hadn't written in the chat and then I'll get off the stage just somebody else can. So as we look at how the workforce is changing because of machine learning and AI and automation and all of these things which got accelerated during the pandemic as well, we need to be able to on-ramp students into the workforce at a higher level of experience and understanding and analytical ability than we have in the past because all of those entry-level jobs and entry-level skills are being replaced by machines. How do you think, do you think it's possible that we can continue to do like a four-year college education that on-ramps students at a higher level to the workplace than it does now without having a longer education? Can we skip all the low-level stuff and get to the high-level stuff? I'm just curious what your thoughts are. Yeah, I mean, I would say that, I think that we have, that's one of the problems we have is we have expectations of, we have different expectations of what we think the function of education is both at the K through 12 level and at higher education. And I'm not sure if those functions always and expectations always match. Are we doing, is the function of the university to do job training, that's a different, that's a different function than what historically this institution has really been prepared for unless your job was one of three things, right? Medicine, not even medicine until fairly recently, the clergy and law. Really, the university has been about something else. And ideally students come out with the ability to do any number of things. But I think that, so I don't know that the expectations that it's a job training center match historically what it's supposed to do or even at the K through, even at the K through 12 level. And yeah, when you look at so many courses at the university, they're like, fact as well, especially in the STEM fields, right? Shove as many facts into you as possible in the shortest amount of time, right? And that doesn't really seem like it's preparing me for jobs either or for being a citizen, memorizing it. Yeah, and I mean, I don't know if freshman biology is not going to help me 10 years later when I'm an adult, you know? Yeah, I don't, I mean, and it's hard. I think that the job market also exists out, like it's this other thing that I'm not sure the university has much control of. I think that some higher education institutions are much more closely aligned with the job market, the community college system, for example, because historically, I think that has been part of its function, often at a very localized level with what are the demands of whatever the local industries might be? And how do we quickly turn out people who have the skills for that? But for four-year institutions, I don't think that that's been necessarily the expectation other than, and so I don't know, I think that... I think as the cost has gone up, it's become more and more the expectation from the consumer that they're being prepared to get a job. Yeah, and those are the stories, those are the stories I think that the mainstream press likes to tell. They like to shake the finger at the university as though it's somehow the university's fault that the job market is shit, right? I mean, like the university, the university really doesn't, I mean, it doesn't matter what courses you offer, whether they're fact-based biology courses or courses in the history of Spanish courtly poetry, that either of those, or whether they're computer programming classes, those don't determine the job market at all. And so it's a real disconnect that we have. But yeah, I think that we have put a lot of faith in this country that education is, I mean, it's very aspirational, that if you do the education thing, then at the other side of it, you come out and you will get a good job, you will be able to buy a house, you will be able to, whatever the American dream looks like. But I'm not sure that the university is the place that is the deciding factor on that. It's a nice, it's a convenience scapegoat when things go to hell. But it's not really, it's not really I think the thing that necessarily facilitates that. But boy, we sure have believed that and we've been willing to go into a lot of debt in order to do that part. But I think that the economic story is a lot more complicated. And I think it's unfair to say that it's the fault of universities that things don't work out well. Yeah. Well, I think we can talk about this for another 20 minutes, but yeah, it's a great topic. Thank you, Maria, and keep going course soon. Thank you. And thank you, Audrey, my gosh, that I had so many questions and comments myself and I'm here for everybody else to do this. While I'm bringing up our next question or one quick note, you remind me of Tracey McMillan-Cotton's Gospel of Education from her first great book. Our next video conversant is Cam DiBacco coming from UCLA. And Kim, let's see. Hello. Hello. Hi, Brian and hi, Audrey. Really nice to meet you. Yeah, kind of sort of virtually in person. I wanted to take a turn back to caring kind of, another take on that. And I was wondering if you could share some thoughts about how we might use or apply the discourse of diversity, equity, inclusion as a critical lens for rethinking technology and able teaching and learning. You know, diversity, equity, inclusion is kind of sweeping through, you know, at the moment and everyone's kind of getting woke and working with it and using it in different ways. And I just wondered what your thoughts about that were. Gosh. You know, I think that one of the things when it comes to ed tech, for example, when it comes to ed tech specifically is I think that we expect more of machines. I think Sherry Turkle says this than we do. Oh, Sherry Turkle. Right, and I think that that's part of it is that we decided we were going to offload so much of this work onto machines. And perhaps rightly so, because we have failed, I think too, we have failed to care enough. We have failed to sort of, we have failed to reorient our institutions in ways that would be more just, that would be more inclusive. But I think that by deciding that we're going to outsource this and allow the machines to do it, we're sort of offloading things in ways that will not get us there. I think that it's really challenging. Like I said, when I responded to David's question, the education system in K through 12 and higher education, it seems on one hand like it would be affective labor like it or it would care about caring. But one of the reasons that machines were developed, and in fact, standardized testing was developed in the 1920s and the processes and practices around that were that fears about caring, right? So when they developed standardized tests and the multiple choice test in particular, which could be credited to a number of people, but Frederick Kelly is often who gets credited with coming up with it. The concern was that the K through 12 teachers who were women by that stage were going to care too much about their students and therefore not be objective when they assessed their students. So we needed to, so education psychologists decided that we had to devise a way to take the caring out of, out, right? So we needed to come up with a system and the multiple choice test was a great way to do this in which the teacher didn't have, didn't have a subjective way, couldn't weigh in subjectively about the student that she, she could just grade a test that was standardized and was objective. And so if you think about the ways in which, I mean, I use hard coded as a metaphor, but also literally the ways in which that has become hard coded into our system, a practice and then a technology that was designed to eliminate caring, right? We've developed now for over a century systems and practices that really have made it so that caring was by design excluded from our practices and processes. And so we have to undo a lot. And I think that we have to sort of that by extension undo a lot of the education technology, right? If education technology is built on the legacy of testing machines, of standardized testing and testing machines, then we're not going to get to a practice of care. And by extension, I don't think we're gonna address questions of justice, questions of diversity without undoing these sort of really what are foundational ideas of the field. And I think that that's why, for me, that's why the history is so important too. I mean, these are, the standardized test is intertwined with eugenics, right? If eugenics and standardized testing are the foundation of ed tech, then we're in a lot of trouble. We're in a lot of trouble if we don't start to think about that and think about that really quickly. If we just move on, right? And act as though, well, we don't practice that today. I mean, that's sort of ridiculous. This is the foundation upon which the house was built. The house, the foundation is rotten. The foundation is rotten. You don't, you cannot build a solid structure on something that the foundation is rotten and the foundation of ed tech is rotten. So what are folks gonna do about that? Yeah. Thank you. Thank you, Kim. Yeah. What a great question. And Audrey, what a powerful answer to leave us with. If the foundation is rotten, if it's just shot through with all kinds of problems, what should we do moving next? I mean, should we start from scratch or how should we proceed? Yeah, I mean, starting from some days, I think to hell with it, set it on fire, burn it all down, start from scratch. I don't know what else to suggest, but I recognize that that actually probably leaves the most vulnerable among us, more vulnerable. So not necessarily a great thing to do. I do think that there are other lineages that we can look to, other moments in history, other ways of being and thinking that we can look to and build upon. There are other stories. And one of the things I'm really proud of in the book is I think a backup here. I have read a lot of histories of education technology and I don't know how many ever talk about race. It's just not mentioned. School segregation isn't mentioned. None of these ideas are really talked about. Again, I think education technology sort of acts as though it exists in the space outside of politics, outside of the practical reality, the lived realities of a lot of people. And so in the book, I wanted to sort of show what were some of the alternatives at the time in this pre-computer era that people turn to in order to resist teaching machines, in order to resist what they saw as the encroachment, the mechanization of education. And that to me is an interesting thing to think about. Where else have people found their ability to resist the machine? And one of the most, I think, powerful examples of this were in the freedom schools, particularly during Freedom Summer in Mississippi. The idea among some of SNCC, the Student Nonviolent Coordinating Committee, was why don't we use teaching machines? Why don't we use programmed instruction? Why don't we use these scientific methods to help, for example, help with adult education? Help boost the literacy among black folks who couldn't vote because they couldn't pass the poll tests. But I think the people, they realized very quickly that this was not going to be a practice of freedom, that that wasn't how one would move forward by adopting this technology of standardization, by adopting a programmed instruction that was written by engineers who had no idea of what the lived experience of a black sharecropper in Mississippi would be. And so the freedom schools were about building teaching practices, building education, alternative education, but building educational practices that were community-based, that solved local problems, that were about inquiry, curiosity, self-fulfillment, but also community politics and thinking about systems that could resist and push back against even against the educational system, right? Against the kinds of practices that would make education even after Brown v. Board that would continue to make education separate and unequal. And so the freedom schools to me are one of these sites that we can look to historically and recognize that people have resisted the mechanization of education. They have been resisting the mechanization of education for a very long time. We just haven't heard these stories. We haven't heard these stories because I think the story of ed tech is often told from a very white perspective, from a very, and from a perspective that sees ed tech as the culmination of education and doesn't recognize that there are alternative stories in which resistance actually can be foregrounded. Benjamin Whitmore, that may be an answer to your question that you asked before about students pushing back on data mechanization. Audrey, for everybody, that's a fantastic answer. What a wonderful way forward for all of us. And I have to close up on that note because we are past the top of the hour and you have been so generous with your time, so generous with your thinking. It's just been an instruction and an inspiration to be able to spend an hour with you. Well, thank you. And I'm excited. It feels very weird. I don't actually have a physical copy of the book yet. So it still kind of feels strange. I should get my author copies sometime this week, I hear. But I'm very excited for the book to be in people's hands. I hope to have more conversations with people about this. And so I think it's an important book. So that sounds like weird and egotistical. I think it's an important book. And so I'm glad that it's gonna be out there. Writing a book is an act of egotism. It has to be or you can't- That's true. And I just shared this in the chat and I'll just share a quick link again for those of you who haven't seen it yet to where you can find the book from IT Press, Teaching Machines. And on this note, Audrey, please take care. Enjoy the conversations that follow and we'll be in touch. Looking forward to seeing you again. Be safe. If everybody else don't go yet because we have to let you know what's happening over the next few weeks, thank you all for your fantastic, fantastic questions. Looking ahead, we're gonna move to cover a whole range of topics from digital reading to the educated underclass to rethinking teaching. If you'd like to keep talking about all of these issues, so what does it mean if the foundation of edtech is rotten? What can we learn from freedom schools? What does it mean to teach critical thinking with a teaching machine? Please feel free to keep the conversation going on Twitter using the hashtag FTTE. Throw something at our blog there. We'd love to hear from you. If you'd like to go back into the past and look all the way back to Audrey's first appearance, if you'd like to see our sessions on pedagogies of caring, our sessions on race and equity, please just head to tinyurl.com slash FTF archive. They're all there, freely available for you. In the meantime, thank you again for fantastic questions and thoughts. It's been an absolute treat to work with you. Again, everybody, please enjoy summer as best you can. Good luck planning for the fall. Take care, be safe, and we'll see you online. Bye-bye.