erasu bo rodio. I hope everybody, and I hope to welcome to those online as well to the third day of the whole conference. I think we're in for a treat this morning, so it would've been well worth getting up after the night before. It's an absolute privilege this morning to be introducing to you, Audrey Waters, that she's come all the way from the west coasts to the United States. She's an educational writer, who some of you may have come across, because she's written in many influential publications, including Higher Ed, the Huffington Post and Educating Modern Learners. She tells me that she's also telling me that she's currently working on a book called Teaching Machines and that's reflected in the title so maybe we're going to get a little bit of a sneak preview, I'm not sure. One other thing she does do which is absolutely fascinating is a writer blog called Hack Education and I've stolen from there the introduction she has to herself in the interest of reusing open resources, I thought rather than write my own I'll use hers. So she describes herself there as an education writer, a recovering academic, a serial drop out, a rabble rowser and some days Ed Tech Cassandra. I confess I had to look up who Cassandra was but turns out Cassandra was a prophet who was cursed by Apollo so that people didn't believe her prophecies. I'm hoping today Audrey that the rabble we have for you here will believe some of your prophecies and I'm sure you'll have some interesting questions and discussions following the presentation. So I'd like to hand over then to Audrey and please welcome her to the conference. Thank you, thank you very much. I'm thrilled to be here. On Monday, driving on our way up here to Coventry, we, and we means myself, my boyfriend and my mum, we stopped at Bletchley Park with the site of the British government's code and Cypher school during the Second World War and the current location of the National Museum and Computing. My boyfriend is accompanying me as he always does. We travel together, our tech world's overlap quite nicely. Some of you might know him, Ken Layma, API evangelist. And he's here because I'm here but he's also here because he's a huge fan of Martin Hoxie's work. So that was great and he's also a huge fan of drinking beer with David Curnahan. So he's very, it's very unlike him to sort of miss out on opportunities to come to the UK to be with you all. And my mum is here as well because my family on her side of the family at least is from England and we've reached that sort of difficult part in my granny's life where she's 90 years old and we sort of rotating through sort of how we can come over from the States and be good children and good grandchildren and come visit her. So I told her I was coming over and we sort of timed our visit together which is great because she's also volunteered to drive Ken and I around on the wrong side of the road. So I'm very appreciative, very appreciative of that. So when we were planning the trip I said to my mum, I said, you know, as we drive up from London can we please stop at Bletchley Park? And she said casually, oh I think your grandfather might have worked there. And I said, what? Like that's a piece of sort of family history I would have like to have known about as someone who sort of writes about computers, thinks about computers a lot for a living. I mean I hadn't really thought about it until that moment that there might be a family connection there but it sort of does make sense. My grandfather during the war was the station commander at Chain Home Low so an early radar development where they helped develop radar, early warning radar base. He later became the Air Officer Commander in Chief at Signals Command. He was knighted for his work actually in developing radar but I'm not sure he ever really talked that much about it. He certainly never mentioned I think any of the work that he might have done at Bletchley Park. Of course this was sort of like the huge secret. In fact my granny said that during the war she never knew what he did until afterwards. She never asked, we never talked about it. And he passed away I think before we could sort of have any of these conversations that would have been really interesting to me about sort of how does technology, how do what we do with computers fit into this other military project. And I'm fascinated by these sorts of stories. The reason that I say I'm Ed Tech's Cassandra is definitely in terms of making predictions and prophecies about sort of the doom that might be on the horizon. But I'm also a folklorist by training and I'm very interested in the stories that we tell. That's important to me. I'm a folklorist by training. I'm not an instructional designer. I'm not a software engineer. I'm not a business person. I'm not an investor in education technology. I play a journalist on the internet. I'm not a computer scientist. I'm a folklorist. I'm interested in stories as much as sort of our disciplinary training sort of helps shape how we see the world. I am a folklorist. I am interested in the stories that we tell particularly the sorts of hidden stories and the forgotten stories and the lost stories much like sort of my grandfather's involvement perhaps in Bletchley Park. Or more broadly I think the way in which we sort of have forgotten to talk about to computer history and surveillance and war. I'm really interested in what stories we tell and whose stories get told, how these stories reflect and I think even sort of construct and shape our world. These are the worlds I think of science and politics, culture and of course education. I try in my work to sort of trace and retrace some of these, the connections through some of these different stories, these different narratives and counter narratives of well as well. The stories of business and then also the stories of bullshit which is a lot of what you hear from the Silicon Valley tech sector. My keynote this morning is going to try to weave together lots of these different stories to you and my apologies if you're hung over. If the keynote doesn't make sense, it's actually the booze's fault and not that I'm incoherent. But I'm trying to pull together stories from history, from literature and from science. When I heard that the theme of the conference was writing giants I confessed that I didn't actually think about waves even though I live in Southern California sort of in the middle of surfer culture and I didn't think of Isaac Newton's famous saying standing on the shoulders of giants. I thought about giants the way that a folklorist would. So as I was preparing my talk I think I sort of went off on a different direction with giants. And so what I want to talk to you about this morning is monsters. I want to talk about ed tech's monsters and machines. I want us to think about Bletchley Park perhaps on the road to where we are today thinking about some of the different paths that have got us to this place in education technology as well. I mean no doubt in the last few years we've witnessed a real resurgence in interest in education technology. A renewed interest in a growing interest particularly from those in Silicon Valley. I think that folks here at this conference are certainly well aware that there is a lengthy history to education technology. But many of the folks that I talked to some of the newest proponents of education technology particularly in the states particularly in Silicon Valley insist that ed tech has no history. They invented it. That there's only now ed tech only has a now and the future. That there's really nothing to be talked about or learned about from the past. And ed tech now as they see it is really closely tied in with venture capital particularly in the states. And it's sort of a very powerful form of storytelling that many of these ed tech proponents are telling. And sort of the media has picked up a lot of these stories as well. We've sort of seen this in particular with the MOOCs. The storytelling is about sort of this disruptive innovation mythology, entrepreneurial hagiography, right. Design fiction and fantasy. It's really a fantasy about education technology, a fantasy about the computer industry that really wants to sort of stretch its tentacles throughout the world. You know we've been given a map in some way. Society has been handed a map of the world as drawn out now by many of these companies in technology. And they sort of want to think about the ways in which these new brave new ed tech explorers, the brave new entrepreneurs are going to sort of conquer lands for us and divide up and rethink and reshape our digital spaces. They're doing this sort of for us. This is their fantasy, right. And they warn us down at the bottom of that map. You know they warn us of these technologies of the past that perhaps we should stay away from the dangerous, unexplored or overly explored places that we should reject because they're stagnant, they're no longer populated, right. There are dragons there. Hic cent drachonis, right. There be dragons down in the forgotten land of forgotten websites. But I actually want to argue that we need to face our dragons in education technology. We need to face the monsters that have been created. We need to face the giants. And they aren't actually simply on the margins of the maps. These monsters are sort of littered, scattered throughout these lands. So I'm in the middle of writing a book actually about some of these questions, a book called Teaching Machines. It's a cultural history of the science and politics of education technology. I think it's an anthropology event of ed tech. It's a book that looks at sort of knowledge and power and practices, sort of learning and politics and pedagogy. I'm really interested in sort of this long running push for efficiency, this desire to build machines, sort of the history of education technology throughout the 20th century pre-computers that was very much interested in automating instruction, of tools like the intelligent tutoring systems, artificially intelligent textbooks. That's one I saw recently, robo-graders and robo-readers. It involves, I think, some of this does involve a nod to the father of computer science, right, Alan Turing, who worked at Blashley Park, of course, and his profoundly significant question, can a machine think? And I want to ask and turn, can a machine teach? What happens when and if machines can think? And what happens when and if machines teach? What does that, what happens to labor? What happens to work? And what happens to learning as we start to find these things more and more automated in our lives? And what exactly do we mean by those verbs, right, think and teach? When we see signs of thinking or teaching in machines, what does that really signal? Does it really signal that machines are becoming more intelligent, or does it in fact signal perhaps that humans are becoming more mechanical? Rather than sort of speculate on the future of teaching in machines, I really want to turn back to the past. It was long before Blashley Park or Alan Turing machines sort of have spoken in binary and ones and zeros. And quite recently I literally got tattoos on my forearms to sort of remind me, as I said, and type on my computer about the way in which machines speak to us. So on my left arm here, I have this, an excerpt from Leaves of Grass. This is Walt Whitman in binary, resist much, obey little. Words to live by. And this one, rather more lengthy, from Lord Byron, which is from his song of the Luddites, Down with All Kings but King Ludd. And I do appreciate the irony, as I do, of a song to the Luddites in binary. I'm really interested in these questions right around poetry and storytelling and resistance in machines. Lord Byron is a particularly fascinating figure in all of this to me. He was, of course, one of the very few defenders of the Luddites. His only appearance in the House of Lords was when he gave a speech challenging the 1812 Frame Breaking Act, which made destruction of the mechanized looms punishable by death. The Luddites were really interesting and sort of maligned, of course, these 19th century artisans who protested against the introduction of factory-owned mechanized machines. They were upset about the labor-saving textile devices. And let's be clear, the emphasis for the Luddites was on the labor part of that problem, not necessarily on the machines. They wanted to protect their livelihoods. They demanded higher wages in a time of economic upheaval, mass unemployment, and, of course, the long Napoleonic wars. They were opposed to the factories, not so much because there was technology, but because the corporations owned the technology and the means of production. The Luddites weren't really anti-technology per se, but that's what the word has come to mean. The wonderful Oxford English Dictionary says that Luddites were the original meaning, like a member of the organized band of English mechanics and their friends, 1811 through 1816, who set themselves to destroy manufacturing machinery in the Midlands in the north of England, the etymology from the proper name Ludd with the suffix ight. According to Pellew's life of Lord Sidmouth, Ned Ludd was a person of weak intellect who lived in the Leicestershire village about 1779, who, in a fit of insane rage, rushed into a stockinger's house, destroyed two frames so completely that saying, Ludd must have been here, came to be used throughout the hosary districts when a stocking frame had undergone extraordinary damage. The story lacks confirmation, but it appears that in 1811 through 1813 the nickname Captain Ludd or King Ludd was commonly given to the ringleaders of the Luddites. Ludd was, as this image shows, a giant. Today we use the Luddite, I think, in what the OED calls its transferred sense, one who opposes the introduction of technology specifically into the place of work. The sample usage from the OED entry from the economist, of course, who else would you cite about Luddites, the economist from 1986, quote, by suggesting the modern world has lost control of its technology, these accidents help to strengthen the hands of the Luddites who would halt technology and therefore halt economic growth. I think that's the way the term is used as this pejorative today. If you question technology, clearly you're against economic growth. To oppose technology to fear or question automation, some folks like the economist, venture capitalists Mark Andreessen, for example, they argue that that means you misunderstand how the economy actually works. I would suggest that perhaps the Luddites understood pretty well how the economy works. I would suggest that when it comes to questions of who owns the machinery, they sort of nailed it, and I would say that the economy works quite well for venture capitalists like Mark Andreessen, and maybe we should question that. In 1984, American novelist Thomas Pinchon asked, is it okay to be a Luddite? Suggesting that in the new computer age, it may well be that we have mostly lost our Luddite sensibility. We no longer resist or rage against the machine, but he cautions some day we might have to. He writes, if our world survives, this is 1984, it was Reagan, that's right, it was this question, like if the world survives, right? The next great challenge to watch out for will come. You heard it here first. When the curves of research and development in artificial intelligence, molecular biology and robotics converge, oh boy, it will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to. God willing, should we live so long. Here we are now, 30 years later, from Pinchon's essay, facing these pronouncements and predictions again, that not just the factory jobs, not just the textile work, but all of our jobs, the white-collar jobs, are on the cusp of being automated. We are entering a new phase in world history, one in which fewer and fewer workers will be needed to produce the goods and services for the global population. That's from Eric Brangelsen and Andrew McAfee in their book, Race Against the Machine. Before the end of the century says Wired Magazine, 70% of today's occupations will be replaced by automation. The economist gives a more rapid timeline, as they would. Nearly half of American jobs will be automated in the next decade. We are, some would say, on this cusp of a great revolution in artificial intelligence in computers and robotics. A great revolution, I suppose, in human labor. Of course, little asterisks that folks in AI have been predicting a revolution in AI that's always 20 years in the future, always since the beginning of AI. It's always been 20 years off, but there we go. It's always on the horizon. It's really for real this time, I'm sure. Like I said, these technological stories that we tell, they are fantasy, they are fantastic, always on the future. I think we do have to thank Alan Turing for laying some of the philosophical groundwork for artificial intelligence. Of course, ironically, despite the affinity with the Luddites, we do have to thank Lord Byron. He was the father of Ada Lovelace, who is generally considered the first computer programmer. She worked with Charles Babbage on his analytical engine. I love these sorts of comments. One of the few people to stand up for the Luddites gave birth to the person who brought us to this world today. I think Byron is interesting too, and now as we celebrate 200 years of Luddites, we're actually coming up on another bison tennery in a couple of years as well that Byron was there for as well that summer, when he and a group of small friends, Percy Shelley, John William, Paul Dory, Claire Montt and Mary Godwin, they spent the summer of 1816 in Lake Geneva, Switzerland, a wet, ungenial summer, said Mary, when they all decided to try their hands at writing ghost stories. And there, Mary Godwin, later Mary Shelley, wrote Frankenstein, first published in 1818, arguably the first work of science fiction, certainly one of the most important and influential texts when we think about what it means to have science and technology and monsters come together. Monsters, of course, it's important to think monsters. Frankenstein is about monsters and not machines. However much of Frankenstein's longevity is owing to the unsung genius of James Whale, who translated it into film Thomas Pinschens' essay about Luddites, it remains more than worthwhile reading. For all the reasons that we read novels, as well as for the much more limited question of its Luddite value, that is, for its attempt through literary means which are nocturnal and deal with danger to deny the machine. It's really interesting because while the lab visualized in James Whale's great 1931 movie, it's full of equipment, full of equipment. The novel itself actually has very, very little machines in it. I think there's a passing mention of something that does create the galvanic twitch that was able to lurch and bring the creation to life, but really the novel doesn't do all of this fancy laboratory work. Pinschens argues that it's actually really important that there's very little technology, very little machinery in the novel. He says that this represents the gothic interest in rejecting the future of science, returning to an age of miracles. To insist on the miracles, argues Pinschen, is to deny the machine, at least some of its claims on us, to assert the limited wish that living beings, earthly or otherwise, may on occasion become bad and big enough to take part in transcendence. Even without machines, however, Frankenstein is always read as this cautionary tale about science and about technology. The story has left an indelible impression upon us. We use Franken now as the sort of shorthand to refer to all sorts of things that we find to be abominations. Frankenfood and Frankenfish, the monster, this monstrosity is like this technological crime against nature. I think it's very telling that often in popular parlance that we confuse the scientist Victor Frankenstein with the creature. We call the creature itself Frankenstein. The sociologist Bruno Latour argued that we don't actually merely make that mistake. It's not merely a matter of making the mistake of confusing the monster with the scientist, but we actually get the whole crime wrong. It was not that Frankenstein invented a creature through some combination of hubris and technology, says Latour, but rather the crime was that he abandoned the creature to itself. The creature, again, a giant, insists in the novel that he was not born a monster. He became monstrous after Frankenstein fled the laboratory in horror when the creature opened his dull yellow eyes, convulsed and breathed. Remember that I am thy creature, he says when he confronts Frankenstein in the Alps. I ought to be thy Adam, but rather I am the fallen angel whom thou divests from joy for no misdeed, everywhere I see bliss, but from which I am alone, I alone, and irrevocably excluded. I was benevolent and good. Misery made me afeind. Latour says, written at the dawn of the great technological revolutions that would define the 19th and 20th century, Frankenstein foresees the gigantic sins that were to be committed that would hide a much greater sin. It is not the case that we have failed to care for creation. It is that we have failed to care for our technological creations. We confuse the monster with his creator. We blame our sins against nature upon our technological creations. But our sin is not that we've created technologies, but we have failed to love and care for our technologies. It is if we have decided that we were unable to follow through with the education of our children. Our gigantic sin, again, is that we failed to love and care for our technological creations. We must love and educate our children. We must love and care for our machines lest they become monsters. Indeed, Frankenstein is also a novel about education. The novel is structured as a series of narratives about education. Captain Walton, who writes letters back to his sister, talks about his education. Inside, that's the story of Victor Frankenstein and his education. Tucked inside of that is the story of the creature and his education that he learns from observing people. It's the story of what happens when science goes awry, but it's also the story of what happens when education goes awry, when education isn't about guidance and love and care and support. Oh, that I had remained forever in my native wood, nor known or felt beyond the sensations of hunger, thirst and heat, the creature says. In his article Love Your Monsters, Bruno Latour says that Frankenstein is a good parable for thinking about political ecology. Again, the lesson is that we should step away from technology. It's not that we should sort of reject innovation or reject science, but rather we have to sort of strengthen our commitment to our patience with our political commitment to all of creation. Capital C creation now includes our science and our machines. Frankenstein I think might also be an interesting parable for education technology in the same way. What are we going to make of ed tech's monsters? What are we going to do about our machines? Is there something to be said about pedagogy, technology and what we're seeing increasingly perhaps as an absence of care? We have 200 years of Luddites, we have 200 years of Frankenstein and by my calculations at least we have about 150 years of building teaching machines. To be clear, I think when I give a nod to Luddites and to Frankenstein again, it's not about rejecting technology, it's not about rejecting science, but it is about rejecting exploitation. It's about rejecting what's become really an uncritical and unexamined belief that these technologies sort of equal progress. The problem isn't again that science gives us monsters, it's that we have pretended like science is sort of divorced from our responsibility, divorced from love, divorced from politics, that science gives us the truth. The problem isn't again that science gives us monsters, but that it actually doesn't give us answers either. And that's I think the problem with ed tech's monsters. That's the problem with teaching machines. They want us to have precise truth answers. They're built on this idea that if we're to automate education we have to see knowledge in a certain way. We have to see knowledge as sort of atomistic, fixed, hierarchical, measurable, non-negotiable. In order to make a lot of our education technologies we've come to view the world as a very fixed thing. I love putting these three images on the slide. I'm sorry. My apologies to go out to the Skinner family. Except not really. Although teaching machines do predate his work by almost a century, they are most commonly associated with wonderfully bad guy looking up on the top right-hand side, Harvard psychologist B.F. Skinner. Here's an excerpt from wonderfully evil looking in Rand about Skinner. She's describing his 1971 book Beyond Freedom and Dignity. The book itself, she says, is like Boris Karloff's embodiment of Frankenstein's monster. It's a corpse patched with nuts, bolt screws from the junkyard of philosophy, pragmatism, social Darwinism, positivism, linguistic analysis with some nails made by Hume, some threads by Bertrand Russell, and glue from the New York Post. Damn, no one ever better review my book like that. The book's voice, like Karloff's, is an emission of inarticulate moaning growls directed at a special enemy, autonomous man. Now, I quote Rand's criticism here, of course, because she invokes Frankenstein's outfits quite nicely, and I'm really fascinated by the fact that she's sort of arguing, here we go, that B.F. Skinner, this really monumental figure in education technology and education psychology is Frankenstein, right? He is, and that he's created, his science behaviorism is this misbegotten creature from a misbegotten science. B.F. Skinner is Frankenstein, and I think that's interesting how she sort of plays into this notion of a film that he's playing God, right? He dares play God, that his creations are monsters and that Skinner is fixated on control, a rejection of freedom and the absence of emotion or care. Of course, but I do quote Ayn Rand before you're all horrified. I quote her with a great deal of irony, of course, because the part, the Silicon Valley tech industry right now could not be more fascinated with her. The Valley right now is full of these sort of laissez-faire, objectivist, libertarian capitalists who really are embracing her vision of the world, which I would argue is monstrous in its own right. Rand uses Skinner as a way to sort of say this is why we actually can't have federal funding of science. She says that in her book review. This is a great example of why the government needs to not be involved with research and we need to open up science to the free marketplace of ideas and the free marketplace of ideas will make sure that nonsense like behaviorism never will sort of lose. Of course, again, ironically the free marketplace of ideas that libertarians love right now is really chock full of behaviorist crap. She criticizes Skinner, right, that there is no freedom, that Skinner always wants to control us, lead our lives, sort of control by sort of scientific management by technocrats who know best but are offering us positive reinforcement. You know, it's she says that she will not stand for this but these behaviorist technologies are thrown throughout the technologies that we use today, right? Gamification, notifications, nudges, no surprise of course because of the rejection of history many folks in Silicon Valley don't actually know who BF Skinner is and wouldn't know behaviorism if it bit them. The Turing test of course, right? The sort of fundamental thing in artificial intelligence is in many ways a behaviorist test. As Alan Turing said the question isn't really can a machine think but can a machine exhibit behaviors behaviors that convince a human fool a human into thinking it is one as well. Again, monsters and machines. Before developing teaching machines, our friend BF Skinner here worked on a number of projects inventing as part of his graduate work what's now known as the Skinner box around 1930. The operant conditioning chamber which he used to study trained animals to perform certain tasks. You do it correctly, you get a reward just like so much ed tech today. I mean, not literally candy often. During World War II my favorite Skinner story he actually worked on project pigeon an experimental project to create pigeon guided missiles. I'm not lying, truly. I cannot begin to tell you how much I wish I could talk to my grandfather about the development of radar and his thoughts on pigeon guided missiles. I mean, even more than talking to him about Bletchley Park I would love to have his thoughts on pigeons and war. But the military actually cancelled that project. They cancelled and revived the pigeon project several times. Our problem said Skinner was that no one would take us seriously. By 1953 the military had devised machines that would guide missiles and we no longer needed to rely on animals to guide the machines. That same year 1953 Skinner visited his fourth grade daughter's classroom and came up with the idea for the teaching machine. He was struck when he visited her classroom on how inefficient not only were the students expected to move through their lessons at the same pace but when it came to assignments and quizzes they actually had to wait sometimes a day a whole day to get the feedback from their teachers. Skinner believed that these were the flaws in schools that could be addressed through automation, through a machine. So we built a prototype that he demonstrated at a psychology conference the next year. All these elements are part of BF Skinner's teaching machines, right? The eliminations of efficiencies that come with human teachers. The delivery of immediate feedback. The ability for students to move at their own pace. I think today education technologists call that personalization. Addressing social problems including problems like school said Skinner meant addressing behaviors. As he wrote in Beyond Freedom and Dignity we need to make vast changes in human behaviors. What we need is a technology of behavior. Teaching machines he said would be one such technology. Teaching with or without machines Skinner said was reliance on contingencies of enforcement. The problem with human teachers he argued is that they just weren't consistent about how they reinforced things. They didn't reinforce right away like a machine could. Like I said, sometimes there was a delay between when a student would do something and when they would figure out whether or not they were right or wrong. Also the teachers he said were too often focused on punishing bad behaviors rather than rewarding good behaviors. Anyone who visits the lower trades of the average school today will observe that a change has been made not from aversive to positive control but from one form of aversive stimulation to another Skinner writes. With the application of behaviorism and the development of teaching machines there is no reason he insisted why the school room should be any less mechanized than the kitchen. Maybe there are reasons I would like to think that maybe monsters and Luddites can help us form a better response. According to Google Mgrams which is one of my favorite non-scientific tools to play with it's the tool that tracks the frequency of words within the corpus that Google has digitized. You can see as society became more and more industrialized we sort of talked about Luddites increasingly over the years. The pattern is sort of the same until interestingly the turn of the century the turn of the 21st century when according to Google at least and I'm sure that they have no interest in making it look this way and sort of to paraphrase Dr Strange Love I think we stopped worrying and came to love the machine but by love here I do wonder if we'd mean instead fascination and attachment with the shiny and the new with acquiest perhaps we're no longer engaging we're no longer engaging in our technologies scientifically, politically or sociologically and again this is not what Bruno LaTour meant when he said we should love our monsters. Is there interest in Luddites that only declines I fear we do face what Frankenstein counseled against a refusal to take responsibility. We see technology as this autonomous creation that's going to sort of enter the world and move society forward without any guidance under its own steam Wired magazines Kevin Kelly is probably the best example of this and he said in his book what technology wants its own wants it wants to sort itself out it wants to self assemble into hierarchies not just the most large and deeply as most large deeply interconnected systems do it wants what every living system wants it wants to perpetuate itself it wants to keep going as technology grows these wants are gaining in complexity and force that is monstrous that is Frankenstein's monster Kelly writes we can choose to modify our legal, political and economic assumptions to meet the ordained trajectories of technology but we cannot escape what technology wants we should just throw up our hands I guess surrender surrender to progress surrender to the machine it's a sleight of hand of course it suggests that technological changes are what technology wants it's an argument that sort of obscures what industry wants what business wants what systems, power wants I think it's an intellectually disingenuous and perhaps politically dangerous argument to make what does a teaching machine want for example what does a teaching machine demand I want to echo what Catherine said yesterday when she said that education demands our interest and our engagement I'll insist to you that technology demands our political interest and political engagement our sins are not that we have created technologies but that we have failed to love and care for them it is as if we decided we were unable to follow through on the education of our children political interest political engagement is love it is love for the world love and a little bit of leadism I want to leave you with one final quotation from Hannah Arendt who wrote education is the point in which we decide whether or not we love the world enough to assume responsibility for it and by the same token save it from that ruin which except for renewal except for the coming of the new and young would be inevitable education too is where we decide whether we love our children enough not to expel them from our world frankenstein sin and leave them to their own devices nor to strike from their hands their chance of undertaking something new something unforeseen by us but to prepare children in advance for the task of renewing our common world that is our task I believe to tell the stories to build the society that would place education technology in that same light to renew our common world we in education I think we have to face these monsters that we've created these monsters that we've inherited we've inherited monsters from the technologies in Bletchley Park we've inherited monsters we've inherited monsters from mass production and standardization we've inherited monsters in education technology from behaviorism and control these are the monsters I think we have to face we have to consider what monsters education technology lives with we have to face them we have to confront them and we have to make sure we are engaged politically not creating more monsters thank you thank you very much Audrey fascinating inspiring journey through stories and history Ben Steples has tweeted while he's been talking he's still going to be talking about that in six months time but we do have a few minutes to sort of start the conversations now so I'd like to invite any comments or questions we have questions as well from our online audience who's got the online link at the end okay I think does anybody like to start us off with some questions or comments from the US and our monsters are slightly different ones so yes there's been the narrative about efficiency and evidence based particularly since the new Labour government but we've just been much less good at it educational research is much more qualitative and humanist and participated in the UK tradition and the edtech project has been more focused on the student experience and enhancing academic practice so though we have monsters that recognize that they're slightly different ones no I think that that's actually a really great observation is one of the things that frightens me very much is that as we as we in the US export our technologies that we're exporting with that often times the ideology that comes with technologies right so that it is a sort of a form of it's a different sort of form of imperialism I wouldn't call it sort of necessarily cultural imperialism but it is sort of like this imperialism at the level of infrastructure that I worry I worry deeply that sort of other countries are not just adopting our ridiculously horrid education policy which go seem to do that quite well following the US's ridiculous education policies but I worry that as sort of we spread these new technologies and MOOCs are a great example of that so we sort of spread these throughout the world on Facebook all of these that they carry with them these other sorts of this other ideology particularly this sort of radical individualism that is very much a randian project but it's very much an American project as well that's a great point thank you I'm glad you have different monsters I guess I think it's a really really good point about the importance of storytelling and the stories that we tell and one interesting reflection I think the sort of fashionableness of MOOCs in the UK amongst policy makers came partly from David Willits minister visiting California and getting caught up in the excitement of the venture capitalists and the story that these exciting new technologies are coming from the venture capitalists of California which of course probably most of us would tell a very different story about that and I'm kind of reflecting on the story told last year actually at this conference by Wendy Hall about openness and perhaps the most significant thing about the invention of the web in Tim Berners-Lee making that open deciding there wasn't going to be any monetary value attached to it it was part of the human intellectual commons and I'm just sort of wondering how we tell that sort of counter story about openness and it's real value in an effective way I think that this is one of the great the great challenges that those of us that work in education know the different stories sort of we're faced with when particularly when journalists find the stories that the venture capitalists and technology sector tell very compelling school is broken someone should fix it by my product it fits quite nicely into the three paragraph story that journalists tell and I think that that is our challenge is to be able to sort of present the counter narrative and present it powerfully I think fortunately we have access to I think social media can help crack open the sort of dominant the dominant narrative we're able to sort of tell our own stories through our own voices now but I do think the challenge is to also be able to sort of tell that story at the not national popular mainstream media level as well even if sort of like all of us have to do a better job of sort of making sure that our friends in journalism don't fall prey to the sort of this grand narrative that thank goodness for filling the blank person from Stanford University who's going to rescue us from from our from our failure from from our institutions failure to move forward many of us within education I think see the interesting innovations that happen all the time they just don't look like the story that the media tells that I showed a group of Chinese teachers around UK schools to look at how to look at how education technology was being used to improve teaching and and these were people who typically taught between 90 and 120 children in a class and you know if anybody was out of line they put them in the class and they got beaten and so there were some downsides to the way they taught but it was very effective and what was going in and looking honestly and objectively at what was going on in the ed tech at the time when the government was putting lots of money into it it was very clear that a lot of the things that were being done were being done because there was the technology that needed to be sold not because there was a learning outcome that was going to be delivered better but it was a human being communicating, intuiting understanding, caring reading eyes, looking at attention spans and all of those kinds of things and the quality of the teaching wasn't improved by the technology and as technologies we can say you're not adopting the technology if teachers had adopted the technology when they were given it every time they were given it no teaching would have gone on at all the humility of understanding that the best teachers the best teaching that goes on is with one person with another person and whatever is being taught objectively so much subjective stuff is going on and that's real learning that's going on and as you're saying exercising one tiny muscle isn't keeping anybody fit and just breaking everything down into binary you've learnt this, yes, no it's going on that perspective on what we're doing is extremely refreshing and I think can't be said often enough thank you very much I think it's interesting that oftentimes I notice that technology is often and not just education technology but in general technology is often used to make the practices that we already do just work a little bit more efficiently we can do it faster, better, stronger perhaps some promise more cheaply but usually just a new set of people profit from it and I think that one of the challenges should be too how does technology transform our practice it isn't simply how do we use technology to do that old thing but just in a new like you know move from print textbooks to textbooks on an iPad is not actually that exciting I mean it's exciting for Apple and Pearson because they want us to buy their new thing but it's not actually transformation the way in which some of the potential to do things really differently oh right, thank you Nigel Ecclesfield from JISC I'm very interested in the way in which you're starting to look at the vocabulary adopted by Silicon Valley and trying to reclaim perfectly good words from the way in which they've been jargonized and colonized by the followers of Ayn Rand and others so it brings to my mind George Bush's wonderful comment that the problem with the French is they didn't have a word for entrepreneur and I'm particularly exercised by a term that's come out of the states from Harvard public value and I blog quite a lot on that because I think that both of those words need reclaiming from the Harvard people who see public value as designed and developed by senior managers in public services. I feel that somewhere, like in education the learners and the practitioners have a role in defining what the policies are but we're in a position at the moment where even in the operations of the great free services like Google what we're coming to be able to access and use is what the algorithms have decided we should know and use and it's really useful and very positive today to have somebody who comes in to reclaim good words like Luddite and I think just to finish one of the great things that the Luddites did when they were campaigning before they started machine smashing was to propose in the economy a small tax on the cloth produced in the new machines in order to re-educate retrain and re-employ those people who were being thrown out of work in the cottage industries. I'm really interested in the Luddites as well, this notion something that's also really important to me is thinking about how can we reclaim reclaim technologies instead of giving up more and more of our personal data more and more of our content to these large providers be they Facebook or Blackboard or Google that we have better control and understanding of technology and I think that's an interesting lesson for the Luddite, from the Luddites as well it's not about being opposed to mechanized looms the problem was when Google owns the mechanized looms the problem isn't in the web the problem is when the web becomes Google and Facebook and so how do we rest back rest back control of the technological production so that so that we have it again that it's not controlled by the new factories the Google factory I think you have a blog on Twitter as well could you join me again please and thank you very much