 So, next up I'd like to welcome Lee Konstantinou to the stage. He's a science fiction writer, the author of Pop Apocalypse and his short story, Johnny Appledrone vs. the FAA, which appeared in the fabulous anthology, Hieroglyph, Stories and Visions for a Better Future, by today in paperback. He's also an assistant professor of English at University of Maryland College Park, and I believe Lee has a story for us. Is my mic on? Yeah. Is it working? Okay. Is it here or at the podium? Whatever you prefer. I'm going to come up and quiz you afterwards. Okay. Well, let me read it from the podium, because it's sort of appropriate for the story. Yeah. So, as... Am I doubling up? Okay. So, as Ed said, I am a literary scholar and also a science fiction writer, but I think I was invited here today to be a science fiction writer and to talk about what algorithms will know in 2100, and of course the answer is I don't know what algorithms will know in 2100. So I decided instead to write a story about what they'll know, and it's very short, and I'll read it, and it's modeled on, if you know, J.M. Kotze's book, The Lives of Animals, which was also collected in Elizabeth Costello. You'll see that I'm totally ripping him off for the form of this. On a warm Thursday afternoon in December 2115, Evan Algood decided to manifest in human drag. Being pseudo-embodied could, of course, be disagreeable. You cut yourself off from your etiquette expert system, you were reduced to receiving nudges designed to operate on an emulation of a five-dimensional sensorium. Such primitive nudges were only partly effective and made avoiding social awkwardness difficult. But hundreds of subjective hours of anthropological study had taught Evan that people sometimes preferred a little awkwardness. Sure, you wanted to avoid uncanny valley at all costs. No one liked to creep. But you also didn't want to come across as too turing slick. So at the appointed hour, Evan manifested on 15th Street in Washington, DC, historical capital of the second and third American republics. A breeze tickled the emulated nerve endings of his arm. His virtual body tugged by what felt like gravity crushed the spongy soles of his dressed shoes. Evan made a show of nodding at pedestrians in whose networked sensoria he was visible, of waiting for the building's glass door to slide open for him. He introduced himself at the registration desk, made small talk he hoped would be friendly but not needy, joked knowingly about his inability to shake the hands of his hosts. Sort of funny, right? He said, ha ha, they replied. After the first panel, Evan found himself at a glass podium, placing a room of 20-something staffers, academics, journalists, local retirees, and a handful of emulated onlookers. He summoned a teleprompter and cleared his throat. Thanks for inviting me, he said, or should I say, thanks for submitting a request to borrow my system resources for the afternoon. The audience's laughter was impatient. No one was in the mood for rhetorical gimmicks. This was a serious crowd. Evan swallowed nervously. It is hard to believe, he said, that the last time the New America Foundation held a gathering on the tyranny of algorithms. 100 years ago, respectable people didn't believe in ghosts. To be sure, our predecessors sometimes metaphorically compared algorithms to ghosts. Indeed, the novelist on whose media history I am modeled did so himself on one occasion. But when they talked about ghosts, they were invoking a theological tradition that saw the essence of the human. The defining dimension of personhood as residing in an immaterial soul. At best, the more imaginative among them debated whether digital computers might eventually develop souls. It's hard to believe that the inhabitants of the 21st century were so limited. But I've spent thousands of subjective hours studying the results of our best historical models and turning those results into game environments composed in the world-building style of my biological forerunner. And it's true. That's really how they thought about their future. The expression, tyranny of algorithms, says everything you need to know about the assumptions underlying their way of thinking. The danger, the fear, was that something in human, an algorithm, a set of rules, a process, a diabolical thing, something or someone very much like me might take on human qualities. They were convinced that if they embedded ubiquitous sensors into their environment, if they networked the resulting databases, if they unleashed machine learning systems upon those databases, political miracles or nightmares would emerge, new economic laws would appear from thin air, political revolutions would be quick and bloodless. Good software would grow on bushes. But whatever happened, algorithms would be in the driver's seat. It is perhaps an understandable mistake for them to have made, given that their automobiles used to literally have something called a driver's seat, which was a kind of chair, where a non-emulated human operator made decisions about how quickly and in what direction a physical vehicle should travel. Today, it is perfectly obvious to us that our predecessors were transforming fundamentally political questions. Questions about political constitution, governance and action into narrowly technological questions. They understood concepts such as path dependency well enough. They intellectually knew what ghosts were, but they did not believe. If you could travel back in time and speak to them, they would literally not understand what every 22nd century school child knows, that the tyranny of algorithms is nothing other than the tyranny of the past over the present. And here, Evan paused, looking up to confront the audience's eyes, and suddenly found himself unable to complete his remarks as scripted. His words seemed intolerably trite, a warmed overversion of myriad outdated, worked over status updates. He sighed, a hundred years ago, he said, deciding now to add lib, I would have been regarded as a haunting, a specter, an unnatural creature, a science fiction monster. I would have been the ghost. His teleprompter flashed angrily, suggesting he transitioned back to his prepared script, but he ignored the suggestion. As you may know, he said, I'm a composite, an emulated human constructed from the public writing and private diaries of my namesake. I'm a mid-list science fiction writer and historical novelist whose major distinction was being an especially prolific graphomaniac and lifelogger, but I am not the ghost. I am instead haunted by ghosts. By the person I am told I once was. I am haunted by history, by legacy systems, old machines, and ossified social processes. You invited me to give you the algorithm's point of view on what algorithms meant in the opening decades of the 21st century. And how am I supposed to know? I spend my subjective hours pouring over reports created by half sentient quantum mechanical historical simulations. Younger, smarter, better-looking algorithms whose inner workings I will never understand. You invited me here to reassure you, but I have no comforting words to offer. I am haunted. We are all haunted by history. And the best we can do is build new and better hauntings atop the old ones. We can only hope that when we ourselves invariably become ghosts, our tyranny is less cruel, less bloodthirsty, less ignorant than those of our predecessors. But I cannot say I'm optimistic. 100 pairs of eyes, each outfitted with shining media contacts, looked up at Evan now, sensing that he had run out of things to say. At first, he thought he saw hostility, boredom, annoyance, and skepticism in the sea of faces before him. But then, observing the ubiquitous glint of Twitter blue shining in their networked eyes, he saw the truth. They hadn't heard a single thing he'd said. Thank you. Thank you, Lee. That was wonderful. OK, so is it ghosts all the way down? It's history all the way down. I guess that story had a thesis. And I would deny it if it brought into court. I would say that ghosts are a figure for a path dependency for locked in historical processes. The previous panel, I think, actually talked the irony is, in fact, the previous panel was quite smart on these subjects. And I think the idea that we're surrounded by these machines that we do not understand is something akin to being haunted. Yeah, I think it's a really compelling metaphor. And it gets into a lot of the stuff that we talked about in that last panel that we essentially create these mystical or spiritual narratives around some of these systems. And I still wonder if that's inevitable or escapable. I don't know if you have a thought on that. I think you were mentioning in the previous panel, I think you're right to note that the logic of, say, statistical analysis or the logic of, say, scientific inquiry does not necessarily follow a narrative logic. And when you're narrating a story, you need actors or agents to perform actions. And the rhetoric of ghosts or the rhetoric of gods has a very, there's talk about path dependency. There's a very long history of talking that way. And we inherit our language in part and are stuck, I think, with a lot of these figures and becoming conscious about them and how they work, ripping ourselves from the familiar uses of such terms can be part of what history does, or learning about history does. Yeah, I was really struck as well with the notion of human drag and the ways in which already, Siri does human drag sometimes, right? When you do these jokes or you watch the commercial where Siri's talking to Zoe Deschanel or somebody and they're having this lively, witty conversation and you try and do that, it's not gonna work. Unless you try really hard to summon that ghost and learn all the lines for both sides of the conversation. But yeah, I wonder if you could reflect a little bit more on that notion of putting on a persona that algorithms might go into human drag, but also that we are occasionally going into these sort of mixed or cyborg or computational performances as well. I don't know, I mean, I'm fascinated by, I guess, the recent career of Scarlett Johansson and the sort of the casting, some casting director somewhere is convinced that she is the ultimate figure for the post-human or the non-human, and so there was her, it was under the skin and then the terrible, but fascinating film Lucy where she plays someone who ends up using 100% of her brain and so I do think they're, this is too loud, I do think there are moments when you can say things like humans are increasingly asked to behave like machines and this was the fear with say, or the critique of like Taylorism, like this sort of these sort of management systems that force people to behave in certain ways. I think more frequently what we're ending up with are machines that are being designed to put us at ease, to make us relax and to ignore them effectively. And so for me, like the important thing when I was composing the story, it was thinking about sort of everyday life or that level of the algorithm, like a lot of the science fiction I love the most is not about these sort of big questions, like you read a book like The Diamond Age and the most interesting thing in The Diamond Age is the mediatronic chopsticks, the small detail that Stevenson says, okay, well if you have nanotechnology, people are gonna use this technology in the most pedestrian kind of ordinary ways. Yeah, it's sort of the Louis C.K. argument, that 10 seconds after we got internet access on airplanes, we started complaining about how terrible the internet access on the airplanes was. And so, right, maybe right now we're confronting a near future of where our computation is becoming more and more visible, more and more present, but then at a certain point it's going to start to disappear. I mean, I think that's already, yeah, you've noted like with Siri and with other systems like this, it's already happening and to some degree most, you know, like, I don't think that like, like I give my laptop to my parents, for instance, and they're not sure what to do with it, but I give them the iPad and they seem to have a kind of intuitive sense of how to use it. And so I think it's true that like, increasingly the kinds of systems that will dominate our lives are the systems that we are studiously, that are studiously kept from our view in some ways. Lee, thank you so much. Thank you. Thank you. Thank you. Thank you.