 So thank you all for coming. My name is Ed Finn. I am the Director of the Center for Science and the Imagination at Arizona State University I'm also the Academic Director of Future Tense, which is a partnership between ASU, the New America Foundation, and Slate Magazine that explores emerging technologies and their transformative effects on society and public policy. Central to that partnership is a series of events in Washington, DC, and New York City and a blog on Slate. In addition to the regular editorial content we have on Slate, we also have launched Futurography, a hybrid of journalism and digital learning, where every month we teach, we choose a new technology or idea and break it down, asking about the state of the science, inviting experts to to reflect on what's happening and what the major themes in the policy and research debates are. And our theme for January is the spawn of Frankenstein, funnily enough. You can follow today's conversation with the hashtag It's Alive and follow Future Tense on Twitter at Future Tense Now. As you saw from the documentary footage of our recent editorial meeting, our work on a new edition of the novel for scientists, engineers, and creators of all kinds is forthcoming in May from MIT Press. My co-editor Dave Gustin will be here later. He's the funny one. Palski Bean items, please silence your cell phones. We're live streaming this event, so please ask the audience, that is you, I'm reading my notes, to wait for the microphone before you ask your question, and please make your question in the form of a question with a question mark at the end of the question. And most importantly, please stick around after the program because we will be having drinks. Yay! So we call this event the spawn of Frankenstein. Mary Shelley's novel has been an incredibly successful modern myth. And so this conversation today is not just about what happened 200 years ago, but the remarkable ways in which that moment and that set of ideas has continued to percolate and evolve and reform in culture and technological research and ethics since then. In February of 1817, Mary Shelley was 19 years old. She was finishing up the first draft of her book. She'd given birth to two kids, I believe, and lost one of them already. And she was an incredibly unusual person in her time. She had a very unusual upbringing. And the novel that she wrote reflects that. It brings in all of these cutting edge, exciting revolutionary things that were happening, such as the French Revolution. It brings in her bizarre upbringing as the somewhat benignly neglected child of atheist philosopher, radical free thinker, William Godwin, and the looming shadow of her mother, Mary Wilsoncraft. Their home was visited by the leading intellectuals of the day, and she brought in chemistry, electricity, medical science, all of these rapidly changing fields into her work. And in many ways it was a philosophical exercise as much as it was a novel or, as some argue, the first work of science fiction in English. The prompt for this was a dare in the summer of 1816 on the shores of Lake Geneva to write a ghost story. And in some ways this started out as a ghost story about Mary's own lost child. But it was also a ghost story about all sorts of missing parents and missing children. And the specters of Victor Frankenstein and his creature today are very real. They seem to be getting more tangible every moment with every new breakthrough in synthetic biology, artificial intelligence, robotics. Not even poker is safe anymore from machine learning. And the questions that haunted Shelly when she first began composing this work are getting more pressing, I would argue, as we begin in very real, very pragmatic, some ways almost quotidian ways to create life in all sorts of different ways. And so I'd like to open this event by arguing that this is not just a story about hubris, about man stealing fire from the gods, but also a reflection on scientific creativity and responsibility. The ways in which scientific discovery is not so different from other kinds of reproduction, from biological reproduction. The ways in which giving birth to new knowledge is like having a parental relationship. And the ways in which our creations and our responsibilities continually surprise us, especially in a world that is getting more complex and more interconnected. After the book came out, it went viral in a very 19th century way. It was immediately ripped off for the stage, for translations. Almost very quickly after it came out, it started to be used as a metaphor in political debates and all sorts of other cultural forms and memes. And perhaps most interestingly, Victor Frankenstein predated the word scientist by about 20 years. So even before we had this notion of the modern figure of the technical investigator, the scientist, we had this flawed figure, this person who balances, cutting edge modern research on the one hand with these ancient, mystical, alchemical arts that Mary Shelley balanced in her book in the context of natural philosophy. So the spawn of Frankenstein is legion, the myths, the figures of the mad scientist, the thoughtless creator, and the creature, the monster, the demon. The abiding questions that stick with us, what it means to be alive, what it means to be human, what our responsibilities as creators are. These are the things that we're going to be talking about today. So we will lead in with Patrick Varone talking about the notion of playing God. Patrick is writer and producer. You may know his work from Futurama. We spent the first 10 minutes of my arrival trying to mic me, and now I've been demiked. So give me a moment, make sure that quite literally this thing is on. Let me begin by talking about my personal background with Frankenstein. The three salient facts about me, the first time I ever went trick-or-treating by myself was in 1965, Ben Cooper Frankenstein mask and one piece tie it in the back pajama costume. And I distinctly remember being told by my mother that it was Herman Munster, but it was Frankenstein's monster. The first movie I ever paid to see in a theater without my parents was in 1974 when I went with friends to see Young Frankenstein. My first and only professional stage performance was in, thank you for laughing at my failure to have ever had an acting career. My first and only professional stage performance was in a live reading of a 1945 radio play from a show called Weird Circle based on Frankenstein. I played a constable killed by the creature. My big line was, so now in addition to my direct personal connection that I have to Frankenstein, I of course have a scholarly bonafide having read the novel for the first time 14 hours ago on my flight from D.C. to L.A. So I come to the text and especially to the theme of playing God, which is the subject of the first panel from the position not as a scholar, but as a writer, as a creator myself. In fact, when I tell the story to people of how I got into show business, how I get into writing, I recall I was a young lawyer in South Florida and listening to an NPR interview. Some writer was saying that when we are creative, we are our most God-like. And I thought at the time, oh, I'd like to be God. And so 30 years later I write TV cartoons and I am, that's God-like. Now among the most relevant work to the subject at hand that I've done was Simpsons creator Matt Groening's other show Futurama. Several times during the series main character, Professor Hubert Farnsworth, dabbled in Frankensteinian, Frankenstone was the character on the Flintstones, Frankensteinian creation. I have a clip that they asked me to show. This was from the first episode of Futurama that we did after a five-year hiatus. And we had to begin the series again, rebooted after having killed off all the major characters in the last episode that had aired. Now that's God-like. So ignore the credits and roll clip, please. So there's a couple of things I want to emphasize from that scene. So clearly it was designed to invoke Frankenstein, hoping that somebody who worked on the show had actually read the book, unlike me. With the lightning switch. But that's from the movies, not from the book itself. The second thing is that the creation of life going on here is this sort of sub-notion of rebirth or reanimation, not, you know, random creation. Which is kind of a mitigating factor in the ethos of playing God. But failing, having that in, we tried to make it a little more controversial by using this new technique, stem cells, and by saying that farnsworth killed people to get those stem cells, thus undercutting the mitigating factor that it might not have been so controversial. So I mentioned those elements of the clip to highlight sort of the shorthand that's kind of developed over the past 200 years in invoking the themes inherent in Frankenstein. My personal belief is that this shorthand has developed because of the sheer number of times Frankenstein has been adapted or produced in other media. And that the themes and metaphors of playing God have worked their way into popular culture. So let me do a little recap. So as Ed said, five years after the 1818 publication, there was a play produced called Presumption or the Fate of Frankenstein. Which really for about 70 years was the major way that you could see a performance outside of, you know, reading the book itself. There were a couple of burlesque shows based on Frankenstein, but they're kind of lost to history, probably for the better. These first 70 years, I mean that all changed with the advent of cinema. There were three silent films made in the teens and twenties, 19 teens and twenties, one of them by Thomas Edison himself. The dam of course broke in 1931 with the iconic adaptation that we saw the clip from with Boris Karloff as the monster. It had six sequels produced all by Universal, including meeting his bride, his ghost, Dracula, Wolfman and Abben and Costello. The British producer Hammer Films made seven films in the fifties and sixties. I found more than 40 films with variants of Frankenstein in the title, including Teenage, Space, Young, Black, and even one, His Great Aunt Tilly. There's a film out there called Frankenstein's Great Aunt Tilly. The most recent was 2015's Victor Frankenstein, which was told from Igor's point of view, Igor of course not in the book. And Universal is set to release a shared universe monster series with Javier Bardem as the creature in the years to come. TV has also been just as fruitful for adaptation and thematic inspiration. IMDB lists more than 160 appearances by the monster, most notably, at least in my worldview, 1973's Frankenstein, The True Story, which I believe was early reality television. There have been other plays, novels, comic books, video games, more than a dozen apps that invoke the name Frankenstein. Most of them rely heavily on building things or making yourself look like Boris Karloff. And it all sort of follows Moore's Law, which is there was one adaptation in the first 70 years, a handful over the next decade, dozens in the years to follow that, and hundreds in the recent past. We will now call it Franken Moore's Law, which brings me to another way that Frankenstein has spawned creation. The name Frankenstein itself has come to mean creation, well it means creation by assembling or cobbling together parts that weren't necessarily meant to be. The prefix Franken, which is even more specifically attributable to otherwise natural things that are mad made. Franken virus, Franken gun, Franken food, Franken berry, which is a Franken food. In the political world, Frankenstein is an editorial cartoonist's best friend. Anytime something is developed that gets out of hand and turns on its creator, Frankenstein's monster, typically the Karloff variant, rears his ugly head. And allow me to throw out the first Trump of the conference. In September of 2016, after months of cartoons showing then candidate Donald Trump as the GOP's Frankenstein monster, Senate Minority Leader Harry Reid actually used those very words on the Senate floor. Now incidentally, when you enter Franken Trump into Google, you get 10,400 results as opposed to 1,290 for Frankenbama, which I guess people prefered Baracula, and 6,040 for Frankenbush, which means something else. But when you enter Trumpenstein, you get 32,300 entries, and the urban dictionary has defined Trumpenstein as someone who voted for Trump. So in this portmanteau, he's the doctor, not the monster. So it's a rare occasion where you get to be both sides of the creation. So one final thought, and then we'll get to our panel. I actually, I believe quite firmly that Mary Shelley meant this to be the impact of the book. She titled it A Modern Prometheus. Prometheus, the great God who stole lightning, created humans, and caught hell for it from Zeus. Modern, which we can scoff at the notion that this woman who wrote this book, the year that Queen Victoria was born in pre-industrial England, would consider herself modern, but the fact is we are as far from her as she was from Shakespeare, which puts it all sort of in a, not a funny context, but an interesting context. In Shelley's vision, Frankenstein was the modern Prometheus. The hip, up to date, learned vital God who chose to create human life and paid the dire consequences. To Shelley, God's create, and for humans to do that, is bad, bad for others, but especially bad for one's creator. So we'll now hand the discussion back to Ed Finn to introduce my co-panelists. Forgive us if we create a scene we are only trying to be God-like. Thank you. Thank you, Patrick. So let me invite Patrick up here, Nancy Kress, a science fiction writer, author of the Probability Trilogy, Beggars in Spain, and Yesterday's Kin, among other books, and Josephine Johnston, director of research and research scholar at the Hastings Center. Thank you. So while Patrick has his electrical fluids readjusted here. Is it live? It's live. Let me get things started. Thank you. That was a really nice introduction to this, our topic for this panel, the question of playing God. And you alluded to this in your opening comments, Patrick, that there's a trope around this. There's a set of gestures we can make now to allude to the playing God myth, especially in the Frankenstein context, that you can do very efficiently. You know, you can have the big switch, the electricity, or the lightning, the laboratory equipment. A lot of it comes from the whale, universal productions. But this is a story that we've been telling for a long time. It's a human obsession. It's been a human obsession for so long that at times it seems like a kind of cliche. So my question for all of you is, is this an idea that's become so familiar that it's lost its moral force? Is it a short answer? The idea that we still have the option of playing God, if anything, is more relevant now in terms of what can be done with genetic engineering and with science than it was for Mary Shelley. Her science, of course, was ridiculous, but we forgive her that because given the context of the day what Galvani and Volta were doing, that was the best that she had available. What we have available today are, in many ways, genuine God-like powers, and I know Josephine is probably going to want to comment on this a lot, too. I write about genetic engineering all the time. And there is an enormous potential here, as well as, of course, enormous dangers. And neither one is particularly well understood, which was also true of Mary Shelley's monster. Josephine? Yeah, it's interesting because I work in a field called bioethics, which not everybody has heard of, but it's really looking at ethical, in my case sometimes legal, and policy and social issues in science and medicine. And you'll have to forgive the accent. I'm not going to do an American accent for you. And in my field, to talk about arguments as plain God arguments is actually often dismissed as meaningless, is said to be irrational, is said to hinge on particular religious ideas that do not and cannot be used in a secular society. So it's really interesting and somewhat disorientating to be here and feel this idea having some sort of serious weight. Now, it's disorientating in a good way because, for me, because I've actually come kind of full circle in my own thinking about what this argument can do. I, you know, just like everybody else, I was a utilitarian for a long time. But then I grew up and I thought, well, you know, maybe that, you know, it's not all just about harms and benefits. And so I started to think that plain God is actually a really useful metaphor or idea that needs some modern translation sometimes. And I'm sure becoming a parent is one of those moments where you realize that you're sort of in this position of being a creator or of having made somebody. And so it's really kind of caused me to reflect differently on it. But like I said, in bioethics, if you raise concerns about things like human dignity and plain God and hubris, you can be laughed out of the room. Because those ideas are not sort of officially said to have any weight in a kind of rational liberal academic community. This is a question of definition. What do we mean when we say playing God? One way to look at it is remaking nature from what it would be ordinarily. And if you want to take that on the very simplest level, antibiotics are remaking God or playing God. We're remaking nature. Many, many people would have died without antibiotics being available. But if you want to go deeper and say, well, yes, but they were there. Penicillin was there until Fleming just happened to discover it on a bunch of moldy bread. Well, his peanut butter sandwich was overdue. But if you talk about actually remaking it at a genetic level, then you are playing God. And there have been tremendous benefits from this. E. coli modified now produces insulin in enough quantities that people with diabetes have it available at a reasonable cost, which they did not before this happened. That's only one very simple example. There are many, many others. And it has produced tremendous benefits and can go on producing tremendous benefits. For me, the Frankenstein novel is value because I don't actually like it as fiction. And now, do you want me to leave? I'll explain later why I don't like it as fiction. But the value for it in me is that it shows both sides of technology. This can be good. This can create good things. This can be misused. But that's true of all technology. The day man discovered fire, the crime of arson became a possibility. It's a two-edged sword, and it isn't do we use it, it's how we use it. Can I say something? That's your turn. It also seems to me, and this alludes to the clip from Futurama, that there are several subtexts of playing God that are not just making something out of whole cloth that include bringing things back to life. And then also, isn't there an element of playing God in keeping something alive? Or alternatively, is it playing God to kill something prematurely? The same way that the Wizard of Oz is used to be a metaphor for everything starting with William Jenning Bryant's international monetary policy to whatever the issue is. I think Frankenstein becomes a metaphor for, you end up Frankensteining the metaphor to fit the story itself or the real-world scientific development that's happened. Fascinating me when I read the book for the chapter that I wrote for the book that was mentioned. Was not just so that by playing God by making life Frankenstein had unleashed harm on himself and people he loved, but that he also, the experience of being that person who made really had changed him. And I think that's a really interesting aspect of what playing God can code for, which is not just the fact that you take power and use it and that has consequences for others and you change the world around you, but that in the process of being somebody who creates, who makes, who does that, you change who you are. And I think about this in the context of things like stem cell, a research or IVF or a pre-implantation genetic diagnosis or even potential use of gene editing and reproductive context, which all issues I work on, where we're also thinking not just about the consequences of using these technologies for those that are made or created, but also what it does to you as a parent or to be someone who has that degree of control over someone else, how it changes you. And in the book, you know, he really suffers, physically suffers from the experience of playing God. What's the reason I don't like the book? And I know I'm not sure I'm not saying that at a forum that's dedicated to Frankenstein, but I don't like the book. She was a teenager. I don't like the book for two reasons. The first reason I don't like the book is that I don't believe the psychology. Okay, he's rejected, the monster is rejected by Frankenstein. He has all of these terrible experiences of rejection and cruelty and barbarity, so he turns into a killer. I'm willing to buy that. What I'm not willing to buy is the repentance at the end, which happens for no particular reason, and suddenly he's completely upset that he did all of these things and repentant about it. Sociopaths don't repent that way. And as a novel, the construction doesn't work because there's no foreshadowing that he is going to be able to do this. That's one reason that I dislike it, is repentance is going to come about. And it isn't as though, it was an early novel, yes, but Jane Austen was even earlier, and her psychology holds up beautifully. So that's the first reason. The second reason is what you just said. Yes, Frankenstein suffers, and the way he suffers is that every time something awful happens, he falls into a dead fate and goes into some sort of horrors that last for months, which is really convenient for the plot. And so I have problems with the construction of it as a novel, as well as with the psychology of it as a novel, as well as with the science of a novel. So what do I like about it? I like the basic idea, which is why it has persisted for so long, the basic questions it raises as you pointed out. It raises those basic questions of playing God that resonate down through the ages. But I like the Prometheus version better. So, sorry. Oh, no. One of the interesting things here is the way in which the playing God motif can be used to sweep things under the rug, but it's also a very familiar story to tell. And so I wonder how each of you have seen people sort of use this as a kind of, you know, as a tool, as a storytelling tool, and what the rules are of telling a good, you know, playing God's story. Because, you know, Futurama, for example, is full of these moments, right? Heads and jars and playing God things. Yeah, well, I mean, the irony of working in animation is that, you know, as I struggle with the concept of, you know, trying to be creative, you're only working with your words and images. You have the voices of real people. But unlike, you know, stage, unlike live-action, TV, or film, you are creating an entire world, an entire universe. And with Futurama, we were obviously doing, we were trying to do on a regular basis both, you know, long-standing tropes and things that were kind of topical. And the difficulty with just the production of animation is that, you know, you're writing something today that's not going to, as soon as you're going to be able to release it is nine, ten months from now. So we had to deal with things that were somewhat evergreen. And, I mean, that clip that I showed was one of about, I found three or four times when we dipped into sort of Frankenstein lore, those of you will remember there was an episode where we were supposed to bring Fry's dog back to life, which he then decided he did not want the professor to play dog god. And we got more nasty letters from people because then we ended up flashing back and showing that the dog died of natural causes. The dog died of natural causes. And people got mad at us for showing a dog that ended up dying of natural causes. Well, but it was natural. You would rather we just didn't show. No, I'm saying you can't kill dogs. Right, you would rather we just didn't. But the fact is it was a thousand years later. What do you think? I didn't live forever. That would have been unnatural. And then, you know, then we also did episodes that involved, you know, cyber technology where, you know, implants and, you know, the thing robotics that helped to create, you know, prolonged life and extend powers and whatnot. And even in those, you know, inevitably you put somebody on an operating table and you strap them down James Whale fashion. And, you know, you've got the everything, I guess, even in the year 3000 power is mostly generated from lightning bolts as opposed to any other means. So, I mean, we, yeah, we were very cognizant of the notion of prolonging or extending or revitalizing life being, you know, starting with Frankenstein, not with Prometheus. We weren't, we didn't go that far back. In the field of bioethics now, just, I mean, if this is, if the notion of playing God is largely sort of vacated, or do people, it seems in popular culture that it's still, you know, alive and well. And so, how, what's that interface look like? How do people respond to it or what kinds of issues, what kinds of arguments do they make in lieu of this sort of, we're, you know, cloudy notion of, you know, what is God and what are we talking about here? Yeah, well, I think people in bioethics don't always respond very well to the fact that playing God argument means something to most people, ordinary person on the street, pretty much knows what it means. And they sort of often will have that as a concern, not like, not necessarily a concern of like, because of this, I will not go near the technology, but they're like, oh, that's interesting technology, but I have this little concern about, or this concern about playing God. And that, most of the time, is just met with like, well, that's just because people don't understand science or what do they want, they don't want, you know, we have vaccinations now, that's playing God, so the whole argument is dismissed. And so, I don't think it's met with a whole lot of sympathy. And often, and I think also people in my field take the God part pretty seriously, so they're like, well, you must think there is a God if you think that there's an argument called playing God and then that means something. And yet, I think, so in place of that argument, so certainly there's heats of attention to like, well, do we really know what we're doing when we do something? And lots of examples from science of not thinking something through clearly enough. So you've seen it most recently, you're seeing it in real time right now with the reaction to gene editing technology. So in February 2015, so two years ago, a group published an article in Science, scientists and others are calling for, you know, attention to the uses of CRISPR-Cas9 technology and that led to this international summit that took place in D.C. in December of 2015 and now the National Academies of Sciences is in about 10 days going to release a report on the uses of gene editing in humans. So the reason that they're paying all this attention is because they recognize that there could be dangers associated, very practical dangers like how do you know that it's going to be safe? How do you know it would be safe across generations in humans or in other organisms? They did a report also on gene drives in non-human organisms. So there's a clear and large body of literature and people paying attention to concrete risks and benefits, you know, what are the safety risks associated with it and sort of a broad understanding of that. The other stuff that's encapsulated in a playing-god argument I think or that's hinted at in it which is not about like if it was safe, like imagine if Frankenstein turned out to be great, right? Like he was awesome, he was kind, he was, you know, and I mean not Frankenstein, the monster, the creature and Frankenstein himself felt great about it and was heralded as he here on all the things he was hoping. So if safety is taken care of, like what else remains? And what is that stuff? And that is something I think people in academia are pretty bad at talking about. But again I think it gets back to this stuff about what it means to be someone who has more control and more power in a creation relationship than we've had in the past. And it's easiest to see that in parental relationships and what it means to be a parent and as parents gain more control, you know, how does that feel, what is that like, does that change the meaning of their own lives in any way? And that's a sort of flourishing type concern rather than a straight up safety related issue. And then you see, yeah, so that's kind of how I think we are responding. One of the responses has been in concerns of safety and that's from the FBI. You can now order online if you find it on sale for $140 on sale, a CRISPR kit, for editing bacterial genes. This is being done in high schools in some places. The FBI has a unit now whose job is to follow up on possible uses of this to create pathogens, airborne or otherwise, out of bacteria, which is frankly not that hard to do. We have the genie out of the bottle and there can be any number of reports issued and there can even be any number of laws passed. And that doesn't mean that it's going to put the genie back in the bottle because it's not. And as far as the ultimate playing god, which is germline, human germline egg and sperm editing rather than bacteria, although bacteria, editing bacteria is the possibility of ending all the boys if they do it right. If they do it wrong. Yes. Sorry. There are reliable reports coming out of China, some of which were referenced not too long ago in MIT Tech Journal, that there is editing of human embryos going on in China. In fact, they did try to edit an embryo in order to remove the gene that deals with it, that creates a blood disease, a genetic blood disease. As a follow up to this, MIT Tech Journal did a survey of Americans trying to find out how Americans feel about editing human genes in embryo. 46% said that if it were to control diseases, they would be in favor of going in this direction. That's playing god with the vengeance. What we can do now, it's illegal to edit genes here in embryos. It's not illegal, however, to multiply, of course you must deal with this all the time, the number of embryos that you have have them artificially using fertility drugs and then scan them genetically and choose out the ones that are not carrying, for instance, inherited genetic markers for Huntington's Coria or Tisex or any of those. That's the first step. You scan them so that you choose those that have the genes you want. The second step would be to knock out a single gene and replace it with another one. We know how to do this. We do this all the time in mice. Knockout mice are a basics for medical research. You create mice without immune systems so that you can then do the medical research to test various drugs for conditions that you give them. We could do it. It's illegal to do it now. That isn't to say that it isn't going to be happening offshore. Again, I don't mean to be a downer because even though it doesn't sound like it, I'm in favor of genetic engineering although not necessarily in human embryos. I think it's necessary in order to feed the third world. I think it's necessary in order to clean up pollution. I think it's necessary to advance medically. But we have the genie out of the bottle now. The knowledge is out there. It's not that everybody can use it, but more and more people can use it. It's not like creating an atom bomb or you have to round up a bunch of plutonium and get a big facility to do it. It can be done in a basement which is why the FBI's unit is now tracking down these kinds of reports that they get from professors in biology and in genetic engineering who feel that some students may be doing something slightly suspicious or that there are a lot of supplies disappearing from labs more than are being used. They have units tracking this down now. The genie is out of the bottle. One of the areas where I think there's the largest gap between the boundaries of technical ability and the social framework and certainly the legal framework around what we culturally understand as possible and what we think about is in the biological sciences and around this kind of genetic modification, synthetic biology, this whole arena. So what are the responsibilities of the science fiction writers and the storytellers in trying to bridge some of that gap? Is this something that you've talked about and thought about Nancy? I thought about it a lot because here's the problem when you write a story. Fiction is about stuff that gets screwed up. Nobody wants to read or watch a long movie or a 400 page novel or a character. You want your life to look like that but you don't want to read about it. Fiction is about stuff that gets screwed up. So the temptation, the easy way out is to take the scientific advances and to show them being screwed up so that you get Jurassic Park. So that you get all of the kind. I just finished an incredibly good novel by the Chinese writer Wang Jin Kong. It's only been here in this country for a year. It's called Pathogenic Take, Pathological. It's an entirely different take on the idea of Biowarfare and again he has to write about it being screwed up because otherwise you don't have a story but this bothers me because again I think we need this kind of genetic engineering and it bothers me so much that I try to create at least outcomes in my stories that are balanced. Some gain, some loss which is what I think usually happens is that they kill each other in the book and everybody is dead. It could be hamlet. The end of the stage is littered with all these bodies. Yeah. So Patrick I feel like Hollywood has an incredible power to normalize certain things or frame conversations. Here's a partial defense. I agree that the general premise is, I think Hitchcock said this, that drama is life with the boring parts cut out. So you do, you want your life to be normal but you don't want your drama to be like that because then you're not going to get the viewership or the readership and to your point I was trying to think of an example of bioengineering to the good and what came to mind is it's the bridge between three of the original series movies where they go to the Genesis Planet or the planet where the Genesis device is detonated and it terraforms this entire planet and the scientists did it. She ends up saying they see it for the first time in this beautiful lush rainforest and she says, boy can I cook. It's not boy can I play God. So there they, and it ends up being Hitch after Spock is killed at the end of the second movie. What? Spoiler alert from 1982. The first Spock, not the current Spock, the first Spock dies and then he's left on the Genesis Planet where through some nonsense he's brought back both Mumbo and Jumbo brought back to life and so but it's your point and it's not as interesting to do stories where I mean despite the Hollywood ending notion that everything you know it's supposed to have a happy ending you know these are these are thrillers these are ultimately I had a tragedy and what is tragedy but it's comedy but tragedy with a happy ending so tragedy just doesn't have that happy ending a virus movie where the entire earth isn't destroyed just a portion of it made up of the people who we don't like. It's funny because I often feel like Hollywood is the only place where some of the negatives of technology seem to be taken seriously and vividly brought to life or kind of like I do read science fiction I've read science fiction I'm reading science fiction now I've read science fiction my whole life but I feel like it's in movies and stories it's from artists that some of the downsides of technology are actually made vivid and real where in academic writing and in science journalism that isn't always there so I've kind of been grateful for the fact that those things have been explored I tell you though and then the series television anytime the shows like Battlestar Galactico begin with a dystopic setup and then in the interest of keeping characters that you like alive and continuing to tell stories that's why you end up not having a completely fatalistic end of the world approach but in feature films that are not part of the series as a typical there's the Frankenstein story in a recent form typically TV versus film will have because of the serialized fashion a greater need to happy things up and the spirit of populism let us open this up for questions yes thank you there was a discussion at the beginning about playing God and also mentioning an okay thing and it sounds as if the panel is saying playing God is bad and it violates the laws of nature well this is exactly what I wanted to get to you're nodding this way so if you would discuss that because if we were just to leave everything to nature we would not be able to address illness and other kinds of things that you actually can violate the laws of nature and that that will cause negative effects I'm all in favor of violating nature I don't think I think there's a lot we can improve on I guess I would just say violate nature sometimes and don't violate nature other times and the problem with that is that it's like things you actually have to think about it every time you're thinking about doing it and people don't like that and that would be so much easier than like well sometimes natural is good and sometimes natural is bad and so you have to think it's through like you actually have to decide we play God in a sense all the time we vaccinate our children which by the way is an enhancement which is something that is sometimes demonized as bad too like it would be okay to make changes to embryos that would make them immune to diseases not you know not pass on the cancer causing genes that we've identified but we wouldn't want to do anything that would be an enhancement like well we do enhancements actually a lot question is is it the kind of enhancement that we think would be good for us and that we want to be engaging in or not and we have to stop and think about it because it's not just a question of yes the good and the bad so I think the problem with the playing God argument is that you can throw it out okay so I played God yesterday when I vaccinated my daughter so the playing God argument is bogus that's too easy the trick is playing God and not losing well said yeah another question God and losing in regards to the CRISPR or Cas9 complex that you mentioned as I'm sure y'all are aware Dr. Dowd not ultimately called for a moratorium on our own technology and I think you see in the book as our own creation so I wondered if you guys could offer any commentary on the process of a creator ultimately rejecting that life that he's created I don't think that's fair to her she asked called for a moratorium on its use in humans which it hasn't actually been used in directly in humans at that time but she was worried about so there are so many uses of it that aren't even in human organisms or even in humans else so she was just really wanted to have potential use in adults children sperm egg embryos and so I think it's brave actually to do what she did because it would have been I think a lot of pressure not to do that right like a lot of temptation to not open up the idea that it was anything negative about this and I think it was really brave and important that she actually said this is a complex thing to make and I need everybody to pay some attention here and help us figure out how to use this wisely and that seems like exactly the kind of thing that we should be encouraging and rewarding it is incredibly rare when you think about human nature incredibly rare to see somebody open the door and then say no I'm not going to walk through it you know when you look at the history of recent technological research I mean we did blow up a few of them I mean you can invent something incredibly powerful and understand that it could have all sorts of uses and that's now going to need some conversation and still not regret inventing it but just really need all of us to put our brains to work to figure out how to use it well Famously said I have become the destroyer of worlds he had some regrets about it really bad that the scientists just tore up their way that may well have happened interesting question so part of the the ability to create life and watch it blossom seems to be okay generally and even in storytelling but this jump starting of life or bringing something back to life and then whether or not to kill it or now that you've created it has to live out its life the ethical issues related to gene therapy and enhancements keep going you know if you could speak maybe to the ethical issues related to social inequalities you know the wealthy will be able to have the special treatments but maybe the poor won't things of that nature I wrote a trilogy about this starting with my novel Beggars and Spain I wrote it out of jealousy I need a lot of sleep I resent it other people get more life but where the novel goes I wanted to create a genetic enhancement what you said that has no downside these people are not monsters these people do not develop weird cancers these people are not strange in anyway nor they develop telepathy or anything like that they just don't sleep but I do as I did for the bifurcation of the human race because the gene is dominant the genetic tinkering is dominant you end up with one strain that needs to sleep and one strain that doesn't and one that has an evolutionary advantage and of course it's the wealthy as well as the children of scientists who have access to this kind of thing and that's where my story went because again I had to have a conflict of some sort I know there will be drinks later I think I mentioned that thank you invite Joey Estrick up here to introduce our next panel and to lead us off with another reflection Joey is the editor and program manager of the Center for Science and Imagination at Arizona State University well thank you Ed and thank you to the first panel for I'm here to talk to you about a really bad Frankenstein adaptation that I love Splice so Splice has anybody seen this movie show of hands there's like six or seven Splice heads in here very exciting Splice is a science fiction horror hybrid it was released in 2009 and the film follows the efforts of a married couple of genetic engineers played by Adrian Brody and the very talented Sarah Polly who worked for a big pharmaceutical company and their job is to create genetic hybrid creatures for medical applications they start off creating these kind of warm like beings but they're not satisfied with that and so they decide to splice human DNA into the mix and they're hoping in a kind of Victor Frankenstein hand wavy way to like revolutionize the human race like they want to create an organism that will produce genetic material that could cure cancer that could cure Parkinson's that would you know and some again very hand wavy way just like solve all the problems that we have and you know they end up creating something sentient and it's kind of cute and like a creepy squid like way and so they decide to raise it in secret of course because as Nancy said something horrible has to happen right off the bat or else you don't have a story so Splice is a modern day Frankenstein story and it's for those of you who are sort of science fiction and horror heads it's crossed with the gruesome bio-horror of classic science fiction movies like Alien and the fly it's also frothy and overwrought it's a little nuts it goes totally off the rails near the end and that messiness is precisely why I love it so much it I think it in bad movies like it bad but kind of smart movies like it tell us a lot about the about the moment we live in and in this case it's I think about the sense of distrust and paranoia we have about biotechnology and these other Frankenstein technologies like AI and geo-engineering and things like that in this moment of as we start to talk about already of great possibility and perhaps great peril as well so in adapting Frankenstein to this contemporary moment of actual human pig hybrids for those of you who have been reading your science and tech news this week with designer babies as Nancy talked about on the horizon the filmmakers behind Splice make important decisions about which elements of Shelly's novel to carry through and which to transform or leave out you know just like any adapters of a myth or well-worn story they want to tailor it to their own social and in this case technological moment and you know my basic premise is these decisions are really meaningful and in this case they shape the themes and ethical messages of the film and they shape the ways that it departs from its source material and so today I want to talk about one really important departure that Splice makes from Shelly's novel as a way to kind of set up this panel my panel is about unintended consequences so without further ado here's a brief clip and this is happening when the creature which is developing at a vastly accelerated rate is very young rush out and see it seriously okay so so names are really important and giving something a name whether it's a human or a child or a pet or like your car right an inanimate object it lets us imbue it with a personality to see it as an independent being with goals and emotions deserving of our attention and affection it's no surprise that so many of our friendly technology conglomerates these days are creating virtual assistants that have names and personalities they're encouraging us to build emotional connections with their brands and to kind of imbue those brands with all kinds of associations about desires and senses of humor and things like that in Frankenstein Shelly very intentionally has Victor never give his creation a name and I think this is really indeed quite intentional it's awkward I think is a novelist to have a major character with no name and it makes the writing harder when referring to the creature Shelly has Victor use a bunch of different substitutes for a name he calls the creature a wretch a demon a monster and many other terrible insulting things Shelly goes to all this trouble I think because the lack of a name symbolizes in a really powerful way Victor's rejection of the creature he abandons it right after he brings it to life he makes no attempt to care for it to teach it but in the novel the creature becomes violent and vengeful precisely because he's rejected first by Victor then by other people largely because he's so large so ugly he's scary looking right his lack of a name brings home the idea that he's barred and shunned from human society and the pain of that exclusion is what turns him bad he's not born bad which brings us to spice and on the other hand in this movie you can start to see it here Dren is socialized educated, loved later in the film the scientists hide with her in the barn or in this barn the barn a barn where they create a sort of grotesque lynchian parody of a traditional 50's suburban nuclear family this movie has a kind of dark comedic underside to it and it really comes out in this pastiche of nuclear family life and these aren't perfect parents by a long shot but they do try and they care for Dren they screw up a lot they try and you can really see of course in this clip Sarah Polly's character starting to really build a bond with this creature and this is a really pivotal scene because you can see in the conflict between the two scientists that this is the start of transitioning the creature transitioning from being a specimen to being a daughter that name specimen becomes this really sticking point between the two of them but of course this ends in violent mayhem this movie ends horribly just like Frankenstein with death and with a really shocking brutal sexual assault actually Sarah Polly's character ends up alone and despondent just like Victor at the end of the novel so we end up in the same place and so to go back to the novel the lesson I drew from it is that Victor's sin this is one reading anyway that Victor's sin wasn't in being too ambitious not necessarily in playing God it was in failing to care for the being he created failing to take responsibility and to provide the creature what it needed to thrive to reach its potential to be a positive development for society instead of a disaster Splice on the other hand has this just very different ethical program it has a very different lesson for us it says that some lines shouldn't be crossed some technologies are too dangerous to meddle with it's possible for scientists these like sort of well-meaning scientists who we kind of like and we like the actors they can fall victim to hubris they can shoot too high and even though they try their best again the experiment these characters do something truly groundbreaking and they fail to predict and understand the consequences of their actions they avoid Victor's mistake they stick around and hold the creature close but the unintended consequences of their actions are still catastrophic and you know as we've already started to talk about we're in a moment when these Frankensteinian technology seem to be becoming more and more reality AI, genetic engineering robotics, geoengineering promise to make us healthier and more efficient and even help to combat climate change but splice warns us that if we try to do these radically ambitious things right and we make an earnest effort to do them right we might unleash terrible unintended consequences anyway we might wipe out the economy we might give rise to the robot uprising that everybody likes to reference in their future tense pieces we might wreck our environment even faster and for splice it's just not about how responsibly we do science or whether we stick around and care and love about the idea that some innovations are just a bridge too far and so to help me continue to explore this theme of unintended consequences I would like to welcome our three expert panelists to the stage so first Sam Arbusman is a scientist in residence at Lux Capital and the author of the book Overcomplicated Technology at the Limits of Comprehension Susan Tyler Hitchcock is the senior editor of books for the National Geographic Society and the author of the book Frankenstein A Cultural History which has just been immensely helpful to me in understanding and untangling all of this and Kara LaPoint is an engineer who has worked with autonomous systems for both science and defense applications and development fielding operations and policy development so thank you so much for being here with me okay so I'm sort of interested you know whether you're new to Frankenstein or Patrick or whether you are kind of like someone who's lived and breathed Frankenstein your whole life what got you interested in the first place Susan you know you have this entire very encyclopedic and helpful book about the Frankenstein phenomenon Sam your work with inventors and technology startups and you know seems to me to be evocative of some of the themes of the story these creators at the cusp of something new and Kara I'm interested to hear from each of you kind of what resonates with you to start us off so my fascination with Frankenstein goes back to my graduate well no really my education my fascination with the literature of the romantics the British romantics who they represent a time of culture wars as interesting as the 60s when I started my fascination with these characters and their literature and also today I mean there were a lot of amazing things happening in their day and I began with an interest in Percy Bischle I ultimately taught humanities to engineering school students and I had the great opportunity on Halloween day of teaching a class on Frankenstein and for that class I brought I actually a Halloween mask green ugly plastic Frankenstein mask and we started talking about the difference between the novel and the current cultural interpretation and that that's what started me from that point on I started collecting Frankenstein Anna and I have hundreds of objects and then I wrote a book we should have done your house ha ha ha Sam how about you I have them hidden away ha ha ha so I guess my interest I guess the themes of Frankenstein the themes of I guess societal implications of technology more generally began I guess through influences from my grandfather from my grandfather grandfather he's 99 he's actually been reading science fiction essentially like the modern dawn of the genre he read Dune when it was serialized before it was actually a book he gave me my first copy and so I've been and a lot of the science fiction that I've been especially drawn to is the ones that kind of really try to understand a lot of the societal implications of the gadgets suppose just kind of the gadgets of the future themselves and in my role at Lux it's a VC firm that does early stage investments and kind of I guess any sort of anything that's at the frontier of science and technology and so one of my roles there is involved in trying to connect groups of people that are not traditionally connected to the world of venture and startups and so related to that when a lot of technologists and people in the world of Silicon Valley are building things there's often this kind of techno utopian sense of like you build something it's this unalloyed good it must be wonderful but of course there's often a lot of people who are thinking about the social regulatory ethical legal implications of all these different technologies but they're often in the world of academia they're often not talking to the people who are building what they're doing is trying to connect these two different worlds together to really make sure that both parties are as engaged as possible and actually even going back to the science fiction part since science fiction more holistically looks at a lot of the implications of these kinds of things as opposed to just saying okay the future is the following three gadgets and what they're going to be science fiction is really good at saying okay here is a scenario let's actually play it out I've actually been working to try to maybe I don't think I've actually gotten people involved in like explicitly Frankensteinian stories yet involved but yes everybody gets money from you guys to watch the slides before that well it's interesting that Sam talks about this kind of holistic approach so I'm an autonomous systems engineer but I've worked in developing systems using systems the policy implications so kind of come at autonomous systems from a lot of different angles so what's really interesting to me about the use of creation should we or should we not create the technology I think it's brought up in the first panel when it comes to autonomy when it comes to artificial intelligence this technology is being developed so I think it's really more productive to think about okay what is the ethics of how where when why you're going to use these types of technologies because it's you know I think somebody said you know the genies out of the bottle these things are being developed so that's what to all these technologies actually use you know the thing about autonomous systems is you start to move into a world where we've used machines for a long time to do things right but now we're starting to get to a place where machines can move into the cognitive space in terms of the decision making space it's a really interesting construct that we use in the defense world sometimes called the OODA loop the observe, orient, decide and act just kind of as a way to describe doing anything orienting is kind of understanding what you're sensing and then deciding what you want to do to achieve whatever your purpose is and then you act we've used machines for a long time to do the sensing you know we have all kinds of cameras and other types of sensors and we've used machines to act for us for a long time but what's really interesting with technology today is we're on this cusp where these cognitive functions machines can move into this cognitive space so figuring out kind of where and when and how we want machines to move to be interesting and I think even kind of from very early on Frankenstein was bringing up those ideas of when kind of you bring something into that cognitive space so that's why I think it's pretty fascinating So Susan I was hoping you could ground us in how people at Mary Shelley's you know historical moment are thinking about unintended consequences as Ed said the word scientist isn't even in use yet but are there other ways people are thinking and talking about the ethics of creation and responsibility and really kind of building on the context that she's in to kind of create this theme in Frankenstein and develop it Yeah well there's an interesting intersection between her legacy from her parents and the science going on her father I find a really important influence on the novel because William Godwin was really I think of him as being the father of our modern day liberal concept that people aren't evil that bad actions come because people have been influenced by hatred by anger by negative outside influences that is that evil is made not born and I think that that really carries through it's as if Mary Shelley wanted to animate that philosophy of her father's but at the same time there are these fascinating experiments going on at the time Galvani the whole idea of the spark of life what is the spark of life in these amazing experiments not only with frogs which is sort of the famous one but even with corpses introducing electrical stimuli to bodies and making them move making the eyes of the corpse open up making it sit up that sort of thing those things were being done at the time and they were kind of like sideshow events that the public would go to so there was a lot of science happening that opened up a lot of questions of should we really be doing this and that is a lot of the inspiration behind Frankenstein as well you don't really see that happening in the novel but it's so interesting that instantly the stories the retelling of the stories bring electricity in as the spark of life so that point about social context and the sort of social constructionist beliefs of William Godwin is really appropriate I think and also something that her mother Mary Wollstonecraft was very adamant about she wrote a lot about women's education and the idea that she had socialized them to be submissive and she called them intellectually malformed and things like that this idea that they were violently socialized away from being intellectuals and citizens and full members of society both Sam and Kara I think you both have some interaction Sam through your book and through Lux and Kara through your engineering work with systems that learn and adapt the systems that work in a social context this sort of social constructionist thinking this idea that the social context for the operation of these technologies actually affects the way they work and what they become you know how do we kind of react to that in this moment one of the clear examples of this kind of thing it's like the whole artificial intelligence machine learning especially with deep learning now this is like we're having a moment deep learning right now and with these systems and even though the algorithms of how they learn are well understood the resulting system based on like once you kind of pour a whole bunch of data into it the resulting thing might actually be very powerful might be very predictive you can identify objects and images or help cars drive by themselves or do cool things with voice recognition how they actually work kind of the underlying components and like the actual thread within the the networks are not always entirely understood and oftentimes because of that there's often moments earlier like there's like the Microsoft chat bonte I guess a little more than a year ago when it was like designed to be a teenage girl ended up being a white supremacist it was because there was this mismatch between the data that they thought the system was going to get and what it actually did get and there was this like the socialization in this case was wrong and you can actually see this also in situations with with IBM Watson where they wanted the engineers who were involved in Watson wanted the system to better understand slang and just kind of everyday language to teach it that they kind of poured in urban dictionary and then end up just cursing out its creators and that was also not intended and so I think there's a lot of these kinds of things like recognizing that the the environment that you expose the system to and the way it kind of assimilates that is going to affect its behavior and sometimes you only discover that when you actually interact with it and so I think there's kind of this iterative process of as opposed to oh no it sucks I kind of give up and run away I think in technology ideally there's this iterative process of understanding if you build something you learn from it you actually kind of find out that there's a mismatch between how you thought it was going to work and how it actually does work and embodied by glitches and failures and bugs and then you debug it and make it better and I think so rather than kind of just viewing it as we fully understand it or we never actually want it to be there's a lack of knowledge though of what the forces that you're putting onto whether it's the creature or the systems you know maybe we don't have the capability of fully understanding or fully knowing like pouring the urban dictionary in they didn't know what influences they were making on the thing and actually related to that there's this idea from physics it's this term when looking at complex technological systems or just complex systems in general of robust yet fragile the idea that when you build a system it's often extremely robust to all the different eventualities that you've planned in but it can be incredibly fragile to pretty much anything you didn't think about and so there's like all these different exceptions and edge cases that you've built in and you're really proud of handling them and suddenly there's like some tiny little thing that just makes the entire thing recognizing the limits to how you actually designed it I think it's really interesting to think about the system we're using the word system to talk about a machine that's being creative when I think of system I actually think of the interaction between machines and people I mean time and time again in history technology comes in innovative emerging technologies come in and they actually change the fabric of our lives I mean think about the whole industrial revolution I live 30 miles outside of DC but I can drive in every day but then think of the personal computer think of the internet you actually live your life differently because of these technologies and so we're on the cusp of the same kind of social change when it comes to autonomous systems autonomy is going to change the fabric of our lives I don't know what it's going to look like but I can tell you it is going to change the fabric of our lives over the coming decades so it's interesting when you're talking about a system to understand that it's not kind of this one way it's not how we're just teaching a machine and how kind of we collectively as a system evolve and so I think that's just an interesting way to kind of think about framing it as you move forward talking about these types of technologies what do you mean when you say autonomy is going to be shaping our future what is autonomy that you're talking about so autonomy you know what and there is no common definition of autonomy many days of my life have been spent in the debate about what autonomy and autonomous mean to get machines to move into the cognitive space machines can start making decisions about how they're going to act so the example I love to use because a lot of people have had them the Roomba vacuums I got a Roomba I love it but it's funny because when you think of a traditional vacuum when you have a traditional vacuum you turn it on and it's what's it doing it's job is to suck up the dirt and you move and decide where it's going to go to the floor but it now decides how the pattern that's going to follow around your room or around whatever the set space is to clean that so it's autonomy is as you're starting to look at machines starting to get into the decision space and I think one of the things that we really need to address and figure out as these machines come in it's much more than just a technical challenge it's all these other things try on it to be predictable right and we have this kind of intuitive trust of other people and we know that they're not going to be perfect all the time they have this kind of understanding of what a toddler is going to do is different than what a teenager is going to do is different than what an adult is going to do right so you have this intuitive knowledge so as we're developing these autonomous systems that can act in different ways and when you have an autonomous system you know when I turn that room on I don't know the path it's going to take around the room I don't know if it goes straight or goes left or does a little circle I have three kids and a dog so it does a lot of the little circles where it finds those dirt patches right I don't know just looking at it instantaneously if it's doing the right thing I have to kind of wait to see as it's done its job to evaluate systems is going to be fundamentally different with autonomous systems and this to me is one of the real challenges that we are facing we are facing as a society so think about autonomy in you know self-driving cars a lot of people like to talk about self-driving cars right and this is a technology that is developing a pace well what are the challenges the challenges are how do you integrate these into the human existing system we already have how do you trust the self-driving cars a lot of drivers who have had an accident and they still are trusted to drive around right but we don't have that kind of same level of intuitive understanding of what is predictable and reliable related to that there is I mean within like machine learning I was going back mentioning how these systems a lot of them are kind of making these somewhat esoteric decisions where they work but we are not entirely sure why they are making these things and that makes you like the decision-making process of these systems and so related to the self-driving cars it's one thing like when you we pretty we have a pretty decent sense of like like intuitive sense of mind of like when I meet someone in an intersection how they are going to kind of interact with me in my car versus their car they are not entirely rational but I kind of have a sense but if it's a self-driving car I'm not really entirely sure the kind of decision making process that's really important and I think back like in terms of the history of technology and so the first computer my family had was the Commodore VIC-20 and I guess William Shander called it the wonder computer of the 1980s he was the pitchman fourth and and one of the way and so I was too young to program at the time but one of the ways you would get programs is you had these things called type-ins you would actually just like get a magazine and there would be code there and what the computer was doing and now we have these really powerful technologies but I no longer have that connection there's a certain distance between them and I think we need to find ways of creating sort of a gateway into kind of peeking under the hood and I'm not entirely sure what those things would be it could be a symbol maybe as a progress bar which although I guess those are only tenacially connected to reality but we need more of those kinds of things in order to actually create that sort of trust it seems to me that the ruling aesthetic is magic right so you know Netflix it works according to magic the iPhone like so much of what happens is under the hood and it's sort of for your protection you don't need to worry about it but I think we're realizing especially with something like cyber security which is a big unintended consequences problem right we offload everything onto the internet to become more efficient suddenly everything seems like at risk and insecure in a way we're realizing we might need to know a little bit more about how this stuff actually works maybe magic isn't good enough all the time there and one of the few times it's through like the failure and you're like oh it's like I don't know like the chatbot is becoming racist like now we actually realize it was kind of assimilating data in ways we didn't expect yes these kind of bugs are actually to do this which brings us back to Frank's line because because Victor was so fascinated and excited and proud and delighted with what he was doing and then when he saw what he had done it's like looking out horrible end of end of his fascination and delight and beginning of his downfall I wanted to say and I'm going to kind of prompt you Sam you know Frankenstein's very haughty about his really you know I think it's you can read it psychologically as a defense mechanism he's so haughty later about the creature he's a very disdainful of it he sort of distances himself from it you know all the negative all the unintended consequences it causes he sort of works really hard to convince his listeners and the reader you know that he's not responsible for that as if not thinking ahead somehow absolves him but Sam in your book Overcomplicated you talk a bit about this concept of humility which dates all the way back to the medieval period and I feel like the conversation we've been having reminds me of that concept talking about how to live with this complexity in a way that's not scornful but that's also not kind of mystified and helpless yeah so when I was writing about humility in the face of technology I was kind of contrasting it with two extremes which we often tend towards when we think about when we're kind of confronted with technology we don't fully understand and so one is the fear in the face of the unknown we're like oh my god self-driving cars are gonna kill us all the robots are gonna rise up and the other extreme is kind of like the magic of Netflix or like the the beautiful mind of Google like this like almost like religious reverential sense of awe like these things are beautiful they must be perfect and of course they're not perfect they're built by imperfect beings cutting off questioning like when we're so fearful that we can't really process the systems that we're dealing with we don't actually try to understand them and the same thing that we think the system is perfect and wonderful and worthy of our awe we also don't query it and so humility I kind of use that as like this sort of halfway point which actually is productive it actually ends up allowing us to try to query our system but recognize there are going to be limits and so going back to the medieval thing I was from the 12th century is a philosopher physician rabbi and in one of his books The Guide to Perplexed he wrote about sort of like there are clear limits to what we can understand and that's fine and he had like made his piece with it and I think in later centuries there was sort of a scientific triumphalism that if we apply our mind to the world around us we'll understand everything and in many ways we've actually been extremely successful which is great but I think we are recognizing that there are certain limits there are certain things we're not going to be able to understand the technological realm and recognize that even the systems we ourselves have built there are certain cases where and it's one thing to say okay I don't understand the iPhone in my pocket but if no one understands the iPhone completely including the people who created it and work with it on a daily basis that's kind of that's an interesting sort of thing and I think this humility is powerful in the sense that it's better to work with that and recognize our limits from the outset so that we can we can sort of build upon them and understand the system as opposed to thinking we are going to fully understand and then be blindsided by all of these unintended consequences so Susan I'm going to query you on this first I feel like the other two are going to have stuff to say too but I want to get the Frankenstein angle on it so what do we do like should we you know how do we what does Frankenstein tell us about how we prepare for unintended consequences since they're inevitable clearly like we're sort of innovating and discovering very quickly things are changing quickly should we ask scientists and engineers to regulate themselves should we create rigid laws you know to researchers need more flexible norms that they agree you know what is Frankenstein you know what is this modern myth that we're constantly using to frame these debates kind of have to say about them what does that oh gosh I have in my mind something that you said you do maybe you should say you said something about well I want to prompt you so you said something when we were like talking in advance I was picking their brains about this about how secretive Victor is about how he removes himself from his colleagues well it's true yes indeed Victor is representative of a scientist who works in secret all by himself does not share and as a matter of fact even the you know the James Whale film the same thing I mean Victor goes up into a tower and he locks the door and his beloved Elizabeth has to knock on the door to ever see him I mean it is perpetuated in the in the retelling of Frankenstein this whole idea of a science that is solo and not shared and you know thanks for the prompt because maybe that's a good idea to you know that we share the science that we talk about it and I think sharing it not only with other scientists but also with philosophers psychologists humanists you know people of all people who think about bioethicism people who think about these questions from different vantage points and talk about them as the science is being developed that that is about what human beings could do I think that's about the best we could do I think this idea of sharing is really critical so from the kind of developer operator perspective and I come from kind of a navy background military background it's really important that you get the people who are developing systems talking to the people who are using systems we get into trouble when people have an idea of oh this is what somebody would want and they go off and develop it kind of in isolation maybe not secret but in isolation there's a lot of kind of stove pipes in large organizations and it's really important to create these robust feedback loops and we have this this saying in the navy that you know sailors can break any systems you always when you build something you want to make it sailor proof but it's really a fabulous thing to take a new design take a prototype and give it to sailors because there's nothing like 19 and 20 year olds to completely take apart what you just gave them tell you why all the reasons you thought it was going to be great are completely stupid and useless and tell you the thousand other things you can do with a system so I think this kind of idea of sharing in the development so sharing in terms of talking to people who are operators talking to people who are the infrastructure developers right you think about kind of going back to the self-driving car think about how we interact with the driving infrastructure when you come to a stop light what are you doing you are visually looking at a stop light that will tell you to stop or go do you think that is the best way to interact with a computer that's really really hard it's really really hard for a computer to look and visually see a different colored light and take from that the instructions of whether to stop or go so you have to kind of include the people who are developing infrastructure developing technology to make sure you're developing in a way that works in a way that's useful in a way that's going to actually be the right way to go with technology and I think that's a really good example from Frankenstein is that because he's kind of solo and designing something that to him is brilliant and maybe if he had stopped and talked to anybody about it they would have said hey maybe that's not the most brilliant idea in the world and so relates to this so in the like open-source software movement there's this maximum on something then all bugs are going to be I guess rooted out and discovered which is not entirely true there are bugs that can actually be quite serious and last for like a decade or more but by and large you want more people to actually be looking at the technology kind of also going back to like the robust fragile idea that you want to make it as robust as possible and to do that you need as many people involved kind of to deal with all the different eventualities but you also just need the different kinds to really try to understand not just to make the system as robust as possible but really as well thought out as possible I think that's a really important thing crowd-sourcing your development if you think about what's going on with self-driving cars one of the most important things that's happening today that's going to feed into that is actually just the autonomous features in other cars that are being deployed and all this information gathering because there are so many people out there and so many cars out there they help you park they help you do all these other things they help you stay in the lane and so those can all have unintended consequences but as you learn from that and the more widely you're testing and testing this kind of incremental approach I like to say revolution through evolution you build a little test a little learn a lot and I think that's a really good way to try to to prevent unintended consequences so instead of just talking about managing unintended consequences when they happen let's think through what could be possible consequences and try to mitigate them along the way and it relates like the process of science more broadly and science and people have been talking a lot about recently like the reproducibility crisis and the fact that there's certain scientific research that can't be reproducible and I think that really speaks to the importance of like opening science up and actually making sure we can share data and opening and actually really seeing the entire process and like putting your computer code into a wonderful messiness that is science as opposed to kind of just trying to sweep it under the rug and I think that's really important to really make sure that everyone is involved in that kind of thing. So we have time for one more quick question I actually want to address it to you Susan at least first and hopefully we get quick answers so we can go to questions and answers from everybody else as I'm listening to you I'll talk about diversifying this conversation and engaging sort of non-specialists it strikes me that one irony there is that you know disastrous outcomes might happen like we might transgress the bounds of acceptable human ambition that this is actually a roadblock to having a constructive conversation in a way right like all of these themes that we're talking about today of unincorporated consequences in playing God are in fact difficult for people to grapple with in big groups I wonder if you have any thoughts about that there's other ways to think of the novel maybe to recode it for people yeah you know I think that culture has done the novel because I actually think that the novel doesn't end the novel does not end with everybody dead nor does Splice by the way there's a couple of people at the end of Splice and of course the company there's a pregnant woman at the end that is true that's what I'm thinking about so Frankenstein ends with the monster the creature whatever we want to call them good or bad going off into the distance and potentially living forever and also Victor Frankenstein is you know yes indeed he is saying I shouldn't have done that and you Walton who is our narrator who's been going off to the North Pole indeed listens to him still wants to go to the North Pole but his crew says no no we want to go home we're too cold they're worried they're going to die yeah I know but there are still these figures in the novel both the creature in Walton to some extent who are like still questing still questing yeah and I don't know why I got onto that from your question but you refuse to see the novel is purely black at the end I think yeah I don't yeah and oh I know I was going to say culture done it as a service because I think culture has simplified and simplified the story to say science science is bad pushing the limits are bad this is a bad guy and he you know shouldn't have done it and he and I I don't think that it is that simple frankly in the novel or today for that matter all right well I am going to ask if anybody out here has questions for any of our panelists thank you very much for a great discussion I'm curious about what segment of society really wants the self-driving cars and one of the concerns is that there'll be a lethargy that will come upon the rider perhaps or the one who's in the car and such and not really ready to let's say your rumba if your rumba couldn't get through get with stall because it couldn't get into it you have to interact to reset it or something so I'm just wondering in a self-driving car if you if you're not going to be able to have to do anything then you're not going to be maybe aware of what's really going around so is it musk is the one who started the whole idea and yet is it going to target just a certain segment of society as opposed to you know everyone has to drive a self be in a self-driving car I'm not going to speak to I think in terms of who's who's driving I think there's a lot of people who've been driving the self-driving cars but your idea of when you have people that were formally driving who are now the passengers the most important issue with autonomous systems is one of the most dangerous parts with any autonomous system is the handoff the handoff of control between a machine and between people it doesn't matter whether you're talking about cars or other systems you could be talking about a plane an autopilot going to a pilot on a plane it's a perfect example and it's that not having that full situational awareness so when you have this handoff that's a really dangerous time for any system so I think this is one of the challenges for a system I don't think you can just define the machine you have to define the system of how is the system going to work together between a machine and a person and how they're going to work together well I think cognitive load is a really big issue for engineers as well I mean just I mean think about we live in an age of so much information how much information can a person process and frankly you have data there's tons of data you have so many sensors you can bring in so much data how do you take that data the knowledge out of it and turn it into information and I think that's that's really part of the art of some of this is how you take so much data and turn it into information and deliver it to the human part of the system or even the machine part of the system the right information at the right time to make the overall system successful and to relate to that there's the the computer scientist Danny Hillis he's argued that we were living in the enlightenment we kind of apply our brains to understand the world around us we move from the enlightenment to the entanglement this era of like everything hopelessly interconnected we're no longer fully going to understand and I think to a certain degree we've actually been in that world already for some time it's not just that self-driving cars are going to herald this new era we were ready there and I think the question is how to just actually be conscious of it and try our best to make sure we're getting the relevant information and constantly iteratively trying to understand our systems as best we can I think that goes back to like in terms of thinking about how we approach what understanding means for these systems like it's not it's not a binary situation it's not either complete understanding or total ignorance and mystery there's like there's a spectrum of you can understand certain components you can understand the lay of the land without understanding all the details and I think our goal when we design these technologies is to make sure that we have the ability to kind of move along that spectrum towards greater understanding even if we never get all the way there and I think that's in many cases fine I'd like to move on to our next question I think we have one in the back or one over here hi hi I want to dig in a little on the dialogue that we may all agree it would be a good idea to involve more people in at the start of conceiving of these technologies and ideally I think we might agree that some public morality would be a good element to include but say hypothetically we lived in a society where practically we're not really good at having conversations with the public that are thorny and especially that include technical details I mean just say that happened to be the case and I just want to clarify is it is the value of broad public consensus and input or is the value more on having a diversity of representative thought processes and if the value is on something like openness and transparency that might have a different infrastructure or feedback whereas if it's on something about diversity of thought you might think of a sort of council where you have a philosopher and a human humanist and whatever so I think oftentimes we end up saying something like we should have a broad conversation about this and that's how we'll move forward but sort of digging on what that might actually look like and how to get the best value in our current society Thank you for that question I'm just going to ask that we keep our responses quick just so we can take a look at that but we're not usually exclusive well look at that quick response either if you want to add in add anything and so one thing I would say is just this is maybe a little bit to the side of it but people have actually looked at when you kind of bring in lots of different diversity of opinions when it comes to innovation oftentimes the more diverse the opinions the I think the lower the average value of the output variance the idea is like for the most part when you bring lots of people together who might speak lots of different languages and jargons it often fails but when it does succeed it succeeds in a spectacular fashion in a way it wouldn't have otherwise and so I think we should aim towards that but recognize that sometimes these conversations involve a lot of people talking past each other and so we need to do our best to make sure that doesn't happen but I think specifically to I always like to tell people autonomy, autonomous it's not a technical problem it's not like I could put a bunch of engineers in a room for a couple of months and I could solve it there are all these other aspects to it so you need to make sure you bring all the other people you bring the lawyers you bring the ethicists you bring the everybody else the users all the different people so I think you just have to be very thoughtful whenever you are looking at developing technology to bring all those voices in at an early stage I'm Ted Daly it's for you Kara in the last panel Nancy Kress I thought made a very complex sophisticated argument about genetic engineering has great benefits also enormous risks I think Nancy said gene editing some aspects of that is illegal but then Nancy said but of course you can go offshore so I want to ask you to address those same things Kara about autonomous systems I think you've made clear that they have both risks as well as great benefits do you think it ought to be regulated at all and if so who should do the regulating given that if Country A does some regulation in our globalized world it's the easiest thing in the world to go to Country B I think it's a great question it's something that we internally talk a lot about I think the thing about autonomy to understand is that autonomy is ultimately software it is software that you're putting into hardware systems that helps move into this every piece of autonomy that you develop the software you develop it's dual use so that was my earlier point in terms of I don't think it's really useful to talk about should you regulate development because autonomy is being developed for a lot of different things so what you really need to think about is okay this technology is being developed so how where when should the technology be used I think those are the useful conversations to have in terms of how it's regulated et cetera autonomy allowed to be used where it's not allowed to be used but the idea that you could somehow regulate the development of autonomy I just don't think it's feasible or realistic I'm with a heavy heart I have to say that we're out of time we will all be around during the happy hour afterwards so we'd love to keep talking to you answering your questions and hearing what you have to say and thank you to all of you for being up here with me and for sharing your thoughts with us okay and to wrap up I'd like to introduce our next presenter who is an editorial fellow here at New America and Jacob also writes brilliantly about technology and culture for Slate Magazine and he's here to talk to you about a fantastic Frankenstein adaptation before I shifted to writing I was an academic and I think the two hour mark of conferences was the point at which I faded to the extent that I was paying attention before so if you'll forgive me I'm going to try to gross you out just a little bit just to maybe wake you up earlier this week I visited the National Zoo here in Washington DC where a extraordinarily patient researcher tried to show me samples of a tiny nematotal skin parasite a little worm that appears to be infecting an animal species that was already facing extinction there are several reasons why this was difficult for me one of them as I pointed out not at all humbly I'm afraid to the researcher I have extremely long eyelashes I don't know if you can see from where you are but they're quite luscious and it's quite difficult because of that to lean into a microscope she kept saying pull your head back but I just really wanted to lean into this experience of looking at this parasite the other problem was probably more pressing is that as you may know if you've looked through a microscope recently and are also someone like me who barely remembers his biology classes most skin samples under a microscope don't actually look like anything unless you really know what you're looking at so I'm looking at these slides and just sort of trying to take it in stride and recognize that this murky pink blot that I'm looking at is in fact some kind of tiny worm it's not really getting through to me this worm until the researcher who had this very helpful little arrow that she could move around on the slide points out this tiny line of black dots inside what appears to be an air bubble they're minuscule even at the microscopic scale that we're studying them but she makes it zoom in just a bit more and I can see that there are these small little balls that make up that black line see that she says those are worm eggs all of a sudden like that I catch myself itching at my own skin convinced that I too am literally filthy with worm eggs which I probably am is there any chance I say that like this stuff is on my skin well no of course not this particular nematode is not on my skin but thinking back to the trouble with my eyelashes she responds have you ever heard about the mites that live in your eyelashes they're mostly harmless like that I realize I'm a landscape I'm a habitat I'm an ecosystem and that means that my body is not my own at some fundamental level I'm not me thanks science that's what I needed to know today now I'm going to ask them to roll a clip that you will at first think has nothing to do with nematodes listen to me very carefully watching this scene from Terminator 2 I'm drawn to one thing in particular not the grotesque horror of Arnold paring away his flesh although apparently it was even grosser in the original cut of the film it's not that that I'm drawn to but rather it's the reactions of the two characters watching him Miles and Theresa Dyson even before they know what's happening but after they do all the more so they respond with a kind of outsized horror screaming and writhing as if they were the ones under the knife as if Arnold were peeling away the flesh from their arms now you might dismiss this perhaps reasonably as a bit of comical overacting I'm not convinced though that that's what we're seeing here I'd suggest that we're witnessing something else something more like what I felt peering at those nematode eggs under the microscope what they're seeing as he peels away his arm isn't horrifying because it's gross but because it's uncanny in something like the Freudian sense the uncanny Freud wrote in 1919 is that class of the terrifying which leads back to something long known to us something once familiar he's trading here on the German word for the uncanny Unheimlich which as I understand it means something literally like unhomely for Freud unhomely is not the negation or our opposite of home but that which shows us that home has always been something other than what we thought it was uncanny things are those things that astrange us from ourselves and our world precisely because they call us back to where we came from Freud in other words and maybe not surprisingly if you know your Freud thought that we feel something is uncanny when it brings to mind sentiments or ideas that we've repressed while I'm not here to talk about the repress this is still important for our purposes today I think because Freud thought it wasn't the unknown that scared us most but the known and here it's worth noting that the panel you're about to listen to is on the fear of the unknown Freud wrote some new things are frightening but not by any means all that is it's not novelty that frightens us although it sometimes seems to not the newness of science or of technology in particular and as I learned staring through the microscope earlier this week when the new is frightening it's mostly because the more we learn the more alien old things begin to seem this sentiment is also central I think to the scientific horror of a book like Frankenstein where the quest for knowledge mostly serves to teach us how much larger the world is than we realized I see something similar at work in this scene from Terminator 2 something I'm inclined to call the technological uncanny the technological uncanny would be what we experience when we just when new discoveries imbue familiar things with unfamiliar qualities what the Dyson those two horrified characters or quaking on the floor are realizing in this scene is that they may not be who they think they are or what they think they are sorry that was tacky of me my papers were stuck together there's a sort of sequel to this scene from Terminator 2 in the recent film Ex Machina where Donahal Gleason's character slices into his own arm with a razor and peers into his mouth in the mirror trying to convince himself that he is also not a robot and that's the trick once we start to realize how much hides beneath the surface of the visible world how many slimy things do crawl with legs upon the slimy sea as Mary Shelley's beloved Samuel Taylor Coleridge puts it we also begin to learn how little we know so this is the lesson I take from this scene from Terminator 2 it's not the strangeness of new technologies that frightens us but the way technology threatens to make us strangers to ourselves in a semi-Freudian spirit then I'd like to propose that where Frankenstein and its spawn are concerned our fear of the unknown may really be about our discomfort with knowing thanks I'll invite up I think Bina to introduce someone to yes perfect, I want to introduce everyone else thank you to lead us fearlessly into a conversation about the fear of the unknown let me introduce Bina Binkachaman Carnegie Fellow here at New America and Director of Global Policy Initiatives at the Broad Institute of MIT and Harvard thank you we'll invite up our panel we'll introduce once you can see their faces if anyone needs to do a semi-thinning stretch feel free moving around in your chairs so we have here with us today we have Dave Gustin who is the founding director and a professor of the school for the future of education and society at Arizona State University and Charlotte Gordon who is the author of Romantic Outlaws Extraordinary Lives of Mary Wollstonecraft and her daughter Mary Shelley and she's also a distinguished professor of humanities at Endicott College and Anna Lee Nuitz Nuitz who is the tech culture editor at Ars Technica and the author of several books most recently Scatter, Adapt, Remember How Humans Will Survive the Math Extinction so we have a great panel here and despite the billing that this is about the fear of the unknown and talking amongst ourselves I think maybe a more apt name for what we're going to talk about would be fear and loathing from Frankenstein to the future we really want to explore fear, writ large fear of the unknown fear of the known as Jacob Soak aptly brought up fear of the hideous and the strange and look at how the themes that carry from Frankenstein into how we think about science and technology today and how we think about the future so Charlotte if you would not mind kicking us off as our expert on the life of Mary Shelley what do you what do we really know about how Mary Shelley intended to represent fear in Frankenstein what was she trying to say about her own fear about the fear of others and what can that tell us thank you did you set Jacob up ahead of time no we talked not at all no that was perfect I think that you know it's funny because Frankenstein has gone down in history as you know the first novel of science fiction it's about science it's about innovation it's about the future and I think another very interesting way to read Frankenstein is to see it as political commentary and a real exploration of what Mary Shelley feared about what she knew which is why I thought Jacob's introduction to our panel was so perfect and what wasn't that Mary Shelley knew well she was born in 1797 and so if we just pause for a second and think about what it meant to be a woman in 1797 it wasn't a lot of fun women could not own their own property they couldn't have money they could not initiate a divorce their children were considered their husbands according to sort of English political theory and English thought nothing was more dangerous and harmful to the kingdom than disorder and so the role of men in English society was to keep women their daughters their sisters and their wives under control it was the responsibility of the man or the husband to discipline his wife the only rule was that the whip that he used would not be thicker than the thumb that's where we get the idea of the rule of thumb so in many ways you can read Frankenstein as a real exploration and a real condemnation this might be a little bit of hard since we don't have the text in front of us and you're not my students so I can't you know browbeat you into opening the text we do have the text in front of us really that most of the people most of Mary Shelley's contemporaries did not think that the situation for women was bad you know that it was just how it was but Mary was different because not only as Susan pointed out was she the daughter of William Godwin rock star political philosopher even more importantly she was the daughter thank you Joey of Mary Walston Kraft radical feminist who wrote a vindication of the rights of women and was an international superstar amongst liberals and radicals and was called a whore and a hyena and petticoats by everyone else and there were a lot of them unfortunately for little Mary Shelley her mother died 10 days after giving birth to her but Mary read all of her mother's books and by the time she was 12 or 13 years old she had decided I am going to live according to my mother's ideals I am going to be a beacon of freedom I am going to fight for justice for all people and she dedicated her life in fact to living according to her mother's ideals really interestingly as she's writing Frankenstein she suffers a terrible two terrible experiences which are two young women that she was close to killed themselves why did they kill themselves and her own mother Mary Walston Kraft had also tried to kill herself twice why? because in 18 well throughout the 18th century throughout the 19th century what was more monstrous to English society than an unwed mother or worse even than that a woman who thought for herself an independent woman an ambitious woman an intellectual woman so Mary Shelley when she's writing Frankenstein is thinking as I know it's strange to think about but she's thinking less about technology than she is about the social ills that she herself has endured and that those close to her have endured and she's really describing a world if you think about it in which there are no mothers in which the ideals of as she saw it a femininity or of women and we would complicate that discussion now but for Mary Shelley did not exist so Victor Frankenstein works in isolation rather than a community he's driven by ambition and by you know self-promotion in that sense and then he doesn't nurture or educate his creation he's a horrible parent this is terrible thanks Mary Shelley and the only voice of sanity in the entire novel is off stage is Walton's sister that he writes to who writes letters to him and in fact that's really how the novel ends I just have to say is Walton listens to his sister who says stop with the ambition already leave North Pole come home and be with your family and live in community like a normal human being like a good human being is the implication so what is Frankenstein really about it's about all the things we've been talking about today but it's an incredible damning political commentary about life in England during Mary Shelley's life but on into now too there's of course implications we've been reading this wrong so when we think I mean aside from reading it wrong but when we think about the fear of the unknown we carry on this idea like Frankenstein is synonymous with horror and fear right are we reading this wrong and thinking that she was trying to tell us to be afraid is she actually condemning being afraid of the unknown in this case the unknown being femininity or women that's a great question well I you know I'm a writer I'm a thinker so I don't think any reading is ever wrong so no I loved everything we've been talking about and I think the applications are all there for us but I do think that if there is a sort of moral to Frankenstein that who did Mary Shelley herself most identify with probably the creature you know as Joey said the you know the unnamed creature why because that's how people responded to her as an intellectual woman and as an unwed mother she was called a whore when people found out that she wrote Frankenstein they said what kind of woman would write such a book she must be something wrong with her there's something perverse about a woman who would write such a book so later in her life she says oh I actually I wrote it but that's because the idea came to me in a dream and we know that isn't true because we have her notebooks she in fact thought of the idea she worked on it really hard she worked on it really hard while people around young women around her were killing themselves and also incidentally she was reading the history of slavery so she's dedicating herself to the ideas of social injustice and the suffering of those who are considered monstrous by their own society herself included so she sees herself as a woman who's trying she wants to publish and be smart in her world as someone who's going to evoke feelings of is it a feeling of monstrosity people will react to her as though she's a monster and she's saying don't do that don't do that interesting too if you think about the way Frankenstein has been adapted especially throughout the 20th century is that slowly the narratives have become a lot more team monster rather than team Frankenstein and it's been fun to watch these yes because we're slowly starting to sympathize more with the monster and I think we can look at that in the way I've always been team monster I just want to lay that out right now and I think the monster we should feel terrible because the monster is horrifically abused like you said if we look at the monster as a stand-in for a woman who's thinking independently or someone who's been enslaved this is a creature who is responding to circumstance the creature has only ever experienced cruelty so of course he dishes it out but I think if you really want to sort of complete the arc and go all the way up to the present day and think about a modern Frankenstein narrative Westworld the new TV series is a fantastic example and in fact the great thing is that of course Westworld is also an adaptation of a previous narrative which had male protagonists and of course now it has female protagonists and African-American protagonists and it's very much explicitly about what happens when you're a creature who's been made by an indifferent kind of corporate science I mean there's some complications there and we could talk about Westworld drinks afterwards but I think that's a narrative where we are fully sympathetic to the creations and the creatures and we understand that they're being abused that their minds are being tormented by the act of creation and that their only medium is violent uprising that's their only hope and to gain control over the means of their own production which actually happens in a fantastic episode sorry spoilers where they learn one of the characters learns how to take control of her own programming and it's such a fantastic scene it would be like if Frankenstein said alright screw you sorry if Frankenstein's monster said screw you Frankenstein I'm going to remake myself I think we have we have seen that come full circle and I do think that that is a response to cultural changes and to you know how people understand the process of making an artificial being which of course is becoming more and more realistic in an age of robots so that is one of the holes in the plot of the original novel the creature actually uncovers Victor's notes from that had not been revealed to the reader previously and you say smart do it yourself don't ask the guy to do it for you but you know to turn it back to this framing of the fear of the unknown I want to know who's responsible for that it wasn't Joey probably it wasn't Ed but it's Frankenstein it was by committee okay so this is by framing it has a fear of the unknown it puts the knowers in control of whether there is fear whether there of what exactly the response is to tick down a list of what characters in the novel are actually afraid of these are real threats to dearly held values there's fear of loss and loneliness there is fear of the other there's fear of disharmony there's fear of female sexuality and power there's fear of the inversion of the master slave relationship these are real things that are really operative sort of pushed off to the side the same way that we think of say luddism as a movement of ignorance and reaction when there were real values the real people being able to put food on their table and provide for their families we do something different that's probably not good for the way we want to ask questions about contemporary science and technology well I think this is a great point and it's important to us and I think contemporary psychologists would say well actually we're more afraid of what's salient so we're more afraid of a terrorist attack because we see it on the news and it's constantly being reiterated for us it's an availability heuristic or the terminology but we're less afraid of a distant problem like climate change because we see less of it though we're seeing more technology do you think it's the fear of the unknown that complicates and leads to most of the controversies we see around new areas of technology like nanotechnology artificial intelligence or do you think there's something else at play and how's that I think there's a lot more at play and actually what I would also want to do around that kind of framing is turn it around to figure out who the we is as well the we is sort of this composite monstrous kind of thing and we have a variety of social and technical means to try to figure out who that we is so we just had an interesting election that we heard about you know trump and stein and franken trump previously but we have these interesting composites of machines and people and orders and behaviors and rules that try to create what the we is who we the people are in this country and it turns out well it gets really messy when you try to do that and you use some rules say the number of people who voted for somebody and you get one result and you use a different set of rules the number of people in certain states that total up certain numbers of electors and you get a different result so figuring out who that we is that is feeling what it is that we want them to feel is incredibly difficult too so let me put it to you another way how do we incorporate on the part of the public or politicians factor in to the development of science and technology like what can you tell us about recent developments in science and technology and how you see fear playing a role in their development or non-development you know I'm going to push back again suddenly because you know usually the way this is cast is that the scientists and the engineers the technologists and the public is either accepting of or afraid of the kinds of things that they're offered well you know why do a lot of things get created anyway because of fear of loneliness fear of death and so it's not the case that fear and knowledge operate in the dichotomy that people with knowledge operate out of fear as well and the people who are normally cast as afraid in the Frankenstein films the folks with the pitchforks have rational responses in many instances and so I really actually want to move not just move the unknown off to the side but I want to move the fear off to the side also because I think it's fairly destructive in talking about how it is that go back to the previous panel that a whole mess of different people can come together and talk about what it is that we want out of new science and technology and not simply a reaction between the rational and the afraid well it's an interesting question you raise and I'm going to pierce through that dichotomy as well because I think we assume that fear is irrational but aren't there a lot of fears that are perfectly rational if you're standing on the edge of a cliff it's pretty rational to be afraid right so is it rational what level of fear about emerging technologies and Emily I'd love to hear your perspective on this is rational to bring into the conversation or is productive and how should we be talking about our fears in a way that helps guide us I mean it's a good question I will always remember when I met the synthetic biologist Drew Endi who's at Stanford now and he told me I hate Frankenstein because it's wrecked my career and he was quite passionate about it and he really felt that because he was doing genetic engineering which of course was not designed to be used on humans it's going to be used on really quite safe projects and projects where they built safety concerns into their experiments from the beginning but he felt that whenever you use the phrase genetic engineering people would immediately jump to the Frankenstein fear and they'd say you're playing God or you're trying to destroy nature as if nature is something that we can quantify and as if we haven't been changing nature for like over 8,000 years I guess about agriculture and how we actually have radically transformed tons of species animal species and plant species for a long time and so I think that you know there's a certain amount of there's a number of questions that we have to bring to any project whether it's autonomous cars or genetic engineering around safety and around unexpected outcomes and that kind of goes back to telling different stories other than Frankenstein because Frankenstein is a scenario if you want to look at it that way it's a scenario where a guy who's a total dick makes a monster and you know fear-lit maybe I mean or maybe he's actually full of fear you know maybe he's actually you know building it out of a fear of loneliness or something like that but he creates a creature and turns it into a monster and he neglects it and he mistreats it and yet turns out to be a really bad situation but you know there might have been another way out you know there might have been a way of rearing that that creature as a child and sending it to school and doing all kinds of nice things for it maybe you know I give it a college education and health care you know all those things might have really helped the outcome from the monster maybe give it a name too while you're at it and so I think that a great idea would be to have a whole set that aren't just Frankenstein that are counter stories about what would be a good outcome like you know what would be a version of this kind of scientific experiment where we do the right thing or we're kind of as Sam was saying in the previous panel you know kind of have a middle of the road approach where we're not really totally ignorant fearful and we're not completely in control but we kind of iterate and so that's a project that's going on right now which is about the Anthropocene which is the kind of geological era that we're in now that sort of human created and there's a lot of fears about climate change obviously and how human what our role in it as humans is but a bunch of universities have gotten together and created a set of stories about how to have a good Anthropocene and actually you can find it online it's called goodanthropocenes.net or something like that and their stories taken from real life but also science fiction about you know moves toward actually caring for the environment in a sustainable way using technology or using other techniques and just sort of thinking about well what would be a good outcome and not a perfect outcome where we're all living in like a super land with like you know where we're all eight feet tall and live forever but you know where we do okay destroy the world but we muddle through and we manage to repair things that we that we do wrong that's fascinating but oh jump in Dave yeah the two pieces from the Frankenstein genre that appeal to me in exactly that way are of course young Frankenstein highlighted when highlighted when Jean Wilder goes into the the cell with the creature and warns them not to open the door no matter what happens you know he panics down the door and then he says hello handsome and it's that moment when he decides to love his creature and of course that goes to Frankenweenie which is you know not a Frankenstein story but a boy and his dog story but the point of course is that the creator is working with the being that he has this deep connection with and you know loves to the ends of the earth and that's the you know this sort of perfect this sort of of the creature interpretation also Franken Hooker Hi Hi highly recommended everybody highly recommended actually very underrated narrative from the 1990s Franken Hooker which is actually ends with a kind of collective uprising of the women who've been abused and turned into this creature so it's kind of awesome which is also where Penny Dreadful is going yes, Penny Dreadful another key Frankenstein narrative yeah the thing that's interesting about Penny Dreadful is that she embeds I love this idea of many stories I think you're so right that that's kind of where hope lies and that will help us get to this sort of middle ground or the complicated ground or help us with our entangled ground and I think one of the things that Mary Shelley embeds in the novel is the the creature's innocence because if you think about it when he begins he's so I mean he's very well read he gives himself essentially his creator's education he reads Paradise Lost you know he spent a lot of time hoping I mean that's the sort of irony of it he spent all the time hoping for this beautiful connection which is why I love that you know we're on team monster for a while there hoping and so I think you're so right that if we have these other stories or in my mind what I did most recently for Slate was I wrote I ended up changing it wasn't an open letter from Dr. Frankenstein to Elon Musk but that's how I first thought of it and what I ended up doing instead was I just said here's three lessons that Victor Frankenstein should have learned and you Elon Musk with open AI can benefit from and you know the idea of not working in isolation the idea of putting many brains on a project not just to stop the bad you know tyrant who's going to get hold of the invention whatever it is sort through the bugs but also of course to you know this is where you're I'm the historian I'm not the futurist so I won't talk about what all those minds can do but the third one also was funding like he you know Frankenstein had very poor funding and that you know that's why the monster I don't I hate saying monster that's why the creation was kind of funny looking I mean if you had to go dig up from a bunch of different graves an arm here and a leg there and you know that's going to have a beautiful thing maybe Frankenstein needs to testify before the house finds that could be really beautiful I think that would be great because the other thing Frankenstein didn't do is with the lack of poor funding in the isolation Frankenstein did not prepare the public for the advent of his creation so he was not out there telling people this kind of funny looking thing is going to be coming charging out of the woods don't be scared say greetings happy to meet you have you been paradise lost right so I think he didn't talk to the users he did not he did not do any kind of you know nor to be fair did he talk to his creation once it became alive and I think that's the difference between kind of the science that we're talking about now and maybe science will be doing in 50 years where if we're actually creating entities like human consciousness maybe it won't be exactly like it and maybe it won't be 50 years maybe it will be 200 years or a thousand years or whatever but how do you get how do you do that like how do you what kind of rules and ethics do you have when you're creating a being that is a human equivalent I think then maybe you have to go toward thinking about children and how do we treat children in the previous panel in the second panel around autonomy just didn't quite reach it because I don't think you can read this novel and have an uncomplicated idea of autonomy and even think about trashing the word in a way no matter whether you're talking about humans or whether you're talking about systems that humans create because a whole part of this is wrapped up around how it is that creatures whether they're animals and are constituted through interactions and the very concept of autonomy sort of bleeds away when you have that complication so if you're thinking about creating an autonomous system out of hardware and software well the autonomy that you've made is one that is constituted through a whole lot of programmers a whole lot of people who put stuff together the whole lot of users that you and the sailors the concept of autonomy actually mean when this thing is constituted in all these diverse kinds of ways I think you're raising a really important point which I might cast a little bit under the veil of predictability and unpredictability and sort of loss of control if you want to go back to the loss framing and I think that one of the things that's so challenging when we talk about areas of science and technology in my area of climate change is that there's a lack of control for any individual person on how you address it and then there's a whole lot of unpredictability and sort of how it plays out over the future and I'm just thinking about your book Emily and just how you're trying to project us into this future where we face mass extinction I mean how do you it's like how do we even get at these these problems that are sort of unknown and uncertain where we have very little control it seems to me that the fear can either just sort of hit the panic button or cause us to completely look away is there any way that we can get past that kind of fear that we think about the future I mean I think that's what was so interesting about the introduction about the uncanny and how what we're really afraid of is the stuff that we're basically forcing ourselves to forget it's stuff that we already know but we're trying to look away as you said and I think with things like climate change you can't the way forward I mean part of it is storytelling as I said sort of thinking ahead by having a narrative and that narrative doesn't have to be science fiction it can be something that we get from climate modeling a lot of the best work that we have on the future of the climate is coming from people who are taking data and using that to project into the future what we might see from sea level or what we might see is too those are stories about what's going to happen to the shape of the planet we also have models of what happens to an ecosystem when enough species die out you get to a point where if enough species die out then you get more species dying out and more species dying out and we can model that too all of that sounds horrible and people don't want to think about it and that's the moment when people kind of shut down and they want to forget and that's why I think it's again it's important how terrible we are and how we're killing everything and everything is doomed versus you know everything is fine it has we have to have a sense of yeah we're going to screw up we screwed up but we can fix it and when we fix it well maybe it won't be perfect but it'll be a little better and so there has to be a certain amount in these in our stories and our thinking about the future and even maybe our science of a willingness to forgive and try again and try again and when I say forgive part of that is forgive ourselves for screwing up because if you keep telling yourself you're terrible for having screwed things up that is when you go into the shut down mode and that's what happens to Frankenstein you know he realizes he's screwed up and he just runs away from the problem so do you disagree with that so Nancy on an earlier panel was saying oh you know when things go well it just makes for a crappy story right so but I hear you bringing up a couple of times now that there can be a good there's a positive story of implications of technology there's a positive story we can tell about the future how do you what's the way to make that a good story a great example would be Kim Stanley Robinson's novel 2312 which is about the year 2312 so it's set in the future and it's a story about a future solar system and Earth is there and Earth has kind of fixed a lot of its environmental problems they're doing things like raising Florida back up out of the ocean and they've de-extincted wolves by building asteroids where they have artificial environments to de-extinct animals and so they shoot all these wolves down onto the planet to let them I mean in safety bubbles they're not no wolves are harmed in the re-populating of the wilds and but they screw up all the time people kill each other there's a horrific conspiracy to you know rack all of these plans there's no simple like a good story it has to be just kind of like a good enough story and so it becomes an interesting story when you think about all the conflicts involved in trying to fix the mess the mess is there we're trying to fix it in the process of trying to fix it we're going to make a new mess when we raise Florida up out of the ocean that's going to wreck a bunch of new ecosystems that have formed if all of the ice melts in the Arctic say for example a whole new economy will spring up in the Arctic that economy so there's always new ways to screw things up even if you're fixing old problems and so I think that's the the acceptance that we have to have things are always going to be ambiguous I want to open it up to questions but do you have a thought final thought on that wish Mary Shelley were here by the way her name was not Mary Shelley when she wrote this book Mrs. Percy B she was not married she was an unwed mother just want to remind you of that you know she didn't stop with Frankenstein that there was more spawn and the next book that she was going to write is called The Last Man which is about a disease a mysterious disease wipes out all of humanity except for a few people who have many wooden conversations I mean it's not a great novel but she finds her way out of the very you know sort of dystopian dilemmas that she keeps dreaming up I mean her novels honestly they're just not fun to read because she's really a philosopher and a thinker so ideas dominate and people being real and having actual conversations that we can follow just aren't important to her but by the end of her fiction writing career and she goes on and she writes five more novels what ends up happening is or what ends up happening is very complicated but essentially the idea of community of coming together and community that heroes she says and by her last novel there are no heroes all of the men are weak in her novels and what ends up saving people we have women coming together and saving men from their ambition and learning how to live together in a kind of communitarian flawed but still better than the sort of last man universe so I wish she could be on the panel because she'd have a lot to say we should have gotten like an A++ on the Bechdel test where the women have to no conversations she totally would have like the heroes show all right so let's open up to some questions I'm sure there's some interesting thoughts and questions that you've encouraged of course the question marks is anyone afraid of asking a question why are you afraid I'm not sure how to frame this but I guess I've just been thinking a lot about the ways that Mary Godwin Shelley's novel has had unintended consequences of its own in the way it's been interpreted and reinterpreted so many times in culture and I'm sure that sort of comes up all of the time in Frankenstein but I suppose I would be particularly interested to hear anyone on this panel's thoughts about that has it is it you know was there something inherent in the novel that you know intended the consequences you know the sort of massive mythological effect that it's had or has culture you know done a real disservice to the novel creating a myth that it itself has needed and continues to need I just want to answer historically to begin with before we talk about the metaphysical implications which is Mary Shelley made not a penny out of Frankenstein that only 500 copies were published in the first edition and not all of them sold I mean they didn't all sell she made no money and the reason why the name Frankenstein in the story became famous in England during that time playwrights could just rip off novels freely right so the story itself gets popular on the stage and no one has to adhere to the novel itself but the name Frankenstein and the story therefore becomes known by the public but people aren't sitting around reading the novel so there's right almost from the beginning there's a detachment between the actual text and then the story as she's told and I think Frankenstein is the monster and isn't Frank you know right that mirror that mirror thing between Frankenstein and his creation but so I just have to be a historian yeah and I think that's such an interesting point because I think that Frankenstein has become so well known because of all these adaptations I mean it's kind of the first fanfic phenomenon fanfic is when fans write their own versions of stories or set in the universe of their favorite stories and so you know a lot of the adaptations of Frankenstein end up having the same kinds of conversations in a weird way that we've been having about are we team monster or are we team Frankenstein and it allows us to have this kind of productive debate but in the realm of fiction so it's not too scary we don't have to be talking about actual scientists and actual politicians we can say well you know why was the monster wronged and you know what did Frankenstein do wrong and I think it's out in these adaptations and some of them are super team monster some of them are just like the monsters horrible and can barely talk and it's really all about poor Frankenstein some of them create whole new characters but there is as you said about Drew Endi this issue of scientists perceiving it as an anti-science screed when you know when people talk to scientists and engineers a lot of motivation for them to do what they do to do science for good in the world actually comes from science fiction undifferentiated between whether it's sort of pro-science science fiction or anti-science science fiction and my favorite example from from Frankenstein is not from the novel but from the 1931 film where Earl Backin who was the inventor of the the transistorized pacemaker saw the 1931 film as a 8 or 9 year old boy fascinated with the connection between electricity and life and created this essentially Frankenstein technology in his garage in the mid-1950s and so I don't know if there's anybody from NSF in the room right now but Ed and I have fought this little battle with NSF about how much we identify the word Frankenstein the reference to the novel in the films with work that is being sponsored by the National Science Foundation because of this issue it's a centered novel even though there's no women really and if they die you know the other thing is that it's a half of a marital battle that she was having with Percy so for the literary people in here Percy goes on and writes a long poem called Pometheus Unbound which is almost exactly the opposite of opposite vision of Frankenstein science is a lot about the thing that scares Victor the most in the novel it seems is to create a female creature to create a bride and so it strikes me that and maybe the points that you brought up Charlotte about femininity and politics at the time explain that but it seems striking to me that the thing that seems to horrify him the most beyond words really and causes him to become violent is the specter I would say super briefly that I think that that is less a fear about femininity than it is about the racial other because what he's afraid of is that these two monsters are going to go off and make babies and they're going to take over Europe oh my god what do you think that's about I don't know it doesn't sound relevant that would have been precisely what I would have said as well that it's the new race it's a new race it's about about the the female the potential female creature being so strong that she could take any mate that she wanted if she rejected the creature yeah so miscegenation to that's dark place okay we're going to take one last quick question we just have a couple of minutes from the gentleman in the glasses here who had his hand up and then we get to get the booze so we have an incentive to speed it up so the two things that have been going through my mind in the last few minutes is the industrial revolution and also the huge upheaval in the democratic and monarchy structures in Europe at that time and I wondered if that sort of fear is implied in any of Frankenstein I'm sorry to jump right in Mary Shelley was deeply involved in the political systems and in protest against the political systems because of her own beliefs but also because of her father and ultimately her husband Percy so I think that Frankenstein can be read as I think I've said for the five thousandth time as a you know as a political novel that this is a novel about race it's about gender it's about the social injustices as she saw it I mean it's a real cry out against social injustice as for the industrial revolution it worried the Shelle's the industrial revolution that they were excited by innovation technology and science in general that she herself if she were here on the panel which I'm imagining she would be pretty pro-science on the one hand it's male unchecked male ambition racism and inequality that she's against not science per se it's interesting because I think Frankenstein especially later gets read as a figure for the proletariat because not Frankenstein the monster sorry I have fallen totally into the trap the monster is read as a figure for the proletariat and especially the fact that the monster is made from parts and I think that's part of the kind of assembly line idea that this is a creature that sort of almost sums up what industrial production is like and oftentimes Frankenstein the Frankenstein monster is kind of contrasted with Dracula who kind of comes at the end of the century who's the sort of aristocratic monster although also represents a bunch of other stuff too so I think that there is that the industrial revolution does kind of haunt Frankenstein and the monster but especially later especially in later adaptations especially because Frankenstein does rise up you know and kind of lay his maker and you know the creature I'm sorry I apologize in advance that's fascinating that's fascinating yeah and there's I'm a political scientist by training and there's this wonderful frontispiece to some edition there's actually real research to be done still on this stuff to an edition of Hobbes Leviathan which was probably I mean the book had to have been in Mary's father's library whether this particular edition was there or not which shows the Leviathan which is which represents the body politic as a composite being okay so we've got all these different composite beings running around not just the creature but the governments that we constitute in in liberal society the corporations that we constitute that are you know that are bodies that are fictional bodies that earn profits can endure past the lifetime of their owners and so we have all these composites that we haven't figured out had a control yet I don't we can't find residents today in Frankenstein from that I don't know how we could thank you all so much for a wonderful wonderful panel let me just say thank you all for coming if you're interested in learning more about Frankenstein visit the future tense channel on Slate for cheats cheats quizzes a video about Frankenstein you can also learn more about what we're doing at ASU around the Frankenstein Bicentennial project at Frankenstein.asu.edu and now I invite you to please join us in the foyer for refreshments