 It seems like there's some sort of force that slows down STEM graduate students and keeps them from graduating. I guess you could call it science fiction. Even if you're not a fan of science fiction, you should probably acknowledge that the genre is important for at least a couple of reasons. First, science fiction, more so than many other genres, provides a high contrast context for thinking about big questions. Like asking what is love while watching a romantic comedy might help you appreciate some of your more complex feelings for Richard Gere, but when you're watching a man fall in love with a smartphone app, you're exploring the absolute boundaries of what love really is for you. Of course, just having flying cars or robots in a movie doesn't mean you're going to inspire an existential crisis in your audience, but sci-fi does seem to be a regular backdrop for exploring important issues. Second, science fiction governs how people think about the future in a very concrete and measurable fashion. I'm not talking about kids reading Jules Verne and wanting to be engineers when they grow up. I mean that the visions of the future, detailed in science fiction stories, are the specifications that your world is being built to. Like one of the primary reasons that you own a cell phone is because of the representation of communicators in Star Trek the original series. You only know the word robot because of a play from the 1920s about human-like machines, and many people think that the internet only happened because William Gibson wrote Neuromancer. So sci-fi is both a fantastic vehicle for introspection about complex issues and an important part of how we think about the future. Unfortunately, sometimes media creators just want to write stories about flying cars and robots and don't want to deal with all the crazy implications of the technologies they propose. So they're just sort of right around them. A machine that disassembles you and reassembles you somewhere else? Man, that would be crazy. What if it reassembled you out of different particles? What if it duplicated your particles and then there were two? No, you know what? Forget it. Let's just have it move your bits somewhere else. Wait, so the holodeck lets you play God and create an entire universe to your specifications? Let's just not talk about that, okay? That's not to say that this sort of sidestepping makes a story bad. Asking big open-ended questions without good answers makes it really difficult to get a neat, tidy, satisfying ending. But the thing is, by repeatedly approaching and then evading questions about what complex scenarios science or technology might present us with, I think that these more simplistic stories are conditioning people to believe that they can safely ignore those issues, that everything will just sort of work itself out before we cut to commercial. Like I've made several videos about topics like artificial intelligence, time travel, and artificial reality that are meant to address some commonly held beliefs that are based on media. That's maybe one part science to a hundred parts fiction. Just as an example, how many people have you heard that are genuinely concerned that Siri or Google or something is actually becoming Skynet? How many of the two-thirds of Americans do you think are opposed to genetic engineering due to, say, legitimate concerns about societal inequality, and how many do you think are just thinking of Jurassic Park? Of course, attitudes in media do gradually acclimate over time to new technologies. But some of the subjects that previously were only covered in science fiction are really just around the corner, and we might not have a lot of time to prepare for them if they do take off. Let's look at some things that we're used to brushing off and sci-fi that we are dangerously close to needing to deal with today. At the Game Developers Conference a few weeks ago, the spotlight was dominated by several companies which are currently steaming towards very realistic consumer virtual reality products. There are a ton of these devices, and they're all getting really good. The general easy way out in fiction is to say that human beings just can't handle perfect virtual worlds, or that will always prefer the real thing. But there are very few reasons to expect that those are true, and a lot to worry about if they're not. What happens to society when, for about $1,000, anybody can have a convincing virtual reality environment in their living room? If you were given the ability to create any sort of virtual environment that would feel real, what would you make? And what if that virtual world was genuinely better than the real one? Next, we've all experienced Google's spooky ability to figure out what we mean from a truly bizarre set of search terms. But that's just the tip of the iceberg when it comes to artificial intelligence. The first deep learning algorithms have been using the massive volume of information on the internet to achieve things that would take human programmers thousands upon thousands of man hours to code. We already live in a world where computers are grasping trends and correlations that we don't understand. The easy way out in fiction is to say that there are certain things that computers are just fundamentally incapable of understanding. Things like bluffing, or emotion, or irrational behavior. And so humans will always have an edge. But that supposedly categorical edge seems to erode a little bit every year. What happens if a powerful enough AI grants us the ability to manipulate large scale variables of our world? What if that power allowed a government without any people in it, just a remarkably efficient and effective algorithm? What would we want that algorithm to do? And what happened when those programs start creating the parameters for other programs? Conscious computers might be a long way off for human programmers, but we don't have to code them ourselves. We just have to program the programs that will program them. Finally, as I've noted before, the idea of the 40 hour work week was developed in a time before assembly lines, let alone 5-axis robots or 3D printers. The efficiency and adaptability of automated manufacturing processes continues to rise meteorically. The number of industrial robot shipped every year grows. And with each robot shipped, there are a couple of human jobs that are eliminated. And that's just industrial type jobs. Artificial intelligences are already playing the stock market game at a level that even the most savvy trader from 20 years ago couldn't possibly comprehend. Are computers going to be better than us at everything? The easy way out is to say that there will always be a demand for jobs that human beings will always be better at. But what if there isn't? Many political candidates campaign on job creation, but it's totally feasible that at some point in the future, there isn't going to be enough work to go around without simply creating holes and paying people to fill them. What will happen to a civilization that's built atop working for money when there's enough robots to keep everything running without us? What would we do all day long if we didn't need to work for a living? That's just three things that might be on the horizon and there are many more unsettling implications that you can derive from just those three. Even if you're not a fan of sci-fi, the importance of thinking about those sorts of issues can't be overstated. If just one of those things comes to pass and we're unprepared because we were expecting Star Trek and we got Terminator, we're going to be in a lot of trouble. So I guess what I'm saying is, queue up some firefly. The future may depend on it. If you have any questions or comments about any of those potential problems, or if you just want to let me know what sort of sci-fi you prefer, please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah, blah, subscribe, blah, share, and don't stop thinking.