 Hi, welcome to Teens On Topic. I'm your host, Emma Arnston, and today we are joined by... Hello, my name is Ramiro. I'm Andy Knox. My name is Anantamashiana. And today we're talking a little bit about self-driving cars, so let's see what people in Davis have to say about it. Let's do it. So our question this week is, what kind of impact do you think self-driving cars have had or will have on either our daily lives or society as a whole? Self-riding cars. I don't live around here. Actually, I live in Japan for the most time, so they don't have them over there. So I haven't really experienced them, but it seems like that's where we're going to go and that's the wave of the future. And for sure, they're going to have a huge impact. I can see it helping reduce incongestion and kind of making certain aspects of life easier. The only big concern I have is potential job losses. Self-driving cars and trucks would mean taxi drivers and truck drivers would be put out of the job. But for my personal day-to-day life, I think it would actually be more convenient. I think that it's an exciting new technology, but maybe in small towns like Davis where it's such a walking-oriented town, technology maybe needs to develop a little bit more before it's implemented in small towns like this. I kind of feel the same way like he said, but I think that that's just where we're going with our technology and the goods can outweigh the bads. But it is definitely a safety issue right now. I've been in places before where people have been killed by self-driving cars, and it's scary, but it's also exciting at the same time. Times are society as a whole. I think overall humans are generally really bad drivers. I think we get easily distracted and a lot of people like to fixate on the number of accidents that self-driving cars have had, but they're still statistically significantly lower than human error. So I think it would actually be safer in the long run. Anything to add to that? Okay, well, we're back. I think people in general were pretty pro self-driving cars. I mean, they brought up similar cons, which was possible accidents and then also job loss. So what do you guys think about the cons of self-driving cars? I feel like most people, at least in your day list, tend to put more weight on the pros than they do the cons. If I were to think about it, I think traffic. We have a huge traffic problem in America right now. If we have self-driving cars, people are going to feel more accommodated and they're going to be able to travel further distances to their jobs, therefore increasing traffic. So I feel like driving cars should be a thing. I feel like we shouldn't have driving cars. Sorry, how would self-driving cars increase traffic? Say you take 45 minutes to commute from Davis to Roseville, correct? And the commute sometimes takes an hour because of traffic or self-driving cars. You wouldn't even have to worry about that. Actually, it's more of a headache. It's more of a personal kind of thing. Do you see what I mean? Personally, I don't like having to drive in traffic. If I had a self-driving car, that would take 15 more minutes, but it would drive itself to where I want to go. I'd totally do that. I'd go for that. So you think it could result in increased car usage overall? Exactly. I mean, I see what you're saying. I feel like in terms of traffic jams, though, you would see a lot less of that. I mean, as it currently works, cars, people can't really trust each other and they have a certain reaction time between the person ahead of me going, okay, now I'm going to go and I think that wouldn't be as much of a problem with self-driving cars if they detect that a green light's on and they can go ahead. So I mean, I think in terms of time, that would be overall beneficial, but I mean, I agree that it would probably result in more people using cars because there's less of a disincentive to do so. So I mean, I think that could actually have some potential effects on the environment. So hopefully that kind of technological progress goes hand in hand with environmental progress. Yeah, I'm not a huge fan of the idea of self-driving cars, mostly because I agree with what the one person who talked about the accident issue. I think it's a safety issue. If I, me as a person, if I'm taking an Uber, I want a human to drive me. I don't want a robot driving me because even if there's a small chance that the robot is going to malfunction, I don't feel comfortable with that at all. Or if you're going to have self-driving cars, it should be easy to override the self-driving function and start driving yourself. Well, I mean, I really think you should be probably less worried about getting hit by a self-driving car than by a person because the manufacturers of self-driving cars are going to do everything in their power to make it so those cars aren't going to kill people because if someone is killed by a self-driving car, right, and like, if someone is killed by a regular car, right, there's going to be no question that like probably you're going to assume that you should blame the driver and then if you can prove there's something wrong with the car, you know, you probably even just sue the driver for like being so irresponsible as to have a car that is like malfunctioning that he hasn't had checked up, right. But if there's a self-driving car that kills someone, I think the company is absolutely going to take the fall for it. So I think companies are going to try to make them as safe as possible just to avoid liabilities, which I think is helpful. I agree, but there's also that, you know, little room, that little space or that little chance that an accident may happen. Say you're driving in harsh weather conditions such as snow or heavy rain, are these cars also going to be programmed for these tropical storms or thunderstorms or winter storms? You see what I'm trying to get at? If it's snowing like crazy and well, a human has more experience driving through snow than a robot, well, then what would be more beneficial, you know, a self-driving car or a human? Yeah, I mean, of course the manufacturers, like you said, are going to do everything in their power to make the cars as safe as they can. But I don't think that's any different than what they do today. I mean, people make normal cars. They obviously want it to be as safe as possible today as it is. But my point is, like, it's a lot more reliable that a human is not going to malfunction than it is that a robot is not going to malfunction. I don't think so because I feel like the likelihood that a human will, like, malfunction or just kind of go crazy and fall asleep at the wheel and come into oncoming traffic or just happen to ram into someone who's crossing the street when they're not paying his close attention, right? The fault of that in the eyes probably of the justice system, which would be like oversuing some kind of lawsuit in this case, right? The fault of that would be put onto the human, whereas I think the fault of the crash, when it's done by an automated vehicle, I think it's going to be put on the company. And so if you notice that there's a company that, oh, wow, they're like five times as likely to kill someone as just someone regularly driving a car is, probably A, not a lot of people are going to want self-driving cars because they'd be afraid of killing people, and B, the companies would immediately do everything in their power to make it safer because they don't want to get sued again. Yeah, but still, like what Romero was saying about the inclement weather, I mean, of course, they're going to be state-of-the-art high-tech sensors, but I mean, what happens when your sensor gets covered in, I don't know, bird poop or like? Like, I think that sensors can mess up, and there's always a chance that they do mess up, and I don't feel comfortable with it. That's just how I am. Yeah, well, I mean, I think that also kind of shows that there's probably going to be a market for non-self-driving cars for a long time because there are going to be people like, this sounds kind of rude, but people like you who are just untrusting of self-driving cars, which is understandable, that's probably going to be something that lasts for a long time. I mean, but also, I'm sure there's a lot of technology today that when it first came out, people were either untrusting of it, they're like, this isn't safe, this isn't going to work, this business isn't going to succeed, but I mean, look where we are today, and I would definitely be willing to change my mind if they would run trials, which I'm sure before they put self-driving cars on the road, they're going to have to run hundreds of thousands of trials, but I would want to see the data, see the accident rate, malfunction rate, and I mean, you're right, a human can fall asleep while driving or get distracted, go on their phone, that's things that a robot doesn't do, so I'd like to see the statistics. I think there's already some examples of that nowadays. Tesla, I believe, Model X, if I'm not wrong, also already has these driving cars that they're not 100% automated, but they do help you keep yourself on the lane, say you're driving and like you said, you become sleepy or you just kind of swerve off, it'll correct you and get you onto the lane. I totally agree with that, yet self-driving cars, like 100%, you get in it, you get from position A to B, I don't know, I just don't feel comfortable with it, and it's also just, it comes to tradition, you know, I genuinely like driving that car and I feel like hundreds of thousands of people, if not millions of people, also like that feeling of having control and losing that control kind of, you know, this is scary for some. What I'm going back to what you were saying earlier about the Tesla ones that are partially self-driving, I'm kind of most worried about the middle ground in between that and ones that are totally self-driving because I feel like there is where people, you know, even if the instruction manual says be paying attention the whole time because, you know, you still have some control over the car and there's potential that it could malfunction. I think probably a lot of people would kind of ignore that warning and there may be like an increased rate of people going on their phones in those partially self-driving cars that then, when the human is needed for whatever reason, they won't be there. Yeah, I mean, I know, I think it was, I don't remember who it was, but I think it was Uber that they're developing their self-driving cars and then there's already been like an accident that, I think it was an old woman that was killed. So I mean, I think the issue, at least in my opinion with self-driving cars is like there's a big difference between like when you're initially developing the technology and when I think it would be safe for people to use it and there's obviously like a time in between that you need to test stuff out and then if you want to test in like actual conditions where it's on the road there's just, I think there's a danger that you have to assume of like, for example, like the woman dying and I mean, I think it was at least, I'm not sure the total story, but I think it was at least the issue of like both people I think the person behind the wheel wasn't paying attention so like even though they should have, like what you were saying that you know they should have been paying attention, they weren't. Because 99% of the time you're probably not going to need to. Yeah, yeah, like there's just, there's always that 1% so I think at least my issue is until it's like 100% safe and tested so much that I just rather drive myself. Yeah, like Andy said the thing about it's going to tell you to pay attention but people aren't going to, I mean I read a news article a couple years ago about a guy who was in his Tesla, put it on autopilot and I'm pretty sure Tesla's tell you that even if it's on autopilot you're supposed to continue paying attention but I think he started either reading a book or he started watching a movie on the screen and he crashed, so. Yeah, it's inevitable. But another, something I was just thinking about right now what if you have to go off-road or drive on a road that the GPS hasn't mapped yet? You see what I mean? Well how would you control that car and say go left, go right, go at these many miles per hour, et cetera, et cetera. I feel like it's very controversial, very hard to manage. Well I feel like a self-driving car would probably be fairly different from like a GPS where it like freaks out when it thinks you're going through a river when you're actually going through a bridge. But I mean it's probably going to be able to adapt to its surroundings. So sure I mean you'd probably have to pay some more attention when you're like driving on some country road and like turning into some rather new bridge that's like not on all the maps or anything and to like get to some mountain to go climate. But I mean I don't think that you really would have to worry about it thinking that there's not something there when there is, I mean thinking that there's not like a bridge there when there is. Well until then we're going to have to just wait and hope for the best, right? I mean, I guess that's true, I think it's kind of inevitable. It's a technology that's going to come out. Like I don't think anyone could like sit here and tell you self-driving cars are never going to be a thing. I think it's like, I mean it's already in development, there's like already things happening. So I think like the issue with now is just figuring out how we're going to implement it and what's the safest way to do self-driving cars. Absolutely. Well thank you all for being on the show. This is a really fun topic to talk about and I hope you guys come back at some point and tune in next time. It'll be next year, so 2019. And when we're going to talk about foreign intervention. Happy holidays. Happy holidays. It's been a pleasure, thank you. Yeah, thanks for having us. Thank you.