 cars that are able to drive themselves are something that many of us have been dreaming about since we first saw Night Rider. It's a dream that Elon Musk told us would come true by the year 2020, but of course that didn't happen because as it turns out, building a car that is able to drive itself safely is a lot harder than people thought, especially when all of the self-driving tech has to be crammed into a single car instead of trying to make some kind of self-driving network of vehicles that communicate with each other, you know, tell each other where they are on the roadways in real time, and have their own designated highways to drive on to reduce the number of random variables they might encounter. But instead of admitting that self-driving tech and artificial intelligence has not come far enough yet for full self-driving to be a reality, Elon has chosen to just continue on with calling the technology full self-driving. In fact, Tesla has doubled down with their beliefs in their full self-driving tech with the release of FSD version 12, because they're no longer saying that the software is in beta. Okay, we've all dealt with beta software before, whether it's a video game or it's a bleeding-edge version of some software. When something is in beta, the end user understands, or at least they should understand, that the software is incomplete and it probably has bugs. The fact that FSD was in beta signaled to the more intelligent Tesla drivers that this stuff is basically glorified cruise control and not something that you should let drive you around while you're asleep or while you're in some other situation where you can't grab the wheel immediately. Now, I'm sure that somewhere in the terms of service for FSD 12, it's going to tell you that it's not the kind of full self-driving where you can give total control to the car. Okay, I'm sure that Tesla's lawyers have them covered there, but we've already seen so many cases already where Tesla owners have used their cars irresponsibly by sleeping or doing other stuff while full self-driving drives them around, and I bet you more people are going to continue to abuse this tech, especially now that its beta tag is gone. Another really concerning change with FSD 12 is the fact that now it fully relies on neural networks or AI, as so many people like to call it, and not computer code. If you go watch recordings of the FSD 12 demo that Elon Musk streamed on Twitter a few days ago, he mentioned several times that there's no lines of code or there's no human written lines of code in FSD 12, which means fixing bugs is going to become a lot harder, and engineers like human engineers having a really good understanding of how FSD really works is going to become impossible. Now, I won't go into deep detail about how neural networks like this get trained. There's way better videos out there on YouTube for that, but at a high level they show lots of videos of people driving, and this is what they call training data, to the neural network, and Tesla has a lot of training data because, surprise, surprise, if you drive a Tesla, it's always recording you, and those recordings get sent to Tesla servers, and they probably sort these videos into different categories like good driving and bad driving at the very least, and then they train the neural networks on what to do and what not to do in different scenarios. Now, I'm not an expert on neural networks by any means, but what I've noticed with the ones that I've looked at is they oftentimes end up accomplishing the goal that you set them out to do, but in some pretty unorthodox ways, and then you've got to go in and you've got to try to correct the neural networks by setting new requirements for what counts as a fail or a success. You have to program it to deal with newer variables as they come up, and doing this debugging to a neural network is much harder because it's not like you're getting feedback from a compiler or an interpreter that's telling you what line your error is on and whether it's a syntax error, you know what kind of bug it is. Patching bugs in a neural network is probably more akin to doing brain surgery, because you're literally fixing a digital brain. So in the FSD12 demo, which you can watch on Twitter or X or whatever it's called this week, Elon, being the meme lord he is, decided to livestream a full trip from Twitter headquarters to Mark Zuckerberg's house in Palo Alto, and you know, he was joking that he was going to challenge the Zuck to a fight when he got there, but you know, it's a little hypocritical to show the Zuck's address on a popular stream, you know, to heck and dox him when he banned, Elon banned the Elon flights account from Twitter shortly after he took over the platform. That was an account that was basically using publicly available data on flights to say whenever Elon Musk's private jet was taken off from one place and go into another. Yeah, Elon banned that saying that it wasn't free speech and that it was a privacy issue, but then he went and doxed old Zuckie by googling his address and then putting it into his Tesla's GPS. But anyway, there is one positive thing that I can definitely say, and that is that the demo seemed pretty thorough to me for an urban environment. So the car drove near a university. It went through a couple of roundabouts. Lots of people walked in front of it, like I guess jaywalking or whatever, you know, unexpectedly walking in front of it, and it seemed to handle traffic pretty well. It seemed to maintain its lane and things were fine for the most part. There was one instance where Elon had to intervene and hit the brakes because I guess the Tesla self-driving misread a traffic light and accelerated when it was red. And right after the incident, right where, you know, Elon had to intervene, he mentioned that they would have to start training the neural network on more videos of that kind of intersection to try and fix that bug. But while I was watching the video and him saying that kind of made me think of this, I noticed that there were a lot of Teslas on the road, like where he was driving around, which makes sense in a place like Palo Alto, you know, it's a really rich neighborhood. But if there's a lot of Teslas around there and all Teslas are sending training data back to Tesla to be curated and then used to train the neural network further, it's probably safe to say that a good amount of data from that area had already been used to train the full self-driving program. Like I'm pretty sure that there's plenty of data from various Twitter employees that are commuting to headquarters since Elon issued a return to office mandate for Twitter employees back in March. And there could be data sets used to train the neural network that would make up most of the commute that he live streamed. And yet there was still one intervention. And there were also a couple of times when things look like they came close, like there was one time where the light was turning yellow and like Elon kind of in a rush voice was like, oh, I hope it stops. And it almost seemed like he was going to intervene and hit the brakes. But then it finally kicked in and slowed down. But you know, what is full self-driving supposed to do when it encounters something that it has very little to no training data on? Like one thing I've noticed personally when driving around rural Virginia is there's a lot fewer Teslas on the road than there is in Boston, Massachusetts. In fact, once you get probably 30 miles south of Richmond, you're not going to see any Teslas probably because there's no charging stations around. But the reason I bring this up is the traffic in road conditions in small towns is very different than in big cities. And I doubt that there's a lot of good training data for that stuff. Like there's a lot of winding roads around here that don't have lines on them. I'd be very curious to see how a Tesla or any other autonomous vehicle handles that. I wonder, has a Tesla camera ever laid eyes on a tractor or a combine or a log truck? Would the AI know how to handle that? Would it realize that some of these vehicles have very wide turning radiuses? Agricultural equipment and other things that you encounter on country roads like animals are going to be a growing pain that Tesla has to deal with as their vehicles start to get more range and more charging stations are built in these fancy autonomous vehicles start making it down to needs feed and especially if Tesla can successfully branch out into trucks because then you are going to have more people around here actually wanting to buy them. And that growing pain is going to be maximized by the way that Elon is marketing the product because he's acting like full self-driving is already full self-driving when it almost t-boned a vehicle after accelerating at a red light in a place where I think you can argue that it's the kind of environment that they should be really good at driving in because like I said so many people in Palo Alto own Teslas that are feeding that system training data. What do you guys think about the latest FSD demo and the future of self-driving cars? Personally, I think self-driving tech that's actually good is inevitable. I mean even in this demo it seemed like it was pretty good and it might even be possible for it to be safe enough to use an AI taxi like Elon Musk said we would be doing by the year 2020 maybe by 2050 or 2060 it'll be possible but if you enjoyed this video please give it a like and share it with others to hack the algorithm and follow me on odyssey have a great day