 When we run this on the pre-trained model You can see that it runs for a while. It runs faster than the training runs because it's not doing back propagation It's just doing the forward passes just doing inference and You can see it shows what the loss is Which is down between minus point eight minus point nine closer to minus point nine So that's quite good in this case for hinge loss as we'll describe later The very best it could possibly do is a minus one. So that's getting quite good And we see an accuracy that shows around 96% so an error of around 4% out of every hundred guesses we get four of them wrong Not great not terrible It's not really a contender for any of the top performing methods on the Amnist digits leaderboard that we saw earlier Then again, we haven't tried very hard yet. We haven't tweaked our parameters We haven't done any hyper parameter optimization. We're using just two convolution layers We're only using 16 kernels per layer. These are relatively small There are a lot of tricks still up our sleeves This is not an attempt to overtake any of those records. This is just a Minimal example firing up the go-kart get it running down the road So that then we can supercharge it later for fun But we start by really understanding Down to the nut and the bolt what it's doing and how it works In the spirit of understanding our neural network down to the nut and the bolt It's helpful to do some additional reporting some additional visualization to understand what's going on So for this we have a separate report script This one we're going to do some plotting. So we import pie plot and patches the ability to draw patches in a plot NumPy of course for everything We get our testing data in We bring in a cottonwood tool called visualized structure that creates the structure that we've been referring to and then We use our load structure again to be able to pull it in and then a toolbox TV that we were going to use to create a text summarization Of the structure as well The very first thing I do is switch back end to agg This is a nice general matte plot lib back end that tends to work almost everywhere For saving images to files, which is what we're going to do. We're after maximum portability with this option In this script, we're going to create Images showing some correct examples and some incorrect examples We get to specify the number we want to show There will be six per plot And so I just arbitrarily chose 36 of each to show So we calculate how many plots that will end up being if we chose a number of examples that was not divisible by six We round up to the nearest six And then we start describing the this plot So we give it a height of nine inches a width of 16 inches And we start specifying the positions of various parts I won't go through every bit of the Visualization code here to describe what each line does, but I'll describe at a high level what each chunk does So these parameters Here through this section Specify the layout how big each piece is and where it sits relative to other pieces I like having this up at the top of the script because again as I'm developing This is the part that I'm I'm tweaking. I run it. I don't quite like something I adjust one of those numbers a little bit. I run it again. They're all right here and I don't have to hunt for them We choose a background cover of ivory. It's a little bit warmer than white and We specify to drop these reports because there could be quite a few plots We'll give it its own directory called examples and we make sure to create that if it doesn't already exist And then load the structure Render these results pull these examples out and create these plots Summarize create a text summarization of all these results And then create our structure diagram