 Hi, welcome to Visual Studio Toolbox. I'm your host, Robert Green, and joining me today is Andrew Hall. Hey, Andrew. Hi, Robert. Welcome back in the show. Thank you. You've been on the show a number of times. I have. I'm doing episodes on debugging mostly. Yeah. But now you have a new job, you're with a new team. I am, yeah. Yeah. So I moved to the web development tools team. So we do all the tooling for ASP.net, the web editors, HTML, CSS, JSON, and then we also build tools for Azure App Service. Cool. So we're going to talk today about the tools inside Visual Studio for doing Azure Functions. Yeah. Azure Functions is very interesting. I know it's been covered on other shows like Cloud Cover, but you're going to show us tools for doing them. I am. But let's start with a review of what are Azure Functions and why we interested in them. Yeah. So Azure Functions are often called serverless computing and it's the idea. It's ironic since they run on a server somewhere else. It is and I'll explain what that means. But the reason it's called serverless computing is you don't have a dedicated process that's up and running. So the idea is that it's event driven and the function only runs and only consumes resources on the server when it actually has an event to respond to. So you can do an HTTP trigger which would be a traditional generally how you interact with an ASP.net app, or something like that. But there's tons of other events that we'll talk about. You can respond to things being put into queues and blob storage, and you can wire up to listen to when things change in GitHub or things like that. So the idea is they are the glue that stitch other things together. So it's cool because in Cloud computing, there were things happening on the Cloud. Like you mentioned, something goes into blob storage, which is an event that happens in the Cloud. Azure Functions give you a way of hooking into those events. Right? Correct. Cool. Yeah. So in that said, the idea then is you only actually pay for the resources that you're using while your code is running. Right. So Azure Functions basically they charge you by the 100 to some milliseconds that your application is running for. As opposed to normally if you wanted to do something with a if you had to think like a normal like ASP.net process or something like that, and you want it to be able to scale and always be available, it's basically up there and always running. So you're paying for some and you can have it scale up or scale down, but you always have some amount of code that's taking up server resources and available at all times. So you want to write some code that is triggered when somebody adds a picture or a document, it's a file storage as an example. So maybe somebody in the field is taking a photograph of something, whether it's an adjuster goes out and looks at your vehicle, uploads photos and at that point, some process kicks off. But rather than, as you said, sitting there constantly spinning the meter or writing some file watcher type thing that is constantly querying, there's an event that gets triggered, and then you get to write code on that. Correct. Yes. So let's pick your adjuster example because that's a great example. You have an insurance adjuster that goes out, they fill out a report, they take some pictures, they upload it. So now what a function can do is you can look for the presence of something changing on a blob storage location, and you can say, okay, whenever something goes into this, let me know. The function code is just going to pick that up, and it's going to put the images in the right place, it may process the form that they uploaded, stick that information in a database, and then the function goes away. Send out notifications, do any number of things. So when someone runs the nightly report, all that information is just there and ready. So instead of having to do it as a batch job, where you say every, you know, an hour we're going to pick up and look for stuff, you can just do process them nicely on demand, it's always there, you don't have weird refresh problems, you don't make people on the other end wait because, you know, so either you write, if you think about, if you can't do it based on the event, you have to have it on some sort of cadence. So you might say, twice a day, well that means if somebody goes and runs a report at 11 a.m, and you run it at noon and 5 p.m., it's at a date, or if you guys say look for stuff in the queue and process it when they go run that report, it's not gonna take them longer to run that report. Or if you're waiting for something, you don't have to now go look for it every five minutes to see if it's there, you can be told when it's there. Exactly, so, you know, Azure Functions, they have a lot of great input and output things, so you could say, hey, like when I finish processing this, I'm gonna send an email to the relevant party. So maybe one of the fields in that report that the adjuster uploads is the email address of the person on the other end, the back end that needs to know when that's available. So the Azure Function, process the information, puts in the database, and the last task it does is actually sends an email that says, hey, this information is now available. Cool. Now most of the demos I've seen of Azure Functions, you actually write the code in the portal. Correct. Which is cool, of course. Nothing wrong with that? Nothing wrong with writing code outside of Visual Studio, but for those of us that wanna write our code inside Visual Studio, that's what you're gonna show us right now. That's what I'm gonna show, yeah. So I think the feedback that we've heard is the portal's great for getting up and getting started really quickly, but I think most of us that have done development for a while realize that as soon as you wanna start building up meaningful amounts of code that you're stitching together, working in an IDE like Visual Studio is gonna be much more productive. Plus the context switch, right? Writing the rest of the app, why do I have to switch over to the portal to write a piece of the functionality of the app and come all the way back into Visual Studio? And here's the million dollar question, Robert. How do you debug it in the portal? That is a good question. I predict we're gonna find out how I would debug it in Visual Studio. Yeah, that's a good prediction. Something about my past life maybe. All right, so I wanna point out, before I get going here, that the tooling we have today is a preview and it's available only for Visual Studio 2015 as a separate download. And so you can go to our blog, so my web developer blog, so it's blogs.mstn.com slash web dev, and then you'll find near the top my post on Visual Studio Tools for Azure Functions. This has the instructions for how to, everything that you need is basically Visual Studio 2015 update three, makes sense. The Azure SDK 2.9.6, which is the most recent version of the Azure SDK. Or later, and then you need to download and install our preview tooling on top of that, which is what this link is right here. So, if you're already doing Azure development, you probably already have these two out of the way, and then you just need to download and install our basically preview bits on top of that. Or if you automatically install anything, Visual Studio tells you to. Yeah, we won't automatically install the Azure SDK unless you're working with Azure. Right. We don't recommend that you install the Azure SDK for the sake of it. Yeah, but if you install it ever, then you'll get a notification that tells you there's a new version of it. Correct, that's correct. Yeah, so that's what we're calling out. So you're not gonna go to Vanilla Visual Studio 2013 update three, sorry, 2015 update three, and find what I'm about to show you. You're not even gonna install the most recent Azure SDK and find what I'm about to show you. You actually then have to also go install the special preview tools on top of that. Okay, well, in 2017, what's the story? Yeah, so the preview right now exists for Visual Studio 2015 only. We're working on Visual Studio 2017 right now. We are in the RC phase of that. Functions tooling that's not a preview isn't going to be available in 2017 when we ship the RTM product. It will come sometime after that. And we will not take the preview label off of it until it works in 2017. But the reality is just with, do few things and do them well. So as a team right now, we're really focused on finishing 2017 at a high quality, especially finishing the tools for .NET Core. Right. And so once we finish that, once we ship that, we'll be able to go and dedicate our focus on getting Azure Functions tooling up to snuff and ready into a V1 state. All right, so I'm in the new project dialogue. And under the cloud node here, under C-Sharp, I can pick the Azure Functions and notice it has the preview label on it. And so let's go ahead and just call this, how about Toolbox Function app? Toolbox functions. All right, so we're gonna go ahead and create their new project. Take a couple seconds. And you notice that as this comes up, I'm gonna have a sort of a vanilla project here with a couple of files in it. And so I have app settings at JSON. This is where I'm gonna stick information, that configuration data that my application is gonna use, like the connection string. So for example, every function type except for a HTTP trigger is required to have a storage account associated with it. And then this Azure WebJobs dashboard, this is not actually a particularly interesting moment. This just tells it where to pipe log output. So if I'm running locally, it would then show up in the portal anyway. I don't really need that. But when we eventually wanna connect up to like a blob storage or a queue or something like that, we're gonna go ahead and put the connection string here in this Azure WebJobs storage. And then whenever we create functions, we will actually just give it the name of the key here. Somebody go ahead and copy this cause we'll create a function in a second. And then we have host.json, which is a blank JSON file right now, but we can use it to configure properties about how the functions behave and we'll show an example of that in a minute. So let's go ahead and get started. I'm gonna right click and I'm gonna say add new Azure function. So it's a special item type today. And you can see that anything that you can do in the portal from a language perspective, we support InVisual Studio. I will mention though that- Including Bash. Including Bash. Which is command line stuff, right? Yeah, that's generally a- Cool. But so we're gonna go ahead and focus on C-Sharp today. So we'll filter this list just to that. And I'll mention most of the stuff that I'm gonna show you in InVisual Studio. Like you can create, you can edit depending on what you have installed. We'll dictate whether you get IntelliSense or not. But obviously we're gonna get IntelliSense for C-Sharp pretty much no matter what, assuming InVisual Studio 2015, C-Sharp's always installed. And debugging only works on C-Sharp functions today. Okay. So let's go ahead and let's go ahead and create a Q trigger because it really demonstrates everything we wanna talk about in a nice lightweight fashion. So I have an Azure storage account open here already. And so I have a Q that's called MyQTest. So let's go ahead and say the name of the Q is gonna be MyQTest. Remember I mentioned that key value pair here for a second? So Azure WebJob Storage, that's the name of the key that's gonna contain the connection string. Okay, that's not the actual connection string. That's the key that's going to contain the connection string. Got it, that's good to know. And that's so I can share, we'll talk about the structure a little bit more after I generate this, but that's so I can share connection strings across multiple functions because of the way they work today. So Q trigger C-Sharp, that's good enough for now. Let's go ahead and say create. And so you can see I get a folder here that's dedicated to the function that I just created. So you can have multiple functions in the same solution. Correct, so you can have as many functions as you want. If you wanted, you could have a JavaScript function, you could have a C-Sharp function, you can have F-Sharp function. They can all exist happily in the same solution today. Let's go ahead and close the properties here. And so what I have here is I have function.json. This is, just wanna open you, there you go. Try to drag it apparently instead of double clicking it. This is gonna contain some information that we entered in the portal. So this tells the runtime how to bind the function entry points. So the name, that's a pretty self-explanatory. The type, what type of, what are we listening to? It's a Q trigger. And then from a param, and then we have my Q test, this is the name of the queue. And then this is, I mentioned we had, this is the name of the key that's gonna ultimately contain the connection string. And the reason for that is if you think about, and if I put the connection string in here every time, if I then, like when I moved from, for example, test to production, I might have to go update the connection string in like 20 or 30 different functions. But by simply all pointing it at the same key, I normally have to update it once. And so today it's blank, so this isn't gonna work. So let's go here to Azure Storage Explorer. And let's go up to this particular storage account. Let's go ahead and grab our connection string. And when we come back to Visual Studio, we'll go ahead and paste that in. Perfect, so that connection string should work now. And so you can see my function at the moment, it's in this container, this run.csx file. It doesn't do anything particularly interesting at the moment. Let's let you know that it ran. It's gonna print something into the log. So it's gonna go ahead, let's go ahead and hit F5 and just watch it run. See if it's gonna work as we would expect. So I'm gonna see if this command line is gonna come up. This is the function's runtime. And so these are dynamically compiled currently. So if I had any compilation errors, which hopefully I don't, because it's just a vanilla template, I would expect to see those here. And it's gonna tell me that the Java host has started and so we're waiting for something to happen. Oh, so I forgot one thing, I mentioned the host.json. So by default, a lot of these things in Azure, they don't necessarily, there's a time period for how often they actually check for updates before they would get triggered. So at the moment, where it's gonna be the default behavior for the queue is it's gonna check once a minute. So what I wanna actually do is I wanna come in here and I wanna update it to say, let's check once a second so we don't have to wait for a minute. It's kind of dead air time, right? Right. So the nice thing is we have IntelliSense for this. So I think I'm doing queues, so let's do that. And then I wanna go ahead down here and I wanna say max polling interval and let's do 1000, so this is a millisecond. So that should change our max polling interval to once a second instead of once a minute. All right, it's gonna come up and run. Perfect, now if we did everything correctly, I should be able to go push something into this queue, a message, and let me go ahead and add something. Let's say high toolbox, let's click okay. And now when I go back here to watch my, I can see that it's gonna tell me that the C-sharp queue trigger function processed high toolbox. So we've actually just connected up and I wanna point out I'm using Azure Storage Explorer. This is nothing running locally, I'm actually working against a real Azure storage account in the cloud, but I'm running everything locally. And now this is going back to our question before and this is one of the things you can't do in the portal today, what if I wanna debug this? So notice my breakpoint bound, that's a good sign. Let's go ahead back to storage explorer, let's add another one, let's call it breakpoint is gonna be my message and back here in Visual Studio I can see that the breakpoint hit. And so I'm doing regular debugging, so I'm gonna hit F10, I'm gonna step, I can step, I can use the debugger to inspect my variable values. So as I'm developing these things I can actually run, test and debug them locally using the Visual Studio tools in a really tight loop. Sorry, what? No, you finish. Oh, I see where if I'm doing writing code in the portal not only am I using the portal editor which is actually a pretty good editor they have some IntelliSense and stuff like that. But if I wanna debug I'm relying predominantly on logging. Right. So I'm doing this basically what I have here is log.info, I call it printfdebugging or console.writedebugging. So that's kinda what you're forced to do up in the cloud or you can remote debug it from Visual Studio. Cool. And then it'd be easier to put this into source control these functions into source control doing it inside Visual Studio presumably. Absolutely because it's just a regular function project. And so functions are fully integrated with the Azure Kudu system and so you can hook it up for continuous integration so you can just edit in Visual Studio and we have two ways of deploying them to the cloud I'll show the web deploy here in a second so I can right click on this and say publish and I can publish my functions that way or if I hook up continuous integration going through a Git repository either using VSTS or GitHub or multiple other source control systems I can make changes I can just check it in and then it'll automatically be continuously deployed into production for me. No manual step required. But let's go ahead and publish this up to the cloud right now. And so let's say publish. So I'm gonna publish to Microsoft Azure App Service. Let's go ahead and create a new resource group just for the purpose of this to make it nice and easy. So the web app name, toolbar functions, probably good enough. Azure resource group, let's create another one just for us. Central, let's go ahead and go with a new app service plan I'll talk about these in a second. So this is the idea of what size machine do I wanna run these on or the recommend really the way that we've I would probably recommend which one of the cool things about functions called a consumption plan. And so what a consumption plan is it's the idea that you're only going to run the function you only gonna pay for what you use. Like you're not gonna have a dedicated sort of VM up there and running like you are with these other ones it's like say hey if we're not you're not none of these are being hit they're never gonna be run you have no dedicated resources associated with them but then you can scale them up to as much scale as they need and you can go in and set some limits so I'm like yeah scale them as much as needed but I don't wanna pay more than $50 a month or whatever. Now does that impact startup time or anything? What's the downside of doing consumption versus those other plans? That's a good I think the downside of the consumption plan is more of the you have to go in and manually configure spending limits or you can end up with a really really big bill. Where the and then if you basically say so these you pay for the size of the dedicated machine of that you're basically you're running your functions on if you're in a consumption plan you're not paying when you're not using them but if you either could pay a really an infinite bill like these will just keep turning away and you may like they may take a lot longer but eventually like it'll process through everything in the queue with a consumption plan it will basically infinitely scale unless you set a price limit on it but once you hit the price limit you stop processing for that month. Okay. Does that make sense? So it's really a risk reward is quite the right way to talk about it but do you want to sort of be able to infinitely scale but if you set a price cap now you're gonna potentially stop processing stuff if you ever hit that. Got it. Whereas these it's sort of like if you think about my laptop here well it's just gonna continually run and it's gonna turn through things as fast as it can it's not gonna scale well but I also have a fixed cost and it's just gonna keep chugging on things. All right. Let's just go ahead with the consumption plan. I'm gonna go ahead I'm here in the West US so let's go ahead and put it in the West US. Let's say okay let's pick a storage account. Why does it not, are you not happy? The name can only contain lower case letters. Let's go ahead and create that. These are preview bits. These are preview bits, that's correct. Although I vaguely remember that bug with the lower case versus the thing. I think it was actually an issue on the Azure side. Yeah. Like they wouldn't accept uppercase letters for some reason. Just, I guess calling too lower was hard I don't remember. You call too lower. No you call too lower. Well it always gets dangerous to change what people type in. Yeah, I guess. Well some things are case sensitive some things are not but you know changing someone's case without them knowing it could always, I'm guessing could result in a bad something bad happening. All right, so we now have it. So let's go ahead and click publish. So this should take about 10, 15 seconds or so. Any questions while we're going? So you're now publishing this to Azure before you were running it locally or running it in Azure? So before I was running the function locally. Locally, okay. Yeah, and so what I just did is we published it to Azure. And so now if I go to the portal I would expect to see, so let's go to resource groups. I created my new resource group called toolbox. And I can see that I have a function app in here. Once that loads, we should see our QC Sharp trigger. And I can see the code that's my code. Which you could edit here. Which I could edit here. Would get automatically updated in Visual Studio. No, we're not gonna get automatically updated in Visual Studio. So if I make any edits here unless I manually brought them back to Visual Studio they're gonna get overridden the next time I publish. Oh, okay. Actually, so basically once you're in Visual Studio Visual Studio becomes a source of truth that you choose to publish. And actually if you hook it up to the continuous integration that I mentioned using get source control actually won't let you edit in the portal. It's a read-only view. Okay. And it will say like this is coming from source control like we don't have the ability to push changes back to source control. Let's go ahead and know about the logs. So I mentioned, so we would expect or connected to the streaming log service. So for now I would expect that if I go push something into my queue it's gonna work in the portal. So let's go ahead and refresh. Ta-da-da. I would expect the breakpoint message to be gone because we processed it before when we hit the breakpoint. And so let's call this should show in portal. Okay. And if I go up here to back to my portal it should get processed here at some point. So we have to click the refresh. Or am I still running locally? That could always happen too. Yeah, how does this know? All right, so everything should be queuing off the same thing. So let's try this again. Okay, it's possible I also have some previous function I created that's actually queued up to this. That happens sometimes because I didn't create a new dedicated. So it's basically the first person to check the queue and to process it, process it wins. So that's gonna be my guess. That's what's happening. I guess the other question is we go to our app settings and see if, I remember these are preview bits. So it's possible our connection string didn't get, didn't get passed in correctly. So let's go ahead and let this finish. And Azure WebJob storage. That should have gotten picked up correctly. That looks probably right. But let's go ahead and back to Visual Studio and double check. Where's my app settings at JSON? Oh, nope, that doesn't look like the same connection string. So that's actually must be my issue. I mentioned these are preview bits. In theory, when we do stuff in ASP.net, we try to pick up stuff from your app settings and set the connection string appropriately. But I guess in this case, it got connected to the, so let's go ahead and just update this. Perfect. Boom, let's save that change. Perfect. I can see that I have still some messages in the portal. And I'm guessing with that change, it should pick up and start processing these. So let's go ahead and review. Let's go back to my function. And once the streaming logs get up and connected, I should probably just take a second to pick up those changes. I may have a, oh, you know what I just did? Sorry, I broke myself here. Because I connected the storage queue up to a different one than the source code was stored on. So it's now saying like, hey, you changed the storage location that you're putting me in. You should be like, I can't access the source code because the source code for it actually gets stored in the storage account. So sorry, that was my bad. Let's go actually just reuse the same storage account that we're using over here. So this is function ACBA. So now in Visual Studio, let's go ahead and create a new publish profile. Let's say publish. Let's go create a new profile. We can go ahead and use our toolbox resource group. Let's go ahead and say new. Let's go ahead and say, why should mean dynamic? And then storage account. Let's go with the function ACBA. So this is the same one we're putting off to off the queue. Okay, so you chose the wrong storage account when you published. Or I could have added a different key. So the problem is is like I changed what like, this value here is actually, sorry, this is covering it, but that Azure function storage is where it's gonna store the source code. And I could have a different key and point it at a different storage account. But by going in and changing it in the portal, I then actually screwed up where like, again, it's like, I can't find the source code. So this should finish, fix it, to republish. Well, that's good. One of the things I love doing on this show is showing what it's like in real life to use these things. If you come in here and it's a perfectly polished demo, my first thought is, okay, right. That doesn't happen in the real world. What would it look like in the real world? So now we're seeing that. Yeah, so I said, so what I could have done here alternately is I could have said like, sorry, I need to comment here, but I could have said like queue connection. And then, yeah, deployment failed. Using the specified process because the process web is starting on the server. Yeah, let's just try it again. Let's see what happens. Profile, it's like the most recent one. All right, let's do what happens again. If not, we'll go with this route here. All right, work of that time. Sometimes these things fail, right. So now if I come back here, let's go ahead and refresh, maybe just close this out. Let's go ahead into our toolbox. So that's the resource group that we used. And so this looks like the one that we probably just published. We create a new app. Because I forgot to, I didn't reuse the app name, so. So I create a new instance of it. Perfect, C sharp triggers there. Go ahead and look at our logs. Here we go, and then now it worked correctly. So we can say that it's processed the messages. Cool. Did I explain myself well enough when I screwed up? Yep. All right, so now we actually see it running in the portal. So we did it vis-a-vis local or run the bug locally. A nice tip and trick for when working with Azure don't accept the names that it gives you with all those numbers. Use your own names so you can keep track of what's what. Yeah, and the reason it gives you those sort of crazy, crazy numbers is because this is actually a public endpoint. Right, it has to be beat. Because I didn't add any authentication to it. And so yeah, it has to be unique within the Azure system. And the easiest way to do that is just add a bunch of random numbers at the end. Which is great as a default, but then it confuses you when you're looking at your own stuff. Correct, fine if you have one. Do you have more than that? Yes. Which one did I actually do? Right. So last thing I want to show is, so I showed doing this to the portal. So let's say this, because this happens sometimes, right? So let's say I have the wrong connection string or something like that. I get it up here and it was getting triggered, but it wasn't behaving quite like I expected. So if I can go back to Azure Studio, I can go to cloud explorer and I'm gonna go ahead and just pin this here. This should come up. And so I can go down here and I can look at my app services. Go ahead and collapse this. And I can see the function apps that I just created. So this is the one we're currently on. And I can right click on this and I can debug it. So let's go ahead and say attack debugger. I can come back here to my run.csx. Let's go ahead and unpin this for a second. And this is so you can debug the version running in Azure as opposed to the version running locally. Correct, because I have had it when I've been writing some of these. Everything, anybody that's ever done server development runs into this. It runs great on your machine. There's some reason when you publish it up to the test server or something that's not on your machine, it's not working quite right. And so what remote debugging lets you do. I've heard of that. I've heard of that happening. Never happened to you. I think it's an urban legend personally. I know people who swear by it. Right, so it looks like. And I mentioned before, the ability to run and debug locally is really valuable because while we offer the ability to remote debug, you can see that we're dealing with the cloud and there's a lot of latency that can actually be involved in that. So the risk of dynamic, so I realize as I say this that I may be in a world where I could get myself in trouble is you can get switched back and forth between machines. So a dedicated VM going back to that question, why did my code go away? Something weird happened. Don't know why that would have made it disappear. That's weird. That is very odd. It's a preview bit. So the remote debugging should not. Yeah, is that. Should not have. So something weird went on there. As you mentioned, this is a real life show. I have no explanation for that because my code just went away. Now maybe a bug we just found live on the air. So in theory what should have happened and it's always worked before for me is I should have been able to just right click. I should have been able to say attach to your bugger and then I should have had that break point bind and hit. Let's go here. I can browse my files through here. So this is toolbox functions this. I can go here and I can say files and it's unable to receive. Yeah, which makes sense because it's basically telling us that there's no files. I have no idea what happened, Robert. Interesting. As confused as anybody. Well I guess we won't be showing that. But it's in there. It's in there. It should work. Well apparently we now know that until we get whatever that just happened. Well it did work. To full disclosure, we redid this episode because the first time we did this neither of us was really happy with it. And it worked then, right? That's correct, yeah. Okay, so. All right, so let's go ahead up here. Let's just say publish. I don't need the preview to populate itself. So let's try this one more time just because I'm stubborn and I refuse to accept failure. All right, so it's up there. So if I go back to the portal and I go back here and I click this one, we see source code there. Good. All right, let's refresh. It's still there. That's good. Awesome. So let's go back to Visual Studio. Let's go back to Cloud Explorer. Oh well, that's not gonna. There's actually a bug in Cloud Explorer right now. We fixed it for the next version but it's not yet in this particular thing. So let's go ahead and say attach to bugger. Let's see what happens. So as I mentioned, debugging's gonna only be reliable when you create a dedicated machine. So the interesting thing about dynamic is because it scales, it can hop around between instances in Azure. So I don't know if we exposed a bug related to that but it's always possible that you'll never hit your breakpoint. It's possible. It's not likely. But it's possible that if the function's under load, you could attach to instance A and if I go push something into the queue, it could get processed by instance B. All right, so let's see. Let's go ahead and refresh this. And that's only something under that plan, the plan you chose versus the other plans. Yeah, that's only with the dynamic because you don't have a dedicated machine. You're getting potentially scaled up on multiple machines. And so when I right click and say attach debugger, it picks a particular instance. As long as the function's not under load and practice, it should only ever run on one particular instance. And I can see that my breakpoint bound as we would expect. So let's go ahead and add something to the queue, remote debugging. And let's go ahead and push that into the queue. And the breakpoint hit. You can see that we're attached remotely. Cool. So I have no idea what happened the first time, but I want to just point out we did succeed. Very nice. So, and the message says remote debugging exclamation point. And notice that we're running up in Azure. So, go ahead and look at that finish. Now when I refresh, I can see it's gone out of the queue. It's gone out of the queue. All right, so that is cool. So Azure Functions, very cool functionality. If you're doing any kind of cloud development, you should definitely look into that. And these tools, despite of being in the previous state, they work pretty well. We ran into a couple issues, but I think they're in decent shape for people to play around with. Yeah, I think so. I mean, some of the problems are my own problems. Picking the wrong store, changing my storage account, things like that. Cool. All right, thanks so much. Thank you very much. All right, hope you enjoyed that. We'll see you next time on Visual Studio Toolbox.