 Good morning everyone. How are you doing? Hey Pierre, how are you? I'm good. Hey Jay, I'm so happy to have you with us. And for our audience, Jay Gordon or Jay Destro is the host of Azure FunBytes. So I figured let's get him in here and let it... What is Azure FunBytes and when does it go on? Sure. So like you said, my name is Jay Gordon. I'm a cloud advocate just like yourself. Azure FunBytes is this great opportunity that we take every single week to have conversations about the product services and the people and the people that's a big deal to me that make up an amazing Azure experience. I've been able to talk with over the course of 54 episodes now to people all across Microsoft and other organizations. So I've had some great vendors come on and do some demos for me. This past week I had a great conversation with Lavanya over at the Azure Security Center team where we talked about DevSecOps, integrating GitHub actions with security scanning and having that all kind of work alongside your Azure solution so that you have like a single pane of glass to watch for changes in your security posture. So if you want to check it out, it's Thursdays, 11 a.m. on the west, 2 p.m. in the east and hopefully you all enjoy it. Yeah, no, it's fantastic. I know it's in my schedule so unless I am pulled away on some other projects, I typically like play it on my on my desk station while I'm doing some work and participating in the chat when appropriate. But most of the time I just kind of learn, learn from the experts. Well, you know, I try to do actually the same thing myself while I'm actually hosting the show. You know, I'm learning from the experts. People are coming on and they're telling me some subjects that I, to be honest with you, I don't really know a ton about. Like, I had Adrian Hall from Hasara to come on recently and we talked about GraphQL. Like GraphQL is not up my alley. You know, I don't know a ton about it. And so he sat down and he said, well, this is how it works. And this is how it works alongside Azure to build these APIs that, you know, are less HTTP calls than a typical API where you would have to have one for call per service. And so, you know, I really love the idea of having conversations with people about the technology that they're passionate about, you know, it's, it's, it becomes more authentic to have those conversations rather than just show a demo. You know what I mean? Yes. And I've always believed that people, people appreciate content. People connect with people. You need to have a personal connection to whoever is relaying the information for you to like really, in most case, not in every case, but in most case, for you to do that. Speaking of something else that I love, you gave me some or you sent me a tweet yesterday about Sacha that I absolutely love. Do you want to talk about it a bit? Yeah, sure. So what's really great is that there was this interesting text summit that happened over here at the US at the White House. And one of the big things that came from was there are all these huge, huge leaders from all the different big tech companies across the United States. And there needed to be a commitment made around security. And that's the one thing that Sacha announced is that they are going to be investing as an organization, billions of dollars, billions with a B, 20. So it's to the, let's read it directly. So Satya said, Microsoft will invest $20 billion for advanced our security solutions over the next five years. And then here's another really great thing is $150 million to help US government agencies upgrade protections and expand our cybersecurity training partnerships. There's no advancement in security posture across the internet without education. Yeah, that's fair. And so without this commitment to helping organizations extend their knowledge and creating a better environment that they're not only just helping their organization, whether it's a government org or private industry, they're also helping anyone that may be part of that service, whether employees, users, customers, whatever you want to call it. Being able to create a safer, more secure internet is a responsibility of everyone who is building anything that goes online. Yep. It's education is especially around security, I think is key and is sorely lacking. Yeah, it's definitely something that we don't really talk enough about. We certainly love to posture around it. We love to say like, Hey, you know, this is something that I think is important. We should shift left. And if you don't know what it means by shifting left, it's taking security and bring it earlier in the development life cycle, the software development life cycle. And so earlier on in the process of creating applications, we should be discussing how to make them more secure and having testing put in place, having scans on our different resources. So, you know, here's one of the more common things, Pierre, that I used to see a lot of is NPM packages that people would make part of their Node.js applications. And those NPM modules being somewhat infected with malware. So you could be thinking, Oh, I'm creating this really interesting like Node.js single page or a jam stack. And you want to take a module and you end up doing some Bitcoin mining for some random person that you that you have no clue. It's so common. And the same thing happened with Docker images. So there were a bunch of images up in Docker Hub that had some sort of cryptocurrency mining malware in a bunch of them and people would use them in production distributions of their application. And unfortunately, that would lead to overextending resource utilization because that that mining has a cost, you know, so your applications can suffer, they could auto scale up more when they didn't need to. It's the same in every facet of operation. I know so many people and I've one of my customers downloaded a script off the internet to PowerShell that was supposed to clean up some unused accounts in Active Directory. They didn't really check it thoroughly. And they ended up running it and the script basically wiped out all of the application partitions in Active Directory. So it basically wiped out exchange SQL and everything like that. So the internet is great. You can get some good code to help you do your work. But you have to validate that code. You have to scan it. You have to make sure that it's secure. You have to make sure you trust the source. You have to test it in a sandbox. Do all the due diligence that you have to do before you actually set that up in production. So I think that SODD or Stack Overflow Driven Development is kind of dangerous. Well, Stack Overflow and for those who don't know what I mean, it means going on to Stack Overflow and finding not just some recommendations, but using code snippets directly and trying to use that to solve your problems, you know, or running curl bash right directly into a Linux terminal because you say, well, this is going to install this, but sometimes you may have no clue if you're just randomly curling down a bash script and then having it run under a root user, you may not have no clue what that script is doing. And so you may want to actually, rather than directly curl bashing and executing, is just grab the file, take a look, do some sanity checks. If something doesn't look right, it probably isn't. That's right. Okay. How about we get on with the news? Because we've got a few items today that we need to talk about. Absolutely. That seems good to me. Over $20 billion of investment in security postures, but our first, wait a minute, our first item is log analytics. So log analytics agents, and we talked about this last February in our IT obstacles, all things hybrid event, the agent is going away, it's being retired. And it's retired like next week. So it's been the log analytics agent is being replaced by the Azure monitor agent. And one of the big benefits from that cutover is we're going down to like one agent. First of all, we don't have this usual. What's the main thing? Absolutely. So you don't have a log analytics agent, and you don't have a inventory agent, and you don't have a patching agent. Well, I think we were up to eight or nine different potential agents that you could deploy on your servers to do all kinds of different things. Now you have one, and you manage what data they collect and upload to log analytics and to Azure monitor. You do that by data collection rules within Azure monitor. So you can say, okay, well, that group of server is going to collect these metrics, and it's going to collect these performance records, and it's going to collect that information, and it's going to send it to this log analytics workspace. We've talked in the past too about hub and spoke type of Azure monitor or log analytics workspaces, where you have a group that looks after a department that needs to have access to all their logs and monitor data, but maybe corporate further up also wants to have a copy of that. In the past, hub and spoke was, first of all, unsupported, very hard to set up, had to be scripted and very expensive because you're ingesting the data twice. Now with DCM or data collection rules, you could actually say collect those and then put them into this log analytics, and then have another rule underneath that says collect those and then put them in this. So we are now kind of building the hub and spoke without actually building the hub and spoke. It's still very expensive because you end up having to ingest that data twice. And so we want to reduce the amount of touch points that we have and how we access information about our deployed applications or infrastructure, and so little, the least amount of overhead tends to be the best to me. Yeah, and in terms of deployment, this is all going to be a lot more simple because you don't have to deploy with all of the policies already attached. You just say deploy that agent and it talks to this description. And I'm sure it'll end up being an extension for VMs as well, so that you can actually just click a tick box or a set of flag in AZ-CLI to say, all right, when you create this virtual machine, add this extension that installs the log analytics agent that you need in order to get our statistics and bring it into a service, put it into some sort of workbook or whatever it is, and so I think that that's really, really useful. One less big step, one less say a little bit extra code that you might have to write if you're doing some infrastructure as code, being able to just say, look, just install this, it's a lot easier. Yeah, and so we'll move on to our next item, which is one of yours. Sure. The, well, not Azure Orbital, but the SES announcement. Sure, so SES Networks is a company that they launch satellites, they manage them. They are going to be one of Microsoft's new partners in trying to diversify Azure connectivity and resiliency from our newer service, Azure Orbital. So imagine this, Pierre, you are setting up an express route from one portion of a satellite all the way down to your data center or your hybrid connection into Azure. I think that that's pretty amazing to me. I think we're getting closer and closer to this terabit level system capacity for satellite-based transmissions of data. And I think that the closer we get to the bigger pipes, the more open the internet will be. And that's one of the things I really believe in is that connectivity and speed make for a community and a world that is a little bit more connected. And so Microsoft is going to be using current medium Earth orbit constellation to provide connectivity until the second generation of satellites, known as 03B Empower is ready for service. And if you don't know what Azure Orbital is, it's a satellite ground station and scheduling services for fast downlinking of data. So it's a ground station as a service that provides communication and control of your satellite. That to me is like now Azure is in space. We were talking about this before, we're both kind of Star Trek guys. And we're never going to be able to talk to Deep Space Nine if we don't set up the infrastructure that'll be able to get the transmissions from here to there. Yeah, it's a good man. Transmission from the ground station service or Azure Orbital. Azure Orbital is the service which includes ground station as a service. It can have some significant bandwidth issues though. Even though the satellite themselves are starting to have more and more and more big bandwidth, you have to look at it. That's the payload of a specific application or a specific scientific analysis or whatever your workload is running on that satellite is also to be side by side with the communication of the operations of the satellite, which is a little different. Because I had to deal with that with the Azure Orbital team or the Azure Space team a couple weeks ago when we built a simulation in Azure on how with throttled network so that they can simulate workloads that would run on the satellite and see how the data would come across. One of the things I think would be really interesting is if someone had say like a medium Earth orbit or a low Earth orbit satellite and that satellite was collecting data on weather changes and then what are you going to do with that data? Well, one of the really easier solutions, if it's let's say unstructured data is to put it into something like Cosmos DB. So we're sticking with the space theme here. So being able to take that raw data and then process it into a service that then can take it and turn it into information so we can find out what weather changes happened in a certain section of the planet and how fast. The fact that there's this link set up between whether it's a government agency or private industry satellite that'll get us closer to getting more information about our world the better. Yep. All right. We're all we're already past the halfway mark. So let's move on with our items. The next one is another and lately we've been putting a few more of these items where is the end of life announcement for the Azure Classic or the original like vision version one of Azure. Yeah. The end of life has been announced. No, it's not any time soon. It's 2024. 2024. So you got a few more years but if you've got any workloads that are currently running in a classic model, now is the time to start looking at migrating them to the ARM model. So the Azure Resource Manager model. First of all, you get access to a lot more services as we kind of stopped development into the classic mode a few years back. And and now you can actually have your entire data center or your environment all on the same model as opposed to having to maintain some in classic and then some in ARM. And classic was more like a platform as a service as opposed to a full suite of internet services. And it also required you to have some concepts that added more overhead into the process of service definition, service configuration, service packages, and the CS Dev CS CFG and the CS PKG files that you had to actually create. And you had to prepare your application. So that's the great thing about using something like a container. Preparing your application for containerization really is as simple as, okay, are all my dependencies in the Docker file. They are great. Let's go ahead and roll. Be having to, you know, modify your application specifically to a provider to me is a kind of it's counterproductive. What you want to do is create a solution that makes the most sense. And that also could include accessing common APIs to actually create your infrastructure. And that's one of the things that ARM really provides. ARM says, you know what, you can use either Azure CLI, you can use an ARM template, because Bicep, you can use third party tools. So Terraform, all Terraform does really is make use of APIs that Azure already has. And you're just using their own domain specific language. And it's just creating a way for you to have a more diverse way so that you can do a multi-cloud if you want to. Yeah. And I love the Terraform and Pulumi, but I have to admit I'm more of a Bicep guy. Yeah, loving Bicep. I've been flexing, or let me get this, I've been flexing my Bicep a lot lately. As a matter of fact, I'm going to be in October, I'm going to be running a here on Learn TV, a thing called Create DevOps. And at the end of the session, we're going to have a one hour long, me and Steven Morowski, we're going to have a one hour long workshop about how to take GitHub actions, Bicep, Azure Kubernetes service and have it all work together so that you can create automated deployments and not necessarily have to do any part in separate pipelines. So we'll have one giant pipeline. We'll use GitHub action secrets, we'll use Key Vault to store special things that we don't want out in the world like our SSH keys or account information or even the service principle data. Yeah. And so going to this deployment model as opposed to the kind of creating package files and then uploading package files and having them, it doesn't make any sense to me to do it that way anymore. And so I'm hoping that people love ARM and the services that push ARM forward. Yeah. ARM is not going away, Bicep is just an easier way to generate them. So anyway. And abstraction. Yeah. Speaking of abstraction, the our next item is we're abstracting the desktop because Windows 11 preview is now available as an Azure virtual desktop. So if you're in your environment, if you're using AVD or we used to call, what was it? Windows virtual, the WVD. We have too many acronyms. I'm starting to kind of lose track. But if you're using Azure virtual desktop, you can now deploy Windows 11 preview in your environment. So you can start maybe testing your line of business application or testing whatever you need. Or maybe it's just because you want to take a look at it. But you can through the marketplace actually see the Windows 11 available as a virtual machine. I want to try Windows 11 a lot, but I am like a lot of people. I don't have a newer Windows capable computer. There are some specific requirements that Windows 11 has around hardware and I simply don't have that type of hardware. It's actually not that bad because I've got a machine, a desktop under my desk over there that is almost three years old. And it doesn't have a physical TPM chip, but it's an ASUS motherboard. And ASUS motherboard, for a long time, same thing was gigabytes, have had the firmware TPM, they call it, where it's basically hosted within the CPU itself or the chipset itself. And it just has to be turned on. It's not turned on by default in the BIOS. So it has been turned on. So yeah, I've got a three-year-old, over three-year-old machine sitting over there that is running Windows 11 beautifully. But if you're on a Mac, if you're a Linux guy and you want to try it out to see what it looks like because you have to support it in your environment, Windows Azure Virtual Desktop is a perfect place for you to take it for a spin. Yeah, that's what I'm looking forward to doing. I want to try it because I want to see the new WSL integration. I know that they're doing more and more of WSL. I really want to see the new Visual Studio 2022. I saw some of the previews that came out of, I think it was built, and it looks pretty huge. And so that's the one nice thing that we're doing is that Azure has a real effort to make things in the cloud more accessible to people. And the more accessible we are as far as making these technologies available and making them, and this is one of the cool things, like like Rory Pretty, he's going to be on my stream next week, and he's going to talk about accessibility and how to make the internet more open, not just for people, or I should say not just for the services that make it easier to access them, but actually to see them if you're blind, to hear them if you're deaf, to use them if you're not capable because of disabilities. I think that that's a huge part of also what Microsoft is doing. We have a chief accessibility officer that's in charge of making sure that every single one of Microsoft's products has some sort of accessibility built into them. Yeah, and speaking of being accessible, let's say hello to our chat room. We currently have Andrew McCollum that's there, Vakasin Terzik. I'm really hoping I pronounced that properly. I'm sorry if I mangled it. I'm French, and I typically have, I put the emphasis on the wrong syllables in a lot of cases. We have Jared Shockley, Steve, the audio guy, Andrew McCollum, the guy who said Paul Jensen, and we have also a regular IT guy, which is basically my boss. He's watching, so he's probably just sitting at home having his coffee waiting for the data start because it's pretty early where he is. Who else we got? That's it. And Robert Jr. Well, my cup is empty, and that means we're getting close to the end of this show. We are, we are getting to the end of the show. So last thing we have to cover is really our learn module of the week. And in honor of the Azure Monitor or log analytics agent being retired, we've picked the monitor, the usage performance and availability of all of your resources using Azure Monitor learning path that will basically take you through how to organize it, how to plan for it, how to deploy it, and how to collect and analyze that data, make it visual so that you can share it and actually gain insight from the logs and performance of your environment. I love Microsoft Learn just because there are so many modules to help you get skilled up on things that you don't have to pay for. And I love that. I love the sandboxes that people can use so they don't need to worry about drawing down credit or spending money. All these things are super useful. And as you can see, you gain points for it so you can gamify it. I am a level 10 wizard in Azure Microsoft Learn modules. And so I'm striving to get better roles at Dyson and to become a stronger Azure wizard. Well, when you get to the levels of Oren Thomas in Sonya Cuff, which I believe I've done every single modules in Learn. Yeah, those two are just absolutely phenomenal. And to close, today is testing and production day. So as our friend, Steve the Auto Guy, aka Jared Shockley, who's on the chat room right now and I are going to be discussing multiple of different things. And I think today we're also today doing an AMA where we're actually trying for the first time in production to integrate Discord into a live broadcast. So if you want to participate, make sure to head to our Discord server aka.ms slash itofstock-discord. We'll be setting up the main stage so you can participate and ask questions for Jared and I on everything that has to do with streaming, creating content, recording content, how to share the information and knowledge that you have. So sounds good. Sounds like really important and useful information too. Yeah, so thank you very much Jay for being with us today. Hopefully we'll do this again soon. Hey, good morning internet. How are you? Don't you know me? I'm your favorite son. I can't wait to see you all here again another time. All right. See you everyone. Bye.