 Now, one problem that we have, ok, just go through 10 seconds, so I want to introduce the answer, one problem with any server management system is an inventory, right, how do you know that how many servers you have or how many servers and answerable will manage and on which server answerable will do what, right. There will be cases where let us say you have four web servers, two database servers, maybe a couple of database servers and all of them need to be configured differently. Some servers need to be memory optimized, some servers need to be CP optimized and so on and so forth. So, the configuration is actually really a lot, right. For certain servers some users might have access to it, some of the users might not have access to it. So, you have to configure them like that and to understand which server does what basically you create a static inventory. Static inventory is basically the first step that any answerable user would take or in fact any configuration management system user would take. So, a static inventory is basically a text file, wherein you write the addresses of all the server addresses, your service addresses and so on. For answerable it is static text file in PHP any style, you know what PHP any style is, like square packets then you go heading and below that you write the line and then another square packet heading. So, let us paste in this style. Now, this static inventory can consist of most addresses, address can be NIP or a resolvable domain name, anything. Optional variables, optional variables for example, insert cases you can say that turn off x, y, z user or insert cases you can apply variables like data center equal to area, data center equal to 0 so on. And then last we can do host grouping, what is host grouping? Basically all 4 web servers need to come under one heading, all 2 web servers need to come under one heading. So, basically you are grouping your hosts so that you can manage the meta. So, that is host grouping. One distinct characteristics of static inventory is that it has to be managed manually. So, let us say you all put it in your server, you have to add a line under a great host group, appropriate heading and so on. Now, when we were in a data center kind of environment where on building a server itself is not a matter of minutes, it takes weeks and if your configuration is little too eccentric it can take months. So, in those cases where you have to meet the couple of entries or meet the couple of entries a month it is ok. I mean you will not you will find it inconvenient, but you will not care about a lot. But now imagine today's infrastructure where you are AWS, GCP, digital vision and so on. And getting an instance is actually a matter of a minute or two. Is it going to happen is that you will end up good in a lot of instances, because the cost of instances is now dropped. There is no procurement in here to go through. It is really cheap. I mean if you are on distortion and you get an instance of high cost which is really cheap, your restaurant bill is more than that. So, people will end up building a lot of servers and at a rate which is lot more. So, because we are now a huge number of addresses to manage it goes the static inventory often goes up to sick. Right? Because you just booted 10 more servers, but you forgot to add those IPs because now the process is so frequent that a mistake is inevitable at some point. Right? So, I am just going to show you a sample of a static inventory. This is a very basic inventory, static inventory and as you can imagine that as we go on adding new services this can easily go out of sync. So, what is the next best thing that we can do? Basically the plan is to generate the inventory dynamically on the runtime. So, when you actually call handsome to execute something let us say you want to install your engine it is on a particular set of servers now as, but you do not know the IP addresses beforehand. Maybe you mess something or maybe you just do not know you do not have a static inventory prepared. So, in those cases you can use dynamic inventory which will actually build this inventory on the fly up in memory it is not written in a file. So, we build in memory so that the chances of going out of sync is eliminated. So, answerable as you know as a process project it actually comes feature back with a lot of inventory options easy to gcp distortion as you have a bunch of them are some of the lot of costs. You just need to basically download a script and put it in appropriate place by calling the answer. And good part about an answerable inventory is that answer does not puts any restrictions on how you can build a dynamic inventory. So, basically what you can do is you can say that my favorite programming language is Python, my favorite programming language is Ruby, I like sync, I like go you can write your dynamic inventory as per your own convenience and you just do that, but you can write a dynamic inventory as per your own convenience in any programming language. Only restriction that we have is that each dynamic inventory basically should return a valid JSON inventory when you are when you were executed with this particular platform. So, my inventory dash dash list should actually emit a valid JSON inventory and then answerable will take over this. So, just as an example I will show you how default inventory works. This is what answerable actually provides. I have not done any changes here. And remember this basically is going to pass the entire all of my eCpo account. So, it is going to take a couple of seconds maybe more than a couple of seconds, but the output that you will see here is going to be big. So, it is going to consist of all of my, how many of you are aware of eCpo by the way? How many of you are not aware of eCpo? Sorry about that, I should not explain. Amazon eCpo is basically an AWS web service is a cloud provider. eCpo is a service within that cloud provider using which you can move virtual machines at very short time. So, it will take you a minute to move a virtual machine or maybe a hundred virtual machines, a thousand virtual machines. So, eCpo is basically I have a virtual machine in this account. So, it is going to go through each of those machines. It is going to pull out all the tags in eCpo to tag your machines ok. So, it is going to pull out all the tags, it is going to pull out IP addresses, regions, subnet zones. So, on it is going to build a big inventory file for me which I can use to call. So, I have like bunch of machines running here in different APCs. So, there is this example, but this is an inventory example. What do we do with this inventory? So, now I am going to show you how we can actually execute a command using this dynamic inventory on just one server. If you notice there are a bunch of IPs right, but I do not want to execute something on all those IPs. So, I am going to execute this command wherein I will get the date of the remote server on. So, basically I am calling Ansible using ec2.py, this is the inventory file which Ansible provides my default. And I am going to use this particular group within the list of IPs that I got. So, any server that has tag with key name and value elastic search I will be logging in as other key user and using shell module run date command all right sounds good. So, how does ec2.py is contacting AWS all right. We are going to write a dynamic inventory after this. So, you will get no idea on that, but short answer to that was using kbs, AWS pipes kbs. These guys are using a library which contacts the kbs all right. Okay. So, write this little sodium, again we will take a couple of seconds to maybe 10 seconds to get the output for this in the week. You guys have any questions? Yeah, this inventory will always decide on your laptop which is executing the Ansible command. So, not on the remote machine, on the little machine. So, okay the question is that where will the inventory created and how does which piece of code lies where most of them right. So, basically what happens is that once I execute this we will be checking the net first. Okay. So, when I execute this what is going to happen is this ec2.py it is running on my machine it will be executed on my machine here. This guy will contact AWS APS it has a credential written down it will use those credentials contact AWS APS get the code for get the details of all the machines that I have all the machines over and then to figure out which machine has this particular tool which machine are part of this particular. This all is happening on my laptop here. Okay. Once this is happening now Ansible is going to write create the python code on the fly and that python code will be uploaded to the remote machine where that python code is responsible to actually execute the date command. All right. Is this possible? So, basically the question is that if you did something using static inventory during first execution will it affect the upcoming execution is that the question. Between runs variables are not persisted in Ansible but change to persist because they are being made on the remote right. So, if you install a package called engines during the first run it will still be there in execute a better right. But if you said if you were using a variable called my package in your in your provided by your inventory or provided by someone else it might not be available on the second one depending upon how you are running. All right. So, basically this machine the time on the remote machine is this one. Yes. It is not necessary if your user name. So, my I am using DCN user name. If this would have been on the third then it would have been. Or if I can configure it in the in the configuration file in one of the Ansible configuration file or in SSH configuration file. I have not configured anything and this is just a demo user. That is why I have to pass minus you are. But you can set it as a default in the configuration file. You do not need to pass it. This one is based on tagging. But there are terms of groups again. You can base it on tags, you can base it on data center, you can base it on. I have pretty much anything there are CDDR, subnets, BPCs, so on. I have only configured that SSH keys. Yeah, that is ok. As long as you are authorized SSH keys or whatever method you use. No, SSH key is actually picked up from an SSH agent. You can specify in the inventory, but it is not mandatory. We have multiple instances of SSH keys. You have to run it separately, sorry. Or you have to add all the keys to your SSH keys. All right. Ok, so going forward. So, we have what I have shown you is the default script that Ansible provides. I have not done any changes in that. That is something you can download all the entities right now. Let us start working. But I actually want to take a bit deeper. So, I have written a small inventory script, very small inventory script. This is easier to do because that one is critical. So, I have written a small inventory script that I want to share with you, which you can use as sort of learning point on how you can edit your own scripts. So, this inventory that I have written is actually simpler than what Ansible does by default. So, this inventory basically goes for all the instances, but it only look for the instances that have a tag called Ansible inventory. It will ignore all the other instances. So, using this, you can actually selectively say that, ok, only 10 sets of machines I will manage by Ansible. I do not want Ansible to touch all those machines. Which is something that I do not recommend. I think all your servers should be managed by some sort of software and not many. But this is what basic inventory script will do. I am using Boto 3. I am a Python builder. So, I am releasing Python. But rest assured, you can write it in any language that you want. Basically, using a bunch of libraries, Boto, JSON, perfect parser and so on. A small function to get IPX. So, basically you pass a JSON of instance, you figure out and return public IPX if it is available, if public IPX is not available, it is going to return a private IPX. And this when these code lines basically look for a file called EC2.ini. Just get the credentials for the API, alright. Let us go with that. That is it. The entire logic is basically this tiny bit of N line of code. That is all. So, what this guy is doing is that creating a Boto client. Boto is by the way AWS like in case you assume. It is going to connect to EC2 with that access key and secret and all which it got from this particular group. And then it is going to ask for all the instances. Let us describe the instances. Alright. And describe the instances where basically you are going to get a reservation from that I am going to extract instances from that I am going to extract the addresses. Now once I have done that I am going to check for a tag called Ansible Rule. If the Ansible Rule is present, it is going to take the tag's value and then put it in the entry. Alright. Yeah. Sorry. Yes, it is hyper-necronized. So, basically this is actually slightly dumber than you think. It actually returns whatever you think. Alright. So, just to give you an example. So, another one that is on the same out of 100 instances I have not actually tagged Ansible Rule to all of them. I have just used 2 instances and tagged that as well. This is the same one. The elastic search that if you look at that it is 716, 716. So, use of this effectively again it can execute the same command. This is the inventory file and the grouping because now the grouping is not based on the tags. Grouping is not based on the, on these groups that I created. They are called elastic search using Ansible underscore code. It should also take a S type of tag code because that one is much more comprehensive. Yeah. Alright. So, this is how you can use it. The entire group this is available to you. So, you can check out the guys on play. Play with it. Feel free to modify it and do whatever you want to do. That's all I have. Yeah. How do you implement this local data centers? That's a great question. Ansible tag is basically use a combination of any database like SQLite and write a small botus quota. I mean write a small class or Sinatra quota. Do you know what class? Micro, Microfibroids. So, use that back by SQLite if you have small numbers. You have like large inventory. MySQL or Postgres. And basically your inventory script will now be called to your Sinatra or Flask that will essentially read everything from the database. So, it will be the only thing. That's all I know. And it's not the one. Can you not do the same thing with remote facts or data facts? Because we need to write a pipeline. Remote facts. As far as I know, fact collection will happen automatically because those these two processes generating inventory and getting facts are not the same. That makes it using this one, the one that I do. You will lose on a lot of facts that are coming from the AWS. You will still have the host facts. So, the facts come from different levels. If you use Amazon, if you use the Azure default inventory, you will get a lot more information because it's actually getting a lot of more things, which I am not sure. So, what I mean to say is if the remote facts or data facts you see that the inventory file or whatever tag name you want. So, with that, then you can apply your other components of data. I am sure that I will get host facts, but I don't think that will have the numbers. No, it will. The reservations. If you get it, you are getting the reservations, right? Yeah, that's what I am trying to say. The host facts that I am talking about are the facts which are coming from inside the instance, like which are coming from this instance. Rich kernel version. I am talking about those facts. I think you are talking about the AWS facts. Yeah. This guy, the one that I will not get it. What the one that AWS has? You get a bunch of... You don't need to do anything. So, remember when you write a paper to write a host, my host, host, it's basically that look. It's not creating the dynamic in the case. So, what is the feel, is it, for creating the dynamic in the case? No, no. So, it's free. That depends on what kind of entry you are using. If you are using the one that is bought by AWS, you don't actually have a single database received by AWS. Because AWS is about how I am using it, but it's flexible in different options. You can use it on the basis of, let's say, subnets. You can use it on the basis of the availability, so the region, DPCITs, operating systems, operating subversions. It's huge. That's how I am using it. But it's so big that I actually can't possibly demo everything because you realize the amount of time you look to actually use it. The one that AWS has, it's very complicated. So, the tags that we create, who are the ansible? What tags are they? Ansible, what has the different tags? Tag was created on EC2 only. That was not created on Ansible. As long as it is created by anyone, but it has to be created on EC2. If it is created on Ansible or by hand, by Pocket MasterChef or whatever, it will all work. Any other questions? Okay. I'll take only two questions, short ones. And then you can catch me on, yes. It communicates by talking about how Ansible connections are removed. If it's a static opinion, then it's passing. It's a normal text passing. If it is dynamic, then it is actually a JSON. So, basically, even the text one, it's in memory, it's converted to a JSON. And then, dynamic is anyway giving you JSON. So, JSON output is basically passed inside Ansible in memory. Certain sections like the host address or if there are any variables, those are used accordingly. There can be connection variables like what SSH user to use, what set of keys to use, so on. Yes, last one. Yes. I am not sure. I have never. As far as I can foresee, at least using SQL, for Windows, you have to go somewhere because Ansible allows you to make Python attractive. In any case, because I don't think Ansible doesn't use Python for code, I am available, you can catch me.