 So the second speaker of the session is going to be Frank Becker Frank Becker has been developing Linux systems for more than 10 years He specializes in the field of testing and benchmarking since 2013 he is working for Amazon web services and I see a couple of Amazon t-shirts in the room so wow Frank also has held tutorials and talks at the German Picon about I Python and Django in His spare time which by his own accord is very rare He is producing the German speaking podcast import this Today Frank is going to introduce the lip butto which makes it very use very easy to use AWS from Python, please welcome Frank Becker with managing the cloud with a few lines of Python Hi everyone Thanks for your interest in in this talk We already lost four minutes. So I already have to skip through the stuff here As I'm already was kindly introduced. I work for AWS since 2013 so that's one half years now I'm with the operating system group which is based in Dresden 200 kilometers south of where we are right now We desperately looking for Talent so in case you're interested in something this please talk to us. We have a booth downstairs But we also have I mean all over Europe locations development centers where we have a lot of very interesting work and And yeah, if you especially like to work at scale That's probably a good opportunity I've been working with pysons since about version 2.4 Yes, I as already mentioned We actually put some effort into This podcast here a friend of mine Marcos who had to talk earlier About swing's localization And he already promised that yeah in the next couple of weeks, there will be some new episodes But back to the to the talk The idea for the talk I actually got an Local little inox conference where I was talking to a couple of devian developers and they said look we get those AWS credits and actually Those guys at least didn't have any idea of what to do with them Which is sad because you know AWS is giving it out for for them to improve they've been and I think they can use that so We had a little chat and well sure as many of you probably also They didn't fancy to click through web applications to launch instances and whatnot. So They asked I mean, how can we automate that and they also had some Python background so I Introduced both of them and what I had the feeling like this is something that's also helpful for others Before we actually Talk about about Boto to other little things the first one is you know humans like abstractions So this word cloud is kind of a buzzword But let's say at least different people have different opinions what it really means So I define it for this talk only, you know, maybe in half an hour. I would have a different opinion What I mean by that is that you have dynamic or in AWS speak elastic IT resources that can be some like storage That can be something like compute so virtual hosts That can be networking. So if you need a content delivery network, it's just there waiting for you But it also could be some routing stuff packet filters What we call security groups You can have easily databases as I will show you in a sec messaging systems and the key here is that you can scale up and down those things and Why is gay up? Sure? You have to pay more but when you scale down the ideas you don't pay anything for this stuff you don't use and Well, as mentioned previously that has to be scriptable because nobody really wants to do that all by hand And Python I believe this is a perfect language to to do so If you now think well, I want to write my hundreds S3 uploader There's a tool for that already actually also both of comes with a command line tool for that but I Would recommend AWS CLI tool, which is also written in in Python and Yeah, there's a different talk for that. So, you know, if all you need really is actually Fits in a in a simple shell script and maybe both of isn't isn't the thing then you might be much faster using that one So Boto actually Both of us Started by this guy Mitch Garnett he Also used to work for AWS, but he left the company unfortunately. So now the project is managed by by AWS which means We make sure that, you know, the code is up to date But actually we also very happy about contributions. So I checked on GitHub last week So we had nearly 400 contributors to the project We had over 6,000 commits and that is just the github history from somewhere in 2010 Yeah, and the name actually comes from this little dolphin here Which brings me to the first example I want to show you so Maybe Many of you are familiar with the search service called S3 some research The idea is that you can dump Stuff in what what we call buckets You also could think of it like a namespace or a directory and there you have to create the key that you can have Your objects back and attach this key you have an object and this can be a stream of whatever so it can be files or Well, as I said Whatever um, what we make sure is that you actually get your data back. So the term for that is durability and What AWS guarantees you is that you get 99.999 and that's really nine times nine percent of Chance that you that you really get your data back and if you look up what your hard drive Gives you and you do some calculation with some rate arrays. You will see that's hard to reach to reach this number. Um, I Dare to do a live demos I have a couple of iPads notebooks I Prepared I just have to see if because we had a promise setting up the the display here if that works so What you basically do is in the first place you import Boto. Can you see the mouse pointer? Good. So I execute that Then um, I have a little file on my on my hard drive called dolphin Jepik So what we do here is we first try to create a bucket of that fails. We just connected And then we As I mentioned, we have to create a key For the object we want to upload and this key. Yeah, it's called Diffen traffic and then we just Upload the content. So Let's do that. This little star here turned into so it's done and Yeah, that's some iPads magic. So we Generate actually what we do here is we get this this bucket again We go through the list of keys Which is not really so much relevant here, but actually this line here then Generates a URL that is valid only for 120 seconds. So what you can do is of course you can generate us that are valid forever But sometimes you just want to share a file and you do not want, you know, the others Download it too. So you just want to have Just want to have this link being valid for a certain amount of time and that is what this thing does So all goes well and it does You actually see the signature for this for the attached to this URL and well be downloaded the file Who of you knows how to create torrent files with AWS torrent, you know between the Thing that the music industry got a little wrong But actually a very helpful protocol within let me do this bigger I Prepared it already You may not remember That I gave you this link here down here for this presentation and Sorry and this Of course also comes from history. So the only thing you do actually is You're attached to your history link question mark trend on what you get is the trend file so with the limitation of the of the wireless we have here I'm not sure if Clients you can talk to each other, but that would improve downloads like this A lot Okay, next example I talked about this Message queues the service is called sqs What it basically does is and there are many other implementations of that You just dump a message in a queue and somewhere else you take it out. That's the basic concept very useful and distributed systems and Well as I mentioned there are many open source project that does kind of the same thing But if you want to have that scalable if you want to have that in a high availability You will find out it's not so simple anymore to set it up And actually it also can be quite costly if you have to distribute the service and stuff. So This photo it's quite easy to to do that Let me go to the I press notebook Again, I try to make it a little bigger. So This time we use the sks module out of photo. We create a connection To This is always per region. So we we go to the European region from AWS They use the service we create a queue which we which we label Europe I some 14 We set the timeout I come to that in a bit and wait just let me execute that first one second one and In the next block we actually add a message. So we import the message class We instantiate one and we set the body as I imagine in a queue and We write that to the to the queue we created before So now let's assume. We are somewhere else on a totally different system. We create a remote queue We get all the messages we print the message body and We print the queue count So I executed again. I mean I tested that before that's why I started there Yeah, of course, we get the we get the message I imagine in the queue and We also get a Queue count of zero So now we wait a little actually this timeout and we see again how many queues How many messages we have in this queue and big surprise now it's one So the idea there is that if for whatever reason your service that was actually dealing with the message and receiving I don't know a chunk of Jason and doing something That failed for whatever reason crashed or then you do not actually want your message to be gun forever You want actually service then when it dealt with a message to delete it and that's actually the last block so you just say Again, you get the messages You iterate over them and you delete them then so if you get for one delete the message are true and Next time it's empty That was example 2 sqs. Now. Let's launch virtual instance That brings me to the next iPass notebook Again, we this time import the EC2 module out of Porto Let's actually do it that you also see which kind of Help Porto is that you know, you do not have to generate all those Ape XML stuff and us for yourself This time we enable logging so my basic do is I input logging and set the The log level to to debug Which both of you pick up I again create a connection Now I have debugging and Boto actually tells me that it found the config I didn't touch that there are several ways how I can put in your AWS keys and In case some of that should show up. I have temporary keys for this presentation. So In case you want to reuse them, don't try So and actually all it takes to actually run this instance now is This command on this line I have to say this parameter first parameter is the Actually image we want to launch so the term there is army Amazon machine image And that really defines what you actually get if you get a Linux system what Linux system What's being installed there? So as I mentioned earlier. Oh, did I forget it? Probably I forgot it. So we actually have our own Linux distribution where you make sure that runs best in on AWS It's called Amazon Linux, but you also can have redhead to the devian or what not into So The thing here to notice is you That's actually all this thing now Boto generates for us to launch it. So you will have to I Come to availability sounds in a bit but we have a current ID and Well security groups and all that the architecture root devices We don't really have to have the time to To get into that but the thing now is since That thing is launched in really a couple of seconds And and then actually the system boots and that is kind of the handoff from us to to you as the customer When we do not touch it anymore And then we don't really know if your instance really boots up or not So there's a service for that to monitor that of course, but that's not the fault So what you get back in that case is a so-called reservation ID Which you can use actually then later on to see if your instance is turned to the state running And which instance ID it got so every virtual host or instance the same thing gets an ID of course that you can find it back and that is what Will change you know Logging really yeah That's why I don't do live demos Well, then I show in the slides In that example here, I actually get back four of those instances and Yeah, and once I have the the object for this instance I have a couple of methods like I want to have the public DNS name or SCC Later on I also can terminate those instances then Check them or whatever That for this wait before I actually can start with a with the next example, I Have to introduce a few concepts in the so-called virtual private cloud which roughly could Say is a LAN, but just in the cloud So I already talked about regions so a region is really you know data centers at geographical point That is divided into so-called availability sounds Which means if you want to have a Available Service you will launch in different availability sounds and if one goes down and you're at least ensured that the other one Will still be up and running. That's the idea behind that and and then you you want to have at least in a VPC you want to have your Own network in there. You do not want to see traffic from Our management or from other customers or what whatever And therefore you basically launch with private IP addresses. You have subnets in there the subnets are Availability zone and If you want to have those instances exposed to the internet you can always attach Public IP address and they're visible again or you can go through this internet gateway and router and you know There are different things like load balancers and stuff you can use here It's important for what I want to do now so This example actually shows how to just launch 10 of those hosts 10 instances Installed this CC and all the stuff you need to to build a Linux kernel Set up this CC and then this CC has a functionality where actually can broadcast and find other other nodes and then you can compile I'm afraid we do not have the time to really show that because the launches and all that takes a little So let me just show you quickly Oh great now Firefox doesn't want to work. I'll have it back up What you basically do is That here is important this time you say I don't want to have just one instance. I want to have 10 That's this thing You say Exquisitely which instance type you want to have so C3 X large has a little bit of compute power You have to give a subnet I talked about it earlier We want to have monitoring this time and yeah, well, that's done. You get a couple of instances I used fabric to you know SSH into them install the stuff Start the CC Well, it's again done with this fabric I actually kick off the the compile and after After a little less than two minutes the whole thing was over and I actually can shut down everything and and done I Very very quickly It's going through the last example where you could say well, but maybe ten instances a little too much So I want to have this more flexible. Let's say you have a compile service or something The key there is another tool called autoscale What it basically is you have a so-called launch configuration Where you define, you know the instance size of your virtual host which army want to use and all that You need a so-called autoscaling group which defines the availability zones for instance the minimal size of your of your cluster the maximum size and And yeah, this launch configuration that is all started when you kick this off You could imagine that now there are for instance is being launched and that is exactly what you see on the button of the slide here Was the get activities method? Then you have to have a scaling policy for scaling up and down You have to kind of triggers or alarms For that and you and that's done here in those alarms. You give it a threshold for CPU utilization Which have to which has to trigger for a certain amount of time So that's twice 60 seconds and if this triggers then it actually scales up by That's this parameter up here by one instance and this would go on and on Until you reach the date instances All right, and well to shut this whole thing down again three lines of Python, but I'm through my slides Actually for all the services you see here Boto is The API or is the tool to use Thank you We got thank you. Thank you a lot Frank. We got time for one very quick question Okay, then I give you I give you may I may I ask a quick question Yeah, how long would it take me to set up a service running on Boto from scratch? Well, it's really well you have to click an AWS account which means some verification that you are you claim you be and From there you get and a key so you have to configure Boto that's just two strings or two keys you have to Give it as a shell viable or whatever and then yeah, you use one of those lines here and you're up and running Okay, thank you a lot. Let's thank the speaker again and prepare for the next part