 So yeah, I'm kind of the last speaker so I'll quickly wrap things up so that we can spend the rest of the Saturdays. Okay, so my topic is triggering analytics with serverless functions. So the last time I had the idea of my own talk here is also about Google Cloud Functions. So in case you didn't know, like Google had this service called Google Cloud Functions, essentially it's like you write a simple function, you throw it to the cloud and then it just did it with different triggers so maybe it can make an HTTP call and it will run a simple function. So in case you didn't know, Google Cloud Functions is now generally available, so now you should be able to use it as a buffer that Google won't put around when you're doing your applications. So the previous time when I came out of the talk here, I was talking about how you can use Slack and you can ping Google Cloud Functions via HTTP call to trigger a Google Cloud Function workload and the previous workload was to generate stacks via Meetup but Google Cloud Functions has various other triggers so this talk will be talking about the other triggers, Cloud Cloud Storage. The other four is more for Android apps, so here's one, it's the present button follow back road. Okay, so this is a scenario where now you have a department that does all these type of operations and this department needs to build other departments in the company and this department requires the assets from other departments. They talk to each other where they say, okay, let's say department A that does the compilation of data reports, they tell the other department to send some form C and we file to them and maybe previously, how do we do it? Okay, department A, department A, department B, department C, department C, our department A will compile it and send some right report but within the manual, what we really push is say we send a CSV file, we send it over to the department and for some of you, they miss a comment or they miss type something and then like that you actually objective and say, can you file your data and then it's simply a copy and machine. So, look at this scenario, we also improve things like you can't have it in a way that you can't do it manually, where one department takes control of this whole compilation process and needs to interface with other departments just to collect data and compile them properly. It can take such a long time. So what we have introduced new compute services which does this check for us. So, one way is we could spin up the server and we can tell them we can create some kind of front-page to tell them, okay, you can drop the service into this front-page, then it gets completed accordingly. But you know, setting up server costs money, costs time, someone needs to maintain it, someone needs to develop all these front-end applications on the server. So, it will be nice to have some kind of like small little function that will do these mini-mini checks. So, that's where Google.com is coming to. So, let's look at this scenario where report A, where the department creates report A and they send it over to the compiler. They drop the service file into some kind of like bucket in order to get things through. So, how we will start this will be like department A will create a service file and then they will just need to drop it into the bucket. And then when we drop it into the bucket, like just now in the first initial slide, I was saying that Google Cloud Functions can be triggered from activities we did in the bucket. So what will potentially happen is the moment you drop the bucket, the Google Cloud bucket will inform Google Cloud Functions like A, that the file in the bucket you might want to be interested in, you might want to take a look and see if you want to continue processing it. So, if the data is not of good quality, maybe you can immediately inform the department A like, hey, your data is shared, go and track it. Okay, before we do any demo, let's quickly look through some quick load. So if you want to follow along with the code base, the links are there. But I will just focus on the main components of the main to the more interesting portions. So, for Google Cloud Functions, we are not dealing with the HTTP stuff. You kind of need to write some kind of function with the two parameters, data and context. So it's data and context parameters. It contains a whole bunch of information that the event will provide. So let's say if you use Google Cloud Market to modify Google Cloud Functions, it will have a bucket name when the file will drop into the bucket, those kind of stuff. So what is it? The whole structure of the data format is available in the web window website. You can just check the documentation. Or an easy way is you can just log it out when you are first time trying it. Just log out and see what's inside. And then from there, you can continue playing along with it. So this next section is kind of the basic stuff. So basically, essentially, the bucket, Google Cloud Market, goes into the web config files. So in our case, we will be using Slack to notify the bad department that the file is horrible. And then we will send initial information that when you drop the bucket, when you drop the file into the bucket, the compute will say, okay, we have received the CLT file, and we will be checking it. And it is best that I tell them on Slack that, hey, your file is not good. So this part is a more important thing to kind of know that. So for Google Cloud Functions, when you try to run it, and when you try to save files, you can only save it in the slash tmp folder. So because this whole Google Cloud Functions is essentially a read-only file system that you can, it doesn't allow you to manipulate files except for the slash tmp folder. So any temporary file that you want to manipulate, you need to drop it there. So in our case, let's say we have the CLT file drop in the bucket. We first want to download the file into our Cloud Functions before we can load it up with pandas to do a implementation. So after we load it up, after this Cloud Functions like downloads the file from bucket, then it will run the check. So we have this analytics check, dot run check, and it's basically just doing a check for whether this set of columns is up or not. So now I'm going to run the demo. Hopefully it works. So I'm going to drop a set of files here. So this GCR test and then I'll give them all one. Then all the stuff will appear, like the messages to all the pages. Okay, hold on. Okay, so what we're going to do now is to upload the incorrect set and the end that should be in. So on the initial first time when you run the debug, it will take a while because there's this thing called co-start time. So essentially it takes a while for Cloud Functions from zero, from a zero instant to jump to one. But after that once you have co-started, now you have started the function problem. Any function problem after that will be very, very faster. So if you have been looking at improved Cloud Functions for a while, then this whole debate about which cloud provider provides the smallest amount of co-start, because people don't want to wait too long for this kind of stuff. So just like I was mentioning about how if I drop the incorrect set, I just drop it in the bucket. That will inform the Cloud Functions to run this check and then it says, okay, state column is missing. Okay, now the department building was, okay, we have a state column missing and I can go and correct it. So let's say we send the corrected data. So you send the corrected data and now you see the response way faster because the function has been warmed up. So yeah, now everything is good. So we have done the first initial check of the whole pattern. Okay, so next week, so you see this red box there. You will see that I put the instant patient of the storage lines into the main function. When they say to Cloud Functions, you will only expect to run the stuff within the function. But if you check the documentation, it's holding about performance where they say, use global variables if you want to reuse objects in the future for future invocations. So essentially what I could possibly do in order to reduce the amount of time for future invocations is to put the instantiation of the Google Cloud bucket lines outside of the function. So the moment when you run the cold start, yeah, it will take a while. You will let it stay here all your clients and everything. But then from that moment on when the function is warm enough, that client can be reused over and over again. So that kind of reduces the amount of time for subsequent function calls. And then there is part about use dependencies wisely. So as you know, because let's say if you have that from Google Cloud Next, there is this weird demo where they were trying to show about how they wrote some Node.js code. And when you write some code, when you do your imports, you really put the import at the top of the script, like you import whatever, and then after that you write all your function calls after that. But because of that, you do what you're already import initially and then you run your functions. But let's say if some of your functions only depends on, okay, maybe... Okay, I'll see the point. That point will be completed. Sorry, just forget it. Okay, yeah, but just one more thing, just for the sake of... So when we just now were saying about the cold start and when the cold start, and then after that when the function is already run, so essentially what actually happened when we do Cloud Functions, once the function started, the function is kind of running from that moment on. Like Google doesn't slow down or slow it down immediately. So let's say you have some kind of file that you downloaded in order to prevent it from crossing your technical folder. You kind of have to clean up after you are done. So I was like, you know, you will grab future invocations of the function. Let's say for this year, if I have this file and I download it into the Cloud Functions, if I don't clean it up, it might be used the same file if I don't do it right. Okay, so we are back to this diagram. Report A, copy report C, and then to compile everything. So only what we've done is only the initial section in the report A and then check if report A is okay. So that's all we have done so far. But we want to, like, you know, go proceed on all the way to the virtual compilations and do the final check. So what we could possibly do would be we do the checks on the reports from report A, B, and C. Once we have been done with the checks, we can inform some kind of, like, we can store the state that, you know, we checked this file. And once this file is checked, we can trigger another function to run the final compilations step. Okay, that's what I was mentioning. So once we've done the report A, report B, and B, and C, we store that state into some kind of database, and then once we are done with it, we can then trigger a sub-function which will then run the final compilations step. Yeah. So I'll just skip to the demo. So this demo is essentially what I'm doing is I'm simulating, like, the three departments. What the three departments will do is, like, each of the departments will send the correct data in and then what happens after that is there will be three departments that will check each of the correct data. And once they are done checking, they'll inform the data store that this file has been checked and it's okay to be compiled. So once all three have been done, there will be a final part of the function which will then say that, okay, once all three files is okay, we can then run the final compilations step which will be triggered by Google Lockup. Okay. Let me run this demo. Okay. So I just sent three files and I send them to another packet. So I send them to this packet. No, hold on. This packet today. And then refresh it. So I put them to this folder, that's one, that's two and that's three to represent the three different departments and then the department on that is here that's correct data. So if you look back at the slag, so you know what the line can see out of here. It's essentially, it is the first tool basically just to do a simple check of like, hey, this file is okay and if it's not okay, inform back the, inform the department that gave it the send over the wrong file to go and redo it. The final, the final function is another function which is actually triggered by Google Power Start which is just basically in this case that counts a number of rows from each of the files and then just summed it up. So in this case, the same file to transfer only has one row. So it just counts them as the total number of rows and all that size as three. So it's just a simple example, but it can be expanded on further. Like when you have all this, and you have all these things, all triggered based on events. So one is from pocket trigger, and the trigger send an event over to Google Cloud Functions to run a compute or even from Power Start to trigger Google Cloud Functions. Okay, go back to sites. Okay, so let's say you want to find out more about all these functions, about all these demo, we can check it by the data. And also, let's say you're interested on how to deploy these Google Cloud Functions in a automatic patch or something. So like just now, Nijan was mentioning about Google Cloud Build or something. So this set of functions will deploy even Google Cloud Build. So I have another, like not both, which kind of describe about how you can go about it to kind of do this automatic build to create all this workflow. So the rest of the sites address two more. It's just basically informing about, you know, some of the feature work that Google has installed. It's a simple cloud scheduler. Essentially, if you look at the Instagram, I was telling you about the triggers. There's no trigger about triggering a function based on time. But for most of the cases, like you kind of need, let's say you want to do daily reports and things, you want to run this function at 4 a.m. every day. You can't do that right now. But now, like Google was saying that, okay, now you can do cloud scheduler to kind of trigger a Google Cloud function to do so. So I think currently, you can help out at that stage right now. So if you want, you can find the link to go and apply for it. I don't have anything with me right now. And then there's one more thing. There's this thing called serverless spontaneous. So, yeah, it's kind of new thing. So, just now I was mentioning about Google Cloud Functions. Google Cloud Functions is based on the environment that Google provides. So in this case right now, Google provides Google Cloud Functions for Node.js and Python. And that's it. I think if you see videos out there, there's potentially a Google Cloud Functions code, but I'm not sure when it's going to go out. I think it's currently really helpful there now. But this serverless spontaneous essentially is essentially, let's say, if you have a Docker code data which you can put anything you want, like image magic or Rust or something, you can put everything into a Docker code data, pass it to Google and tell Google to run on behalf. And it will be run based on how Google Cloud Functions is kind of running on it based on how much you use. So save a lot of money if your own environment was safe. Yeah, so I'm still waiting for this. Hopefully they come up with this soon. Okay, that's the end for me. Any questions? Oops. So, your final step. Did Datastore trigger the workflow? When it received three files? Okay, so in my case, essentially the last step of the function, what it does is it essentially tests Datastore whether the records say this file is passed and when all three records say this file is passed, then it triggers the box up. So how did that function know to check? It basically, for each iteration, it will keep checking. So it's kind of something, but there's this thing called Google Cloud Firestore, which basically you put the data point into this Firestore. It triggers, can you trigger a Google Cloud Function? So that would be a possibility, but for some part, even in the past, I went to activate Firestore. Yeah, the Firestore in Datastore was on the other edge. It can't change the mode, so that's now the setting is permanent and I can't change back. So I can't play around with that feature anymore. Yeah, okay. Okay, any other questions? So, is it supposed to all languages or is it supported for specific languages or it supports languages? You mean Google Cloud Function? The Cloud Function currently supports Node.js 6, Node.js 8, and Python. Node.js 8 and Python is beta, yeah, beta stage. The only one that is generally available in Node.js 6. Any more questions? Okay. One last question.