 Okay, so we are going to cover in this session like what is cron job, what are the disadvantages of using cron job, what is QAPI and what is Q worker, why is it useful and how we can trigger a queue like using cron and manually. So basically what is a cron? Like during your workflow process you must have come across a screen like this where you just you run a cron. Now what is a cron? A cron is basically any scheduled job for example database maintenance or sending bulk emails and you can set a time frame like you know the cron runs once an hour, 3 hours, 6 hours. So any particular job that you want to run continuously on your website you can use cron job. It is basically used to manage like short running tasks which are not resource intensive like you are not using any third party resources. So in that case you can use a cron job. So using cron job comes with its own disadvantages. So one thing is like if for a module you have like 3-4 cron jobs and for one module your cron job fails then your website like all the cron jobs that you have created can fail. And usually in a cron there is like no logging information like you cannot log or what went wrong you cannot log that. And you cannot run 2 or 3 cron jobs simultaneously. So when I am using cron word a lot I can like in your module so when you are creating a module you have a hook where you can define like whatever work or whatever job you want to create in this hook. So this is basically a cron job. So due to all these disadvantages we use QAPI. So QAPI is a better process. So you can like think of Q as you know a system of like a number of items that you can go one by one. So Q worker has its advantages which is like you can set a time limit. You can have a time limit for each Q. For example you create 5 Qs you can set up you know time limit for each one of them. So if one Q fails you can still run other Qs inside your module. It's efficient as compared to cron because it can handle resource intensive tasks like if inside a Q you are using any third party resource so this is more powerful. And also you can run multiple Qs without interfering with other Qs. And you can log information also like you can have some metadata and you can know like why an item failed. So basically we have some components of QAPI. One is Q interface. So any class that implements this Q interface and when you create an object of this class that is known as a Q. So Q basically works on database Q. So database Q works on first in first out. So like when you have a set of items it always pick up the first one process it and then release it. And also like when you have items in a Q you can just release one item and then come back to it later. So basically Q the role of Q object is to create item claim them from the Q process it and then release it. So we have a module Q UI which is a very powerful module and I use it to see like what all the Qs are there and what items are there in a Q. And you can also use this to you know delete like if you don't want to process items inside a Q you can just delete it. So here is a screenshot of Q manager where you can just get a list of all the Q items and for example you want to see like what's inside one Q you can just view it or you can also release or delete it. So these were the basic concept and now we are going to like go through a demo. So anyone has any questions still? Yeah. If you have a long running process let's say a binary like SQLite we spin it up in a Q on a AWS pod will it keep that pod alive while it's processing? So I can like I have not worked on that setup but so you have access to this API inside of Drupal module. So you can have your own queries or you can create items and you can add you know any number of items inside that Q. So I can just show inside the demo like how we can use this API. So this particular example that I have created is so this is a small module which prevents so when you insert a node it adds like if it's not published so it adds that to this Q. The name of the Q is Cron node publisher so when a node is not published so it just adds it to that particular Q. So I have just so over here you can see that Cron node publisher has at least eight items inside this Q. So I have a fundamental question I don't understand which is so you have this on this Drupal site and it's showing you the queues you set up yeah but it doesn't reflect like all the Drupal still has all of its Cron stuff that it's running right so it's not it's not magically somehow part of the queue system like when you enable queues right it's no no no so those are these are like the queues that we are creating. Drupal has its own Cron that is running in the backend that has you know scheduled maintenance kind of things that is a different and if it if it fails does it affect your queue no no okay no so this is like totally independent this is the queue that you are creating for processing items so basically I created a queue over here which has eight items in it so and then we can also set a limit so for example I set a 10 seconds limit on it so like each item each item has a 10 second limit so this is basically so if you can see my module structure it's plug-in queue worker and inside queue worker I have two queue workers node publisher and manual one so inside node you have this metadata where you can tell that you know you take 10 seconds and the title and the id so process item so basically this is like a implementation of queue worker base which is a class that you use and then you have process item function which is so you're processing the particular item so what I'm doing over here is noting I have an id node id I load it and then I publish it so I have created a function where I just publish that particular node id which I have already added inside the queue so if I go and run the cron so like now there are zero items and all the nodes that I added to the cron are published so this is like how you can use the queue worker in cron do you have any questions so another way of like this was how we are using cron another way is using like form we can create a form and then we can process the items in a queue so I'm going to add like couple of items so now you can see like there are two items inside that particular queue worker so now we are going to process this item with the form not using cron we are not going to run the cron but we are using a form to process these items so inside the module I created a form and I am using the other queue worker that I created manual node publisher and if I go to this URL it will tell me like how many items it will process so over here it stays that 30 items like on submit of this form it will process 30 items and then inside this function I have like created how it will work once the form is submitted so it will get this cron worker that I created manual node publisher and it will loop through all the items that is passed like inside the while loop and then it processes them and once it get processed it's delete it it's deleting them from the cron and if there is any exception we can release that item from the queue worker I mean yeah so basically just clears the items from here I can take a look later why so do you have any questions till now is that job still gonna be triggered by cron as well though in this example or like a so no once if like I created this and you remove the metadata for creating cron so it will not get triggered by okay yeah although I suppose you could have a use case where you might want to run it manually and have cron on it yeah so it's like it's and you can also use drush so so these are some drush commands that you can use to create to run and get the list of all the queues that have and delete them also yes yeah so it's like more friendly user interface yes that that that is the custom model that I created using the API so this is what this was the demonstration about how like I used queue and just created a example queues for that so you can also like go ahead and inspect these are the items that I have inside my queue and you can also like if you can release them if it's like not required you can just release or delete them so this was like an example of how like just a demonstration but I can share a real scenario where I used queue API and how I used it so this is Hoover's website and they had like bunch of sections and data on their main page and all the other pages so there was a case where you know we have to keep the speed of the website really fast and on top of that we could not do caching because they were like updating the website once a day or it was really fast process so we come up with a process where we generated static HTML files like we generated static HTML files twice a day when traffic was not like greater so that it gets cached and the page is loading fast so for each page that that was like fetching the content from view we created a paragraph JSON view paragraph so each JSON view paragraph has a URL associated with it so for example this is events page and it has a JSON view URL which is featured carousel events so this view has content that is getting fetched from events like it's pulling in five featured events or ten featured events so what we did is we created a queue which sends request to that URL and it creates a HTML file twice a day so that twice a day the content gets updated and then in front end using JavaScript we just loaded that particular content so in that how it helped us is like the file was cached and the speed of the page was fast so we have like each for each page we have different queues like right now there are zero items but for each page we created a different queue so that all the views that are inside it it creates a static HTML file and then it loads on the page so this is like the example of how we process item so we sent we entered the node ID so to create to add it to the queue and then we set up a time like it has to run three times a day so that the content is always up to date and then we are creating a static HTML file only on those times these files get updated so that the pages always having new content yes so you guys have any questions possible to or maybe this is another like a contrived model but is there a way to sort of chain queues to be dependent of each other like that no they are like independent they each queue has its own items and they're independent nice yeah could you make the it would there be any way to trigger one from the completion of another one though like just trying to think about how yes so just to answer your question so you could have a queue that runs and then creates queue items for another queue and then that one then can run so really like the results or some kind of context is being passed from one queue to the other they have certain kind of purposes so in theory you could have chains that are like passing along data or something has to happen and then this next thing that happens in suit but you would do that by making you create other queues exactly so like when it's working it's been says okay I'm done my thing and I'm gonna create stuff for the next step and then that'll run yeah you can use this like to create an item so this function helps you like creating an item and you can have like number of queues like for this project inside the crown be created like number of queues over here we added like different content I wonder to like in practice like it's fascinating idea giving someone like a button to like kick off the process like this but could uh could the items in the queue be so vast that like you could bring the system to its knees by clicking that button and how would and how would we manage that I guess so good thing is like this happened with us also because the research only for Hoover had like one lakh items so in that case we can set a time limit like if it fails you can just you know get back to it again and in that case it just keeps on adding the items and you still have like in your queue manager you still you can see it like what items are in there but it won't stop your system so that was one of the reason why we went through this route so that the website is not slowed down because that was one of their requirement like dupal 7 website was timing out again and again so they wanted to do polite so that it doesn't time out so yeah that is why we go hold the static file route so in theory you could instead of trying to process all the items you could probably set a limit because you're kind of moving through so you could set like you know so you want to run it like overnight or something you only want to process like a thousand things you could do that there are also like other things like combining like making your queue items a certain size so like instead of having like you know each queue item is one thing you can have those like a hundred and each of those and then you only run so many of those at a time I found that sometimes it does depending on what you're doing could have to strain on the memory so but the good news is you can actually like temporarily increase the memory for a run and then it'll so let's say you have a dress command that's that's prompting those two runs you can increase that memory run stuff and then it will go back to what it was just in case you need a little extra juice what if you have unlimited resources and you want to be able to spin up new images say here on aquea cloud next but you don't want those to go away all the queue api keep those resources running yeah we have something similar on that yeah that's where we started that we're here we've got this situation where we're unzipping very large files and what happens is on a cn api cloud next it'll spin up a new pod and it'll start using that as the resource in the background but then it doesn't recognize that there's a process happening because they only recognize like Apache processes and the binarial fail and then the media entity that's getting updated you can't there's no like that whole pod that was processing that is going away so you want to keep it alive yeah I mean we use this queue api because prawn was not uh was timing out a lot and like this particular website has like so many researches and events so we wanted to like process each item and even the views like the dupal views were when we are calling them on a page it was rendering the page slow because it's like processing those items so that's why we went to the route of creating static files and we wanted them to be updated because they are like regularly updating the content like new events are coming or new resources on home page are coming so that's why we went to disrupt queue api I like the new admin module and did you did you guys create that because like uh no there's a other prawn super prawn I forget something yeah ultimate prawn and all these other things that that will trap a lot of that stuff but I found it really kind of buggy and really kind of yeah so queue ui is a contributed module that I found like when I was trying to access the items inside a queue so yeah it's a really good module it helped a lot like to even to view the items and I have like I was not able to add the metadata but it's also possible to add some metadata in each item like you can know if if you want to process this item or and even like this module really helped us because we were using feed feed module we were so for Hoover there were like 10 of thousands of feeds so even like if I so that feed module is also using queue worker so you can just see like all the items and like right now it's 800 items that's that are still not able that it's still not able to process because it's like too much data inside the feeds so yeah this this also helped in like you know visualizing what how much content is still left and like if we need to run the chron on odd hours just to you know process these items I saw something about batch API were you guys using that no we were not using batch because we were accessing like each node so we didn't need it the batch API well yeah we created paragraphs and then like inside the paragraph we access the URL send the request created the static files like and then loaded on the page yeah so like at least on each page two or three sections are rendering like that through through that static html file and we use this on all the pages yeah so this was the first time I was giving me the session so thank you for joining I actually uh earlier I said like I want to give a session for 20 minutes but they extended it to 45 so I knew it will wrap up by 20 minutes I just wanted to show the module