 All right, we're gonna keep things moving along. I'm excited by our next speaker. I've been looking forward to this for a while because I know something you don't know about what he's presenting. David Sattamano lives in Columbia, presenting to us from Columbia today. How cool is that? One of the nicest people in SEO. He's a standup guy, 15 years experience in SEO, and he's creating quite a name for himself right now in the technical world of automation. He's got some really cool things to reveal in this next talk. We're gonna be talking about him on Twitter and some other places. Please welcome David Sattamano. So there was this Russian engineer that apparently automated everything that he did. He lived in a terminal, and when he left the company, his colleagues had a look at his repository. They actually didn't know this, but the local network was connected to the coffee machine. It was awesome. And he could actually brew his coffee directly from his terminal. That's the kind of automation I aspire to. We're not talking about that today. We're talking about marketing automation. And so here's an example workflow for say an average tech SEO workflow. We go ahead and we fire up screen frog. We check the crawl config. We probably have to adjust things. We let the crawl run through. We analyze the site and we write it up. Pretty simple standard stuff, right? What would it take to automate just even a part of this, right? Just to get the crawl config right so you understand how the server will respond. Well, you could do something like change your IP, crawl with different user agents, crawl all the different variations. This is really time consuming. And on top of that, it's boring. And this is something that a senior technical SEO would take at least a couple of hours to do. And they may not actually get it right. So why are we doing this, right? Why are we doing something that a computer can actually do better? We should be working on the processes and the models at a higher level. We should be able to abstract this work. And this is why no code and low code automation workflow tools are so promising and exciting. Let me show you electronic, which is pretty cool because it can actually watch your behavior for repetitive tasks. But take a look at the left hand side here. You see do while loop, if then, these are programming concepts. So you can't really just jump in to low code, no code automation without understanding what you're doing. Now, this kind of industry from what I gather is really going towards robotic process automation. And that's the hot term. So this is the place to be, place to watch. But still, you still have to understand the fundamentals to be able to do all this really cool stuff. So that's what we're gonna walk through today and I'm gonna show you some really cool automations as well. Now, if you note some of this presentation, okay, everything that I'm going to be doing is simple. It's going to be free because I don't think you should be paying while you're learning. And I'm not gonna automate for you, but I'm gonna show you some of the possibilities, right? And then hopefully you can either tell somebody to do this for you or you can have a go and start using this no code, low code automation. Do not take any notes. This deck will be online. Just sit back, relax and enjoy. So let's talk about automation lingo. And one of the very first building blocks is HTTP requests. So you all know what they do, right? So there's two that I wanna show you and that are important. You're gonna see quite often in automation workflows. So we're gonna do something a little bit fun here and we're gonna build an app so I can show you what this looks like. Here I've got a new wit.ai application and what we're gonna be using this for is for classifying keywords. And this is based on Brittany Mulder's whiteboard Friday. So if I go into the utterance and the understanding and if I type in MozPro, you'll see it because I've already classified this and correctly identifies as navigational. So I'm gonna go ahead and train and validate just to make sure that confidence level goes up. But what if I had something a little bit different? And here I've got two different things. So informational is is and navigational is MozPro expensive. In my opinion, this would probably be a better classification for commercial investigation. If you didn't have this entity, you could easily add it in here. So I'm gonna go ahead and do that. I'm going to train and validate. How cool is that? So we're talking about HTTP requests and the first one you need to know is the get request. And this is exactly what your browser does every time you punch in a URL, right? So we're just literally getting something. Let's go ahead and explore that. Now that we have our wit.ai application, I kind of want to do something a little bit smarter. So I want to be able to return the meaning of a sentence without logging into the interface. So here we are at the API documentation and we can see that this is the right call. So return the meaning of a sentence. And here we've got certain arguments or parameters you can call them. And the Q1 is really important here because it's required. And so what do we want to classify? What word do we want to send? And so if you look on the right hand side here, we've got our endpoint, which is right here, right? And then the full endpoint looks a little bit different because we've got essentially this Q, which is our query. And here we're asking how many people between Tuesday and Friday. And then we have this header, right? So we need to send this authorization header with bare token. Now token refers to, in this API documentation, the API key. So this is actually pretty easy. If you go into your app, into wit.ai, you actually have on the settings page your access token, right? And then you can actually just do this. And so what do I want to do is classify 301 redirect. And so now it's giving me a nice cruel command. So I'm just gonna copy that. And without me having to set up any kind of environment, I'm gonna go ahead and paste that in. And let's just go ahead and run that and see if that works. Hey, look at that. So it worked. Now let's make this a little bit better and include this thing called JSON-PP. And what that's going to do is just pretty print the actual result. And so you can see here is that the word that we wanted to classify was 301 redirect. And it came back as informational, which is pretty good and at a really high confidence. So the second HTTP request we need to know is the post. This is when you're sending information back through a website like a contact form and that information gets stored in a database somewhere. That's it. Just means putting information somewhere. Let's explore that. We're going to explore post request together. What we wanna do here is add more keywords to our commercial intent in this application. So the way to do this is that in the MozCon deck, you'll see a link to the replete for a MozCon cruel post example. And so what we're doing here is we're posting to this endpoint. And again, this weird thing here that you see is essentially a placeholder for the entity that we want to post to. In my case, we're gonna go with commercial. And the next two lines here basically say we're sending across headers. And so this API requires an authorization header. And of course, here's my API key. Content type is application JSON. And then after we have the data, which is our JSON object. And what we're doing is we're adding the by keyword and a few synonyms. So just go ahead and click on run here and that came back with a response. Let's go back to with AI. And we should be able to refresh the page here and see that our new by keyword has been added to the commercial intent along with its synonyms. So the most common data format you'll see with API is it's going to be JSON. And it's definitely worth learning. Not only that, because you're already familiar with it through the use of schema in JSON-LD, right? So once you do learn JSON, it's actually gonna open up a world of opportunity and automation for you. So if you're gonna use things like scale.com, which are doing things like API for driverless cars and they have text classification, that's gonna be an API. You can be using get and post requests and you're gonna be using JSON, right? So this automatically opens you up to this. On top of that, there are other really cool things like the open AI beta, which is text generation that may or may not put a lot of jobs, right? And again, API, JSON, get, post requests. So it's pretty cool. Now, this is a placeholder slot and you can look at this offline. There's not a lot to learn about JSON, right? But there are a few concepts that you need to understand. Let's go to the next building block of automation workflows. Whenever we wanna do something on a schedule or a timer, the word you wanna use here is cron. So let's go ahead and learn and take a look actually and explore all the current building blocks we've gone over and build something cool. Here we are at the search engine line homepage and what I wanted to hear is make an API out of this homepage that essentially pulls things into a Google Sheet. Now, this is another no code solution. We're gonna use something called simple scraper. So I've got this installed and I'm gonna go ahead and scrape this website. So it's gonna ask me for a property up here in the top left. So I'm gonna go ahead and add a property. I'm gonna call the headline and then I'm just gonna visually select it, which is fantastic. And then I'm gonna go ahead and add another property called the description for short and then I'm gonna highlight this part. Great, so now we can view our results. As you can see here, it's extracted everything properly, which is fantastic. Now I want to save this recipe and I call it SEL and URL, all that kind of stuff is fine. And I can create the recipe. So what we've done now, in fact, is created an API that we can run whenever we want. And the other good thing about simple scraper is that it actually integrates with Google Sheets pretty easily. So once you've authenticated, I can append results any existing data or I can replace the previous results. So I'm gonna open the spreadsheet and I can see here that there's nothing in here at the moment. So we're gonna go to the API and this is actually your API. It's really, really cool because it just works. And of course, right here, I am submitting a get request. And as you can see here, it's formatted in adjacent object. This is good, but it's not really running right now. So the next thing we have to do is use this parameter, which is run now equal true. It's gonna paste in here for a second. And what I'm gonna do is here, I'm gonna run it. But on top of that, while that's running, I'm gonna go to cronjob.org. And here I'm gonna create a cronjob and I'm gonna call it SEL scrape. And our URL is a get request with our API key. And I'm gonna say every day at one o'clock in the morning. I'm not sure why they would update their homepage at one o'clock in the morning, but let's just go with it. And so I've created a cronjob. So it's gonna run the API call at one o'clock every morning. You can see here, this has been returned. And let's go back to our integration. Let's just open our Google Sheet again. And hey, look at that, that's pretty cool. Webhooks, they look like regular URLs, but they have superpowers. It's kind of like when something is done, ping this URL and then have it email me. It's kind of hard to explain because there's different definitions and some people interpret it differently. So why don't we go ahead and build our very own. Let's have a look. Okay, so we're gonna have a little bit of fun and create our own webhook. What I need you to do is in the deck, you'll get a link to the Google spreadsheet and from here just click on make a copy. I've already got one up here. The next thing you do is go to the tools script editor and you'll see this code here already ready for you. The only thing we need to do to get this running is deploy it as a web app. And so what's happening here is that we are gonna give it a public URL. It's gonna go through and ask you for some authorization permissions. Don't worry about it because this is gonna be your code. There's nothing unsafe about it. So just go through the process and allow that to happen. So that's all done. And before I show you what's going on, I want you to look at this functioning and this should kind of give it away. Do get, so we're making a get request and there's a parameter called URL. What's going to happen is that we're actually just going to append a value that we put into URL into our spreadsheet. So let's go ahead and do this really quickly. So there is our URL and then I'm gonna add the parameter URL equals testing month gone, perfect. And so I should get a JSON response back. There's my message. And if we go back, hey, look at that. That's pretty cool. So let's do one more thing. Let's make this a little bit more useful. And so now that I have my project key which is here, I still have it pasted, right? I can take this and I can use this and I'm just gonna change this to URL. That's fine. And now what I can do is I can make a little bookmark out of this. So let's go here and edit a bookmark. I'm just gonna paste that in there and that looks pretty good. And so let's see if that works. So let's go to Google.com. I'm gonna save that to sheet and that looks pretty good. And I'm gonna go back to my sheet and hey, look, I'm done. It's worth mentioning a popular no-code automation platform like Zapier because they do cron and web looks really well. But if you look at the screenshot here in the second input, you'll see picking off a child key. That's JSON. So you need to understand that kind of stuff. And at this point, you might be thinking, why am I learning this? I'm not gonna be a programmer. Well, even if you don't wanna be a programmer, right? And you wanna do this kind of automation, you're still gonna have to talk to a developer and developers need specifics. And the more specific you can be because you understand at least some of these parts, the easier it's going to be to explain. Dooming Loom, okay? Because I've been around and I've seen lots of different automations and I'm a little bit worried here but I can't see machine learning not taking over parts of our jobs in the near future. For example, the still search pilot, right? They're constantly doing tests, they're understanding which tests work and why can this not be automated, right? And then think about how that would potentially work. I mean, most of our sites are heavily templated already. So why wouldn't a machine be able to execute one of these tests, write some new meta descriptions, for example, run it and see if it got a lift. It's very possible, right? So the last part is just to be able to understand how to use these no-code, low-code automation tools. This is why you need to learn the fundamentals and you don't need to be a programmer. So the next building block which is really popular is ETL, Extract Transform and Load. I fell in love with a company called Stitch Data not just because they have a really generous free tier of five million rows, just because they have a great platform. I mean, lots of good connectors like Google Analytics, that kind of thing but I also found one which is probably my favorite because I can pass it any JSON data I want. Let's go ahead and explore Stitch. So once you've logged in, one of the things you need to do is add a destination. And so mine is BigQuery and I'm all hooked up here. The next thing you need to do is add an integration. So we've got a few things running here. The GA test is essentially just taking from my personal website and it gets filled into my BigQuery. And so it keeps loading every night, which is great. This is good and there's a lot of good integrations but what I really wanna show you and the magic of Stitch I think is the ability to load things into BigQuery through webhooks, which is fantastic. So I'm just gonna name one here called delete if I can spell and I'm gonna go ahead and save that. I guess I already have one called delete me. So let's call it delete me, one, one. And this is all I need, this webhook, which is great. And so I'm gonna continue and that's it. And that's it. So I did have one called MozCon and I've been loading things into it and I just wanna show you how easy this is. So I've already got this already loaded and again, this link is in the deck and all we're doing here is we're posting to Stitch and it's gonna automatically create the BigQuery schema and it's gonna enter these things. So this, we already have Cyrus in there and we are gonna say Britney Muller and we're gonna go ahead and run that. If everything worked out, you're gonna get a response from Stitch saying it's accepted. Now going into my BigQuery table, I was fooling around earlier and the schema has already been set by Stitch, which is awesome. And then I have a few Fred Flintstone and I have Cyrus Shepard. And in a moment or so, I will also have Britney Muller. Everybody loves browser automation but it's kind of out of reach for a lot of people because it's complicated. Like, have you ever tried the right pop into your scripts? You need to know JavaScript and plus it's not that easy. So I did find a tool that I think is gonna make everyone's life a lot easier and excited to show you. So let's have a look. Here we're at testim.io, which I think is the easiest browser automation tool I've found, period. And so what I'm gonna do here is I'm gonna create a test and what I wanna automate is essentially just storing Google link data into my Google Drive. So here I'm gonna go in to create a navigation action. So this is already at my search console data for my personal blog. And then what I'm gonna do is I'm going to record. And so here I can literally just do this. And so it is recording the test as we speak. So I'm gonna click on links, gonna go to export external links and more sample links and literally just do that. And that is it. And so we can see now that this is automatically stored in my Google Drive. So I'm just gonna go back here and stop that. So this has already been set up and this is a really quick automation. So let's see here. I'm gonna click on this play button but I'm gonna run locally. And the reason why I wanna do this is because I'm already logged in on my machine. I don't wanna fumble around with search console authentication because I've got two factor on that kind of thing. So I can just go run locally and we can see here that Testim has taken over the browser. There you go. And I've just repeated my entire process. And so I can go ahead and do this once a week or whenever I need to. But this is a really, really simple browser automation. No talk in 2020 is completely without talking about machine learning, right? And honestly, some of the setup with some of the libraries puts me off and that's why I use BigML.com. That way I can just kind of focus on the mathematics and the output rather than the entire setup and figuring out these different libraries. So I'm really excited to show you a feature that I didn't know existed before and actually had some one-on-one help from BigML to be able to showcase this. And again, they have a generous free plan so none of this needs to be paid. Let's go ahead and let's show you the association discovery. Hopefully this data that looks familiar to you is actually the screen frog internal all report except we're missing the address because we don't really need it. So got a few extra metrics in here like Mozlinks, Equity, External Links, Page Authority, Sessions, that kind of thing. But the one I want you to focus on is crawled during period. Every single one of these rows represents a URL and how many times it was crawled during this log reporting period. So this would be URL A and it was crawled five times. And what I wanna do is try to figure out if I miss anything. Are there any reasons why this was crawled more often or not at all? In comes BigML to the rescue. So we're gonna be using something called association discovery, which as I say, discovers relations between variables and high-dimensional data sets. For me, this is a bit more important. So associations go beyond simple variable correlations, right, and they also reveal a complex set of rules. This will make sense in a second. Once you've logged into your BigML free account, what you need to do here is click on the upload data source and go ahead and upload your CSV. I've already done this here. And so once it's been uploaded, all you need to do is click on the down arrow and one click data set. The one click data set looks like this. You have a histogram of all the different values. So from here, what we wanna do is we want to configure an association. These are the manner of rules or the number of fields that will contribute to the rules. So I'm gonna lower this down to two to make this a little bit faster. On top of that, what I wanna do is really focus on the fact of the crawl during period. So I want the rules to describe what's happening with this specific field. You don't really have to do too much. You can study a lot more on BigML, but from here you can just create an association. That'll take a few minutes, but here's the output. This will make a lot of sense already to most people working in SEO. So if we're looking at something like the consequent where crawled during period is less than or equal to 0.36, a rule that describes that is indexability, non-indexable. That makes a lot of sense. And if something is indexable, generally gets crawled more than 1.82 times, you can look at all these different metrics and you can see that BigML will actually give you a full description as to what they mean. The other way to do this is by visualizing it through a graph. Now this is a little bit messy. So we're gonna add some labels. Then we're gonna increase the leverage and we're gonna increase it all the way to the top. So we have a really good idea of what we're looking at and of course, this makes sense. If it was crawled less than 0.36 times, it's probably because it's non-indexable. Now this is a data set that you know, but this is really, really useful to be able to get to grips with a data set that you don't know. In my opinion, this is one of the most powerful use cases for machine learning. On top of that, BigML makes this so incredibly easy with no code solutions. I encourage you to try this out with any data set that you have to be able to sense check what you're looking at. All right, you made it through your vegetables, now it's time for cake. So I'm excited about this because you're gonna be able to deploy your own version of like IFTTT in the cloud within a couple of minutes and we're gonna be using something called n8n.io and Heroku, which is awesome. So let's go ahead and let me show you probably the coolest thing in this presentation. During the course of the research for this presentation, I trialed every automation tool I could. In the end, I settled with n8n.io, which is pronounced node nation because it's free and I see a lot of problems with this tool. So what we're gonna be building is essentially this. You enter in URLs, we are gonna loop, we are gonna go to the PageSpeed API, grab lighthouse metrics, send them to stitch, which is then going to store them in our BigQuery database. To get this working, you're going to need the link in the deck to my GitHub repository and from here, you're going to need a free account with Heroku, the PageSpeed Insights API key and of course, your stitch data account hooked up to BigQuery and you're gonna need your web hook URL. Let's go ahead and deploy this to Heroku and just see just how easy it is. So here I'm gonna name it Mozcon tests. Down at the bottom, you'll need to change your app name three times and that's it. We're gonna go ahead and deploy that. Okay, so our app is deployed. The default credentials are user and pass, which you should probably change if we're gonna do this and that's it, we're logged in. The next part is going back to the repository and I've got some code already ready for you and you can just open it up in a new window, copy everything and watch this magic. Bang, so I can just paste that in. Now there's gonna be a few things you're gonna need to change. Obviously, we don't have your API key. So within the PageSpeed module here down at the bottom, I can click on this and I would replace my PageSpeed API key. Additionally, you'd have to do the same thing with your Stitch module. Here's my working version of this workflow. Now to start it, all I need to do is make sure the start button is connected and I can execute the workflow. So what is happening here? So I've got my URLs. It's gonna structure the data in a way so that I can split it and loop it. I'm gonna go through the PageSpeed API. I'm going to set the lighthouse metrics and then everything is going to get pinged to the webhook for Stitch and placed in my BigQuery instance. Let's say I wanted to add in an extra URL. I would go into this module and let's ping Google.com as well. I can click on this X and that's already saved for us. So that's great. Now one other thing is I don't really want this start button here. It's not really useful. So I'm gonna go ahead and add a cron trigger and that'll put this directly in and I can add a cron time. Let's just say every day at two o'clock and that's it. And so from here, I'm gonna go ahead. I'm gonna connect that. I'm going to go and activate it and that's it. Now this workflow will complete every single day at two o'clock without me looking, which is incredible. A few closing thoughts. Computers can't automate this fully yet, but don't get comfortable. Automation workflows need to have good checks and balances and error reporting because sometimes you think something executed, but actually you didn't get what you thought you were getting. In terms of programming, I'm not saying you have to learn the program, but if you did enjoy some of the things I showed you here today, you want access to this kind of automation, you don't have to go very far. Whatever language you choose, learn to parse JSON, learn how to make HTTP requests and that'll get you in a really good spot. A bit of a bonus here. I got asked by Cyrus and Brittany to build the Moz API for Sheets and this is a way to easily get linkscape data into Sheets without any code. There's also a few helpful functions in there as well. So you can download it today for free through this link and you can also check out the post on the Moz blog. Now, one of the things. This wouldn't be a presentation on automation if I didn't automate something, right? So I talked about a lot of things here and the deck might be hard to follow, but I can email you a full list of instructions. Stick your email in this URL as the get parameter, click on send, I promise not to spam you, I will just send you the tutorial. A big thank you to every single company that I've mentioned here today. Without them, I would never be able to do this stuff. So thank you very much. In terms of your journey of automation, you should start by doing little projects and here's a few ideas for you to get started. Don't worry about automating pretty much everything that you do, just take little bits and small bits at a time as you're learning. That's it from me. Hopefully you enjoy the rest of MozCon. Thank you very much for listening.