No Code SEO Automations

Episode Summary.

SEO Automation allows you to complete your daily SEO tasks faster. It gives you more time to focus on the creative aspects of optimization.  

This can be very useful if you’re working with a large website. 

Guest Profile.

 

🔨Name: Lazarina Stoy

🔨What Lazarina Does: SEO & Data Science Manager at Intrepid Digital

🔨Company: Intrepid Digital

🔨 Noteworthy: Lazarina creates educational content in the SEO, data science, and analytics niche, as well as resources that can help SEOs and digital analysts be more efficient with their time. Lazarina Stoy is also a freelance SEO Consultant and storyteller.

 

 

Key Insights.

💡You combine multiple CSV files into one file using a simple script in less than 1 minute

Lazarina says you can combine multiple files into one in excel or CSV exports in less than a minute. No need for copy-pasting. Get your data in a simple place.

Copy-paste the code into Google Colab. Run the notebook. Type the directory where your files are located. Get your updated sheet in the same folder.

 

💡Use GPT3 to classify keyword context.

GPT3 is a machine-learning language prediction model that uses deep learning to produce human-like text.  The script in Google Sheets can help you to classify keywords at scale.

 

💡You can export a sitemap into Google Sheets.

Lazarina explains that when performing  Site audits, you can crawl and export a sitemap to Google Sheets using google colab.

Episode Summary.

The Importance of Topic Modelling.

We all know how important topic clusters are for SEO. Using topic modeling, you can very quickly identify first of all the gaps in the internal linking structure that you have on your website at present. You can also identify topic gaps because maybe the topics that you have identified,  don’t align with the brand’s idea of where it needs to go.


You can write meta descriptions automatically

Lazarina explains that you can generate your meta descriptions automatically for pages that don’t have it, based on the copy of the page via text extraction.

You can Identify existing topics in content.

Lazarina explains you can crawl and export content on your website, then upload your files to the web app, fine-tune the model’s performance, and also download the files and explore them and build these files into your deliverable of the internal linking audit.

 

Resources.

free resources 

Connect with Lazarina:

Episode Transcriptions.

Lazarina Stoy 0:03

So, basically, my idea for this webinar was just to go through the different use cases for low code or no code automations, and the things that you can use them for. So I’ve organised a list of kinds of resources. And basically different things that you can do with resources that are already created in the community, or guides and different tools that exist that require very little knowledge of programming and programming languages, but at the same time, can help you automate, kind of ease the way that you do things. And so I will not be going through demonstrations for all of them, just because I think that I will not be able to do it justice in a very short webinar like today’s.

 

But at the same time, I would like to kind of show you different use cases, some of them, I will show you others I will, for all of them, you will see that you have a guide referenced as well. And essentially, who the author is what they suggest this particular tool or script should be used in, and things like that.

 

So I think that there’s definitely everything in this little guide for you to get started. And I hope that the video only adds to that as opposed to make more noise. So just to kind of start, the first thing that I want to show you is a very quick script that I have created about combining multiple CSVs and Excel files into one, the code is based essentially on a script that I’ve found for doing this task.

 

And I’ve very gently modified it in order to allow for those different things to be done. So when you open the GitHub link, essentially, you can copy paste the code, and you’re going to see a file like this, essentially, what it does is merge multiple Excel files, there isn’t really anything that you need to do besides to run this file. But before we do that, we need to talk about what are the different files that you can use in order to use the script. And essentially, let’s talk about the different ways that this can be useful in your routine. Now, most of you, I would imagine, are familiar with SEMrush, this platform on keyword research or topic research. And something that is very useful is actually their keyword magic tool.

 

This is something that is very useful when you’re doing keyword research. And here I’ve just entered the keywords so that I could show you how it looks like when you get that. Typically, what you can do is just export that directly when you export is going to export in an Excel X or Excel file, or a CSV, you can choose Excel for this demonstration. And we can also do the same for dog toys. I’ve already kind of exported that in my folder. But essentially the process is the same. You enter your seed keyword, you can see all of these keywords, if you want, you can remove some of these and kind of filter down using the platform.

 

But I just find that a lot easier to find patterns when you are doing this kind of analysis in Google Sheets, because you can combine it with different scripts as well. So let’s say we’re entering our keywords, we are selecting everything, and we are exporting. So the next thing that we’re we would like to do is essentially go into our downloads folder. And in our downloads folder, we can see those two files, doctors and categories. Of course, this is a very small demonstration of keyword research is just for the purposes of this demonstration. But let’s say we want to combine them in the same folder and type keywords.

 

That should be enough for this demonstration. Typically, what you might find is you might have a lot of different files like this, like the demonstration, you might also use the same kind of process when you’re doing topic research as well. There’s definitely a lot of programmes in SEMrush, and also an hrs as well that allow you to do bulk exports, that then you can use to create a keyword universe like this one. And so after we have our little folder, essentially we need to know what the path is.

 

So we know that this is in our downloads folder. And the way that we can also see what the path is is to get the information and we can see it’s an users Lazar in Estonian downloads. So what we do now would be to go here, press play for this, and we can see that the prompts that we have on the screen just enter the path to the CSV files in this format. So we type in users. Serena story.

 

Then we have downloads. And then we have keywords. As far as I can remember, let me double check. Yes, that is the one. And then we press Enter. And what we are going to see is that in this keyword document, we will have a new file come up, which is Excel file. And we will receive from this Excel file essentially the combined two files into one.

 

Unknown Speaker 5:33

I think that’s just loading

 

Unknown Speaker 5:34

Now bear with me. Okay, so this is how our, our file would look like in Excel when we have that combined. And as we can see, here, we have all of our cat toys and dog toys kind of broad match from here. Now something very cool that we can do here would be to add the a seed keyword column, we can just copy paste from here, add that change from file name to seed keyword. And we can remove here via Find and Replace.

 

We can remove essentially these things that SEMrush adds to the end of the file. So essentially, the file name, let me use the script is very helpful because it allows us to see what the seed keyword is that we entered, and how that matters to the keyword that we see in the column that SEMrush returns from their platform. So here we would like to enter broad match underscore us underscore 22 I think that is 2022 Dash eight that 15x

 

Unknown Speaker 6:56

l x f perfect

 

Lazarina Stoy 7:06

 

and then we replace it with nothing so find all replace all I think we have probably made a mistake I can see where the mistake is if you see broad matches with a dash and we’ve put an underscore.

Okay, we can see that that fixed that it wasn’t a typo here. And now we do the same for Catholics. But we just replaced the dash with a space and replace all. Amazing. And now we can see that we have a very good sheet here with the seed keyword and the other seed keywords from the second sheet. Now this is something that imagine if you’re doing keyword research for a number of different keywords, this is something that could help you very easily automate and create a keyword universe file, which is the first step in getting your insights to the client.

 

Now the second thing that I want to show you is essentially we’re going to copy a few of these keywords. And I’m going to paste them in a sheet that is that is for classifying keywords using GPT three. So we are getting that we are taking a little bit from the cat toys, spreadsheet.

 

And we are taking a little bit from the dog toy spreadsheet just to show you what the sheet can do. And now what we are going to do is essentially use a script that has been shared by Daniel Richmond Danny is really great. In the in the guide that I have sent over here, you can see that script as number two classify and keywords using GPT. Three.

 

And I have also connected the Google Sheet and also the blog post. And Danny has also been really kind to create a video about how you can use that. So essentially, the the first thing that we would like to do is to go into our settings tab, paste a, you know, an API key.

 

I would ask you not to use this API key because it is something that is based off of charges that we can incur, and essentially classify the keywords based on prompts, so essentially are we going to classify if it is a dog leash, or if it’s another pet niche. And so essentially, we provide all of this, and we get the projection. So based on the settings formula, I think Danny mentioned that here that we can do the projections via get API formula.

 

And so we enter the formula here. And we wait for the result. As you can see, this very nicely classifies keywords. From what I can see the only kind of thing that it does not classify that well are very technical terms in terms that you don’t really have a lot of information online.

 

So it really does depend how useful the script is going to be based on the kind of niche that you work in. And whether it is, you know, applicable to b2b and whether there’s a lot of information out there or not. So this is kind of the second thing that I wanted to show you. I will close this now. Now, the third thing is exporting a sitemap and Google Sheets.

 

If you bear with me, just one moment, I will show you the script for this. This is again, another script that you can find in my GitHub, but essentially what you get with this. So essentially, with this, you have, let’s say you have a very large website, and you want to export the, you want to export essentially, all the URLs that are part of the sitemap, this can be very useful if you want to check for instance, whether all the URLs are contained in the sitemap, if you want to compare that with a call that you have, it can be also very useful if you want to check a folder URLs in the sitemap or indexed or not.

 

And so from that point of view, the first thing that you want to do is get the URLs from the sitemap here, I have done a demo with my own website, and then you open this spirit. So in order to proceed Google colab. And the only thing that you have to do essentially come here and run also completely no code, like the previous kind of demo that I did. And so this is going to run automatically.

 

Essentially, what it does is it instals A connection between Google Sheets and your machine, the call up person. And so it creates a data frame, which is essentially a table a spreadsheet with all of those files, and it uploads those spreadsheet into Google Sheets. And here, we can see that we have the sitemap here.

 

And we can see that we are getting a location for the sitemap. So where’s what kind of what URL? Are we looking at? Essentially, when was it last modified? Is it an image? Or is it an image location, so essentially the URL of the image or not, and then we can see what kind of sitemap we extracted from this can be very, very useful if you’re doing like I said, kind of large scale automation.

 

The next thing that I want to show you is how to check whether the URLs from your Sitemap have been indexed and very simple way to do this via App Script. The reason why I wanted to show you why it is important for you to know whether you know you’re using Python code or if you’re using App Script, is because both of these can become very little code.

 

The only kind of difference is that you essentially have a different mode, how you’re accessing this code, but essentially, so many App Scripts exist for different use cases that you can use in the sphere of SEO. And for instance, I have found this by simply searching for whether someone has created something like this.

 

So definitely lean on people that have been, you know, developing those kinds of tools when you’re a beginner, because there’s essentially a lot of people that are developing tools like this. And you know, these tools, scripts and apps exist. So you just need to define your use case, find the script and utilise it.

 

So let’s go back to the sheet and see how we can utilise this. So obviously, we go into the app script. I don’t know if you’ve seen that I might have clicked too early. But essentially, it’s from extensions and apps script. You create your project, and you copy paste the code that I have added there.

 

So the formula that we are going to use is is paid indexed. And we essentially we save the project here, we go back to the sheet, we add a new column here indexed question mark. I’m just going to remove this out of the way so it’s a little bit easier to read. And we enter our formula is page indexed and we referenced the URL.

 

What we’re going to see is all of this is going to start loading essentially it’s going to To check whether this URL exists in Google search results or not, and it’s going to give us an outcome of whether it is indexed. Or maybe it is not indexed, I would imagine that all of these are index. But you know, if you are working with a spreadsheet, sorry, if you’re with a site that is a little bit bigger, and you’re unfamiliar with it, it would be nice to check for indexation beforehand. And this is a very quick way to do so.

 

Now, there’s also a few different use cases here for which I believe that I will not be able to do just do justice, if I explain as part of this tutorial. So in terms of how to write meta descriptions, automatically, I’ve referenced two different resources one for under child Charlton that has a Google Sheets. Extension company, he definitely has very polished extensions as part of his keywords in sheets plugin.

 

So I definitely recommend checking that. I also have my script here about how to generate them using Word. You can also generate all text automatically here, I’m referencing miracles kind of script for doing that. Definitely very simple to use. And it works on the same principle like the one that I showed you in Google colab, where essentially, you just have to run all of the cells, provided that you give the script an input, and then it will automatically put the alt text and captions to your images.

 

There is also the possibility to use to identify existing topics and content. And you can do this as part of internal linking, you can do this as part of research. I have a video on YouTube with step by step instructions on how to do this, which I think is going to be very helpful for most of you.

 

There’s also a very much in depth guide on internal linking and topic clusters for machine learning, which I think is going to do this kind of demonstration justice. But just to kind of show you very quickly what I’m referring to here, when I say a no code machine learning app, I will show you on the screen what I mean. So essentially, let’s imagine that we have a website.

 

And we have here, we have either scrape the content of that website, or we have, essentially have the titles for this. Maybe it can be titles for YouTube videos, that can be keywords that you want to cluster and do topic modelling of. In my guide, you can see kind of my best practices and suggestions about doing topic modelling. So there are certain things that work better than others. Topic modelling definitely works better and long, long form text, which is essentially

 

Unknown Speaker 17:47

paragraphs, which is blog posts, which is the entire content of the page, as opposed to just working on the titles. But for the purposes of this demonstration, let’s work on the titles only. What I have done is exported this as a CSV, which are just the titles of all the content that I have on the website, and I have uploaded it here.

 

And I have also downloaded a list of stopwords, you can find this list of stopwords in the length resource. And I have uploaded it, what I can see immediately is that I’m having a lot of stop words, which are essentially the brand name. But I can remove it from here, I can remove for instance, the blog, because that’s not really a key word that adds a lot of value. And then I can choose how many topics do I want?

 

So do I want the 10 biggest topics that this dataset has, I can see them on the left hand side here. And by running a few different iterations, I can see how specific is this word to a specific topic. And if it’s a one, then that means that this word only exists in one topic. And this is a very, very good way to have a no code automation that is completely free, by the way.

 

That is state of art kind of topic modelling algorithm that has been developed and used for a very long time in a machine learning community. And it’s completely accessible to people that don’t code. And of course, with the downloads, you have different kinds of files that you can download. And you can essentially do analysis based off of this. And we all know how important topic topics and topic clusters are for SEO.

 

And so using this kind of topic modelling, you can very quickly identify first of all the gaps in the internal linking structure that you have on your website at present. But you can also identify topic gaps because maybe the topics that you have identified here, they don’t truly align with the brand’s idea of where it needs to go. They don’t truly align with their desire to pursue new topics.

 

So maybe for instance, they’re ranking for employee productivity training. Digital adoption, but they would like to rank more for change management. And if your kind of research identifies this as a very small topic, then this is a very good way to kind of start working on this as well. So, as I mentioned, I’m definitely not doing this justice. So I would like to reference you to the guide that referenced the materials that are included in that guide.

 

I also have a template here that you can use to follow the instructions and organise the outputs, the different downloads that you get from this app. And then you can provide that to your clients. And you can see, for instance, how often certain topics relate to one another, you can use this information to provide recommendations about internal linking, you can use also the information to see what URL and what content is being recognised as part of a certain topic. And essentially, I think it’s, it’s a very good use of this machine learning app.

 

Now, the next thing that I want to show you is essentially how to identify mismatches and migrations. Again, I have a video on this complete guide to fuzzy matching. And there’s also a template that you can use. Essentially, this sheet sheet is going to show you with practical examples how you can do string assessment comparisons, interlink opportunity, identification, redirect texts, and contemplator ism texts.

 

All of these can be very useful in your SEO work. You can also use the same script to find the most suitable page to redirect a broken backlink to when you’re doing our backlinks audits and analysis. And the last thing that I want to refer you to kind of a note code process is getting page level PageSpeed Insights and Google Sheets, I think this is definitely very good for a beginner level kind of analysis that you might do.

 

And typically, you might find sometimes that it’s not worth it to go to the effort of doing accrual and going through the effort of fine tuning the settings of that crude oil, especially if you’re working for a very large site. And so having the possibility to audit page level metrics, and Google Sheets is something that I think is very valuable.

 

So I will show you in a moment how essentially, you can see in the article how it looks like essentially, for each URL, you get the time to interactive Contentful paint the score, and the last time that the report has been run. And this is a very great tool as well as a script because you can also schedule reports that you can run.

 

So this is something that I could, for instance, use to connect to a Data Studio dashboard and have page level data records in Data Studio, which is something that you typically cannot get if you’re using the Chrome UX connector for Data Studio, which only report on the site level metrics.

 

So very holistic overview, especially when you have perhaps a site that is not generating enough visits to be included in the Chrome UX assessment. Or maybe you have a website that is too big. And that assessment is not really relevant to them. Because the site is so big, and you’re only interested in the performance for a specific section, that kind of report here is going to be very, very valuable. And of course, you can use that spreadsheet to do a number of different things and kind of report on different data benchmark and things like that. So I hope that you found that useful.

 

There’s definitely a lot more kind of learning to do and testing of all of these scripts. I hope that is useful for you to kind of learn a little bit more about nokhada automation. And you can definitely always reach out to me if you’re struggling with anything as part of this webinar. And definitely feel free to reach out to the authors of this tools.

 

Thank them and, you know, thank them for their time and developing, you know, these resources because they’re definitely very cool. Some other people that I would recommend you’re checking once you have a little bit more muscle working with tools like this are liefert.

 

Get Greg Bernhardt, Charlie Rector. They create a lot of NoCo tools, streaming apps that you can use. And they’re very active on Twitter as well. So you can follow them there and kind of start learning a little bit more about how to do no code or local automations as part of your SEO work. And so I hope that this was useful. Thank you so much for watching. And feel free to reach out if there are any questions

 

Transcribed by https://otter.ai