Winning the featured snippet for a target keyword means increased traffic to that page, and you can use STAT to achieve those wins. In this week’s episode of Whiteboard Friday, Moz Learning and Development Specialist Zoe Pegler walks you through how you can do so in five easy steps.
Click on the whiteboard image above to open a high resolution version in a new tab!
Hi. I’m Zoe from Moz’s Learning Team. Today I’m going to be showing you how to use STAT to identify featured snippet opportunities. If you’re not familiar with STAT, it’s a ranking tool which is very good at pulling big data.
What’s a featured snippet?
For those of you that might not know what a featured snippet is, it’s one of those answer boxes that appear at the top of a search results page. It’s the result that shows up directly beneath the ads after the search is performed. So, for example, if you did a search for something like “Is coffee good for you,” you’re going to see an answer box saying, “Recent studies found that coffee drinkers are less likely to die from some of the leading causes of death.”
Websites that have URLs ranked in the featured snippet often experience heightened brand visibility and the majority of available traffic from the associated keyword. Where do you start if you want to become a part of that featured snippet box? How do you target those opportunities? Well, the first step here is keyword research.
1. Upload keywords to STAT and filter
You want to discover keywords that you can start monitoring optimizing for. Ideally, you want to find keywords that you rank on page one for that also have a featured snippet. STAT’s keywords tab is a great place to start with this. In this feature, you can upload a bunch of keywords, and once you’ve allowed some time for the data to gather, you can really dig into what keywords you have that are triggering answer boxes and what opportunities there are.
There’s an extremely useful feature in STAT where you can filter a table of keywords to show earned SERP features and specifically answers. You can filter for specific answer subtypes too. STAT currently parse lists, paragraphs, tables, carousels, and videos.
So you can check out all of these. You should also filter for keywords specifically on page one. So do that. Filter the “Rank” column to show results ranking between one and 10. Once you have found all of those keywords, there’s a really smart, useful way of collecting them all together, and that’s by putting them into a dynamic tag.
2. Create a dynamic tag
This lets you group those keywords together and label them. You could call this tag featured snippet opportunities for example. The magic of putting the keywords into that dynamic tag is that it acts like a smart playlist. These fancy segments automatically populate each day with keywords that match the specific criteria you set for them, making it quick and easy to see which of your keywords are featured snippet opportunities.
Being able to segment keywords into these dynamic tags is what makes STAT so much more valuable. Being able to create reports in granular keyword levels is powerful stuff.
3. Check the data set over time
Okay, so what is the next step to prioritize your featured snippet opportunities by the highest potential ROI keywords? It’s usually much easier to take a featured snippet or to steal one if you’re also on page one.
Taking a look at STAT’s SERP Features tab can help out here. There’s a nifty graph which allows you to see how answer boxes appear if your keywords have changed over time. Using this will help you to access opportunity. You can then start pulling out and comparing some of that data and digging into things like average monthly search volume, current featured snippet URLs, and the featured snippet type.
Is it a paragraph, a list, or a table? Is there any markup? What’s your rank? How does the page look in general? You might want to start investigating which long-tail keywords you could potentially optimize your site for. There are a couple of reports you can pull in STAT which can definitely help you in this research.
4. Set up reports
The People Also Ask report will show you questions and their rank within the box as well as the URL sourced in each answer. It’s worth taking a look at the Related Searches report as well to see related search queries offered by Google which users may also be searching. Once you’ve identified long-tail keywords you want to track and keep an eye on, you can copy and paste those keywords into Google Keyword Planner or even back into STAT.
That way you can see what the rankings, search volume, and CPC look like. You can use one of those smart dynamic tags in STAT to group and label them again as you start optimizing on the keywords you think will be valuable to your site. Once you’ve identified and optimized your site, you’ll want to keep careful watch over your hard work, so monitor.
5. Set up and monitor alerts
I recommend setting up alerts for this. STAT lets you do this so you’ll be notified any time your ranking goes up or down for your featured snippet target keywords, meaning you’re not going to miss seeing an opportunity. I hope this has been helpful and you’re feeling more prepared to try some of this.
If you already have a STAT subscription and want to get even more familiar with the tool features, think about taking the STAT Fundamentals Certification course. Have a great day, and thank you for watching this edition of Whiteboard Friday.
At Google’s Search On event in October last year, Prabhakar Raghavan explained that 15% of daily queries are ones that have never been searched before. If we take the latest figures from Internet Live Stats, which state 3.5 billion queries are searched every day, that means that 525 million of those queries are brand new.
That is a huge number of opportunities waiting to be identified and worked into strategies, optimization, and content plans. The trouble is, all of the usual keyword research tools are, at best, a month behind with the data they can provide. Even then, the volumes they report need to be taken with a grain of salt – you’re telling me there are only 140 searches per month for “women’s discount designer clothing”? – and if you work in B2B industries, those searches are generally much smaller volumes to begin with.
So, we know there are huge amounts of searches available, with more and more being added every day, but without the data to see volumes, how do we know what we should be working into strategies? And how do we find these opportunities in the first place?
Finding the opportunities
The usual tools we turn to aren’t going to be much use for keywords and topics that haven’t been searched in volume previously. So, we need to get a little creative — both in where we look, and in how we identify the potential of queries in order to start prioritizing and working them into strategies. This means doing things like:
Mining People Also Ask
Drilling into related keyword themes
Mining People Also Ask
People Also Ask is a great place to start looking for new keywords, and tends to be more up to date than the various tools you would normally use for research. The trap most marketers fall into is looking at this data on a small scale, realizing that (being longer-tail terms) they don’t have much volume, and discounting them from approaches. But when you follow a larger-scale process, you can get much more information about the themes and topics that users are searching for and can start plotting this over time to see emerging topics faster than you would from standard tools.
2. Use SerpAPI to run your keywords through the API call – you can see their demo interface below and try it yourself:
3. Export the “related questions” features returned in the API call and map them to overall topics using a spreadsheet:
4. Export the “related search boxes” and map these to overall topics as well:
5. Look for consistent themes in the topics being returned across related questions and searches.
6. Add these overall themes to your preferred research tool to identify additional related opportunities. For example, we can see coffee + health is a consistent topic area, so you can add that as an overall theme to explore further through advanced search parameters and modifiers.
7. Add these as seed terms to your preferred research tool to pull out related queries, like using broad match (+coffee health) and phrase match (“coffee health”) modifiers to return more relevant queries:
This then gives you a set of additional “suggested queries” to broaden your search (e.g. coffee benefits) as well as related keyword ideas you can explore further.
This is also a great place to start for identifying differences in search queries by location, like if you want to see different topics people are searching for in the UK vs. the US, then SerpAPI allows you to do that at a larger scale.
If you’re looking to do this on a smaller scale, or without the need to set up an API, you can also use this really handy tool from Candour – Also Asked – which pulls out the related questions for a broad topic and allows you to save the data as a .csv or an image for quick review:
Once you’ve identified all of the topics people are searching for, you can start drilling into new keyword opportunities around them and assess how they change over time. Many of these opportunities don’t have swathes of historical data reported in the usual research tools, but we know that people are searching for them and can use them to inform future content topics as well as immediate keyword opportunities.
You can also track these People Also Ask features to identify when your competitors are appearing in them, and get a better idea of how they’re changing their strategies over time and what kind of content and keywords they might also be targeting. At Found, we use our bespoke SERP Real Estate tool to do just that (and much more) so we can spot these opportunities quickly and work them into our approaches.
This one doesn’t need an API, but you’ll need to be careful with how frequently you use it, so you don’t start triggering the dreaded captchas.
Similar to People Also Ask, you can scrape the autosuggest queries from Google to quickly identify related searches people are entering. This tends to work better on a small scale, just because of the manual process behind it. You can try setting up a crawl with various parameters entered and a custom extraction, but Google will be pretty quick to pick up on what you’re doing.
To scrape autosuggest, you use a very simple URL query string:
Okay, it doesn’t look that simple, but it’s essentially a search query that outputs all of the suggested queries for your seed query.
So, if you were to enter “cyber security” after the “q=”, you would get:
This gives you the most common suggested queries for your seed term. Not only is this a goldmine for identifying additional queries, but it can show some of the newer queries that have started trending, as well as information related to those queries that the usual tools won’t provide data for.
For example, if you want to know what people are searching for related to COVID-19, you can’t get that data in Keyword Planner or most tools that utilize the platform, because of the advertising restrictions around it. But if you add it to the suggest queries string, you can see:
This can give you a starting point for new queries to cover without relying on historical volume. And it doesn’t just give you suggestions for broad topics – you can add whatever query you want and see what related suggestions are returned.
If you want to take this to another level, you can change the location settings in the query string, so instead of “gl=uk” you can add “=us” and see the suggested queries from the US. This then opens up another opportunity to look for differences in search behavior across different locations, and start identifying differences in the type of content you should be focusing on in different regions — particularly if you’re working on international websites or targeting international audiences.
Refining topic research
Although the usual tools won’t give you that much information on brand new queries, they can be a goldmine for identifying additional opportunities around a topic. So, if you have mined the PAA feature, scraped autosuggest, and grouped all of your new opportunities into topics and themes, you can enter these identified “topics” as seed terms to most keyword tools.
Google Ads Keyword Planner
Currently in beta, Google Ads now offers a “Refine keywords” feature as part of their Keyword Ideas tool, which is great for identifying keywords related to an overarching topic.
Below is an example of the types of keywords returned for a “coffee” search:
Here we can see the keyword ideas have been grouped into:
Brand or Non-Brand – keywords relating to specific companies
Drink – types of coffee, e.g. espresso, iced coffee, brewed coffee
Product – capsules, pods, instant, ground
Method – e.g. cold brew, French press, drip coffee
These topic groupings are fantastic for finding additional areas to explore. You can either:
Start here with an overarching topic to identify related terms and then go through the PAA/autosuggest identification process.
Start with the PAA / autosuggest identification process and put your new topics into Keyword Planner
Whichever way you go about it, I’d recommend doing a few runs so you can get as many new ideas as possible. Once you’ve identified the topics, run them through the refine keywords beta to pull out more related topics, then run them through the PAA/autosuggest process to get more topics, and repeat a few times depending how many areas you want to explore or how in-depth you need your research to be.
Trends data is one of the most up-to-date sets you can look at for topics and specific queries. However, it is worth noting that for some topics, it doesn’t hold any data, so you might run into problems with more niche areas.
Using “travel ban” as an example, we can see the trends in searches as well as related topics and specific related queries:
Now, for new opportunities, you aren’t going to find a huge amount of data, but if you’ve grouped your opportunities into overarching topics and themes, you’ll be able to find some additional opportunities from the “Related topics” and “Related queries” sections.
In the example above we see these sections include specific locations and specific mentions of coronavirus – something that Keyword Planner won’t provide data on as you can’t bid on it.
Drilling into the different related topics and queries here will give you a bit more insight into additional areas to explore that you may not have otherwise been able to identify (or validate) through other Google platforms.
The Moz interface is a great starting point for validating keyword opportunities, as well as identifying what’s currently appearing in the SERPs for those terms. For example, a search for “london theatre” returns the following breakdown:
From here, you can drill into the keyword suggestions and start grouping them into themes as well, as well as being able to review the current SERP and see what kind of content is appearing. This is particularly useful when it comes to understanding the intent behind the terms to make sure you’re looking at the opportunities from the right angle – if a lot more ticket sellers are showing than news and guides, for example, then you want to be focusing these opportunities on more commercial pages than informational content.
There are a variety of other tools you can use to further refine your keyword topics and identify new related ideas, including the likes of SEMRush, AHREFS, Answer The Public, Ubersuggest, and Sistrix, all offering relatively similar methods of refinement.
The key is identifying the opportunities you want to explore further, looking through the PAA and autosuggest queries, grouping them into themes, and then drilling into those themes.
Keyword research is an ever-evolving process, and the ways in which you can find opportunities are always changing, so how do you then start planning these new opportunities into strategies?
Forming a plan
Once you’ve got all of the data, you need to be able to formalize it into a plan to know when to start creating content, when to optimize pages, and when to put them on the back burner for a later date.
A quick (and consistent) way you can easily plot these new opportunities into your existing plans and strategies is to follow this process:
Identify new searches and group into themes
Monitor changes in new searches. Run the exercise once a month to see how much they change over time
Plot trends in changes alongside industry developments. Was there an event that changed what people were searching for?
Group the opportunities into actions: create, update, optimize.
Group the opportunities into time-based categories: topical, interest, evergreen, growing, etc.
Plot timeframes around the content pieces. Anything topical gets moved to the top of the list, growing themes can be plotted in around them, interest-based can be slotted in throughout the year, and evergreen pieces can be turned into more hero-style content.
Then you end up with a plan that covers:
All of your planned content.
All of your existing content and any updates you might want to make to include the new opportunities.
A revised optimization approach to work in new keywords on existing landing pages.
A revised FAQ structure to answer queries people are searching for (before your competitors do).
Developing themes of content for hubs and category page expansion.
Finding new keyword opportunities is imperative to staying ahead of the competition. New keywords mean new ways of searching, new information your audience needs, and new requirements to meet. With the processes outlined above, you’ll be able to keep on top of these emerging topics to plan your strategies and priorities around them. The world of search will always change, but the needs of your audience — and what they are searching for — should always be at the center of your plans.
We all know how effective link building efforts can be, but it can be an intimidating, frustrating process — and sometimes even a chore. In this popular Whiteboard Friday originally published in 2017, Rand Fishkin builds out a framework you can still use today to streamline and simplify the link building process for you, your teammates, and yes, even your interns.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. As you can see, I’m missing my moustache, but never mind. We’ve got tons of important things to get through, and so we’ll leave the facial hair to the inevitable comments.
I want to talk today about how to prioritize your link building efforts and opportunities. I think this comes as a big challenge for many marketers and SEOs because link building can just seem so daunting. So it’s tough to know how to get started, and then it’s tough to know once you’ve gotten into the practice of link building, how do you build up a consistent, useful system to do it? That’s what I want to walk you through today.
Step 1: Tie your goals to the link’s potential value
So first off, step one. What I’m going to ask you to do is tie your SEO goals to the reasons that you’re building links. So you have some reason that you want links. It is almost certainly to accomplish one of these five things. There might be other things on the list too, but it’s almost always one of these areas.
A) Rank higher for keyword X. You’re trying to get links that point to a particular page on your site, that contain a particular anchor text, so that you can rank better for that. Makes total sense. There we go.
B) You want to grow the ranking authority of a particular domain, your website, or maybe a subdomain on your website, or a subfolder of that website. Google does sort of have some separate considerations for different folders and subdomains. So you might be trying to earn links to those different sections to help grow those. Pretty similar to (A), but not necessarily as much of a need to get the direct link to the exact URL.
D) Growing topical authority. So this is essentially saying, “Hey, around this subject area or keyword area, I know that my website needs some more authority. I’m not very influential in this space yet, at least not from Google’s perspective. If I can get some of these links, I can help to prove to Google and, potentially, to some of these visitors, as well, that I have some subject matter authority in this space.”
E) I want to get some visibility to an amplification-likely or a high-value audience. So this would be things like a lot of social media sites, a lot of submission type sites, places like a Product Hunt or a Reddit, where you’re trying to get in front of an audience, that then might come to your site and be likely to amplify it if they love what they see.
Okay. So these are our goals.
Step 2: Estimate the likelihood that the link target will influence that goal
Second, I’m going to ask you to estimate the likelihood that the link target will pass value to the page or to the section of your site. This relies on a bunch of different judgments.
You can choose whether you want to wrap these all up in sort of a single number that you estimate, maybe like a 0 to 10, where 0 is not at all valuable, and 10 is super, super valuable. Or you could even take a bunch of these metrics and actually use them directly, so things like domain authority, or linking root domains to the URL, or page authority, the content relevance.
You could be asking:
Is this a nofollowed or a followed link?
Is it passing the anchor text that I’m looking for or anchor text that I control or influence at all?
Is it going to send me direct traffic?
If the answers to these are all positive, that’s going to bump that up, and you might say, “Wow, this is high authority. It’s passing great anchor text. It’s sending me good traffic. It’s a followed link. The relevance is high. I’m going to give this a 10.”
Or that might not be the case. This might be low authority. Maybe it is followed, but the relevance is not quite there. You don’t control the anchor text, and so anchor text is just the name of your brand, or it just says “site” or something like that. It’s not going to send much traffic. Maybe that’s more like a three.
Then you’re going to ask a couple of questions about the page that they’re linking to or your website.
Is that the right page on your site? If so, that’s going to bump up this number. If it’s not, it might bring it down a little bit.
Does it have high relevance? If not, you may need to make some modifications or change the link path.
Is there any link risk around this? So if this is a — let’s put it delicately — potentially valuable, but also potentially risky page, you might want to reduce the value in there.
I’ll leave it up to you to determine how much link risk you’re willing to take in your link building profile. Personally, I’m willing to accept none at all.
Step 3: Build a prioritization spreadsheet
Then step three, you build a prioritization spreadsheet that looks something like this. So you have which goal or goals are being accomplished by acquiring this link. You have the target and the page on your site. You’ve got your chance of earning that link. That’s going to be something you estimate, and over time you’ll get better and better at this estimation. Same with the value. We talked about using a number out of 10 over here. You can do that in this column, or you could just take a bunch of these metrics and shove them all into the spreadsheet if you prefer.
Then you have the tactic you’re going to pursue. So this is direct outreach, this one’s submit and hope that it does well, and who it’s assigned to. Maybe it’s only you because you’re the only link builder, or maybe you have a number of people in your organization, or PR people who are going to do outreach, or someone, a founder or an executive who has a connection to some of these folks, and they’re going to do the outreach, whatever the case.
Then you can start to prioritize. You can build that prioritization by doing one of a couple things. You could take some amalgamation of these numbers, so like a high chance of earning and a high estimated value. We’ll do some simple multiplication, and we’ll make that our prioritization. Or you might give different goals. Like you might say, “Hey, you know what? (A) is worth a lot more to me right now than (C). So, therefore, I’m going to rank the ones that are the (A) goal much higher up.” That is a fine way to go about this as well. Then you can sort your spreadsheet in this fashion and go down the list. Start at the top, work your way down, and start checking off links as you get them or don’t get them. That’s a pretty high percentage, I’m doing real well here. But you get the idea.
This turns link building from this sort of questionable, frustrating, what should I do next, am I following the right path, into a simple process that not only can you follow, but you can train other people to follow. This is really important, because link building is an essential part of SEO, still a very valuable part of SEO, but it’s also a slog. So, to the degree that you can leverage other help in your organization, hire an intern and help train them up, work with your PR teams and have them understand it, have multiple people in the organization all sharing this spreadsheet, all understanding what needs to be done next, that is a huge help.
I look forward to hearing about your link building prioritization, goals, what you’ve seen work well, what metrics you’ve used. We will see you again next week for another edition of Whiteboard Friday. Take care.
Have you ever wished there were an easy way to see all the top keywords your site is ranking for? How about a competitor’s? What about those times when you’re stumped trying to come up with keywords related to your core topic, or want to know the questions people are asking around your keywords?
There’s plenty of keyword research workflow gold to be uncovered in Keyword Explorer. It’s a tool that can save you a ton of time when it comes to both general keyword research and the nitty-gritty details. And time and again, we hear from folks who are surprised that a tool they use all the time can do [insert cool and helpful thing here] — they had no idea!
Well, let’s remedy that! Starting with today’s post, we’ll be publishing a series of quick videos put together by our own brilliant SEO scientist (and, according to Google, the smartest SEO in the world) Britney Muller. Each one will highlight one super useful workflow to solve a keyword research problem, and most are quick — just under a couple of minutes. Take a gander at the videos or skim the transcripts to find a workflow that catches your eye, and if you’re the type of person who likes to try it out in real time, head to the tool and give it a spin (if you have a Moz Community account like most Moz Blog readers, you already have free access):
You can do this a couple of ways. One is just to enter in a head keyword term that you want to explore — so maybe that’s “SEO” — and you can click Search. From here, you can go to Keyword Suggestions, where you can find all sorts of other keywords relevant to the keyword “SEO.”
We have a couple filters available to help you narrow down that search a little bit better. Here, without doing any filtering, you can see all of these keywords, and they’re ranked by relevancy and then search volume. So you do tend to see the higher search volume keywords at the top.
Save keyword suggestions in a list
But you can go through here and click the keywords that you want to save for your list. You can also do some filtering. We could group keywords by low lexical similarity. What this means is it’s basically just going to take somewhat similar keywords and batch them together for you to make it a bit easier.
Here you can see there are 141 group keywords under “SEO.” Fifty keywords have fallen under “SEO services” and so on. This gives you a higher level, topical awareness of what the keywords look like. If you were to select these groups, you could add a list for these.
When I say add list, I mean you can just save them in a keyword list that you can refer back to time and time again. These lists are amazing, one of my favorite features. What you would basically do is create a new list. I’m just going to call it Test. That adds all of your selected keywords to a list. You can continue adding keywords by different filters.
Filter by which keywords are questions
One of my other favorite things to filter by is “Are questions.” This will give you keywords that are actual questions, and it’s really neat to be able to try to bake these into your content marketing or an FAQ page. Really helpful. You can select all up here. Then I can just add that to that SEO Test list that we already created. I hope this gives you an idea of how to use some of these general filters.
Filter based on closely or broadly related topics and synonyms
You can also filter based on closely related topics, broadly related topics and synonyms. Keywords with similar result pages is very interesting. You can really play around with both of these filters.
Filter by volume
You can also filter by volume. If you are trying to go after those high volume keywords, maybe you set a filter for here. Maybe you’re looking for long tail keywords, and then you’re going to look a little bit on the smaller search volume end here. These can all help in playing around and discovering more keywords.
Find the keywords a domain currently ranks for
Another thing that you can do to expand your keyword research is by entering in a domain. You can see that this changed to root domain when I entered moz.com. If you click Search, you’re going to get all of the keywords that that domain currently ranks for, which is really powerful. You could see all of the ranking keywords, add that to a list, and monitor how your website is performing.
Find competitors’ keywords
If you want to get really strategic, you can plug in some of your competitor sites and see what their keywords are. These are all things that you can do to expand your keyword research set. From there, you’re going to hopefully have one or a couple keyword lists that house all of this data for you to better strategically route your SEO strategy.
If we know that related questions are occurring most often, you can create strategic content around that. The opportunities here with these filters and sorts for keyword opportunities are endless.
2. How to discover ranking keywords for a particular domain or an exact page
See all the keywords a particular domain ranks for
This is super easy to do in Keyword Explorer. You just go to the main search bar. Let’s just throw in moz.com for example. I can see all the keywords that currently rank for moz.com.
We’re seeing over 114,000, and we get this really beautiful, high-level overview as to what that looks like. You can see the ranking distribution, and then you can even go into all of those ranking keywords in this tab here, which is really cool.
See all the keywords a specific page ranks for
You can do the same exact thing for a specific page. So let’s take the Beginner’s Guide. This will toggle to Exact Page, and you just click Search. Here we’re going to see that it ranks for 804 keywords. You get to know exactly what those are, what the difficulty is, the monthly search volume.
Keep track of those keywords in a list
You can add these things to a list to keep an eye on. It’s also great to do for competitive pages that appear to be doing very well or popular things occurring in your space. But this is just a quick and easy way to see what root domains or exact pages are currently ranking for.
3. How to quickly find keyword opportunities for a URL or a specific page
Find lower-ranking keywords that could be improved upon
I’m just going to paste in the URL to the Beginner’s Guide to SEO in Keyword Explorer. I’m going to look at all of the ranking keywords for this URL, and what I want to do is I want to sort by rank.
I want to see what’s ranking between 4 to 50 and see where or what keywords aren’t doing so well that we could improve upon. Right away we’re seeing this huge monthly search volume keyword, “SEO best practices,” and we’re ranking number 4.
It can definitely be improved upon. You can also go ahead and take a look at keywords that you rank for outside of page 1, meaning you rank 11 or beyond for these keywords. These could definitely also be improved upon. You can save these keywords to a list.
You can export them and strategically create content to improve those results.
4. How to check rankings for a set of keywords
Use keyword lists to check rankings for a subset of keywords
This is pretty easy. So let’s say you have a keyword list for your target keywords. Here I’ve got an SEO Test keyword list. I want to see how Moz is ranking for these keywords.
This is where you would just add Check Rankings For and add your URL. I’m just going to put moz.com, check rankings, and I can immediately see how well we’re doing for these specific keywords.
I can filter highest to lowest and vice versa.
5. How to track your keywords
Set up a Campaign
If you don’t already have a list of your keywords that you would like to track, I suggest watching the General Keyword Research video above to help discover some of those keywords. But if you already have the keywords you know that you want to track for a particular site, definitely set up an account with Moz Pro and set up a Campaign.
It walks you through all of the steps to set up a particular Campaign for a URL. If you already have your Campaign set up, for example this is my Moz Campaign and I want to add say a new list of keywords to track, what you can do is you can come into this dashboard view and then go to Rankings.
If you scroll down here, you can add keywords. So let’s say Moz is breaking into the conversion rate optimization space. I can paste in a list of my CRO keywords, and then I can add a label.
Use keyword labels to track progress on topics over time
Now that’s going to append that tag so I can filter by just CRO keywords. Then I’m going to click Add Keywords. This is going to take a little while to start to kick into gear basically.
But once it starts tracking, once these keywords are added, you’ll get to see them historically over time and even you against your competitors. It’s a really great way to monitor how you’re doing with keywords, where you’re seeing big drops or gains, and how you can better pivot your strategy to target those things.
Discover anything new or especially useful? Let us know on Twitter or here in the comments, and keep an eye out for more quick and fun keyword research workflow videos in the coming weeks — we’ve got some good stuff coming your way, from finding organic CTR for a keyword to discovering SERP feature opportunities and more.
Image search results used to give you the option to “view image” without having to navigate to the site the image was hosted on.
When it started in 2013, sites saw a 63% decline in organic traffic from image results.
Because there was no need to click through when the image could be viewed in full from within the search results.
And then everything changed
In February 2018, Google decided to remove the “view image” button. Now searchers must visit the site hosting that image directly, restoring image results to their former organic search driving power.
According to some recent studies, this change has increased organic image traffic a massive 37%.
Given image results’ return to value, marketers are asking themselves how they can make the most out of this search mechanism.
So what are some new ways we can leverage tools to better understand how to optimize images for ranking?
To explore this, I decided to see if Google’s Vision AI could assist in unearthing hidden information about what matters to image ranking. Specifically, I wondered what Google’s image topic modeling would reveal about the images that rank for individual keyword searches, as well as groups of thematically related keywords aggregated around a specific topic or niche.
Here’s what I did — and what I found.
A deep dive on “hunting gear”
I began by pulling out 10 to 15 top keywords in our niche. For this article, we chose “hunting gear” as a category and pulled high-intent, high-value, high-volume keywords. The keywords we selected were:
Bow hunting gear
Cheap hunting gear
Coyote hunting gear
Dans hunting gear
Deer hunting gear
Discount hunting gear
Duck hunting gear
Hunting rain gear
Sitka hunting gear
Turkey hunting gear
Upland hunting gear
Womens hunting gear
I then pulled the image results for the Top 50 ranking images for each of these keywords, yielding roughly ~650 images to give to Google’s image analysis API. I made sure to make note of the ranking position of each image in our data (this is important for later).
Learning from labels
The first, and perhaps most actionable, analysis the API can be used for is in labeling images. It utilizes state-of-the-art image recognition models to parse each image and return labels for everything within that image it can identify. Most images had between 4 and 10 identifiable objects contained within them. For the “hunting gear” related keywords listed above, this was the distribution of labels:
At a high level, this gives us plenty of information about Google’s understanding of what images that rank for these terms should depict. A few takeaways:
The top ranking images across all 13 of these top keywords have a pretty even distribution across labels.
Clothing, and specifically camouflage, are highly represented, with nearly 5% of all images containing camo-style clothing. Now, perhaps this seems obvious, but it’s instructive. Including images in your blog posts related to these hunting keywords with images containing camo gear likely gives you improved likelihood of having one of your images included in top ranking image results.
Outdoor labels are also overrepresented: wildlife, trees, plants, animals, etc. Images of hunters in camo, out in the wild, and with animals near them are disproportionately represented.
Looking closer at the distribution labels by keyword category can give use a deeper understanding of how the ranking images differ between similar keywords.
For “turkey hunting gear” and “duck hunting gear,” having birds in your images seems very important, with the other keywords rarely including images with birds.
Easy comparisons are possible with the interactive Tableau dashboards, giving you an “at a glance” understanding of what image distributions look like for an individual keyword vs. any other or all others. Below I highlighted just “duck hunting gear,” and you can see similar distribution of the most prevalent labels as the other keywords at the top. However, hugely overrepresented are “water bird,” “duck,” “bird,” “waders,” “hunting dog,” “hunting decoy,” etc., providing ample ideas for great images to include in the body of your content.
Here we can see that some labels seem preferred for top rankings. For instance:
Clothing-related labels are much more common amongst the best ranking images.
Animal-related labels are less common amongst the best ranking images but more common amongst the lower ranking images.
Guns seem significantly more likely to appear in top ranking images.
By investigating trends in labels across your keywords, you can gain many interesting insights into the images most likely to rank for your particular niche. These insights will be different for any set of keywords, but a close examination of the results will yield more than a few actionable insights.
Not surprisingly, there are ways to go even deeper in your analysis with other artificial intelligence APIs. Let’s take a look at how we can further supplement our efforts.
An even deeper analysis for understanding
Deepai.org has an amazing suite of APIs that can be easily accessed to provide additional image labeling capabilities. One such API is “Image Captioning,” which is similar to Google’s image labeling, but instead of providing single labels, it provides descriptive labels, like “the man is holding a gun.”
We ran all of the same images as the Google label detection through this API and got some great additional detail for each image.
Just as with the label analysis, I broke up the caption distributions and analyzed their distributions by keyword and by overall frequency for all of the selected keywords. Then I compared top and bottom ranking images.
A final interesting finding
Google sometimes ranks YouTube video thumbnails in image search results. Below is an example I found in the hunting gear image searches.
It seems likely that at least some of Google’s understanding of why this thumbnail should rank for hunting gear comes from its image label detection. Though other factors, like having “hunting gear” in the title and coming from the NRA (high topical authority) certainly help, the fact that this thumbnail depicts many of the same labels as other top-ranking images must also play a role.
The lesson here is that the right video thumbnail choice can help that thumbnail to rank for competitive terms, so apply your learnings from doing image search result label and caption analysis to your video SEO strategy!
In the case of either video thumbnails or standard images, don’t overlook the ranking potential of the elements featured — it could make a difference in your SERP positions.
Once you’ve identified where the opportunity to nab a featured snippet lies, how do you go about targeting it? Part One of our “Featured Snippet Opportunities” series focused on how to discover places where you may be able to win a snippet, but today we’re focusing on how to actually make changes that’ll help you do that.
For those of you that need a little brush-up, what’s a featured snippet? Let’s say you do a search for something like, “Are pigs smarter than dogs?” You’re going to see an answer box that says, “Pigs outperform three-year old human children on cognitive tests and are smarter than any domestic animal. Animal experts consider them more trainable than cats or dogs.” How cool is that? But you’ll likely see these answer boxes for all sorts of things. So something to sort of keep an eye on. How do you become a part of that featured snippet box? How do you target those opportunities?
Last time, we talked about finding keywords that you rank on page one for that also have a featured snippet. There are a couple ways to do that. We talk about it in the first video. Something I do want to mention, in doing some of that the last couple weeks, is that Ahrefs can help you discover your featured snippet opportunities. I had no idea that was possible. Really cool, go check them out. If you don’t have Ahrefs and maybe you have Moz or SEMrush, don’t worry, you can do the same sort of thing with a Vlookup.
So I know this looks a little crazy for those of you that aren’t familiar. Super easy. It basically allows you to combine two sets of data to show you where some of those opportunities are. So happy to link to some of those resources down below or make a follow-up video on how to do just that.
All right. So step one is identifying these opportunities. You want to find the keywords that you’re on page one for that also have this answer box. You want to weigh the competitive search volume against qualified traffic. Initially, you might want to just go after search volume. I highly suggest you sort of reconsider and evaluate where might the qualified traffic come from and start to go after those.
From there, you really just want to understand the intent, more so even beyond this table that I have suggested for you. To be totally honest, I’m doing all of this with you. It’s been a struggle, and it’s been fun, but sometimes this isn’t very helpful. Sometimes it is. But a lot of times I’m not even looking at some of this stuff when I’m comparing the current featured snippet page and the page that we currently rank on page one for. I’ll tell you what I mean in a second.
So we have an example of how I’ve been able to already steal one. Hopefully, it helps you. How do you target your keywords that have the featured snippet?
Simplifying and cleaning up your pages does wonders. Google wants to provide a very simple, cohesive, quick answer for searchers and for voice searches. So definitely try to mold the content in a way that’s easy to consume.
Summaries do well. Whether they’re at the top of the page or at the bottom, they tend to do very, very well.
Competitive markup, if you see a current featured snippet that is marked up in a particular way, you can do so to be a little bit more competitive.
Provide unique info
Dig deeper, go that extra mile, provide something else. Provide that value.
How To Target Featured Snippet Examples
What are some examples? So these are just some examples that I personally have been running into and I’ve been working on cleaning up.
Roman numerals. I am trying to target a list result, and the page we currently rank on number one for has Roman numerals. Maybe it’s a big deal, maybe it’s not. I just changed them to numbers to see what’s going to happen. I’ll keep you posted.
Fix broken links. But I’m also just going through our page and cleaning it. We have a lot of older content. I’m fixing broken links. I have the Check My Links tool. It’s a Chrome add-on plugin that I just click and it tells me what’s a 404 or what I might need to update.
Fixing spelling errors or any grammatical errors that may have slipped through editors’ eyes. I use Grammarly. I have the free version. It works really well, super easy. I’ve even found some super old posts that have the double or triple spacing after a period. It drives me crazy, but cleaning some of that stuff up.
Deleting extra markup. You might see some additional breaks, not necessarily like that ampersand. But you know what I mean in WordPress where it’s that weird little thing for that break in the space, you can clean those out. Some extra, empty header markup, feel free to delete those. You’re just cleaning and simplifying and improving your page.
One interesting thing that I’ve come across recently was for the keyword “MozRank.” Our page is beautifully written, perfectly optimized. It has all the things in place to be that featured snippet, but it’s not. That is when I fell back and I started to rely on some of this data. I saw that the current featured snippet page has all these links.
So I started to look into what are some easy backlinks I might be able to grab for that page. I came across Quora that had a question about MozRank, and I noticed that — this is a side tip — you can suggest edits to Quora now, which is amazing. So I suggested a link to our Moz page, and within the notes I said, “Hello, so and so. I found this great resource on MozRank. It completely confirms your wonderful answer. Thank you so much, Britney.”
I don’t know if that’s going to work. I know it’s a nofollow. I hope it can send some qualified traffic. I’ll keep you posted on that. But kind of a fun tip to be aware of.
How we nabbed the “find backlinks” featured snippet
All right. How did I nab the featured snippet “find backlinks”? This surprised me, because I hardly changed much at all, and we were able to steal that featured snippet quite easily. We were currently in the fourth position, and this was the old post that was in the fourth position. These are the updates I made that are now in the featured snippet.
Clean up the title
So we go from the title “How to Find Your Competitor’s Backlinks Next Level” to “How to Find Backlinks.” I’m just simplifying, cleaning it up.
Clean up the H2s
The first H2, “How to Check the Backlinks of a Site.” Clean it up, “How to Find Backlinks?” That’s it. I don’t change step one. These are all in H3s. I leave them in the H3s. I’m just tweaking text a little bit here and there.
Simplify and clarify your explanations/remove redundancies
I changed “Enter your competitor’s domain URL” — it felt a little duplicate — to “Enter your competitor’s URL.” Let’s see. “Export results into CSV,” what kind of results? I changed that to “export backlink data into CSV.” “Compile CSV results from all competitors,” what kind of results? “Compile backlink CSV results from all competitors.”
So you can look through this. All I’m doing is simplifying and adding backlinks to clarify some of it, and we were able to nab that.
Too often do you see SEO analyses and decisions being made without considering the context of the marketing channel mix. Equally as often do you see large budgets being poured into paid ads in ways that seem to forget there’s a whole lot to gain from catering to popular search demand.
Both instances can lead to leaky conversion funnels and missed opportunity for long term traffic flows. But this article will show you a case of an SEO context analysis we used to determine the importance and role of SEO.
This analysis was one of our deliverables for a marketing agency client who hired us to inform SEO decisions which we then turned into a report template for you to get inspired by and duplicate.
The traffic analyzed is for of a monetizing blog, whose marketing team also happens to be one of most fun to work for. For the sake of this case study, we’re giving them a spectacular undercover name — “The Broze Fellaz.”
For context, this blog started off with content for the first two years before they launched their flagship product. Now, they sell a catalogue of products highly relevant to their content and, thanks to one of the most entertaining Shark Tank episodes ever aired, they have acquired investments and a highly engaged niche community.
As you’ll see below, organic search is their biggest channel in many ways. Facebook also runs both as organic and paid and the team spends many an hour inside the platform. Email has elaborate automated flows that strive to leverage subscribers that come from the stellar content on the website. We therefore chose the three — organic Search, Facebook, and email — as a combination that would yield a comprehensive analysis with insights we can easily act on.
Ingredients for the SEO analysis
This analysis is a result of a long-term retainer relationship with “The Broze Fellaz” as our ongoing analytics client. A great deal was required in order for data-driven action to happen, but we assure you, it’s all doable.
From the analysis best practice drawer, we used:
2 cups of relevant channels for context and analysis via comparison.
3 cups of different touch points to identify channel roles — bringing in traffic, generating opt-ins, closing sales, etc.
5 heads of open-minded lettuce and readiness to change current status quo, for a team that can execute.
457 oz of focus-on-finding what is going on with organic search, why it is going on, and what we can do about it (otherwise, we’d end up with another scorecard export).
Imperial units used in arbitrary numbers that are hard to imagine and thus feel very large.
1 to 2 heads of your analyst brain, baked into the analysis. You’re not making an automated report — even a HubSpot intern can do that. You’re being a human and you’re analyzing. You’re making human analysis. This helps avoid having your job stolen by a robot.
Full tray of Data Studio visualizations that appeal to the eye.
Sprinkles of benchmarks, for highlighting significance of performance differences.
From the measurement setup and stack toolbox, we used:
Google Analytics with tailored channel definitions, enhanced e-commerce and Search Console integration.
UTM routine for social and email traffic implemented via Google Sheets & UTM.io.
Google Data Studio. This is my favorite visualization tool. Despite its flaws and gaps (as it’s still in beta) I say it is better than its paid counterparts, and it keeps getting better. For data sources, we used the native connectors for Google Analytics and Google Sheets, then Facebook community connectors by Supermetrics.
Keyword Hero. Thanks to semantic algorithms and data aggregation, you are indeed able to see 95 percent of your organic search queries (check out Onpage Hero, too, you’ll be amazed).
Inspiration for my approach comes from Lea Pica, Avinash, the Google Data Studio newsletter, and Chris Penn, along with our dear clients and the questions they have us answer for them.
Ready? Let’s dive in.
Analysis of the client’s SEO on the context of their channel mix
1) Insight: Before the visit
What’s going on and why is it happening?
Organic search traffic volume blows the other channels out of the water. This is normal for sites with quality regular content; yet, the difference is stark considering the active effort that goes into Facebook and email campaigns.
The CTR of organic search is up to par with Facebook. That’s a lot to say when comparing an organic channel to a channel with high level of targeting control.
It looks like email flows are the clear winner in terms of CTR to the website, which has a highly engaged community of users who return fairly often and advocate passionately. It also has a product and content that’s incredibly relevant to their users, which few other companies appear to be good at.
There’s a high CTR on search engine results pages often indicates that organic search may support funnel stages beyond just the top.
As well, email flows are sent to a very warm audience — interested users who went through a double opt-in. It is to be expected for this CTR to be high.
What’s been done already?
There’s an active effort and budget allocation being put towards Facebook Ads and email automation. A content plan has been put in place and is being executed diligently.
What we recommend next
Approach SEO in a way as systematic as what you do for Facebook and email flows.
Optimize meta titles and descriptions via testing tools such as Sanity Check. The organic search CTR may become consistently higher than that of Facebook ads.
Run a technical audit and optimize accordingly. Knowing that you haven’t done that in a long time, and seeing how much traffic you get anyway, there’ll be quick, big wins to enjoy.
Results we expect
You can easily increase the organic CTR by at least 5 percent. You could also clean up the technical state of your site in the eyes of crawlers -— you’ll then see faster indexing by search engines when you publish new content, increased impressions for existing content. As a result, you may enjoy a major spike within a month.
2) Insight: Engagement and options during the visit
With over 70 percent of traffic coming to this website from organic search, the metrics in this analysis will be heavily skewed towards organic search. So, comparing the rate for organic search to site-wide is sometimes conclusive, other times not conclusive.
Adjusted bounce rate — via GTM events in the measurement framework used, we do not count a visit as a bounce if the visit lasts 45 seconds or longer. We prefer this approach because such an adjusted bounce rate is much more actionable for content sites. Users who find what they were searching for often read the page they land on for several minutes without clicking to another page. However, this is still a memorable visit for the user. Further, staying on the landing page for a while, or keeping the page open in a browser tab, are both good indicators for distinguishing quality, interested traffic, from all traffic.
We included all Facebook traffic here, not just paid. We know from the client’s data that the majority is from paid content, they have a solid UTM routine in place. But due to boosted posts, we’ve experienced big inaccuracies when splitting paid and organic Facebook for the purposes of channel attribution.
What’s going on and why is it happening?
It looks like organic search has a bounce rate worse than the email flows — that’s to be expected and not actionable, considering that the emails are only sent to recent visitors who have gone through a double opt-in. What is meaningful, however, is that organic has a better bounce rate than Facebook. It is safe to say that organic search visitors will be more likely to remember the website than the Facebook visitors.
Opt-in rates for Facebook are right above site average, and those for organic search are right below, while organic is bringing in a majority of email opt-ins despite its lower opt-in rate.
Google’s algorithms and the draw of the content on this website are doing better at winning users’ attention than the detailed targeting applied on Facebook. The organic traffic will have a higher likelihood of remembering the website and coming back. Across all of our clients, we find that organic search can be a great retargeting channel, particularly if you consider that the site will come up higher in search results for its recent visitors.
What’s been done already?
The Facebook ad campaigns of “The Broze Fellaz” have been built and optimized for driving content opt-ins. Site content that ranks in organic search is less intentional than that.
Opt-in placements have been tested on some of the biggest organic traffic magnets.
Thorough, creative and consistent content calendars have been in place as a foundation for all channels.
What we recommend next
It’s great to keep using organic search as a way to introduce new users to the site. Now, you can try to be more intentional about using it for driving opt-ins. It’s already serving both of the stages of the funnel.
Test and optimize opt-in placements on more traffic magnets.
Test and optimize opt-in copy for top 10 traffic magnets.
Once your opt-in rates have improved, focus on growing the channel. Add to the content work with a 3-month sprint of an extensive SEO project
Assign Google Analytics goal values to non-e-commerce actions on your site. The current opt-ins have different roles and levels of importance and there’s also a handful of other actions people can take that lead to marketing results down the road. Analyzing goal values will help you create better flows toward pre-purchase actions.
Facebook campaigns seem to be at a point where you can pour more budget into them and expect proportionate increase in opt-in count.
Results we expect
Growth in your opt-ins from Facebook should be proportionate to increase in budget, with a near-immediate effect. At the same time, it’s fairly realistic to bring the opt-in rate of organic search closer to site average.
3) Insight: Closing the deal
For channel attribution with money involved, you want to make sure that your Google Analytics channel definitions, view filters, and UTM’s are in top shape.
What’s going on and why is it happening?
Transaction rate, as well as per session value, is higher for organic search than it is for Facebook (paid and organic combined).
Organic search contributes to far more last-click revenue than Facebook and email combined. For its relatively low volume of traffic, email flows are outstanding in the volume of revenue they bring in.
Thanks to the integration of Keyword Hero with Google Analytics for this client, we can see that about 30 percent of organic search visits are from branded keywords, which tends to drive the transaction rate up.
So, why is this happening? Most of the product on the site is highly relevant to the information people search for on Google.
Multi-channel reports in Google Analytics also show that people often discover the site in organic search, then come back by typing in the URL or clicking a bookmark. That makes organic a source of conversions where, very often, no other channels are even needed.
We can conclude that Facebook posts and campaigns of this client are built to drive content opt-ins, not e-commerce transactions. Email flows are built specifically to close sales.
What’s been done already?
There is dedicated staff for Facebook campaigns and posts, as well a thorough system dedicated to automated email flows.
A consistent content routine is in place, with experienced staff at the helm. A piece has been published every week for the last few years, with the content calendar filled with ready-to-publish content for the next few months. The community is highly engaged, reading times are high, comment count soaring, and usefulness of content outstanding. This, along with partnerships with influencers, helps “The Broze Fellaz” take up a half of the first page on the SERP for several lucrative topics. They’ve been achieving this even without a comprehensive SEO project. Content seems to be king indeed.
Google Shopping has been tried. The campaign looked promising but didn’t yield incremental sales. There’s much more search demand for informational queries than there is for product.
What we recommend next
Organic traffic is ready to grow. If there is no budget left, resource allocation should be considered. In paid search, you can often simply increase budgets. Here, with stellar content already performing well, a comprehensive SEO project is begging for your attention. Focus can be put into structure and technical aspects, as well as content that better caters to search demand. Think optimizing the site’s information architecture, interlinking content for cornerstone structure, log analysis, and technical cleanup, meta text testing for CTR gains that would also lead to ranking gains, strategic ranking of long tail topics, intentional growing of the backlink profile.
Three- or six-month intensive sprint of comprehensive SEO work would be appropriate.
Results we expect
Increasing last click revenue from organic search and direct by 25 percent would lead to a gain as high as all of the current revenue from automated email flows. Considering how large the growth has been already, this gain is more than achievable in 3–6 months.
Wrapping it up
Organic search presence of “The Broze Fellaz” should continue to be the number-one role for bringing new people to the site and bringing people back to the site. Doing so supports sales that happen with the contribution of other channels, e.g. email flows. The analysis points out is that organic search is also effective at playing the role of the last-click channel for transactions, often times without the help of other channels.
We’ve worked with this client for a few years, and, based on our knowledge of their marketing focus, this analysis points us to a confident conclusion that a dedicated, comprehensive SEO project will lead to high incremental growth.
In drawing analytical conclusions and acting on them, there’s always more than one way to shoe a horse. Let us know what conclusions you would’ve drawn instead. Copy the layout of our SEO Channel Context Comparison analysis template and show us what it helped you do for your SEO efforts — create a similar analysis for a paid or owned channel in your mix. Whether it’s comments below, tweeting our way, or sending a smoke signal, we’ll be all ears. And eyes.
We’ve been talking a lot about search intent this week, and if you’ve been following along, you’re likely already aware of how “search intent” is essential for a robust SEO strategy. If, however, you’ve ever laboured for hours classifying keywords by topic and search intent, only to end up with a ton of data you don’t really know what to do with, then this post is for you.
I’m going to share how to take all that sweet keyword data you’ve categorized, put it into a Power BI dashboard, and start slicing and dicing to uncover a ton insights — faster than you ever could before.
Building your keyword list
Every great search analysis starts with keyword research and this one is no different. I’m not going to go into excruciating detail about how to build your keyword list. However, I will mention a few of my favorite tools that I’m sure most of you are using already:
Search Query Report — What better place to look first than the search terms already driving clicks and (hopefully) conversions to your site.
Answer The Public — Great for pulling a ton of suggested terms, questions and phrases related to a single search term.
InfiniteSuggest— Like Answer The Public, but faster and allows you to build based on a continuous list of seed keywords.
MergeWords— Quickly expand your keywords by adding modifiers upon modifiers.
Grep Words— A suite of keyword tools for expanding, pulling search volume and more.
Please note that these tools are a great way to scale your keyword collecting but each will come with the need to comb through and clean your data to ensure all keywords are at least somewhat relevant to your business and audience.
Once I have an initial keyword list built, I’ll upload it to STAT and let it run for a couple days to get an initial data pull. This allows me to pull the ‘People Also Ask’ and ‘Related Searches’ reports in STAT to further build out my keyword list. All in all, I’m aiming to get to at least 5,000 keywords, but the more the merrier.
For the purposes of this blog post I have about 19,000 keywords I collected for a client in the window treatments space.
Categorizing your keywords by topic
Bucketing keywords into categories is an age-old challenge for most digital marketers but it’s a critical step in understanding the distribution of your data. One of the best ways to segment your keywords is by shared words. If you’re short on AI and machine learning capabilities, look no further than a trusty Ngram analyzer. I love to use this Ngram Tool from guidetodatamining.com — it ain’t much to look at, but it’s fast and trustworthy.
After dropping my 19,000 keywords into the tool and analyzing by unigram (or 1-word phrases), I manually select categories that fit with my client’s business and audience. I also make sure the unigram accounts for a decent amount of keywords (e.g. I wouldn’t pick a unigram that has a count of only 2 keywords).
Using this data, I then create a Category Mapping table and map a unigram, or “trigger word”, to a Category like the following:
You’ll notice that for “curtain” and “drapes” I mapped both to the Curtains category. For my client’s business, they treat these as the same product, and doing this allows me to account for variations in keywords but ultimately group them how I want for this analysis.
Using this method, I create a Trigger Word-Category mapping based on my entire dataset. It’s possible that not every keyword will fall into a category and that’s okay — it likely means that keyword is not relevant or significant enough to be accounted for.
Creating a keyword intent map
Similar to identifying common topics by which to group your keywords, I’m going to follow a similar process but with the goal of grouping keywords by intent modifier.
Search intent is the end goal of a person using a search engine. Digital marketers can leverage these terms and modifiers to infer what types of results or actions a consumer is aiming for.
For example, if a person searches for “white blinds near me”, it is safe to infer that this person is looking to buy white blinds as they are looking for a physical location that sells them. In this case I would classify “near me” as a “Transactional” modifier. If, however, the person searched “living room blinds ideas” I would infer their intent is to see images or read blog posts on the topic of living room blinds. I might classify this search term as being at the “Inspirational” stage, where a person is still deciding what products they might be interested and, therefore, isn’t quite ready to buy yet.
There is a lot of research on some generally accepted intent modifiers in search and I don’t intent to reinvent the wheel. This handy guide (originally published in STAT) provides a good review of intent modifiers you can start with.
I followed the same process as building out categories to build out my intent mapping and the result is a table of intent triggers and their corresponding Intent stage.
Note: it’s not about the tool necessarily (although Power BI is a super powerful one). It’s more about being able to look at all of this data in one place and pull insights from it at speeds which Excel just won’t give you. If you’re still skeptical of trying a new tool like Power BI at the end of this post, I urge you to get the free download from Microsoft and give it a try.
Setting up your data in Power BI
Power BI’s power comes from linking multiple datasets together based on common “keys.” Think back to your Microsoft Access days and this should all start to sound familiar.
Step 1: Upload your data sources
First, open Power BI and you’ll see a button called “Get Data” in the top ribbon. Click that and then select the data format you want to upload. All of my data for this analysis is in CSV format so I will select the Text/CSV option for all of my data sources. You have to follow these steps for each data source. Click “Load” for each data source.
Step 2: Clean your data
In the Power BI ribbon menu, click the button called “Edit Queries.” This will open the Query Editor where we will make all of our data transformations.
The main things you’ll
want to do in the Query Editor are the following:
Make sure all data formats make sense (e.g. keywords are formatted as text, numbers are formatted as decimals or whole numbers).
Rename columns as needed.
Create a domain column in your Top 20 report based on the URL column.
Close and apply your
changes by hitting the “Edit Queries” button, as seen above.
Step 3: Create relationships between data sources
On the left side of Power BI is a vertical bar with icons for different views. Click the third one to see your relationships view.
In this view, we are going to connect all data sources to our ‘Keywords Bridge’ table by clicking and dragging a line from the field ‘Keyword’ in each table and to ‘Keyword’ in the ‘Keywords Bridge’ table (note that for the PPC Data, I have connected ‘Search Term’ as this is the PPC equivalent of a keyword, as we’re using here).
The last thing we need to do for our relationships is double-click on each line to ensure the following options are selected for each so that our dashboard works properly:
The cardinality is Many to 1
The relationship is “active”
The cross filter direction is set to “both”
We are now ready to start building our Intent Dashboard and analyzing our data.
Building the search intent dashboard
In this section I’ll walk you through each visual in the Search Intent Dashboard (as seen below):
Top domains by count of keywords
Visual type: Stacked Bar Chart visual
Axis: I’ve nested URL under Domain so I can drill down to see this same breakdown by URL for a specific Domain
Value: Distinct count of keywords
Legend: Result Types
Filter: Top 10 filter on Domains by count of distinct keywords
Keyword breakdown by result type
Visual type: Donut chart
Legend: Result Types
Value: Count of distinct keywords, shown as Percent of grand total
Sum of Distinct MSV
Because the Top 20 report shows each keyword 20 times, we need to create a calculated measure in Power BI to only sum MSV for the unique list of keywords. Use this formula for that calculated measure:
Sum Distinct MSV = SUMX(DISTINCT('Table'[Keywords]), FIRSTNONBLANK('Table'[MSV], 0))
This is just a distinct count of keywords
Slicer: PPC Conversions
Visual type: Slicer
Drop your PPC Conversions field into a slicer and set the format to “Between” to get this nifty slider visual.
Visual type: Table or Matrix (a matrix allows for drilling down similar to a pivot table in Excel)
Values: Here I have Category or Intent Stage and then the distinct count of keywords.
Pulling insights from your search intent dashboard
This dashboard is now a Swiss Army knife of data that allows you to slice and dice to your heart’s content. Below are a couple examples of how I use this dashboard to pull out opportunities and insights for my clients.
Where are competitors winning?
With this data we can quickly see who the top competing domains are, but what’s more valuable is seeing who the competitors are for a particular intent stage and category.
I start by filtering to the “Informational” stage, since it represents the most keywords in our dataset. I also filter to the top category for this intent stage which is “Blinds”. Looking at my Keyword Count card, I can now see that I’m looking at a subset of 641 keywords.
Note: To filter multiple visuals in Power BI, you need to press and hold the “Ctrl” button each time you click a new visual to maintain all the filters you clicked previously.
The top competing subdomain here is videos.blinds.com with visibility in the top 20 for over 250 keywords, most of which are for video results. I hit ctrl+click on the Video results portion of videos.blinds.com to update the keywords table to only keywords where videos.blinds.com is ranking in the top 20 with a video result.
From all this I can now say that videos.blinds.com is ranking in the top 20 positions for about 30 percent of keywords that fall into the “Blinds” category and the “Informational” intent stage. I can also see that most of the keywords here start with “how to”, which tells me that most likely people searching for blinds in an informational stage are looking for how to instructions and that video may be a desired content format.
Where should I focus my time?
Whether you’re in-house or at an agency, time is always a hit commodity. You can use this dashboard to quickly identify opportunities that you should be prioritizing first — opportunities that can guarantee you’ll deliver bottom-line results.
To find these bottom-line results, we’re going to filter our data using the PPC conversions slicer so that our data only includes keywords that have converted at least once in our PPC campaigns.
Once I do that, I can see I’m working with a pretty limited set of keywords that have been bucketed into intent stages, but I can continue by drilling into the “Transactional” intent stage because I want to target queries that are linked to a possible purchase.
Note: Not every keyword will fall into an intent stage if it doesn’t meet the criteria we set. These keywords will still appear in the data, but this is the reason why your total keyword count might not always match the total keyword count in the intent stages or category tables.
From there I want to focus on those “Transactional” keywords that are triggering answer boxes to make sure I have good visibility, since they are converting for me on PPC. To do that, I filter to only show keywords triggering answer boxes. Based on these filters I can look at my keyword table and see most (if not all) of the keywords are “installation” keywords and I don’t see my client’s domain in the top list of competitors. This is now an area of focus for me to start driving organic conversions.
I’ve only just scratched the surface — there’s tons that can can be done with this data inside a tool like Power BI. Having a solid data set of keywords and visuals that I can revisit repeatedly for a client and continuously pull out opportunities to help fuel our strategy is, for me, invaluable. I can work efficiently without having to go back to keyword tools whenever I need an idea. Hopefully you find this makes building an intent-based strategy more efficient and sound for your business or clients.
I use web crawlers on a daily basis. While they are very useful, they only imitate search engine crawlers’ behavior, which means you aren’t always getting the full picture.
The only tool that can give you a real overview of how search engines crawl your site are log files. Despite this, many people are still obsessed with crawl budget — the number of URLs Googlebot can and wants to crawl.
Log file analysis may discover URLs on your site that you had no idea about but that search engines are crawling anyway — a major waste of Google server resources (Google Webmaster Blog):
“Wasting server resources on pages like these will drain crawl activity from pages that do actually have value, which may cause a significant delay in discovering great content on a site.”
While it’s a fascinating topic, the fact is that most sites don’t need to worry that much about crawl budget —an observation shared by John Mueller (Webmaster Trends Analyst at Google) quite a few times already.
There’s still a huge value in analyzing logs produced from those crawls, though. It will show what pages Google is crawling and if anything needs to be fixed.
When you know exactly what your log files are telling you, you’ll gain valuable insights about how Google crawls and views your site, which means you can optimize for this data to increase traffic. And the bigger the site, the greater the impact fixing these issues will have.
What are server logs?
A log file is a recording of everything that goes in and out of a server. Think of it as a ledger of requests made by crawlers and real users. You can see exactly what resources Google is crawling on your site.
You can also see what errors need your attention. For instance, one of the issues we uncovered with our analysis was that our CMS created two URLs for each page and Google discovered both. This led to duplicate content issues because two URLs with the same content was competing against each other.
Analyzing logs is not rocket science — the logic is the same as when working with tables in Excel or Google Sheets. The hardest part is getting access to them — exporting and filtering that data.
Looking at a log file for the first time may also feel somewhat daunting because when you open one, you see something like this:
Calm down and take a closer look at a single line:
[08/Dec/2017:04:54:20 -0400] is the Timestamp (when)
GET is the Method
/contact/ is the Requested URL (what)
200 is the Status Code (result)
11179 is the Bytes Transferred (size)
“-” is the Referrer URL (source) — it’s empty because this request was made by a crawler
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html) is the User Agent (signature) — this is user agent of Googlebot (Desktop)
Once you know what each line is composed of, it’s not so scary. It’s just a lot of information. But that’s where the next step comes in handy.
Tools you can use
There are many tools you can choose from that will help you analyze your log files. I won’t give you a full run-down of available ones, but it’s important to know the difference between static and real-time tools.
Static — This only analyzes a static file. You can’t extend the time frame. Want to analyze another period? You need to request a new log file. My favourite tool for analyzing static log files is Power BI.
Real-time — Gives you direct access to logs. I really like open source ELK Stack (Elasticsearch, Logstash, and Kibana). It takes a moderate effort to implement it but once the stack is ready, it allows me changing the time frame based on my needs without needing to contact our developers.
Don’t just dive into logs with a hope to find something — start asking questions. If you don’t formulate your questions at the beginning, you will end up in a rabbit hole with no direction and no real insights.
Here are a few samples of questions I use at the start of my analysis:
Which search engines crawl my website?
Which URLs are crawled most often?
Which content types are crawled most often?
Which status codes are returned?
If you see that Google is crawling non-existing pages (404), you can start asking which of those requested URLs return 404 status code.
Order the list by the number of requests, evaluate the ones with the highest number to find the pages with the highest priority (the more requests, the higher priority), and consider whether to redirect that URL or do any other action.
If you use a CDN or cache server, you need to get that data as well to get the full picture.
Segment your data
Grouping data into segments provides aggregate numbers that give you the big picture. This makes it easier to spot trends you might have missed by looking only at individual URLs. You can locate problematic sections and drill down if needed.
There are various ways to group URLs:
Group by content type (single product pages vs. category pages)
Group by language (English pages vs. French pages)
Group by storefront (Canadian store vs. US store)
Group by file format (JS vs. images vs. CSS)
Don’t forget to slice your data by user-agent. Looking at Google Desktop, Google Smartphone, and Bing all together won’t surface any useful insights.
Monitor behavior changes over time
Your site changes over time, which means so will crawlers’ behavior. Googlebot often decreases or increases the crawl rate based on factors such as a page’s speed, internal link structure, and the existence of crawl traps.
It’s a good idea to check in with your log files throughout the year or when executing website changes. I look at logs almost on a weekly basis when releasing significant changes for large websites.
By analyzing server logs twice a year, at the very least, you’ll surface changes in crawler’s behavior.
Watch for spoofing
Spambots and scrapers don’t like being blocked, so they may fake their identity — they leverage Googlebot’s user agent to avoid spam filters.
To verify if a web crawler accessing your server really is Googlebot, you can run a reverse DNS lookup and then a forward DNS lookup. More on this topic can be found in Google Webmaster Help Center.
Merge logs with other data sources
While it’s no necessary to connect to other data sources, doing so will unlock another level of insight and context that regular log analysis might not be able to give you. An ability to easily connect multiple datasets and extract insights from them is the main reason why Power BI is my tool of choice, but you can use any tool that you’re familiar with (e.g. Tableau).
Blend server logs with multiple other sources such as Google Analytics data, keyword ranking, sitemaps, crawl data, and start asking questions like:
What pages are not included in the sitemap.xml but are crawled extensively?
What pages are included in the Sitemap.xml file but are not crawled?
Are revenue-driving pages crawled often?
Is the majority of crawled pages indexable?
You may be surprised by the insights you’ll uncover that can help strengthen your SEO strategy. For instance, discovering that almost 70 percent of Googlebot requests are for pages that are not indexable is an insight you can act on.
Don’t think of server logs as just another SEO tool. Logs are also an invaluable source of information that can help pinpoint technical errors before they become a larger problem.
Last year, Google Analytics reported a drop in organic traffic for our branded search queries. But our keyword tracking tool, STAT Search Analytics, and other tools showed no movement that would have warranted the drop. So, what was going on?
Server logs helped us understand the situation: There was no real drop in traffic. It was our newly deployed WAF (Web Application Firewall) that was overriding the referrer, which caused some organic traffic to be incorrectly classified as direct traffic in Google Analytics.
Using log files in conjunction with keyword tracking in STAT helped us uncover the whole story and diagnose this issue quickly.
Putting it all together
Log analysis is a must-do, especially once you start working with large websites.
My advice is to start with segmenting data and monitoring changes over time. Once you feel ready, explore the possibilities of blending logs with your crawl data or Google Analytics. That’s where great insights are hidden.
Ready to learn how to get cracking and tracking some more? Reach out and request a demo get your very own tailored walkthrough of STAT.