Google shook up the SEO world by announcing big changes to how publishers should mark nofollow links. The changes — while beneficial to help Google understand the web — nonetheless caused confusion and raised a number of questions. We’ve got the answers to many of your questions here.
14 years after its introduction, Google today announced significant changes to how they treat the “nofollow” link attribute. The big points:
Nofollow can now be specified with 3 different attributes — “nofollow”, “sponsored”, and “ugc” — each signifying a different meaning.
For ranking purposes, Google now treats each of the nofollow attributes as “hints” — meaning they likely won’t impact ranking, but Google may choose to ignore the directive and use nofollow links for rankings.
Google continues to ignore nofollow links for crawling and indexing purposes, but this strict behavior changes March 1, 2020, at which point Google begins treating nofollow attributes as “hints”, meaning they may choose to crawl them.
You can use the new attributes in combination with each other. For example, rel=”nofollow sponsored ugc” is valid.
Paid links must either use the nofollow or sponsored attribute (either alone or in combination.) Simply using “ugc” on paid links could presumably lead to a penalty.
Publishers don’t have to do anything. Google offers no incentive for changing, or punishment for not changing.
Publishers using nofollow to control crawling may need to reconsider their strategy.
Why did Google change nofollow?
Google wants to take back the link graph.
Google introduced the nofollow attribute in 2005 as a way for publishers to address comment spam and shady links from user-generated content (UGC). Linking to spam or low-quality sites could hurt you, and nofollow offered publishers a way to protect themselves.
Google also required nofollow for paid or sponsored links. If you were caught accepting anything of value in exchange for linking out without the nofollow attribute, Google could penalize you.
The system generally worked, but huge portions of the web—sites like Forbes and Wikipedia—applied nofollow across their entire site for fear of being penalized, or not being able to properly police UGC.
This made entire portions of the link graph less useful for Google. Should curated links from trusted Wikipedia contributors really not count? Perhaps Google could better understand the web if they changed how they consider nofollow links.
By treating nofollow attributes as “hints”, they allow themselves to better incorporate these signals into their algorithms.
Hopefully, this is a positive step for deserving content creators, as a broader swath of the link graph opens up to more potential ranking influence. (Though for most sites, it doesn’t seem much will change.)
What is the ranking impact of nofollow links?
Prior to today, SEOs generally believed nofollow links worked like this:
Not used for crawling and indexing (Google didn’t follow them.)
Might be used for ranking, though the observed effect was typically small or non-existant
To be fair, there’s a lot of debate and speculation around the second statement, and Google has been opaque on the issue. Experimental data and anecdotal evidence suggest Google has long considered nofollow links as a potential ranking signal.
As of today, Google’s guidance states nofollowed attributes—including sponsored and ugc—are treated like this:
Still not used for crawling and indexing (see the changes taking place in the future below)
For ranking purposes, all nofollow directives are now officially a “hint” — meaning Google may choose to ignore it and use it for ranking purposes. Many SEOs believe this is how Google has been treating nofollow for quite some time.
Beginning March 1, 2020, nofollow attributes will be treated as hints across the board, meaning:
In some cases, they may be used for crawling and indexing
In some cases, they may be used for ranking
Emphasis on the word “some.” Google is very explicit that in most cases they will continue to ignore nofollow links as usual.
Do publishers need to make changes?
For most sites, the answer is no — only if they want to. Google isn’t requiring sites to make changes, and as of yet, there is no business case to be made.
That said, there are a couple of cases where site owners may want to implement the new attributes:
Sites that want to help Google better understand the sites they—or their contributors—are linking to. For example, it could be to everyone’s benefit for sites Wikipedia to adopt these changes. Or maybe Moz could change how it marks up links in the user-generated Q&A section (which often links to high-quality sources.)
Sites that use nofollow for crawl control. For sites with large faceted navigation, nofollow is sometimes an effective tool at preventing Google from wasting crawl budget. It’s too early to tell if publishers using nofollow this way will need to change anything before Google starts treating nofollow as a crawling “hint” but it may be important to pay attention too.
To be clear, if a site is properly using nofollow today, SEOs do not need to recommend any changes be made. Though sites are free to do so, they should not expect any rankings boost for doing so, or new penalties for not changing.
That said, Google’s use of nofollow may evolve, and it will be interesting to see in the future—through study and analysis—if a ranking benefit does emerge from using nofollow attributes in a certain way.
Which nofollow attribute should you use?
If you choose to change your nofollow links to be more specific, Google’s guidelines are very clear, so we won’t repeat them in-depth here. In brief, your choices are:
rel=”sponsored” – For paid or sponsored links. This would assumingly include affiliate links, although Google hasn’t explicitly said.
rel=”ugc” – Links within all user-generated content. Google has stated if UGC is created by a trusted contributor, this may not be necessary.
rel=”nofollow” – A catchall for all nofollow links. As with the other nofollow directives, these links generally won’t be used for ranking, crawling, or indexing purposes.
Additionally, attributes can be used in combination with one another. This means a declaration such as rel=”nofollow sponsored” is 100% valid.
Can you be penalized for not marking paid links?
Yes, you can still be penalized, and this is where it gets tricky.
Google advises to mark up paid/sponsored links with either “sponsored” or “nofollow” only, but not “ugc”.
This adds an extra layer of confusion. What if your UGC contributors are including paid or affiliate links in their content/comments? Google, so far, hasn’t been clear on this.
For this reason, we may likely see publishers continue to markup UGC content with “nofollow” as a default, or possibly “nofollow ugc”.
Can you use the nofollow attributes to control crawling and indexing?
Nofollow has always been a very, very poor way to prevent Google from indexing your content, and it continues to be that way.
If you want to prevent Google from indexing your content, it’s recommended to use one of several other methods, most typically some form of “noindex”.
Crawling, on the other hand, is a slightly different story. Many SEOs use nofollow on large sites to preserve crawl budget, or to prevent Google from crawling unnecessary pages within faceted navigation.
Bases on Google statements, it seems you can still attempt to use the nofollow attributes in this way, but after March 1, 2020, they may choose to ignore this. Any SEO using nofollow in this way may need to get creative in order to prevent Google from crawling unwanted sections of their sites.
Final thoughts: Should you implement the new nofollow attributes?
While there is no obvious compelling reason to do so, this is a decision every SEO will have to make for themselves.
Given the initial confusion and lack of clear benefits, many publishers will undoubtedly wait until we have better information.
That said, it certainly shouldn’t hurt to make the change (as long as you mark paid links appropriately with “nofollow” or “sponsored”.) For example, the Moz Blog may someday change comment links below to rel=”ugc”, or more likely rel=”nofollow ugc”.
Finally, will anyone actually use the “sponsored” attribute, at the risk of giving more exposure to paid links? Time will tell.
What are your thoughts on Google’s new nofollow attributes? Let us know in the comments below.
It’s too easy to fall into a rut with your SEO audits. If it doesn’t meet best practices it ought to be fixed, right? Not always. Though an SEO audit is essentially a checklist, it’s important to both customize your approach and prioritize your fixes to be efficient and effective with your time and effort. In today’s Whiteboard Friday, Kameron Jenkins teaches us her methods for saying adios to generic, less effective SEO audits and howdy to a better way of improving your site.
Click on the whiteboard image above to open a high resolution version in a new tab!
Hey, everybody. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and today we’re going to be talking about the SEO audit. We’re going to be talking about how to take it from its kind of current generic state to something that’s a little bit more customized and it has prioritization baked in so hopefully we’re going to be doing these SEO audits for higher impact than we’re currently doing them.
What is an SEO audit?
So I think it’s safe to start with a definition of what an SEO audit is. Now, depending on who you ask, an SEO audit can mean a lot of different things. So if you were to boil it down to its just barest of bones, here’s what I would say an SEO audit usually is. This is what someone means when they say SEO audit.
An SEO audit is a checklist to see if your site is compliant
So it’s a list of checks basically. You have all of these things that are SEO best practices, and you run your site through this sieve and you try to see is my site compliant or not compliant essentially.
So you have things like: Missing H1s, yes or no? Broken links, yes or no? Duplicate title tags, yes or no? So you end up with this whole big, long list of things that are wrong and not according to SEO best practices on your site.
Purpose = improving SEO metrics
The whole purpose of this is usually to improve some kind of SEO metrics.
Maybe you’re trying to correct a traffic drop or something like that. So you have this whole laundry list of things now that you need to fix as a result of this SEO audit. So usually what you end up saying is, hey, dev team or client or whoever you’re giving this to, “You need to fix these things because they’re SEO best practice.” What’s wrong with this though?
“Fix it because it’s SEO best practice.” What’s wrong with this picture?
I think there are a couple things wrong with this.
1. May or may not be hurting you
Number one, it means that we’re addressing things that may or may not actually be the culprit of whatever issue we’re facing. It’s just a list of things that didn’t meet a best practices list, but we don’t really know and we’re not really sure if these things are actually causing the issues that we’re seeing on our site.
2. May or may not have an impact
So because we don’t know if these are the culprit and the things that are hurting us, they may or may not have an impact when we actually spend our time on them.
3. May be wasting time
Number three, that leads to a lot of potential wasted time. This is especially true, well, for everyone. Everyone is very busy. But this is especially true for people who work at enterprises and they have a very large website, maybe a really strapped for time and resources development team. If you give them a list of fixes and you say, “Hey, fix these things because it’s SEO best practices,”they are just going to say, “Yeah, sorry, no.I don’t have time for that, and I don’t see the value in it.I don’t really know why I’m doing this.”
So I think there’s a better way. Move over to this side.
How to customize
Customization and prioritization I think are a lot better alternatives to doing our SEO audits. So there are three kind of main ways that I like to customize my SEO audits.
1. Don’t look at everything
Number one, it may sound a little bit counterintuitive, but don’t look at everything. There are plenty of times when you do an SEO audit and it makes sense to do a kind of comprehensive audit, look through all kinds of things.
You’re doing links. You’re doing content. You’re doing the site architecture. You’re doing all kinds of things. Usually I do this when I’m taking over a new client and I want to get to know it and I want to get to know the website and its issues a little bit better. I think that’s a totally valid time to do that. But a lot of times we’re doing more work than we actually have to be doing when we look at the entire website and every single scenario on the website.
So maybe don’t look at everything.
2. Start with a problem statement
Instead I think it could be a good idea to start with a goal or a problem statement. So a lot of times SEO audits kind of come in response to something. Maybe your client is saying, “Hey, our competitor keeps beating us for this. Why are they beating us?” Or, “Hey, we’ve had year-over-year decline in traffic.What’s going on? Can you do an SEO audit?”
So I think it’s a good idea to start with that as kind of a goal or a problem statement so that you can narrow and target your SEO audit to focus on the things that are actually the issue and why you’re performing the audit.
3. Segment to isolate
Number three, I think it’s a really good idea to segment your site in order to isolate the actual source of the problem. So by segment, I mean dividing your site into logical chunks based on their different purposes.
So, for example, maybe you have product pages. Maybe you have category pages. You have a blog section and user-generated content. There are all these different sections of your website. Segment those, isolate them, and look at them in isolation to see if maybe one of the sections is the culprit and actually experiencing issues, because a lot of times you find that, oh, maybe it’s the product pages that are actually causing my issues and it’s not the blog posts or anything else at all.
So that way you’re able to really waste less time and focus, take a more targeted, focused look at what’s actually going on with your website. So once you’ve kind of audited your site through that lens, through a more customized lens, it’s time to prioritize, because you still have a list of things that you need to fix. You can’t just heap them all onto whatever team you’re passing this on to and say,” Here, fix these all.”
How to prioritize
It’s a lot better to prioritize and tell them what’s more important and why. So here’s how I like to do that. I would plot this out on a matrix. So a pretty simple matrix. At the top, your goal goes there. It keeps you really focused. All of these little things, say pretend these are just the findings from our SEO audit.
On the y-axis, we have impact. On the x-axis, we have time. So essentially we’re ordering every single finding by what kind of impact it’s going to have and how much time it’s going to take to complete. So you’re going to end up with these four quadrants of tasks.
So in this green quadrant here, you have your quick wins.
These are the things that you should do right now, because they’re going to have a really high impact and they’re not going to take a lot of time to do. So definitely prioritize those things.
Schedule & tackle in sprints
In this blue quadrant here, you have things that are going to make a really high impact, but they also take a lot of time. So schedule those after the green quadrant if you can. I would also suggest breaking those larger, time-intensive tasks into smaller, bite-sized chunks.
This is a good idea no matter what you’re doing, but this is especially helpful if you’re working with a development team who probably runs in two-week sprints anyway. It’s a really good idea to segment and tackle those little bits at a time. Just get it on the schedule.
In this orange down here, we have things to maybe deprioritize. Still put them on the schedule, but they’re not as important as the rest of the tasks.
So these are things that aren’t going to make that high of an impact, some impact, but not that high, and they’re not going to take that much time to do. Put them on the schedule, but they’re not as important.
Just don’t do it
Then in this last quadrant here, we have the just don’t do it quadrant. Hopefully, if you’re taking this really nice targeted look at your site and your audit through this lens, you won’t have too many of these, if any.
But if something is going to take you a lot of time and it’s not going to make that big of an impact, no one really has time for that. We want to try to avoid those types of tasks if at all possible. Now I will say there’s a caveat here for urgency. Sometimes we have to work on things regardless of what kind of impact they’re going to make on our site.
Maybe it’s because a client has a deadline, or it’s something in their contract and we just have to get something done because it’s a fire. We all have love/hate relationships with those fires. We don’t want to be handling them all of the time. If at all possible, let’s make sure to make those the exception and not the rule so that we actually get these priority tasks, these important things that are going to move the needle done and we’re not constantly pushing those down for fires.
One last thing, I will say impact is something that trips up a lot of people, myself included. How do you actually determine how much of an impact something is going to have before you do it? So that can be kind of tricky, and it’s not an exact science. But there are two main ways that I kind of like to do that. Number one, look for correlations on your website.
So if you’re looking at your website through the lens of these pages are performing really well, and they have these things true about them, and they’re on your list of things to fix on these other pages, you can go into that with a certain degree of certainty, knowing that, hey, if it works for these pages, there is a chance that this will make a high impact on these other pages as well.
So look at the data on your own website and see what’s already performing and what qualities are true about those pages. Number two, I would say one of the biggest things you can do is just to start small and test. Sometimes you really don’t know what kind of an impact something is going to make until you test on a small section. Then if it does have a high impact, great. Put it here and then roll it out to the rest of your site.
But if it doesn’t have a good impact or it has minimal impact, you learn something from that. But at least now you know not to prioritize it, not to spend all of your time on it and roll it out to your entire website, because that could be potentially a waste of time. So that’s how I prioritize and I customize my SEO audits. I think a lot of us struggle with: What even is an SEO audit?
How do I do it? Where do I even look? Is this even going to make a difference? So that’s how I kind of try to make a higher impact with my SEO audits by taking a more targeted approach. If you have a way that you do SEO audits that you think is super helpful, pop it in the comments, share it with all of us. I think it’s really good to share and get on the same page about the different ways we could perform SEO audits for higher impact.
So hopefully that was helpful for you. That’s it for this week’s Whiteboard Friday. Please come back again next week for another one.
How many of these have you heard over the years? Convincing clients and stakeholders that SEO is worth it is half the battle. From doubts about the value of its traffic to concerns over time and competition with other channels, it seems like there’s an argument against our jobs at every turn.
In today’s Whiteboard Friday, Kameron Jenkins cover the five most common objections to SEO and how to counter them with smart, researched, fact-based responses.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Hey, everybody. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and today we’re going to be going through five common objections to SEO and how to respond. Now I know, if you’re watching this and you’re an SEO, you have faced some of these very objections before and probably a lot of others.
This is not an exhaustive list. I’m sure you’ve faced a ton of other objections, whether you’re talking to a potential client, maybe you’re talking to your friend or your family member. A lot of people have misunderstandings about SEO and that causes them to object to wanting to invest in it. So I thought I’d go through some of the ones that I hear the most and how I tend to respond in those situations. Hopefully, you’ll find that helpful.
1. “[Other channel] drives more traffic/conversions, so it’s better.”
Let’s dive in. The number one objection I hear a lot of the time is this other channel, whether that be PPC, social, whatever, drives more traffic or conversions, therefore it’s better than SEO. I want to respond a few different ways depending.
Success follows investment
So the number one thing I would usually say is that don’t forget that success follows investment.
So if you are investing a lot of time and money and talent into your PPC or social and you’re not really doing much with organic, you’re kind of just letting it go, usually that means, yeah, that other channel is going to be a lot more successful. So just keep that in mind. It’s not inherently successful or not. It kind of reflects the effort you’re putting into it.
Every channel serves a different purpose
Number two, I would say that every channel serves a different purpose. You’re not going to expect social media to drive conversions a lot of the time, because a lot of the time social is for engagement. It’s for more top of the funnel. It’s for more audience development. SEO, a lot of the time that lives at your top and mid-funnel efforts. It can convert, but not always.
So just keep that in mind. Every channel serves a different purpose.
Assists vs last click only
The last thing I would say, kind of dovetailing off of that, is that assists versus last click only I know is a debate when it comes to attribution. But just keep in mind that when SEO and organic search doesn’t convert as the last click before conversion, it still usually assists in the process. So look at your assisted conversions and see how SEO is contributing.
2. “SEO is dead because the SERPs are full of ads.”
The number two objection I usually hear is SEO is dead because the SERPs are full of ads. To that, I would respond with a question.
What SERPs are you looking at?
It really depends on what you’re querying. If you’re only looking at those bottom funnel, high cost per click, your money keywords, absolutely those are monetized.
Those are going to be heavily monetized, because those are at the bottom of the funnel. So if you’re only ever looking at that, you might be pessimistic when it comes to your SEO. You might not be thinking that SEO has any kind of value, because organic search, those organic results are pushed down really low when you’re looking at those bottom funnel terms. So I think these two pieces of research are really interesting to look at in tandem when it comes to a response to this question.
I think this was put out sometime last year by Varn Research, and it said that 60% of people, when they see ads on the search results, they don’t even recognize that they’re ads. That’s actually probably higher now that Google changed it from green to black and it kind of blends in a little bit better with the rest of it. But then this data from Jumpshot says that only about 2% to 3% of all search clicks go to PPC.
So how can these things coexist? Well, they can coexist because the vast majority of searches don’t trigger ads. A lot more searches are informational and navigational more so than commercial.
People research before buying
So just keep in mind that people are doing a lot of research before buying.
A lot of times they’re looking to learn more information. They’re looking to compare. Keep in mind your buyer’s entire journey, their entire funnel and focus on that. Don’t just focus on the bottom of the funnel, because you will get discouraged when it comes to SEO if you’re only looking there.
Also, they’re just better together. There are a lot of studies that show that PPC and SEO are more effective when they’re both shown on the search results together for a single company.
I’m thinking of one by Seer, they did right now, that showed the CTR is higher for both when they’re on the page together. So just keep that in mind.
3. “Organic drives traffic, just not the right kind.”
The number three objection I hear a lot is that organic drives traffic, just not the right kind of traffic. People usually mean a few different things when they say that.
Branded vs non-branded
Number one, they could mean that organic drives traffic, but it’s usually just branded traffic anyway.
It’s just people who know about us already, and they’re searching our business name and they’re finding us. That could be true. But again, that’s probably because you’re not investing in SEO, not because SEO is not valuable. I would also say that a lot of times this is pretty easily debunked. A lot of times inadvertently people are ranking for non-branded terms that they didn’t even know they were ranking for.
So go into Google Search Console, look at their non-branded queries and see what’s driving impressions and clicks to the website.
Assists are important too
Number two, again, just to say this one more time, assists are important too. They play a part in the eventual conversion or purchase. So even if organic drives traffic that doesn’t convert as the last click before conversion, it still usually plays a role.
It can be highly qualified
Number three, it can be highly qualified. Again, this is that following the investment thing. If you are actually paying attention to your audience, you know the ways they search, how they search, what terms they search for, what’s important to your brand, then you can bring in really highly qualified traffic that’s more inclined to convert if you’re paying attention and being strategic with your SEO.
4. “SEO takes too long”
Moving on to number four, that objection I hear is SEO takes too long. That’s honestly one of the most common objections you hear about SEO.
SEO is not a growth hack
In response to that, I would say it’s not a growth hack. A lot of people who are really antsy about SEO and like “why isn’t it working right now” are really looking for those instant results.
They want a tactic they can sprinkle on their website for instant whatever they want. Usually it’s conversions and revenue and growth. I would say it’s not a growth hack. If you’re looking at it that way, it’s going to disappoint you.
Methodology + time = growth
But I will say that SEO is more methodology than tactic. It’s something that should be ingrained and embedded into everything you do so that over time, when it’s baked into everything you’re doing, you’re going to achieve sustained growth.
So that’s how I respond to that one.
5. “You can’t measure the ROI.”
Number five, the last one and probably one of the most frustrating, I’m sure this is not exclusive to SEO. I know social hears it a lot. You can’t measure the ROI, therefore I don’t want to invest in it, because I don’t have proof that I’m getting a return on this investment. So people kind of tend to mean, I think, two things when they say this.
A) Predicting ROI
Number one, they really want to be able to predict ROI before they even dive in. They want assurances that if I invest in this, I’m going to get X in return, which there are a lot of, I think, problems with that inherently, but there are some ways you can get close to gauging what you’re going to get for your efforts. So what I would do in this situation is use your own website’s data to build yourself a click-through rate curve so that you know the click-through rate at your various rank positions.
By knowing that and combining that with the search volume of a keyword or a phrase that you want to go after, you can multiply the two and just say, “Hey, here’s the expected traffic we will get if you will let me work on improving our rank position from 9 to 2 or 1” or whatever that is. So there are ways to estimate and get close.
A lot of times, when you do improve, you’re focusing on improving one term, you’re likely going to get a lot more traffic than what you’re estimating because you tend to end up ranking for so many more longer tail keywords that bring in a lot of additional search volume. So you’re probably going to even underestimate when you do this. But that’s one way you can predict ROI.
B) Measuring ROI
Number two here, measuring ROI is a lot of times what people want to be doing.
They want to be able to prove that what they’re doing is beneficial in terms of revenue. So one way to do this is to get the lifetime value of the customer, multiply that by the close rate so that you can have a goal value. Now if you turn on your conversions and set up your goals in Google Analytics, which you I think should be doing, this assumes that you’re not an e-commerce site.
There’s different tracking for that, but a similar type of methodology applies. If you apply these things, you can have a goal value. So that way, when people convert on your site, you start to rack up the actual dollar value, the estimated dollar value that whatever channel is producing. So you can go to your source/medium report and see Google organic and see how many conversions it’s producing and how much value.
This same thing applies if you go to your assisted conversions report. You can see how much value is in there as well. I think that’s really beneficial just to be able to show people like, “Look, it is generating revenue.My SEO that’s getting you organic search traffic is generating value and real dollars and cents for you.” So those are some of the most common objections that I hear.
I want to know what are some of the ones that you hear too. So pop those in the comments. Let me know the objections you hear a lot of the time and include how you’re either struggling to respond or find the right response to people or something that you found works as a response. Share that with us. We’d all love to know. Let’s make SEO better and something that people understand a lot better. So that’s it for this week’s Whiteboard Friday.
Search for information about SEO, and you’ll quickly discover three big themes: content, user experience, and links. If you’re just getting started with SEO, that last theme will likely seem a lot more confusing and challenging than the others. That’s because, while content and user experience are under the realm of our control, links aren’t… at least not completely.
Think of this post as a quick-and-dirty version of The Beginner’s Guide to SEO’s chapter on link building. We definitely recommend you read through that as well, but if you’re short on time, this condensed version gives you a quick overview of the basics as well as actionable tips that can help you get started.
Let’s get to it!
What does “building links” mean?
Link building is a term used in SEO to describe the process of increasing the quantity of good links from other websites to your own.
Why are links so important? They’re one of the main (although not the only!) criteria Google uses to determine the quality and trustworthiness of a page. You want links from reputable, relevant websites to bolster your own site’s authority in search engines.
“Building links” is common SEO vernacular, but it deserves unpacking or else you may get the wrong idea about this practice. Google wants people to link to pages out of their own volition, because they value the content on that page. Google does not want people to link to pages because they were paid or incentivized to do so, or create links to their websites themselves — those types of links should use the “nofollow” attribute. You can read more about what Google thinks about links in their webmaster guidelines.
The main thing to remember is that links to your pages are an important part of SEO, but Google doesn’t want you paying or self-creating them, so the practice of “building links” is really more a process of “earning links” — let’s dive in.
How do I build links?
If Google doesn’t want you creating links yourself or paying for them, how do you go about getting them? There are a lot of different methods, but we’ll explore some of the basics.
Link gap analysis
One popular method for getting started with link building is to look at the links your competitors have but you don’t. This is often referred to as a competitor backlink analysis or a link gap analysis. You can perform one of these using Moz Link Explorer’s Link Intersect tool.
Link Intersect gives you a glimpse into your competitor’s link strategy. My pal Miriam and I wrote a guide that explains how to use Link Explorer and what to do with the links you find. It’s specifically geared toward local businesses, but it’s helpful for anyone just getting started with link building.
A skill you’ll definitely need for link building is email outreach. Remember, links to your site should be created by others, so to get them to link to your content, you need to tell them about it! Cold outreach is always going to be hit-or-miss, but here are a few things that can help:
Make a genuine connection: People are much more inclined to help you out if they know you. Consider connecting with them on social media and building a relationship before you ask them for a link.
Offer something of value: Don’t just ask someone to link to you — tell them how they’ll benefit! Example: offering a guest post to a content-desperate publisher.
Be someone people would want to link to: Before you ask anyone to link to your content, ask yourself questions like, “Would I find this valuable enough to link to?” and “Is this the type of content this person likes to link to?”
There are tons more articles on the Moz Blog you can check out if you’re looking to learn more about making your email outreach effective:
Contribute your expertise using services like HARO
When you’re just getting started, services like Help a Reporter Out (HARO) are great. When you sign up as a source, you’ll start getting requests from journalists who need quotes for their articles. Not all requests will be relevant to you, but be on the lookout for those that are. If the journalist likes your pitch, they may feature your quote in their article with a link back to your website.
Where do I go from here?
I hope this was a helpful crash-course into the world of link building! If you want to keep learning, we recommend checking out this free video course from HubSpot Academy that walks you through finding the right SEO strategy, including how to use Moz Link Explorer for link building.
Disclaimer: I’m currently the Director of Demand Generation at Nextiva, and writing this case study post-mortem as the former VP of Marketing at Sales Hacker (Jan. 2017 – Sept. 2018).
Every B2B company is investing in content marketing right now. Why? Because they all want the same thing: Search traffic that leads to website conversions, which leads to money.
But here’s the challenge: Companies are struggling to get traction because competition has reached an all-time high. Keyword difficulty (and CPC) has skyrocketed in most verticals. In my current space, Unified Communication as a Service (UCaaS), some of the CPCs have nearly doubled since 2017, with many keywords hovering close to $300 per click.
We validated our hard work by measuring organic growth (traffic and keywords) against our email list growth and revenue, which correlated positively, as we expected.
Organic Growth Highlights
January 2017–June 2018
As soon as I was hired at Sales Hacker as Director of Marketing, I began making SEO improvements from day one. While I didn’t waste any time, you’ll also notice that there was no silver bullet.
This was the result of daily blocking and tackling. Pure execution and no growth hacks or gimmicks. However, I firmly believe that the homepage redesign (in July 2017) was a tremendous enabler of growth.
Organic Growth to Present Day
I officially left Sales Hacker in August of 2018, when the company was acquired by Outreach.io. However, I thought it would be interesting to see the lasting impact of my work by sharing a present-day screenshot of the organic traffic trend, via Google Analytics. There appears to be a dip immediately following my departure, however, it looks like my predecessor, Colin Campbell, has picked up the slack and got the train back on the rails. Well done!
Unique considerations — Some context behind Sales Hacker’s growth
Before I dive into our findings, here’s a little context behind Sales Hacker’s growth:
Sales Hacker’s blog is 100 percent community-generated — This means we didn’t pay “content marketers” to write for us. Sales Hacker is a publishing hub led by B2B sales, marketing, and customer success contributors. This can be a blessing and a curse at the same time — on one hand, the site gets loads of amazing free content. On the other hand, the posts are not even close to being optimized upon receiving the first draft. That means, the editorial process is intense and laborious.
Aggressive publishing cadence (4–5x per week) — Sales Hacker built an incredible reputation in the B2B Sales Tech niche — we became known as the go-to destination for unbiased thought leadership for practitioners in the space (think of Sales Hacker as the sales equivalent to Growth Hackers). Due to high demand and popularity, we had more content available than we could handle. While it’s a good problem to have, we realized we needed to keep shipping content in order to avoid a content pipeline blockage and a backlog of unhappy contributors.
We had to “reverse engineer” SEO — In short, we got free community-generated and sponsored content from top sales and marketing leaders at SaaS companies like Intercom, HubSpot, Pipedrive, LinkedIn, Adobe and many others, but none of it was strategically built for SEO out of the box. We also had contributors like John Barrows, Richard Harris, Lauren Bailey, Tito Bohrt, and Trish Bertuzzi giving us a treasure trove of amazing content to work with. However, we had to collaborate with each contributor from beginning to end and guide them through the entire process. Topical ideation (based on what they were qualified to write about), keyword research, content structure, content type, etc. So, the real secret sauce was in our editorial process. Shout out to my teammate Alina Benny for learning and inheriting my SEO process after we hired her to run content marketing. She crushed it for us!
Almost all content was evergreen and highly tactical — I made it a rule that we’d never agree to publish fluffy pieces, whether it was sponsored or not. Plain and simple. Because we didn’t allow “content marketers” to publish with us, our content had a positive reputation, since it was coming from highly respected practitioners. We focused on evergreen content strategies in order to fuel our organic growth. Salespeople don’t want fluff. They want actionable and tactical advice they can implement immediately. I firmly believe that achieving audience satisfaction with our content was a major factor in our SEO success.
Outranking the “big guys” — If you look at the highest-ranking sales content, it’s the usual suspects. HubSpot, Salesforce, Forbes, Inc, and many other sites that were far more powerful than Sales Hacker. But it didn’t matter as much as traditional SEO wisdom tells us, largely due to the fact that we had authenticity and rawness to our content. We realized most sales practitioners would rather read insights from their peers in their community, above the traditional “Ultimate Guides,” which tended to be a tad dry.
We did VERY little manual link building — Our link building was literally an email from me, or our CEO, to a site we had a great relationship with. “Yo, can we get a link?” It was that simple. We never did large-scale outreach to build links. We were a very lean, remote digital marketing team, and therefore lacked the bandwidth to allocate resources to link building. However, we knew that we would acquire links naturally due to the popularity of our brand and the highly tactical nature of our content.
Our social media and brand firepower helped us to naturally acquire links — It helps A LOT when you have a popular brand on social media and a well-known CEO who authored an essential book called “Hacking Sales”. Most of Sales Hacker’s articles would get widely circulated by over 50+ SaaS partners which would help drive natural links.
Updating stale content was the lowest hanging fruit — The biggest chunk of our new-found organic traffic came from updating / refreshing old posts. We have specific examples of this coming up later in the post.
Email list growth was the “north star” metric — Because Sales Hacker is not a SaaS company, and the “product” is the audience, there was no need for aggressive website CTAs like “book a demo.” Instead, we built a very relationship heavy, referral-based sales cadence that was supported by marketing automation, so list growth was the metric to pay attention to. This was also a key component to positioning Sales Hacker for acquisition. Here’s how the email growth progression was trending.
So, now that I’ve set the stage, let’s dive into exactly how I built this SEO strategy.
Think about it: B2B tech sales is all about numbers and selling stuff. Very few brands are really taking the time to learn about the types of content their audiences would like to consume.
When I was asking people if I could talk to them about their media and content interests, their response was: “So, wait, you’re actually not trying to sell me something? Sure! Let’s talk!”
Here’s what I set out to learn:
Goal 1 — Find one major brand messaging insight.
Goal 2 — Find one major audience development insight.
Goal 3 — Find one major content strategy insight.
Goal 4 — Find one major UX / website navigation insight.
Goal 5 — Find one major email marketing insight.
In short, I accomplished all of these learning goals and implemented changes based on what the audience told me.
If you’re curious, you can check out my entire UX research process for yourself, but here are some of the key learnings:
Based on these outcomes, I was able to determine the following:
Topical “buckets” to focus on — Based on the most common daily tasks, the data told us to build content on sales prospecting, building partnerships and referral programs, outbound sales, sales management, sales leadership, sales training, and sales ops.
Thought leadership — 62 percent of site visitors said they kept coming back purely due to thought leadership content, so we had to double down on that.
Content Types — Step by step guides, checklists, and templates were highly desired. This told me that fluffy BS content had to be ruthlessly eliminated at all costs.
Sales Hacker Podcast — 76 percent of respondents said they would listen to the Sales Hacker Podcast (if it existed), so we had to launch it!
2) SEO site audit — Key findings
I can’t fully break down how to do an SEO site audit step by step in this post (because it would be way too much information), but I will share the key findings and takeaways from our own Site Audit that led to some major improvements in our website performance.
Lack of referring domain growth
Sales Hacker was not able to acquire referring domains at the same rate as competitors. I knew this wasn’t because of a link building acquisition problem, but due to a content quality problem.
Lack of organic keyword growth
Sales Hacker had been publishing blog content for years (before I joined) and there wasn’t much to show for it from an organic traffic standpoint. However, I do feel the brand experienced a remarkable social media uplift by building content that was helpful and engaging.
Sales Hacker did happen to get lucky and rank for some non-branded keywords by accident, but the amount of content published versus the amount of traffic they were getting wasn’t making sense.
To me, this immediately screamed that there was an issue with on-page optimization and keyword targeting. It wasn’t anyone’s fault – this was largely due to a startup founder thinking about building a community first, and then bringing SEO into the picture later.
At the end of the day, Sales Hacker was only ranking for 6k keywords at an estimated organic traffic cost of $8.9k — which is nothing. By the time Sales Hacker got acquired, the site had an organic traffic cost of $122k.
This is common among startups that are just looking to get content out. This is just one example, but truth be told, there was a whole mess of non-descriptive URLs that had to get cleaned up.
Poor internal linking structure
The internal linking concentration was poorly distributed. Most of the equity was pointing to some of the lowest value pages on the site.
Poor taxonomy, site structure, and navigation
I created a mind-map of how I envisioned the new site structure and internal linking scheme. I wanted all the content pages to be organized into categories and subcategories.
My goals with the new proposed taxonomy would accomplish the following:
Increase engagement from natural site visitor exploration
Allow users to navigate to the most important content on the site
Improve landing page visibility from an increase in relevant internal links pointing to them.
Topical directories and category pages eliminated with redirects
Topical landing pages used to exist on SalesHacker.com, but they were eliminated with 301 redirects and disallowed in robots.txt. I didn’t agree with this configuration. Example: /social-selling/
Trailing slash vs. non-trailing slash duplicate content with canonical errors
Multiple pages for the same exact intent. Failing to specify the canonical version.
Branded search problems — “Sales Hacker Webinar”
Some of the site’s most important content is not discoverable from search due to technical problems. For example, a search for “Sales Hacker Webinar” returns irrelevant results in Google because there isn’t an optimized indexable hub page for webinar content. It doesn’t get that much search volume (0–10 monthly volume according to Keyword Explorer), but still, that’s 10 potential customers you are pissing off every month by not fixing this.
Fast forward six months later, and this was the new homepage we built after doing audience and customer research…
New homepage goals
Tell people EXACTLY what Sales Hacker is and what we do.
Make it stupidly simple to sign up for the email list.
Allow visitors to easily and quickly find the content they want.
Add social proof.
Improve internal linking.
I’m proud to say, that it all went according to plan. I’m also proud to say that as a result, organic traffic skyrocketed shortly after.
Special Note: Major shout out to Joshua Giardino, the lead developer who worked with me on the homepage redesign. Josh is one of my closest friends and my marketing mentor. I would not be writing this case study today without him!
There wasn’t one super measurable thing we isolated in order to prove this. We just knew intuitively that there was a positive correlation with organic traffic growth, and figured it was due to the internal linking improvements and increased average session duration from improving the UX.
4) Updating and optimizing existing content
Special note: We enforced “Ditch the Pitch”
Before I get into the nitty-gritty SEO stuff, I’ll tell you right now that one of the most important things we did was blockade contributors and sponsors from linking to product pages and injecting screenshots of product features into blog articles, webinars, etc.
Side note: One thing we also had to do was add a nofollow attribute to all outbound links within sponsored content that sent referral traffic back to partner websites (which is no longer applicable due to the acquisition).
The #1 complaint we discovered in our audience research was that people were getting irritated with content that was “too salesy” or “too pitchy” — and rightfully so, because who wants to get pitched at all day?
So we made it all about value. Pure education. School of hard knocks style insights. Actionable and tactical. No fluff. No nonsense. To the point.
And that’s where things really started to take off.
This is what the post originally looked like (and it didn’t rank well for “best sales books).
And then after…
And the result…
Before and after: “Sales operations”
What we noticed here was a crappy article attempting to explain the role of sales operations.
Here are the steps we took to rank #1 for “Sales Operations:”
Built a super optimized mega guide on the topic.
Since the old crappy article had some decent links, we figured let’s 301 redirect it to the new mega guide.
Promote it on social, email and normal channels.
Here’s what the new guide on Sales Ops looks like…
And the result…
5) New content opportunities
One thing I quickly realized Sales Hacker had to its advantage was topical authority. Exploiting this was going to be our secret weapon, and boy, did we do it well:
We knew we could win this SERP by creating content that was super actionable and tactical with examples.
Most of the competing articles in the SERP were definition style and theory-based, or low-value roundups from domains with high authority.
In this case, DA doesn’t really matter. The better man wins.
“Best sales tools”
Because Sales Hacker is an aggregator website, we had the advantage of easily out-ranking vendor websites for best and top queries.
Of course, it also helps when you build a super helpful mega list of tools. We included over 150+ options to choose from in the list. Whereas SERP competitors did not even come close.
Notice how Sales Hacker’s article is from 2017 still beats HubSpot’s 2019 version. Why? Because we probably satisfied user intent better than them.
For this query, we figured out that users really want to know about Direct Sales vs Channel Sales, and how they intersect.
HubSpot went for the generic, “factory style” Ultimate Guide tactic.
Don’t get me wrong, it works very well for them (especially with their 91 DA), but here is another example where nailing the user intent wins.
“Sales excel templates”
This was pure lead gen gold for us. Everyone loves templates, especially sales excel templates.
The SERP was easily winnable because the competition was so BORING in their copy. Not only did we build a better content experience, but we used numbers, lists, and power words that salespeople like to see, such as FAST and Pipeline Growth.
Special note: We never used long intros
The one trend you’ll notice is that all of our content gets RIGHT TO THE POINT. This is inherently obvious, but we also uncovered it during audience surveying. Salespeople don’t have time for fluff. They need to cut to the chase ASAP, get what they came for, and get back to selling. It’s really that straightforward.
When you figure out something THAT important to your audience, (like keeping intros short and sweet), and then you continuously leverage it to your advantage, it’s really powerful.
6) Featured Snippets
Featured snippets became a huge part of our quest for SERP dominance. Even for SERPs where organic clicks have reduced, we didn’t mind as much because we knew we were getting the snippet and free brand exposure.
Here are some of the best-featured snippets we got!
Featured snippet: “Channel sales”
Featured snippet: “Sales pipeline management”
Featured snippet: “BANT”
Featured snippet: “Customer success manager”
Featured snippet: “How to manage a sales team”
Featured snippet: “How to get past the gatekeeper”
1. It may be worth acquiring a niche media brand in your space
2. It may be worth starting your own niche media brand in your space
I feel like most B2B companies (not all, but most) come across as only trying to sell a product — because most of them are. You don’t see the majority of B2B brands doing a good job on social. They don’t know how to market to emotion. They completely ignore top-funnel in many cases and, as a result, get minimal engagement with their content.
There’s really so many areas of opportunity to exploit in B2B marketing if you know how to leverage that human emotion — it’s easy to stand out if you have a soul. Sales Hacker became that “soul” for Outreach — that voice and community.
But one final reason why a SaaS company would buy a media brand is to get the edge over a rival competitor. Especially in a niche where two giants are battling over the top spot.
In this case, it’s Outreach’s good old arch-nemesis, Salesloft. You see, both Outreach and Salesloft are fighting tooth and nail to win a new category called “Sales Engagement”.
As part of the acquisition process, I prepared a deck that highlighted how beneficial it would be for Outreach to acquire Sales Hacker, purely based on the traffic advantage it would give them over Salesloft.
Sales Hacker vs. Salesloft vs Outreach — Total organic keywords
This chart from 2018 (data exported via SEMrush), displays that Sales Hacker is ranking for more total organic keywords than Salesloft and Outreach combined.
Sales Hacker vs. Salesloft vs Outreach — Estimated traffic cost
This chart from 2018 (data exported via SEMrush), displays the cost of the organic traffic compared by domain. Sales Hacker ranks for more commercial terms due to having the highest traffic cost.
Sales Hacker vs. Salesloft vs Outreach — Rank zone distributions
This chart from 2018 (data exported via SEMrush), displays the rank zone distribution by domain. Sales Hacker ranked for more organic keywords across all search positions.
Sales Hacker vs. Salesloft vs Outreach — Support vs. demand keywords
This chart from 2018 (data exported via SEMrush), displays support vs demand keywords by domain. Because Sales Hacker did not have a support portal, all its keywords were inherently demand focused.
Meanwhile, Outreach was mostly ranking for support keywords at the time. Compared to Salesloft, they were at a massive disadvantage.
I wouldn’t be writing this right now without the help, support, and trust that I got from so many people along the way.
Joshua Giardino — Lead developer at Sales Hacker, my marketing mentor and older brother I never had. Couldn’t have done this without you!
Max Altschuler — Founder of Sales Hacker, and the man who gave me a shot at the big leagues. You built an incredible platform and I am eternally grateful to have been a part of it.
Scott Barker — Head of Partnerships at Sales Hacker. Thanks for being in the trenches with me! It’s a pleasure to look back on this wild ride, and wonder how we pulled this off.
Alina Benny — My marketing protege. Super proud of your growth! You came into Sales Hacker with no fear and seized the opportunity.
Mike King — Founder of iPullRank, and the man who gave me my very first shot in SEO. Thanks for taking a chance on an unproven kid from the Bronx who was always late to work.
Yaniv Masjedi — Our phenomenal CMO at Nextiva. Thank you for always believing in me and encouraging me to flex my thought leadership muscle. Your support has enabled me to truly become a high-impact growth marketer.
Thanks for reading — tell me what you think below in the comments!
It’s easy to find writers; they’re everywhere — from a one-second Google search to asking on LinkedIn.
But hiring the best ones? That’s the daunting task marketers and business owners face. And you do not just need writers, you need exceptional SEO content writers.
Mainly because that’s what Google (aka the largest traffic driver of most sites) has clearly been clamoring for since their Panda update in 2011, RankBrain in 2015, and their “Fred” update (and by the way, Gary Illyes from Google coined “Fred’ for every unnamed Google update) in March, 2017.
It’s obvious how each of these major updates communicates Google’s preference for excellent SEO writers:
If you’re a frequent Moz reader, you probably know how they work — but if not: Panda penalizes every webpage with content that adds little to no value to people online, giving more visibility to content pieces that do. On its own, the RankBrain update has made Google almost as smart as humans — when choosing the most relevant and high-quality content to rank on page #1 of search engine result pages (SERPs).
The “Fred” update further tackled sites with low-quality content that aren’t doing anything beyond providing information that’s already available on the internet. It also penalized sites that prioritized revenue above user experience.
It is evident that Google has, through these core updates, been requiring brands, publishers, and marketers to work with SEO content writers who know their onions; the ones who know how to write with on-page SEO mastery.
But how do you find these exceptional wordsmiths? Without a plan, you will have to screen tens (or even hundreds) of them to find those who are a good fit.
But let’s make it easier for you. Essentially, your ideal SEO writers should have two key traits:
Good on-page SEO expertise
A great eye for user experience (i.e. adding relevant images, formatting, etc.)
A writer with these two skills is a great SEO writer. But let’s dig a bit deeper into what that means.
(Note: this post is about hiring exceptional SEO content writers — i.e., wordsmiths who don’t need you monitoring them to do great work. So, things can get a bit techie as you read on. I’ll be assuming your ideal writer understands or is responsible for things like formatting, on-page SEO, and correctly uploading content into your CMS.)
1. On-page SEO knowledge
By now, you know what on-page SEO is. But if not, it’s simply the elements you put on a site or web page to let search engines understand that you have content on specific topics people are searching for.
So, how do you know if a writer has good on-page SEO knowledge?
Frankly, “Can you send me your previous writing samples?” is the ideal question to ask any writer you’re considering hiring. Once they show their samples, have them walk you through each one, and ask yourself the following questions:
Question A: Do they have ‘focus keywords’ in their previous samples?
Several factors come into play when trying to rank any page, but your ideal writer must know how to hold things down on the keyword side of things.
Look through their samples; see if they have optimized any content piece for a specific keyword in the past so you can know if they’ll be able to do the same for your content.
Question B: How do they use title tags?
Search engines use title tags to detect the headings in your content.
You know how it works: put “SEO strategy” — for example — in a few, relevant headings on a page and search engines will understand the page is teaching SEO strategy.
Essentially, your ideal SEO writer should understand how to use them to improve your rankings and attract clicks from your potential customers in search results.
Are title tags really that important? They are. Ahrefs, for instance, made their title tag on a page more descriptive and this alone upped their traffic by 37.58%.
So, look through the titles in your candidate’s samples, especially the h1 title. Here’s what you should look for when examining how a candidate uses HTML tags:
i. Header tags should, ideally, not be more than 60 characters. This is to avoid results that look like this in SERPs:
(three dots in front of your titles constitutes bad UX — which Google frowns at)
ii. The subheadings should be h2 (not necessarily, but it’s a plus)
iii. Headings under subtopics should be h3 (also not necessary, but it’s a plus)
Look for these qualities in your candidate’s work and you’ll be able to confirm that they properly implement title tags in their content, and can do the same for you.
But some writers may not have control over the title tags in their published works — that is, the sites they wrote for probably didn’t give them such access. In this case, request samples they published on their own site, where they actually have control over these tags.
Question C: What do they know about internal linking?
Orbit Media once shared how they used internal linking to shoot a blog post from position #29 up to #4.
So, it’s important that your writers know how to contextually link to your older content pieces while writing new content. And it works for good reason; internal linking helps you:
Communicate the relevance and value of your pages to Google (the more links a page gets, the more authority it has in Google’s eyes)
Demonstrate to Google that your site contains in-depth content about any specific topic
Tell Google your site has easy navigation — which means it has good UX and is well-structured.
Internal linking is a major key to search ranking, so you need writers who have internal linking in their pocketful of tools. But also ensure they do it using proper anchor texts; in a recent LinkedIn post, expert editor Rennie Sanusi hinted at two key anchor text elements to look for in your candidate’s samples:
[Anchor texts] should clearly explain where they’ll take your reader to
[Anchor texts] shouldn’t be too long
Question D: Do they write long-form content?
The average word count of a Google first page result is 1,800+ words long — according to research from Backlinko.
Google has been all about in-depth content since its inception; you’re probably familiar with their mission statement:
Every algorithm change they make is geared toward achieving this mission statement, and ranking long-form content helps them in the process as well.
Because, to them, writing longer content means you’re putting more information that searchers are looking for into your content.
So you need writers who can produce long-form content. Check their samples and confirm they know how to write long-form content on a regular basis.
Question E: Have they ranked for any important keywords?
Ultimately, you need to see examples of important keywords your ideal content writer has ranked for in the past. This is the utmost test of their ability to actually drive search traffic your way.
That’s it for finding writers who know on-page SEO. But as you know, that’s only one part of the skills that makes a great SEO content writer.
The other important bit is their ability to write content that engages humans. In other words, they need to know how to keep people reading a page for several minutes (or even hours), leading them to take actions that are important to your business.
2. A great eye for user experience
Keeping readers on a page for long durations also improves your ranking.
In the aforementioned Backlinko study, researchers analyzed 100,000 sites and found that “websites with low average bounce rates are strongly correlated with higher rankings.”
And you know what that means; your ideal SEO writer should not only write to rank on search engines, they must also write to attract and keep the attention of your target audience.
So, look for the following in their samples:
Headlines and introductions that hook readers
You need writers who are expert enough to know the types of headlines and opening paragraphs that work.
It’s not a hard skill to spot; look through their samples. If their titles and introductions don’t hook you, they probably won’t hook your audience. It’s really that simple.
Explainer images and visuals
The report also revealed that: “Content with at least one image significantly outperformed content without any images.”
But of course, they have to be relevant images (or other visual types). And many times (if not most of the time), that means explainer images — so look out for those in their samples. And there are two examples of explainer images:
Example #1: Explainer images with text and pointers
This one has elements (an arrow and a text) on it, explaining how the image is relevant to the topic the content is about.
Example #2: Explainer images without text and pointers
Why does this image not have any text or arrows on it? It’s a self-explanatory screenshot, that’s why.
As long as it’s used appropriately — where the “online sales of Nike products” is mentioned in the content — it gets its message across.
In general, your ideal SEO writers need to know how to use tools like Skitch and Canva to create these images. Remember, you’re on a hunt for the exceptional ones.
References and citing resources
Your ideal writer should link to stats or studies that make their points stronger. This one’s pretty self-explanatory. Check the links in their samples and make sure they cite genuine resources.
Illustrations make understanding easier. Especially if you’re in a technical industry (and most industries have their geeky side), your ideal writer should know how to explain their points with examples.
Simply search their samples — using Command + F (or Ctrl F if you’re using Windows) — for “example,” “instance,” or “illustration.” This works, because writers usually mention things like “for example,” or “for instance” when providing illustrations.
Excellent SEO content writers = Higher search rankings
Getting SEO content writers who have all the skills I’ve mentioned in this article are possible to find. And hiring them means higher search rankings for your content. These writers are, again, everywhere. But here’s the thing — and you’ve probably heard it before: You get what you pay for.
Exceptional SEO content writers are your best bet, but they’re not cheap. They can send your search traffic through the roof, but, like you: They want to work for people who can afford the quality they provide. So, if you’re going on a hunt for them, ready your wallet.
But ensure you get their samples and ask the questions in this guide as you deem fit. If you’re paying for content that’ll help you rank higher on Google, then you really should get what you pay for.
Did you find any of my tips helpful? Let me know in the comments below!
Editor’s note: This post first appeared in April of 2017, but because SEO (and Google) changes so quickly, we figured it was time for a refresh!
Meta tags represent the beginning of most SEO training, for better or for worse. I contemplated exactly how to introduce this topic because we always hear about the bad side of meta tags — namely, the keywords meta tag. One of the first things dissected in any site review is the misuse of meta tags, mainly because they’re at the top of every page in the header and are therefore the first thing seen. But we don’t want to get too negative; meta tags are some of the best tools in a search marketer’s repertoire.
There are meta tags beyond just description and keywords, though those two are picked on the most. I’ve broken down the most-used (in my experience) by the good, the bad, and the indifferent. You’ll notice that the list gets longer as we get to the bad ones. I didn’t get to cover all of the meta tags possible to add, but there’s a comprehensive meta tag resource you should check out if you’re interested in everything that’s out there.
It’s important to note that in 2019, you meta tags still matter, but not all of them can help you. It’s my experience, and I think anyone in SEO would agree, that if you want to rank high in search, your meta tags need to accompany high-quality content that focuses on user satisfaction.
My main piece of advice: stick to the core minimum. Don’t add meta tags you don’t need — they just take up code space. The less code you have, the better. Think of your page code as a set of step-by-step directions to get somewhere, but for a browser. Extraneous meta tags are the annoying “Go straight for 200 feet” line items in driving directions that simply tell you to stay on the same road you’re already on!
The good meta tags
These are the meta tags that should be on every page, no matter what. Notice that this is a small list; these are the only ones that are required, so if you can work with just these, please do.
Meta content type – This tag is necessary to declare your character set for the page and should be present on every page. Leaving this out could impact how your page renders in the browser. A few options are listed below, but your web designer should know what’s best for your site.
Meta description – The infamous meta description tag is used for one major purpose: to describe the page to searchers as they read through the SERPs. This tag doesn’t influence ranking, but it’s very important regardless. It’s the ad copy that will determine if users click on your result. Keep it within 160 characters, and write it to catch the user’s attention. Sell the page — get them to click on the result. Here’s a great article on meta descriptions that goes into more detail.
Different sites will need to use these in specific circumstances, but if you can go without, please do.
Social meta tags – I’m leaving these out. OpenGraph and Twitter data are important to sharing but are not required per se.
Robots – One huge misconception is that you have to have a robots meta tag. Let’s make this clear: In terms of indexing and link following, if you don’t specify a meta robots tag, they read that as index,follow. It’s only if you want to change one of those two commands that you need to add meta robots. Therefore, if you want to noindex but follow the links on the page, you would add the following tag with only the noindex, as the follow is implied. Only change what you want to be different from the norm.
<meta name="robots" content="noindex" />
Specific bots (Googlebot) – These tags are used to give a specific bot instructions like noodp (forcing them not to use your DMOZ listing information, RIP) and noydir (same, but instead the Yahoo Directory listing information). Generally, the search engines are really good at this kind of thing on their own, but if you think you need it, feel free. There have been some cases I’ve seen where it’s necessary, but if you must, consider using the overall robots tag listed above.
Language – The only reason to use this tag is if you’re moving internationally and need to declare the main language used on the page. Check out this meta languages resource for a full list of languages you can declare.
Keywords – Yes, I put this on the “indifferent” list. While no good SEO is going to recommend spending any time on this tag, there’s some very small possibility it could help you somewhere. Please leave it out if you’re building a site, but if it’s automated, there’s no reason to remove it.
Refresh – This is the poor man’s redirect and should not be used, if at all possible. You should always use a server-side 301 redirect. I know that sometimes things need to happen now, but Google is NOT a fan.
Site verification – Your site is verified with Google and Bing, right? Who has the verification meta tags on their homepage? These are sometimes necessary because you can’t get the other forms of site verification loaded, but if at all possible try to verify another way. Google allows you to verify by DNS, external file, or by linking your Google Analytics account. Bing still only allows by XML file or meta tag, so go with the file if you can.
The bad meta tags
Nothing bad will happen to your site if you use these — let me just make that clear. They’re a waste of space though; even Google says so (and that was 12 years ago now!). If you’re ready and willing, it might be time for some spring cleaning of your <head> area.
Author/web author – This tag is used to name the author of the page. It’s just not necessary on the page.
Revisit after – This meta tag is a command to the robots to return to a page after a specific period of time. It’s not followed by any major search engine.
Rating – This tag is used to denote the maturity rating of content. I wrote a post about how to tag a page with adult images using a very confusing system that has since been updated (see the post’s comments). It seems as if the best way to note bad images is to place them on a separate directory from other images on your site and alert Google.
Expiration/date – “Expiration” is used to note when the page expires, and “date” is the date the page was made. Are any of your pages going to expire? Just remove them if they are (but please don’t keep updating content, even contests — make it an annual contest instead!). And for “date,” make an XML sitemap and keep it up to date. It’s much more useful.
Copyright – That Google article debates this with me a bit, but look at the footer of your site. I would guess it says “Copyright 20xx” in some form. Why say it twice?
Abstract – This tag is sometimes used to place an abstract of the content and used mainly by educational pursuits.
Distribution – The “distribution” value is supposedly used to control who can access the document, typically set to “global.” It’s inherently implied that if the page is open (not password-protected, like on an intranet) that it’s meant for the world. Go with it, and leave the tag off the page.
Generator – This is used to note what program created the page. Like “author,” it’s useless.
Cache-control – This tag is set in hopes of controlling when and how often a page is cached in the browser. It’s best to do this in the HTTP header.
Resource type – This is used to name the type of resource the page is, like “document.” Save yourself time, as the DTD declaration does it for you.
There are so many meta tags out there, I’d love to hear about any you think need to be added or even removed! Shout out in the comments with suggestions or questions.
Log File Analysis should be a part of every SEO pro’s tool belt, but most SEOs have never conducted one. Which means most SEOs are missing out on unique and invaluable insights that regular crawling tools just can’t produce.
Let’s demystify Log File Analysis so it’s not so intimidating. If you’re interested in the wonderful world of log files and what they can bring to your site audits, this guide is definitely for you.
What are Log Files?
Log Files are files containing detailed logs on who and what is making requests to your website server. Every time a bot makes a request to your site, data (such as the time, date IP address, user agent, etc.) is stored in this log. This valuable data allows any SEO to find out what Googlebot and other crawlers are doing on your site. Unlike regular crawlings, such as with the Screaming Frog SEO Spider, this is real-world data — not an estimation of how your site is being crawled. It is an exact overview of how your site is being crawled.
Having this accurate data can help you identify areas of crawl budget waste, easily find access errors, understand how your SEO efforts are affecting crawling and much, much more. The best part is that, in most cases, you can do this with simple spreadsheet software.
In this guide, we will be focussing on Excel to perform Log File Analysis, but I’ll also discuss other tools such as Screaming Frog’s less well-known Log File Analyser which can just make the job a bit easier and faster by helping you manage larger data sets.
Note: owning any software other than Excel is not a requirement to follow this guide or get your hands dirty with Log Files.
How to Open Log Files
Rename .log to .csv
When you get a log file with a .log extension, it is really as easy as renaming the file extension .csv and opening the file in spreadsheet software. Remember to set your operating system to show file extensions if you want to edit these.
How to open split log files
Log files can come in either one big log or multiple files, depending on the server configuration of your site. Some servers will use server load balancing to distribute traffic across a pool or farm of servers, causing log files to be split up. The good news is that it’s really easy to combine, and you can use one of these three methods to combine them and then open them as normal:
Use the command line in Windows by Shift + right-clicking in the folder containing your log files and selecting “Run Powershell from here”
Then run the following command:
copy *.log mylogfiles.csv
You can now open mylogfile.csv and it will contain all your log data.
Or if you are a Mac user, first use the cd command to go to the directory of your log files:
Then, use the cat or concatenate command to join up your files:
cat *.log > mylogfiles.csv
2) Using the free tool, Log File Merge, combine all the log files and then edit the file extension to .csv and open as normal.
3) Open the log files with the Screaming Frog Log File Analyser, which is as simple as dragging and dropping the log files:
(Please note: This step isn’t required if you are using Screaming Frog’s Log File Analyser)
Once you have your log file open, you’re going to need to split the cumbersome text in each cell into columns for easier sorting later.
Excel’s Text to Column function comes in handy here, and is as easy as selecting all the filled cells (Ctrl / Cmd + A) and going to Excel > Data > Text to Columns and selecting the “Delimited” option, and the delimiter being a Space character.
Once you’ve separated this out, you may also want to sort by time and date — you can do so in the Time and Date stamp column, commonly separating the data with the “:” colon delimiter.
Your file should look similar to the one below:
As mentioned before, don’t worry if your log file doesn’t look exactly the same — different log files have different formats. As long as you have the basic data there (time and date, URL, user-agent, etc.) you’re good to go!
Understanding Log Files
Now that your log files are ready for analysis, we can dive in and start to understand our data. There are many formats that log files can take with multiple different data points, but they generally include the following:
Date and time
Server request method (e.g. GET / POST)
HTTP status code
More details on the common formats can be found below if you’re interested in the nitty gritty details:
Apache and NGINX
Amazon Elastic Load Balancing
How to quickly reveal crawl budget waste
As a quick recap, Crawl Budget is the number of pages a search engine crawls upon every visit of your site. Numerous factors affect crawl budget, including link equity or domain authority, site speed, and more. With Log File Analysis, we will be able to see what sort of crawl budget your website has and where there are problems causing crawl budget to be wasted.
Ideally, we want to give crawlers the most efficient crawling experience possible. Crawling shouldn’t be wasted on low-value pages and URLs, and priority pages (product pages for example) shouldn’t have slower indexation and crawl rates because a website has so many dead weight pages. The name of the game is crawl budget conservation, and with good crawl budget conversion comes better organic search performance.
See crawled URLs by user agent
Seeing how frequently URLs of the site are being crawled can quickly reveal where search engines are putting their time into crawling.
If you’re interested in seeing the behavior of a single user agent, this is easy as filtering out the relevant column in excel. In this case, with a WC3 format log file, I’m filtering the cs(User-Agent) column by Googlebot:
And then filtering the URI column to show the number of times Googlebot crawled the home page of this example site:
This is a fast way of seeing if there are any problem areas by URI stem for a singular user-agent. You can take this a step further by looking at the filtering options for the URI stem column, which in this case is cs-uri-stem:
From this basic menu, we can see what URLs, including resource files, are being crawled to quickly identify any problem URLs (parameterized URLs that shouldn’t be being crawled for example).
You can also do broader analyses with Pivot tables. To get the number of times a particular user agent has crawled a specific URL, select the whole table (Ctrl/cmd + A), go to Insert > Pivot Table and then use the following options:
All we’re doing is filtering by User Agent, with the URL stems as rows, and then counting the number of times each User-agent occurs.
With my example log file, I got the following:
Then, to filter by specific User-Agent, I clicked the drop-down icon on the cell containing “(All),” and selected Googlebot:
Understanding what different bots are crawling, how mobile bots are crawling differently to desktop, and where the most crawling is occurring can help you see immediately where there is crawl budget waste and what areas of the site need improvement.
Find low-value add URLs
Crawl budget should not be wasted on Low value-add URLs, which are normally caused by session IDs, infinite crawl spaces, and faceted navigation.
To do this, go back to your log file, and filter by URLs that contain a “?” or question mark symbols from the URL column (containing the URL stem). To do this in Excel, remember to use “~?” or tilde question mark, as shown below:
A single “?” or question mark, as stated in the auto filter window, represents any single character, so adding the tilde is like an escape character and makes sure to filter out the question mark symbol itself.
Isn’t that easy?
Find duplicate URLs
Duplicate URLs can be a crawl budget waste and a big SEO issue, but finding them can be a pain. URLs can sometimes have slight variants (such as a trailing slash vs a non-trailing slash version of a URL).
Ultimately, the best way to find duplicate URLs is also the least fun way to do so — you have to sort by site URL stem alphabetically and manually eyeball it.
One way you can find trailing and non-trailing slash versions of the same URL is to use the SUBSTITUTE function in another column and use it to remove all forward slashes:
=SUBSTITUTE(C2, “/”, “”)
In my case, the target cell is C2 as the stem data is on the third column.
Then, use conditional formatting to identify duplicate values and highlight them.
However, eyeballing is, unfortunately, the best method for now.
See the crawl frequency of subdirectories
Finding out which subdirectories are getting crawled the most is another quick way to reveal crawl budget waste. Although keep in mind, just because a client’s blog has never earned a single backlink and only gets three views a year from the business owner’s grandma doesn’t mean you should consider it crawl budget waste — internal linking structure should be consistently good throughout the site and there might be a strong reason for that content from the client’s perspective.
To find out crawl frequency by subdirectory level, you will need to mostly eyeball it but the following formula can help:
The above formula looks like a bit of a doozy, but all it does is check if there is a trailing slash, and depending on the answer, count the number of trailing slashes and subtract either 2 or 1 from the number. This formula could be shortened if you remove all trailing slashes from your URL list using the RIGHT formula — but who has the time. What you’re left with is subdirectory count (starting from 0 from as the first subdirectory).
Replace C2 with the first URL stem / URL cell and then copy the formula down your entire list to get it working.
Make sure you replace all of the C2s with the appropriate starting cell and then sort the new subdirectory counting column by smallest to largest to get a good list of folders in a logical order, or easily filter by subdirectory level. For example, as shown in the below screenshots:
The above image is subdirectories sorted by level.
The above image is subdirectories sorted by depth.
If you’re not dealing with a lot of URLs, you could simply sort the URLs by alphabetical order but then you won’t get the subdirectory count filtering which can be a lot faster for larger sites.
See crawl frequency by content type
Finding out what content is getting crawled, or if there are any content types that are hogging crawl budget, is a great check to spot crawl budget waste. Frequent crawling on unnecessary or low priority CSS and JS files, or how crawling is occurring on images if you are trying to optimize for image search, can easily be spotted with this tactic.
In Excel, seeing crawl frequency by content type is as easy as filtering by URL or URI stem using the Ends With filtering option.
Quick Tip: You can also use the “Does Not End With” filter and use a .html extension to see how non-HTML page files are being crawled — always worth checking in case of crawl budget waste on unnecessary js or css files, or even images and image variations (looking at you WordPress). Also, remember if you have a site with trailing and non-trailing slash URLs to take that into account with the “or” operator with filtering.
Spying on bots: Understand site crawl behavior
Log File Analysis allows us to understand how bots behave by giving us an idea of how they prioritize. How do different bots behave in different situations? With this knowledge, you can not only deepen your understanding of SEO and crawling, but also give you a huge leap in understanding the effectiveness of your site architecture.
See most and least crawled URLs
This strategy has been touched up previously with seeing crawled URLs by user-agent, but it’s even faster.
In Excel, select a cell in your table and then click Insert > Pivot Table, make sure the selection contains the necessary columns (in this case, the URL or URI stem and the user-agent) and click OK.
Once you have your pivot table created, set the rows to the URL or URI stem, and the summed value as the user-agent.
From there, you can right-click in the user-agent column and sort the URLs from largest to smallest by crawl count:
Now you’ll have a great table to make charts from or quickly review and look for any problematic areas:
A question to ask yourself when reviewing this data is: Are the pages you or the client would want being crawled? How often? Frequent crawling doesn’t necessarily mean better results, but it can be an indication as to what Google and other content user-agents prioritize most.
Crawl frequency per day, week, or month
Checking the crawling activity to identify issues where there has been loss of visibility around a period of time, after a Google update or in an emergency can inform you where the problem might be. This is as simple as selecting the “date” column, making sure the column is in the “date” format type, and then using the date filtering options on the date column. If you’re looking to analyze a whole week, just select the corresponding days with the filtering options available.
Crawl frequency by directive
Understanding what directives are being followed (for instance, if you are using a disallow or even a no-index directive in robots.txt) by Google is essential to any SEO audit or campaign. If a site is using disallows with faceted navigation URLs, for example, you’ll want to make sure these are being obeyed. If they aren’t, recommend a better solution such as on-page directives like meta robots tags.
To see crawl frequency by directive, you’ll need to combine a crawl report with your log file analysis.
(Warning: We’re going to be using VLOOKUP, but it’s really not as complicated as people make it out to be)
To get the combined data, do the following:
Get the crawl from your site using your favorite crawling software. I might be biased, but I’m a big fan of the Screaming Frog SEO Spider, so I’m going to use that.
If you’re also using the spider, follow the steps verbatim, but otherwise, make your own call to get the same results.
Export the Internal HTML report from the SEO Spider (Internal Tab > “Filter: HTML”) and open up the “internal_all.xlsx” file.
From there, you can filter the “Indexability Status” column and remove all blank cells. To do this, use the “does not contain” filter and just leave it blank. You can also add the “and” operator and filter out redirected URLs by making the filter value equal “does not contain → “Redirected” as shown below:
This will show you canonicalized, no-index by meta robots and canonicalized URLs.
Copy this new table out (with just the Address and Indexability Status columns) and paste it in another sheet of your log file analysis export.
Now for some VLOOKUP magic. First, we need to make sure the URI or URL column data is in the same format as the crawl data.
Log Files don’t generally have the root domain or protocol in the URL, so we either need to remove the head of the URL using “Find and Replace” in our newly made sheet, or make a new column in your log file analysis sheet append the protocol and root domain to the URI stem. I prefer this method because then you can quickly copy and paste a URL that you are seeing problems with and take a look. However, if you have a massive log file, it is probably a lot less CPU intensive with the “Find and Replace” method.
To get your full URLs, use the following formula but with the URL field changed to whatever site you are analyzing (and make sure the protocol is correct as well). You’ll also want to change D2 to the first cell of your URL column
down the formula to the end of your Log file table and get a nice list of full URLs:
Now, create another column and call it “Indexability Status”. In the first cell, use a VLOOKUP similar to the following: =VLOOKUP(E2,CrawlSheet!A$1:B$1128,2,FALSE). Replace E2 with the first cell of you “Full URL” column, then make the lookup table into your new. crawl sheet. Remember to sue the dollar signs so that the lookup table doesn’t change as you. apply the formula to further roles. Then, select the correct column (1 would be the first column of the index table, so number 2 is the one we are after). Use the FALSE range lookup mode for exact matching. Now you have a nice tidy list of URLs and their indexability status matched with crawl data:
Crawl frequency by depth and internal links
This analysis allows us to see how a site’s architecture is performing in terms of crawl budget and crawlability. The main aim is to see if you have far more URLs than you do requests — and if you do then you have a problem. Bots shouldn’t be “giving up” on crawling your entire site and not discovering important content or wasting crawl budget on content that is not important.
Tip: It is also worth using a crawl visualization tool alongside this analysis to see the overall architecture of the site and see where there are “off-shoots” or pages with poor internal linking.
To get this all-important data, do the following:
Crawl your site with your preferred crawling tool and export whichever report has both the click depth and number of internal links with each URL.
In my case, I’m using the Screaming Frog SEO Spider, going exporting the Internal report:
Use a VLOOKUP to match your URL with the Crawl Depth column and the number of Inlinks, which will give you something like this:
Depending on the type of data you want to see, you might want to filter out only URLs returning a 200 response code at this point or make them filterable options in the pivot table we create later. If you’re checking an e-commerce site, you might want to focus solely on product URLs, or if you’re optimizing crawling of images you can filter out by file type by filtering the URI column of your log file using the “Content-Type” column of your crawl export and making an option to filter with a pivot table. As with all of these checks, you have plenty of options!
Using a pivot table, you can now analyze crawl rate by crawl depth (filtering by the particular bot in this case) with the following options:
To get something like the following:
Better data than Search Console? Identifying crawl issues
Search Console might be a go-to for every SEO, but it certainly has flaws. Historical data is harder to get, and there are limits on the number of rows you can view (at this time of writing it is 1000). But, with Log File Analysis, the sky’s the limit. With the following checks, we’re going to be discovered crawl and response errors to give your site a full health check.
Discover Crawl Errors
An obvious and quick check to add to your arsenal, all you have to do is filter the status column of your log file (in my case “sc-status” with a W3C log file type) for 4xx and 5xx errors:
Find inconsistent server responses
A particular URL may have varying server responses over time, which can either be normal behavior, such as when a broken link has been fixed or a sign there is a serious server issue occurring such as when heavy traffic to your site causes a lot more internal server errors and is affecting your site’s crawlability.
Analyzing server responses is as easy as filtering by URL and by Date:
Alternatively, if you want to quickly see how a URL is varying in response code, you can use a pivot table with the rows set to the URL, the columns set to the response codes and counting the number of times a URL has produced that response code. To achieve this setup create a pivot table with the following settings:
This will produce the following:
As you can see in the above table, you can clearly see “/inconcistent.html” (highlighted in the red box) has varying response codes.
View Errors by Subdirectory
To find which subdirectories are producing the most problems, we just need to do some simple URL filtering. Filter out the URI column (in my case “cs-uri-stem”) and use the “contains” filtering option to select a particular subdirectory and any pages within that subdirectory (with the wildcard *):
For me, I checked out the blog subdirectory, and this produced the following:
View Errors by User Agent
Finding which bots are struggling can be useful for numerous reasons including seeing the differences in website performance for mobile and desktop bots, or which search engines are best able to crawl more of your site.
You might want to see which particular URLs are causing issues with a particular bot. The easiest way to do this is with a pivot table that allows for filtering the number of times a particular response code occurs per URI. To achieve this make a pivot table with the following settings:
From there, you can filter by your chosen bot and response code type, such as image below, where I’m filtering for Googlebot desktop to seek out 404 errors:
Alternatively, you can also use a pivot table to see how many times a specific bot produces different response codes as a whole by creating a pivot table that filters by bot, counts by URI occurrence, and uses response codes as rows. To achieve this use the settings below:
For example, in the pivot table (below), I’m looking at how many of each response code Googlebot is receiving:
Diagnose on-page problems
Websites need to be designed not just for humans, but for bots. Pages shouldn’t be slow loading or be a huge download, and with log file analysis, you can see both of these metrics per URL from a bot’s perspective.
Find slow & large pages
While you can sort your log file by the “time taken” or “loading time” column from largest to smallest to find the slowest loading pages, it’s better to look at the average load time per URL as there could be other factors that might have contributed to a slow request other than the web page’s actual speed.
To do this, create a pivot table with the rows set to the URI stem or URL and the summed value set to the time taken to load or load time:
Then using the drop-down arrow, in this case, where it says “Sum of time-taken” and go to “Value Field Settings”:
In the new window, select “Average” and you’re all set:
Now you should have something similar to the following when you sort the URI stems by largest to smallest and average time taken:
Find large pages
You can now add the download size column (in my case “sc-bytes”) using the settings shown below. Remember that the set the size to the average or sum depending on what you would like to see. For me, I’ve done the average:
And you should get something similar to the following:
Bot behavior: Verifying and analyzing bots
The best and easiest way to understand bot and crawl behavior is with log file analysis as you are again getting real-world data, and it’s a lot less hassle than other methods.
Find un-crawled URLs
Simply take the crawl of your website with your tool of choice, and then take your log file an compare the URLs to find unique paths. You can do this with the “Remove Duplicates” feature of Excel or conditional formatting, although the former is a lot less CPU intensive especially for larger log files. Easy!
Identify spam bots
Unnecessary server strain from spam and spoof bots is easily identified with log files and some basic command line operators. Most requests will also have an IP associated with it, so using your IP column (in my case, it is titled “c-ip” in a W3C format log), remove all duplicates to find each individual requesting IP.
From there, you should follow the process outlined in Google’s document for verifying IPs (note: For Windows users, use the nslookup command):
Conclusion: Log Files Analysis — not as scary as it sounds
With some simple tools at your disposal, you can dive deep into how Googlebot behaves. When you understand how a website handles crawling, you can diagnose more problems than you can chew — but the real power of Log File Analysis lies in being able to test your theories about Googlebot and extending the above techniques to gather your own insights and revelations.
What theories would you test using log file analysis? What insights could you gather from log files other than the ones listed above? Let me know in the comments below.
After you’ve put in the work with technical SEO and made your discoveries, there’s one thing left to do: present your findings to the client and agree on next steps. And like many things in our industry, that’s easier said than done. In this week’s episode of Whiteboard Friday, Benjamin Estes from Distilled presents his framework for making technical recommendations to clients and stakeholders to best position you for success
Click on the whiteboard image above to open a high-resolution version in a new tab!
Hi. My name is Ben. I’m a principal consultant at a company called Distilled. Welcome to Whiteboard Friday. Today I’d like to talk to you about something a bit different than most Whiteboard Fridays.
I’d like to talk about how to work with clients or bosses in a different way. Instead of thinking about technical SEO and how to make technical discoveries or see what problems are, I want to talk about how to present your findings to your client after you’ve done that discovery.
What’s the problem that we’re dealing with here? Well, the scenario is that we’ve got a recommendation and we’re presenting it to a client or a boss.
Easy enough. But what’s the goal of that situation? I would argue that there’s a very specific goal, and the best way to look at it is the goal is to change the action of the individual or the organization. Now, what if that wasn’t the case? You know, what if you worked with a client and none of their actions changed as a result of that engagement? Well, what was the point?
You know, should they have even trusted you in the first place to come in and help them? So if this is the specific goal that we’re trying to accomplish, what’s the best way to do that? Most people jump right to persuasion. They say, “If only I could something, the client would listen to me.” “If only I could present the forecast.”
If only I could justify the ROI, something, some mysterious research that probably hasn’t been done yet and maybe can’t even be done at all. My argument here is that the idea of persuasion is toxic. When you say, “If only I could this,” really what you mean is, “If only I had the evidence, the client would have to do as I say.” You’re trying to get control over the client when you say these things.
It turns out that human beings basically do whatever they want to do, and no matter how well you make your case, if it’s made for your reasons and not the client’s, they’re still not going to want to do the thing that you recommend. So I’ve introduced a framework at Distilled that helps us get past this, and that’s what I’d like to share with you right now.
The key to this method is that at each step of the process you allow the client to solve the problem for themselves. You give them the opportunity to see the problem from their own perspective and maybe even come up with their own solution. There are three steps to this.
First, you suggest the problem.
When I say “suggest,” I don’t mean suggest a solution. I mean you plant the idea in their mind that this is a problem that needs solving. It’s almost like inception. So you first say, “Here is what I see.” Hold up the mirror to them. Make the observations that they haven’t yet made themselves.
Step two, demonstrate, and what demonstrate means is you’re allowing them to emulate your behavior.
You’re demonstrating what you would do in that situation if you had to deal with the same problem. So you say, “Here’s what I would do if I were in your shoes.”
Finally, you elaborate. You say, “Here’s why I think this is a reasonable activity.” Now I’ve got to be honest. Most of the time, in my experience, if you use this framework, you never even make it to elaboration, because the client solves the problem somewhere back here and you can just end the meeting.
The key, again, is to let the client solve the problem for themselves, for their own reason, in the way that they feel most comfortable.
Let’s look at an example, because that is, again, kind of abstract. So let’s say that you’ve made an observation in Google Search Console. The client has all these pages that Google has discovered, but they shouldn’t really be in the index or indexable or discoverable at all.
Start by suggesting
So you start by suggesting. “I see in Search Console that Google has discovered 18 million pages,”when it should be, let’s say, 10,000. “This is from your faceted navigation.” Now notice there’s no judgment. There’s no hint at what should be done about this or even the severity of the problem. You’re just presenting the numbers.
Now we’re already sort of at a turning point. Maybe the client hears this and they do a sort of a head slap and they say, “Of course. You know, I hadn’t seen that problem before. But here’s what I think we should do about it.” You reach some sort of agreement, and the problem is solved and the meeting is over and you get that hour back in your day. But maybe they sort of have some sort of questions about what this means, what this implies, and they want to hear your solution.
Demonstrate what you would do
Well, now it’s time to demonstrate what you would do when presented with that fact. You say, “This would be fixed by adding ‘nofollow’ to links to that faceted content.” Maybe they see how this is an obvious solution to the problem that’s completely compatible with their tech stack, and again you get 50 minutes back in your day because the meeting is done.
You’ve done your job. Or maybe they don’t. Maybe they don’t understand why that would be a good solution.
So finally, you get to this stage, which is elaboration. “Here’s why I think this is a good idea. These pages are important for user experience. You don’t want to get rid of the faceted navigation in your e-commerce store, but you do want to not link to those pages for SEO reasons, because maybe there’s no search volume for related terms.”
So for a particular cost range for an item or something like that, there’s just no associated search activity. You need the pages still. So you say, “These pages are important for user experience, but they don’t satisfy any search intent.” At that point, the client says, “Of course. You’ve come up with the ideal solution, and I’m going to implement your recommendation exactly as you’ve given it to me.”
Or they don’t. If they don’t, you’re no worse off. You can basically walk out of that meeting saying, “I’ve done everything possible to get the client on board with my recommendation, but it just didn’t work out.” That feeling of being able to know that you did the right thing has been a very powerful one, at least in my experience. I’ve been consulting for about eight years, and just going through this process helps me sleep better at night knowing that I really did my job.
We’ve also found that this has a really high success rate with clients too. Finally, you’ll discover that it’s much, much easier to put together presentations if you know that this is the format that you’re going to be presenting in. So if you think that your job is to give the evidence to the client to convince them of something, there’s really no end to the evidence that you could gather.
You could always gather more evidence, and when you get to that final meeting, you can say, “Oh, it’s not because I saw the problem in the wrong way or I communicated it in the wrong way.It’s that I didn’t justify the ROI enough.” There’s no leaving that. That rabbit hole just keeps going, just keeps going. So again, this method has been extremely successful for Distilled. If you’re interested in engaging with this more, you can read at this URL, dis.tl/present, where I give a more thorough write-up on this.
Of course, I’d love to hear any thoughts or experiences that you have with this method. Thank you very much.
Can your marketing agency make a profit working with low-budget clients in rural areas?
Could you be overlooking a source of referrals, publicity, and professional satisfaction if you’re mainly focused on landing larger clients in urban locales? Clients in least-populated areas need to capture every customer they can get to be viable, including locals, new neighbors, and passers-through. Basic Local SEO can go a long way toward helping with this, and even if package offerings aren’t your agency’s typical approach, a simple product that emphasizes education could be exactly what’s called for.
Today, I’d like to help you explore your opportunities of serving rural and very small town clients. I’ve pulled together a sample spreadsheet and a ton of other resources that I hope will empower you to develop a bare-bones but high-quality local search marketing package that will work for most and could significantly benefit your agency in some remarkable ways.
Everything in moderation
The linchpin fundamental to the rural client/agency relationship is that the needs of these businesses are so exceedingly moderate. The competitive bar is set so low in a small-town-and-country setting, that, with few exceptions, clients can make a strong local showing with a pared-down marketing plan.
Let’s be honest — many businesses in this scenario can squeak by on a website design package from some giant web hosting agency. A few minutes spent with Google’s non-urban local packs attest to this. But I’m personally dissatisfied by independent businesses ending up being treated like numbers because it’s so antithetical to the way they operate. The local hardware store doesn’t put you on hold for 45 minutes to answer a question. The local farm stand doesn’t route you overseas to buy heirloom tomatoes. Few small town institutions stay in business for 150 years by overpromising and under-delivering.
Let’s assume that many rural clients will have some kind of website. If they don’t, you can recommend some sort of freebie or cheapie solution. It will be enough to get them placed somewhere in Google’s results, but if they never move beyond this, the maximum conversions they need to stay in business could be missed.
I’ve come to believe that the small-to-medium local marketing agency is the best fit for the small-to-medium rural brand because of shared work ethics and a similar way of doing business. But both entities need to survive monetarily and that means playing a very smart game with a budget on both sides.
It’s a question of organizing an agency offering that delivers maximum value with a modest investment of your time and the client’s money.
Constructing a square deal
When you take on a substantial client in a large town or city, you pull out all the stops. You dive deeply into auditing the business, its market, its assets. You look at everything from technical errors to creative strengths before beginning to build a strategy or implement campaigns, and there may be many months or years of work ahead for you with these clients. This is all entirely appropriate for big, lucrative contracts.
For your rural roster, prepare to scale way back. Here is your working plan:
1. Schedule your first 15-minute phone call with the client
Avoid the whole issue of having to lollygag around waiting for a busy small business owner to fill out a form. Schedule an appointment and have the client be at their place of business in front of a computer at the time of the call. Confirm the following, ultra-basic data about the client.
Business model (single location brick-and-mortar, SAB, etc.)
Are there any other businesses at this address?
Main products/services offered
If SAB, list of cities served
Most obvious search phrase they want to rank for
Year established and year they first took the business online
Have they ever been aware of a penalty on their website or had Google tell them they were removing a listing?
Finally, have the client (who is in front of their computer at their place of business) search for the search term that’s the most obviously important and read off to you the names and URLs of the businesses ranking in the local pack and on the first page of the organic results.
And that’s it. If you pay yourself $100/hr, this quick session yields a charge of $25.
2. Make a one-time investment in writing a bare-bones guide to Local SEO
Spend less than one working day putting together a .pdf file or Google doc written in the least-technical language containing the following:
Your briefest, clearest definition of what local SEO is and how it brings customers to local businesses. Inspiration here.
An overview of 3 key business models: brick & mortar, SAB, and home-based so the client can easily identify which of these models is theirs.
Foolproof instructions for creating a Google account and creating and claiming a GMB listing. Show the process step-by-step so that anyone can understand it. Inspiration here.
A list of top general industry citation platforms with links to the forms for getting listed on them. Inspiration here and if the client can hit at least a few of these, they will be off to a good start.
An overview of the role of review acquisition and response, with a few simple tips for earning reviews and a list of the top general industry review platforms. Inspiration here and here.
An overview of the role of building offline relationships to earn a few online linktations. Inspiration here.
Links to the Google My Business forum and the main Google support platforms including their phone number (844.491.9665), Facebook, Twitter, and online chat. Tell the client this is where to go if they encounter a problem with their Google listing in the future.
Your agency’s complete contact information so that the business can remember who you are and engage you for further consulting down the road, if ever necessary.
If you pay yourself $100 an hour, investing in creating this guide will cost you less than $1000.00. That’s a modest amount that you can quickly earn back from clients. Hopefully, the inspirational links I’ve included will give you a big head start. Avoid covering anything trendy (like some brand new Google feature) so that the only time you should have to update the guide in the near future will be if Google makes some major changes to their guidelines or dashboard.
Deliver this asset to every rural client as their basic training in the bare essentials of local marketing.
3. Create a competitive audit spreadsheet once and fill it out ad infinitum
What you want here is something that lets you swiftly fill in the blanks.
For the competitive audit, you’ll be stacking up your client’s metrics against the metrics of the business they told you was ranking at the top of the local pack when they searched from their location. You can come up with your own metrics, or you can make a copy of this template I’ve created for you and add to it/subtract from it as you like.
Make a copy of the ultra-basic competitive local audit template — you can do so right here.
You’ll notice that my sample sheet does not delve deeply into some of the more technical or creative areas you might explore for clients in tougher markets. With few exceptions, rural clients just don’t need that level of insight to compete.
Give yourself 45 focused minutes filling in the data in the spreadsheet. You’ve now invested 1 hour of time with the client. So let’s give that a value of $100.
4. Transfer the findings of your audit into a custom report
Here’s another one-time investment. Spend no more than one workday creating a .pdf or Google Docs template that takes the fields of your audit and presents them in a readable format for the client. I’m going to leave exact formatting up to you, but here are the sections I would recommend structuring the report around:
A side-by-side comparison of the client vs. competitor metrics, bucketed by topic (Website, GMB, Reputation, Links, Citations, etc)
A very basic explanation of what those metrics mean
A clear recommendation of what the client should do to improve their metrics
For example, your section on reputation might look like this:
The beauty of this is that, once you have the template, all you have to do is fill it out and then spend an hour making intelligent observations based on your findings.
Constructing the template should take you less than one workday; so, a one-time investment of less than $1,000 if you are paying yourself $100/hr.
Transferring the findings of your audit from the spreadsheet to the report for each client should take about 1 hour. So, we’re now up to two total hours of effort for a unique client.
5. Excelling at value
So, you’ve now had a 15-minute conversation with a client, given them an introductory guide to the basics of local search marketing, and delivered a customized report filled with your observations and their to-dos. Many agencies might call it a day and leave the client to interpret the report on their own.
But you won’t do that, because you don’t want to waste an incredible opportunity to build a firm relationship with a business. Instead, spend one more hour on the phone with the owner, going over the report with them page by page and allowing a few minutes for any of their questions. This is where you have the chance to deliver exceptional value to the client, telling them exactly what you think will be most helpful for them to know in a true teaching moment.
At the end of this, you will have become a memorable ally, someone they trust, and someone to whom they will have confidence in referring their colleagues, family members, and neighbors.
You’ve made an overall investment of less than $2,000 to create your rural/small town marketing program.
Packaging up the guide, the report and the 1:1 phone consulting, you have a base price of $300 for the product if you pay yourself $100/hour.
However, I’m going to suggest that, based on the level of local SEO expertise you bring to the scenario, you create a price point somewhere between $300–$500 for the package. If you are still relatively green at local SEO, $300 could be a fair price for three hours of consulting. If you’re an industry adept, scale it up a bit because, because you bring a rare level of insight to every client interaction, even if you’re sticking to the absolute basics. Begin selling several of these packages in a week, and it will start totaling up to a good monthly revenue stream.
As a marketer, I’ve generally shied away from packages because whenever you dig deeply into a client’s scenario, nuances end up requiring so much custom research and communication. But, for the very smallest clients in this least competitive markets, packages can hit the spot.
Considerable benefits for your agency
The client is going to walk away from the relationship with a good deal … and likely a lot to do. If they follow your recommendations, it will typically be just what they needed to establish themselves on the web to the extent that neighbors and travelers can easily find them and choose them for transactions. Good job!
But you’re going to walk away with some amazing benefits, too, some of which you might not have considered before. To wit:
1. Relationships and the ripple effect
A client you’ve treated very well on the phone is a client who is likely to remember you for future needs and recommend you. I’ve had businesses send me lovely gifts on top of my consulting fee because I’ve taken the time to really listen and answer questions. SEO agencies are always looking for ways to build authentic relationships. Don’t overlook the small client as a centroid of referrals throughout a tight-knit community and beyond it to their urban colleagues, friends, and family.
2. Big data for insights and bragging rights
If your package becomes popular, a ton of data is going to start passing through your hands. The more of these audits you do, the more time you’re spending actively observing Google’s handling of the localized SERPs. Imagine the blog posts your agency can begin publishing by anonymizing and aggregating this data, pulling insights of value to our industry. There is no end to the potential for you to grow your knowledge.
Apart from case studies, think of the way this package can both build up your proud client roster and serve as a source of client reviews. The friendly relationship you’ve built with that 1:1 time can now become a font of very positive portfolio content and testimonials for you to publish on your website.
3. Agency pride from helping rebuild rural America
Have you noticed the recent spate of hit TV shows that hinge on rebuilding dilapidated American towns? Industry consolidation is most often cited as the root of rural collapse, with small farmers and independent businesses no longer able to create a tax base to support basic community needs like hospitals, fire departments, and schools. Few of us rejoice at the idea of Main Streets — long-cherished hallmarks not just of Americana but of shared American identity — becoming ghost towns.
It can be a source of professional satisfaction for your marketing agency if you offer these brave and hard-working business owners a good deal and the necessary education they need to present themselves sufficiently on the web. I live in a rural area, and I know just how much a little, solid advice can help. I feel extra good if I know I’m contributing to America’s rural comeback story.
Promoting your rural local SEO package
Once you’ve got your guide and templates created, what next? Here are some simple tips:
Create a terrific landing page on your website specifically for this package and call it out on your homepage as well. Wherever appropriate, build internal links to it.
Promote on social media.
Blog about why you’ve created the package, aligning your agency as an ally to the rebuilding of rural communities.
If, like me, you live in a rural area, consider presenting at local community events that will put you in front of small business owners.
Don’t overlook old school media like community message boards at the local post office, or even fliers tacked to electric poles.
If you’re a city slicker, consider how far you’d have to travel to get to the nearest rural community to participate in events.
Advertising both off and online in rural papers can be quite economical. There is also a place of worship print bulletins, local school papers, and other publications that welcome sponsors. Give it a try.
And, of course, ask happy clients to refer you, telling them what it means to your business. You might even develop a referral program.
The truth is that your agency may not be able to live by rural clients, alone. You may still be targeting the bulk of your campaigns towards urban enterprises because just a few highly competitive clients can bring welcome security to your bank account.
But maybe this is a good day to start looking beyond the fast food franchise, the NY attorney and the LA dermatology group. The more one reads about rural entrepreneurs, the more one tends to empathize with them, and empathy is the best foundation I know of for building rewarding business relationships.