After you’ve put in the work with technical SEO and made your discoveries, there’s one thing left to do: present your findings to the client and agree on next steps. And like many things in our industry, that’s easier said than done. In this week’s episode of Whiteboard Friday, Benjamin Estes from Distilled presents his framework for making technical recommendations to clients and stakeholders to best position you for success
Click on the whiteboard image above to open a high-resolution version in a new tab!
Hi. My name is Ben. I’m a principal consultant at a company called Distilled. Welcome to Whiteboard Friday. Today I’d like to talk to you about something a bit different than most Whiteboard Fridays.
I’d like to talk about how to work with clients or bosses in a different way. Instead of thinking about technical SEO and how to make technical discoveries or see what problems are, I want to talk about how to present your findings to your client after you’ve done that discovery.
What’s the problem that we’re dealing with here? Well, the scenario is that we’ve got a recommendation and we’re presenting it to a client or a boss.
Easy enough. But what’s the goal of that situation? I would argue that there’s a very specific goal, and the best way to look at it is the goal is to change the action of the individual or the organization. Now, what if that wasn’t the case? You know, what if you worked with a client and none of their actions changed as a result of that engagement? Well, what was the point?
You know, should they have even trusted you in the first place to come in and help them? So if this is the specific goal that we’re trying to accomplish, what’s the best way to do that? Most people jump right to persuasion. They say, “If only I could something, the client would listen to me.” “If only I could present the forecast.”
If only I could justify the ROI, something, some mysterious research that probably hasn’t been done yet and maybe can’t even be done at all. My argument here is that the idea of persuasion is toxic. When you say, “If only I could this,” really what you mean is, “If only I had the evidence, the client would have to do as I say.” You’re trying to get control over the client when you say these things.
It turns out that human beings basically do whatever they want to do, and no matter how well you make your case, if it’s made for your reasons and not the client’s, they’re still not going to want to do the thing that you recommend. So I’ve introduced a framework at Distilled that helps us get past this, and that’s what I’d like to share with you right now.
The key to this method is that at each step of the process you allow the client to solve the problem for themselves. You give them the opportunity to see the problem from their own perspective and maybe even come up with their own solution. There are three steps to this.
First, you suggest the problem.
When I say “suggest,” I don’t mean suggest a solution. I mean you plant the idea in their mind that this is a problem that needs solving. It’s almost like inception. So you first say, “Here is what I see.” Hold up the mirror to them. Make the observations that they haven’t yet made themselves.
Step two, demonstrate, and what demonstrate means is you’re allowing them to emulate your behavior.
You’re demonstrating what you would do in that situation if you had to deal with the same problem. So you say, “Here’s what I would do if I were in your shoes.”
Finally, you elaborate. You say, “Here’s why I think this is a reasonable activity.” Now I’ve got to be honest. Most of the time, in my experience, if you use this framework, you never even make it to elaboration, because the client solves the problem somewhere back here and you can just end the meeting.
The key, again, is to let the client solve the problem for themselves, for their own reason, in the way that they feel most comfortable.
Let’s look at an example, because that is, again, kind of abstract. So let’s say that you’ve made an observation in Google Search Console. The client has all these pages that Google has discovered, but they shouldn’t really be in the index or indexable or discoverable at all.
Start by suggesting
So you start by suggesting. “I see in Search Console that Google has discovered 18 million pages,”when it should be, let’s say, 10,000. “This is from your faceted navigation.” Now notice there’s no judgment. There’s no hint at what should be done about this or even the severity of the problem. You’re just presenting the numbers.
Now we’re already sort of at a turning point. Maybe the client hears this and they do a sort of a head slap and they say, “Of course. You know, I hadn’t seen that problem before. But here’s what I think we should do about it.” You reach some sort of agreement, and the problem is solved and the meeting is over and you get that hour back in your day. But maybe they sort of have some sort of questions about what this means, what this implies, and they want to hear your solution.
Demonstrate what you would do
Well, now it’s time to demonstrate what you would do when presented with that fact. You say, “This would be fixed by adding ‘nofollow’ to links to that faceted content.” Maybe they see how this is an obvious solution to the problem that’s completely compatible with their tech stack, and again you get 50 minutes back in your day because the meeting is done.
You’ve done your job. Or maybe they don’t. Maybe they don’t understand why that would be a good solution.
So finally, you get to this stage, which is elaboration. “Here’s why I think this is a good idea. These pages are important for user experience. You don’t want to get rid of the faceted navigation in your e-commerce store, but you do want to not link to those pages for SEO reasons, because maybe there’s no search volume for related terms.”
So for a particular cost range for an item or something like that, there’s just no associated search activity. You need the pages still. So you say, “These pages are important for user experience, but they don’t satisfy any search intent.” At that point, the client says, “Of course. You’ve come up with the ideal solution, and I’m going to implement your recommendation exactly as you’ve given it to me.”
Or they don’t. If they don’t, you’re no worse off. You can basically walk out of that meeting saying, “I’ve done everything possible to get the client on board with my recommendation, but it just didn’t work out.” That feeling of being able to know that you did the right thing has been a very powerful one, at least in my experience. I’ve been consulting for about eight years, and just going through this process helps me sleep better at night knowing that I really did my job.
We’ve also found that this has a really high success rate with clients too. Finally, you’ll discover that it’s much, much easier to put together presentations if you know that this is the format that you’re going to be presenting in. So if you think that your job is to give the evidence to the client to convince them of something, there’s really no end to the evidence that you could gather.
You could always gather more evidence, and when you get to that final meeting, you can say, “Oh, it’s not because I saw the problem in the wrong way or I communicated it in the wrong way.It’s that I didn’t justify the ROI enough.” There’s no leaving that. That rabbit hole just keeps going, just keeps going. So again, this method has been extremely successful for Distilled. If you’re interested in engaging with this more, you can read at this URL, dis.tl/present, where I give a more thorough write-up on this.
Of course, I’d love to hear any thoughts or experiences that you have with this method. Thank you very much.
Agencies, are you set up for ongoing Google Tag Manager success? GTM isn’t the easiest tool in the world to work with, but if you know how to use it, it can make your life much easier. Make your future self happier and more productive by setting up your GTM containers the right way today. Dana DiTomaso shares more tips and hints in this edition of Whiteboard Friday.
Click on the whiteboard image above to open a high resolution version in a new tab!
Hi, Moz fans. My name is Dana DiTomaso. I am President and partner at Kick Point, which is a digital marketing agency based in Edmonton, Alberta. Today I’m going to be talking to you about Google Tag Manager and what your default container in Google Tag Manager should contain. I think if you’re in SEO, there are certainly a lot of things Google Tag Manager can do for you.
But if you’ve kind of said to yourself, “You know, Google Tag Manager is not the easiest thing to work with,” which is fair, it is not, and it used to be a lot worse, but the newer versions are pretty good, then you might have been a little intimidated by going in there and doing stuff. But I really recommend that you include these things by default because later you is going to be really happy that current you put this stuff in. So I’m going to go through what’s in Kick Point’s default Google Tag Manager container, and then hopefully you can take some of this and apply it to your own stuff.
Agencies, if you are watching, you are going to want to create a default container and use it again and again, trust me.
So we’re going to start with how this stuff is laid out. So what we have are tags and then triggers. The way that this works is the tag is sort of the thing that’s going to happen when a trigger occurs.
So tags that we have in our default container are the conversion linker, which is used to help conversions with Safari.
Then we need to track a number of events. You can certainly track these things as custom dimensions or custom metrics if that floats your boat. I mean that’s up to you. If you are familiar with using custom dimensions and custom metrics, then I assume you probably know how to do this. But if you’re just getting started with Tag Manager, just start with events and then you can roll your way up to being an expert after a while.
So under events, we always track external links, so anything that points out to a domain that isn’t yours.
The way that we track this is we’re looking at every single link that’s clicked and if it does not contain our client’s domain name, then we record it as an external link, and that’s an event that we record. Now remember, and I’ve seen accidents with this where someone doesn’t put in your client’s domain and then it tracks every single click to a different page on your client’s website as an external link. That’s bad.
When you transfer from HTTP to HTTPS, if you don’t update Google Tag Manager, it will start recording links incorrectly. Also bad. But what this is really useful for are things like when you link out to other websites, as you should when you’re writing articles, telling people to find out more information. Or you can track clicks out to your different social properties and see if people are actually clicking on that Facebook icon that you stuck in the header of your website.
The next thing to track are PDF downloads.
Now there’s a limitation to this, of course, in that if people google something and your PDF comes out and then they click on it directly from Google, of course that’s not going to show up in your Analytics. That can show up in Search Console, but you’re not going to get it in Analytics. So just keep that in mind. This is if someone clicks to your PDF from a specific page on your website. Again, you’re decorating the link to say if this link contains a PDF, then I want to have this.
Then we also track scroll tracking. Now scroll tracking is when people scroll down the site, you can track and fire an event at say 25%, 50%, 75%, and 100% of the way down the page. Now the thing is with this is that your mileage is going to vary. You will probably pick different percentages. By default, in all of our containers we put 25%, 50%, 75%, and 100%. Based on the client, we might change this.
An advanced, sort of level up tactic would be to pick specific elements and then when they enter the viewport, then you can fire an event. So let’s say, for example, you have a really important call to action and because different devices are different sizes, it’s going to be a different percentage of the way down the page when it shows up, but you want to see if people got to that main CTA. Then you would want to add an event that would show whether or not that CTA was shown in the viewport.
If you google Google Tag Manager and tracking things in the viewport, there are some great articles out there on how to do it. It’s not that difficult to set up.
Then also form submits. Of course, you’re going to want to customize this. But by default put form submits in your container, because I guarantee that when someone is making your container let’s say for a brand-new website, they will forget about tracking form submits unless you put it in your default container and they look at it and say, “Oh, right, I have to edit that.” So always put form submits in there.
Tel: & mailto: links
Of course you want to track telephone links and mailto: links. Telephone links should always, always be tappable, and that’s something that I see a lot of mistakes. Particularly in local SEO, when we’re dealing with really small business websites, they don’t make the telephone links tappable. It’s probably because people don’t know how. In case you don’t know how, you just telephone and then a colon and then the telephone number.
<a href="tel:+5555555555">(555) 555-5555</a>
That’s it. That’s all you need to do. Just like a link, except rather than going out to an HTTPS://, you’re going out to a telephone number. That is going to make your visitors’ lives so much easier, particularly on mobile devices. You always want to have those be tappable. So then you can track the number of people who tap on telephone links and people who tap on mailto: links exactly the same way. Now something that I do have to say, though, is that if you are using a call tracking provider, like CallRail for example, which is one that we use, then you’re going to want to shut this off, because then you could end up in double counting.
Particularly if you’re tracking every call made out from your website, then CallRail would have an Analytics integration, and then you would be tracking taps and you might also be tracking telephone clicks. So you can track it if you want to see how many people tap versus picking up the phone and calling the old-fashioned way with landlines. You can also do that, but that’s entirely up to you. But just keep that in mind if you are going to track telephone links.
All pages tracking
Then, of course, all pages tracking. Make sure you’re tracking all of the pages on your website through Google Analytics. So those are the tags.
Next up are the triggers. So I have a tag of external links. Then I need a trigger for external links. The trigger says when somebody clicks an external link, then I want this event to happen.
So the event is where you structure the category and then the action and the label.
The way that we would structure external links, for example, we would say that the category for it is an external link, the action is click, and then the label is the actual link that was clicked for example. You can see you can go through each of these and see where this is happening.
Then on things like form submit, for example, our label could be the specific form.
Tel: & mailto:
On telephone and mailto:, we might track the phone number.
On other things, like PDFs, we might track like the page that this happened on.
For scroll tracking, for example, we would want to track the page that someone scrolled down on. What I recommend when you’re setting up the event tracking for page scroll, the category should be page scroll, the action should be the percentage of which people scroll down, and then the label should be the URL.
Really think of it in terms of events, where you’ve got the category, which is what happened, the action, which is what did the person do, and the label is telling me more information about this. So actions are typically things like scroll, click, and tap if you’re going to be fancy and track mobile versus desktop. It could be things like form submit, for example, or just submit. Just really basic stuff. So really the two things that are going to tell you the difference are things like categories and labels, and the action is just the action that happened.
I’m really pedantic when it comes to setting up events, but I think in the long term, again, future you is going to thank you if you set this stuff up properly from the beginning. So you can really see that the tag goes to this trigger. Tag to trigger, tag to trigger, etc. So really think about making sure that every one of your tags has a corresponding trigger if it makes sense. So now we’re going to leave you with some tips on how to set up your Tag Manager account.
1. Use a Google Analytics ID variable
So the first tip is use a Google Analytics ID variable. It’s one of the built-in variables. When you go into Tag Manager and you click on Variables, it’s one of the built-in variables in there. I really recommend using that, because if you hardcode in the GA ID and something happens and you have to change it in the future or you copy that for someone else or whatever it might be, you’re going to forget.
I guarantee you you will forget. So you’re going to want to put that variable in there so you change it once and it’s everywhere. You’re saving yourself so much time and suffering. Just use a Google Analytics ID variable. If you have a really old container, maybe the variable wasn’t a thing when you first set it up. So one of the things I would recommend is go check and make sure you’re using a variable. If you’re not, then make a to-do for yourself to rip out all the hardcoded instances of your GA ID and instead replace it with a variable.
It will save you so much headaches.
2. Create a default container to import
So the next thing — agencies, this is for you — create a default container to import. Obviously, if you’re working in-house, you’re probably not making Google Tag Manager containers all that often, unless you work at say a homebuilder and you’re making microsites for every new home development. Then you might want to create a default container for yourself. But agency side for sure, you want have a default container that you make so every cool idea that you think of, you think, oh, we need to track this, just put it all in your default container, and then when you’re grabbing it to make one for a client, you can decide, oh, we don’t need this, or yes, we need this.
It’s going to save you a ton of time when you’re setting up containers, because I find that that’s the most labor-intensive part of working with a new Tag Manager container is thinking about, “What is all the stuff I want to include?” So you want to make sure that your default container has all your little tips and tricks that you’ve accumulated over the years in there and documented of course, and then decide on a client-by-client basis what you’re going to leave and what you’re going to keep.
3. Use a naming scheme and folders
Also use a naming scheme and folders, again because you may not be working there forever, and somebody in the future is going to want to look at this and think, “Why did they set it up like this? What does this word mean? Why is this variable called foo?” You know, things that have annoyed me about developers for years and years and years, developers I love you, but please stop naming things foo. It makes no sense to anyone other than you. So our naming scheme, and you can totally steal this if you want, is we go product, result, and then what.
So, for example, we would have our tag for Google Analytics page download. So it would say Google Analytics. This is the product that the thing is going to go to. Event is what is the result of this thing existing. Then what is the PDF download. Then it’s really clear, okay, I need to fix this thing with PDF download. Something is wrong.
It’s kind of weird. Now I know exactly where to go. Again, with folders as well, so let’s say you’ve implemented something such as content consumption, which is a Google Tag Manager recipe that you can grab on our website at Kickpoint.ca, and I’ll make sure to link to it in the transcript. Let’s say you grab that. Then you’re going to want to take all the different tags and triggers that come along with content consumption and toss that into its own folder and then separate it out from all of your basic stuff.
Even if you have everything to start in a folder called Basics or Events or Analytics versus Call Tracking versus any of the other billion different tracking pixels that you have on your website, it’s a good idea to just keep it all organized. I know it’s two minutes now. It is saving you a lifetime of suffering in the future, and the future you, whether it’s you working there or somebody who ends up taking your job five years from now, just make it easier on them.
Especially too, when you think back to say Google Analytics has been around for a long time now. When I go back and look at some of my very, very first analytics that I set up, I might look at it and think, “Why was I doing that?” But if you have documentation, at least you’re going to know why you did that really weird thing back in 2008. Or when you’re looking at this in 2029 and you’re thinking, “Why did I do this thing in 2019?” you’re going to have documentation for it. So just really keep that in mind.
4. Audit regularly!
Then the last thing is auditing regularly, and that means once every 3, 6, or 12 months. Pick a time period that makes sense for how often you’re going into the container. You go in and you take a look at every single tag, every single trigger, and every single variable. Simo Ahava has a really nice Google Tag Manager sort of auditing tool.
I’ll make sure to link to that in the transcript as well. You can use that to just go through your container and see what’s up. Let’s say you tested out some sort of screen recording, like you installed Hotjar six months ago and you ended up deciding on say another product instead, like FullStory, so then you want to make sure you remove the Hotjar. How many times have you found that you look at a new website and you’re like, “Why is this on here?”
No one at the client can tell you. They’re like, “I don’t know where that code came from.” So this is where auditing can be really handy, because remember, over time, each one of those funny little pixels that you tested out some product and then you ended up not going with it is weighing down your page and maybe it’s just a couple of microseconds, but that stuff adds up. So you really do want to go in and audit regularly and remove anything you’re not using anymore. Keep your Google Tag Manager container clean.
A lot of this is focused on obviously making future you very happy. Auditing will also make future you very happy. So hopefully, out of this, you can create a Google Tag Manager default container that’s going to work for you. I’m going to make sure as well, when the transcript is out for this, that I’m going to include some of the links that I talked about as well as a link to some more tips on how to add in things like conversion linker and make sure I’m updating it for when this video is published.
When you publish new content, you want users to find it ranking in search results as fast as possible. Fortunately, there are a number of tips and tricks in the SEO toolbox to help you accomplish this goal. Sit back, turn up your volume, and let Cyrus Shepard show you exactly how in this week’s Whiteboard Friday.
[Note: #3 isn’t covered in the video, but we’ve included in the post below. Enjoy!]
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I’m Cyrus Shepard, back in front of the whiteboard. So excited to be here today. We’re talking about ten tips to index and rank new content faster.
You publish some new content on your blog, on your website, and you sit around and you wait. You wait for it to be in Google’s index. You wait for it to rank. It’s a frustrating process that can take weeks or months to see those rankings increase. There are a few simple things we can do to help nudge Google along, to help them index it and rank it faster. Some very basic things and some more advanced things too. We’re going to dive right in.
1. URL Inspection / Fetch & Render
So basically, indexing content is not that hard in Google. Google provides us with a number of tools. The simplest and fastest is probably the URL Inspection tool. It’s in the new Search Console, previously Fetch and Render. As of this filming, both tools still exist. They are depreciating Fetch and Render. The new URL Inspection tool allows you to submit a URL and tell Google to crawl it. When you do that, they put it in their priority crawl queue. That just simply means Google has a list of URLs to crawl. It goes into the priority, and it’s going to get crawled faster and indexed faster.
Another common technique is simply using sitemaps. If you’re not using sitemaps, it’s one of the easiest, quickest ways to get your URLs indexed. When you have them in your sitemap, you want to let Google know that they’re actually there. There’s a number of different techniques that can actually optimize this process a little bit more.
The first and the most basic one that everybody talks about is simply putting it in your robots.txt file. In your robots.txt, you have a list of directives, and at the end of your robots.txt, you simply say sitemap and you tell Google where your sitemaps are. You can do that for sitemap index files. You can list multiple sitemaps. It’s really easy.
You can also do it using the Search Console Sitemap Report, another report in the new Search Console. You can go in there and you can submit sitemaps. You can remove sitemaps, validate. You can also do this via the Search Console API.
But a really cool way of informing Google of your sitemaps, that a lot of people don’t use, is simply pinging Google. You can do this in your browser URL. You simply type in google.com/ping, and you put in the sitemap with the URL. You can try this out right now with your current sitemaps. Type it into the browser bar and Google will instantly queue that sitemap for crawling, and all the URLs in there should get indexed quickly if they meet Google’s quality standard.
(BONUS: This wasn’t in the video, but we wanted to include it because it’s pretty awesome)
Within the past few months, both Google and Bing have introduced new APIs to help speed up and automate the crawling and indexing of URLs.
Both of these solutions allow for the potential of massively speeding up indexing by submitting 100s or 1000s of URLs via an API.
While the Bing API is intended for any new/updated URL, Google states that their API is specifically for “either job posting or livestream structured data.” That said, many SEOs like David Sottimano have experimented with Google APIs and found it to work with a variety of content types.
If you want to use these indexing APIs yourself, you have a number of potential options:
That’s talking about indexing. Now there are some other ways that you can get your content indexed faster and help it to rank a little higher at the same time.
4. Links from important pages
When you publish new content, the basic, if you do nothing else, you want to make sure that you are linking from important pages. Important pages may be your homepage, adding links to the new content, your blog, your resources page. This is a basic step that you want to do. You don’t want to orphan those pages on your site with no incoming links.
Adding the links tells Google two things. It says we need to crawl this link sometime in the future, and it gets put in the regular crawling queue. But it also makes the link more important. Google can say, “Well, we have important pages linking to this. We have some quality signals to help us determine how to rank it.” So linking from important pages.
5. Update old content
But a step that people oftentimes forget is not only link from your important pages, but you want to go back to your older content and find relevant places to put those links. A lot of people use a link on their homepage or link to older articles, but they forget that step of going back to the older articles on your site and adding links to the new content.
Now what pages should you add from? One of my favorite techniques is to use this search operator here, where you type in the keywords that your content is about and then you do a site:example.com. This allows you to find relevant pages on your site that are about your target keywords, and those make really good targets to add those links to from your older content.
6. Share socially
Really obvious step, sharing socially. When you have new content, sharing socially, there’s a high correlation between social shares and content ranking. But especially when you share on content aggregators, like Reddit, those create actual links for Google to crawl. Google can see those signals, see that social activity, sites like Reddit and Hacker News where they add actual links, and that does the same thing as adding links from your own content, except it’s even a little better because it’s external links. It’s external signals.
7. Generate traffic to the URL
This is kind of an advanced technique, which is a little controversial in terms of its effectiveness, but we see it anecdotally working time and time again. That’s simply generating traffic to the new content.
Now there is some debate whether traffic is a ranking signal. There are some old Google patents that talk about measuring traffic, and Google can certainly measure traffic using Chrome. They can see where those sites are coming from. But as an example, Facebook ads, you launch some new content and you drive a massive amount of traffic to it via Facebook ads. You’re paying for that traffic, but in theory Google can see that traffic because they’re measuring things using the Chrome browser.
When they see all that traffic going to a page, they can say, “Hey, maybe this is a page that we need to have in our index and maybe we need to rank it appropriately.”
Once we get our content indexed, talk about a few ideas for maybe ranking your content faster.
8. Generate search clicks
Along with generating traffic to the URL, you can actually generate search clicks.
Now what do I mean by that? So imagine you share a URL on Twitter. Instead of sharing directly to the URL, you share to a Google search result. People click the link, and you take them to a Google search result that has the keywords you’re trying to rank for, and people will search and they click on your result.
You see television commercials do this, like in a Super Bowl commercial they’ll say, “Go to Google and search for Toyota cars 2019.” What this does is Google can see that searcher behavior. Instead of going directly to the page, they’re seeing people click on Google and choosing your result.
This does a couple of things. It helps increase your click-through rate, which may or may not be a ranking signal. But it also helps you rank for auto-suggest queries. So when Google sees people search for “best cars 2019 Toyota,” that might appear in the suggest bar, which also helps you to rank if you’re ranking for those terms. So generating search clicks instead of linking directly to your URL is one of those advanced techniques that some SEOs use.
9. Target query deserves freshness
When you’re creating the new content, you can help it to rank sooner if you pick terms that Google thinks deserve freshness. It’s best maybe if I just use a couple of examples here.
Consider a user searching for the term “cafes open Christmas 2019.” That’s a result that Google wants to deliver a very fresh result for. You want the freshest news about cafes and restaurants that are going to be open Christmas 2019. Google is going to preference pages that are created more recently. So when you target those queries, you can maybe rank a little faster.
Compare that to a query like “history of the Bible.” If you Google that right now, you’ll probably find a lot of very old pages, Wikipedia pages. Those results don’t update much, and that’s going to be harder for you to crack into those SERPs with newer content.
The way to tell this is simply type in the queries that you’re trying to rank for and see how old the most recent results are. That will give you an indication of what Google thinks how much freshness this query deserves. Choose queries that deserve a little more freshness and you might be able to get in a little sooner.
10. Leverage URL structure
Finally, last tip, this is something a lot of sites do and a lot of sites don’t do because they’re simply not aware of it. Leverage URL structure. When Google sees a new URL, a new page to index, they don’t have all the signals yet to rank it. They have a lot of algorithms that try to guess where they should rank it. They’ve indicated in the past that they leverage the URL structure to determine some of that.
Consider The New York Times puts all its book reviews under the same URL, newyorktimes.com/book-reviews. They have a lot of established ranking signals for all of these URLs. When a new URL is published using the same structure, they can assign it some temporary signals to rank it appropriately.
If you have URLs that are high authority, maybe it’s your blog, maybe it’s your resources on your site, and you’re leveraging an existing URL structure, new content published using the same structure might have a little bit of a ranking advantage, at least in the short run, until Google can figure these things out.
These are only a few of the ways to get your content indexed and ranking quicker. It is by no means a comprehensive list. There are a lot of other ways. We’d love to hear some of your ideas and tips. Please let us know in the comments below. If you like this video, please share it for me. Thanks, everybody.
We know how important page speed is to Google, but why is that, exactly? With increasing benefits to SEO, UX, and customer loyalty that inevitably translates to revenue, there are more reasons than ever to both focus on site speed and become adept at communicating its value to devs and stakeholders. In today’s Whiteboard Friday, Sam Marsden takes us point-by-point through how Google understands speed metrics, the best ways to access and visualize that data, and why it all matters.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Hi, Moz fans, and welcome to another Whiteboard Friday. My name is Sam Marsden, and I work as an SEO at web crawling platform DeepCrawl. Today we’re going to be talking about how Google understands speed and also how we can visualize some of the performance metrics that they provide to benefit things like SEO, to improve user experience, and to ultimately generate more revenue from your site.
Google & speed
Let’s start by taking a look at how Google actually understands speed. We all know that a faster site generally results in a better user experience. But Google hasn’t actually directly been incorporating that into their algorithms until recently. It wasn’t until the mobile speed update, back in July, that Google really started looking at speed. Now it’s likely only a secondary ranking signal now, because relevance is always going to be much more important than how quickly the page actually loads.
But the interesting thing with this update was that Google has actually confirmed some of the details about how they understand speed. We know that it’s a mix of lab and field data. They’re bringing in lab data from Lighthouse, from the Chrome dev tools and mixing that with data from anonymized Chrome users. So this is available in the Chrome User Experience Report, otherwise known as CrUX.
Now this is a publicly available database, and it includes five different metrics. You’ve got first paint, which is when anything loads on the page. You’ve then got first contentful paint, which is when some text or an image loads. Then you’ve DOM content loaded, which is, as the name suggests, once the DOM is loaded. You’ve also got onload, which is when any additional scripts have loaded. That’s kind of like the full page load. The fifth and final metric is first input delay, and that’s the time between when a user interacts with your site to when the server actually responds to that.
These are the metrics that make up the CrUX database, and you can actually access this CrUX data in a number of different ways.
Where is CrUX data?
1. PageSpeed Insights
The first and easiest way is to go to PageSpeed Insights. Now you just plug in whatever page you’re interested in, and it’s going to return some of the CrUX metrics along with Lighthouse and a bunch of recommendations about how you can actually improve the performance of your site. That’s really useful, but it just kind of provides a snapshot rather than it’s not really good for ongoing monitoring as such.
2. CrUX dashboard
Another way that you can access CrUX data is through the CrUX dashboard, and this provides all of the five different metrics from the CrUX database. What it does is it looks at the percentage of page loads, splitting them out into slow, average, and fast loads. This also trends it from month to month so you can see how you’re tracking, whether you’re getting better or worse over time. So that’s really good. But the problem with this is you can’t actually manipulate the visualization of that data all that much.
3. Accessing the raw data
To do that and get the most out of the CrUX database, you need to query the raw data. Because it’s a freely available database, you can query the database by creating a SQL query and then putting this into BigQuery and running it against the CrUX database. You can then export this into Google Sheets, and then that can be pulled into Data Studio and you can create all of these amazing graphs to visualize how speed is performing or the performance of your site over time.
It might sound like a bit of a complicated process, but there are a load of great guides out there. So you’ve got Paul Calvano, who has a number of video tutorials for getting started with this process. There’s also Rick Viscomi, who’s got a CrUX Cookbook, and what this is, is a number of templated SQL queries, where you just need to plug in the domains that you’re interested in and then you can put this straight into BigQuery.
Also, if you wanted to automate this process, rather than exporting it into Google Sheets, you could pull this into Google Cloud Storage and also update the SQL query so this pulls in on a monthly basis. That’s where you kind of want to get to with that.
Once you’ve got to this stage and you’re able to visualize the data, what should you actually do with it? Well, I’ve got a few different use cases here.
1. Get buy-in
The first is you can get buy-in from management, from clients, whoever you report into, for various optimization work. If you can show that you’re flagging behind competitors, for example, that might be a good basis for getting some optimization initiatives rolling. You can also use the Revenue Impact Calculator, which is a really simple kind of Google tool which allows you to put in some various details about your site and then it shows you how much more money you could be making if your site was X% faster.
2. Inform devs
Once you’ve got the buy-in, you can use the CrUX visualizations to inform developers. What you want to do here is show exactly the areas that your site is falling down. Where are these problem areas? It might be, for example, that first contentful paint is suffering. You can go to the developers and say, “Hey, look, we need to fix this.” If they come back and say, “Well, our independent tests show that the site is performing fine,” you can point to the fact that it’s from real users. This is how people are actually experiencing your site.
3. Communicate impact
Thirdly and finally, once you’ve got these optimization initiatives going, you can communicate the impacts that they’re actually having on performance and also business metrics. You could trend these various performance metrics from month to month and then overlay various business metrics. You might want to look at conversion rates. You might want to look at bounce rates, etc. and showing those side-by-side so that you can see whether they’re improving as the performance of the site is improving as well.
Faster site = better UX, better customer loyalty, and growing SEO benefit
These are different ways that you can visualize the CrUX database, and it’s really worthwhile, because if you have a faster site, then it’s going to result in better user experience. It’s going to result in better customer loyalty, because if you’re providing your users with a great experience, then they’re actually more likely to come back to you rather than going to one of your competitors.
There’s also a growing SEO benefit. We don’t know how Google is going to change their algorithms going forward, but I wouldn’t be surprised if speed is coming in more and more as a ranking signal.
This is how Google understands page speed, some ways that you can visualize the data from the CrUX database, and some of the reasons why you would want to do that.
I hope that’s been helpful. It’s been a pleasure doing this. Until the next time, thank you very much.
Contrary to popular belief, SEO and PPC aren’t at opposite ends of the spectrum. There are plenty of ways the two search disciplines can work together for benefits all around, especially when it comes to optimizing your Google Ads. In this week’s edition of Whiteboard Friday, we’re thrilled to welcome Dana DiTomaso as she explains how you can harness the power of both SEO and PPC for a better Google experience overall.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Hey, Moz readers. My name is Dana DiTomaso, and I’m President and partner at Kick Point. We’re a digital marketing agency way up in the frozen wilds of Edmonton, Alberta. Today I’m going be talking to you about PPC, and I know you’re thinking, “This is an SEO blog. What are you doing here talking about PPC?”
But one of my resolutions for 2019 is to bring together SEO and PPC people, because SEO can learn a lot from PPC, and yes, PPC, you also can learn a lot from SEO. I know PPC people are like, “We just do paid. It’s so great.” But trust me, both can work together. In our agency, we do both SEO and PPC, and we work with a lot of companies who have one person, sometimes two and they’re doing everything.
One of the things we try to do is help them run better Ads campaigns. Here I have tips on things that we see all the time, when we start working with a new Ads account, that we end up fixing, and hopefully I can pass this on to you so you can fix it before you have to call an agency to come and fix it for you. One of the things is this is actually a much longer piece than what I can present on this whiteboard. There’s only so much room.
There is actually a blog post on our website, which you can find here. Please check that out and that will have the full nine tips. But I’m just going to break it down to a few today.
1. Too many keywords
First thing, too many keywords. We see this a lot where people, in Google it says make sure to put together keywords that have the same sort of theme.
But your theme can be really specific, or it can be kind of vague. This is an example, a real example that we got, where the keyword examples were all lawyer themes, so “defense lawyer,” “criminal lawyer,””dui lawyer,” “assault lawyer,” “sexual assault lawyer.” Technically, they all have the same theme of “lawyer,”but that’s way too vague for it to be all in one single ad group, because what kind of ad are you going to show?
“We are lawyers. Call us.” It’s not specific enough. Take for example “dui lawyer,”which I know is a really very competitive niche, and then you can do [dui lawyer], [dui lawyer seattle], and then “dui lawyer” and +dui+lawyer+seattle spelled out a little bit differently. I’ll talk about that in a second. By taking this one thing and then breaking it down into a much more specific ad group, you can really have much more control.
This is a consistent theme in all the tips I talk about is much more control over where you’re spending your money, what keywords you’re spending it on, what your ads are, having a much better landing page to ad match, which is also really important. It just makes your ad life so much easier when you’ve got it in all of those ad groups. I know it might seem intimidating. It’s like, “Well, I have three ad groups now.If I follow your tips, I’m going to have 40.”
But at the same time, it’s way easier to manage 40 well organized groups than it is to manage 3 really badly organized groups. Keep that in mind.
2. Picking the right match type
The next thing is picking the right match type. You can see here I’ve got this bracket stuff and this phrase stuff and these plus signs. There are really four match types.
There’s broad match, which is terrible and don’t ever use it. Broad match is just you writing out the keyword, and then Google just displays it for whatever it feels like is relevant to that particular search. For example, we’ve seen examples where it’s like a catering company and they’ll have “catering” as a keyword, and they’re showing up for all sorts of phrases in catering where they can’t provide catering, so searching for a venue that only does in-house catering. Or they’re spending money on a catering conference or just totally irrelevant stuff. Do not use broad match.
Broad match modifier (BMM)
The upgrade from that is what’s called broad match modifier or BMM, and that’s where these plus signs come in. This is really the words dui, lawyer, and seattle in any order, but they all have to exist and other things can exist around that. It could be, “I need a DUI lawyer in Seattle.” “I live in Seattle. I need a DUI lawyer.” That would also work for that particular keyword.
The next type is phrase, and that’s in the quotes. This “dui lawyer” is the example here, and then you can have anything before it or you can have anything after it, but you can’t have something in between it. It couldn’t be “dui who is really great at being a lawyer” for example. Weak example, but you get the idea. You can’t just shove stuff in the middle of a phrase match.
Then exact match is what’s in the brackets here, and that is just those words and nothing else. If I have [dui lawyer], this keyword, if I didn’t have [dui lawyer seattle], this keyword would not trigger if somebody searches [dui lawyer seattle]. That’s as specific as possible. You really want to try that for your most competitive keywords.
This is the really expensive stuff, because you do not want to waste one single penny on anything that is irrelevant to that particular search. This is your head on, it’s really expensive every click. I’ve got to make sure I’m getting the most money possible for those clicks. That’s where you really want to use exact match.
3. Only one ad per group
Next, tips. The next thing is what we see is a lot of people who have only one ad per group.
Have at least 3 ads per group
This is not a tip. This is a criticism up here. The thing is that maybe, again, you think it’s easy for management, but it’s really hard to see what’s going to work, because if you’re not always testing, how are you going to know if you could do better? Make sure to have at least three ads per group.
Add emotional triggers into your ad copy
Then look at your ad copy. We see a lot of just generic like, “We are the best lawyers. Call us.” There’s nothing there that says I need to call these people. Really think about how you can add those emotional triggers into your copy. Talk to your client or your team, if you work in-house, and find out what are the things that people say when they call. What are the things where they say, “Wow, you really helped me with this” or, “I was feeling like this and then you came in and I just felt so much better.”
That can really help to spice up your ads. We don’t want to get too fancy with this, but we certainly want to make something that’s going to help you stand out. Really add those emotional triggers into your ad copy.
Make sure to have a call to action
Then the next thing is making sure to have a call to action, which seems basic because you think it’s an ad. If you click it, that’s the call to action. But sometimes people on the Internet, they’re not necessarily thinking. You just want to say, “You know what? Just call me or email me or we’re open 24 hours.”
Just be really specific on what you want the person to do when they look at the ad. Just spell it out for them. I know it seems silly. Just tell them. Just tell them what you want them to do. That’s all you need to do.
Then make sure you add in all of the extensions. In Google Ads, if you’re not super familiar with the platform, there’s a section called Extensions. These are things like when the address shows up under an ad, or you’ve got those little links that come up, or you’ve got somebody saying we’re open 24 hours, for example. There are all sorts of different extensions that you can use. Just put in all the extensions that you possibly can for every single one of your groups.
Then they won’t all trigger all at the same time, but at least they’re there and it’s possible that they could trigger. If they do, that’s give your ad more real estate versus your competition, which is really great on mobile because ads take up a lot of space at the top of a mobile search. You want to make sure to shove your competition as far as you possibly can down that search so you own as much of that property as you possibly can. One thing that I do see people doing incorrectly with extensions, though, is setting extensions at say the campaign level, and then you have different ad groups that cover different themes.
Going back to this example over here, with the different types of lawyers, let’s say you had an extension that talks specifically about DUI law, but then it was triggering on say sexual assault law. You don’t want that to happen. Make sure you have really fine-tuned control over your different extensions so you’re showing the right extension with the right type of keyword and the right type of ad. The other thing that we see a lot is where people have location extensions and they’re showing all the location extensions where they should not be showing all the location extensions.
You’ve got an ad group for, say, Seattle, and it’s talking about this new home development that you have, and because you just loaded in all of your location extensions, suddenly you’re showing extensions for something in say San Francisco. It’s just because you haven’t filtered properly. Really double-check to make sure that you’ve got your filter set up properly for your location extensions and that you’re showing the right location extension for the right ad.
I know that Google says, “We’ll pick the locations closest to the client.” But you don’t know where that person is searching right there. They could be in San Francisco at that moment and searching for new home builds in Seattle, because maybe they’re thinking about moving from San Francisco to Seattle. You don’t want them to see the stuff that’s there. You want them to see the stuff that’s at the place where they’re intending to be. Really make sure you control that.
4. Keep display and search separate
Last, but not least, keep display and search separate.
By default, Google so helpfully says, “We’ll just show your ads everywhere. It’s totally cool. This is what we want everyone to do.” Don’t do that. This is what makes Google money. It does not make you money. The reason why is because display network, which is where you’re going to a website and then you see an ad, and search network, when you type in the stuff and you see an ad, are two totally different beasts.
Avoid showing text ads on the display network for greater campaign control
It’s really a different type of experience. To be honest, if you take your search campaigns, which are text-based ads, and now you’re showing them on websites, you’re showing a boring text ad on a website that already has like 50 blinky things and click here. They’re probably not seeing us and maybe they have an ad blocker installed. But if they are, certainly your text ad, which is kind of boring and not intended for that medium, is not going to be the thing that stands out.
Really you’re just wasting your money because you’ll end up with lower relevancy, less clicks, and then Google thinks that your group is bad. Then you’ll end up paying more because Google thinks your group is bad. It really gives you that extra control by saying, “This is the search campaign. It’s only on search. This is the display campaign. It’s only on display.” Keep the two of them totally separate. Then you have lots of control over the search ads being for search and the display ads being for display.
Don’t mix those two up. Make sure to uncheck that by default. Definitely there are more tips on our blog here. But I hope that this will help you get started. SEOs, if you’ve never done a PPC campaign in your life, I recommend just setting one up. Put 50 bucks behind that thing. Just try it out, because I think what will really help you is understanding more of how people search, because as we get less and less keyword data from the different tools that we use to figure out what the heck are people googling when they try to search for our business, ads give you some of that data back.
That’s where ads can be a really great ally in trying to get better SEO results. I hope you found this enjoyable. Thanks so much.
Do you need to disavow links in the modern age of Google? Is it safe? If so, which links should you disavow? In this Whiteboard Friday, Cyrus Shepard answers all these questions and more. While he makes it clear that the majority of sites shouldn’t have to use Google’s Disavow Tool, he provides his personal strategies for those times when using the tool makes sense.
How do you decide when to disavow? We’d love to hear your process in the comments below!
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I’m Cyrus Shepard. Today we’re going to be talking about a big topic — Google’s Disavow Tool. We’re going to be discussing when you should use it and what links you should target.
Now, this is kind of a scary topic to a lot of SEOs and webmasters. They’re kind of scared of the Disavow Tool. They think, “It’s not necessary. It can be dangerous. You shouldn’t use it.” But it’s a real tool. It exists for a reason, and Google maintains it for exactly for webmasters to use it. So today we’re going to be covering the scenarios which you might consider using it and what links you should target.
Disclaimer! The vast majority of sites don’t need to disavow *anything*
Now I want to start out with a big disclaimer. I want this to be approved by the Google spokespeople. So the big disclaimer is the vast majority of sites don’t need to disavow anything. Google has made tremendous progress over the last few years of determining what links to simply ignore. In fact, that was one of the big points of the last Penguin 4.0 algorithm update.
Before Penguin, you had to disavow links all the time. But after Penguin 4.0, Google simply ignored most bad links, emphasis on the word “most.” It’s not a perfect system. They don’t ignore all bad links. We’ll come back to that point in a minute. There is a danger in using the Disavow Tool of disavowing good links.
That’s the biggest problem I see with people who use the disavow is it’s really hard to determine what Google counts as a bad link or a harmful link and what they count as a good link. So a lot of people over-disavow and disavow too many things. So that’s something you need to look out for. My final point in the disclaimer is large, healthy sites with good link profiles are more immune to bad links.
So if you are The New York Times or Wikipedia and you have a few spam links pointing to you, it’s really not going to hurt you. But if your link profile isn’t as healthy, that’s something you need to consider. So with those disclaimers out of the way, let’s talk about the opposite sort of situations, situations where you’re going to want to consider using the Disavow Tool.
Good candidates for using the Disavow Tool
Obviously, if you have a manual penalty. Now, these have decreased significantly since Penguin 4.0. But they still exist. People still get manual penalties. Definitely, that’s what the Disavow Tool is for. But there are other situations.
There was a conversation with Marie Haynes, that was published not too long ago, where she was asking in a Google hangout, “Are there other situations that you can use the disavow other than a penalty, where your links may hurt you algorithmically?”
John Mueller said this certainly was the case, that if you want to disavow those obviously dodgy links that could be hurting you algorithmically, it might help Google trust your link profile a little more. If your link profile isn’t that healthy in the first place if you only have a handful of links and some of those are dodgy, you don’t have a lot to fall back on.
So disavowing those dodgy links can help Google trust the rest of your link profile a little more.
1. Penalty examples
Okay, with those caveats out of the way and situations where you do want to disavow, a big question people have is, “Well, what should I disavow?” So I’ve done this for a number of sites, and these are my standards and I’ll share them with you. So good candidates to disavow, the best examples are often what Google will give you when they penalize you.
Again it’s a little more rare, but when you do get a link penalty, Google will often provide sample links. They don’t tell you all of the links to disavow. But they’ll give you sample links, and you can go through and you can look for patterns in your links to see what matches what Google is considering a spammy link. You definitely want to include those in your disavow file.
2. Link schemes
If you’ve suffered a drop in traffic, or you think Google is hurting you algorithmically because of your links, obviously if you’ve participated in link schemes, if you’ve been a little bit naughty and violated Google’s Webmaster Guidelines, you definitely want to take a look at those.
We’re talking about links that you paid for or someone else paid for. It’s possible someone bought some shady links to try to bring you down, although Google is good at ignoring a lot of those. If you use PBNs. Now I know a lot of black hat SEOs that use PBNs and swear by them. But when they don’t work, when you’ve been hurt algorithmically or you’ve been penalized or your traffic is down and you’re using PBNs, that’s a good candidate to put in your disavow file.
3. Non-editorial links
Google has a whole list of non-editorial links. We’re going to link to it in the transcript below. But these are links that the webmaster didn’t intentionally place, things like widgets, forum spam, signature spam, really shady, dodgy links that you control. A good judge of all of these links is often in the anchor text.
4. $$ Anchor text
Is it a money anchor text? Are these money, high-value keywords? Do you control the anchor text? You can generally tell a really shady link by looking at the anchor text. Is it optimized? Could I potentially benefit? Do I control that?
If the answer is yes to those questions, it’s usually a good candidate for the disavow file.
The “maybe” candidates for using the Disavow Tool
Then there’s a whole set of links in a bucket that I call the “maybe” file. You might want to disavow. I oftentimes do, but not necessarily.
So a lot of these would be malware. You click on a link and it gives you a red browser warning that the site contains spam, or your computer freezes up, those toxic links.
If I were Google, I probably wouldn’t want to see those types of links linking to a site. I don’t like them linking to me. I would probably throw them in the disavow.
2. Cloaked sites
These are sites when you click on the link, they show Google one set of results, but a user a different set of results. The way you find these is that when you’re searching for your links, it’s usually a good idea to look at them using a Googlebot user agent.
If you use Chrome, you can get a browser extension. We’ll link to some of these in the post below. But look at everything and see everything through Google’s eyes using a Googlebot user agent and you can find those cloaked pages. They’re kind of a red flag in terms of link quality.
3. Shady 404s
Now, what do I mean by a shady 404? You click on the link and the page isn’t there, and in fact, maybe the whole domain isn’t there. You’ve got a whole bunch of these. It looks like just something is off about these 404s. The reason I throw these in the disavow file is because usually there’s no record of what the link was. It was usually some sort of spammy link.
They were trying to rank for something, and then, for whatever reason, they removed the entire domain or it’s removed by the domain registrar. Because I don’t know what was there, I usually disavow it. It’s not going to help me in the future when Google discovers that it’s gone anyway. So it’s usually a safe bet to disavow those shady 404s.
4. Bad neighborhood spam
Finally, sometimes you find those bad neighborhood links in your link profile.
These are things like pills, poker, porn, the three P’s of bad neighborhoods. If I were Google and I saw porn linking to my non-porn site, I would consider that pretty shady. Now maybe they’ll just ignore it, but I just don’t feel comfortable having a lot of these bad, spammy neighborhoods linking to me. So I might consider these to throw in the disavow file as well.
Probably okay — don’t necessarily need to disavow
Now finally, we often see a lot of people disavowing links that maybe aren’t that bad. Again, I want to go back to the point it’s hard to tell what Google considers a good link, a valuable link and a poor link. There is a danger in throwing too much in your disavow file, which a lot of people do. They just throw the whole kitchen sink in there.
If you do that, those links aren’t going to count, and your traffic might go down.
1. Scraper sites
So one thing I don’t personally put in my disavow file are scraper sites. You get a good link in an online magazine, and then a hundred other sites copy it. These are scraper sites. Google is picking them up. I don’t put those in the disavow file because Google is getting better and better at assigning the authority of those links to the original site. I don’t find that putting them in the disavow file has really helped, at least with the sites I work with.
The same with feeds. You see a lot of feed links in Google’s list in your link report. These are just raw HTML feeds, RSS feeds. Again, for the same reason, unless they are feeds or scraper sites from this list over here. If they are feeds and scrapers of good sites, no need.
3. Auto-generated spam
These are sites that are automatically generated by robots and programs. They’re usually pretty harmless. Google is pretty good at ignoring them. You can tell the difference between auto-generated spam and link scheme again by the anchor text.
Auto-generated spam usually does not have optimized anchor text. It’s usually your page title. It’s usually broken. These are really low-quality pages that Google generally ignores, that I would not put in a disavow.
4. Simple low quality
These are things like directories, pages that you look at and you’re like, “Oh, wow, they only have three pages on their site. No one is linking to them.”
Leave it up to Google to ignore those, and they generally do a pretty good job. Or Google can count them. For things like this, unless it’s obvious, unless you’re violating these rules, I like to leave them in. I don’t like to include them in the disavow. So we’ve got our list.
Pro tips for your disavow file
A few pro tips when you actually put your disavow file together if you choose to do so.
If you find one bad link on a spammy domain, it’s usually a good idea to disavow the entire domain, because there’s a good chance that there are other links on there that you’re just not spotting.
So using the domain operator in your disavow file is usually a good idea, unless it’s a site like WordPress or something with a lot of subdomains.
Use Search Console & third-party tools
Where do you find your links to disavow? First choice is generally Search Console, the link report in Search Console, because that’s the links that Google is actually using. It is helpful to use third-party tools, such as Moz Link Explorer, Ahrefs, SEMrush, whatever your link index is, and that’s because you can sort through the anchor text.
When Google gives you their link report, they don’t include the anchor text. It’s very helpful to use those anchor text reports, such as you would get in Moz Link Explorer, and you can sort through and you can find your over-optimized anchor text, your spammy anchor text. You can find patterns and sort. That’s often really helpful to do that in order to sort your information.
Try removing links
If you have a disavow file, and this happens on a lot of older sites, if you’re auditing a site, it’s a really good idea to go in and check and see if a disavow file already exists. It’s possible it was created prior to Penguin 4.0. It’s possible there are a lot of good links in there already, and you can try removing links from that disavow file and see if it helps your rankings, because those older disavow files often contain a lot of links that are actually good, that are actually helping you.
Record everything and treat it as an experiment
Finally, record everything. Treat this as any other SEO process. Record everything. Think of it as an experiment. If you disavow, if you make a mistake and your rankings drop or your rankings go up, you want to know what caused that, and you need to be responsible for that and be a good SEO. All right, that’s all we have for today.
Leave your own disavow comments below. If you like this video, please share. Thanks, everybody.
Bonus: I really liked these posts for detailing alternative ways of finding links to disavow, so I thought I’d share:
The final episode in our six-part One-Hour Guide to SEO series deals with a topic that’s a perennial favorite among SEOs: link building. Today, learn why links are important to both SEO and to Google, how Google likely measures the value of links, and a few key ways to begin earning your own.
Click on the whiteboard image above to open a high resolution version in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. We are back with our final part in the One-Hour Guide to SEO, and this week talking about why links matter to search engines, how you can earn links, and things to consider when doing link building.
Why are links important to SEO?
So we’ve discussed sort of how search engines rank pages based on the value they provide to users. We’ve talked about how they consider keyword use and relevant topics and content on the page. But search engines also have this tool of being able to look at all of the links across the web and how they link to other pages, how they point between pages.
So it turns out that Google had this insight early on that what other people say about you is more important, at least to them, than what you say about yourself. So you may say, “I am the best resource on the web for learning about web marketing.” But it turns out Google is not going to believe you unless many other sources, that they also trust, say the same thing. Google’s big innovation, back in 1997 and 1998, when Sergey Brin and Larry Page came out with their search engine, Google, was PageRank, this idea that by looking at all the links that point to all the pages on the internet and then sort of doing this recursive process of seeing which are the most important and most linked to pages, they could give each page on the web a weight, an amount of PageRank.
Then those pages that had a lot of PageRank, because many people linked to them or many powerful people linked to them, would then pass more weight on when they linked. That understanding of the web is still in place today. It’s still a way that Google thinks about links. They’ve almost certainly moved on from the very simplistic PageRank formula that came out in the late ’90s, but that thinking underlies everything they’re doing.
How does Google measure the value of links?
Today, Google measures the value of links in many very sophisticated ways, which I’m not going to try and get into, and they’re not public about most of these anyway. But there is a lot of intelligence that we have about how they think about links, including things like more important, more authoritative, more well-linked-to pages are going to pass more weight when they link.
A.) More important, authoritative, well-linked-to pages pass more weight when they link
That’s true of both individual URLs, an individual page, and websites, a whole website. So for example, if a page on The New York Times links to yoursite.com, that is almost certainly going to be vastly more powerful and influential in moving your rankings or moving your ability to rank in the future than if randstinysite.info — which I haven’t yet registered, but I’ll get on that — links to yoursite.com.
This weighting, this understanding of there are powerful and important and authoritative websites, and then there are less powerful and important and authoritative websites, and it tends to be the case that more powerful ones tend to provide more ranking value is why so many SEOs and marketers use metrics like Moz’s domain authority or some of the metrics from Moz’s competitors out in the software space to try and intuit how powerful, how influential will this link be if this domain points to me.
B.) Diversity of domains, rate of link growth, and editorial nature of links ALL matter
So the different kinds of domains and the rate of link growth and the editorial nature of those links all matter. So, for example, if I get many new links from many new websites that have never linked to me before and they are editorially given, meaning I haven’t spammed to place them, I haven’t paid to place them, they were granted to me because of interesting things that I did or because those sites wanted to editorially endorse my work or my resources, and I do that over time in greater quantities and at a greater rate of acceleration than my competitors, I am likely to outrank them for the words and phrases related to those topics, assuming that all the other smart SEO things that we’ve talked about in this One-Hour Guide have also been done.
C.) HTML-readable links that don’t have rel=”nofollow” and contain relevant anchor text on indexable pages pass link benefit
HTML readable links, meaning as a simple text browser browses the web or a simple bot, like Googlebot, which can be much more complex as we talked about in the technical SEO thing, but not necessarily all the time, those HTML readable links that don’t have the rel=”nofollow” parameter, which is something that you can append to links to say I don’t editorially endorse this, and many, many websites do.
If you post a link to Twitter or to Facebook or to LinkedIn or to YouTube, they’re going to carry this rel=”nofollow,”saying I, YouTube, don’t editorially endorse this website that this random user has uploaded a video about. Okay. Well, it’s hard to get a link from YouTube. And it contains relevant anchor text on an indexable page, one that Google can actually browse and see, that is going to provide the maximum link benefit.
So a href=”https://yoursite.com” great tool for audience intelligence, that would be the ideal link for my new startup, for example, which is SparkToro, because we do audience intelligence and someone saying we’re a tool is perfect. This is a link that Google can read, and it provides this information about what we do.
It says great tool for audience intelligence. Awesome. That is powerful anchor text that will help us rank for those words and phrases. There are loads more. There are things like which pages linked to and which pages linked from. There are spam characteristics and trustworthiness of the sources. Alt attributes, when they’re used in image tags, serve as the anchor text for the link, if the image is a link.
There’s the relationship, the topical relationship of the linking page and linking site. There’s text surrounding the link, which I think some tools out there offer you information about. There’s location on the page. All of this stuff is used by Google and hundreds more factors to weight links. The important part for us, when we think about links, is generally speaking if you cover your bases here, it’s indexable, carries good anchor text, it’s from diverse domains, it’s at a good pace, it is editorially given in nature, and it’s from important, authoritative, and well linked to sites, you’re going to be golden 99% of the time.
Are links still important to Google?
Many folks I think ask wisely, “Are links still that important to Google? It seems like the search engine has grown in its understanding of the web and its capacities.” Well, there is some pretty solid evidence that links are still very powerful. I think the two most compelling to me are, one, the correlation of link metrics over time.
So like Google, Moz itself produces an index of the web. It is billions and billions of pages. I think it’s actually trillions of pages, trillions of links across hundreds of billions of pages. Moz produces metrics like number of linking root domains to any given domain on the web or any given page on the web.
Moz has a metric called Domain Authority or DA, which sort of tries to best replicate or best correlate to Google’s own rankings. So metrics like these, over time, have been shockingly stable. If it were the case someday that Google demoted the value of links in their ranking systems, basically said links are not worth that much, you would expect to see a rapid drop.
But from 2007 to 2019, we’ve never really seen that. It’s fluctuated. Mostly it fluctuates based on the size of the link index. So for many years Ahrefs and Majestic were bigger link indices than Moz. They had better link data, and their metrics were better correlated.
Now Moz, since 2018, is much bigger and has higher correlation than they do. So the various tools are sort of warring with each other, trying to get better and better for their customers. You can see those correlations with Google pretty high, pretty standard, especially for a system that supposedly contains hundreds, if not thousands of elements.
When you see a correlation of 0.25 or 0.3 with one number, linking root domains or page authority or something like that, that’s pretty surprising. The second one is that many SEOs will observe this, and I think this is why so many SEO firms and companies pitch their clients this way, which is the number of new, high quality, editorially given linking root domains, linking domains, so The New York Times linked to me, and now The Washington Post linked to me and now wired.com linked to me, these high-quality, different domains, that correlates very nicely with ranking positions.
So if you are ranking number 12 for a keyword phrase and suddenly that page generates many new links from high-quality sources, you can expect to see rapid movement up toward page one, position one, two, or three, and this is very frequent.
How do I get links?
Obviously, this is not alone, but very common. So I think the next reasonable question to ask is, “Okay, Rand, you’ve convinced me. Links are important. How do I get some?” Glad you asked. There are an infinite number of ways to earn new links, and I will not be able to represent them here. But professional SEOs and professional web marketers often use tactics that fall under a few buckets, and this is certainly not an exhaustive list, but can give you some starting points.
1. Content & outreach
The first one is content and outreach. Essentially, the marketer finds a resource that they could produce, that is relevant to their business, what they provide for customers, data that they have, interesting insights that they have, and they produce that resource knowing that there are people and publications out there that are likely to want to link to it once it exists.
Then they let those people and publications know. This is essentially how press and PR work. This is how a lot of content building and link outreach work. You produce the content itself, the resource, whatever it is, the tool, the dataset, the report, and then you message the people and publications who are likely to want to cover it or link to it or talk about it. That process is tried-and-true. It has worked very well for many, many marketers.
2. Link reclamation
Second is link reclamation. So this is essentially the process of saying, “Gosh, there are websites out there that used to link to me, that stopped linking.” The link broke. The link points to a 404, a page that no longer loads on my website.
The link was supposed to be a link, but they didn’t include the link. They said SparkToro, but they forgot to actually point to the SparkToro website. I should drop them a line. Maybe I’ll tweet at them, at the reporter who wrote about it and be like, “Hey, you forgot the link.” Those types of link reclamation processes can be very effective as well.
They’re often some of the easiest, lowest hanging fruit in the link building world.
3. Directories, resource pages, groups, events, etc.
Directories, resource pages, groups, events, things that you can join and participate in, both online or online and offline, so long as they have a website, often link to your site. The process is simply joining or submitting or sponsoring or what have you.
Most of the time, for example, when I get invited to speak at an event, they will take my biography, a short, three-sentence blurb, that includes a link to my website and what I do, and they will put it on their site. So pitching to speak at events is a way to get included in these groups. I started Moz with my mom, Gillian Muessig, and Moz has forever been a woman-owned business, and so there are women-owned business directories.
I don’t think we actually did this, but we could easily go, “Hey, you should include Moz as a woman-owned business.We should be part of your directory here in Seattle.” Great, that’s a group we could absolutely join and get links from.
4. Competitors’ links
So this is basically the practice you almost certainly will need to use tools to do this. There are some free ways to do it.
The simple, free way to do it is to say, “I have competitor 1 brand name and competitor 2 brand name.I’m going to search for the combination of those two in Google, and I’m going to look for places that have written about and linked to both of them and see if I can also replicate the tactics that got them coverage.” The slightly more sophisticated way is to go use a tool. Moz’s Link Explorer does this.
So do tools from people like Majestic and Ahrefs. I’m not sure if SEMrush does. But basically you can plug in, “Here’s me. Here’s my competitors. Tell me who links to them and does not link to me.” Moz’s tool calls this the Link Intersect function. But you don’t even need the link intersect function.
You just plug in a competitor’s domain and look at here are all the links that point to them, and then you start to replicate their tactics. There are hundreds more and many, many resources on Moz’s website and other great websites about SEO out there that talk about many of these tactics, and you can certainly invest in those. Or you could conceivably hire someone who knows what they’re doing to go do this for you. Links are still powerful.
Okay. Thank you so much. I want to say a huge amount of appreciation to Moz and to Tyler, who’s behind the camera — he’s waving right now, you can’t see it, but he looks adorable waving — and to everyone who has helped make this possible, including Cyrus Shepard and Britney Muller and many others.
Hopefully, this one-hour segment on SEO can help you upgrade your skills dramatically. Hopefully, you’ll send it to some other folks who might need to upgrade their understanding and their skills around the practice. And I’ll see you again next week for another edition of Whiteboard Friday. Take care.
We’ve arrived at one of the meatiest SEO topics in our series: technical SEO. In this fifth part of the One-Hour Guide to SEO, Rand covers essential technical topics from crawlability to internal link structure to subfolders and far more. Watch on for a firmer grasp of technical SEO fundamentals!
Click on the whiteboard image above to open a high resolution version in a new tab!
Howdy, Moz fans, and welcome back to our special One-Hour Guide to SEO Whiteboard Friday series. This is Part V – Technical SEO. I want to be totally upfront. Technical SEO is a vast and deep discipline like any of the things we’ve been talking about in this One-Hour Guide.
There is no way in the next 10 minutes that I can give you everything that you’ll ever need to know about technical SEO, but we can cover many of the big, important, structural fundamentals. So that’s what we’re going to tackle today. You will come out of this having at least a good idea of what you need to be thinking about, and then you can go explore more resources from Moz and many other wonderful websites in the SEO world that can help you along these paths.
1. Every page on the website is unique & uniquely valuable
First off, every page on a website should be two things — unique, unique from all the other pages on that website, and uniquely valuable, meaning it provides some value that a user, a searcher would actually desire and want. Sometimes the degree to which it’s uniquely valuable may not be enough, and we’ll need to do some intelligent things.
So, for example, if we’ve got a page about X, Y, and Z versus a page that’s sort of, “Oh, this is a little bit of a combination of X and Y that you can get through searching and then filtering this way.Oh, here’s another copy of that XY, but it’s a slightly different version.Here’s one with YZ. This is a page that has almost nothing on it, but we sort of need it to exist for this weird reason that has nothing to do, but no one would ever want to find it through search engines.”
Okay, when you encounter these types of pages as opposed to these unique and uniquely valuable ones, you want to think about: Should I be canonicalizing those, meaning point this one back to this one for search engine purposes? Maybe YZ just isn’t different enough from Z for it to be a separate page in Google’s eyes and in searchers’ eyes. So I’m going to use something called the rel=canonical tag to point this YZ page back to Z.
Maybe I want to remove these pages. Oh, this is totally non-valuable to anyone. 404 it. Get it out of here. Maybe I want to block bots from accessing this section of our site. Maybe these are search results that make sense if you’ve performed this query on our site, but they don’t make any sense to be indexed in Google. I’ll keep Google out of it using the robots.txt file or the meta robots or other things.
2. Pages are accessible to crawlers, load fast, and can be fully parsed in a text-based browser
Secondarily, pages are accessible to crawlers. They should be accessible to crawlers. They should load fast, as fast as you possibly can. There’s a ton of resources about optimizing images and optimizing server response times and optimizing first paint and first meaningful paint and all these different things that go into speed.
But speed is good not only because of technical SEO issues, meaning Google can crawl your pages faster, which oftentimes when people speed up the load speed of their pages, they find that Google crawls more from them and crawls them more frequently, which is a wonderful thing, but also because pages that load fast make users happier. When you make users happier, you make it more likely that they will link and amplify and share and come back and keep loading and not click the back button, all these positive things and avoiding all these negative things.
3. Thin content, duplicate content, spider traps/infinite loops are eliminated
Thin content and duplicate content — thin content meaning content that doesn’t provide meaningfully useful, differentiated value, and duplicate content meaning it’s exactly the same as something else — spider traps and infinite loops, like calendaring systems, these should generally speaking be eliminated. If you have those duplicate versions and they exist for some reason, for example maybe you have a printer-friendly version of an article and the regular version of the article and the mobile version of the article, okay, there should probably be some canonicalization going on there, the rel=canonical tag being used to say this is the original version and here’s the mobile friendly version and those kinds of things.
If you have search results in the search results, Google generally prefers that you don’t do that. If you have slight variations, Google would prefer that you canonicalize those, especially if the filters on them are not meaningfully and usefully different for searchers.
4. Pages with valuable content are accessible through a shallow, thorough internal links structure
Number four, pages with valuable content on them should be accessible through just a few clicks, in a shallow but thorough internal link structure.
Now this is an idealized version. You’re probably rarely going to encounter exactly this. But let’s say I’m on my homepage and my homepage has 100 links to unique pages on it. That gets me to 100 pages. One hundred more links per page gets me to 10,000 pages, and 100 more gets me to 1 million.
So that’s only three clicks from homepage to one million pages. You might say, “Well, Rand, that’s a little bit of a perfect pyramid structure. I agree. Fair enough. Still, three to four clicks to any page on any website of nearly any size, unless we’re talking about a site with hundreds of millions of pages or more, should be the general rule. I should be able to follow that through either a sitemap.
If you have a complex structure and you need to use a sitemap, that’s fine. Google is fine with you using an HTML page-level sitemap. Or alternatively, you can just have a good link structure internally that gets everyone easily, within a few clicks, to every page on your site. You don’t want to have these holes that require, “Oh, yeah, if you wanted to reach that page, you could, but you’d have to go to our blog and then you’d have to click back to result 9, and then you’d have to click to result 18 and then to result 27, and then you can find it.”
No, that’s not ideal. That’s too many clicks to force people to make to get to a page that’s just a little ways back in your structure.
5. Pages should be optimized to display cleanly and clearly on any device, even at slow connection speeds
Five, I think this is obvious, but for many reasons, including the fact that Google considers mobile friendliness in its ranking systems, you want to have a page that loads clearly and cleanly on any device, even at slow connection speeds, optimized for both mobile and desktop, optimized for 4G and also optimized for 2G and no G.
6. Permanent redirects should use the 301 status code, dead pages the 404, temporarily unavailable the 503, and all okay should use the 200 status code
Permanent redirects. So this page was here. Now it’s over here. This old content, we’ve created a new version of it. Okay, old content, what do we do with you? Well, we might leave you there if we think you’re valuable, but we may redirect you. If you’re redirecting old stuff for any reason, it should generally use the 301 status code.
If you have a dead page, it should use the 404 status code. You could maybe sometimes use 410, permanently removed, as well. Temporarily unavailable, like we’re having some downtime this weekend while we do some maintenance, 503 is what you want. Everything is okay, everything is great, that’s a 200. All of your pages that have meaningful content on them should have a 200 code.
These status codes, anything else beyond these, and maybe the 410, generally speaking should be avoided. There are some very occasional, rare, edge use cases. But if you find status codes other than these, for example if you’re using Moz, which crawls your website and reports all this data to you and does this technical audit every week, if you see status codes other than these, Moz or other software like it, Screaming Frog or Ryte or DeepCrawl or these other kinds, they’ll say, “Hey, this looks problematic to us. You should probably do something about this.”
7. Use HTTPS (and make your site secure)
When you are building a website that you want to rank in search engines, it is very wise to use a security certificate and to have HTTPS rather than HTTP, the non-secure version. Those should also be canonicalized. There should never be a time when HTTP is the one that is loading preferably. Google also gives a small reward — I’m not even sure it’s that small anymore, it might be fairly significant at this point — to pages that use HTTPS or a penalty to those that don’t.
In general, well, I don’t even want to say in general. It is nearly universal, with a few edge cases — if you’re a very advanced SEO, you might be able to ignore a little bit of this — but it is generally the case that you want one domain, not several. Allmystuff.com, not allmyseattlestuff.com, allmyportlandstuff.com, and allmylastuff.com.
Allmystuff.com is preferable for many, many technical reasons and also because the challenge of ranking multiple websites is so significant compared to the challenge of ranking one.
You want subfolders, not subdomains, meaning I want allmystuff.com/seattle, /la, and /portland, not seattle.allmystuff.com.
Why is this? Google’s representatives have sometimes said that it doesn’t really matter and I should do whatever is easy for me. I have so many cases over the years, case studies of folks who moved from a subdomain to a subfolder and saw their rankings increase overnight. Credit to Google’s reps.
I’m sure they’re getting their information from somewhere. But very frankly, in the real world, it just works all the time to put it in a subfolder. I have never seen a problem being in the subfolder versus the subdomain, where there are so many problems and there are so many issues that I would strongly, strongly urge you against it. I think 95% of professional SEOs, who have ever had a case like this, would do likewise.
Relevant folders should be used rather than long, hyphenated URLs. This is one where we agree with Google. Google generally says, hey, if you have allmystuff.com/seattle/ storagefacilities/top10places, that is far better than /seattle- storage-facilities-top-10-places. It’s just the case that Google is good at folder structure analysis and organization, and users like it as well and good breadcrumbs come from there.
There’s a bunch of benefits. Generally using this folder structure is preferred to very, very long URLs, especially if you have multiple pages in those folders.
9. Use breadcrumbs wisely on larger/deeper-structured sites
Last, but not least, at least last that we’ll talk about in this technical SEO discussion is using breadcrumbs wisely. So breadcrumbs, actually both technical and on-page, it’s good for this.
Google generally learns some things from the structure of your website from using breadcrumbs. They also give you this nice benefit in the search results, where they show your URL in this friendly way, especially on mobile, mobile more so than desktop. They’ll show home > seattle > storage facilities. Great, looks beautiful. Works nicely for users. It helps Google as well.
So there are plenty more in-depth resources that we can go into on many of these topics and others around technical SEO, but this is a good starting point. From here, we will take you to Part VI, our last one, on link building next week. Take care.
We’ve covered strategy, keyword research, and how to satisfy searcher intent — now it’s time to tackle optimizing the webpage itself! In the fourth part of the One-Hour Guide to SEO, Rand offers up an on-page SEO checklist to start you off on your way towards perfectly optimized and keyword-targeted pages.
If you missed them, check out the other episodes in the series so far:
Click on the whiteboard image above to open a high resolution version in a new tab!
Howdy, Moz fans. Welcome to another edition of our special One-Hour Guide to SEO. We are now on Part IV – Keyword Targeting and On-Page Optimization. So hopefully, you’ve watched Part III, where we talked about searcher satisfaction, how to make sure searchers are happy with the page content that you create and the user experience that you build for them, as well as Part II, where we talked about keyword research and how to make sure that you are targeting the right words and phrases that searchers are actually looking for, that you think you can actually rank for, and that actually get real organic click-through rate, because Google’s zero-click searches are rising.
Now we’re into on-page SEO. So this is essentially taking the words and phrases that we know we want to rank for with the content that we know will help searchers accomplish their task. Now how do we make sure that the page is optimal for ranking in Google?
On-page SEO has evolved
Well, this is very different from the way it was years ago. A long time ago, and unfortunately many people still believe this to be true about SEO, it was: How do I stuff my keywords into all the right tags and places on the page? How do I take advantage of things like the meta keywords tag, which hasn’t been used in a decade, maybe two? How do I take advantage of putting all the words and phrases stuffed into my title, my URL, my description, my headline, my H2 through H7 tags, all these kinds of things?
Most of that does not matter, but some of it still does. Some of it is still important, and we need to run through what those are so that you give yourself the best possible chance for ranking.
The on-page SEO checklist
So what I’ve done here is created a sort of brief, on-page SEO checklist. This is not comprehensive, especially on the technical portion, because we’re saving that for Part V, the technical SEO section, which we will get into, of this Guide. In this checklist, some of the most important things are on here.
☑ Descriptive, compelling, keyword-rich title element
Many of the most important things are on here, and those include things like a descriptive, compelling, keyword-rich but not stuffed title element, also called the page title or a title tag. So, for example, if I am a tool website, like toolsource.com — I made that domain name up, I assume it’s registered to somebody — and I want to rank for the “best online survey tools,” well, “The Best Online Survey Tools for 2019” is a great title tag, and it’s very different from best online survey tools, best online survey software, best online survey software 2019. You’ve seen title tags like that. You’ve seen pages that contain stuff like that. That is no longer good SEO practices.
So we want that descriptive, compelling, makes me want to click. Remember that this title is also going to show up in the search results as the title of the snippet that your website appears in.
☑ Meta description designed to draw the click
Second, a meta description. This is still used by search engines, not for rankings though. Sort of think of it like ad text. You are drawing a click, or you’re attempting to draw the click. So what you want to do is have a description that tells people what’s on the page and inspires them, incites them, makes them want to click on your result instead of somebody else’s. That’s your chance to say, “Here’s why we’re valuable and useful.”
☑ Easy-to-read, sensible, short URL
An easy-to-read, sensible, short URL. For example, toolsource.com/reviews/best-online-surveys-2019. Perfect, very legible, very readable. I see that in the results, I think, “Okay, I know what that page is going to be.” I see that copied and pasted somewhere on the web, I think, “I know what’s going to be at that URL. That looks relevant to me.”
Or reviews.best-online-tools.info. Okay, well, first off, that’s a freaking terrible domain name. /oldseqs?ide=17 bunch of weird letters and tab detail equals this, and UTM parameter equals that. I don’t know what this is. I don’t know what all this means. By the way, having more than one or two URL parameters is very poorly correlated with and not recommended for trying to rank in search results. So you want to try and rewrite these to be more friendly, shorter, more sensible, and readable by a human being. That will help Google as well.
☑ First paragraph optimized for appearing in featured snippets
That first paragraph, the first paragraph of the content or the first few words of the page should be optimized for appearing in what Google calls featured snippets. Now, featured snippets is when I perform a search, for many queries, I don’t just see a list of pages. Sometimes I’ll see this box, often with an image and a bunch of descriptive text that’s drawn from the page, often from the first paragraph or two. So if you want to get that featured snippet, you have to be able to rank on page one, and you need to be optimized to answer the query right in your first paragraph. But this is an opportunity for you to be ranking in position three or four or five, but still have the featured snippet answer above all the other results. Awesome when you can do this in SEO, very, very powerful thing. Featured snippet optimization, there’s a bunch of resources on Moz’s website that we can point you to there too.
☑ Use the keyword target intelligently in…
☑ The headline
So if I’m trying to rank for “best online survey tools,” I would try and use that in my headline. Generally speaking, I like to have the headline and the title of the piece nearly the same or exactly the same so that when someone clicks on that title, they get the same headline on the page and they don’t get this cognitive dissonance between the two.
☑ The first paragraph
The first paragraph, we talked about.
☑ The page content
The page’s content, you don’t want to have a page that’s talking about best online survey tools and you never mention online surveys. That would be a little weird.
☑ Internal link anchors
An internal link anchor. So if other places on your website talk about online survey tools, you should be linking to this page. This is helpful for Google finding it, helpful for visitors finding it, and helpful to say this is the page that is about this on our website.
I do strongly recommend taking the following advice, which is we are no longer in a world where it makes sense to target one keyword per page. For example, best online survey tools, best online survey software, and best online survey tools 2019 are technically three unique keyword phrases. They have different search volumes. Slightly different results will show up for each of them. But it is no longer the case, whereas it was maybe a decade ago, that I would go create a page for each one of those separate things.
Instead, because these all share the same searcher intent, I want to go with one page, just a single URL that targets all the keywords that share the exact same searcher intent. If searchers are looking to find exactly the same thing but with slightly modified or slight variations in how they phrase things, you should have a page that serves all of those keywords with that same searcher intent rather than multiple pages that try to break those up, for a bunch of reasons. One, it’s really hard to get links to all those different pages. Getting links just period is very challenging, and you need them to rank.
Second off, the difference between those is going to be very, very subtle, and it will be awkward and seem to Google very awkward that you have these slight variations with almost the same thing. It might even look to them like duplicate or very similar or low-quality content, which can get you down-ranked. So stick to one page per set of shared intent keywords.
☑ Leverage appropriate rich snippet options
Next, you want to leverage appropriate rich snippet options. So, for example, if you are in the recipes space, you can use a schema markup for recipes to show Google that you’ve got a picture of the recipe and a cooking time and all these different details. Google offers this in a wide variety of places. When you’re doing reviews, they offer you the star ratings. Schema.org has a full list of these, and Google’s rich snippets markup page offers a bunch more. So we’ll point you to both of those as well.
☑ Images on the page employ…
Last, but certainly not least, because image search is such a huge portion of where Google’s search traffic comes from and goes to, it is very wise to optimize the images on the page. Image search traffic can now send significant traffic to you, and optimizing for images can sometimes mean that other people will find your images through Google images and then take them, put them on their own website and link back to you, which solves a huge problem. Getting links is very hard. Images is a great way to do it.
☑ Descriptive, keyword-rich filenames
The images on your page should employ descriptive, keyword-rich filenames, meaning if I have one for typeform, I don’t want it to be pick one, two or three. I want it to be typeformlogo or typeformsurveysoftware as the name of the file.
☑ Descriptive alt attributes
The alt attribute or alt tag is part of how you describe that for screen readers and other accessibility-focused devices, and Google also uses that text too.
☑ Caption text (if appropriate)
Caption text, if that’s appropriate, if you have like a photograph and a caption describing it, you want to be descriptive of what’s actually in the picture.
☑ Stored in same domain and subdomain
These files, in order to perform well, they generally need to be hosted on the same domain and subdomain. If, for example, all your images are stored on an Amazon Web Services domain and you don’t bother rewriting or making sure that the domain looks like it’s on toolsource.com/photos or /images here, that can cause real ranking problems. Oftentimes you won’t perform at all in Google images because they don’t associate the image with the same domain. Same subdomain as well is preferable.
If you do all these things and you nail searcher intent and you’ve got your keyword research, you are ready to move on to technical SEO and link building and then start ranking. So we’ll see you for that next edition next week. Take care.
Satisfying your searchers is a big part of what it means to be successful in modern SEO. And optimal searcher satisfaction means gaining a deep understanding of them and the queries they use to search. In this section of the One-Hour Guide to SEO, Rand covers everything you need to know about how to satisfy searchers, including the top four priorities you need to have and tips on how to avoid pogo-sticking in the SERPs.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans, and welcome to our special edition One-Hour Guide to SEO Part III on searcher satisfaction. So historically, if we were doing a guide to SEO in the long-ago past, we probably wouldn’t even be talking about searcher satisfaction.
What do searchers want from Google’s results?
But Google has made such a significant number of advances in the last 5 to 10 years that searcher satisfaction is now a huge part of how you can be successful in SEO. I’ll explain what I mean here. Let’s say our friend Arlen here is thinking about going on vacation to Italy.
So she goes to Google. She types in “best places to visit in Italy,” and she gets a list of results. Now Google sorts those results in a number of ways. They sort them by the most authoritative, the most comprehensive. They use links and link data in a lot of different ways to try and get at that. They use content data, what’s on the page, and keyword data.
They use historical performance data about which sites have done well for searchers in the past. All of these things sort of feed into searcher satisfaction. So when Arlen performs this query, she has a bunch of questions in her head, things like I want a list of popular Italian vacation destinations, and I want some comparison of those locations.
Maybe I want the ability to sort and filter based on my personal preferences. I want to know the best times of year to go. I want to know the weather forecast and what to see and do and hotel and lodging info and transportation and accessibility information and cultural tips and probably dozens more questions that I can’t even list out here. But when you, as a content creator and as a search engine optimization professional, are creating and crafting content and trying to optimize that content so that it performs well in Google’s results, you need to be considering what are all of these questions.
How to craft content that satisfies your searchers
This is why searcher empathy, customer empathy, being able to get inside Arlen’s head or your customer’s head and say, “What does she want? What is she looking for?” is one of the most powerful ways to craft content that performs better than your competition in search engines, because it turns out a lot of people don’t do this.
Priority 1: Answer the searcher’s questions comprehensively and with authority
So if I’m planning my page, what is the best page I could possibly craft to try and rank for “best places to visit in Italy,” which is a very popular search term, extremely competitive? I would think about obviously there’s all sorts of keyword stuff and on-page optimization stuff, which we will talk about in Part IV, but my priorities are answer the searcher’s primary questions comprehensively and authoritatively. If I can do that, I am in good shape. I’m ahead of a lot of the pack.
Priority 2: Provide an easy-to-use, fast-loading, well-designed interface that’s a pleasure to interact with
Second, I want to provide a great user experience. That means easy to use, fast-loading, well-designed, that’s a pleasure to interact with. I want the experience of a visitor, a searcher who lands on this page to be, “Wow, this is much better than the typical experience that I get when I land on a lot of other sites.”
Priority 3: Solve the searcher’s next tasks and questions with content, tools, or links
Priority three, I want to solve the searcher’s next tasks and questions with either content on my own site or tools and resources or links or the ability to do them right here so that they don’t have to go back to Google and do other things or visit other websites to try and accomplish the tasks, like figuring out a good hotel or figuring out the weather forecast. A lot of sites don’t do this comprehensively today, which is why it’s an advantage if you do.
Priority 4: Consider creative elements that may give you a long-term competitive advantage
Priority four is consider some creative elements, maybe interactive tools or an interactive map or sorting and filtering options that could give you a long-term, competitive advantage, something that’s difficult for other people who want to rank for this search term to build.
Maybe that’s the data that you get. Maybe it’s the editorial content. Maybe it’s your photographs. Maybe it’s your tools and interactive elements. Whatever the case.
Do NOT give searchers a reason to click that back button!
One of the biggest goals of searcher satisfaction is to make sure that this scenario does not happen to you. You do not want to give searchers a reason to click that back button and choose someone else.
The search engine literature calls this “pogo sticking.” Basically, if I do a search for “best places to visit in Italy”and I click on, let’s say, US News & World Reports and I find that that page does not do a great job answering my query, or it does a fine job, but it’s got a bunch of annoying popovers and it’s slow loading and it has all these things that it’s trying to sell me, and so I click the back button and I choose a different result from Touropia or Earth Trackers.
Over time, Google will figure out that US News & World Reports is not doing a good job of answering the searcher’s query, of providing a satisfactory experience, and they will push them down in the results and they will push these other ones, that are doing a good job, up in the results. You want to be the result that satisfies a searcher, that gets into their head and answers their questions and helps them solve their task, and that will give you an advantage over time in Google’s rankings.
All right, we’ll see you next time for Part IV on on-page optimization. Take care.