We help you take control of your business’s online marketing with solutions that work together to help you generate more leads, win more business and keep more customers.
When you’ve got one of Google’s most helpful and empathetic voices willing to answer your most pressing SEO questions, what do you ask? Will Critchlow recently had the honor of interviewing Google’s John Mueller at SearchLove London, and in this week’s edition of Whiteboard Friday he shares his best lessons from that session, covering the concept of Domain Authority, the great subdomain versus subfolder debate, and a view into the technical workings of noindex/nofollow.
Hi, Whiteboard Friday fans. I’m Will Critchlow from Distilled, and I found myself in Seattle, wanted to record another Whiteboard Friday video and talk through some things that I learned recently when I got to sit down with John Mueller from Google at our SearchLove London conference recently.
So I got to interview John on stage, and, as many of you may know, John is a webmaster relations guy at Google and really a point of contact for many of us in the industry when there are technical questions or questions about how Google is treating different things. If you followed some of the stuff that I’ve written and talked about in the past, you’ll know that I’ve always been a little bit suspicious of some of the official lines that come out of Google and felt like either we don’t get the full story or we haven’t been able to drill in deep enough and really figure out what’s going on.
I was under no illusions that I might be able to completely fix this this in one go, but I did want to grill John on a couple of specific things where I felt like we hadn’t maybe asked things clearly enough or got the full story. Today I wanted to run through a few things that I learned when John and I sat down together. A little side note, I found it really fascinating doing this kind of interview. I sat on stage in a kind of journalistic setting. I had never done this before. Maybe I’ll do a follow-up Whiteboard Friday one day on things I learned and how to run interviews.
1. Does Google have a “Domain Authority” concept?
But the first thing that I wanted to quiz John about was this domain authority idea. So here we are on Moz. Moz has a proprietary metric called domain authority, DA. I feel like when, as an industry, we’ve asked Google, and John in particular, about this kind of thing in the past, does Google have a concept of domain authority, it’s got bundled up with feeling like, oh, he’s had an easy way out of being able to answer and say, “No, no, that’s a proprietary Moz metric. We don’t have that.”
I felt like that had got a bit confusing, because our suspicion is that there is some kind of an authority or a trust metric that Google has and holds at a domain level. We think that’s true, but we felt like they had always been able to wriggle out of answering the question. So I said to John, “Okay, I am not asking you do you use Moz’s domain authority metric in your ranking factors. Like we know that isn’t the case. But do you have something a little bit like it?”
Yes, Google has metrics that map into similar things
John said yes. He said yes, they have metrics that, his exact quote was, “map into similar things.”My way of phrasing this was this is stuff that is at the domain level. It’s based on things like link authority, and it is something that is used to understand performance or to rank content across an entire domain. John said yes, they have something similar to that.
New content inherits those metrics
They use it in particular when they discover new content on an existing domain. New content, in some sense, can inherit some of the authority from the domain, and this is part of the reason why we figured they must have something like this, because we’ve seen identical content perform differently on different sites. We know that there’s something to this. So yes, John confirmed that until they have some of those metrics developed, when they’ve seen a bit of content for long enough, and it can have its own link metrics and usage metrics, in the intervening time up until that point it can inherit some of this stuff from the domain.
Not wholly link-based
He did also just confirm that it’s not just link-based. This is not just a domain-level PageRank type thing.
2. Subdomains versus subfolders
This led me into the second thing that I really wanted to get out of him, which was — and when I raised this, I got kind of an eye roll, “Are we really going down this rabbit hole” — the subdomain versus subfolder question. You might have seen me talk about this. You might have seen people like Rand talk about this, where we’ve seen cases and we have case studies of moving blog.example.com to example.com/blog and changing nothing else and getting an uplift.
We know something must be going on, and yet the official line out of Google has for a very long time been: “We don’t treat these things differently. There is nothing special about subfolders. We’re perfectly happy with subdomains. Do whatever is right for your business.” We’ve had this kind of back-and-forth a few times. The way I put it to John was I said, “We have seen these case studies. How would you explain this?”
They try to figure out what belongs to the site
To his credit, John said, “Yes, we’ve seen them as well.” So he said, yes, Google has also seen these things. He acknowledged this is true. He acknowledged that it happens. The way he explained it connects back into this Domain Authority thing in my mind, which is to say that the way they think about it is: Are these pages on this subdomain part of the same website as things on the main domain?
That’s kind of the main question. They try and figure out, as he put it, “what belongs to this site.” We all know of sites where subdomains are entirely different sites. If you think about a blogspot.com or a WordPress.com domain, subdomains might be owned and managed by entirely different people, and there would be no reason for that authority to pass across. But what Google is trying to do and is trying to say, “Is this subdomain part of this main site?”
Sometimes this includes subdomains and sometimes not
He said sometimes they determine that it is, and sometimes they determine that it is not. If it is part of the site, in their estimation, then they will treat it as equivalent to a subfolder. This, for me, pretty much closes this loop. I think we understand each other now, which is Google is saying, in these certain circumstances, they will be treated identically, but there are circumstances where it can be treated differently.
My recommendation stays what it’s always been, which is 100% if you’re starting from the outset, put it on a subfolder. There’s no upside to the subdomain. Why would you risk the fact that Google might treat it as a separate site? If it is currently on a subdomain, then it’s a little trickier to make that case. I would personally be arguing for the integration and for making that move.
If it’s treated as part of the site, a subdomain is equivalent to a subfolder
But unfortunately, but somewhat predictably, I couldn’t tie John down to any particular way of telling if this is the case. If your content is currently on a subdomain, there isn’t really any way of telling if Google is treating it differently, which is a shame, but it’s somewhat predictable. But at least we understand each other now, and I think we’ve kind of got to the root of the confusion. These case studies are real. This is a real thing. Certainly in certain circumstances moving from the subdomain to the subfolder can improve performance.
3. Noindex’s impact on nofollow
The third thing that I want to talk about is a little bit more geeked out and technical, and also, in some sense, it leads to some bigger picture lessons and thinking. A little while ago John kind of caught us out by talking about how if you have a page that you no index and keep it that way for a long time, that Google will eventually treat that equivalently to a no index, no follow.
In the long-run, a noindex page’s links effectively become nofollow
In other words, the links off that page, even if you’ve got it as a no index, follow, the links off that page will be effectively no followed. We found that a little bit confusing and surprising. I mean I certainly felt like I had assumed it didn’t work that way simply because they have the no index, follow directive, and the fact that that’s a thing seems to suggest that it ought to work that way.
It’s been this way for a long time
It wasn’t really so much about the specifics of this, but more the like: How did we not know this? How did this come about and so forth? John talked about how, firstly, it has been this way for a long time. I think he was making the point none of you all noticed, so how big a deal can this really be? I put it back to him that this is kind of a subtle thing and very hard to test, very hard to extract out the different confounding factors that might be going on.
I’m not surprised that, as an industry, we missed it. But the point being it’s been this way for a long time, and Google’s view and certainly John’s view was that this hadn’t been hidden from us so much as the people who knew this hadn’t realized that they needed to tell anyone. The actual engineers working on the search algorithm, they had a curse of knowledge.
The curse of knowledge: engineers didn’t realize webmasters had the wrong idea
They knew it worked this way, and they had never realized that webmasters didn’t know that or thought any differently. This was one of the things that I was kind of trying to push to John a little more was kind of saying, “More of this, please. Give us more access to the engineers. Give us more insight into their way of thinking. Get them to answer more questions, because then out of that we’ll spot the stuff that we can be like, ‘Oh, hey, that thing there, that was something I didn’t know.’ Then we can drill deeper into that.”
That led us into a little bit of a conversation about how John operates when he doesn’t know the answer, and so there were some bits and pieces that were new to me at least about how this works. John said he himself is generally not attending search quality meetings. The way he works is largely off his knowledge and knowledge base type of content, but he has access to engineers.
They’re not dedicated to the webmaster relations operation. He’s just going around the organization, finding individual Google engineers to answer these questions. It was somewhat interesting to me at least to find that out. I think hopefully, over time, we can generally push and say, “Let’s look for those engineers. John, bring them to the front whenever they want to be visible, because they’re able to answer these kinds of questions that might just be that curse of knowledge that they knew this all along and we as marketers hadn’t figured out this was how things worked.”
That was my quick run-through of some of the things that I learned when I interviewed John. We’ll link over to more resources and transcripts and so forth. But it’s been a blast. Take care.
…but please don’t come away with the wrong storyline from this statistic.
As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”
Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:
- Why it looks like local businesses don’t need websites
- Statistical proofs of why local businesses need websites now more than ever
- The current status of local business websites and most-needed improvements
How Google stopped bearing so many gifts
Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.
Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.
Consider these five key developments:
1) Zero-click mobile SERPs
This slide from a recent presentation by Rand Fishkin encapsulateshis findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.
2) The encroachment of paid ads into local packs
When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.
3) Google becoming a lead gen agency
At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.
4) Even your branded SERPs don’t belong to you
When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.
5) Google is being called the new “homepage” for local businesses
As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.
“Nothing is enough for the man to whom enough is too little.”
There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.
You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.
Your website is your bedrock
What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:
How often do the top 3 Google local pack results also have a 1st page organic rankings?
In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.
*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.
Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.
Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.
Your takeaway from this
The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.
Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.
This calls for an industry-wide doubling down on organic metrics that matter most.
Bridging the local-organic gap
“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”
Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.
When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:
In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.
What makes a website strong?
The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:
- Technical basics
- Excellent usability
- On-site optimization
- Relevant content publication
For our present purpose, let’s take a special look at those last three elements.
On-site optimization and relevant content publication
There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:
- Keyword and real-world research tell a local business what consumers want
- These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
- Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup
What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.
A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:
This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:
The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.
And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.
It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.
Local link building must be brought to the fore of publicity efforts
Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.
First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.
Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.
Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:
- Event participation and hosting
- Online news
- Business associations
- B2B cross-promotions
There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.
An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.
In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:
Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:
The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.
Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.
Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.
Acting now is actually a strategy for the future
“There is nothing permanent except change.”
You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.
Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.
This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.
For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.
This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.
What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.
For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:
Keyword research has been around as long as the SEO industry has. Search engines built a system that revolves around users entering a term or query into a text entry field, hitting return, and receiving a list of relevant results. As the online search market expanded, one clear leader emerged — Google — and with it they brought AdWords (now Google Ads), an advertising platform that allowed organizations to appear on search results pages for keywords that organically they might not.
Within Google Ads came a tool that enabled businesses to look at how many searches there were per month for almost any query. Google Keyword Planner became the de facto tool for keyword research in the industry, and with good reason: it was Google’s data. Not only that, Google gave us the ability to gather further insights due to other metrics Keyword Planner provided: competition and suggested bid. Whilst these keywords were Google Ads-oriented metrics, they gave the SEO industry an indication of how competitive a keyword was.
The reason is obvious. If a keyword or phrase has higher competition (i.e. more advertisers bidding to appear for that term) it’s likely to be more competitive from an organic perspective. Similarly, a term that has a higher suggested bid means it’s more likely to be a competitive term. SEOs dined on this data for years, but when the industry started digging a bit more into the data, we soon realized that while useful, it was not always wholly accurate. Moz, SEMrush, and other tools all started to develop alternative volume and competitive metrics using Clickstream data to give marketers more insights.
Now industry professionals have several software tools and data outlets to conduct their keyword research. These software companies will only improve in the accuracy of their data outputs. Google’s data is unlikely to significantly change; their goal is to sell ad space, not make life easy for SEOs. In fact, they’ve made life harder by using volume ranges for Google Ads accounts with low activity. SEO tools have investors and customers to appease and must continually improve their products to reduce churn and grow their customer base. This makes things rosy for content-led SEO, right?
Well, not really.
The problem with historical keyword research is twofold:
1. SEOs spend too much time thinking about the decision stage of the buyer’s journey (more on that later).
2. SEOs spend too much time thinking about keywords, rather than categories or topics.
The industry, to its credit, is doing a lot to tackle issue number two. “Topics over keywords” is something that is not new as I’ll briefly come to later. Frameworks for topic-based SEO have started to appear over the last few years. This is a step in the right direction. Organizing site content into categories, adding appropriate internal linking, and understanding that one piece of content can rank for several variations of a phrase is becoming far more commonplace.
What is less well known (but starting to gain traction) is point one. But in order to understand this further, we should dive into what the buyer’s journey actually is.
What is the buyer’s journey?
The buyer’s or customer’s journey is not new. If you open marketing text books from years gone by, get a college degree in marketing, or even just go on general marketing blogs you’ll see it crop up. There are lots of variations of this journey, but they all say a similar thing. No matter what product or service is bought, everyone goes through this journey. This could be online or offline — the main difference is that depending on the product, person, or situation, the amount of time this journey takes will vary — but every buyer goes through it. But what is it, exactly? For the purpose of this article, we’ll focus on three stages: awareness, consideration, & decision.
The awareness stage of the buyer’s journey is similar to problem discovery, where a potential customer realizes that they have a problem (or an opportunity) but they may not have figured out exactly what that is yet.
Search terms at this stage are often question-based — users are researching around a particular area.
The consideration stage is where a potential consumer has defined what their problem or opportunity is and has begun to look for potential solutions to help solve the issue they face.
The decision stage is where most organizations focus their attention. Normally consumers are ready to buy at this stage and are often doing product or vendor comparisons, looking at reviews, and searching for pricing information.
To illustrate this process, let’s take two examples: buying an ice cream and buying a holiday.
Being low-value, the former is not a particularly considered purchase, but this journey still takes place. The latter is more considered. It can often take several weeks or months for a consumer to decide on what destination they want to visit, let alone a hotel or excursions. But how does this affect keyword research, and the content which we as marketers should provide?
At each stage, a buyer will have a different thought process. It’s key to note that not every buyer of the same product will have the same thought process but you can see how we can start to formulate a process.
The Buyer’s Journey – Holiday Purchase
The above table illustrates the sort of queries or terms that consumers might use at different stages of their journey. The problem is that most organizations focus all of their efforts on the decision end of the spectrum. This is entirely the right approach to take at the start because you’re targeting consumers who are interested in your product or service then and there. However, in an increasingly competitive online space you should try and find ways to diversify and bring people into your marketing funnel (which in most cases is your website) at different stages.
I agree with the argument that creating content for people earlier in the journey will likely mean lower conversion rates from visitor to customer, but my counter to this would be that you’re also potentially missing out on people who will become customers. Further possibilities to at least get these people into your funnel include offering content downloads (gated content) to capture user’s information, or remarketing activity via Facebook, Google Ads, or other retargeting platforms.
Moving from keywords to topics
I’m not going to bang this drum too loudly. I think many in of the SEO community have signed up to the approach that topics are more important than keywords. There are quite a few resources on this listed online, but what forced it home for me was Cyrus Shepard’s Moz article in 2014. Much, if not all, of that post still holds true today.
What I will cover is an adoption of HubSpot’s Topic Cluster model. For those unaccustomed to their model, HubSpot’s approach formalizes and labels what many search marketers have been doing for a while now. The basic premise is instead of having your site fragmented with lots of content across multiple sections, all hyperlinking to each other, you create one really in-depth content piece that covers a topic area broadly (and covers shorter-tail keywords with high search volume), and then supplement this page with content targeting the long-tail, such as blog posts, FAQs, or opinion pieces. HubSpot calls this “pillar” and “cluster” content respectively.
The process then involves taking these cluster pages and linking back to the pillar page using keyword-rich anchor text. There’s nothing particularly new about this approach aside from formalizing it a bit more. Instead of having your site’s content structured in such a way that it’s fragmented and interlinking between lots of different pages and topics, you keep the internal linking within its topic, or content cluster. This video explains this methodology further. While we accept this model may not fit every situation, and nor is it completely perfect, it’s a great way of understanding how search engines are now interpreting content.
At Aira, we’ve taken this approach and tried to evolve it a bit further, tying these topics into the stages of the buyer’s journey while utilizing several data points to make sure our outputs are based off as much data as we can get our hands on. Furthermore, because pillar pages tend to target shorter-tail keywords with high search volume, they’re often either awareness- or consideration-stage content, and thus not applicable for decision stage. We term our key decision pages “target pages,” as this should be a primary focus of any activity we conduct.
We’ll also look at the semantic relativity of the keywords reviewed, so that we have a “parent” keyword that we’re targeting a page to rank for, and then children of that keyword or phrase that the page may also rank for, due to its similarity to the parent. Every keyword is categorized according to its stage in the buyer’s journey and whether it’s appropriate for a pillar, target, or cluster page. We also add two further classifications to our keywords: track & monitor and ignore. Definitions for these five keyword types are listed below:
A pillar page covers all aspects of a topic on a single page, with room for more in-depth reporting in more detailed cluster blog posts that hyperlink back to the pillar page. A keyword tagged with pillar page will be the primary topic and the focus of a page on the website. Pillar pages should be awareness- or consideration-stage content.
A great pillar page example I often refer to is HubSpot’s Facebook marketing guide or Mosi-guard’s insect bites guide (disclaimer: probably don’t click through if you don’t like close-up shots of insects!).
A cluster topic page for the pillar focuses on providing more detail for a specific long-tail keyword related to the main topic. This type of page is normally associated with a blog article but could be another type of content, like an FAQ page.
Good examples within the Facebook marketing topic listed above are HubSpot’s posts:
For Mosi-guard, they’re not utilizing internal links within the copy of the other blogs, but the “older posts” section at the bottom of the blog is referencing this guide:
Normally a keyword or phrase linked to a product or service page, e.g. nike trainers or seo services. Target pages are decision-stage content pieces.
Track & monitor
A keyword or phrase that is not the main focus of a page, but could still rank due to its similarity to the target page keyword. A good example of this might be seo services as the target page keyword, but this page could also rank for seo agency, seo company, etc.
A keyword or phrase that has been reviewed but is not recommended to be optimized for, possibly due to a lack of search volume, it’s too competitive, it won’t be profitable, etc.
Once the keyword research is complete, we then map our keywords to existing website pages. This gives us a list of mapped keywords and a list of unmapped keywords, which in turn creates a content gap analysis that often leads to a content plan that could last for three, six, or twelve-plus months.
Putting it into practice
I’m a firm believer in giving an example of how this would work in practice, so I’m going to walk through one with screenshots. I’ll also provide a template of our keyword research document for you to take away.
1. Harvesting keywords
The first step in the process is similar, if not identical, to every other keyword research project. You start off with a batch of keywords from the client or other stakeholders that the site wants to rank for. Most of the industry call this a seed keyword list. That keyword list is normally a minimum of 15–20 keywords, but can often be more if you’re dealing with an e-commerce website with multiple product lines.
This list is often based off nothing more than opinion: “What do we think our potential customers will search for?” It’s a good starting point, but you need the rest of the process to follow on to make sure you’re optimizing based off data, not opinion.
2. Expanding the list
Once you’ve got that keyword list, it’s time to start utilizing some of the tools you have at your disposal. There are lots, of course! We tend to use a combination of Moz Keyword Explorer, Answer the Public, Keywords Everywhere, Google Search Console, Google Analytics, Google Ads, ranking tools, and SEMrush.
The idea of this list is to start thinking about keywords that the organization may not have considered before. Your expanded list will include obvious synonyms from your list. Take the example below:
ski chalet rental
ski chalet hire
ski chalet [location name]
There are other examples that should be considered. A client I worked with in the past once gave a seed keyword of “biomass boilers.” But after keyword research was conducted, a more colloquial term for “biomass boilers” in the UK is “wood burners.” This is an important distinction and should be picked up as early in the process as possible. Keyword research tools are not infallible, so if budget and resource allows, you may wish to consult current and potential customers about which terms they might use to find the products or services being offered.
3. Filtering out irrelevant keywords
Once you’ve expanded the seed keyword list, it’s time to start filtering out irrelevant keywords. This is pretty labor-intensive and involves sorting through rows of data. We tend to use Moz’s Keyword Explorer, filter by relevancy, and work our way down. As we go, we’ll add keywords to lists within the platform and start to try and sort things by topic. Topics are fairly subjective, and often you’ll get overlap between them. We’ll group similar keywords and phrases together in a topic based off the semantic relativity of those phrases. For example:
ski chalet rental
ski chalet hire
ski chalet [location name]
luxury catered chalet
catered chalet rental
catered chalet hire
catered chalet [location name]
cheap ski accommodation
budget ski accommodation
ski accomodation [location name]
Many of the above keywords are decision-based keywords — particularly those with rental or hire in them. They’re showing buying intent. We’ll then try to put ourselves in the mind of the buyer and come up with keywords towards the start of the buyer’s journey.
best ski resorts
ski resorts europe
ski resorts usa
ski resorts canada
top ski resorts
cheap ski resorts
luxury ski resorts
skiing beginner’s guide
family winter holidays
This helps us cater to customers that might not be in the frame of mind to purchase just yet — they’re just doing research. It means we cast the net wider. Conversion rates for these keywords are unlikely to be high (at least, for purchases or enquiries) but if utilized as part of a wider marketing strategy, we should look to capture some form of information, primarily an email address, so we can send people relevant information via email or remarketing ads later down the line.
4. Pulling in data
Once you’ve expanded the seed keywords out, Keyword Explorer’s handy list function enables your to break things down into separate topics. You can then export that data into a CSV and start combining it with other data sources. If you have SEMrush API access, Dave Sottimano’s API Library is a great time saver; otherwise, you may want to consider uploading the keywords into the Keywords Everywhere Chrome extension and manually exporting the data and combining everything together. You should then have a spreadsheet that looks something like this:
You could then add in additional data sources. There’s no reason you couldn’t combine the above with volumes and competition metrics from other SEO tools. Consider including existing keyword ranking information or Google Ads data in this process. Keywords that convert well on PPC should do the same organically and should therefore be considered. Wil Reynolds talks about this particular tactic a lot.
5. Aligning phrases to the buyer’s journey
The next stage of the process is to start categorizing the keywords into the stage of the buyer’s journey. Something we’ve found at Aira is that keywords don’t always fit into a predefined stage. Someone looking for “marketing services” could be doing research about what marketing services are, but they could also be looking for a provider. You may get keywords that could be either awareness/consideration or consideration/decision. Use your judgement, and remember this is subjective. Once complete, you should end up with some data that looks similar to this:
This categorization is important, as it starts to frame what type of content is most appropriate for that keyword or phrase.
The next stage of this process is to start noticing patterns in keyphrases and where they get mapped to in the buyer’s journey. Often you’ll see keywords like “price” or ”cost” at the decision stage and phrases like “how to” at the awareness stage. Once you start identifying these patterns, possibly using a variation of Tom Casano’s keyword clustering approach, you can then try to find a way to automate so that when these terms appear in your keyword column, the intent automatically gets updated.
Once completed, we can then start to define each of our keywords and give them a type:
- Pillar page
- Cluster page
- Target page
- Track & monitor
We use this document to start thinking about what type of content is most effective for that piece given the search volume available, how competitive that term is, how profitable the keyword could be, and what stage the buyer might be at. We’re trying to find that sweet spot between having enough search volume, ensuring we can actually rank for that keyphrase (there’s no point in a small e-commerce startup trying to rank for “buy nike trainers”), and how important/profitable that phrase could be for the business. The below Venn diagram illustrates this nicely:
We also reorder the keywords so keywords that are semantically similar are bucketed together into parent and child keywords. This helps to inform our on-page recommendations:
From the example above, you can see “digital marketing agency” as the main keyword, but “digital marketing services” & “digital marketing agency uk” sit underneath.
We also use conditional formatting to help identify keyword page types:
And then sheets to separate topics out:
Once this is complete, we have a data-rich spreadsheet of keywords that we then work with clients on to make sure we’ve not missed anything. The document can get pretty big, particularly when you’re dealing with e-commerce websites that have thousands of products.
5. Keyword mapping and content gap analysis
We then map these keywords to existing content to ensure that the site hasn’t already written about the subject in the past. We often use Google Search Console data to do this so we understand how any existing content is being interpreted by the search engines. By doing this we’re creating our own content gap analysis. An example output can be seen below:
The above process takes our keyword research and then applies the usual on-page concepts (such as optimizing meta titles, URLs, descriptions, headings, etc) to existing pages. We’re also ensuring that we’re mapping our user intent and type of page (pillar, cluster, target, etc), which helps us decide what sort of content the piece should be (such as a blog post, webinar, e-book, etc). This process helps us understand what keywords and phrases the site is not already being found for, or is not targeted to.
I promised a template Google Sheet earlier in this blog post and you can find that here.
Do you have any questions on this process? Ways to improve it? Feel free to post in the comments below or ping me over on Twitter!
When Google says they prefer comprehensive, complete content, what does that really mean? In this week’s episode of Whiteboard Friday, Kameron Jenkins explores actionable ways to translate the demands of the search engines into valuable, quality content that should help you rank.
Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz.
Today we’re going to be talking about the quality of content comprehensiveness and what that means and why sometimes it can be confusing. I want to use an example scenario of a conversation that tends to go on between SEOs and Google. So here we go.
An SEO usually says something like, “Okay, Google, you say you want to rank high-quality content. But what does that really mean? What is high quality, because we need more specifics than that.”
Then Google goes, “Okay, high quality is something that’s comprehensive and complete. Yeah, it’s really comprehensive.” SEOs go, “Well, wait. What does that even mean?”
That’s kind of what this was born out of. Just kind of an explanation of what is comprehensive, what does Google mean when they say that, and how that differs depending on the query.
Here we have an example page, and I’ll kind of walk you through it. It’s just going to serve to demonstrate why when Google says “comprehensive,” that can mean something different for an e-commerce page than it would for a history of soccer page. It’s really going to differ depending on the query, because people want all sorts of different kinds of things. Their intent is going to be different depending on what they’re searching in Google. So the criteria is going to be different for comprehensiveness. So hopefully, by way of example, we’ll be able to kind of walk you through what comprehensiveness looks like for this one particular query. So let’s just dive in.
All right. So first I’m going to talk about intent. I have here a Complete Guide to Buying a House. This is the query I used as an example. Before we dive in, even before we look into keyword research tools or anything like that, I think it’s really important to just like let the query sit with you for a little bit. So “guide to buying a house,” okay, I’m going to think about that and think about what the searcher probably wanted based on the query.
So first of all, I noticed “guide.” The word “guide” to me makes it sound like someone wants something very complete, very thorough. They don’t just want quick tips. They don’t want a quick bullet list. This can be longer, because someone is searching for a comprehensive guide.
“To buying a house,” that’s a process. That’s not like an add-to-cart like Amazon. It’s a step-by-step. There are multiple phases to that type of process. It’s really important to realize here that they’re probably looking for something a little lengthier and something that is maybe a step-by-step process.
And too, you just look at the query, “guide to buying a house,” people are probably searching that if they’ve never bought a house before. So if they’ve never bought a house before, it’s just good to remember that your audience is in a phase where they have no idea what they’re doing. It’s important to understand your audience and understand that this is something that they’re going to need very, very comprehensive, start-to-finish information on it.
Two, implications. This is again also before we get into any keyword research tools. By implications, I mean what is going to be the effect on someone after reading this? So the implications here, a guide to buying a house, that is a big financial decision. That’s a big financial purchase. It’s going to affect people’s finances and happiness and well-being, and Google actually has a name for that. In their Quality Rater Guidelines, they call that YMYL. So that stands for “your money, your life.”
Those types of pages are held to a really high standard, and rightfully so. If someone reads this, they’re going to get advice about how to spend their money. It’s important for us, as SEOs and writers crafting these types of pages, to understand that these are going to be held to a really high standard. I think what that could look like on the page is, because they’re making a big purchase like this, it might be a good sign of trustworthiness to maybe have some expert quotes in here. Maybe you kind of sprinkle those throughout your page. Maybe you actually have it written by an expert author instead of just Joe Schmoe blogger. Those are just some ideas for making a page really trustworthy, and I think that’s a key to comprehensiveness.
Number three here we have subtopics. There are two ways that I’ll walk you through finding subtopics to fit within your umbrella topic. I’m going to use Moz Keyword Explorer as an example of this.
Use Keyword Explorer to reveal subtopics
In Moz Keyword Explorer, you can search for different keywords and related keywords two different ways. You can type in a query. So you can type in something like “buy a house” or “home buying” or something like that. You start with your main topic, and what you’ll get as a result is a bunch of suggested keywords that you can also incorporate on your page, terms that are related to the term that you searched. This is going to be really great, because you’re going to start to notice themes emerge. Some of the themes I noticed were people tend to search for “home buying calculator,” like a can-I-afford-it type of calculator. A lot of people search financial-related things obviously, bad credit. I filed for bankruptcy, can I still buy a house? You’ll start to see subthemes emerge.
Then I also wanted to mention that, in Moz Keyword Explorer, you can also search by URL. What I might do is query my term that I’m trying to target on my page. I’m going to pick the top three URLs that are ranking. You pop them into Keyword Explorer, and you can compare them and you can see the areas of most overlap. So what you’ll get essentially is a list of keywords that the top ranking pages for that term also rank for. That’s going to be a really good way to mine some extra keyword ideas for your page to make it more comprehensive.
Then here we go. We have step four. After we’ve come up with some subtopics, I think it’s also a really good idea to mine questions and try to find what questions our audience is actually asking. So, for these, I like to use Answer the Public and Keywords Everywhere. Those are two really great tools that I kind of like to use in tandem.
Use Answer the Public to mine questions
Answer the Public, if you’ve never used it, is a really fun tool. You can put in a keyword, and you get a huge list. Depending on how vague your query is, you might get a ton of ideas. If your query is really specific, you might not get as many keyword ideas back. But it’s a really great way to type in a keyword, like “buying a house” or “buy a house” or “home buying” or something like that, and get a whole, big, long list of questions that your audience is asking. People that want to know how to buy a house, they’re also asking these questions.
I think a comprehensive page will answer those questions. But it can be a little bit overwhelming. There’s going to be probably a lot of questions potentially to answer. So how do you prioritize and choose which questions are the best to address on your page?
Use Keywords Everywhere to highlight keywords on a page
That’s where the Keywords Everywhere plug-in comes in handy. I use it in Chrome. You can have it highlight the keywords on the page. I think I have mine set to highlight anything that’s searched 50 or more times a month. That’s a really good way to gauge, just right off the bat you can see, okay, now there are these 10 instead of these 100 questions to potentially answer on my page.
So examples of questions here, I have questions like: Can I afford this? Is now the right time to buy? So you can kind of fit those into your page and answer those questions.
Then finally here I have trends. I think this is a really commonly missed step. It’s important to remember that a lot of terms have seasonality attached to them. So what I did with this query, I queried “buy a house,” and I wanted to see if there were any trends for home buying-type of research queries in Google Trends. I zoomed out to five years to see if I could see year-over-year if there were any trends that emerged.
That was totally the case. When people are searching “buy a house,” it’s at its peak kind of around January into spring, and then in the summer it starts to dive, and then it’s at its lowest during the holidays. That kind of shows you that people are researching at the beginning of the year. They’re kind of probably moving into their house during the summertime, and then during the holidays they’ve had all the time to move in and now they’re just enjoying the holidays. That’s kind of the trend flow that it follows. That’s really key information, if you’re going to build a comprehensive page, to kind of understand that there’s seasonality attached with your term.
Because I know now that there’s seasonality with my term, I can incorporate information like what are the pros and cons of buying in peak season versus off-season for buying a house. Maybe what’s the best time of year to buy. Those are, again, other ideas for things that you can incorporate on your page to make it more comprehensive.
This page is not comprehensive. I didn’t have enough room to fit some things. So you don’t just stop at this phase. If you’re really building a comprehensive page on this topic, don’t stop where I stopped. But this is kind of just an example of how to go about thinking through what Google means when they say make a page comprehensive. It’s going to mean something different depending on your query and just keep that in mind. Just think about the query, think about what your audience wanted based on what they searched, and you’ll be off to a great start building a comprehensive page.
I hope that was helpful. If you have any ideas for building your own comprehensive page, how you do that, maybe how it differs in different industries that you’ve worked in, pop it in the comments. That would be really good for us to share that information. Come back again next week for another edition of Whiteboard Friday.
When you’ve accomplished step one in your local search marketing, how do you take step two?
You already know that any local business you market has to have the table stakes of accurate structured citations on major platforms like Facebook, Yelp, Infogroup, Acxiom, and YP.
But what can local SEO practitioners do once they’ve got these formal listings created and a system in place for managing them? Our customers often come to us once they’ve gotten well underway with Moz Local and ask, “What’s next? What can I do to move the needle?” This blog post will give you the actionable strategy and a complete step-by-step tutorial to answer this important question.
A quick refresher on citations
Listings on formal directories are called “structured citations.” When other types of platforms (like online news, blogs, best-of lists, etc.) reference a local business’ complete or partial contact information, that’s called an “unstructured citation.” And the best unstructured citations of all include links, of course!
For example, the San Francisco branch of a natural foods grocery store gets a linked unstructured citation from a major medical center in their city via a blog post about stocking a pantry with the right ingredients for healthier eating. Google and consumers encounter this reference and understand that trust and authority are being conveyed and earned.
The more often websites that are relevant to your location or industry link to you within their own content, the better your chances of ranking well in Google’s organic and local search engine results.
Why linked unstructured citations are growing in importance right now
Link building is as old as organic SEO. Structured citation building is as old as local SEO. Both practices have long sought to influence Google rankings. But a close read of the local search marketing community these days points up an increasing emphasis on the value of unstructured citations. In fact, local links were one of the top three takeaways from the 2018 Local Search Ranking Factors survey. Why is this?
- Google has become the dominant force in local consumer experiences, keeping as many actions as possible within their own interface instead of sending searchers to company websites. Because links influence rank within that interface, most local businesses enterprises will need to move beyond traditional structured citations to impress Google with mentions on a diverse variety of relevant websites. While structured citations are rightly referred to as “table stakes” for all local businesses, it’s the unstructured ones that can be competitive difference-makers in tough markets.
- Meanwhile, Google is increasingly monetizing local search results. A prime example of this is their Local Service Ads (LSA) program which acts as lead gen between Google and service area businesses like plumbing and housekeeping companies. Savvy local brands (including brick-and-mortar models) will see the way the wind is blowing with this and work to form non-Google-dependent sources of traffic and lead generation. A good linked unstructured citation on a highly relevant publication can drive business without having to pay Google a dime.
Your goal with linked unstructured citations is to build your community footprint and your authority simultaneously. All you need is the right tools for the research phase!
Fishing for opportunities with Link Intersect
For the sake of this tutorial, let’s choose at random a small B&B in Albuquerque — Bottger.com — as our hypothetical client. Let’s say that the innkeeper wants to know how the big Tribal resort casinos are earning publicity and links, in the hopes of finding opportunities for a smaller hospitality business, too. *Note that these aren’t absolutely direct competitors, but they share a city and an overall industry.
We’re going to use Moz’s Link Intersect tool to do this research for Bottger Mansion. This tool could help Bottger uncover all kinds of links and unstructured linked citation opportunities, depending on how it’s used. For example, the tool could surface:
- Links that direct or near-direct competitors have, but that Bottger doesn’t
- Locally relevant links from domains/pages about Bottger’s locale
- Industry-relevant links from domains/pages about the hospitality industry
Step 1: Find the “big fish”
A client may already know who the “big fish” in their community are, or you can cast a net by identifying popular local events and seeing which businesses sponsor them. Sponsorships can be pricey, depending on the event, so if a local company sponsors a big event, it’s an indication that they’re a larger enterprise with the budget to pursue a wide array of creative PR ideas. Larger enterprises can serve as models for small business emulation, at scale.
In our case study, we know that Bottger is located in Albuquerque, so we decided to locate sponsors of the famous Albuquerque International Balloon Fiesta. Right away, we spotted two lavish Albuquerque resort-casinos — Isleta and Sandia. These are the “big fish” we want our smaller client to look to for inspiration.
Step 2: Input domains in Link Intersect
We’re going to compare Bottger’s domain to Isleta and Sandia’s domains. In Moz Pro, navigate to “Link Explorer” and then select “Link Intersect” from the left navigation. Input your domain in the top and the domains you want to mine link ideas from in the fields beneath, as depicted below.
Next to Bottger’s domain, we’ve selected “root domain” as that will show us all competitor links who haven’t linked to us at all. We’re also going to select “root domain” on the resort domains, so we can see all of their backlinks, rather than just links to particular pages on their sites.
Moz’s Link Intersect tool will let you compare your site with up to 5 competitors. It’s totally up to you how many sites you want to evaluate at once. If you’re just getting started with link building, you may want to start with just one domain, as this should yield plenty of link opportunities to start with. If you’ve already been doing some link building, you have more time to dedicate to link building, or you’d just generally rather have more options to work with, go ahead and put in multiple domains to compare.
Step 3: Find link opportunities
Once you’ve input your domain and your competitor(s) domains, click “Find Opportunities.” That will yield a list of sites that link to your competitors, but do not link to you.
In this example, we’re comparing our client’s domain against two other domains: A (Isleta) and B (Sandia). In the “Sites that intersect” column, you will see whether Site A has the link, Site B has it, or if they both have it.
Step 4: The link selection process
Now that we have a list of link ideas from Isleta and Sandia’s backlink profiles, it’s time to decide which ones might yield good opportunities for our B&B. That’s right — just because something is in a competitor’s link profile doesn’t necessarily mean you want it!
View the referring pages
The first step is to drill down and get more detail about links the big resorts have. Select the arrow to expand this section and view the exact page the link is coming from.
In this example, both Sandia and Isleta have links from the root domain marriott.com. By using the “expand” feature, we can see the exact pages those links are located on.
Identify follow or no-follow
You can use the MozBar Chrome plugin to view whether your competitor’s link is no-followed or followed. Since only followed links pass authority, you may want to prioritize those, but no-followed links can also have value in the form of generating traffic to your site and could get picked up by others who do eventually link to your site with a follow link.
Select the MozBar icon from your browser and click the pencil icon. If you want to see Followed links, select “Followed” and the MozBar will highlight these links on the page in green. To find No-Followed links, click “No-Followed” and MozBar will highlight these links on the page in pink.
If this is your first foray into link building for local businesses, you may be unfamiliar with the types of sites you’ll see in Link Intersect. While no two link profiles are exactly the same, many local businesses use similar methods for building links, so there are some common categories to be aware of. Knowing these will help you decipher the results Link Intersect will show you.
Types of links and what you can do with them:
Press release sites like PRweb.com and PRnewsire.com are fairly common among local businesses that want to spread the word about their initiatives. Whether someone at the business won an award or they started a new community outreach program, local businesses often pay companies like PRweb.com to distribute this news on their platform and to their partners. These are no-followed links (don’t pass link authority aka “SEO value”) but they can offer valuable traffic and could even get picked up by sites that do link with a follow link.
If your competitor is utilizing press releases, you may want to consider distributing your newsworthy information this way!
Structured citations / directories
One of the primary types of domains you’ll see in a local business’ backlink profile is directories — structured citation websites like yellowpages.com that list a business’ name, address, and phone number (NAP) with a link back to the business’ website. Because they’re self-created and not editorially given, like Press Releases, they are often no-followed. However, having consistent and accurate citations across major directory websites is a key foundational step in local search performance.
If you see these types of sites in Link Intersect, it may indicate your need for a listings management solution like Moz Local that can ensure your NAP is accurate and available across major directories. Typically, you’ll want to have these table stakes before focusing on unstructured linked citations.
Another favorite among local businesses is local media coverage (or just media coverage in general — it’s not always local). HARO (Help a Reporter Out) is a popular service for connecting journalists to subject matter experts who may be valuable sources for their articles. The journalists will typically link your quote back to your website. Aside from services like HARO, local businesses would do well to make media contacts, such as forming relationships with local news correspondents. As news surfaces, they’ll start reaching out to you for comment!
If you see news coverage in your competitor’s backlink profile, you can get ideas of what types of publications want content and information that you can provide.
Local / industry coverage
Blogs, hobby sites, DIY sites, and other platforms can feature content that depicts city life or interest in a topic. For example, a chef might author a popular blog covering their dining experiences in San Francisco. For a local restaurant, being cited by this publication could be valuable.
If you see popular local or industry sites in your competitor’s backlink profile, it’s a good signal of opportunity for your business to build a relationship with the authors in hopes of gaining links.
Most local businesses are affiliated with some type of governing/regulating body, trade organization, award organization, etc. Many of these organizations have websites themselves, and they often list the businesses they’re affiliated with.
If your competitor is involved with an organization, that means your business is likely suited to be involved as well! Use these links to get ideas of which organizations to join.
Community organizations are a great local validator for search engines, and many local businesses have taken notice. You’ll likely find these types of organizations’ websites in your competitor’s backlink profile, such as Chamber of Commerce websites or the local YMCA.
As a local business, your competitors are in the same locale as you, so take note of these community organizations and consider joining them. You’ll not only get the benefit of better community involvement, but you can get a link out of it too!
Sponsorships / event participation
Local businesses can sponsor, donate to, host or participate in community events, teams, and other cherished local resources, which can lead to both online and offline publicity.
Local businesses can earn great links from online press surrounding these groups and happening. If an event/team page highlights you, but doesn’t actually link to benefactors/participants, don’t be shy about politely requesting a link.
Scholarships / .edu sites
A popular strategy used by many local businesses and non-local businesses alike is scholarship link building. Businesses figured out that if they offered a scholarship, they could get a link back to their site on education websites, such as .edu domains. Everyone seemed to catch on — so much so that many schools stopped featuring these scholarships on their site. It’s also important to note that .edu domains don’t inherently have more value than domains on any other TLD.
If your business wants to offer a scholarship, that is a great thing! We encourage you to pursue this for the benefit it could offer students, rather than primarily for the purpose of gaining links. Scholarship link building has become very saturated, and could be a strategy with diminishing returns, so don’t put all your eggs in this basket, and do it first and foremost for students instead of links.
Businesses may sometimes partner with each other for mutually beneficial link opportunities. Co-marketing opportunities that are a byproduct of genuine relationships can present valuable link opportunities, but link exchanges are against Google’s quality guidelines.
Stay away from “you link to me, I’ll link to you” opportunities as Google can see it as an attempt to manipulate your site’s ranking in search, but don’t be afraid to pursue genuine connections with other businesses that can turn into linking opportunities.
Just because your competitor has that link doesn’t mean you want it too! In Link Intersect, pay attention to the domain’s Spam Score and DA. A high spam score and/or low DA can indicate that the link wouldn’t be valuable for your site, and may even harm it.
Also watch out for links generated from comments. If your competitor has links in their backlink profile coming from comments, you can safely ignore these as they do not present real opportunities for earning links that will move the needle in the right direction.
Now that you’re familiar with popular types of local backlinks and what you can do with them, let’s actually dig into Isleta and Sandia’s backlinks to see which might be good prospects for us.
Step 5: Imitation is the sincerest form of flattery
Both the Albuquerque Marriott and Hilton Garden Inn link to Isleta and Sandia on their “Local Things to Do” pages. This could be a great prospect for Bottger! In many cases, “things to do” pages will include lists of local restaurants, historic sites, attractions, shops, and more. Note how their addresses are included on the following pages, making them powerful linked unstructured citations. Bottger hosts fancy tea parties in a lovely setting, which could be a fun thing for tourists to do.
Isleta and Sandia also have links from a wedding website. If Bottger uses their property as a wedding venue, offers special wedding or engagement packages, or something similar, this could be a great prospect as well.
Link Intersect also yielded links to various travel guide websites. There are plenty of links on websites like these to local attractions. In the following example, you can see an Albuquerque travel guide that’s broken up by category, “hotels” being one of them:
Isleta and Sandia also have been featured in the Albuquerque Journal. In this example, a local reporter covered news that Isleta was opening expanded bingo and poker rooms. This seems to be a journalist who covers local businesses, so she could be a great connection to make!
Many other links in Isleta and Sandia’s backlink profiles came from sources like events websites, since these resorts are large enough to serve as the venue for major events like concerts and MMA matches. Although Bottger isn’t large enough to host an event of that magnitude, it could spark good ideas for link building opportunities in the future. Maybe Bottger could host a small community tea tasting event featuring locally sourced herbal teas and get in touch with a local reporter to promote it. Even competitor links that you can’t directly pursue can spark your creativity for related link building opportunities.
And let’s not forget how we found out about Isleta and Sandia in the first place: the Albuquerque International Balloon Fiesta! Event sponsors are featured on an “official sponsors” page with links to their websites. This is a classic, locally relevant opportunity for any Albuquerque business.
Step 6: Compile your link prospects in Link Tracking Lists
If you’re thinking, “This sounds great, but it also sounds like a lot of work. How am I ever going to keep track of all this?” — we’ve got you covered!
Moz Pro’s “Link Tracking Lists” was built for just this purpose.
In Link Intersect, you’ll see little check boxes next to all your competitors’ links. When you find one you want to target, check the box. When you’re done going through all the links and have checked the boxes next to the domains you want to pursue, click “Add to Link Tracking List” at the top right.
Since we’ve never done link building for Bottger before, we’re going to select “Create New List” from the dropdown, and label it something descriptive.
Make sure to put your client’s domain in the “target URL” field. For Step 3, since we’ve just selected the links we want to track from Link Intersect, those will already be populated in this field, so no further action is needed other than to click “Save.”
We’ll come back to Link Tracking Lists when we talk about outreach, but for now, all you need to know is that you can add the desirable competitor links (in our case, links from Isleta and Sandia) to Link Tracking lists straight from Link Intersect, making it easy to manage your data.
Step 7: Find out how to connect with your link prospects
Now it’s time to connect the dots: how do you go from knowing about your competitor’s links to getting those types of links for yourself?
There are three main ways you can get unstructured linked citations to your local business’ website, and those categories are what’s going to dictate the strategy you need to take to secure that opportunity for yourself.
- Self-created: Self-created links are like voting for yourself, so sites that accept these types of submissions, like Yelp.com, will NoFollow the link to your business’ website. Visitors are still referred to your website through that link, but the link doesn’t pass authority from Yelp.com to your domain. You should only get authority from a website if they link to you on their own (what Google calls “editorially placed” links). Neither NoFollow nor Follow links are inherently good or bad on their own. They are just intended for different purposes, and it’s the misuse of followed links that can get you in trouble with Google. We’ll talk more about that in a later section titled “Avoiding the bad fish: Risks of ignoring Google’s link scheme guidelines”
- Prompted by outreach: In many cases, people won’t know about your content until you tell them. These links are editorially placed by the site owner (not self-created), but the site owner was only made aware of your content because you reached out to them.
- Organically earned: Sometimes, you get links even without asking for them. If you have a popular piece of content on your site that receives lots of traffic, for example, people may link to that on their own because they find it valuable.
Since this tutorial is about proactively pursuing link opportunities, we’re going to focus on unstructured linked citations types one and two.
If your competitor has been featured in an article from say a local journalist or blogger, then your outreach will be focused on making a connection with that writer or publication for future link opportunities, rather than getting the exact link your competitor has. That’s because the article has already been written, so it’s unlikely that the writer will go back and edit their story just to add your link.
The one exception to this rule would be if the article links to your competitor, but your competitor’s link is now broken. In this scenario, you could reach out to the writer and say something like, “Hey! I notice in your article [article title] you link to [competitor’s link], but that link doesn’t seem to be working. I have similar content on my website [your URL]. If you find it valuable, please feel free to use it as a replacement for that broken link!”
Sometimes the contact information of the writer will be right next to the article, itself. For example:
If there’s no email address or contact form in the writer’s bio, you can usually find a link to one of their social media accounts, like Twitter, and you can connect with them there via a public or direct message. If you live in a small, tight-knit community, you may even be able to meet with the author in person.
If you notice your competitors are issuing a lot of press releases and you want to try that out for yourself, you’ll likely need to sign up for an account, as these are a primarily self-serve platform. Most quality press release sites charge per release, and the price can differ depending on length.
Citations / directories
You’ll either want to sign up for a citation service like Moz Local that distributes your data to these types of listings programmatically, or if you do it manually, you’ll want to find the link to create your listings. Please note that your business may already be on the directory even if you haven’t set up a profile. Before creating a new listing, search for your business name and its variants, your phone number, and current and former addresses to see if there are existing listings you can claim and update.
Most businesses will make it easy to contact them. If you’re trying to contact another business for the purpose of proposing teaming up for a co-marketing opportunity, look in their footer (the very bottom of the website). If there’s no contact information there, search for a “Contact Us” or “About” page. You may not find an email address, but you may be able to find a contact form or phone number. Below is an example from Albuquerque Little Theater, where they have contact information on the right and advertising information in the top navigation for businesses that are interested in taking out ads in their printed show programs. Not an unstructured linked citation, but a great way to get your business known to the community!
Most organizations will make it easy for those who want to join, unless they are more exclusive or invitation-only. In the event that you do wish to get involved in an invitation-only organization that has no public-facing contact information, try viewing a member list and seeing if there’s anyone you know. Or maybe you know someone who can introduce you to one of the members. Genuine connections are key for this type of organization.
Step 8: Writing a good outreach email (for unstructured linked citations requiring outreach)
Outreach emails are necessary when the link opportunity you’re pursuing isn’t a link you could create yourself, or if the link source is one where you can’t make face-to-face contact with decision-makers. One of the most important questions you should be asking yourself for these opportunities is, “Why would this website link to me?”
Here’s how Bottger might go about sending an outreach email:
Greeting that matches the nature of the outreach target
“Hey Jill!” might be fine when outreaching to the author of a blog, while “Hello Ms. Smith” might be better for more professional outreach.
Give a brief summary of who you are, what you do, and your interest in contacting them. For example: “I work with Bottger Mansion, a historic Bed & Breakfast in Old Town Albuquerque. I found your page about Albuquerque activities — you’ve really captured a lot of what Albuquerque has to offer!”
The ask, and the value add
This is where you’ll actually ask for the link. It’s a good idea here to add value. Don’t just ask for something; offer to give something back!
To continue the same example: “As long-time residents of Old Town, we’d love to provide you with a comprehensive list of activities in the city’s historic district! We feel an Old Town Activities list would be a great addition to your page. Bottger Mansion regularly hosts high tea, for example, which we’d love to let more people know about with a spot on your list!”
Wish them well, thank them for their time, and sign off. Make sure that it’s easy for them to find information about you by including your full name, title, organization, and website/social links in your email signature.
Don’t be afraid to get on the phone, either! Hearing your voice can add a human element to the outreach attempt and offer a better conversion rate than a more impersonal email (we all get so many of those a day that ones from people we don’t know are easy to ignore).
And remember that local businesses have a particular advantage in accruing unstructured linked citations. Lively participation in the life of your community can continuously introduce you to decision-makers at popular local publications, paving the way towards neighborly outreach on your part. Learn to see the opportunities and think of ways your business can add value to the content that is being written about your town or city.
Step 9: Tracking your wins
Next-to-last, we’re going to jump back to Link Tracking Lists for a second, because that’s going to come in extremely handy here. Remember when we created the list with Sandia and Isleta’s links that we were interested in pursuing? Those will now show up when we go to Moz Pro > Link Explorer > Link Tracking Lists.
Every time Bottger successfully secures a link that they’ve added to their Link Tracking List, the red X in “Links to target URL?” column will turn blue, indicating that the site links to Bottger’s root domain. If we were pursuing links to individual pages, and a link prospect linked to our target page, the red X would turn green.
Another handy feature is the “Notes” dropdown. This allows you to keep track of your outreach attempts, which can be one of the trickiest parts about link building!
Avoiding the bad fish: Some words of caution before you get started
Before starting this process for yourself, familiarize yourself with these four risks so that your fishing trip doesn’t result in a basket of bad catches that could waste your resources or get your website penalized.
1. Risks of a “copy only” strategy
Link Intersect can be amazingly helpful for discovering new, relevant link opportunities for your local business, but link builders beware. If all you ever do is copy your competitors, the most you’ll ever achieve is becoming the second-best version of them. Use this method to keep tabs on strategies your competition is using, and even use it to spark your own creativity, but avoid copying everything your competitors do, and nothing else. Why be the second-best version of your competition when you can be the best version of yourself?
2. Risks of a “blindly follow” strategy
Comparing your site’s backlink profile with your direct competitors’ backlink profiles will return a list of links that they have and you don’t, but don’t use Link Intersect results as an exact checklist of links to pursue. Your competitors might have bad backlinks in their profile. For example, avoid pursuing opportunities from domains with a high Spam Score or low Domain or Page Authority (DA/PA). Learn more about how to evaluate sites by their Spam Score or DA/PA.
They might also have great backlinks that aren’t the right opportunity for your business, and those should be avoided too! Do you remember Isleta and Sandia’s links for events like MMA matches? If Bottger were to blindly take those resorts’ link profiles as directives, they might think they have to host a fight at their B&B, too!
Take what you find with a grain of salt. Evaluate every link opportunity on its own merit, rather than deeming it a good opportunity simply because your competitor has it.
3. Risks of an “apples to oranges” strategy
Choose the domains and pages you want to compare yourself against wisely. As a small local B&B, Bottger wouldn’t want to compare their backlink profile to that of Wikipedia or The New York Times, for example. Those sites are popular, but not relevant in any way to the types of unstructured linked citations Bottger would want to pursue, such as links that are locally relevant or industry-relevant.
In other words, just because a site is popular doesn’t mean it will yield relevant unstructured linked citation opportunities for you. Here in this tutorial, we’ve outlined one potential use-case for Link Intersect: finding unstructured linked citations your local business competitors have. However, this is not the only use for Link Intersect. Instead of comparing your site against competitors or near-competitors, you could compare it against:
If you know what types of links you’re trying to find, choosing sites to evaluate against your own should be a lot easier, and yield more relevant opportunities.
4. Risks of ignoring Google’s “link schemes” guidelines
If you’ve never embarked on link building before, we encourage you to read through Google’s quality guidelines for webmasters, specifically its section on “Link schemes.” If you were to distill those link guidelines down into a single principle, it would be: don’t create links for the purpose of manipulating your site’s ranking in Google search. That’s right. Google doesn’t want anyone embarking on any marketing initiatives solely for the purpose of improving their ranking. Google wants links to be the natural byproduct of the quality work you’re doing for your audience. Google can penalize sites that participate in activities such as:
- Buying links that pass PageRank (“followed” links)
- Excessive “you link to me and I’ll link to you” exchanges
- Self-created followed links that weren’t editorially placed by the site owner
This underscores that the activities that are just good business, like being involved in the local community, are also the ones that can produce the links that Google likes. Sites owners might need a little nudge, which is why we’ve included a section on outreach, but that doesn’t mean the links are unnatural. Unstructured linked citations should be a byproduct of the good work local businesses are doing in their communities.
At Moz, we’re strong believers in authenticity, and there is no better pond for building meaningful marketing relationships than the local one. Focusing on unstructured linked citations can be viewed as a prompt to grow your community relationships — with journalists, bloggers, event hosts, business associations, and customers. It’s a chance for a real-world fishing trip that can reel in a basket of publicity for your local brand beyond what money can buy. Your genuine desire to serve and build community will stand you in good stead for the long haul.
Grab yourself a cup of coffee (or two) and buckle up, because we’re doing maths today.
Back it on up…
A quick refresher from last time: I pulled data from 50 keyword-targeted articles written on Brafton’s blog between January and June of 2018.
We used a technique of writing these articles published earlier on Moz that generates some seriously awesome results (we’re talking more than doubling our organic traffic in the last six months, but we will get to that in another publication).
We pulled this data again… Only I updated and reran all the data manually, doubling the dataset. No APIs. My brain is Swiss cheese.
We wanted to see how newly written, original content performs over time, and which factors may have impacted that performance.
Why do this the hard way, dude?
“Why not just pull hundreds (or thousands!) of data points from search results to broaden your dataset?”, you might be thinking. It’s been done successfully quite a few times!
Trust me, I was thinking the same thing while weeping tears into my keyboard.
The answer was simple: I wanted to do something different from the massive aggregate studies. I wanted a level of control over as many potentially influential variables as possible.
By using our own data, the study benefited from:
- The same root Domain Authority across all content.
- Similar individual URL link profiles (some laughs on that later).
- Known original publish dates and without reoptimization efforts or tinkering.
- Known original keyword targets for each blog (rather than guessing).
- Known and consistent content depth/quality scores (MarketMuse).
- Similar content writing techniques for targeting specific keywords for each blog.
You will never eliminate the possibility of misinterpreting correlation as causation. But controlling some of the variables can help.
As Rand once said in a Whiteboard Friday, “Correlation does not imply causation (but it sure is a hint).”
What we gained in control, we lost in sample size. A sample size of 96 is much less useful than ten thousand, or a hundred thousand. So look at the data carefully and use discretion when considering the ranking factors you find most likely to be true.
This resource can help gauge the confidence you should put into each Pearson Correlation value. Generally, the stronger the relationship, the smaller sample size needed to be be confident in the results.
So what exactly have you done here?
We have generated hints at what may influence the organic performance of newly created content. No more, and no less. But they are indeed interesting hints and maybe worth further discussion or research.
What have you not done?
We have not published sweeping generalizations about Google’s algorithm. This post should not be read as a definitive guide to Google’s algorithm, nor should you assume that your site will demonstrate the same correlations.
So what should I do with this data?
The best way to read this article, is to observe the potential correlations we observed with our data and consider the possibility of how those correlations may or may not apply to your content and strategy.
I’m hoping that this study takes a new approach to studying individual URLs and stimulates constructive debate and conversation.
Your constructive criticism is welcome, and hopefully pushes these conversations forward!
The stat sheet
So quit jabbering and show me the goods, you say? Alright, let’s start with our stats sheet, formatted like a baseball card, because why not?:
And as always, here is the original data set if you care to reproduce my results.
So now the part you have been waiting for…
1. Time and performance
I started with a question: “Do blogs age like a Macallan 18 served up neat on a warm summer Friday afternoon, or like tepid milk on a hot summer Tuesday?”
Does the time indexed play a role in how a piece of content performs?
Correlation 1: Time and target keyword position
First we will map the target keyword ranking positions against the number of days its corresponding blog has been indexed. Visually, if there is any correlation we will see some sort of negative or positive linear relationship.
There is a clear negative relationship between the two variables, which means the two variables may be related. But we need to go beyond visuals and use the PCC.
Days live vs. target keyword position
The data shows a moderate relationship between how long a blog has been indexed and the positional ranking of the target keyword.
But before getting carried away, we shouldn’t solely trust one statistical method and call it a day. Let’s take a look at things another way: Let’s compare the average age of articles whose target keywords rank in the top ten against the average age of articles whose target keywords rank outside the top ten.
Average age of articles based on position
Target KW position ≤ 10
Target KW position > 10
Now a story is starting to become clear: Our newly written content takes a significant amount of time to fully mature.
But for the sake of exhausting this hint, let’s look at the data one final way. We will group the data into buckets of target keyword positions, and days indexed, then apply them to a heatmap.
This should show us a clear visual clustering of how articles perform over time.
This chart, quite literally, paints a picture. According to the data, we shouldn’t expect a new article to realize its full potential until at least 100 days, and likely longer. As a blog post ages, it appears to gain more favorable target keyword positioning.
Correlation 2: Time and total ranking keywords on URL
You’ll find that when you write an article it will (hopefully) rank for the keyword you target. But often times it will also rank for other keywords. Some of these are variants of the target keyword, some are tangentially related, and some are purely random noise.
Instinct will tell you that you want your articles to rank for as many keywords as possible (ideally variants and tangentially related keywords).
Predictably, we have found that the relationship between the number of keywords an article ranks for and its estimated monthly organic traffic (per SEMrush) is strong (.447).
We want all of our articles to do things like this:
We want lots of variants each with significant search volume. But, does an article increase the total number of keywords it ranks for over time? Let’s take a look.
Visually this graph looks a little murky due to the existence of two clear outliers on the far right. We will first run the analysis with the outliers, and again without. With the outliers, we observe the following:
Days live vs. total keywords ranking on URL (w/outliers)
There appears to be a relationship between the two variables, but it isn’t as strong. Let’s see what happens when we remove those two outliers:
Visually, the relationship looks stronger. Let’s look at the PCC:
Days live vs. total keywords ranking on URL (without outliers)
The relationship appears to be much stronger with the two outliers removed.
But again, let’s look at things another way.
Let’s look at the average age of the top 25% of articles and compare them to the average age of the bottom 25% of articles:
Average age of top 25% of articles versus bottom 25%
This is exactly why we look at data multiple ways! The top 25% of blog posts with the most ranking keywords have been indexed an average of 149 days, while the bottom 25% have been indexed 74 days — roughly half.
To be fully sure, let’s again cluster the data into a heatmap to observe where performance falls on the time continuum:
We see a very similar pattern as in our previous analysis: a clustering of top-performing blogs starting at around 100 days.
Time and performance assumptions
You still with me? Good, because we are saying something BIG here. In our observation, it takes between 3 and 5 months for new content to perform in organic search. Or at the very least, mature.
To look at this one final way, I’ve created a scatterplot of only the top 25% of highest performing blogs and compared them to their time indexed:
There are 48 data plots on this chart, the blue plots represent the top 25% of articles in terms of strongest target keyword ranking position. The orange plots represent the top 25% of articles with the highest number of keyword rankings on their URL. (These can be, and some are, the same URL.)
Looking at the data a little more closely, we see the following:
90% of the top 25% of highest-performing content took at least 100 days to mature, and only two articles took less than 75 days.
Time and performance conclusion
For those of you just starting a content marketing program, remember that you may not see the full organic potential for your first piece of content until month 3 at the earliest. And, it takes at least a couple months of content production to make a true impact, so you really should wait a minimum of 6 months to look for any sort of results.
In conclusion, we expect new content to take at least 100 days to fully mature.
But wait, some of you may be saying. What about links, buddy? Articles build links over time, too!
It stands to reason that, over time, a blog will gain links (and ranking potential) over time. Links matter, and higher positioned rankings gain links at a faster rate. Thus, we are at risk of misinterpreting correlation for causation if we don’t look at this carefully.
But what none of you know, that I know, is that being the terrible SEO that I am, I had no linking strategy with this campaign.
And I mean zero strategy. The average article generated 1.3 links from .5 linking domains.
Linking domains vs. target keyword position
Average linking domains to top 25% of articles
Average linking domains to bottom 25% of articles
The one thing consistent across all the articles was a shocking and embarrassing lack of inbound links. This is demonstrated by an insignificant correlation coefficient of -.022. The same goes for the total number of links per URL, with a correlation coefficient of -.029.
These articles appear to have performed primarily on their content rather than inbound links.
(And they certainly would have performed much better with a strong, or any, linking strategy. Nobody is arguing the value of links here.) But mostly…
Shame on me.
Shame. Shame. Shame.
But on a positive note, we were able to generate a more controlled experiment on the effects of time and blog performance. So, don’t fire me just yet?
Note: It would be interesting to pull link quality metrics into the discussion (for the precious few links we did earn) rather than total volume. However, after a cursory look at the data, nothing stood out as being significant.
3. Word count
Content marketers and SEOs love talking about word count. And for good reason. When we collectively agreed that “quality content” was the key to rankings, it would stand to reason that longer content would be more comprehensive, and thus do a better job of satisfying searcher intent. So let’s test that theory.
Correlation 1: Target keyword position versus total word count
Will longer articles increase the likelihood of ranking for the keyword you are targeting?
Not in our case. To be sure, let’s run a similar analysis as before.
Word count vs. target keyword position
Average word count of top 25% articles
Average word count of bottom 25% articles
The data shows no impact on rankings based on the length of our articles.
Correlation 2: Total keywords ranking on URL versus word count
One would think that longer content would result in is additional ranking keywords, right? Even by accident, you would think that the more related topics you discuss in an article, the more keywords you will rank for. Let’s see if that’s true:
Total keywords ranking on URL vs. word count
Not in this case.
Word count, speculative tangent
So how can it be that so many studies demonstrate higher word counts result in more favorable rankings? Some reconciliation is in order, so allow me to speculate on what I think may be happening in these studies.
- Most likely: Measurement techniques. These studies generally look at one factor relative to rankings: average absolute word count based on position. (And, there actually isn’t much of a difference in average word count between position one and ten.)
- Likely: High quality content is longer, by nature. We know that “quality content” is discussed in terms of how well a piece satisfies the intent of the reader. In an ideal scenario, you will create content that fully satisfies everything a searcher would want to know about a given topic. Ideally you own the resource center for the topic, and the searcher does not need to revisit SERPs and weave together answers from multiple sources. By nature, this type of comprehensive content is quite lengthy. Long-form content is arguably a byproduct of creating for quality. Cyrus Shepard does a better job of explaining this likelihood here.
- Less likely: Long-form threshold. The articles we wrote for this study ranged from just under 1,000 words to nearly as high as 4,000 words. One could consider all of these as “long-form content,” and perhaps Google does as well. Perhaps there is a word count threshold that Google uses.
As we are demonstrating in this article, there may be many other factors at play that need to be isolated and tested for correlations in order to get the full picture, such as: time indexed, on-page SEO (to be discussed later), Domain Authority, link profile, and depth/quality of content (also to be discussed later with MarketMuse as a measure). It’s possible that correlation does not imply correlation, and by using word count averages as the single method of measure, we may be painting too broad of a stroke.
This is all speculation. What we can say for certain is that all our content is 900 words and up, and shows no incremental benefit to be had from additional length.
Feel free to disagree with any (or all) of my speculations on my interpretation of the discrepancies of results, but I tend to have the same opinion as Brian Dean with the information available.
At this point, most of you are familiar with MarketMuse. They have created a number of AI-powered tools that help with content planning and optimization.
We use the Content Optimizer tool, which evaluates the top 20 results for any keyword and generates an outline of all the major topics being discussed in SERPs. This helps you create content that is more comprehensive than your competitors, which can lead to better performance in search.
Based on the competitive landscape, the tool will generate a recommended content score (their proprietary algorithm) that you should hit in order to compete with the competing pages ranking in SERPs.
But… if you’re a competitive fellow, what happens if you want to blow the recommended score out of the water? Do higher scores have an impact on rankings? Does it make a difference if your competition has a very low average score?
We pulled every article’s content score, along with MarketMuse’s recommended scores and the average competitor scores, to answer these questions.
Correlation 1: Overall MarketMuse content score
Does a higher overall content score result in better rankings? Let’s take a look:
Absolute MarketMuse score vs. target keyword position
A perfect zero! We weren’t able to beat the system by racking up points. I also checked to see if a higher absolute score would result in a larger number of keywords ranking on the URL — it doesn’t.
Correlation 2: Beating the recommended score
As mentioned, based on the competitive landscape, MarketMuse will generate a recommended content score. What happens if you blow the recommended score out of the water? Do you get bonus points?
In order to calculate this correlation, we pulled the content score percentage attainment and compared it to the target keyword position. For example, if we scored a 30 of recommended 25, we hit 120% attainment. Let’s see if it matters:
Percentage content score attainment vs. target keyword position
No bonus points for doing extra credit!
Correlation 3: Beating the average competitors’ scores
Okay, if you beat MarketMuse’s recommendations, you don’t get any added benefit, but what if you completely destroy your competitors’ average content scores?
We will calculate this correlation the same way we previously did, with percentage attainment over the average competitor. For example, if we scored a 30 over the average of 10, we hit 300% attainment. Let’s see if that matters:
Percentage attainment over average competitor score versus target KW position
That didn’t work either! Seems that there are no hacks or shortcuts here.
We know that MarketMuse works, but it seems that there are no additional tricks to this tool.
If you regularly hit the recommended score as we did (average 110% attainment, with 81% of blogs hitting 100% attainment or better) and cover the topics prescribed, you should do well. But don’t fixate on competitor scores or blowing the recommended score out of the water. You may just be wasting your time.
Note: It’s worth noting that we probably would have shown stronger correlations had we intentionally bombed a few MarketMuse scores. Perhaps a test for another day.
5. On-page optimization
Ah, old-school technical SEO. This type of work warms the cockles of a seasoned SEO’s heart. But does it still have a place in our constantly evolving world? Has Google advanced to the point where it doesn’t need technical cues from SEOs to understand what a page is about?
To find out, I have pulled Moz’s on-page optimization score for every article and compared them to the target keywords’ positional rankings:
Let’s take a look at the scatterplot for all the keyword targets.
Now looking at the math:
On-page optimization score vs. target keyword position
Average on-page score for top 25%
Average on-page score for bottom 25%
If you have a keen eye you may have noticed a few strong outliers on the scatterplot. If we remove three of the largest outliers, the correlation goes up to -.435, a strong relationship.
Before we jump to conclusions, let’s look at this data one final way.
Let’s take a look at the percentage of articles with their target keywords ranking 1–10 that also have a 90% on-page score or better. We will compare that number to the percentage of articles ranking outside the top ten that also have a 90% on-page score or better.
If our assumption is correct, we will see a much higher percentage of keywords ranking 1–10 with an on-page score of 90% or better, and a lower number for articles ranking greater than 10.
On-page optimization score by rankings
Percentage of KWs ranking 1–10 with ≥ 90% score
Percentage of keywords ranking >10 with ≥ 90% score
This is enough of a hint for me. I’m implementing a 90% minimum on-page score from here on out.
Old school SEOs, rejoice!
6. The competition’s average word count
We won’t put this “word count” argument to bed just yet…
Let’s ask ourselves, “Does it matter how long the average content of the top 20 results is?”
Is there a relationship between the length of your content versus the average competitor?
What if your competitors are writing very short form, and you want to beat them with long-form content?
We will measure this the same way as before, with percentage attainment. For example, if the average word count of the top 20 results for “content marketing agency” is 300, and our piece is 450 words, we hit 150% attainment.
Let’s see if you can “out-verbose” your opponents.
Percentage word count attainment versus target KW position
Alright, I’ll put word count to bed now, I promise.
7. Keyword density
You’ve made it to the last analysis. Congratulations! How many cups of coffee have you consumed? No judgment; this report was responsible for entire coffee farms being completely decimated by yours truly.
For selfish reasons, I couldn’t resist the temptation to dispel this ancient tactic of “using target keywords” in blog content. You know what I’m talking about: when someone says “This blog doesn’t FEEL optimized… did you use the target keyword enough?”
There are still far too many people that believe that littering target keywords throughout a piece of content will yield results. And misguided SEO agencies, along with certain SEO tools, perpetuate this belief.
Yoast has a tool in WordPress that some digital marketers live and die by. They don’t think that a blog is complete until Yoast shows the magical green light, indicating that the content has satisfied the majority of its SEO recommendations:
Uh oh, keyword density is too low! Let’s see if it that ACTUALLY matters.
Not looking so good, my keyword-stuffing friends! Let’s take a look at the PCC:
Target keyword ranking position vs. Yoast keyword density
Believers would like to see a negative relationship here; as the keyword density goes down, the ranking position decreases, producing a downward sloping line.
What we are looking at is a slightly upward-sloping line, which would indicate losing rankings by keyword stuffing — but fortunately not TOO upward sloping, given the low correlation value.
Okay, so PLEASE let that be the end of “keyword density.” This practice has been disproven in past studies, as referenced by Zyppy. Let’s confidently put this to bed, forever. Please.
Oh, and just for kicks, the Flesch Reading Ease score has no bearing on rankings either (-.03 correlation). Write to a third grade level, or a college level, it doesn’t matter.
TL;DR (I don’t blame you)
What we learned from our data
- Time: It took 100 days or more for an article to fully mature and show its true potential. A content marketing program probably shouldn’t be fully scrutinized until month 5 or 6 at the very earliest.
- Links: Links matter, I’m just terrible at generating them. Shame.
- Word count: It’s not about the length of the content, in absolute terms or relative to the competition. It’s about what is written and how resourceful it is.
- MarketMuse: We have proven that MarketMuse works as it prescribes, but there is no added benefit to breaking records.
- On-page SEO: Our data demonstrates that it still matters. We all still have a job.
- Competitor content length: We weren’t successful at blowing our competitors out of the water with longer content.
- Keyword density: Just stop. Join us in modern times. The water is warm.
In conclusion, some reasonable guidance we agree on is:
Wait at least 100 days to evaluate the performance of your content marketing program, write comprehensive content, and make sure your on-page SEO score is 90%+.
Oh, and build links. Unlike me. Shame.
Now go take a nap.
In the past year, local SEO has run at a startling and near-constant pace of change. From an explosion of new Google My Business features to an ever-increasing emphasis on the importance of reviews, it’s almost too much to keep up with. In today’s Whiteboard Friday, we welcome our friend Darren Shaw to explain what local is like today, dive into the key takeaways from his 2018 Local Search Ranking Factors survey, and offer us a glimpse into the future according to the local SEO experts.
Howdy, Moz fans. I’m Darren Shaw from Whitespark, and today I want to talk to you about the local search ranking factors. So this is a survey that David Mihm has run for the past like 10 years. Last year, I took it over, and it’s a survey of the top local search practitioners, about 40 of them. They all contribute their answers, and I aggregate the data and say what’s driving local search. So this is what the opinion of the local search practitioners is, and I’ll kind of break it down for you.
Local search today
So these are the results of this year’s survey. We had Google My Business factors at about 25%. That was the biggest piece of the pie. We have review factors at 15%, links at 16%, on-page factors at 14%, behavioral at 10%, citations at 11%, personalization and social at 6% and 3%. So that’s basically the makeup of the local search algorithm today, based on the opinions of the people that participated in the survey.
The big story this year is Google My Business. Google My Business factors are way up, compared to last year, a 32% increase in Google My Business signals. I’ll talk about that a little bit more over in the takeaways. Review signals are also up, so more emphasis on reviews this year from the practitioners. Citation signals are down again, and that makes sense. They continue to decline I think for a number of reasons. They used to be the go-to factor for local search. You just built out as many citations as you could. Now the local search algorithm is so much more complicated and there’s so much more to it that it’s being diluted by all of the other factors. Plus it used to be a real competitive difference-maker. Now it’s not, because everyone is pretty much getting citations. They’re considered table stakes now. By seeing a drop here, it doesn’t mean you should stop doing them. They’re just not the competitive difference-maker they used to be. You still need to get listed on all of the important sites.
All right, so let’s talk about the key takeaways.
1. Google My Business
The real story this year was Google My Business, Google My Business, Google My Business. Everyone in the comments was talking about the benefits they’re seeing from investing in a lot of these new features that Google has been adding.
Google has been adding a ton of new features lately — services, descriptions, Google Posts, Google Q&A. There’s a ton of stuff going on in Google My Business now that allows you to populate Google My Business with a ton of extra data. So this was a big one.
✓ Take advantage of Google Posts
Everyone talked about Google Posts, how they’re seeing Google Posts driving rankings. There are a couple of things there. One is the semantic content that you’re providing Google in a Google post is definitely helping Google associate those keywords with your business. Engagement with Google Posts as well could be driving rankings up, and maybe just being an active business user continuing to post stuff and logging in to your account is also helping to lift your business entity and improve your rankings. So definitely, if you’re not on Google Posts, get on it now.
If you search for your category, you’ll see a ton of businesses are not doing it. So it’s also a great competitive difference-maker right now.
✓ Seed your Google Q&A
Google Q&A, a lot of businesses are not even aware this exists. There’s a Q&A section now. Your customers are often asking questions, and they’re being answered by not you. So it’s valuable for you to get in there and make sure you’re answering your questions and also seed the Q&A with your own questions. So add all of your own content. If you have a frequently asked questions section on your website, take that content and put it into Google Q&A. So now you’re giving lots more content to Google.
✓ Post photos and videos
Photos and videos, continually post photos and videos, maybe even encourage your customers to do that. All of that activity is helpful. A lot of people don’t know that you can now post videos to Google My Business. So get on that if you have any videos for your business.
✓ Fill out every field
There are so many new fields in Google My Business. If you haven’t edited your listing in a couple of years, there’s a lot more stuff in there that you can now populate and give Google more data about your business. All of that really leads to engagement. All of these extra engagement signals that you’re now feeding Google, from being a business owner that’s engaged with your listing and adding stuff and from users, you’re giving them more stuff to look at, click on, and dwell on your listing for a longer time, all that helps with your rankings.
✓ Get more Google reviews
Reviews continue to increase in importance in local search, so, obviously, getting more Google reviews. It used to be a bit more of a competitive difference-maker. It’s becoming more and more table stakes, because everybody seems to be having lots of reviews. So you definitely want to make sure that you are competing with your competition on review count and lots of high-quality reviews.
✓ Keywords in reviews
Getting keywords in reviews, so rather than just asking for a review, it’s useful to ask your customers to mention what service they had provided or whatever so you can get those keywords in your reviews.
✓ Respond to reviews (users get notified now!)
Responding to reviews. Google recently started notifying users that if the owner has responded to you, you’ll get an email. So all of that is really great, and those responses, it’s another signal to Google that you’re an engaged business.
✓ Diversify beyond Google My Business for reviews
Diversify. Don’t just focus on Google My Business. Look at other sites in your industry that are prominent review sites. You can find them if you just look for your own business name plus reviews, if you search that in Google, you’re going to see the sites that Google is saying are important for your particular business.
You can also find out like what are the sites that your competitors are getting reviews on. Then if you just do a search like keyword plus city, like “lawyers + Denver,” you might find sites that are important for your industry as well that you should be listed on. So check out a couple of your keywords and make sure you’re getting reviews on more sites than just Google.
Then links, of course, links continue to drive local search. A lot of people in the comments talked about how a handful of local links have been really valuable. This is a great competitive difference-maker, because a lot of businesses don’t have any links other than citations. So when you get a few of these, it can really have an impact.
✓ From local industry sites and sponsorships
They really talk about focusing on local-specific sites and industry-specific sites. So you can get a lot of those from sponsorships. They’re kind of the go-to tactic. If you do a search for in title sponsors plus city name, you’re going to find a lot of sites that are listing their sponsors, and those are opportunities for you, in your city, that you could sponsor that event as well or that organization and get a link.
All right. So I also asked in the survey: Where do you see Google going in the future? We got a lot of great responses, and I tried to summarize that into three main themes here for you.
1. Keeping users on Google
This is a really big one. Google does not want to send its users to your website to get the answer. Google wants to have the answer right on Google so that they don’t have to click. It’s this zero-click search result. So you see Rand Fishkin talking about this. This has been happening in local for a long time, and it’s really amplified with all of these new features Google has been adding. They want to have all of your data so that they don’t have to send users to find it somewhere else. Then that means in the future less traffic to your website.
So Mike Blumenthal and David Mihm also talk about Google as your new homepage, and this concept is like branded search.
- What does your branded search look like?
- So what sites are you getting reviews on?
- What does your knowledge panel look like?
Make that all look really good, because Google doesn’t want to send people to your new website.
2. More emphasis on behavioral signals
David Mihm is a strong voice in this. He talks about how Google is trying to diversify how they rank businesses based on what’s happening in the real world. They’re looking for real-world signals that actual humans care about this business and they’re engaging with this business.
So there’s a number of things that they can do to track that — so branded search, how many people are searching for your brand name, how many people are clicking to call your business, driving directions. This stuff is all kind of hard to manipulate, whereas you can add more links, you can get more reviews. But this stuff, this is a great signal for Google to rely on.
Engagement with your listing, engagement with your website, and actual humans in your business. If you’ve seen on the knowledge panel sometimes for brick-and-mortar business, it will be like busy times. They know when people are actually at your business. They have counts of how many people are going into your business. So that’s a great signal for them to use to understand the prominence of your business. Is this a busy business compared to all the other ones in the city?
3. Google will monetize everything
Then, of course, a trend to monetize as much as they can. Google is a publicly traded company. They want to make as much money as possible. They’re on a constant growth path. So there are a few things that we see coming down the pipeline.
Local service ads are expanding across the country and globally and in different industries. So this is like a paid program. You have to apply to get into it, and then Google takes a cut of leads. So if you are a member of this, then Google will send leads to you. But you have to be verified to be in there, and you have to pay to be in there.
Then taking a cut from bookings, you can now book directly on Google for a lot of different businesses. If you think about Google Flights and Google Hotels, Google is looking for a way to monetize all of this local search opportunity. That’s why they’re investing heavily in local search so they can make money from it. So seeing more of these kinds of features rolling out in the future is definitely coming. Transactions from other things. So if I did book something, then Google will take a cut for it.
So that’s the future. That’s sort of the news of the local search ranking factors this year. I hope it’s been helpful. If you have any questions, just leave some comments and I’ll make sure to respond to them all. Thanks, everybody.
If you missed our recent webinar on the Local Search Ranking Factors survey with Darren Shaw and Dr. Pete, don’t worry! You can still catch the recording here:
You’ll be in for a jam-packed hour of deeper insights and takeaways from the survey, as well as some great audience-contributed Q&A.
A thousand thanks to the 1,411 respondents who gave of their time and knowledge in contributing to this major survey! You’ve created a vivid image of what real-life, everyday local search marketers and local business owners are observing on a day-to-day basis, what strategies are working for them right now, and where some frankly stunning opportunities for improvement reside. Now, we’re ready to share your insights into:
- Google Updates
- Company infrastructure
- Tool usage
- And a great deal more…
This survey pooled the observations of everyone from people working to market a single small business, to agency marketers with large local business clients:
Thanks to you, this free report is a window into the industry. Bring these statistics to teammates and clients to earn the buy-in you need to effectively reach local consumers in 2019.
There are so many stories here worthy of your time
Let’s pick just one, to give a sense of the industry intelligence you’ll access in this report. Likely you’ve now seen the Local Search Ranking Factors 2018 Survey, undertaken by Whitespark in conjunction with Moz. In that poll of experts, we saw Google My Business signals being cited as the most influential local ranking component. But what was #2? Link building.
You might come away from that excellent survey believing that, since link building is so important, all local businesses must be doing it. But not so. The State of the Local SEO Industry Report reveals that:
When asked what’s working best for them as a method for earning links, 35% of local businesses and their marketers admitted to having no link building strategy in place at all:
And that, Moz friends, is what opportunity looks like. Get your meaningful local link building strategy in place in the new year, and prepare to leave ⅓ of your competitors behind, wondering how you surpassed them in the local and organic results.
The full report contains 30+ findings like this one. Rivet the attention of decision-makers at your agency, quote persuasive statistics to hesitant clients, and share this report with teammates who need to be brought up to industry speed. When read in tandem with the Local Search Ranking Factors survey, this report will help your business or agency understand both what experts are saying and what practitioners are experiencing.
Sometimes, local search marketing can be a lonely road to travel. You may find yourself wondering, “Does anyone understand what I do? Is anyone else struggling with this task? How do I benchmark myself?” You’ll find both confirmation and affirmation today, and Moz’s best hope is that you’ll come away a better, bolder, more effective local marketer. Let’s begin!
Correlation studies have been a staple of the search engine optimization community for many years. Each time a new study is released, a chorus of naysayers seem to come magically out of the woodwork to remind us of the one thing they remember from high school statistics — that “correlation doesn’t mean causation.” They are, of course, right in their protestations and, to their credit, and unfortunate number of times it seems that those conducting the correlation studies have forgotten this simple aphorism.
That being said, correlation studies are not altogether fruitless simply because they don’t necessarily uncover causal relationships (ie: actual ranking factors). What correlation studies discover or confirm are correlates.
Correlates are simply measurements that share some relationship with the independent variable (in this case, the order of search results on a page). For example, we know that backlink counts are correlates of rank order. We also know that social shares are correlates of rank order.
Correlation studies also provide us with direction of the relationship. For example, ice cream sales are positive correlates with temperature and winter jackets are negative correlates with temperature — that is to say, when the temperature goes up, ice cream sales go up but winter jacket sales go down.
Finally, correlation studies can help us rule out proposed ranking factors. This is often overlooked, but it is an incredibly important part of correlation studies. Research that provides a negative result is often just as valuable as research that yields a positive result. We’ve been able to rule out many types of potential factors — like keyword density and the meta keywords tag — using correlation studies.
Unfortunately, the value of correlation studies tends to end there. In particular, we still want to know whether a correlate causes the rankings or is spurious. Spurious is just a fancy sounding word for “false” or “fake.” A good example of a spurious relationship would be that ice cream sales cause an increase in drownings. In reality, the heat of the summer increases both ice cream sales and people who go for a swim. That swimming can cause drownings. So while ice cream sales is a correlate of drowning, it is *spurious.* It does not cause the drowning.
How might we go about teasing out the difference between causal and spurious relationships? One thing we know is that a cause happens before its effect, which means that a causal variable should predict a future change.
An alternative model for correlation studies
I propose an alternate methodology for conducting correlation studies. Rather than measure the correlation between a factor (like links or shares) and a SERP, we can measure the correlation between a factor and changes in the SERP over time.
The process works like this:
- Collect a SERP on day 1
- Collect the link counts for each of the URLs in that SERP
- Look for any URLs are out of order with respect to links; for example, if position 2 has fewer links than position 3
- Record that anomaly
- Collect the same SERP in 14 days
- Record if the anomaly has been corrected (ie: position 3 now out-ranks position 2)
- Repeat across ten thousand keywords and test a variety of factors (backlinks, social shares, etc.)
So what are the benefits of this methodology? By looking at change over time, we can see whether the ranking factor (correlate) is a leading or lagging feature. A lagging feature can automatically be ruled out as causal. A leading factor has the potential to be a causal factor.
Following this methodology, we tested 3 different common correlates produced by ranking factors studies: Facebook shares, number of root linking domains, and Page Authority. The first step involved collecting 10,000 SERPs from randomly selected keywords in our Keyword Explorer corpus. We then recorded Facebook Shares, Root Linking Domains, and Page Authority for every URL. We noted every example where 2 adjacent URLs (like positions 2 and 3 or 7 and 8) were flipped with respect to the expected order predicted by the correlating factor. For example, if the #2 position had 30 shares while the #3 position had 50 shares, we noted that pair. Finally, 2 weeks later, we captured the same SERPs and identified the percent of times that Google rearranged the pair of URLs to match the expected correlation. We also randomly selected pairs of URLs to get a baseline percent likelihood that any 2 adjacent URLs would switch positions. Here were the results…
It’s important to note that it is incredibly rare to expect a leading factor to show up strongly in an analysis like this. While the experimental method is sound, it’s not as simple as a factor predicting future — it assumes that in some cases we will know about a factor before Google does. The underlying assumption is that in some cases we have seen a ranking factor (like an increase in links or social shares) before Googlebot has and that in the 2 week period, Google will catch up and correct the incorrectly ordered results. As you can expect, this is a rare occasion. However, with a sufficient number of observations, we should be able to see a statistically significant difference between lagging and leading results. However, the methodology only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google.
|Factor||Percent Corrected||P-Value||95% Min||95% Max|
|Facebook Shares Controlled for PA||18.31%||0.00001||-0.6849||-0.5551|
|Root Linking Domains||20.58%||0.00001||0.016268||0.016732|
In order to create a control, we randomly selected adjacent URL pairs in the first SERP collection and determined the likelihood that the second will outrank the first in the final SERP collection. Approximately 18.93% of the time the worse ranking URL would overtake the better ranking URL. By setting this control, we can determine if any of the potential correlates are leading factors – that is to say that they are potential causes of improved rankings.
Facebook Shares performed the worst of the three tested variables. Facebook Shares actually performed worse than random (18.31% vs 18.93%), meaning that randomly selected pairs would be more likely to switch than those where shares of the second were higher than the first. This is not altogether surprising as it is the general industry consensus that social signals are lagging factors — that is to say the traffic from higher rankings drives higher social shares, not social shares drive higher rankings. Subsequently, we would expect to see the ranking change first before we would see the increase in social shares.
Raw root linking domain counts performed substantially better than shares at ~20.5%. As I indicated before, this type of analysis is incredibly subtle because it only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google. Nevertheless, this result was statistically significant with a P value <0.0001 and a 95% confidence interval that RLDs will predict future ranking changes around 1.5% greater than random.
By far, the highest performing factor was Page Authority. At 21.5%, PA correctly predicted changes in SERPs 2.6% better than random. This is a strong indication of a leading factor, greatly outperforming social shares and outperforming the best predictive raw metric, root linking domains.This is not unsurprising. Page Authority is built to predict rankings, so we should expect that it would outperform raw metrics in identifying when a shift in rankings might occur. Now, this is not to say that Google uses Moz Page Authority to rank sites, but rather that Moz Page Authority is a relatively good approximation of whatever link metrics Google is using to determine ranking sites.
There are so many different experimental designs we can use to help improve our research industry-wide, and this is just one of the methods that can help us tease out the differences between causal ranking factors and lagging correlates. Experimental design does not need to be elaborate and the statistics to determine reliability do not need to be cutting edge. While machine learning offers much promise for improving our predictive models, simple statistics can do the trick when we’re establishing the fundamentals.
Now, get out there and do some great research!
Around 2005 or so, corporate blogs became the thing to do. Big players in the business world touted that such platforms could “drive swarms of traffic to your main website, generate more product sales” and even “create an additional stream of advertising income” (Entrepreneur Magazine circa 2006). With promises like that, what marketer or exec wouldn’t jump on the blog bandwagon?
Unfortunately, initial forays into branded content did not always dwell on minor issues like “quality” or “entertainment,” instead focusing on sheer bulk and, of course, all the keywords. Now we have learned better, and many corporate blogs are less prolific and offer more value. But on some sites, behind many, many “next page” clicks, this old content can still be found lurking in the background.
This situation leaves current SEOs and content teams in a bit of a pickle. What should you do if your site has excessive quantities of old blog posts? Are they okay just sitting there? Do you need to do something about them?
Why bother addressing old blog posts?
On many sites, the sheer number of pages are the biggest reason to consider improving or scaling back old content. If past content managers chose quantity over quality, heaps of old posts eventually get buried, all evergreen topics have been written about before, and it becomes increasingly harder to keep inventory of your content.
From a technical perspective, depending on the scale of the old content you’re dealing with, pruning back the number of pages that you put forward can help increase your crawl efficiency. If Google has to crawl 1,000 URLs to find 100 good pieces of content, they are going to take note and not spend as much time combing through your content in the future.
From a marketing perspective, your content represents your brand, and improving the set of content that you put forward helps shape the way customers see you as an authority in your space. Optimizing and curating your existing content can give your collection of content a clearer topical focus, makes it more easily discoverable, and ensures that it provides value for users and the business.
Zooming out for a second to look at this from a higher level: If you’ve already decided that it’s worth investing in blog content for your company, it’s worth getting the most from your existing resources and ensuring that they aren’t holding you back.
Decide what to keep: Inventory and assessment
The first thing to do before accessing your blog posts is to make sure you know what you have. A full list of URLs and coordinating metadata is incredibly helpful in both evaluating and documenting.
Depending on the content management system that you use, obtaining this list can be as simple as exporting a database field. Alternatively, URLs can be gleaned from a combination of Google Analytics data, Webmaster Tools, and a comprehensive crawl with a tool such as Screaming Frog. This post gives a good outline of how to get the data you need from these sources.
Regardless of whether you have a list of URLs yet, it’s also good to do a full crawl of your blog to see what the linking structure looks like at this point, and how that may differ from what you see in the CMS.
Once you know what you have, it’s time to assess the content and decide if it’s worth holding on to. When I do this, I like to ask these 5 questions:
1. Is it beneficial for users?
Content that’s beneficial for users is helpful, informative, or entertaining. It answers questions, helps them solve problems, or keeps them interested. This could be anything from a walkthrough for troubleshooting to a collection of inspirational photos.
2. Is it beneficial for us?
Content that is beneficial to us is earning organic rankings, traffic, or backlinks, or is providing business value by helping drive conversions. Additionally, content that can help establish branding or effectively build topical authority is great to have on any site.
3. Is it good?
While this may be a bit of a subjective question to ask about any content, it’s obvious when you read content that isn’t good. This is about fundamental things such as if content doesn’t make sense, has tons of grammatical errors, is organized poorly, or doesn’t seem to have a point to it.
4. Is it relevant?
If content isn’t at least tangentially relevant to your site, industry, or customers, you should have a really good reason to keep it. If it doesn’t meet any of the former qualifications already, it probably isn’t worth holding on to.
5. Is it causing any issues?
Problematic content may include duplicate content, duplicate targeting, plagiarized text, content that is a legal liability, or any other number of issues that you probably don’t want to deal with on your site. I find that the assessment phase is a particularly good opportunity to identify posts that target the same topic, so that you can consolidate them.
Using these criteria, you can divide your old blog posts into buckets of “keep” and “don’t keep.” The “don’t keep” can be 301 redirected to either the most relevant related post or the blog homepage. Then it’s time to further address the others.
What to do with the posts you keep
So now you have a pile of “keep” posts to sort out! All the posts that made it this far have already been established to have value of some kind. Now we want to make the most of that value by improving, expanding, updating, and promoting the content.
When setting out to improve an old post that has good bones, it can be good to start with improvements on targeting and general writing and grammar. You want to make sure that your blog post has a clear point, is targeting a specific topic and terms, and is doing so in proper English (or whatever language your blog may be in).
Once the content itself is in good shape, make sure to add any technical improvements that the piece may need, such as relevant interlinking, alt text, or schema markup.
Then it’s time to make sure it’s pretty. Visual improvements such as adding line breaks, pull quotes, and imagery impact user experience and can keep people on the page longer.
Expand or update
Not all old blog posts are necessarily in poor shape, which can offer a great opportunity. Another way to get more value out of them is to repurpose or update the information that they contain to make old content fresh again. Data says that this is well worth the effort, with business bloggers that update older posts being 74% more likely to report strong results.
A few ways to expand or update a post are to explore a different take on the initial thesis, add newer data, or integrate more recent developments or changed opinions. Alternatively, you could expand on a piece of content by reinterpreting it in another medium, such as new imagery, engaging video, or even as audio content.
If you’ve invested resources in content creation and optimization, it only makes sense to try to get as many eyes as possible on the finished product. This can be done in a few different ways, such assharing and re-sharing on branded social channels, resurfacing posts to the front page of your blog, or even a bit of external promotion through outreach.
Once your blog has been pruned and you’re working on getting the most value out of your existing content, an important final step is to keep tabs on the effect these changes are having.
The most significant measure of success is organic organic traffic; even if your blog is designed for lead generation or other specific goals, the number of eyes on the page should have a strong correlation to the content’s success by other measures as well. For the best representation of traffic totals, I monitor organic sessions by landing page in Google Analytics.
I also like to keep an eye on organic rankings, as you can get an early glimpse of whether a piece is gaining traction around a particular topic before it’s successful enough to earn organic traffic with those terms.
Remember that regardless of what changes you’ve made, it will usually take Google a few months to sort out the relevance and rankings of the updated content. So be patient, monitor, and keep expanding, updating, and promoting!