We help you take control of your business’s online marketing with solutions that work together to help you generate more leads, win more business and keep more customers.
How does Google decide what goes into the local pack? It doesn’t have to be a black box — there’s logic behind the order. In this week’s Whiteboard Friday, renowned local SEO expert Mary Bowling lays out the three factors that drive Google’s local algorithm and local rankings in a simple and concise way anyone can understand.
Hi, Moz fans. This is Mary Bowling from Ignitor Digital, and today I want to talk to you about the local algorithm. I’d like to make this as simple as possible for people to understand, because I think it’s a very confusing thing for a lot of SEOs who don’t do this every day.
The local algorithm has always been based on relevance, prominence, and proximity.
For relevance, what the algorithm is asking is, “Does this business do or sell or have the attributes that the searcher is looking for?” That’s pretty simple. So that gives us all these businesses over here that might be relevant. For prominence, the algorithm is asking, “Which businesses are the most popular and the most well regarded in their local market area?”
For proximity, the question really is, “Is the business close enough to the searcher to be considered to be a good answer for this query?” This is what trips people up. This is what really defines the local algorithm — proximity. So I’m going to try to explain that in very simple terms here today.
Let’s say we have a searcher in a particular location, and she’s really hungry today and she wants some egg rolls. So her query is egg rolls. If she were to ask for egg rolls near me, these businesses are the ones that the algorithm would favor.
They are the closest to her, and Google would rank them most likely by their prominence. If she were to ask for something in a particular place, let’s say this is a downtown area and she asked for egg rolls downtown because she didn’t want to be away from work too long, then the algorithm is actually going to favor the businesses that sell egg rolls in the downtown area even though that’s further away from where the searcher is.
If she were to ask for egg rolls open now, there might be a business here and a business here and a business here that are open now, and they would be the ones that the algorithm would consider. So relevance is kicking in on the query. If she were to ask for the cheapest egg rolls, that might be here and here.
If she were to ask for the best egg rolls, that might be very, very far away, or it could be a combination of all kinds of locations. So you really need to think of proximity as a fluid thing. It’s like a rubber band, and depending on…
- the query,
- the searcher’s location,
- the relevance to the query,
- and the prominence of the business
….is what Google is going to show in that local pack.
I hope that makes it much clearer to those of you who haven’t understood the Local Algorithm. If you have some comments or suggestions, please make them below and thanks for listening.
It’s that time of year again. Professional development budgets are being distributed and you’re daydreaming of Roger hugs and fist bumps. Well, this is a call to arms! It’s time to get down to business and convince your boss that you HAVE to go to MozCon 2020.
You’re already well acquainted with the benefits of MozCon. Maybe you’re a MozCon alumnus. You may have lurked the hashtag once or twice for inside tips and you’ve likely followed the work of some of the speakers for a while. But how are you going to relay that to your boss in a way that sells? Don’t worry, we’ve got a plan.
(And if you want to skip ahead to the letter template, here it is!)
Step #1 – Gather evidence
Alright, so just going in and saying “Rand Fishkin is brilliant and have you seen any of Britney Muller’s Whiteboard Fridays lately?!” probably won’t do the trick — we need some cold hard facts that you can present.
MozCon delivers actionable insights
It’s easy to say that MozCon provides actionable insights, but how do you prove it? A quick scroll through our Facebook Group can prove to anyone that not only is MozCon a gathering of the greatest minds in search, but it also acts as an incubator and facilitator for SEO strategies.
If you can’t get your boss on Facebook, just direct them to the blog post written by Croud: Four things I changed immediately after MozCon. Talk about actionable! A quick Google (or LinkedIn) search will return dozens of similar recaps. Gather a few of these to have in your toolbelt just in case.
Or, if you have the time, pick out some of the event tweets from previous years that relate most to your company. The MozCon hashtag (#MozCon) has plenty of tweets to choose from — things like research findings, workflows, and useful tools are all covered. Some of our favorites from last year are listed below.
Attendees are often given access to exclusive tools and betas by the speakers, and that is something you don’t want to miss!
The networking is unbeatable
The potential knowledge gain doesn’t end with keynote speeches. Many of our speakers stick around for the entire conference and host niche- and vertical-specific Birds of a Feather tables over lunch, in addition to attending the networking events. If you find yourself with questions about their strategies, you’ll often have the ability to ask them directly.
But the speakers aren’t the only folks worth networking with. We hand-select industry vendors to attend the conference and showcase their products. These vendors are also available for training and showcases throughout the entire conference.
Lastly, your peers! There’s no better way to learn than from those who overcome the same obstacles as you. Opportunities for collaboration and peer-to-peer learning are often invaluable (especially those that happen over yummy snacks) and can lead to better workflows, new business, and even exciting partnerships.
Step #2 – Break down the costs
This is where the majority of the conversation will be focused, but fear not, Roger has already done most of the heavy lifting. So let’s cut to the chase. The goal of MozCon isn’t to make money — the goal is to break even and lift up our friends in search.
Every year we work with our speakers to bring cutting-edge content to the stage. You can be sure that the content you’ll be exposed to will set you up for a year of success.
Videos for everyone
While your coworkers won’t be able to enjoy Top Pot doughnuts or KuKuRuZa popcorn, they will be able to see all of the talks via professional video and audio. Your ticket to MozCon includes a professional video package which allows you (and your whole team) to watch every single talk post-conference, for free. (It’s a $350 value for the videos alone!)
MozCon doesn’t do anything on-par. We strive to go above and beyond in everything we do, and the food options are no exception. MozCon works with local vendors to ensure there are tasty, sustainable meals for everyone, including those with special diets. From breakfast to lunch and all the snacks in-between, MozCon has you covered (and saves your T&E budget a few bucks, as well).
Not to brag, but our swag is pretty great. In addition to special MozCon memorabilia, you can look forward to other useful and fun items that vary from year to year. Previous gifts include “conference health” fanny packs (complete with Emergen-C!), moleskin notebooks, reusable water bottles, and phone chargers.
This is probably the detail that’ll make your boss’s ears perk up. There are indeed discounts available for MozCon tickets! If you’re buying now through January 31st 2020, Early Bird pricing is in effect, which saves you a cool $200 off regular ticket costs. If you’ve got a team interested in attending, we offer group discounts for parties of 5+ as well. And my final top-secret tip: if your company already subscribes to a Moz product, you can save even more —up to $700 off per regular-priced ticket if you snag Early Bird pricing, or $500 off after January 31st. That’s a real chunk of change!
Step #3 – Be prepared to prove value
It’s important to go into the conference with a plan to bring back value. It’s easy to come to any conference and just enjoy the food and company (especially this one), but it’s harder to take the information gained and implement change.
Make a plan
Before approaching your boss, make sure you have a plan on how you’re going to show off all of the insights you gather at MozCon! Obviously, you’ll be taking notes — whether it’s to the tune of live tweets, bullet journals, or doodles, those notes are most valuable when they’re backed up by action.
Putting it into action
Set expectations with your boss. “After each day, I’ll select three takeaways and create a plan on how to execute them.” Who could turn down nine potential business-changing strategies?!
And it really isn’t that hard! Especially not with the content that you’ll have access to. At the close of each day, we recommend you look back over your notes and do a brain-dump.
- How did today’s content relate to your business?
- Which sessions resonated and would bring the most value to your team?
- Which strategies can easily be executed?
- Which would make the biggest impact?
After you identify those strategies, create a plan of action that will get you on track for implementing change.
(Fun fact — if you’re traveling, this can actually be done on the plane ride home!)
If you have clients on retainer, ongoing training for employees is something those clients should appreciate — it ensures you’re staying ahead of the game. Offer to not only debrief your in-house SEO team, but to also present to your clients. This sort of presentation is a value add that many clients don’t get and can set your business apart.
These presentations can be short blurbs at the beginning of a regular meeting or a chance to gather up all of your clients and enjoy a bit of networking and education.
Still not enough?
Give the boss a taste of MozCon by having them check out some videos from years past to get a taste for the caliber of our speakers. And if you’re wanting to break into the speaking circuit, you can also take your shot at securing a community speaker spot onstage. Most years, the call for community speakers opens up in early springtime — keep an eye on the Moz Blog for your chance to pitch!
Lastly, the reviews speak for themselves. MozCon is perfect for SEOs of any level and we even factor in time for you to get a little work done in-between sessions — Vaneese can tell you!
Our fingers are crossed!
Alright friend, now is your time to shine. We’ve equipped you with some super-persuasive tools and we’ll be crossing our fingers that the boss gives you the “okay!” Be sure to grab the letter template and make your case the easy way:
If you can make it, we promise to spoil you to the tune of endless Starbucks coffee, tons of new friends, and an experience that will change your perspective on search. We hope to see your smiling face at MozCon 2020!
Image credit: Visit Lakeland
Reporting fake and duplicate listings to Google sounds hard. Sometimes it can be. But very often, it’s as easy as falling off a log, takes only a modest session of spam fighting and can yield significant local ranking improvements.
If your local business/the local brands your agency markets aren’t using spam fighting as a ranking tactic because you feel you lack the time or skills, please sit down with me for a sec.
What if I told you I spent about an hour yesterday doing something that moved a Home Depot location up 3 spots in a competitive market in Google’s local rankings less than 24 hours later? What if, for you, moving up a spot or two would get you out of Google’s local finder limbo and into the actual local pack limelight?
Today I’m going to show you exactly what I did to fight spam, how fast and easy it was to sweep out junk listings, and how rewarding it can be to see results transform in favor of the legitimate businesses you market.
Washing up the shady world of window blinds
Image credit: Aqua Mechanical
Who knew that shopping for window coverings would lead me into a den of spammers throwing shade all over Google?
The story of Google My Business spam is now more than a decade in the making, with scandalous examples like fake listings for locksmiths and addiction treatment centers proving how unsafe and unacceptable local business platforms can become when left unguarded.
But even in non-YMYL industries, spam listings deceive the public, waste consumers’ time, inhibit legitimate businesses from being discovered, and erode trust in the spam-hosting platform. I saw all of this in action when I was shopping to replace some broken blinds in my home, and it was such a hassle trying to find an actual vendor amid the chaff of broken, duplicate, and lead gen listings, I decided to do something about it.
I selected an SF Bay area branch of Home Depot as my hypothetical “client.” I knew they had a legitimate location in the city of Vallejo, CA — a place I don’t live but sometimes travel to, thereby excluding the influence of proximity from my study. I knew that they were only earning an 8th place ranking in Google’s Local Finder, pushed down by spam. I wanted to see how quickly I could impact Home Depot’s surprisingly bad ranking.
I took the following steps, and encourage you to take them for any local business you’re marketing, too:
Step 1: Search
While located at the place of business you’re marketing, perform a Google search (or have your client perform it) for the keyword phrase for which you most desire improved local rankings. Of course, if you’re already ranking well as you want to for the searchers nearest you, you can still follow this process for investigating somewhat more distant areas within your potential reach where you want to increase visibility.
In the results from your search, click on the “more businesses” link at the bottom of the local pack, and you’ll be taken to the interface commonly called the “Local Finder.”
The Local Finder isn’t typically 100% identical to the local pack in exact ranking order, but it’s the best place I know of to see how things stand beyond the first 3 results that make up Google’s local packs, telling a business which companies they need to surpass to move up towards local pack inclusion.
Find yourself in the local finder. In my case, the Home Depot location was at position 8. I hope you’re somewhere within the first set of 20 results Google typically gives, but if you’re not, keep paging through until you locate your listing. If you don’t find yourself at all, you may need to troubleshoot whether an eligibility issue, suspension, or filter is at play. But, hopefully that’s not you today.
Next, create a custom spreadsheet to record your findings. Or, much easier, just make a copy of mine!
Populate the spreadsheet by cutting and pasting the basic NAP (name, address, phone) for every competitor ranking above you, and include your own listing, too, of course! If you work for an agency, you’ll need to get the client to help you with this step by filling the spreadsheet out based on their search from their place of business.
In my case, I recorded everything in the first 20 results of the Local Finder, because I saw spam both above and below my “client,” and wanted to see the total movement resulting from my work in that result set.
Step 3: Identify obvious spam
We want to catch the easy fish today. You can go down rabbit holes another day, trying to ferret out weirdly woven webs of lead gen sites spanning the nation, but today, we’re just looking to weed out listings that clearly, blatantly don’t belong in the Local Finder.
Go through these five easy steps:
- Look at the Google Streetview image for each business outranking you.
Do you see a business with signage that matches the name on the listing? Move on. But if you see a house, an empty parking lot, or Google is marking the listing as “location approximate”, jot that down in the Notes section of your spreadsheet. For example, I saw a supposed window coverings showroom that Streetview was locating in an empty lot on a military base. Big red flag there.
- Make note of any businesses that share an address, phone number, or very similar name.
Make note of anything with an overly long name that seems more like a string of keywords than a brand. For example, a listing in my set was called: Custom Window Treatments in Fairfield, CA Hunter Douglas Dealer.
- For every business you noted down in steps one and two, get on the phone.
Is the number a working number? If someone answers, do they answer with the name of the business? Note it down. Say, “Hi, where is your shop located?” If the answer is that it’s not a shop, it’s a mobile business, note that down. Finally, If anything seems off, check the Guidelines for representing your business on Google to see what’s allowed in the industry you’re investigating. For example, it’s perfectly okay for a window blinds dealer to operate out of their home, but if they’re operating out of 5 homes in the same city, it’s likely a violation. In my case, just a couple of minutes on the phone identified multiple listings with phone numbers that were no longer in service.
- Visit the iffy websites.
Now that you’re narrowing your spreadsheet down to a set of businesses that are either obviously legitimate or “iffy,” visit the websites of the iffy ones. Does the name on the listing match the name on the website? Does anything else look odd? Note it down.
- Highlight businesses that are clearly spammy.
Your dive hasn’t been deep, but by now, it may have identified one or more listings that you strongly believe don’t belong because they have spammy names, fake addresses, or out-of-service phone numbers. My lightning-quick pass through my data set showed that six of the twenty listings were clearly junk. That’s 30% of Google’s info being worthless! I suggest marking these in red text in your spreadsheet to make the next step fast and easy.
Step 4: Report it!
If you want to become a spam-fighting ace later, you’ll need to become familiar with Google’s Business Redressal Complaint Form which gives you lots of room for sharing your documentation of why a listing should be removed. In fact, if an aggravating spammer remains in the Local Finder despite what we’re doing in this session, this form is where you’d head next for a more concerted effort.
But, today, I promised the easiness of falling off a log, so our first effort at impacting the results will simply focus on the “suggest an edit” function you’ll see on each listing you’re trying to get rid of. This is how you do it:
After you click the “suggest an edit” button on the listing, a popup will appear. If you’re reporting something like a spammy name, click the “change name or other details” option and fill out the form. If you’ve determined a listing represents a non-existent, closed, unreachable, or duplicate entity, choose the “remove this place” option and then select the dropdown entry that most closely matches the problem. You can add a screenshot or other image if you like, but in my quick pass through the data, I didn’t bother.
Record the exact action you took for each spam listing in the “Actions” column of the spreadsheet. In my case, I was reporting a mixture or non-existent buildings, out-of-service phone numbers, and one duplicate listing with a spammy name.
Finally, hit the “send” button and you’re done.
Step 5: Record the results
Within an hour of filing my reports with Google, I received an email like this for 5 of the 6 entries I had flagged:
The only entry I received no email for was the duplicate listing with the spammy name. But I didn’t let this worry me. I went about the rest of my day and checked back in the morning.
I’m not fond of calling out businesses in public. Sometimes, there are good folks who are honestly confused about what’s allowed and what isn’t. Also, I sometimes find screenshots of the local finder overwhelmingly cluttered and endlessly long to look at. Instead, I created a bare-bones representational schematic of the total outcome of my hour of spam-fighting work.
The red markers are legit businesses. The grey ones are spam. The green one is the Home Depot I was trying to positively impact. I attributed a letter of the alphabet to each listing, to better help me see how the order changed from day one to day two. The lines show the movement over the course of the 24 hours.
The results were that:
- A stayed the same, and B and C swapping positions was unlikely due to my work; local rankings can fluctuate like this from hour to hour.
- Five out of six spam listings I reported disappeared. The keyword-stuffed duplicate listing which was initially at position K was replaced by the brand’s legitimate listing one spot lower than it had been.
- The majority of the legitimate businesses enjoyed upward movement, with the exception of position I which went down, and M and R which disappeared. Perhaps new businesses moving into the Local Finder triggered a filter, or perhaps it was just the endless tide of position changes and they’ll be back tomorrow.
- Seven new listings made it into the top 20. Unfortunately, at a glance, it looked to me like 3 of these new listings were new spam. Dang, Google!
- Most rewardingly, my hypothetical client, Home Depot, moved up 3 spots. What a super easy win!
Fill out the final column in your spreadsheet with your results.
What we’ve learned
You battle upstream every day for your business or clients. You twist yourself like a paperclip complying with Google’s guidelines, seeking new link and unstructured citation opportunities, straining your brain to shake out new content, monitoring reviews like a chef trying to keep a cream sauce from separating. You do all this in the struggle for better, broader visibility, hoping that each effort will incrementally improve reputation, rankings, traffic, and conversions.
Catch your breath. Not everything in life has to be so hard. The river of work ahead is always wide, but don’t overlook the simplest stepping stones. Saunter past the spam listings without breaking a sweat and enjoy the easy upward progress!
I’d like to close today with three meditations:
1. Google is in over their heads with spam
Google is in over their heads with spam. My single local search for a single keyword phrase yielded 30% worthless data in their top local results. Google says they process 63,000 searches process per second and that as much as 50% of mobile queries have a local intent. I don’t know any other way to look at Google than as having become an under-regulated public utility at this point.
Expert local SEOs can spot spam listings in query after query, industry after industry, but Google has yet to staff a workforce or design an algorithm sufficient to address bad data that has direct, real-world impacts on businesses and customers. I don’t know if they lack the skills or the will to take responsibility for this enormous problem they’ve created, but the problem is plain. Until Google steps up, my best advice is to do the smart and civic work of watchdogging the results that most affect the local community you serve. It’s a positive not just for your brand, but for every legitimate business and every neighbor near you.
2. You may get in over your head with spam
You may get in over your head with spam. Today’s session was as simple as possible, but GMB spam can stem from complex, global networks. The Home Depot location I randomly rewarded with a 3-place jump in Local Finder rankings clearly isn’t dedicating sufficient resources to spam fighting or they would’ve done this work themselves.
But the extent of spam is severe. If your market is one that’s heavily spammed, you can quickly become overwhelmed by the problem. In such cases, I recommend that you:
- Read this excellent recent article by Jessie Low on the many forms spam can take, plus some great tips for more strenuous fighting than we’ve covered today.
- Follow Joy Hawkins, Mike Blumenthal, and Jason Brown, all of whom publish ongoing information on this subject. If you wade into a spam network, I recommend reporting it to one or more of these experts on Twitter, and, if you wish to become a skilled spam fighter yourself, you will learn a lot from what these three have published.
- If you don’t want to fight spam yourself, hire an agency that has the smarts to be offering this as a service.
- You can also report listing spam to the Google My Business Community Forum, but it’s a crowded place and it can sometimes be hard to get your issue seen.
- Finally, if the effect of spam in your market is egregious enough, your ability to publicize it may be your greatest hope. Major media have now repeatedly featured broadcasts and stories on this topic, and shame will sometimes move Google to action when no other motivation appears to.
3. Try to build a local anti-spam movement
What if you built a local movement? What if you and your friendlier competitors joined forces to knock spam out of Google together? Imagine all of the florists, hair salons, or medical practitioners in a town coming together to watch the local SERPs in shifts so that everyone in their market could benefit from bad actors being reported.
Maybe you’re already in a local business association with many hands that could lighten the work of protecting a whole community from unethical business practices. Maybe your town could then join up with the nearest major city, and that city could begin putting pressure on legislators. Maybe legislators would begin to realize the extent of the impacts when legitimate businesses face competition from fake entities and illegal practices. Maybe new anti-trust and communications regulations would ensue.
Now, I promised you “simple,” and this isn’t it, is it? But every time I see a fake listing, I know I’m looking at a single pebble and I’m beginning to think it may take an avalanche to bring about change great enough to protect both local brands and consumers. Google is now 15 years into this dynamic with no serious commitment in sight to resolve it.
At least in your own backyard, in your own community, you can be one small part of the solution with the easy tactics I’ve shared today, but maybe it’s time for local commerce to begin both doing more and expecting more in the way of protections.
I’m ready for that. And you?
What are “fraggles” in SEO and how do they relate to mobile-first indexing, entities, the Knowledge Graph, and your day-to-day work? In this glimpse into her 2019 MozCon talk, Cindy Krum explains everything you need to understand about fraggles in this edition of Whiteboard Friday.
Hi, Moz fans. My name is Cindy Krum, and I’m the CEO of MobileMoxie, based in Denver, Colorado. We do mobile SEO and ASO consulting. I’m here in Seattle, speaking at MozCon, but also recording this Whiteboard Friday for you today, and we are talking about fraggles.
So fraggles are obviously a name that I’m borrowing from Jim Henson, who created “Fraggle Rock.” But it’s a combination of words. It’s a combination of fragment and handle. I talk about fraggles as a new way or a new element or thing that Google is indexing.
Fraggles and mobile-first indexing
Let’s start with the idea of mobile-first indexing, because you have to kind of understand that before you can go on to understand fraggles. So I believe mobile-first indexing is about a little bit more than what Google says. Google says that mobile-first indexing was just a change of the crawler.
They had a desktop crawler that was primarily crawling and indexing, and now they have a mobile crawler that’s doing the heavy lifting for crawling and indexing. While I think that’s true, I think there’s more going on behind the scenes that they’re not talking about, and we’ve seen a lot of evidence of this. So what I believe is that mobile-first indexing was also about indexing, hence the name.
Knowledge Graph and entities
So I think that Google has reorganized their index around entities or around specifically entities in the Knowledge Graph. So this is kind of my rough diagram of a very simplified Knowledge Graph. But Knowledge Graph is all about person, place, thing, or idea.
Nouns are entities. Knowledge Graph has nodes for all of the major person, place, thing, or idea entities out there. But it also indexes or it also organizes the relationships of this idea to this idea or this thing to this thing. What’s useful for that to Google is that these things, these concepts, these relationships stay true in all languages, and that’s how entities work, because entities happen before keywords.
This can be a hard concept for SEOs to wrap their brain around because we’re so used to dealing with keywords. But if you think about an entity as something that’s described by a keyword and can be language agnostic, that’s how Google thinks about entities, because entities in the Knowledge Graph are not written up per se or their the unique identifier isn’t a word, it’s a number and numbers are language agnostic.
But if we think about an entity like mother, mother is a concept that exists in all languages, but we have different words to describe it. But regardless of what language you’re speaking, mother is related to father, is related to daughter, is related to grandfather, all in the same ways, even if we’re speaking different languages. So if Google can use what they call the “topic layer”and entities as a way to filter in information and understand the world, then they can do it in languages where they’re strong and say, “We know that this is true absolutely 100% all of the time.”
Then they can apply that understanding to languages that they have a harder time indexing or understanding, they’re just not as strong or the algorithm isn’t built to understand things like complexities of language, like German where they make really long words or other languages where they have lots of short words to mean different things or to modify different words.
Languages all work differently. But if they can use their translation API and their natural language APIs to build out the Knowledge Graph in places where they’re strong, then they can use it with machine learning to also build it and do a better job of answering questions in places or languages where they’re weak. So when you understand that, then it’s easy to think about mobile-first indexing as a massive Knowledge Graph build-out.
We’ve seen this happening statistically. There are more Knowledge Graph results and more other things that seem to be related to Knowledge Graph results, like people also ask, people also search for, related searches. Those are all describing different elements or different nodes on the Knowledge Graph. So when you see those things in the search, I want you to think, hey, this is the Knowledge Graph showing me how this topic is related to other topics.
When you put this in that context, it makes more sense. He wants the entity understanding, or he knows that the entity understanding is really important, so the href lang is also really important. So that’s enough of that. Now let’s talk about fraggles.
Fraggles = fragment + handle
So fraggles, as I said, are a fragment plus a handle. It’s important to know that fraggles — let me go over here —fraggles and fragments, there are lots of things out there that have fragments. So you can think of native apps, databases, websites, podcasts, and videos. Those can all be fragmented.
Even though they don’t have a URL, they might be useful content, because Google says its goal is to organize the world’s information, not to organize the world’s websites. I think that, historically, Google has kind of been locked into this crawling and indexing of websites and that that’s bothered it, that it wants to be able to show other stuff, but it couldn’t do that because they all needed URLs.
But with fragments, potentially they don’t have to have a URL. So keep these things in mind — apps, databases and stuff like that — and then look at this.
So this is a traditional page. If you think about a page, Google has kind of been forced, historically by their infrastructure, to surface pages and to rank pages. But pages sometimes struggle to rank if they have too many topics on them.
So for instance, what I’ve shown you here is a page about vegetables. This page may be the best page about vegetables, and it may have the best information about lettuce, celery, and radishes. But because it’s got those topics and maybe more topics on it, they all kind of dilute each other, and this great page may struggle to rank because it’s not focused on the one topic, on one thing at a time.
Google wants to rank the best things. But historically they’ve kind of pushed us to put the best things on one page at a time and to break them out. So what that’s created is this “content is king, I need more content, build more pages” mentality in SEO. The problem is everyone can be building more and more pages for every keyword that they want to rank for or every keyword group that they want to rank for, but only one is going to rank number one.
Google still has to crawl all of those pages that it told us to build, and that creates this character over here, I think, Marjory the Trash Heap, which if you remember the Fraggles, Marjory the Trash Heap was the all-knowing oracle. But when we’re all creating kind of low- to mid-quality content just to have a separate page for every topic, then that makes Google’s life harder, and that of course makes our life harder.
So why are we doing all of this work? The answer is because Google can only index pages, and if the page is too long or too many topics, Google gets confused. So we’ve been enabling Google to do this. But let’s pretend, go with me on this, because this is a theory, I can’t prove it. But if Google didn’t have to index a full page or wasn’t locked into that and could just index a piece of a page, then that makes it easier for Google to understand the relationships of different topics to one page, but also to organize the bits of the page to different pieces of the Knowledge Graph.
So this page about vegetables could be indexed and organized under the vegetable node of the Knowledge Graph. But that doesn’t mean that the lettuce part of the page couldn’t be indexed separately under the lettuce portion of the Knowledge Graph and so on, celery to celery and radish to radish. Now I know this is novel, and it’s hard to think about if you’ve been doing SEO for a long time.
But let’s think about why Google would want to do this. Google has been moving towards all of these new kinds of search experiences where we have voice search, we have the Google Home Hub kind of situation with a screen, or we have mobile searches. If you think about what Google has been doing, we’ve seen the increase in people also ask, and we’ve seen the increase in featured snippets.
They’ve actually been kind of, sort of making fragments for a long time or indexing fragments and showing them in featured snippets. The difference between that and fraggles is that when you click through on a fraggle, when it ranks in a search result, Google scrolls to that portion of the page automatically. That’s the handle portion.
So handles you may have heard of before. They’re kind of old-school web building. We call them bookmarks, anchor links, anchor jump links, stuff like that. It’s when it automatically scrolls to the right portion of the page. But what we’ve seen with fraggles is Google is lifting bits of text, and when you click on it, they’re scrolling directly to that piece of text on a page.
So we see this already happening in some results. What’s interesting is Google is overlaying the link. You don’t have to program the jump link in there. Google actually finds it and puts it there for you. So Google is already doing this, especially with AMP featured snippets. If you have a AMP featured snippet, so a featured snippet that’s lifted from an AMP page, when you click through, Google is actually scrolling and highlighting the featured snippet so that you could read it in context on the page.
But it’s also happening in other kind of more nuanced situations, especially with forums and conversations where they can pick a best answer. The difference between a fraggle and something like a jump link is that Google is overlaying the scrolling portion. The difference between a fraggle and a site link is site links link to other pages, and fraggles, they’re linking to multiple pieces of the same long page.
So we want to avoid continuing to build up low-quality or mid-quality pages that might go to Marjory the Trash Heap. We want to start thinking in terms of can Google find and identify the right portion of the page about a specific topic, and are these topics related enough that they’ll be understood when indexing them towards the Knowledge Graph.
Knowledge Graph build-out into different areas
So I personally think that we’re seeing the build-out of the Knowledge Graph in a lot of different things. I think featured snippets are kind of facts or ideas that are looking for a home or validation in the Knowledge Graph. People also ask seem to be the related nodes. People also search for, same thing. Related searches, same thing. Featured snippets, oh, they’re on there twice, two featured snippets. Found on the web, which is another way where Google is putting expanders by topic and then giving you a carousel of featured snippets to click through on.
So we’re seeing all of those things, and some SEOs are getting kind of upset that Google is lifting so much content and putting it in the search results and that you’re not getting the click. We know that 61% of mobile searches don’t get a click anymore, and it’s because people are finding the information that they want directly in a SERP.
That’s tough for SEOs, but great for Google because it means Google is providing exactly what the user wants. So they’re probably going to continue to do this. I think that SEOs are going to change their minds and they’re going to want to be in those windowed content, in the lifted content, because when Google starts doing this kind of thing for the native apps, databases, and other content, websites, podcasts, stuff like that, then those are new competitors that you didn’t have to deal with when it was only websites ranking, but those are going to be more engaging kinds of content that Google will be showing or lifting and showing in a SERP even if they don’t have to have URLs, because Google can just window them and show them.
So you’d rather be lifted than not shown at all. So that’s it for me and featured snippets. I’d love to answer your questions in the comments, and thanks very much. I hope you like the theory about fraggles.
In link building, few things are more frustrating than finding the perfect link opportunity but being completely unable to find a contact email address.
It’s probably happened to you — if you’re trying to build links or do any sort of outreach, it almost always entails sending out a fairly significant amount of emails. There are plenty of good articles out there about building relationships within the context of link building, but it’s hard to build relationships when you can’t even find a contact email address.
So, for today, I want to focus on how you can become better at finding those important email addresses.
Link builders spend a lot of time just trying to find contact info, and it’s often a frustrating process, just because sussing out email addresses can indeed be quite difficult. The site you’re targeting might not even have a contact page in the first place. Or, if the site does have a contact page, it might only display a generic email address. And, sometimes, the site may list too many email addresses. There are eight different people with similar-sounding job titles — should you reach out to the PR person, the marketing director, or the webmaster? It’s not clear.
Whatever the case may be, finding the right email address is absolutely imperative to any successful outreach campaign. In our industry, the numbers around outreach and replies aren’t great. Frankly, it’s shocking to hear the industry standard — only 8.5% of outreach emails receive a response.
I can’t help but wonder how many mistakes are made along the way to such a low response rate.
While there are certainly instances where there is simply no clear and obvious contact method, that should be the exception — not the rule! An experienced link builder understands that finding relevant contact information is essential to their success.
That’s why I’ve put together a quick list of tips and tools that will help you to find the email addresses and contact information you need when you’re building links.
And, if you follow my advice, here is a glimpse of the results you could expect:
We don’t track clicks, in case you were wondering 😉
ALWAYS start by looking around!
First, let’s start with my golden rule: Before you fire up any tool, you should always manually look for the correct contact email yourself.
Based on my experience, tools and automation are a last resort. If you rely solely upon tools and automated solutions, you’ll end up with many more misfired emails than if you were to go the manual route. There’s a simple reason for this: the email address listed on your target website may, surprisingly, belong to the right person you should contact!
Now, if you are using a tool, they may generate dozens of email addresses, and you’ll never end up actually emailing the correct individual. Another reason I advocate manually looking for emails is because many email finding tools are limited and can only find email addresses that are associated with a domain name. So, if there is a webmaster that happens to have a @gmail.com email address, the email finding tool will not find it.
It’s also important to only reach out to people you strongly believe will have an interest in your email in order to stay GDPR compliant.
So, always start your manual search by looking around the site. Usually, there will be a link to the contact page in the header, footer, or sidebar. If there’s not a page explicitly named “contact,” or if the contact page only has generic email addresses, that’s when I would recommend jumping to an “About Us” page, should there be one.
You always want to find a personal email, not a generic one or a contact form. Outreach is more effective when you can address a specific individual, not whoever who is checking [email protected] that day.
If you encounter too many emails and aren’t sure who the best person to contact is, I suggest sending an email to your best hunch that goes something like this:
And who knows, you may even get a reply like this:
If you weren’t able to locate an email address at this point, I’d move on to the next section.
Ask search engines for help
Perhaps the contact page you were looking for was well-hidden; maybe they don’t want to be contacted that much or they’re in desperate need of a new UX person.
You can turn to search engines for help.
My go-to search engine lately is Startpage. Dubbed as the world’s most private search engine, they display Google SERPs in a way that doesn’t make you feel like you just stepped into Times Square. They also have a cool option to browse the search results anonymously with “Anonymous View.”
For our purposes, I would use the site: search operator just like this:
If there is in fact a contact page or email somewhere on their website that you were not able to find, any competent search engine will find it for you. If the above site query doesn’t return any results, then I’d start expanding my search to other corners of the web.
Use the search bar and type:
If you’re looking for the email of a specific person, type their name before or after the quotation marks.
With this query you can find non-domain email addresses:
If that person’s email address is publicly available somewhere, you will likely be able to find it within the search results.
There are many, many excellent email finding tools to choose from. The first one I want to talk about is Hunter.
Hunter has a Chrome extension that’s really easy to use. After you’ve downloaded the extension, there’s not much more that needs to be done.
Go to the site which you are thinking about sending an email to, click on the extension in the top right corner of your screen, and Hunter, well, hunts.
It returns every email address it can find associated with that domain. And also allows you to filter the results based on categories.
Did I say “email address?” I meant to say email address, name, job title, etc. Essentially, it’s a one-click fix to get everything you need to send outreach.
Because I use Hunter regularly (and for good reason, as you can see), it’s the one I’m most familiar with. You can also use Hunter’s online app to look up emails in bulk.
The major downside of working in bulk is coming up with an effective formula to sift through all the emails. Hunter may generate dozens of emails for one site, leaving you to essentially guess which email address is best for outreach. And if you’re relying on guess-work, chances are pretty high you’re leaving perfectly good prospects on the table.
There are several other email finding tools to pick from and I would be remiss to not mention them. Here are 5 alternative email-finding tools:
Even though I personally try not to be too dependent on tools, the fact of the matter is that they provide the easiest, most convenient route in many cases.
The guessing game
I know there’s no word in the digital marketing world that produces more shudders than “guessing.” However, there are times when guessing is easier.
Let’s be real: there aren’t too many different ways that companies both large and small format their email addresses. It’s usually going to be something like:
If you’ve ever worked for a living, you know most of the variations. But, in case you need some help, there’s a tool for that.
Now, I’m not suggesting that you just pick any one of these random addresses, send your email, cross your fingers, and hope for the best. Far from it. There are actually tools that you can use that will indicate when you’ve selected the right one.
Sales Navigator is such a tool. Sales Navigator is a Gmail extension that is easy to use. Simply enter the name of the person you’re looking for, and it will return all of the possible standard variations that they may use for their email address. Then, you can actually test the address from your Gmail account. When you type in the address into the proper line, a sidebar will appear on your screen. If there no is no information in that sidebar, you have the wrong address. If, however, you get a return that looks like this:
Congratulations! You’ve found the right email address.
Obviously, this method only works if you know the name of the person you want to email, but just don’t have their email address. Still, in those scenarios, Sales Navigator works like a charm.
Trust, but verify
There’s nothing more annoying than when you think you’ve finally struck gold, but the gold turned out to be pyrite. Getting an email that bounces back because it wasn’t the correct address is frustrating. And even worse, if it happens too often, your email can end up on email blacklists and destroy your email deliverability.
There are ways to verify, however. At my company, we use Neverbounce. It’s effective and incredibly easy to use. With Neverbounce, you can enter in either individual email addresses or bulk lists, and voila!
It will let you know if that email address is currently Valid, Invalid, or Unknown. It’s that easy. Here are some other email verifiers:
Subscribe to their newsletter
Here’s one final out-of-the-box approach. This approach works more often with sites where one person clearly does most, if not all, of the work. A site where someone’s name is the domain name, for example.
If you come across a site like davidfarkas.com and you see a newsletter that can be subscribed to, hit that subscribe button. Once that’s done, you can simply reply to one iteration of the newsletter.
This method has an added benefit. An effective way of building links is building relationships, just like I said in the opening. When you can demonstrate that you’re already subscribing to a webmaster’s newsletter, you’ll be currying favor with that webmaster.
When you send a link building outreach email, you want to make sure it’s going to a real person and, even more importantly, ending up in the right hands. Sending an email to an incorrect contact periodically may seem like a negligible waste of time, but when you send emails at the volume a link builder should, the waste adds up very quickly. In fact, enough waste can kill everything else that you’re trying to accomplish.
It’s well worth your time to make sure you’re getting it right by putting in the effort to finding the right email address. Be a picky link builder. Don’t just choose the first email that comes your way and never rely solely on tools. If you email the wrong person, it will look to them like that you didn’t care enough to spend time on their site, and in return, they will ignore you and your pitch.
With the tips outlined above, you’ll avoid these issues and be on your way to more successful outreach.
For some organizations, mobile apps can be an important means to capturing new leads and customers, so it can be alarming when you notice your app visits are declining.
However, while there is content on how to optimize your app, otherwise known as ASO (App Store Optimization), there is little information out there on the steps required to diagnose a drop in app visits.
Although there are overlaps with traditional search, there are unique factors that play a role in app store visibility.
The aim of this blog is to give you a solid foundation when trying to investigate a drop in app store visits and then we’ll go through some quick fire opportunities to win that traffic back.
We’ll go through the process of investigating why your app traffic declined, including:
- Identifying potential external factors
- Identifying the type of keywords that dropped in visits
- Analyzing app user engagement metrics
And we’ll go through some ways to help you win traffic back including:
- Spying on your competitors
- Optimizing your store listing
- Investing in localisation
Investigating why your app traffic declined
Step 1. Identify potential external factors
Some industries/businesses will have certain periods of the year where traffic may drop due to external factors, such as seasonality.
Before you begin investigating a traffic drop further:
- Talk to your point of contact and ask whether seasonality impacts their business, or whether there are general industry trends at play. For example, aggregator sites like SkyScanner may see a drop in app visits after the busy period at the start of the year.
- Identify whether app installs actually dropped. If they didn’t, then you probably don’t need to worry about a drop in traffic too much and it could be Google’s and Apple’s algorithms better aligning the intent of search terms.
Step 2. Identify the type of keywords that dropped in visits
Like traditional search, identifying the type of keywords (branded and non-branded), as well as the individual keywords that saw the biggest drop in app store visits, will provide much needed context and help shape the direction of your investigation. For instance:
If branded terms saw the biggest drop-off in visits this could suggest:
- There has been a decrease in the amount of advertising spend that builds brand/product awareness
- Competitors are bidding on your branded terms
- The app name/brand has changed and hasn’t been able to mop up all previous branded traffic
If non-branded terms saw the biggest drop off in visits this could suggest:
- You’ve made recent optimisation changes that have had a negative impact
- User engagement signals, such as app crashes, or app reviews have changed for the worse
- Your competition have better optimised their app and/or provide a better user experience (particularly relevant if an app receives a majority of its traffic from a small set of keywords)
- Your app has been hit by an algorithm update
If both branded and non-branded terms saw the biggest drop off in visits this could suggest:
- You’ve violated Google’s policies on promoting your app.
- There are external factors at play
To get data for your Android app
To get data for your Android app, sign into your Google Play Console account.
Google Play Console provides a wealth of data on the performance of your android app, with particularly useful insights on user engagement metrics that influence app store ranking (more on these later).
However, keyword specific data will be limited. Google Play Console will show you the individual keywords that delivered the most downloads for your app, but the majority of keyword visits will likely be unclassified: mid to long-tail keywords that generate downloads, but don’t generate enough downloads to appear as isolated keywords. These keywords will be classified as “other”.
Your chart might look like the below. Repeat the same process for branded terms.
To get data for your IOS app
To get data on the performance of your IOS app, Apple have App Store Connect. Like Google Play Console, you’ll be able to get your hands on user engagement metrics that can influence the ranking of your app.
However, keyword data is even scarcer than Google Play Console. You’ll only be able to see the total number of impressions your app’s icon has received on the App Store. If you’ve seen a drop in visits for both your Android and IOS app, then you could use Google Play Console data as a proxy for keyword performance.
If you use an app rank tracking tool, such as TheTool, you can somewhat plug gaps in knowledge for the keywords that are potentially driving visits to your app.
Step 3. Analyze app user engagement metrics
User engagement metrics that underpin a good user experience have a strong influence on how your app ranks and both Apple and Google are open about this.
Google states that user engagement metrics like app crashes, ANR rates (application not responding) and poor reviews can limit exposure opportunities on Google Play.
While Apple isn’t quite as forthcoming as Google when it comes to providing information on engagement metrics, they do state that app ratings and reviews can influence app store visibility.
Ultimately, Apple wants to ensure IOS apps provide a good user experience, so it’s likely they use a range of additional user engagement metrics to rank an app in the App Store.
As part of your investigation, you should look into how the below user engagement metrics may have changed around the time period you saw a drop in visits to your app.
- App rating
- Number of ratings (newer/fresh ratings will be weighted more for Google)
- Number of downloads
- Installs vs uninstalls
- App crashes and application not responding
You’ll be able to get data for the above metrics in Google Play Console and App Store Connect, or you may have access to this data internally.
Even if your analysis doesn’t reveal insights, metrics like app rating influences conversion and where your app ranks in the app pack SERP feature, so it’s well worth investing time in developing a strategy to improve these metrics.
One simple tactic could be to ensure you respond to negative reviews and reviews with questions. In fact, users increase their rating by +0.7 stars on average after receiving a reply.
Apple offers a few tips on asking for ratings and reviews for IOS app.
Help win your app traffic back
Step 1. Spy on your competitors
Find out who’s ranking
When trying to identify opportunities to improve app store visibility, I always like to compare the top 5 ranking competitor apps for some priority non-branded keywords.
All you need to do is search for these keywords in Google Play and the App Store and grab the publicly available ranking factors from each app listing. You should have something like the below.
Title Character length
Number of reviews
Number of installs
Description character length
[Your brands title]
Above: anonymized table of a client’s Google Play competitors
From this, you may get some indications as to why an app ranks above you. For instance, we see “Competitor 1” not only has the best app rating, but has the longest title and description. Perhaps they better optimized their title and description?
We can also see that competitors that rank above us generally have a larger number of total reviews and installs, which aligns with both Google’s and Apple’s statements about the importance of user engagement metrics.
With the above comparison information, you can dig a little deeper, which leads us on nicely to the next section.
Optimize your app text fields
Keywords you add to text fields can have a significant impact on app store discoverability.
As part of your analysis, you should look into how your keyword optimization differs from competitors and identify any opportunities.
For Google Play, adding keywords to the below text fields can influence rankings:
- Keywords in the app title (50 characters)
- Keywords in the app description (4,000 characters)
- Keywords in short description (80 characters)
- Keywords in URL
- Keywords in your app name
When it comes to the App Store, adding keywords to the below text fields can influence rankings:
- Keywords in the app title (30 characters)
- Using the 100 character keywords field (a dedicated 100-character field to place keywords you want to rank for)
- Keywords in your app name
To better understand how your optimisation tactics hold up, I recommended comparing your app text fields to competitors.
For example, if I want to know the frequency of mentioned keywords in their app descriptions on Google Play (keywords in the description field are a ranking factor) than I’d create a table like the one below.
Above: anonymized table of a client’s Google Play competitors
From the above table, I can see that the number 1 ranking competitor (competitor 1) has more mentions of “job search” and “employment app” than I do.
Whilst there are many factors that decide the position at which an app ranks, I could deduce that I need to increase the frequency of said keywords in my Google Play app description to help improve ranking.
Be careful though: writing unnatural, keyword stuffed descriptions and titles will likely have an adverse effect.
Remember, as well as being optimized for machines, text fields like your app title and description are meant to be a compelling “advertisement” of your app for users..
I’d repeat this process for other text fields to uncover other keyword insights.
Step 2. Optimize your store listing
Your store listing in the home of your app on Google Play. It’s where users can learn about your app, read reviews and more. And surprisingly, not all apps take full advantage of developing an immersive store listing experience.
Whilst Google doesn’t seem to directly state that fully utilizing the majority of store listing features directly impacts your apps discoverability, it’s fair to speculate that there may be some ranking consideration behind this.
At the very least, investing in your store listing could improve conversion and you can even run A/B tests to measure the impact of your changes.
You can improve the overall user experience and content found in the store listing by adding video trailers of your app, quality creative assets, your apps icon (you’ll want to make your icon stand out amongst a sea of other app icons) and more.
You can read Google’s best practice guide on creating a compelling Google Play store listing to learn more.
Step 3. Invest in localization
The saying goes “think global, act local” and this is certainly true of apps.
Previous studies have revealed that 72.4% of global consumers preferred to use their native language when shopping online and that 56.2% of consumers said that the ability to obtain information in their own language is more important than price.
It makes logical sense. The better you can personalize your product for your audience, the better your results will be, so go the extra mile and localize your Google Play and App Store listings.
A drop in visits of any kind causes alarm and panic. Hopefully this blog gives you a good starting point if you ever need to investigate why an apps traffic has dropped as well as providing some quick fire opportunities to win it back.
Gone are the days of optimizing content solely for search engines. For modern SEO, your content needs to please both robots and humans. But how do you know that what you’re writing can check the boxes for both man and machine?
In today’s Whiteboard Friday, Ruth Burr Reedy focuses on part of her recent MozCon 2019 talk and teaches us all about how Google uses NLP (natural language processing) to truly understand content, plus how you can harness that knowledge to better optimize what you write for people and bots alike.
Howdy, Moz fans. I’m Ruth Burr Reedy, and I am the Vice President of Strategy at UpBuild, a boutique technical marketing agency specializing in technical SEO and advanced web analytics. I recently spoke at MozCon on a basic framework for SEO and approaching changes to our industry that thinks about SEO in the light of we are humans who are marketing to humans, but we are using a machine as the intermediary.
Those videos will be available online at some point. [Editor’s note: that point is now!] But today I wanted to talk about one point from my talk that I found really interesting and that has kind of changed the way that I approach content creation, and that is the idea that writing content that is easier for Google, a robot, to understand can actually make you a better writer and help you write better content for humans. It is a win-win.
The relationships between entities, words, and how people search
To understand how Google is currently approaching parsing content and understanding what content is about, Google is spending a lot of time and a lot of energy and a lot of money on things like neural matching and natural language processing, which seek to understand basically when people talk, what are they talking about?
This goes along with the evolution of search to be more conversational. But there are a lot of times when someone is searching, but they don’t totally know what they want, and Google still wants them to get what they want because that’s how Google makes money. They are spending a lot of time trying to understand the relationships between entities and between words and how people use words to search.
The example that Danny Sullivan gave online, that I think is a really great example, is if someone is experiencing the soap opera effect on their TV. If you’ve ever seen a soap opera, you’ve noticed that they look kind of weird. Someone might be experiencing that, and not knowing what that’s called they can’t Google soap opera effect because they don’t know about it.
They might search something like, “Why does my TV look funny?” Neural matching helps Google understand that when somebody is searching “Why does my TV look funny?” one possible answer might be the soap opera effect. So they can serve up that result, and people are happy.
As we’re thinking about natural language processing, a core component of natural language processing is understanding salience.
Salience, content, and entities
Salience is a one-word way to sum up to what extent is this piece of content about this specific entity? At this point Google is really good at extracting entities from a piece of content. Entities are basically nouns, people, places, things, proper nouns, regular nouns.
Entities are things, people, etc., numbers, things like that. Google is really good at taking those out and saying, “Okay, here are all of the entities that are contained within this piece of content.” Salience attempts to understand how they’re related to each other, because what Google is really trying to understand when they’re crawling a page is: What is this page about, and is this a good example of a page about this topic?
Salience really goes into the second piece. To what extent is any given entity be the topic of a piece of content? It’s often amazing the degree to which a piece of content that a person has created is not actually about anything. I think we’ve all experienced that.
You’re searching and you come to a page and you’re like, “This was too vague. This was too broad. This said that it was about one thing, but it was actually about something else. I didn’t find what I needed. This wasn’t good information for me.” As marketers, we’re often on the other side of that, trying to get our clients to say what their product actually does on their website or say, “I know you think that you created a guide to Instagram for the holidays. But you actually wrote one paragraph about the holidays and then seven paragraphs about your new Instagram tool. This is not actually a blog post about Instagram for the holidays. It’s a piece of content about your tool.” These are the kinds of battles that we fight as marketers.
Natural Language Processing (NLP) APIs
Fortunately, there are now a number of different APIs that you can use to understand natural language processing:
Is it as sophisticated as what they’re using on their own stuff? Probably not. But you can test it out. Put in a piece of content and see (a) what entities Google is able to extract from it, and (b) how salient Google feels each of these entities is to the piece of content as a whole. Again, to what degree is this piece of content about this thing?
So this natural language processing API, which you can try for free and it’s actually not that expensive for an API if you want to build a tool with it, will assign each entity that it can extract a salient score between 0 and 1, saying, “Okay, how sure are we that this piece of content is about this thing versus just containing it?”
So the higher or the closer you get to 1, the more confident the tool is that this piece of content is about this thing. 0.9 would be really, really good. 0.01 means it’s there, but they’re not sure how well it’s related.
A delicious example of how salience and entities work
The example I have here, and this is not taken from a real piece of content — these numbers are made up, it’s just an example — is if you had a chocolate chip cookie recipe, you would want chocolate cookies or chocolate chip cookies recipe, chocolate chip cookies, something like that to be the number one entity, the most salient entity, and you would want it to have a pretty high salient score.
You would want the tool to feel pretty confident, yes, this piece of content is about this topic. But what you can also see is the other entities it’s extracting and to what degree they are also salient to the topic. So you can see things like if you have a chocolate chip cookie recipe, you would expect to see things like cookie, butter, sugar, 350, which is the temperature you heat your oven, all of the different things that come together to make a chocolate chip cookie recipe.
But I think that it’s really, really important for us as SEOs to understand that salience is the future of related keywords. We’re beyond the time when to optimize for chocolate chip cookie recipe, we would also be looking for things like chocolate recipe, chocolate chips, chocolate cookie recipe, things like that. Stems, variants, TF-IDF, these are all older methodologies for understanding what a piece of content is about.
Instead what we need to understand is what are the entities that Google, using its vast body of knowledge, using things like Freebase, using large portions of the internet, where is Google seeing these entities co-occur at such a rate that they feel reasonably confident that a piece of content on one entity in order to be salient to that entity would include these other entities?
Using an expert is the best way to create content that’s salient to a topic
So chocolate chip cookie recipe, we’re now also making sure we’re adding things like butter, flour, sugar. This is actually really easy to do if you actually have a chocolate chip cookie recipe to put up there. This is I think what we’re going to start seeing as a content trend in SEO is that the best way to create content that is salient to a topic is to have an actual expert in that topic create that content.
Somebody with deep knowledge of a topic is naturally going to include co-occurring terms, because they know how to create something that’s about what it’s supposed to be about. I think what we’re going to start seeing is that people are going to have to start paying more for content marketing, frankly. Unfortunately, a lot of companies seem to think that content marketing is and should be cheap.
Content marketers, I feel you on that. It sucks, and it’s no longer the case. We need to start investing in content and investing in experts to create that content so that they can create that deep, rich, salient content that everybody really needs.
How can you use this API to improve your own SEO?
One of the things that I like to do with this kind of information is look at — and this is something that I’ve done for years, just not in this context — but a prime optimization target in general is pages that rank for a topic, but they rank on page 2.
What this often means is that Google understands that that keyword is a topic of the page, but it doesn’t necessarily understand that it is a good piece of content on that topic, that the page is actually solely about that content, that it’s a good resource. In other words, the signal is there, but it’s weak.
What you can do is take content that ranks but not well, run it through this natural language API or another natural language processing tool, and look at how the entities are extracted and how Google is determining that they’re related to each other. Sometimes it might be that you need to do some disambiguation. So in this example, you’ll notice that while chocolate cookies is called a work of art, and I agree, cookie here is actually called other.
This is because cookie means more than one thing. There’s cookies, the baked good, but then there’s also cookies, the packet of data. Both of those are legitimate uses of the word “cookie.” Words have multiple meanings. If you notice that Google, that this natural language processing API is having trouble correctly classifying your entities, that’s a good time to go in and do some disambiguation.
Make sure that the terms surrounding that term are clearly saying, “No, I mean the baked good, not the software piece of data.” That’s a really great way to kind of bump up your salience. Look at whether or not you have a strong salient score for your primary entity. You’d be amazed at how many pieces of content you can plug into this tool and the top, most salient entity is still only like a 0.01, a 0.14.
A lot of times the API is like “I think this is what it’s about,” but it’s not sure. This is a great time to go in and bump up that content, make it more robust, and look at ways that you can make those entities easier to both extract and to relate to each other. This brings me to my second point, which is my new favorite thing in the world.
Writing for humans and writing for machines, you can now do both at the same time. You no longer have to, and you really haven’t had to do this in a long time, but the idea that you might keyword stuff or otherwise create content for Google that your users might not see or care about is way, way, way over.
Now you can create content for Google that also is better for users, because the tenets of machine readability and human readability are moving closer and closer together.
Tips for writing for human and machine readability:
What I’ve done here is I did some research not on natural language processing, but on writing for human readability, that is advice from writers, from writing experts on how to write better, clearer, easier to read, easier to understand content.Then I pulled out the pieces of advice that also work as pieces of advice for writing for natural language processing. So natural language processing, again, is the process by which Google or really anything that might be processing language tries to understand how entities are related to each other within a given body of content.
Short, simple sentences
Short, simple sentences. Write simply. Don’t use a lot of flowery language. Short sentences and try to keep it to one idea per sentence.
One idea per sentence
If you’re running on, if you’ve got a lot of different clauses, if you’re using a lot of pronouns and it’s becoming confusing what you’re talking about, that’s not great for readers.
It also makes it harder for machines to parse your content.
Connect questions to answers
Then closely connecting questions to answers. So don’t say, “What is the best temperature to bake cookies? Well, let me tell you a story about my grandmother and my childhood,” and 500 words later here’s the answer. Connect questions to answers.
What all three of those readability tips have in common is they boil down to reducing the semantic distance between entities.
If you want natural language processing to understand that two entities in your content are closely related, move them closer together in the sentence. Move the words closer together. Reduce the clutter, reduce the fluff, reduce the number of semantic hops that a robot might have to take between one entity and another to understand the relationship, and you’ve now created content that is more readable because it’s shorter and easier to skim, but also easier for a robot to parse and understand.
Be specific first, then explain nuance
Going back to the example of “What is the best temperature to bake chocolate chip cookies at?” Now the real answer to what is the best temperature to bake chocolate cookies is it depends. Hello. Hi, I’m an SEO, and I just answered a question with it depends. It does depend.
That is true, and that is real, but it is not a good answer. It is also not the kind of thing that a robot could extract and reproduce in, for example, voice search or a featured snippet. If somebody says, “Okay, Google, what is a good temperature to bake cookies at?” and Google says, “It depends,” that helps nobody even though it’s true. So in order to write for both machine and human readability, be specific first and then you can explain nuance.
Then you can go into the details. So a better, just as correct answer to “What is the temperature to bake chocolate chip cookies?” is the best temperature to bake chocolate chip cookies is usually between 325 and 425 degrees, depending on your altitude and how crisp you like your cookie. That is just as true as it depends and, in fact, means the same thing as it depends, but it’s a lot more specific.
It’s a lot more precise. It uses real numbers. It provides a real answer. I’ve shortened the distance between the question and the answer. I didn’t say it depends first. I said it depends at the end. That’s the kind of thing that you can do to improve readability and understanding for both humans and machines.
Get to the point (don’t bury the lede)
Get to the point. Don’t bury the lead. All of you journalists who try to become content marketers, and then everybody in content marketing said, “Oh, you need to wait till the end to get to your point or they won’t read the whole thing,”and you were like, “Don’t bury the lead,” you are correct. For those of you who aren’t familiar with journalism speak, not burying the lead basically means get to the point upfront, at the top.
Include all the information that somebody would really need to get from that piece of content. If they don’t read anything else, they read that one paragraph and they’ve gotten the gist. Then people who want to go deep can go deep. That’s how people actually like to consume content, and surprisingly it doesn’t mean they won’t read the content. It just means they don’t have to read it if they don’t have time, if they need a quick answer.
The same is true with machines. Get to the point upfront. Make it clear right away what the primary entity, the primary topic, the primary focus of your content is and then get into the details. You’ll have a much better structured piece of content that’s easier to parse on all sides.
Avoid jargon and “marketing speak”
Avoid jargon. Avoid marketing speak. Not only is it terrible and very hard to understand. You see this a lot. I’m going back again to the example of getting your clients to say what their products do. You work with a lot of B2B companies, you will you will often run into this. Yes, but what does it do? It provides solutions to streamline the workflow and blah, blah. Okay, what does it do? This is the kind of thing that can be really, really hard for companies to get out of their own heads about, but it’s so important for users, for machines.
Avoid jargon. Avoid marketing speak. Not to get too tautological, but the more esoteric a word is, the less commonly it’s used. That’s actually what esoteric means. What that means is the less commonly a word is used, the less likely it is that Google is going to understand its semantic relationships to other entities.
Keep it simple. Be specific. Say what you mean. Wipe out all of the jargon. By wiping out jargon and kind of marketing speak and kind of the fluff that can happen in your content, you’re also, once again, reducing the semantic distances between entities, making them easier to parse.
Organize your information to match the user journey
Organize it and map it out to the user journey. Think about the information somebody might need and the order in which they might need it.
Break out subtopics with headings
Then break it out with subheadings. This is like very, very basic writing advice, and yet you all aren’t doing it. So if you’re not going to do it for your users, do it for machines.
Format lists with bullets or numbers
You can also really impact skimmability for users by breaking out lists with bullets or numbers.
The great thing about that is that breaking out a list with bullets or numbers also makes information easier for a robot to parse and extract. If a lot of these tips seem like they’re the same tips that you would use to get featured snippets, they are, because featured snippets are actually a pretty good indicator that you’re creating content that a robot can find, parse, understand, and extract, and that’s what you want.
So if you’re targeting featured snippets, you’re probably already doing a lot of these things, good job.
Grammar and spelling count!
The last thing, which I shouldn’t have to say, but I’m going to say is that grammar and spelling and punctuation and things like that absolutely do count. They count to users. They don’t count to all users, but they count to users. They also count to search engines.
Things like grammar, spelling, and punctuation are very, very easy signals for a machine to find and parse. Google has been specific in things, like the “Quality Rater Guidelines,”that a well-written, well-structured, well-spelled, grammatically correct document, that these are signs of authoritativeness. I’m not saying that having a greatly spelled document is going to mean that you immediately rocket to the top of the results.
I am saying that if you’re not on that stuff, it’s probably going to hurt you. So take the time to make sure everything is nice and tidy. You can use vernacular English. You don’t have to be perfect “AP Style Guide” all the time. But make sure that you are formatting things properly from a grammatical standpoint as well as a technical standpoint. What I love about all of this, this is just good writing.
This is good writing. It’s easy to understand. It’s easy to parse. It’s still so hard, especially in the marketing world, to get out of that world of jargon, to get to the point, to stop writing 2,000 words because we think we need 2,000 words, to really think about are we creating content that’s about what we think it’s about.
Use these tools to understand how readable, parsable, and understandable your content is
So my hope for the SEO world and for you is that you can use these tools not just to think about how to dial in the perfect keyword density or whatever to get an almost perfect score on the salience in the natural language processing API. What I’m hoping is that you will use these tools to help yourself understand how readable, how parsable, and how understandable your content is, how much your content is about what you say it’s about and what you think it’s about so you can create better stuff for users.
It makes the internet a better place, and it will probably make you some money as well. So these are my thoughts. I’d love to hear in the comments if you’re using the natural language processing API now, if you’ve built a tool with it, if you want to build a tool with it, what do you think about this, how do you use this, how has it gone. Tell me all about it. Holla atcha girl.
Have a great Friday.
Find Ranking Keywords, Uncover Opportunities, Check Rankings, & More: 5 Workflows for Easier Keyword Research
Have you ever wished there were an easy way to see all the top keywords your site is ranking for? How about a competitor’s? What about those times when you’re stumped trying to come up with keywords related to your core topic, or want to know the questions people are asking around your keywords?
There’s plenty of keyword research workflow gold to be uncovered in Keyword Explorer. It’s a tool that can save you a ton of time when it comes to both general keyword research and the nitty-gritty details. And time and again, we hear from folks who are surprised that a tool they use all the time can do [insert cool and helpful thing here] — they had no idea!
Well, let’s remedy that! Starting with today’s post, we’ll be publishing a series of quick videos put together by our own brilliant SEO scientist (and, according to Google, the smartest SEO in the world) Britney Muller. Each one will highlight one super useful workflow to solve a keyword research problem, and most are quick — just under a couple of minutes. Take a gander at the videos or skim the transcripts to find a workflow that catches your eye, and if you’re the type of person who likes to try it out in real time, head to the tool and give it a spin (if you have a Moz Community account like most Moz Blog readers, you already have free access):
1. How to do general keyword research
Find relevant keywords
You can do this a couple of ways. One is just to enter in a head keyword term that you want to explore — so maybe that’s “SEO” — and you can click Search. From here, you can go to Keyword Suggestions, where you can find all sorts of other keywords relevant to the keyword “SEO.”
We have a couple filters available to help you narrow down that search a little bit better. Here, without doing any filtering, you can see all of these keywords, and they’re ranked by relevancy and then search volume. So you do tend to see the higher search volume keywords at the top.
Save keyword suggestions in a list
But you can go through here and click the keywords that you want to save for your list. You can also do some filtering. We could group keywords by low lexical similarity. What this means is it’s basically just going to take somewhat similar keywords and batch them together for you to make it a bit easier.
Here you can see there are 141 group keywords under “SEO.” Fifty keywords have fallen under “SEO services” and so on. This gives you a higher level, topical awareness of what the keywords look like. If you were to select these groups, you could add a list for these.
When I say add list, I mean you can just save them in a keyword list that you can refer back to time and time again. These lists are amazing, one of my favorite features. What you would basically do is create a new list. I’m just going to call it Test. That adds all of your selected keywords to a list. You can continue adding keywords by different filters.
Filter by which keywords are questions
One of my other favorite things to filter by is “Are questions.” This will give you keywords that are actual questions, and it’s really neat to be able to try to bake these into your content marketing or an FAQ page. Really helpful. You can select all up here. Then I can just add that to that SEO Test list that we already created. I hope this gives you an idea of how to use some of these general filters.
Filter based on closely or broadly related topics and synonyms
You can also filter based on closely related topics, broadly related topics and synonyms. Keywords with similar result pages is very interesting. You can really play around with both of these filters.
Filter by volume
You can also filter by volume. If you are trying to go after those high volume keywords, maybe you set a filter for here. Maybe you’re looking for long tail keywords, and then you’re going to look a little bit on the smaller search volume end here. These can all help in playing around and discovering more keywords.
Find the keywords a domain currently ranks for
Another thing that you can do to expand your keyword research is by entering in a domain. You can see that this changed to root domain when I entered moz.com. If you click Search, you’re going to get all of the keywords that that domain currently ranks for, which is really powerful. You could see all of the ranking keywords, add that to a list, and monitor how your website is performing.
Find competitors’ keywords
If you want to get really strategic, you can plug in some of your competitor sites and see what their keywords are. These are all things that you can do to expand your keyword research set. From there, you’re going to hopefully have one or a couple keyword lists that house all of this data for you to better strategically route your SEO strategy.
If we know that related questions are occurring most often, you can create strategic content around that. The opportunities here with these filters and sorts for keyword opportunities are endless.
2. How to discover ranking keywords for a particular domain or an exact page
See all the keywords a particular domain ranks for
This is super easy to do in Keyword Explorer. You just go to the main search bar. Let’s just throw in moz.com for example. I can see all the keywords that currently rank for moz.com.
We’re seeing over 114,000, and we get this really beautiful, high-level overview as to what that looks like. You can see the ranking distribution, and then you can even go into all of those ranking keywords in this tab here, which is really cool.
See all the keywords a specific page ranks for
You can do the same exact thing for a specific page. So let’s take the Beginner’s Guide. This will toggle to Exact Page, and you just click Search. Here we’re going to see that it ranks for 804 keywords. You get to know exactly what those are, what the difficulty is, the monthly search volume.
Keep track of those keywords in a list
You can add these things to a list to keep an eye on. It’s also great to do for competitive pages that appear to be doing very well or popular things occurring in your space. But this is just a quick and easy way to see what root domains or exact pages are currently ranking for.
3. How to quickly find keyword opportunities for a URL or a specific page
Find lower-ranking keywords that could be improved upon
I’m just going to paste in the URL to the Beginner’s Guide to SEO in Keyword Explorer. I’m going to look at all of the ranking keywords for this URL, and what I want to do is I want to sort by rank.
I want to see what’s ranking between 4 to 50 and see where or what keywords aren’t doing so well that we could improve upon. Right away we’re seeing this huge monthly search volume keyword, “SEO best practices,” and we’re ranking number 4.
It can definitely be improved upon. You can also go ahead and take a look at keywords that you rank for outside of page 1, meaning you rank 11 or beyond for these keywords. These could definitely also be improved upon. You can save these keywords to a list.
You can export them and strategically create content to improve those results.
4. How to check rankings for a set of keywords
Use keyword lists to check rankings for a subset of keywords
This is pretty easy. So let’s say you have a keyword list for your target keywords. Here I’ve got an SEO Test keyword list. I want to see how Moz is ranking for these keywords.
This is where you would just add Check Rankings For and add your URL. I’m just going to put moz.com, check rankings, and I can immediately see how well we’re doing for these specific keywords.
I can filter highest to lowest and vice versa.
5. How to track your keywords
Set up a Campaign
If you don’t already have a list of your keywords that you would like to track, I suggest watching the General Keyword Research video above to help discover some of those keywords. But if you already have the keywords you know that you want to track for a particular site, definitely set up an account with Moz Pro and set up a Campaign.
It walks you through all of the steps to set up a particular Campaign for a URL. If you already have your Campaign set up, for example this is my Moz Campaign and I want to add say a new list of keywords to track, what you can do is you can come into this dashboard view and then go to Rankings.
If you scroll down here, you can add keywords. So let’s say Moz is breaking into the conversion rate optimization space. I can paste in a list of my CRO keywords, and then I can add a label.
Use keyword labels to track progress on topics over time
Now that’s going to append that tag so I can filter by just CRO keywords. Then I’m going to click Add Keywords. This is going to take a little while to start to kick into gear basically.
But once it starts tracking, once these keywords are added, you’ll get to see them historically over time and even you against your competitors. It’s a really great way to monitor how you’re doing with keywords, where you’re seeing big drops or gains, and how you can better pivot your strategy to target those things.
Discover anything new or especially useful? Let us know on Twitter or here in the comments, and keep an eye out for more quick and fun keyword research workflow videos in the coming weeks — we’ve got some good stuff coming your way, from finding organic CTR for a keyword to discovering SERP feature opportunities and more.
Content and links — to successfully leverage search as a marketing channel you need useful content and relevant links.
In fact, a Google employee straight up told us that content and links are two of the three (the other being RankBrain) most important ranking factors in its search algorithm.
So why do we seem to overcomplicate SEO by chasing new trends and tactics, overreacting to fluctuations in rankings, and obsessing over the length of our title tags? SEO is simple — it’s content and it’s links.
Now, this is a simple concept, but it is much more nuanced and complex to execute well. However, I believe that by getting back to basics and focusing on these two pillars of SEO we can all spend more time doing the work that will be most impactful, creating a better, more connected web, and elevating SEO as a practice within the marketing realm.
To support this movement, I want to provide you with strategic, actionable takeaways that you can leverage in your own content marketing and link building campaigns. So, without further ado, let’s look at how you can be successful in search with content and links.
Building the right content
As the Wu-Tang Clan famously said, “Content rules everything around me, C.R.E.A.M,” …well, it was something like that. The point is, everything in SEO begins and ends with content. Whether it’s a blog post, infographic, video, in-depth guide, interactive tool, or something else, content truly rules everything around us online.
Content attracts and engages visitors, building positive associations with your brand and inspiring them to take desired actions. Content also helps search engines better understand what your website is about and how they should rank your pages within their search results.
So where do you start with something as wide-reaching and important as a content strategy? Well, if everything in SEO begins and ends with content, then everything in content strategy begins and ends with keyword research.
Proper keyword research is the difference between a targeted content strategy that drives organic visibility and simply creating content for the sake of creating content. But don’t just take my word for it — check out this client project where keyword research was executed after a year of publishing content that wasn’t backed by keyword analysis:
(Note: Each line represents content published within a given year, not total organic sessions of the site.)
In 2018, we started creating content based on keyword opportunities. The performance of that content has quickly surpassed (in terms of organic sessions) the older pages that were created without strategic research.
Start with keyword research
The concept of keyword research is straightforward — find the key terms and phrases that your audience uses to find information related to your business online. However, the execution of keyword research can be a bit more nuanced, and simply starting is often the most difficult part.
The best place to start is with the keywords that are already bringing people to your site, which you can find within Google Search Console.
Beyond the keywords that already bring people to your website, a baseline list of seed keywords can help you expand your keyword reach.
Seed keywords are the foundational terms that are related to your business and brand.
As a running example, let’s use Quip, a brand that sells oral care products. Quip’s seed keywords would be:
- [toothbrush set]
- [electric toothbrush]
- [electric toothbrush set]
- [toothbrush subscription]
These are some of the most basic head terms related to Quip’s products and services. From here, the list could be expanded, using keyword tools such as Moz’s Keyword Explorer, to find granular long-tail keywords and other related terms.
Expanded keyword research and analysis
The first step in keyword research and expanding your organic reach is to identify current rankings that can and should be improved.
Here are some examples of terms Moz’s Keyword Explorer reports Quip has top 50 rankings for:
- [teeth whitening]
- [sensitive teeth]
- [whiten teeth]
- [automatic toothbrush]
- [tooth sensitivity]
- [how often should you change your toothbrush]
These keywords represent “near-miss” opportunities for Quip, where it ranks on page two or three. Optimization and updates to existing pages could help Quip earn page one rankings and substantially more traffic.
For example, here are the first page results for [how often should you change your toothbrush]:
As expected, the results here are hyper-focused on answering the question how often a toothbrush needs to be changed, and there is a rich snippet that answers the question directly.
Now, look at Quip’s page where we can see there is room for improvement in answering searcher intent:
The title of the page isn’t optimized for the main query, and a simple title change could help this page earn more visibility. Moz reports 1.7k–2.9k monthly search volume for [how often should you change your toothbrush]:
This is a stark contrast to the volume reported by Moz for [why is a fresh brush head so important] which is “no data” (which usually means very small):
Quip’s page is already ranking on page two for [how often should you change your toothbrush], so optimizing the title could help the page crack the top ten.
Furthermore, the content on the page is not optimized either:
Rather than answering the question of how often to change a toothbrush concisely (like the page that has earned the rich snippet), the content is closer to ad copy. Putting a direct, clear answer to this question at the beginning of the content could help this page rank better.
And that’s just one query and one page!
Keyword research should uncover these types of opportunities, and with Moz’s Keyword Explorer you can also find ideas for new content through “Keyword Suggestions.”
Using Quip as an example again, we can plug in their seed keyword [toothbrush] and get multiple suggestions (MSV = monthly search volume):
- [toothbrush holder] – MSV: 6.5k–9.3k
- [how to properly brush your teeth] – MSV: 851–1.7k
- [toothbrush cover] – MSV: 851–1.7k
- [toothbrush for braces] – MSV: 501–850
- [electric toothbrush holder] – MSV: 501–850
- [toothbrush timer] – MSV: 501–850
- [soft vs medium toothbrush] – MSV: 201–500
- [electric toothbrush for braces] – MSV: 201–500
- [electric toothbrush head holder] – MSV: 101–200
- [toothbrush delivery] – MSV: 101–200
Using this method, we can extrapolate one seed keyword into ten more granular and related long-tail keywords — each of which may require a new page.
This handful of terms generates a wealth of content ideas and different ways Quip could address pain points and reach its audience.
Another source for keyword opportunities and inspiration are your competitors. For Quip, one of its strongest competitors is Colgate, a household name brand. Moz demonstrates the difference in market position with its “Competitor Overlap” tool:
Although many of Colgate’s keywords aren’t relevant to Quip, there are still opportunities to be gleaned here for Quip. One such example is [sensitive teeth], where Colgate is ranking top five, but Quip is on page two:
While many of the other keywords show Quip is ranking outside of the top 50, this is an opportunity that Quip could potentially capitalize on.
To analyze this opportunity, let’s look at the actual search results first.
It’s immediately clear that the intent here is informational — something to note when we examine Quip’s page. Also, scrolling down we can see that Colgate has two pages ranking on page one:
One of these pages is from a separate domain for hygienists and other dental professionals, but it still carries the Colgate brand and further demonstrates Colgate’s investment into this query, signaling this is a quality opportunity.
The next step for investigating this opportunity is to examine Colgate’s ranking page and check if it’s realistic for Quip to beat what they have. Here is Colgate’s page:
This page is essentially a blog post:
If this page is ranking, it’s reasonable to believe that Quip could craft something that would be at least as good of a result for the query, and there is room for improvement in terms of design and formatting.
One thing to note, that is likely helping this page rank is the clear definition of “tooth sensitivity” and signs and symptoms listed on the sidebar:
Now, let’s look at Quip’s page:
This appears to be a blog-esque page as well.
This page offers solid information on sensitive teeth, which matches the queries intent and is likely why the pages ranks on page two. However, the page appears to be targeted at [tooth sensitivity]:
This is another great keyword opportunity for Quip:
However, this should be a secondary opportunity to [sensitive teeth] and should be mixed in to the copy on the page, but not the focal point. Also, the page one results for [tooth sensitivity] are largely the same as those for [sensitive teeth], including Colgate’s page:
So, one optimization Quip could make to the page could be to change some of these headers to include “sensitive teeth” (also, these are all H3s, and the page has no H2s, which isn’t optimal). Quip could draw inspiration from the questions that Google lists in the “People also ask” section of the SERP:
Also, a quick takeaway I had was that Quip’s page does not lead off with a definition of sensitive teeth or tooth sensitivity. We learned from Colgate’s page that quickly defining the term (sensitive teeth) and the associated symptoms could help the page rank better.
These are just a few of the options available to Quip to optimize its page, and as mentioned before, an investment into a sleek, easy to digest design could separate its page from the pack.
If Quip were able to move its page onto the first page of search results for [sensitive teeth], the increase in organic traffic could be significant. And [sensitive teeth] is just the tip of the proverbial iceberg — there is a wealth of opportunity with associated keywords, that Quip would rank well for also:
Executing well on these content opportunities and repeating the process over and over for relevant keywords is how you scale keyword-focused content that will perform well in search and bring more organic visitors.
At Page One Power, we’ve leveraged this strategy and seen great results for clients. Here is an example of a client that is primarily focused on content creation and their corresponding growth in organic sessions:
These pages (15) were all published in January, and you can see that roughly one month after publishing, these pages started taking off in terms of organic traffic. This is because these pages are backed by keyword research and optimized so well that even with few external backlinks, they can rank on or near page one for multiple queries.
However, this doesn’t mean you should ignore backlinks and link acquisition. While the above pages rank well without many links, the domain they’re on has a substantial backlink profile cultivated through strategic link building. Securing relevant, worthwhile links is still a major part of a successful SEO campaign.
Earning real links and credibility
The other half of this complicated “it’s content and it’s links” equation is… links, and while it seems straightforward, successful execution is rather difficult — particularly when it comes to link acquisition.
While there are tools and processes that can increase organization and efficiency, at the end of the day link building takes a lot of time and a lot of work — you must manually email real website owners to earn real links. As Matt Cutts famously said (we miss you, Matt!), “Link building is sweat, plus creativity.”
However, you can greatly improve your chances for success with link acquisition if you identify which pages (existing or need to be created) on your site are link-worthy and promote them for links.
Spoiler alert: these are not your “money pages.”
Converting pages certainly have a function on your website, but they typically have limited opportunities when it comes to link acquisition. Instead, you can support these pages — and other content on your site — through internal linking from more linkable pages.
So how do you identify linkable assets? Well, there are some general characteristics that directly correlate with link-worthiness:
- Usefulness — concept explanation, step-by-step guide, collection of resources and advice, etc.
- Uniqueness — a new or fresh perspective on an established topic, original research or data, prevailing coverage of a newsworthy event, etc.
- Entertaining — novel game or quiz, humorous take on a typically serious subject, interactive tool, etc.
Along with these characteristics, you also need to consider the size of your potential linking audience. The further you move down your marketing funnel, the smaller the linking audience size; converting pages are traditionally difficult to earn links to because they serve a small audience of people looking to buy.
Instead, focus on assets that exist at the top of your marketing funnel and serve large audiences looking for information. The keywords associated with these pages are typically head terms that may prove difficult to rank for, but if your content is strong you can still earn links through targeted, manual outreach to relevant sites.
Ironically, your most linkable pages aren’t always the pages that will rank well for you in search, since larger audiences also mean more competition. However, using linkable assets to secure worthwhile links will help grow the authority and credibility of your brand and domain, supporting rankings for your keyword-focused and converting pages.
Going back to our Quip example, we see a page on their site that has the potential to be a linkable asset:
Currently, this page is geared more towards conversions which hurts linkability. However, Quip could easily move conversion-focused elements to another page and internally link from this page to maintain a pathway to conversion while improving link-worthiness.
To truly make this page a linkable asset, Quip would need add depth on the topic of how to brush your teeth and hone in on a more specific audience. As the page currently stands, it is targeted at everybody who brushes, but to make the page more linkable Quip could focus on a specific age group (toddlers, young children, elderly, etc.) or perhaps a profession or group who works odd hours or travels frequently and doesn’t have the convenience of brushing at home. An increased focus on audience will help with linkability, making this page one that shares useful information in a way that is unique and entertaining.
It also happens that [how to properly brush your teeth] was one of the opportunities we identified earlier in our (light) keyword research, so this could be a great opportunity to earn keyword rankings and links!
Putting it all together and simplifying our message
Now before we put it all together and solve SEO once and for all, you might be thinking, “What about technical and on-page SEO?!?”
And to that, I say, well those are just makeu…just kidding!
Technical and on-page elements play a major role in successful SEO and getting these elements wrong can derail the success of any content you create and undermine the equity of the links you secure.
Let’s be clear: if Google can’t crawl your site, you’re not showing up in its search results.
However, I categorize these optimizations under the umbrella of “content” within our content and links formula. If you’re not considering how search engines consume your content, along with human readers, then your content likely won’t perform well in the results of said search engines.
Rather than dive into the deep and complex world of technical and on-page SEO in this post, I recommend reading some of the great resources here on Moz to ensure your content is set up for success from a technical standpoint.
But to review the strategy I’ve laid out here, to be successful in search you need to:
- Research your keywords and niche – Having the right content for your audience is critical to earning search visibility and business. Before you start creating content or updating existing pages, make sure you take the time to research your keywords and niche to better understand your current rankings and position in the search marketplace.
- Analyze and expand keyword opportunities – Beyond understanding your current rankings, you also need to identify and prioritize available keyword opportunities. Using tools like Moz you can uncover hidden opportunities with long-tail and related key terms, ensuring your content strategy is targeting your best opportunities.
- Craft strategic content that serves your search goals – Using keyword analysis to inform content creation, you can build content that addresses underserved queries and helpful guides that attract links. An essential aspect of a successful content plan is balancing keyword-focused content with broader, more linkable content and ensuring you’re addressing both SEO goals.
- Promote your pages for relevant links – Billions of new pages go live each day, and without proper promotion, even the best pages will be buried in the sea of content online. Strategic promotion of your pages will net you powerful backlinks and extra visibility from your audience.
Again, these concepts seem simple but are quite difficult to execute well. However, by drilling down to the two main factors for search visibility — content and links — you can avoid being overwhelmed or focusing on the wrong priorities and instead put all your efforts into the strategies that will provide the most SEO impact.
However, along with refocusing our own efforts, as SEOs we also need to simplify our message to the uninitiated (or as they’re also known, the other 99% of the population). I know from personal experience how quickly the eyes start to glaze over when I get into the nitty-gritty of SEO, so I typically pivot to focus on the most basic concepts: content and links.
People can wrap their minds around the simple process of creating good pages that answer a specific set of questions and then promoting those pages to acquire endorsements (backlinks). I suggest we embrace this same approach, on a broader scale, as an industry.
When we talk to potential and existing clients, colleagues, executives, etc., let’s keep things simple. If we focus on the two concepts that are the easiest to explain we will get better understanding and more buy-in for the work we do (it also happens that these two factors are the biggest drivers of success).
So go out, shout it from the rooftops — CONTENT AND LINKS — and let’s continue to do the work that will drive positive results for our websites and help secure SEOs rightful seat at the marketing table.
Reporting is central to our jobs as SEOs and helps us to communicate the value of our work to stakeholders and clients alike. Without good reporting, it can be a challenge to illustrate our success in search. We know how important it is — but it can also be painful and clunky.
Am I the only one who moderately dreads what we might call “reporting season?” The timing of that season might vary — based on who you work for, what a reporting cycle looks like, and other factors — but ultimately it’s the time of year when we have to get our ducks in a row and report to our stakeholders: not only on the SEO progress that we’ve made, but what that progress equates to in terms of real-world implications.
For me, one of the biggest time-black-holes when building reports is the fact that I’m reaching to collect data from disparate sources to paint a full picture of my SEO work. I find myself grabbing screenshots from various tools, pulling them into a template that I’ve built, and wishing I had a streamlined process for it all … then, repeating the exact same data-wild-goose-chase-and-template-building-acrobatics for each site I track. Ugh.
A solution (which I admit I’m a totally biased fan of) has launched in Moz Pro this week. Within a Campaign’s custom reports, we’ve introduced nine custom report templates to help you report on what matters to your stakeholders. Just select a template and dive into the insights.
These templates are rooted in workflows that are popular within the Moz Pro app. Our team also conducted tons of customer interviews to identify what kinds of templates we needed to build. While you can edit templates to suit your individual needs, they come pre-loaded with descriptive insights and data that stands on its own to tell a story. If you have a Medium-level plan or higher, you’ve already got instant access to these templates.
Use one of Moz’s new report templates to pull together the data you need—depending on exactly what your reader needs to know. Choose from one of our nine most popular templates to tell your SEO story. Here’s what we’ve got:
1. Competitive Analysis Overview Report
The Competitive Analysis Overview Report provides a brief overview of how your site compares to your competitors. It highlights competitive metrics like search visibility and compares your site’s featured snippets, link profiles, and tracked keywords to your competitors. As an overview report, it will help quickly show stakeholders how your site compares to your competitors.
2. Full Competitive Analysis Report
The Full Competitive Analysis Report gives a complete and thorough view of how your site stacks up against the competition. More in-depth and detailed than the aforementioned overview report, this one is perfect for stakeholders who want to know all the details about your SEO competition. It highlights competitive metrics, as well as in-depth comparisons across links, keyword performance, Domain Authority, and more.
3. Campaign Overview Report
The Campaign Overview Report is perfect to provide to any team members or clients who want exactly that—an overview of your site’s Campaign. The report includes a view of your Campaign dashboard, Search Visibility, and a look at site health, link data, and traffic.
4. Link Analysis Report
The Link Analysis Report is ideal to pass along to any stakeholder who is particularly interested in link data. It provides an in-depth look at your own site’s links, as well as how your site stacks up against its competitors when it comes to link profiles. This report includes many important link metrics, including discovered & lost links, linking domains, anchor text, Domain Authority, and more.
5. Rankings Analysis Report
The Rankings Analysis Report will be great for anyone who is curious about your site’s ranking performance, especially when it comes to top keywords. The report highlights a high-level overview of keyword performance, and then digs in to best- and worst-performing keywords, Search Visibility, traffic, and keyword opportunities.
6. Ranking Opportunities Report
The Ranking Opportunities Report is ideal for the stakeholder in your life who wants to know what the next steps might be for your keyword strategy. This report identifies some of the top keyword opportunities pulled in from Keyword Explorer and your Campaign, based on your site’s current performance. By highlighting keywords your site is already ranking for that you aren’t tracking, and opportunities to rank for new keywords, this is an easy report to pass along for consideration around future keyword strategy.
7. Full Site Audit Report
The Full Site Audit Report provides a very thorough, in-depth look at your site’s health. This report is ideal for any stakeholder or client who wants to know precisely how the site is doing and what outstanding work still needs to be done. Based on your site crawl in Moz Pro, this highlights actionable insights such as new and critical issues, crawler warnings, redirect issues, and metadata/content issues.
8. Quick Site Audit Report
The Quick Site Audit Report is a briefer version of the aforementioned Full Site Audit Report. This report is easily digestible for any stakeholders who just want a high-level view of your site’s health and link profile. It highlights top-level crawl metrics, new site crawl issues, and quick link metrics.
9. Search Visibility Report
The Search Visibility Report is ideal for a client or boss who just wants to know the answer to the age-old question: “How visible is my site?” This report provides a quick overview of your Moz Campaign before diving into trending search visibility and a comparison against competitors. Provide a clear answer to the question of how visible your site is with this concise report.
Feeling ready to jump into year-end reporting? We’re looking forward to your feedback. How do the new templates fit into your reporting workflows? Got other ideas on how we can continue to improve your reporting? Please feel free to share in the comments!