The excitement of finishing a competitive keyword research project often gives way to the panic of fleeing from an avalanche of opportunities. Without an organizing principle, a spreadsheet full of keywords is a bottomless to-do list. It’s not enough to know what your competitors are ranking for — you need to know what content is powering those rankings and how you’re currently competing with that content. You need a blueprint to craft those keywords into a compelling structure.
Recently, I wrote a post about the current state of long-tail SEO. While I had an angle for the piece in mind, I also knew it was a topic Moz and others had covered many times. I needed to understand the competitive landscape and make sure I wasn’t cannibalizing our own content.
This post covers one method to perform that competitive content research, using Google’s advanced search operators. For simplicity’s sake, we’ll pare down the keyword research and start our journey with just one phrase: “long tail seo.”
Find your best content (site:)
long tail seo site:moz.com
“long tail seo” site:moz.com
First, what has Moz already published on the subject? By pairing your target keywords with the [site:] operator, you can search for matching content only on your own site. I usually start with a broad-match search, but if your target phrases are made up of common words, you could also use quotation marks and exact-match search. Here’s the first piece of content I see:
Our best match on the subject is a Whiteboard Friday from five years ago. If I had nothing new to add to the subject and/or I was considering doing a video, this might end my journey. I don’t really want to compete with my own content that’s already performing well. In this case, I decide that I’ve got a fresh take, and I move forward.
Target a specific folder (inurl:)
long tail seo site:moz.com inurl:learn
long tail seo site:moz.com/learn
For larger sites, you might want to focus on a specific section, like the blog, or in Moz’s case, our Learning Center. You have a couple of options here. You could use the [inurl:] operator with the folder name, but that may result in false alarms, like:
This may be useful, in some cases, but when you need to specifically focus on a sub-folder, just add that sub-folder to the [site:] operator. The handy thing about the [site:] operator is that anything left off is essentially a wild card, so [site:moz.com/learn] will return anything in the /learn folder.
Find all competing pages (-site:)
long tail seo -site:moz.com
Now that you have a sense of your own, currently-ranking content, you can start to dig into the competition. I like to start broad, simply using negative match [-site:] to remove my own site from the list. I get back something like this:
This is great for a big-picture view, but you’re probably going to want to focus in on just a couple or a handful of known competitors. So, let’s narrow down the results …
Explore key competitors (site: OR site:)
long tail seo (site:ahrefs.com OR site:semrush.com)
By using the [OR] operator with [site:] and putting the result in parentheses, you can target a specific group of competitors. Now, I get back something like this:
Is this really different than targeting one competitor at a time? Yes, in one important way: now I can see how these competitors rank against each other.
Explore related content #1 (-“phrase”)
long tail seo -“long tail seo”
As you get into longer, more targeted phrases, it’s possible to miss relevant or related content. Hopefully, you’ve done a thorough job of your initial keyword research, but it’s still worth checking for gaps. One approach I use is to search for your main phrase with broad match, but exclude the exact match phrase. This leaves results like:
Just glancing at page one of results, I can see multiple mentions of “long tail keywords” (as well as “long-tail” with a hyphen), and other variants like “long tail keyword research” and “long tail organic traffic.” Even if you’ve turned these up in your initial keyword research, this combination of Google search operators gives you a quick way to cover a lot of variants and potentially relevant content.
Explore related content #2 (intext: -intitle:)
intext:”long tail seo” -intitle:”long tail seo”
Another handy trick is to use the [intext:] operator to target your phrase in the body of the content, but then use [-intitle:] to exclude results with the exact-match phrase in the title. While the results will overlap with the previous trick, you can sometimes turn up some interesting side discussions and related topics. Of course, you can also use [intitle:] to laser-target your search on content titles.
Find pages by dates (####..####)
long tail seo 2010..2015
In some cases, you might want to target your search on a date-range. You can combine the four-digit years with the range operator [..] to target a time period. Note that this will search for the years as numbers anywhere in the content. While the [daterange:] operator is theoretically your most precise option, it relies on Google being able to correctly identify the publication date of a piece, and I’ve found it difficult to use and a bit unpredictable. The range operator usually does the job.
Find top X lists (intitle:”#..#”)
intitle:”top 11..15″ long tail seo
This can get a little silly, but I just want to illustrate the power of combining operators. Let’s say you’re working on a top X list about long-tail SEO, but want to make sure there isn’t too much competition for the 11-15 item range you’re landing in. Using a combo of [intitle:] plus the range operator [..], you might get something like this:
Note that operator combos can get weird, and results may vary depending on the order of the operators. Some operators can’t be used in combination (or at least the results are highly suspicious), so always gut-check what you see.
Putting all of the data to work
If you approach this process in an organized way (if I can do it, you can do it, because, frankly, I’m not that organized), what you should end up with is a list of relevant topics you might have missed, a list of your currently top-performing pages, a list of your relevant competitors, and a list of your competitors’ top-performing pages. With this bundle of related data, you can answer questions like the following:
Are you at risk of competing with your own relevant content?
Should you create new content or improve on existing content?
Is there outdated content you should remove or 301-redirect?
What competitors are most relevant in this content space?
What effort/cost will it take to clear the competitive bar?
What niches haven’t been covered by your competitors?
No tool will magically answer these questions, but by using your existing keyword research tools and Google’s advanced search operators methodically, you should be able to put your human intelligence to work and create a specific and actionable content strategy around your chosen topic.
If you’d like to learn more about Google’s advanced search operators, check out our comprehensive Learning Center page or my post with 67 search operator tricks. I’d love to hear more about how you put these tools to work in your own competitive research.
Life rushed back into Jayda’s lungs, sharp and unforgiving. To her left, shards of a thousand synonyms. To her right, the crumbling remains of a mountain of long-tail keywords. As the air filled her lungs, the memories came rushing back, and with them the crushing realization that her team was buried beneath the debris. After months of effort, they had finally finished their competitive keyword research, but at what cost?
Anyone who does SEO as part of their job knows that there’s a lot of value in analyzing which queries are and are not sending traffic to specific pages on a site.
The most common uses for these datasets are to align on-page optimizations with existing rankings and traffic, and to identify gaps in ranking keywords.
However, working with this data is extremely tedious because it’s only available in the Google Search Console interface, and you have to look at only one page at a time.
On top of that, to get information on the text included in the ranking page, you either need to manually review it or extract it with a tool like Screaming Frog.
You need this kind of view:
…but even the above view would only be viable one page at a time, and as mentioned, the actual text extraction would have had to be separate as well.
Given these apparent issues with the readily available data at the SEO community’s disposal, the data engineering team at Inseev Interactive has been spending a lot of time thinking about how we can improve these processes at scale.
One specific example that we’ll be reviewing in this post is a simple script that allows you to get the above data in a flexible format for many great analytical views.
Better yet, this will all be available with only a few single input variables.
A quick rundown of tool functionality
The tool automatically compares the text on-page to the Google Search Console top queries at the page-level to let you know which queries are on-page as well as how many times they appear on the page. An optional XPath variable also allows you to specify the part of the page you want to analyze text on.
This means you’ll know exactly what queries are driving clicks/impressions that are not in your <title>, <h1>, or even something as specific as the first paragraph within the main content (MC). The sky’s the limit.
For those of you not familiar, we’ve also provided some quick XPath expressions you can use, as well as how to create site-specific XPath expressions within the “Input Variables” section of the post.
Post setup usage & datasets
Once the process is set up, all that’s required is filling out a short list of variables and the rest is automated for you.
The output dataset includes multiple automated CSV datasets, as well as a structured file format to keep things organized. A simple pivot of the core analysis automated CSV can provide you with the below dataset and many other useful layouts.
… Even some “new metrics”?
Okay, not technically “new,” but if you exclusively use the Google Search Console user interface, then you haven’t likely had access to metrics like these before: “Max Position,” “Min Position,” and “Count Position” for the specified date range – all of which are explained in the “Running your first analysis” section of the post.
To really demonstrate the impact and usefulness of this dataset, in the video below we use the Colab tool to:
[3 Minutes] — Find non-brand <title> optimization opportunities for https://www.inseev.com/ (around 30 pages in video, but you could do any number of pages)
[3 Minutes] — Convert the CSV to a more useable format
[1 Minute] – Optimize the first title with the resulting dataset
Okay, you’re all set for the initial rundown. Hopefully we were able to get you excited before moving into the somewhat dull setup process.
Keep in mind that at the end of the post, there is also a section including a few helpful use cases and an example template! To jump directly to each section of this post, please use the following links:
[Quick Consideration #2] — This tool has been heavily tested by the members of the Inseev team. Most bugs [specifically with the web scraper] have been found and fixed, but like any other program, it is possible that other issues may come up.
If you encounter any errors, feel free to reach out to us directly at [email protected] or [email protected], and either myself or one of the other members of the data engineering team at Inseev would be happy to help you out.
If new errors are encountered and fixed, we will always upload the updated script to the code repository linked in the sections below so the most up-to-date code can be utilized by all!
One-time setup of the script in Google Colab (in less than 20 minutes)
Things you’ll need:
Google Cloud Platform account
Google Search Console access
Video walkthrough: tool setup process
Below you’ll find step-by-step editorial instructions in order to set up the entire process. However, if following editorial instructions isn’t your preferred method, we recorded a video of the setup process as well.
As you’ll see, we start with a brand new Gmail and set up the entire process in approximately 12 minutes, and the output is completely worth the time.
Keep in mind that the setup is one-off, and once set up, the tool should work on command from there on!
Editorial walkthrough: tool setup process
Download the files from Github and set up in Google Drive
Set up a Google Cloud Platform (GCP) Project (skip if you already have an account)
Create the OAuth 2.0 client ID for the Google Search Console (GSC) API (skip if you already have an OAuth client ID with the Search Console API enabled)
Add the OAuth 2.0 credentials to the Config.py file
Part one: Download the files from Github and set up in Google Drive
2. After you log in to your desired Google Cloud account, click “ENABLE”.
3. Configure the consent screen.
In the consent screen creation process, select “External,” then continue onto the “App Information.”
Example below of minimum requirements:
Add the email(s) you’ll use for the Search Console API authentication into the “Test Users”. There could be other emails versus just the one that owns the Google Drive. An example may be a client’s email where you access the Google Search Console UI to view their KPIs.
4. In the left-rail navigation, click into “Credentials” > “CREATE CREDENTIALS” > “OAuth Client ID” (Not in image).
5. Within the “Create OAuth client ID” form, fill in:
6. Save the “Client ID” and “Client Secret” — as these will be added into the “api” folder config.py file from the Github files we downloaded.
These should have appeared in a popup after hitting “CREATE”
The “Client Secret” is functionally the password to your Google Cloud (DO NOT post this to the public/share it online)
Part four: Add the OAuth 2.0 credentials to the Config.py file
1. Return to Google Drive and navigate into the “api” folder.
2. Click into config.py.
3. Choose to open with “Text Editor” (or another app of your choice) to modify the config.py file.
4. Update the three areas highlighted below with your:
CLIENT_ID: From the OAuth 2.0 client ID setup process
CLIENT_SECRET: From the OAuth 2.0 client ID setup process
GOOGLE_CREDENTIALS: Email that corresponds with your CLIENT_ID & CLIENT_SECRET
5. Save the file once updated!
Congratulations, the boring stuff is over. You are now ready to start using the Google Colab file!
Running your first analysis
Running your first analysis may be a little intimidating, but stick with it and it will get easy fast.
Below, we’ve provided details regarding the input variables required, as well as notes on things to keep in mind when running the script and analyzing the resulting dataset.
After we walk through these items, there are also a few example projects and video walkthroughs showcasing ways to utilize these datasets for client deliverables.
Setting up the input variables
XPath extraction with the “xpath_selector” variable
Have you ever wanted to know every query driving clicks and impressions to a webpage that aren’t in your <title> or <h1> tag? Well, this parameter will allow you to do just that.
While optional, using this is highly encouraged and we feel it “supercharges” the analysis. Simply define site sections with Xpaths and the script will do the rest.
In the above video, you’ll find examples on how to create site specific extractions. In addition, below are some universal extractions that should work on almost any site on the web:
‘//title’ # Identifies a <title> tag
‘//h1’ # Identifies a <h1> tag
‘//h2’ # Identifies a <h2> tag
Site Specific: How to scrape only the main content (MC)?
Chaining Xpaths – Add a “|” Between Xpaths
‘//title | //h1’ # Gets you both the <title> and <h1> tag in 1 run
‘//h1 | //h2 | //h3’ # Gets you both the <h1>, <h2> and <h3> tags in 1 run
Here’s a video overview of the other variables with a short description of each.
‘colab_path’ [Required] – The path in which the Colab file lives. This should be “/content/drive/My Drive/Colab Notebooks/”.
‘domain_lookup’ [Required] – Homepage of the website utilized for analysis.
‘startdate’ & ‘enddate’[Required] – Date range for the analysis period.
‘gsc_sorting_field’ [Required] – The tool pulls the top N pages as defined by the user. The “top” is defined by either “clicks_sum” or “impressions_sum.” Please review the video for a more detailed description.
‘gsc_limit_pages_number’ [Required] – Numeric value that represents the number of resulting pages you’d like within the dataset.
‘brand_exclusions’ [Optional] – The string sequence(s) that commonly result in branded queries (e.g., anything containing “inseev” will be branded queries for “Inseev Interactive”).
‘impressions_exclusion’ [Optional] – Numeric value used to exclude queries that are potentially irrelevant due to the lack of pre-existing impressions. This is primarily relevant for domains with strong pre-existing rankings on a large scale number of pages.
‘page_inclusions’ [Optional] – The string sequence(s) that are found within the desired analysis page type. If you’d like to analyze the entire domain, leave this section blank.
Running the script
Keep in mind that once the script finishes running, you’re generally going to use the “step3_query-optimizer_domain-YYYY-MM-DD.csv” file for analysis, but there are others with the raw datasets to browse as well.
That said, there are a few important things to note while testing things out:
2. Google Drive / GSC API Auth: The first time you run the script in each new session it will prompt you to authenticate both the Google Drive and the Google Search Console credentials.
GSC authentication: Authenticate whichever email has permission to use the desired Google Search Console account.
If you attempt to authenticate and you get an error that looks like the one below, please revisit the “Add the email(s) you’ll use the Colab app with into the ‘Test Users'” from Part 3, step 3 in the process above: setting up the consent screen.
Quick tip: The Google Drive account and the GSC Authentication DO NOT have to be the same email, but they do require separate authentications with OAuth.
3. Running the script: Either navigate to “Runtime” > “Restart and Run All” or use the keyboard shortcut CTRL + fn9 to start running the script.
4. Populated datasets/folder structure: There are three CSVs populated by the script – all nested within a folder structure based on the “domain_lookup” input variable.
Automated Organization [Folders]: Each time you rerun the script on a new domain, it will create a new folder structure in order to keep things organized.
Automated Organization [File Naming]: The CSVs include the date of the export appended to the end, so you’ll always know when the process ran as well as the date range for the dataset.
5. Date range for dataset: Inside of the dataset there is a “gsc_datasetID” column generated, which includes the date range of the extraction.
6. Unfamiliar metrics: The resulting dataset has all the KPIs we know and love – e.g. clicks, impressions, average (mean) position — but there are also a few you cannot get directly from the GSC UI:
‘count_instances_gsc’ — the number of instances the query got at least 1 impression during the specified date range. Scenario example: GSC tells you that you were in an average position 6 for a large keyword like “flower delivery” and you only received 20 impressions in a 30-day date range. Doesn’t seem possible that you were really in position 6, right? Well, now you can see that was potentially because you only actually showed up on one day in that 30-day date range (e.g. count_instances_gsc = 1)
Quick tip #1: Large variance in max/min may tell you that your keyword has been fluctuating heavily.
Quick tip #2: These KPIs, in conjunction with the “count_instances_gsc”, can exponentially further your understanding of query performance and opportunity.
Recommended use: Download file and use with Excel. Subjectively speaking, I believe Excel has a much more user friendly pivot table functionality in comparison to Google Sheets — which is critical for using this template.
Alternative use: If you do not have Microsoft Excel or you prefer a different tool, you can use most spreadsheet apps that contain pivot functionality.
For those who opt for an alternative spreadsheet software/app:
Below are the pivot fields to mimic upon setup.
You may have to adjust the Vlookup functions found on the “Step 3 _ Analysis Final Doc” tab, depending on whether your updated pivot columns align with the current pivot I’ve supplied.
Project example: Title & H1 re-optimizations (video walkthrough)
Project description: Locate keywords that are driving clicks and impressions to high value pages and that do not exist within the <title> and <h1> tags by reviewing GSC query KPIs vs. current page elements. Use the resulting findings to re-optimize both the <title> and <h1> tags for pre-existing pages.
Project assumptions: This process assumes that inserting keywords into both the <title> and <h1> tags is a strong SEO practice for relevancy optimization, and that it’s important to include related keyword variants into these areas (e.g. non-exact match keywords with matching SERP intent).
Project example: On-page text refresh/re-optimization
Project description: Locate keywords that are driving clicks and impressions to editorial pieces of content that DO NOT exist within the first paragraph within the body of the main content (MC). Perform an on-page refresh of introductory content within editorial pages to include high value keyword opportunities.
Project assumptions: This process assumes that inserting keywords into the first several sentences of a piece of content is a strong SEO practice for relevancy optimization, and that it’s important to include related keyword variants into these areas (e.g. non-exact match keywords with matching SERP intent).
We hope this post has been helpful and opened you up to the idea of using Python and Google Colab to supercharge your relevancy optimization strategy.
As mentioned throughout the post, keep the following in mind:
Github repository will be updated with any changes we make in the future.
There is the possibility of undiscovered errors. If these occur, Inseev is happy to help! In fact, we would actually appreciate you reaching out to investigate and fix errors (if any do appear). This way others don’t run into the same problems.
Other than the above, if you have any ideas on ways to Colab (pun intended) on data analytics projects, feel free to reach out with ideas.
The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
As SEO professionals, we can easily fall behind on the latest Google search engine result page (SERP) features. Frequent updates keep us on our toes, and also keep our jobs interesting.
Recently, I teamed up with fellow Moz writer and all-round brilliant SEO, Izzi Smith, to create a new SEO quiz series named “SERP Pursuit”. The quiz is still open, if you’d like to test your knowledge of Google’s SERP features.
The outcome of the quiz was a collection of insights from the SEO community about different SERP feature topics, including the questions where participants may have struggled or become confused.
Thanks to everyone that shared and participated in the quiz! The top question in the series received 825 answers – providing a strong sample size. For each question, the sample size has been included along with the question.
The six common misconceptions found in our data relate to structured data (in terms of schema.org), Featured Snippets, unpaid Shopping tab listings (now referred to as “free product listings”), and also Web Stories.
Here are the questions, the answers to those questions, along with further details explaining why the correct answer is as such.
FAQ and How-To Schema rich results
Using too much Structured Data markup
Structured Data influencing Featured Snippets
Scroll-to-text with Featured Snippets
Unpaid Shopping tab listing inputs
Web Story device type rich results
1. What is the maximum number of FAQ and How-To schema rich results that can appear on the first page of Google?
The maximum number of FAQ and How-To schema rich results that can appear on the first page of Google is three. This has been proven for both FAQ schema and How-To schema rich results across mobile and desktop search results. Filtering will happen if less than three rich results are eligible.
According to our question sample of 775 answers, the most popular answer at 39% was that there is no limit in place for FAQ and How-To rich results (incorrect). The correct answer of “3” was selected by 34% of respondents.
Participants may have been drawn to the “no limit” response because this has historically been the answer for rich results other than FAQ and How-To schema. For instance, review snippets with Product schema don’t have limitations regarding what Google results page they can appear on, or the amount that can appear at the same time.
I am, however, glad to see the amount of participants that gave the correct answer, as this is a topic I’ve written about extensively over the past couple of years. Filtering is a very common reason for FAQ and How-To rich results not appearing, and being aware of the limitations can be a big time-saver for troubleshooting.
2. Is it possible to have too much structured data markup on a single page?
No, it is not possible to have too much structured data on a single page. But just because there are no repercussions from Google for excessive usage, time is often better spent elsewhere. Ultimately, you should focus on what provides value to your site: using valid schema containing information used by Google.
According to our question sample of 604 answers, the most popular answer at 55% was that it is possible to have too much structured data on a single page (incorrect answer). The correct answer of “False” was selected by 45% of respondents.
This question is one that comes down to semantics, but does cause confusion among the SEO community. In the context of SEO, there is no generic ranking factor associated with structured data usage. But using the right schema types, with a good spread of usage, can provide relevant results for users. It’s also good to keep in mind what does and does not yield rich results.
For the most part, it is my opinion that structured data shouldn’t be a task that requires a significant and regular investment of time. If using WordPress, there are tools such as Yoast that have already solved many of the ongoing structured data issues faced by sites. Their plugin provides Google with plenty of structured data signals, without extensive time investment (to be avoided).
3. Is structured data (in the context of schema.org) used to generate Featured Snippets on Google?
No, structured data (in the context of schema.org) is not used to generate Featured Snippets on Google. The structure of content on a page is, however, often a contributing factor. Google’s systems determine whether content is or isn’t suitable for inclusion in Featured Snippets.
According to our question sample of 579 answers, the most popular answer at 52% was that structured data is used to generate Featured Snippets (incorrect answer). The correct answer of “False” was selected by 48% of respondents.
The misconception of structured data influencing Featured Snippets is one that I come across often. It is often based on experiments where structured data is added to a page and seeing the page then being added to a Featured Snippet. With an understanding of how Featured Snippets operate, this connection doesn’t make sense, even as a contributing factor.
In my last article that I wrote for Moz, I showed how some sites can be prevented from ranking within Featured Snippets. This shows the complexity around how content is presented prominently at the top of Google’s search results, but structured data is one that Google has repeatedly mentioned doesn’t influence Featured Snippets as far back as 2015.
4. In which scenarios will the yellow text highlight and scroll-to functionality trigger once a Featured Snippet result is clicked?
Currently, scroll-to-text will only ever be triggered for Featured Snippets on Google in two separate scenarios. The first is when the Chrome browser is in use (on both mobile and desktop), and also when a URL is built using Accelerated Mobile Pages (AMP) on mobile devices with any browser in use.
According to our quiz sample of 527 answers, the most popular answer at 48% was the correct answer on the quiz. The closest answer to this, mentioning that this will only ever happen on Chrome on desktop, was selected by 36% of respondents.
Although the highest percentage of respondents answered this question correctly, I believe it is still worthwhile to discuss due to the fact that it wasn’t over half of respondents. Featured Snippet highlights have been happening as far back as December of 2019, but originally were exclusively for pages built with AMP.
5. With unpaid Google Shopping tab listings, are product feeds submitted via Merchant Center and structured data used as inputs?
The unpaid listings that appear within Google’s Shopping tab are based exclusively on data submitted via product feeds in Merchant Center. Support was originally for both product feeds and also structured data, but Google’s documentation was updated in May of 2020 to be exclusively product feeds.
According to our question sample of 468 answers, the most popular answer at 78% was that both product feeds and Structured Data are both used as inputs (incorrect answer). The correct answer of “False” was selected by only 22% of respondents.
Out of all questions in the quiz, this was the one that tripped up the most respondents. Prior to May 2020, it was the case that both product feeds and structured data were used as inputs for unpaid Shopping tab listings. Because this was only shown as a change in Google’s documentation, and was included as the original announcement, I can see how this could have confused respondents.
A recent announcement from Google changed the naming from “surfaces across Google” to “free product listings” or “free listings”, which is also good to keep in mind for this feature. But if you’re ever trying to troubleshoot issues related to the “free listings” within the Shopping tab, spend your time investigating your Merchant Center data, not your product structured data.
6. When it comes to rich results, do Web Stories make your search result stand out more prominently on both desktop and mobile?
While Google’s Web Stories are a feature that can rank on both mobile and desktop search results, the rich result element only comes into play on mobile. Using the AMP Test, which now has support for Web Stories, you can preview how your Web Story rich results will appear on mobile devices.
According to our question sample of 407 answers, the most popular answer at 69% was that Web Story rich results show on both desktop and mobile (incorrect answer). The correct answer of “False” was selected by 31% of respondents.
When a Web Story surfaces in Google’s search results while using a desktop device, it can be an… odd experience for users. They select the result, then they’re suddenly catapulted into the immersive full-screen Web Story experience. This is without prompt for the user (because the URL looks like a standard web search listing), which is a UX that Google still needs to address.
Web Stories are an interesting format that are worth experimenting with considering the rich results benefits on mobile for standing out more, and also for the prominence they are often given within Google’s Discover Feed. I’ve written about several Web Stories SEO tips for publishers to keep in mind when creating them with Google’s plugin.
The questions covered in this post are the top areas where respondents of our quiz struggled most relating to structured data, Featured Snippets, free product listings, and Web Stories. If some of these SERP feature questions had you feeling confused, don’t be too hard on yourself, just use this as an opportunity to improve on your understanding!
Nailing the areas mentioned in this post is a great start to expanding your Google SERP feature knowledge. SERP features change frequently on Google. If you’re wanting to keep updated on the latest changes to Google’s documentation, I’d highly suggest bookmarking this page, which features key changes as they’re made.
The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
How should you handle Google My Business listings when circumstances force a multi-location brand to consolidate?
This question is one I’m now increasingly receiving from local enterprises. Brands which made rapid adaptations in 2020 to continue serving the public are now having to make longer-term decisions based on the COVID-19 recession, altered consumer behavior, and budget.
Uprooting branches is painful. I believe it’s still too early to predict whether customers’ habits have been permanently changed by the pandemic and adoption of emergent service methodologies, such as telemeetings or home delivery. Forever is a very long time. That being said, Black Friday 2020 saw foot traffic cut in half and some multi-location brands faced with this reality are having to evaluate how to consolidate their bases of operation. A brand which formerly maintained five storefronts in a single city wants to know if it can weather the storm and build a future from just one physical locale.
Each business scenario is different, but there are general questions you should ask prior to consolidation, and there are specific steps to take if you determine the business you’re marketing must retrench. I want to be sure to mention that this article deals with permanent location closure. If you need to temporarily close a location due to COVID, read Google’s guidance on this.
Today, we’ll help you consider important factors in the decision-making process about permanent closure of locations, walk you through managing Google My Business listing consolidation with help from a Google Gold Product Expert, and prepare you for changes you may experience as a result of reducing your local footprint while working to make the best of a bad situation.
One of the challenging aspects of this tough scenario is that the business you’re marketing will need to choose which locations to close and which ones should remain open. I recommend asking these four questions, because the answers will be different for each brand and each market:
1. Is bias towards a city or industry centroid appearing to impact the local results for my top search phrases?
Go to Google and look up the name of a city in which you’re considering consolidation. Click on the map and identify where Google is locating the name of the city on the map. That is roughly Google’s idea of the center, or centroid, of the city:
Now, from a remote location (not at or too near your place of business), perform some of your most important searches and evaluate whether Google appears to be clustering the local pack, local finder, and Google Maps results around that centroid, of if they look fairly evenly distributed around the city. Document your findings.
Next, evaluate whether there is an industry centroid appearing to exert influence on the results for your searches. For example, in this search for auto dealerships, Google is clustering the lion’s share of the results around the auto row in this town, though there are many other dealerships in other parts of the city:
If one of your locations in a particular city is close to a city or industry centroid, and these points on the map appear to be influencing local search results for your most important search phrases, count this as a vote for keeping this location open while closing others that aren’t as close to these centroids.
2. Which location has historically done the highest volume of business, and does this still hold true today?
Take whatever practical data you have about the real-world performance of all your locations within a city. Compare pre-pandemic rates of foot traffic, transactions, phone calls, and any other metrics you have to these same figures today, even if you transition the data points to cover adaptations like curbside pickups, requests for delivery, or telemeetings. Document your findings. Count a vote for the winner of these benchmarks.
3. Which location is performing best in Google’s local results?
Here, you want to identify whether the GMB listing for one of your locations is outperforming the others in a city for core search phrases. Perhaps it’s the one with the highest star rating, the most reviews, the most owner responses, the closest proximity to a centroid, the best photos, more Q&A, or more regularly-scheduled Google Posts. Whatever factors are driving it to rank best for the business, document your findings.
If one of your listings stands out as the strongest, count that as a vote for it.
4. Which location, if any, has amenities that have strengthened its ability to serve during an emergency?
It may be the location with the biggest parking lot that is facilitating easier curbside pickup. Or the one with the drive-thru window. Or the one with the biggest storeroom to house products telehelp experts can interact with during customer service meetings.
If one of your locations has been better able to safely and effectively serve the public during the public health emergency, count that as a vote for it being the one that remains open.
To finalize, look at which location won the most votes in these four questions, and include that information in your final deliberations about which place should remain open while others are closed.
COVID-19 has created scenarios that local businesses have never faced before. Google has done a good job rolling out reactive features, like new GMB attributes and post types, but I think they are still trying to play catch-up in terms of updating guidelines for unforeseen scenarios. Normally, when I encounter a novel situation, I directly contact Google staff to be sure the advice I might give to a business has their official stamp of approval. But, for the past few months, I haven’t received responses to my requests for comment and guidance via the usual channels.
This led me to post about the emergence of COVID-driven location consolidation in Google’s help forum, and fortunately for all of us, volunteer Gold Product Expert, Krystal Taing, took the time to respond with thought and care. If you’ve determined that consolidating your locations within a city is necessary, here is Krystal’s advice on how to follow through digitally with your Google My Business listings:
Go to Google Maps and click the “suggest an edit” button. Mark the closed locations as “moved” to the location that will remain open. See the option for this in the following screenshot:
Send a request to Google to ask if they will transfer the reviews of the closed locations to the remaining location. Here is Google’s process for doing this. It’s important to know that it’s not guaranteed that Google will move your reviews, but it’s worth it to ask. Be sure to include the Maps URLs or listing CID numbers from both the closed listings and the remaining listing in your request to be clear about which locations you mean.
Sincere thanks to Krystal Taing for providing a process for all brands who are facing this dilemma. The volunteers in Google’s forum provide so much guidance, for free, and I sincerely hope Google will evaluate the emergence of COVID consolidation and release official guidelines for it sometime this year.
I want to address this very important question by first expressing my sympathy for any brand owners who have found themselves in this situation, and for their marketers who are trying to give good advice in difficult times. I’m sorry. You likely already know in your gut that having to close locations will have a negative impact on your business, and you’re right to suspect that having to close your GMB listings could be detrimental to your overall online visibility, as well.
I want to re-emphasize that every business scenario is different, but my prediction would be that brands which will suffer the greatest losses of digital visibility will be those whose models depend most on Google’s user-to-business proximity bias.
Because of this, if your business model is something like a convenience store that formerly had five locations in a city to ensure that customers could quickly get to you in every neighborhood, and you’re now reducing to a single location, it’s less likely that Google will continue to surface your remaining listing to people in the more distant neighborhoods you’ve now vacated. You’re likely to see significant losses of rankings, traffic, and transactions, due to the reduction of your local footprint.
However, if your business model is something like a big home improvement store or a restaurant with a rare menu, and people wouldn’t normally mind driving across town to get to you, then having to consolidate may have less impact on your visibility. Loss of locations may mean a bit less convenience for hyperlocal customers who formerly enjoyed being able to hike to your door, but in many cases, where the local market isn’t oversaturated with near-identical options, you’ll be able to retain good visibility on the maps.
As Krystal Taing points out:
“The label on the listings will still display as ‘Permanently Closed’ since that is true. However, there will be a fairly short period of time when this will still display to users unless they have direct links to the business profile. Marking the business as moved will help Google understand the alternative listings to display in results when they would have normally displayed that store to a user.”
So, there is some reason to hope that for many business models, the negative impacts of location and listing consolidation may not be as dire as we might fear, but I don’t want to give false hope here. You should plan to see drop-off across multiple metrics, and should be doing everything you can to ameliorate outcomes.
Crucial to this will be communication with your existing consumer base. You want to prevent loyal customers who encounter that big, ugly “permanently closed” label on listings from misapprehending that your brand has gone out of business. Let’s take a look at your options.
Can anything good come of all this difficult pruning? We can make a start with five proactive steps you can take to reduce the chance that customers mistake closure of some locations for total brand closure:
If your number of overall brand locations was limited (perhaps 10 or less) to begin with, announce the consolidation on the homepage of your website and on a site-wide banner. Address customers who formerly frequented the closed locations and offer clear directions to your new location, including written directions, maps, and photos. Word this as a warm welcome and let customers know you value their business and want to see them at your open location. If you had a large number of former locations, a better place for this messaging would be the location landing pages of the former locations, pointing users to the pages of those that remain open.
Email the same message to your email database prior to closing each location. Then, follow up with another email blast over the next few months. Remind customers that you are still in town and ready to serve them at a different location.
Be as active as possible on social media to communicate the change. You might even consider offering a special promotion for customers who make the switch from shopping at your old locations to patronizing the remaining one.
Be sure you have edited all of your citations on other platforms to reflect the change, reducing the chance of inconveniencing customers. If you’ve been using Moz Local to manage your listings across the Internet, you can automate this work of closure to save time and hassle.
Don’t assume everyone has the Internet. Find offline channels in your community that neighbors rely on for news and work to get the message out about where your open business is located.
Finally, if there is any silver lining to consolidation, it’s that you can now focus the strength of your marketing into fewer locations. Three points to consider:
Major improvements at smaller scale
With overhead significantly reduced, is now the time to invest in better e-commerce and a great home delivery system to ensure you can still serve customers across a city, or even throughout several neighboring communities? Read up on the pros and cons of in-house delivery vs. third party last-mile fulfillment. Or, if your business deals in services rather than products, are there strides you can make in teleservices that would make your brand the most accessible and satisfying one with which a community can transact? With fewer locations to manage, focus in on best-in-town customer service.
Another idea: could any of your former locations be replaced by a kiosk? Many are eligible for GMB inclusion and the existence of kiosks could regrow your local footprint in new ways.
Location landing page adjustments
What will you do with the landing pages of locations that are now permanently closed? The answer to this depends on your business model. If your business is geared towards services instead of products, and you are continuing to serve the areas where you’ve had to close locations, then there could be good reason to maintain these pages, diversifying them as much as possible with fresh, hyperlocal content.
If, however, the business is product-oriented, it’s likely that you’ll want to permanently 301 redirect these pages to the landing page for the branches that remain open. You might want to do this in stages, first keeping the pages live for a time as a place to announce the changes and to point website visitors to locations that remain open. You might have this be the structure for six months or a year. After whatever period of adjustment you feel is reasonable to ensure a given community knows of the changes, your final step could be then be 301 redirection.
The bright side of this is that any link power the multiple old pages learned will now flow through to the one landing page for the open location, which could give it a new ranking boost once Google has had time to process the change.
Over the next few years, your consolidated business will need to continuously evaluate opportunities for growth, even if the goal is no longer re-opening the same number of locations you operated prior to the pandemic. Rather, you might find new ways to become part of the essential fabric of local communities, you might expand your team nationally or even internationally to include greater levels of expertise because technology makes it possible, you might develop the next app that solves a pain point you’ve have to experience first-hand and that you know your peers are struggling with, too. As you prune and trim, a new and solid conception of your business may emerge over time.
As I mentioned above, forever is a long time. I don’t think any economist or marketer can realistically predict what the new normal will be when we have hopefully put these hard times behind us. We’re all guessing. What I would bet on, though, is that the entrepreneurial spark that helped you grow your business to its greatest heights before the crisis is still burning bright. Your observational powers, business acumen, and drive to contribute to our joint recovery matter, and the communities that you serve will be counting on you to lead the way forward. Wishing you success, every step of the way.
Google must be one of the most experimental enterprises the world has ever known. When it comes to the company’s local search interfaces, rather than rolling them all out as a single, cohesive whole, they have emerged in piecemeal fashion over two decades with different but related feature sets, unique URLs, and separate branding. Small wonder that confusion arises in dialog about aspects of local search. You, your agency coworkers, and your clients may find yourselves talking at cross-purposes about local rankings simply because you’re all looking at them on different interfaces!
Such is certainly the case with Google Maps vs. the object we call the Google Local Finder. Even highly skilled organic SEOs at your agency may not understand that these are two different entities which can feature substantially different local business rankings.
Today we’re going to clear this up, with a side-by-side comparison of the two user experiences, expert quotes, and a small, original case study that demonstrates and quantifies just how different rankings are between these important interfaces.
I manually gathered both Google Maps and Local Finder rankings across ten different types of geo-modified, local intent search phrases and ten different towns and cities across the state of California. I looked at differences both across search phrase and across locale, observing those brands which ranked in the top 10 positions for each query. My queries were remote (not performed within the city nearest me) to remove the influence of proximity and establish a remote baseline of ranking order for each entry. I tabulated all data in a spreadsheet to discover the percentage of difference in the ranked results.
Results of my study of Google Maps vs. the Local Finder
Before I roll out the results, I want to be sure I’ve offered a good definition of these two similar but unique Google platforms. Any user performing a local search (like “best tacos san jose”) can take two paths for deep local results:
Path one starts with a local pack, typically made up of three results near the top of the organic search results. If clicked on, the local pack takes the user to the Local Finder, which expands on the local pack to feature multiple listings, accompanied by a map. These types of results exist on google.com/search.
Path two may start on any Android device that features Google Maps by default, or it can begin on a desktop device by clicking the “Maps” tab above the organic SERPs. These types of results look quite similar to the Local Finder, with their list of ranked businesses and associated map, but they exist on google.com/maps.
Here’s a side-by-side comparison:
At first glance, these two user experiences look fairly similar with some minor formatting and content differences, but the URLs are distinct, and what you might also notice in this screenshot is that the rankings, themselves, are different. In this example, the results are, in fact, startlingly different.
I’d long wanted to quantify for myself just how different Maps and Local Finder results are, and so I created a spreadsheet to track the following:
Ten search phrases of different types including some head terms and some longer-tail terms with more refined intent.
Ten towns and cities from all parts of the big state of California covering a wide population ration. Angels Camp, for example, has a population of just 3,875 residents, while LA is home to nearly 4 million people.
I found that, taken altogether, the average difference in Local Finder vs. Maps results was 18.2% across all cities. The average difference was 18.5% across all search phrases. In other words, nearly one-fifth of the results on the two platforms didn’t match.
Here’s a further breakdown of the data:
Average percentage of difference by search phrase
grocery store (19%)
personal injury attorney (18%)
house cleaning service (10%)
electric vehicle dealer (16%)
best tacos (11%)
cheapest tax accountant (41%)
nearby attractions (8%)
women’s clothing (39%)
Average percentage of difference by city
Angels Camp (28%)
San Jose (15%)
San Rafael (24%)
San Francisco (4%)
Los Angeles (25%)
San Diego (16%)
Grass Valley (15%)
While many keyword/location combos showed 0% difference between the two platforms, others featured degrees of difference of 20%, 30%, 50%, 70%, and even 100%.
It would have been lovely if this small study surfaced any reliable patterns for us. For example, looking at the fact that the small, rural town of Angels Camp was the locale with the most diverse SERPs (28%), one might think that the smaller the community, the greater the variance in rankings. But such an idea founders when observing that the city with the second-most variability in LA (25%).
Similarly, looking at the fact that a longer-tail search like “cheapest tax accountant” featured the most differences (41%), it could be tempting to theorize that greater refinement in search intent yields more varied results. But then we see that “best tacos” results were only 11% different across Google Maps and the Local Finder. So, to my eyes, there is no discernible pattern from this limited data set. Perhaps narratives might emerge if we pulled thousands of SERPs.
For now, all we can say with confidence is that we’ve proven that there’s a good chance that the rankings a business enjoys in Google’s Local Finder frequently will not match their rankings in Google Maps. Individual results sets for keyword/locale combos may vary not at all, somewhat, substantially, or totally.
Maps vs. Finders: What’s the diff, and why?
The above findings from our study naturally lead to the question: why are the results for the same query different on the two Google platforms? For commentary on this, I asked three of my favorite local SEOs for theories on the source of the variance, and any other notable variables they’ve observed.
“I think that the differences are driven by the subtle differences of the ‘view port’ aspect ratio and size differences in the two environments. The viewport effectively defines the cohort of listings that are relevant enough to show. If it is larger, then there are likely more listings eligible, and if one of those happens to be strong, then the results will vary.”
Here’s an illustration of what Mike is describing. When we look at the results for the same search in the Local Finder and Google Maps, side by side, we often see that the area shown on the map is different at the automatic zoom level:
“Typically when I begin searches in Maps, I am seeing a broader area of results being served as well as categories of businesses. The results in the Local Finder are usually more specific and display more detail about the businesses. The Maps-based results are delivered in a manner that show users desire discovery and browsing. This is different from the Local Finder in that these results tend to be more absolute and about Google pushing pre-determined businesses and information to be evaluated by the user.”
Krystal is a GMB Gold Product Expert, and her comment was the first time I’d ever heard an expert of her caliber define how Google might view the intent of Maps vs. Finder searchers differently. Fascinating insight!
“What varies is mainly the features that Google shows. For example, products will show up on the listing in the Local Finder but not on Google Maps and attribute icons (women-led, Black-owned, etc.) show up on Google Maps but not in the Local Finder. Additionally, searches done in the Local Finder get lumped in with search in Google My Business (GMB) Insights whereas searches on Maps are reported on separately. Google is now segmenting it by platform and device as well.”
In sum, Google Maps vs. Local Finder searchers can have a unique UX, at least in part, because Google may surface a differently-mapped area of search and can highlight different listing elements. Meanwhile, local business owners and their marketers will discover variance in how Google reports activity surrounding these platforms.
What should you do about the Google Maps vs. Local Finder variables?
As always, there is nothing an individual can do to cause Google to change how it displays local search results. Local SEO best practices can help you move up in whatever Google displays, but you can’t cause Google to change the radius of search it is showing on a given platform.
That being said, there are three things I recommend for your consideration, based on what we’ve learned from this study.
1. See if Google Maps is casting a wider net than the Local Finder for any of your desired search phrases.
I want to show you the most extreme example of the difference between Maps and the Local Finder that I discovered during my research. First, the marker here locates the town of Angels Camp in the Sierra foothills in east California:
For the search “personal injury attorney angels camp”, note the area covered by map at the automatic zoom level accompanying the Local Finder results:
The greatest distance between any two points in this radius of results is about 100 miles.
Now, contrast this with the same search as it appears at the automatic zoom level on Google Maps:
Astonishingly, Google is returning a tri-state result for this search in Maps. The greatest distance between two pins on this map is nearly 1,000 miles!
As I mentioned, this was the most extreme case I saw. Like most local SEOs, I’ve spent considerable time explaining to clients who want to rank beyond their location that the further a user gets from the brand’s place of business, the less likely they are to see it come up in their local results. Typically, your best chance of local pack rankings begins with your own neighborhood, with a decent chance for some rankings within your city, and then a lesser chance beyond your city’s borders.
But the different behavior of Maps could yield unique opportunities. Even if what’s happening in your market is more moderate, in terms of the radius of results, my advice is to study the net Google is casting for your search terms in Maps. If it is even somewhat wider than what the Local Finder yields, and there is an aspect of the business that would make it valuable to bring in customers from further afield, this might indicate that some strategic marketing activities could potentially strengthen your position in these unusual results.
For example, one of the more distantly-located attorneys in our example might work harder to get clients from Angels Camp to mention this town name in their Google-based reviews, or might publish some Google posts about Angels Camp clients looking for the best possible lawyer regardless of distance, or publish some website content on the same topic, or look to build some new relationships and links within this more distant community. All of this is very experimental, but quite intriguing to my mind. We’re in somewhat unfamiliar territory here, so don’t be afraid to try and test things!
As always, bear in mind that all local search rankings are fluid. For verticals which primarily rely on the narrowest user-to-business proximity ratios for the bulk of transactions, more remote visibility may have no value. A convenience store, for example, is unlikely to garner much interest from faraway searchers. But for many industries, any one of these three criteria could make a larger local ranking radius extremely welcome:
The business model is traditionally associated with traveling some distance to get to it, like hotels or attractions (thinking post-pandemic here).
Rarity of the goods or services being offered makes the business worth driving to from a longer distance. This is extremely common in rural areas with few nearby options.
The business has implemented digital shopping on its website due to the pandemic and would now like to sell to as many customers as possible in a wider region with either driver delivery or traditional shipping as the method of fulfillment.
If any of those scenarios fits a local brand you’re marketing, definitely look at Google Maps behavior for focus search phrases.
2. Flood Google with every possible detail about the local businesses you’re marketing
As Joy Hawkins mentioned, above, there can be many subtle differences between the elements Google displays within listings on their two platforms. Look at how hours are included in the Maps listing for this taco shop, but that they’re absent from the Finder. The truth is, Google changes the contents of the various local interfaces so often that even the experts are constantly asking themselves and one another if some element is new.
The good news is, you don’t need to spend a minute worrying about minutiae here if you make just 5 commitments:
Add to this a modest investment in non-dashboard elements like Google Questions and Answers which exist on the Google Business Profile
Be sure your website is optimized for the terms you want to rank for
Earn publicity on the third-party websites Google uses as the “web results” references on your listings. I
I realize this is a tall order, but it’s also basic, good local search marketing and if you put in the work, Google will have plenty to surface about your locations, regardless of platform variables.
3. Study Google Maps with an eye to the future
Google Maps, as an entity, launched in 2005, with mobile app development spanning the next few years. The Local Finder, by contrast, has only been with us since 2015. Because local packs default to the Local Finder, it’s my impression that local SEO industry study has given the lion’s share of research to these interfaces, rather than to Google Maps.
I would suggest that 2021 is a good year to spend more time looking at Google Maps, interacting with it, and going down its rabbit holes into the weird walled garden Google continues to build into this massive interface. I recommend this, because I feel it’s only a matter of time before Google tidies up its piecemeal, multi-decade rollout of disconnected local interfaces via consolidation, and Maps has the history at Google to become the dominant version.
We’ve learned today that Google Maps rankings are, on average, nearly 20% different than Local Finder rankings, that this may stem, in part, from unique view port ratios, that it’s possible Google may view the intent of users on the two platforms differently, and that there are demonstrable variables in the listing content Google displays when we look at two listings side-by-side. We’ve also looked at some scenarios in which verticals that could benefit from a wider consumer radius would be smart to study Google Maps in the year ahead.
I want to close with some encouragement for everyone participating in the grand experiment of Google’s mapping project. The above photo is of the Bedolina Map, which was engraved on a rock in the Italian alps sometime around 500 BC. It is one of the oldest-known topographic maps, plotting out pathways, agricultural fields, villages, and the people who lived there. Consider it the Street View of the Iron Age.
I’m sharing this image because it’s such a good reminder that your work as a local SEO linked to digital cartography is just one leg of a very long journey which, by nature, requires a willingness to function in an experimental environment. If you can communicate this state of permanent change to clients, it can decrease stress on both sides of your next Zoom meeting. Rankings rise and fall, and as we’ve seen, they even differ across closely-related platforms, making patience essential and a big-picture view of overall growth very grounding. Keep studying, and help us all out on the mapped path ahead by sharing what you learn with our community.
Looking to increase your general knowledge of local search marketing? Read The Essential Local SEO Strategy Guide
Google My Business is both a free tool and a suite of interfaces that encompasses a dashboard, local business profiles, and a volunteer-driven support forum with this branding. Google My Business and the associated Google Maps make up the core of Google’s free local search marketing options for eligible local businesses.
Today, we’re doing foundational learning! Share this simple, comprehensive article with incoming clients and team members to get off on the right foot with this important local business digital asset.
An introduction to the basics of Google My Business
First, let’s get on the same page regarding what Google My Business is and how to be part of it.
What is Google My Business?
Google My Business (GMB) is a multi-layered platform that enables you to submit information about local businesses, to manage interactive features like reviews and questions, and to publish a variety of media like photos, posts, and videos.
What is GMB eligibility?
Eligibility to be listed within the Google My Business setting is governed by the Guidelines for representing your business on Google, which is a living document that undergoes frequent changes. Before listing any business, you should consult the guidelines to avoid violations that can result in penalties or the removal of your listings.
You need a Google account to get started
You will need a Google account to use Google’s products and can create one here, if you don’t already have one. It’s best for each local business to have its own company account, instead of marketing agencies using their accounts to manage clients’ local business profiles.
When a local business you’re marketing has a large in-house marketing department or works with third party agencies, Google My Business permits you to add and remove listing owners and managers so that multiple people can be given a variety of permissions to contribute to listings management.
How to create and claim/verify a Google My Business profile
Once the business you’re marketing has a Google account and has determined that it’s eligible for Google My Business inclusion, you can create a single local business profile by starting here, using Google’s walkthrough wizard to get listed.
Fill out as many fields as possible in creating your profile. This guide will help you understand how best to fill out many of the fields and utilize many of the features. Once you’ve provided as much information as you can, you’ll be given options to verify your listing so that you can control and edit it going forward.
Where your Google My Business information can display
Once your data has been accepted into the GMB system, it will begin showing up in a variety of Google’s local search displays, including the mobile and desktop versions of:
Google Business Profiles
Your comprehensive Google Business Profile (GBP) will most typically appear when you search for a business by its brand name, often with a city name included in your search language (e.g. “Amy’s Drive Thru Corte Madera”). In some cases, GBPs will show for non-branded searches as well (e.g. “vegan burger near me”). This can happen if there is low competition for a search term, or if Google believes (rightly or wrongly) that a search phrase has the intent of finding a specific brand instead of a variety of results.
Google Business Profiles are extremely lengthy, but a truncated view looks something like this, located to the right of the organic search engine results:
Google Local Packs
Local packs are one of the chief displays Google uses to rank and present the local business information in their index. Local packs are shown any time Google believes a search phrase has a local intent (e.g. “best vegan burger near me”, “plant-based burger in corte madera”, “onion rings downtown”). The searcher does not have to include geographic terms in their phrase for Google to presume the intent is local
Most typically these days, a local pack is made up of three business listings, with the option to click on a map or a “view all” button to see further listings. On occasion, local packs may feature fewer than three listings, and the types of information Google presents in them varies .
Local pack results look something like this on desktop search, generally located above the organic search results:
Google Local Finders
When a searcher clicks through on the map or the “view all” link in a local pack, they will be taken to the display commonly known as the Local Finder. Here, many listings can be displayed, typically paginated in groups of ten, and the searcher can zoom in and out on the map to see their options change.
The URL of this type of result begins google.com/search. Some industries, like hospitality have unique displays, but most local business categories will have a local finder display that looks like this, with the ranked list of results to the left and the map to the right:
Google Maps is the default display on Android mobile phones, and desktop users can also choose to search via this interface instead of through Google’s general search. You’ll notice a “maps” link at the top of Google’s desktop display, like this:
Searches made via Google Maps yield results that look rather similar to the local finder results, though there are some differences. It’s a distinct possibility that Google could, at some point, consolidate the user experience and have local packs default to Google Maps instead of the local finder.
The URL of these results begins google.com/maps instead of google.com/search and on desktop, Google’s ranked Maps’ display looks like this:
The GMB dashboard is where you manage most of this
Once you’ve created and claimed your Google Business Profiles, you’ll have access to managing most (but not all) of the features they contain in your Google My Business dashboard, which looks like this:
The GMB dashboard has components for ongoing management of your basic contact info, reviews, posts, images, products and other features.
The GMB dashboard also hosts the analytical features called GMB Insights. It’s a very useful interface, though the titles and functions of some of its components can be opaque. Some of the data you’ll see in GMB Insights includes:
How many impressions happened surrounding searches for your business name or location (called Direct), general searches that don’t specify your company by name but relate to what you offer (called Discovery), and searches relating to brands your business carries (called Branded).
Customer actions, like website visits, phone calls, messaging, and requests for driving directions.
Search terms people used that resulted in an impression of your business.
There are multiple other GMB Insights features, and I highly recommend this tutorial by Joy Hawkins for a next-level understanding of why reporting from this interface can be conflicting and confusing. There’s really important data in GMB Insights, but interpreting it properly deserves a post of its own and a bit of patience with some imperfections.
When things go wrong with Google My Business
When engaging in GMB marketing, you’re bound to encounter problems and find that all kinds of questions arise from your day-to-day work. Google relies heavily on volunteer support in their Google My Business Help Community Forum and you can post most issues there in hopes of a reply from the general public or from volunteer contributors titled Gold Product Experts.
In some cases, however, problems with your listings will necessitate speaking directly with Google or filling out forms. Download the free Local SEO Cheat Sheet for robust documentation of your various GMB support options.
How to use Google My Business as a digital marketing tool
Let’s gain a quick, no-frills understanding of how GMB can be used as one of your most important local marketing tools.
How to drive local business growth with Google’s local features
While each local business will need to take a nuanced approach to using Google My Business and Google Maps to market itself, most brands will maximize their growth potential on these platforms by following these seven basic steps:
1) Determine the business model (brick-and-mortar, service area business, home-based business, or hybrid). Need help? Try this guide.
3) Before you create GMB profiles, be certain you are working from a canonical source of data that has been vetted by all relevant parties at the business you’re marketing. This means that you’ve checked and double-checked that the name, address, phone number, hours of operation, business categories and other data you have about the company you are listing is 100% accurate.
4) Create and claim a profile for each of the locations you’re marketing. Depending on the business model, you may also be eligible for additional listings for practitioners at the business or multiple departments at a location. Some models, like car dealerships, are even allowed multiple listings for the car makes they sell. Consult the guidelines. Provide as much high quality, accurate, and complete information as possible in creating your profiles.
5) Once your listings are live, it’s time to begin managing them on an ongoing basis. Management tasks will include:
Analyzing chosen categories on an ongoing basis to be sure you’ve selected the best and most influential ones, and know of any new categories that appear over time for your industry.
Committing to a Google Posts schedule, publishing micro-blog-style content on an ongoing basis to increase awareness about products, services, events, and news surrounding the locations you’re marketing.
Populating Google Questions & Answers with company FAQs, providing simple replies to queries your staff receives all the time. Then, answer any incoming questions from the public on an ongoing basis.
7) In addition to managing your own local business profiles, you’ll need to learn to view them in the dynamic context of competitive local markets. You’ll have competitors for each search phrase for which you want to increase your visibility and your customers will see different pack, finder, and maps results based on their locations at the time of search. Don’t get stuck on the goal of being #1, but do learn to do basic local competitive audits so that you can identify patterns of how dominant competitors are winning.
In sum, providing Google with great and appropriate data at the outset, following up with ongoing management of all relevant GMB features, and making a commitment to ongoing local SEO education is the right recipe for creating a growth engine that’s a top asset for the local brands you market.
How to optimize Google My Business listings
This SEO forum FAQ is actually a bit tricky, because so many resources talk about GMB optimization without enough context. Let’s get a handle on this topic together.
Google uses calculations known as “algorithms” to determine the order in which they list businesses for public viewing. Local SEOs and local business owners are always working to better understand the secret ranking factors in Google’s local algorithm so that the locations they’re marketing can achieve maximum visibility in packs, finders, and maps.
Many local SEO experts feel that there are very few fields you can fill out in a Google Business Profile that actually have any impact on ranking. While most experts agree that it’s pretty evident the business name field, the primary chosen category, the linked website URL, and some aspects of reviews may be ranking factors, the Internet is full of confusing advice about “optimizing” service radii, business descriptions, and other features with no evidence that these elements influence rank.
My personal take is that this conversation about GMB optimization matters, but I prefer to think more holistically about the features working in concert to drive visibility, conversions, and growth, rather than speculating too much about how an individual feature may or may not impact rank.
Whether answering a GMB Q&A query delivers a direct lead, or writing a post moves a searcher further along the buyer journey, or choosing a different primary category boosts visibility for certain searches, or responding to a review to demonstrate empathy wins back an unhappy customer, you want it all. If it contributes to business growth, it matters.
Why Google My Business plays a major role in local search marketing strategy
Local businesses seeking to capture the share they need of these queries to become visible in their geographic markets must know how to incorporate Google My Business marketing into their local SEO campaigns.
A definition of local search engine optimization (local SEO)
Local SEO is the practice of optimizing a business’s web presence for increased visibility in local and localized organic search engine results. It’s core to providing modern customer service, ensuring today’s businesses can be found and chosen on the internet. Small and local businesses make up the largest business sector in the United States, making local SEO the most prevalent form of SEO.
Local SEO and Google My Business marketing are not the same thing, but learning to utilize GMB as a tool and asset is key to driving local business growth, because of Google’s near monopoly.
A complete local SEO campaign will include management of the many components of the Google My Business profile, as well as managing listings on other location data and review platforms, social media publication, image and video production and distribution, and a strong focus on the organic and local optimization of the company website. Comprehensive local search marketing campaigns also encompass all the offline efforts a business makes to be found and chosen.
When trying to prioritize, it can help to think of the website as the #1 digital asset of most brands you’ll market, but that GMB marketing will be #2. And within the local search marketing framework, it’s the customer and their satisfaction that must be centered at every stage of on-and-offline promotion.
Focus on GMB but diversify beyond Google
Every aspect of marketing a brand contains plusses, minuses and pitfalls. Google My Business is no exception. Let’s categorize this scenario into four parts for a realistic take on the terrain.
1) The positive
The most positive aspect of GMB is that it meets our criteria as owners and marketers of helping local businesses get found and chosen. At the end of the day, this is the goal of nearly all marketing tactics, and Google’s huge market share makes their platforms a peerless place to compete for the attention of and selection by customers.
What Google has developed is a wonder of technology. With modest effort on your part, GMB lets you digitize a business so that it can be ever-present to communities, facilitate conversations with the public which generate loyalty and underpin everything from inventory development to quality control, and build the kind of online reputation that makes brands local household names in the offline world.
2) The negative
The most obvious negative aspects of GMB are that its very dominance has cut Google too much slack in letting issues like listing and review spam undermine results quality. Without a real competitor, Google hasn’t demonstrated the internal will to solve problems like these that have real-world impacts on local brands and communities.
Meanwhile, a dry-eyed appraisal of Google’s local strategy observes that the company is increasingly monetizing their results. For now, GMB profiles are free, but expanding programs like Local Service Ads point the way to a more costly local SEO future for small businesses on tight budgets
Finally, local brands and marketers (as well as Google’s own employees) are finding themselves increasingly confronted with ethical concerns surrounding Google that have made them the subject of company walkouts, public protests, major lawsuits, and government investigations. If you’re devoting your professional life to building diverse, inclusive local communities that cherish human rights, you may sometimes encounter a fundamental disconnect between your goals and Google’s.
3) The pitfall
Managing your Google-based assets takes time, but don’t let it take all of your time. Because local businesses owners are so busy and Google is so omnipresent, a pitfall has developed where it can appear that GMB is the only game in town.
The old adage about eggs in baskets comes into play every time Google has a frustrating bug, monetizes a formerly-free business category, or lets competitors and lead generators park their advertising in what you felt was your space. Sometimes, Google’s vision of local simply doesn’t match real-world realities, and something like a missing category or an undeveloped feature you need is standing in the way of fully communicating what your business offers.
The pitfall is that Google’s walls can be so high that the limits and limitations of their platforms can be mistaken as all there is to local search marketing.
4) The path to success
My article on how to feed, fight, and flip Google was one of the most-read here on the Moz blog in 2020. With nearly 14,000 unique page views, this message is one I am doubling down on in 2021:
Feed Google everything they need to view the businesses you’re marketing as the most relevant answers to people in close proximity to brand locations so that the companies you promote become the prominent local resources in Google’s index.
Fight spam in the communities you’re marketing to so that you’re weeding out fake and ineligible competitors and protecting neighbors from scams, and take principled stands on the issues that matter to you and your customers, building affinity with the public and a better future where you work and live.
Flip the online scenario where Google controls so much local business fate into a one-on-one environment in which you have full control over creating customer experiences exceptional enough to win repeat business and WOM recommendations, outside the GMB loop. Turn every customer Google sends you into a keeper who comes directly to you — not Google — for multiple transactions.
GMB is vital, but there’s so much to see beyond it! Get listed on multiple platforms and deeply engage in your reviews across them. Add generous value to neighborhood sites Nextdoor, or on old school fora that nobody but locals use. Forge B2B alliances and join the Buy Local movement to become a local business advocate and community sponsor. Help a Reporter Out. Evaluate whether image, video, or podcasting media could boost your brand to local fame. Profoundly grow your email base. Be part of the home delivery revival, fill the hungry longing for bygone quality and expertise, or invest in your website like never before and make the leap into digital sales. The options and opportunities are enticing and there’s a right fit for every local brand.
Key takeaway: don’t get stuck in Google’s world — build your own with your customers from a place of openness to possibilities.
A glance at the future of Google My Business
By now, you’ve likely decided that investing time and resources into your GMB assets is a basic necessity to marketing a local business. But will your efforts pay off for a long time to come? Is GMB built to last, and where is Google heading with their vision of local?
Barring unforeseen circumstances, yes, Google My Business is here to stay, though it could be rebranded, as Google has often rebranded their local features in the past. Here are eight developments I believe we could see over the next half decade:
As mentioned above, Google could default local packs to Maps instead of the local finder, making their network a bit tidier. This is a good time to learn more about Google Maps, because some aspects of it are quite different.
Pay-to-play visibility will become increasingly prevalent in packs, organic, and Maps, including lead generation features and trust badges.
If Apple Maps manages to make Google feel anxious, they may determine to invest in better spam filters for both listings and reviews to defend the quality of their index.
Location-based image filters and search features will grow, so photograph your inventory.
Google will make further strides into local commerce by surfacing, and possibly even beginning to take commissions from, sales of real time inventory. The brands you market will need to decide whether to sell via Google, via their own company websites, or both.
Google could release a feature depicting the mapped delivery radii of brick-and-mortar brands. Home delivery is here to stay, and if it’s relevant to brands you market, now is the time to dive in.
Google has a limited time window to see if they can drive adoption of Google Messaging as a major brand-to-consumer communications platform. The next five years will be telling, in this regard, and brands you market should discuss whether they wish to invite Google into their conversations with customers.
Google could add public commenting on Google Posts to increase their interactivity and push brands into greater use of this feature. Nextdoor has this functionality on their posts and it’s a bit of a surprise that Google doesn’t yet.
What I’m not seeing on the near horizon is a real commitment to better one-on-one support for the local business owners whose data makes up Google’s vast and profitable local index. While the company has substantially increased the amount of automated communications it sends GMB listing owners, Google’s vision of local as an open-source, DIY free-for-all appears to continue to be where they’re at with this evolving venture.
Your job, then, is to be vigilant about both the best and worst aspects of the fascinating Google My Business platform, taking as much control as you can of how customers experience your brand in Google’s territory. This is no easy task, but with ongoing education, supporting tools, and a primary focus on serving the customer, your investment in Google My Business marketing can yield exceptional rewards!
I’m completely fascinated by Google’s Discover Feed. Besides the fact that it serves highly-relevant content, it also seems beyond the reach of being gamed. In a way, it almost seems beyond the reach of pure SEO (which makes it downright tantalizing to me).
It all made me want to understand what makes the feed tick.
So I did what any sensible person would do. I spent the better part of two months running all sorts of queries in all sorts of different ways to see how it impacted my Discover Feed.
Here are my ramblings.
My approach to analyzing Google’s Discover Feed
Let me explain what I did and how I did it, to both give you a better understanding of this analysis and point out its gaping limitations.
For five days a week, and over the course of two months, I executed all sorts of user behavior aimed at influencing my Discover Feed.
I ran queries on specific topics on mobile, searched for other topics on desktop… clicked on results… didn’t click on results… went directly to sites and clicked… went directly to sites and didn’t click anything, etc.
In other words, I wanted to see how Google reacted to my various behaviors. I wanted to see if one behavior influenced what showed in my Discover Feed more than other behaviors.
To do this, I searched for things I would generally never search for, went to sites I would normally never visit, and limited my normal search behavior at times so as not to influence the feed.
For example, I hate celebrity news and gossip with a passion, so I went to people.com every day (outside of the weekends) and scrolled through the site without clicking a thing. I then recorded if related material (i.e. celebrity nonsense) ended up in my Discover Feed the next day.
I recorded all of my various “web behaviors” in the same way. I would execute a given behavior (e.g. search for things related to a specific topic on mobile, but without clicking any results) and record what happened in my Discover Feed as time went on.
Here’s a breakdown of the various behaviors I executed along with the topics associated with each behavior. (For the record, each behavior corresponds to a single topic or website so I could determine the impact of that behavior on my Discover Feed.)
Allow me to quickly elaborate on the behaviors above:
These are all topics/sites that I am in no way interested or involved in (particularly self-help car repair).
When I clicked a YouTube video, I watched the entire thing (I mean, I didn’t actually watch half the time, but Google doesn’t know that… or do they?)
When I visited a site, I did scroll through the content and stay on the page for a bit.
A search for a “segment of a topic already existing in Discover Feed” means that the overall topic was something that regularly appeared in my feed (in this case, NFL football and MLB baseball). However, the subtopics, in this case the Cowboys and Marlins, were topics I never specifically searched for and did not appear in my feed. Also, the data for these two categories only reflects one month of experimentation.
All of this points to various limitations.
Is it possible that Google sees a topic like entertainment news as more “Discover-worthy” than sewing? It is.
Is it possible that going to a site like Fandango during a pandemic (when many theaters were closed) influenced Google’s decision to include or exclude things related to the topic matter dealt with by the site? It is.
What if I didn’t skip the weekends and executed the above every single day. Would that have made a difference? I don’t know.
I’m not trying to portray any of the data I’ll present as being overly-conclusive. This is merely what I did, what I found, and what it all made me think.
Let’s have at it then.
How user behavior impacted my Discover Feed
Before I dive into the “data”, I want to point out that the heart of my observations isn’t found in the data itself, but in some of the things I noticed in my Discover Feed along the way.
More than that, this data is far from conclusive or concrete, and in most ways speaks to my unique user-profile. That said, let’s have a look at the data, because there just may be some general takeaways.
As I mentioned, I wanted to see the impact of the various online behaviors on my Discover Feed. That is, how frequently did Google insert content related to the topics associated with each specific behavior into my feed?
For all the times I went to japantimes.co.jp how often was there content in my feed related to Japanese news? For all the times I searched for and watched YouTube videos on lawn care, how often did Google show me such content in Discover?
Here are some of the most intruding highlights reflected in the graph above:
Watching YouTube videos on mobile had no impact on my feed whatsoever (though it certainly did on what YouTube ads I was served).
Watching YouTube videos on desktop has little impact (in fact, any insertion of “sewing” into my feed was only as a topic card which contained no URL).
Searching on Google alone, without clicking a result, was ineffective.
Visiting a desktop site and clicking around was very effective at filling my feed with “cooking” content, however the same was not true on mobile.
Watching YouTube videos (desktop) about sewing was only successful in getting Google to include the topic in its “Discover more” cards.
I want to emphasize that when I say things like “YouTube mobile watches had no impact”, I don’t mean that as a general statement. Rather, such a statement is only aligned with the way I engaged with YouTube (one video watch per day). Clearly, and as is obvious, if you watch a large number of YouTube videos around one topic in a short time, Discover will pick this up.
I did, in fact, test this.
I gave my kids free rein at various moments to take my phone and watch a large number of videos related to specific topics (surprisingly, they were happy to oblige and to watch vast amounts of YouTube).
I have twin 9-year-old boys. One watched an obscene number of YouTube videos and executed an insane number of searches related to airplanes and flight simulators. I am still awaiting the day where my feed stops showing cards related to this topic. Here’s my search history to prove it:
The other little fellow watched videos about the weather and animal behavior that results from it for a few hours straight (hey, it was during the height of quarantine). That same day, this is what I saw in my feed:
You don’t need me to tell you that if Google thinks you’re going gaga over a specific topic, it will throw said topic into your Discover Feed posthaste.
My goal in all of this was not to see what is the quickest way to get Google to update the topics it shows in your Discover Feed. The point in my methodology was to see if there was one type of behavior that Google seemed to take more seriously than another vis-a-vis inserting new topics into my Discover Feed.
To that, Google did react differently to my various behaviors.
That doesn’t mean I can make many conclusions based on the above data. For example, Google clearly saw my going to foodnetwork.com and clicking on an article each day as a strong signal that “cooking” deserves to be in my Discover Feed.
Google was apt to think of my behavior of visiting foodnetwork.com and clicking an article each day as an endorsement for wanting “cooking” content in my Discover Feed.
At the same time, Google completely ignored that behavior on mobile. Each day I went to japantimes.co.jp and scrolled through an article. Yet, not once did Google include anything even remotely related to Japanese news in my feed.
I suspect that the topic here was too far removed from overall search behavior. So while it was reasonable for Google to assume I wanted cooking-related material in my feed, the same did not hold true for topics related to Japan.
I think this is the same reason why the topic associated with my visiting a site on desktop without clicking anything made it into my feed. The topic here was celebrity news, and I imagine that Google has profiled this topic as being one that is highly-relevant to Discover. So much so that Google tested including it in my feed at various points.
Despite never clicking on an article when visiting people.com each day, Google still flirted with showing celebrity news content in my Discover Feed.
That said, there is some reason to believe that desktop behavior has more of an impact than mobile user behavior.
The case for desktop Discover Feed dominance
About a month into my little experiment I wondered what would happen if I started searching for and clicking on things that were segments of topics that already appeared in my feed.
Deciding on these segments was quite easy. My feed is constantly filled with material on baseball and American football. Thus, I decided to search for and click on two teams I have no interest in. This way, while the topic overall was already in my feed, I would be able to see the impact of my behavior.
Specifically, on desktop I searched for things related to the Dallas Cowboys, clicking on a search result each time. Similarly, I did the same for the Miami Marlins baseball team on mobile.
Again, in both cases, content specific to these teams had yet to appear in my feed.
Here are the results of this activity:
Over a 30-day period, I found 10 instances of content related to the Dallas Cowboys in my feed and 6 instances of content about the Miami Marlins.
Again, just as in the first set of data I presented, a disparity between mobile and desktop exists.
Is this a general rule? Is this based on my particular profile? I don’t know. It’s just an interesting point that should be investigated further.
I will say that I doubt the content itself played a role. If anything, there should have been more results on mobile about the Marlins, as I was very much caught up in the World Series that was taking place at the time of my activity.
What does this data actually mean?
There are so many factors at play, that using any of the data above is a bit “hard.” Yes, I think there are some trends or indicators within it. However, that’s not really what I want you to take away from all of this. (Also, is it such a crime to consume data solely because it’s interesting to see some of what’s going on?)
What do I want you to take away, then?
As part of my data analysis (if you’ll even call it that) I looked at how long it took for a behavior to result in Discover Feed inclusion. Surprisingly, the numbers were pretty consistent:
Discounting the 31 behavior instances around my “Search Desktop No Click” activity (e.g. searching for all things related to “fishing” but clicking on nothing) to impact my feed, Google picked up on what I was doing fairly quickly.
Generally speaking, it took less than 10 behaviors for Google to think it should update the topics shown in my feed.
That’s really the point. Despite the normal things I search for and engage with both regularly and heavily (things like SEO, for example) Google took this “lighter” yet consistent behavior as a signal to update my feed.
Google was very aware of what I was doing and acted on it pretty quickly all things considered. In the case of “food/cooking” content, as shown earlier, Google took my behavior very seriously and consistently showed such content in my feed.
Forget which behavior on which device produced more cards in my feed. The fact that it varied at all is telling. It shows Google is looking at the type of engagement and where it happens in the context of your overall profile.
Personally, I think if you (yes, you, the person reading this) did this experiment, you would get different results. Maybe some of the trends might align, but I would imagine that would be it.
And now for the really interesting part of all this.
Diving into what was and what wasn’t in my Discover Feed
As I’ve mentioned, the data is interesting in some of the possible trends it alludes to and in that it shows how closely Google is watching your behavior. However, the most interesting facets of this little project of mine came from seeing what Google was and was not showing day-in and day-out.
Is Google profiling users utilizing the same account?
The first month of this study coincided with a lockdown due to COVID-19. That meant my kids were home, all day, for a month. It also meant they watched a lot of YouTube. From Wild Kratts to Woody Woodpecker, my kids consumed a heap of cartoons and they did so using my Google account (so I could see what they were watching).
Wouldn’t you know, a funny thing happened. There was no “cartoon” content in my Discover Feed. I checked my feed religiously that month and not once did I notice a card about a cartoon.
Isn’t that odd?
Not if Google is profiling my account according to the devices being used or even according to the content being consumed. All signs point to Google being well aware that the content my kids were watching was not being consumed by the one using Discover (me).
This isn’t a stretch at all. The same happens in my YouTube feed all the time. While my desktop feed is filled to the brim with Fireman Sam, the YouTube app on my phone is a mixture of news and sports (I don’t “SEO” on YouTube) as my kids generally don’t watch their “programs” on my phone.
The URLs I visited were absent from Discover
There was one other thing missing from my Discover Feed and this one has enormous implications.
Virtually none of the URLs I visited during my two-month experiment popped up in my Discover Feed!
I visited the Food Network’s website some 40 times, each time clicking and reading (pretending to read to be fair) an article or recipe. By the time I was nearing the end of my experiment Discover was showing me some sorts of food/cooking related content every day.
Through all of that, not once did Google show me a URL from the Food Network! Do you like apples? Well, how do you like them apples? (Cooked slowly with a hint of cinnamon.)
This was the general trend for each type of behavior that produced new topics in my feed. I visited a few websites about car repair, Google threw some cards about the topic in my Feed… none of which were sites I visited.
The only time I saw the same site I visited that appeared in my Discover Feed was ESPN for some of the sports queries I ran and people.com which I visited every day. However, I think that was entirely accidental as both sites are top sources of content in their spaces.
Yes, some sites I visit regularly do appear in my feed in general. For example, there were some local news sites that I visited multiple times a day for the better part of a month so as to track COVID-19 in my area. I freely admit it was a compulsion. One that Google picked up on.
In other words, it took a heck of a lot for Google to think I wanted that specific site or sites in my feed. Moreover, it would seem that Google doesn’t want to simply show content from the URLs you visit unless the signal otherwise is immense.
This leads me to my next question…
Is Discover really an SEO issue?
What can you do to optimize for Google Discover? It’s almost an absurd question. I visited the same site every day and Google still didn’t include its URL in my feed. (Again, I am aware that certain behaviors will trigger a specific URL, my point is that Google is not as apt to do so as you might think.) It all points to a certain lack of control. It all points to Google specifically not wanting to pigeon-hole the content it shows in Discover.
In other words, you can’t create content specifically for Discover. There’s no such concept. There’s no such control. There is no set of standardized “ranking signals” that you can try to optimize for.
Optimizing your images to make sure they’re high-quality or ensuring they’re at least 1,200 pixels wide and so forth isn’t really “optimizing” for Discover. It’s merely making yourself eligible to get into the ballpark. There is no standardized path to actually get on the field.
The entire idea of Discover is to offer content that’s specifically relevant to one user and all of their various idiosyncrasies. The notion of “optimizing” for something like that almost doesn’t compute.
Like with optimizing your images for Discover, all you can really do is position yourself.
And how does one position themselves for inclusion into the Discover Feed?
One of the sites that kept popping up in my feed was dallascowboys.com. This makes sense as I was searching for things related to the Dallas Cowboys and clicking on all sorts of results as a consequence. However, in my “travels” I specifically did not visit dallascowboys.com. Yet, once Google saw I was interested in the Cowboys, it was one of the first sites I was served with.
You don’t need to be a rocket scientist to see why. What other site is more relevant and more authoritative than the official site of the franchise?
If you want your site to be included in Discover, you need to be incredibly relevant and authoritative on whatever it is your site deals with.
That means investing time and resources into creating unique and substantial content. It means crafting an entire strategy around creating topical identity. After all, the idea is to get Google to understand that your site deals with a given topic, deals with it in-depth, and deals with it often (i.e., the topic is closely related to who you are as a site).
That sounds a heck of a lot more like “content marketing” than pure SEO, at least it does to me.
A cross-discipline marketing mind meld
Discover, to me, is the poster child for the merging of pure content creation and SEO. It speaks to the idea of needing a more abstract understanding of what a sound content strategy is, in order to be effective in the “Google-verse.”
It’s perhaps a different sort of motion than what you might typically find in the world of pure SEO. As opposed to diving into the minute details (be it a specific technical problem or a specific aspect of content optimization), Discover urges us to take a more holistic approach, to take a step back.
The way Discover is constructed advocates for a broader approach based on a meta-analysis of how a site is perceived by Google and what can be done to create a stronger profile. It’s almost the perfect blend of content, marketing, and an understanding of how Google works (SEO).
While Google Posts aren’t a ranking factor, they can still be an incredibly effective resource for increasing local business conversions — when used correctly. This week’s Whiteboard Friday host, Greg Gifford, shows you how to put your best post forward.
Click on the whiteboard image above to open a high resolution version in a new tab!
Howdy, Moz fans. Welcome to another edition of Whiteboard Fridays. I’m Greg Gifford, the Vice President of Search at SearchLab, a boutique digital marketing agency specializing in local SEO and paid search. I’m here today to talk about— you guessed it — Google Posts, the feature on Google My Business that lets you post interesting and attractive things to attract potential customers.
The importance of Google My Business
Mike Blumenthal said it first. Your Google My Business listing is your new homepage. Then we all kind of stole it, and everybody says it now. But it’s totally true. It’s the first impression that you make with potential customers. If someone wants your phone number, they don’t have to go to your site to get it anymore. Or if they need your address to get directions or if they want to check out photos of your business or they want to see hours or reviews, they can do it all right there on the search engine results page.
If you’re a local business, one that serves customers face-to-face at a physical storefront location or that serves customers at their location, like a plumber or an electrician, then you’re eligible to have a Google My Business listing, and that listing is a major element of your local SEO strategy. You need to stand out from competitors and show potential customers why they should check you out. Google Posts are one of the best ways to do just that thing.
How to use Google Posts effectively
For those of you who don’t know about Google Posts, they were released back in 2016, and they used to show up, up at the top of your Google My Business panel, and most businesses went crazy over them. In October of 2018, they moved them down to the very bottom of the GMB panel on desktop and out of the overview panel on mobile results, and most people kind of lost interest because they thought there would be a huge loss of visibility.
But honestly, it doesn’t matter. They’re still incredibly effective when they’re used correctly.
Posts are basically free advertising on Google. You heard that right. They’re free advertising. They show up in Google search results. Seriously, especially effective on mobile when they’re mixed in with other organic results.
But even on desktop, they help your business attract potential customers and stand out from other local competitors. More importantly, they can drive pre-site conversions. You’ve heard about zero-click search. Now people can convert without getting to your site. They appear as a thumbnail, an image with a little bit of text underneath. Then when the user clicks on the thumbnail, the whole post pops up in a pop-up window that basically fills the window on either mobile or desktop.
Now they have no influence on ranking. They’re a conversion factor, not a ranking factor. Think of it this way though. If it takes you 10 minutes to create a post and you do only one a week, that’s just 40 minutes a month. If you get a conversion, isn’t it worth doing? If you do them correctly, you can get a lot more than just one conversion.
In the past, I would have told you that posts stay live in your profile for seven days, unless you use one of the post templates that includes a date range, in which case they stay live for the entire date range. But it looks like Google has changed the way that posts work, and now Google displays your 10 most recent posts in a carousel with a little arrow to scroll through. Then when you get to the end of those 10 posts, it has a link to view all of your older posts.
Now you shouldn’t pay attention to most of what you see online about Posts because there’s a ridiculous amount of misinformation or simply outdated information out there.
Avoid words on the “no-no” list
Quick tip: Be careful about the text that you use. Anything with sexual connotation will get your post denied. This is really frustrating for some industries. If you put up a post about weather stripping, you get vetoed because of the word “stripping.” Or if you’re a plumber and you post about “toilet repairs” or “unclogging a toilet”, you get denied for using the word “toilet.”
So be careful if you have anything that might be on that no-no, naughty list.
Use an enticing thumbnail
The full post contains an image. A full post has the image and then text with up to 1,500 characters, and that’s all most people pay attention to. But the post thumbnail is the key to success. No one is going to see the full post if the thumbnail isn’t enticing enough to click on.
Think of it like you’re creating a paid search campaign. You need really compelling copy if you want more clicks on your ad or a really awesome image to attract attention if it’s a banner image. The same principle applies to posts.
Make them promotional
It’s also important to be sure that your posts are promotional. People are seeing these posts in the search results before they go to your site. So in most cases they have no idea who you are yet.
The typical social fluff that you share on other social platforms doesn’t work. Don’t share links to blog posts or a simple “Hey, we sell this” message because those don’t work. Remember, your users are shopping around and trying to figure out where they want to buy, so you want to grab their attention with something promotional.
Pick the right template
Most of the stuff out there will tell you that the post thumbnail displays 100 characters of text or about 16 words broken into 4 distinct lines. But in reality, it’s different depending on which post template you use and whether or not you include a call to action link, which then replaces that last line of text.
But, hey, we’re all marketers. So why wouldn’t we include a CTA link, right?
There are three main post types. In the vast majority of cases, you want to use the What’s New post template. That’s the one that allows for the most text in the thumbnail view, so it’s easier to write something compelling. Now with the What’s New post, once you include that call to action, it replaces that last line so you end up with three full lines of available text space.
Both the Event and Offer post templates include a title and then a date range. Some people dig the date range because the post stays visible for that whole date range. But now that posts stay live and visible forever, there’s no advantage there. Both of those post types have that separate title line, then a separate date range line, and then the call to action link is going to be on the fourth line, which leaves you only a single line of text or just a few words to write something compelling.
Sure, the Offer post has a cool little price tag emoji there next to the title and some limited coupon functionality, but that’s not a reason. You should have full coupon functionality on your site. So it’s better to write something compelling with a “What’s New” post template and then have the user click through on the call to action link to get to your site to get more information and convert there.
There’s also a new COVID update post type, but you don’t want to use it. It shows up a lot higher on your Google My Business profile, actually just below your top line information, but it’s text only. Only text, no image. If you’ve got an active COVID post, Google hides all of your other active posts. So if you want to share a COVID info post or updates about COVID, it’s better to use the What’s New post template instead.
Pay attention to image cropping
The image is the frustrating part of things. Cropping is super wonky and really inconsistent. In fact, you could post the same image multiple times and it will crop slightly differently each time. The fact that the crop is slightly higher than vertical center and also a different size between mobile and desktop makes it really frustrating.
The important areas of your image can get cropped out, so half of your product ends up being gone, or your text gets cropped out, or things get really hard to read. Now there’s a rudimentary cropping tool built into the image upload function with posts, but it’s not locked to an aspect ratio. So then you’re going to end up with black bars either on the top or on the side if you don’t crop it to the correct aspect ratio, which is, by the way, 1200 pixels width by 900 pixels high.
You need to have a handle on what the safe area is within the image. So to make things easier, we created this Google Posts Cropping Guide. It’s a Photoshop document with built-in guides to show you what the safe area is. You can download it at bit.ly/posts-image-guide. Make sure you put that in lowercase because it’s case sensitive.
But it looks like this. Anything within that white grid is safe and that’s what’s going to show up in that post thumbnail. But then when you see the full post, the rest of the image shows up. So you can get really creative and have things like here’s the image, but then when it pops up, there’s additional text at the bottom.
Include UTM tracking
Now, for the call to action link, you need to be sure that you include UTM tracking, because Google Analytics doesn’t always attribute that traffic correctly, especially on mobile.
Now if you include UTM tagging, you can ensure that the clicks are attributed to Google organic, and then you can use the campaign variable to differentiate between the posts that you published so you’ll be able to see which post generated more click-throughs or more conversions and then you can adjust your strategy moving forward to use the more effective post types.
So for those of you that aren’t super familiar with UTM tagging, it’s basically adding a query string like this to the end of the URL that you’re tagging so it forces Google Analytics to attribute the session a certain way that you’re specifying.
So here’s the structure that I recommend using when you do Google posts. It’s your domain on the left. Then ?UTM_Source is GMB.Post, so it’s separated. Then UTM_Medium is Organic, and UTM_Campaign is some sort of post identifier. Some people like to use Google as the source.
But at a high level, when you look at your source medium report, that traffic all gets lumped together with everything from Google. So sometimes it’s confusing for clients who don’t really understand that they can look at secondary dimensions to break apart that traffic. So more importantly, it’s easier for you to see your post traffic separately when you look at the default source medium report.
You want to leave organic as your medium so that it’s lumped and grouped correctly on the default channel report with all organic traffic. Then you enter some sort of identifier, some sort of text string or date that can let you know which post you’re talking about with that campaign variable. So make sure it’s something unique so that you know which post you’re talking about, whether it’s car post, oil post, or a date range or the title of the post so you know when you’re looking in Google Analytics.
It’s also important to mention that Google My Business Insights will show you the number of views and clicks, but it’s a bit convoluted because multiple impressions and/or multiple clicks from the same users are counted independently. That’s why adding the UTM tagging is so important for tracking accurately your performance.
Final note, you can also upload videos so a video shows in the thumbnail and in the post.
So when users see that thumbnail that has a little play button on it and they click it, when the post pops up, the video will play there. Now the file size limit is 30 seconds or 75 MB, which if you got commercials, that’s basically the perfect size. So even though they’ve been around for a few years, most businesses still ignore Posts. Now you know how to rock Posts so you’ll stand out from competitors and generate more click-throughs.
Hopefully you enjoyed the video. If you’ve got any additional tips to share, please throw them in the comments down below. Thanks for watching, and I’ll see you again next time.
Way back in 2015, I published an article giving away a free, simple, forecasting tool, and talking through use cases for forecasting in SEO. It was a quick, effective way to see if a change to your site traffic is some kind of seasonality you can ignore, something to celebrate, or a worrying sign of traffic loss.
In short: you could enter in a series of data, and it would plot it out on a graph like the image above.
Five years later, I still get people — from former colleagues to complete strangers — asking me about this tool, and more often than not, I’m asked for a version that works directly in spreadsheets.
I find this easy to sympathize with: a spreadsheet is more flexible, easier to debug, easier to expand upon, easier to maintain, and a format that people are very familiar with.
The tradeoff when optimizing for those things is, although I’ve improved on that tool from a few years ago, I’ve still had to keep things manageable in the famously fickle programming environment that is Excel/Google Sheets. That means the template shared in this post uses a simpler, slightly less performant model than some tools with external code execution (e.g. Forecast Forge).
In this post, I’m going to give away a free template, show you how it works and how to use it, and then show you how to build your own (better?) version. (If you need a refresher on when to use forecasting in general, and concepts like confidence intervals, refer to the original article linked above.).
Types of SEO forecast
There is one thing I want to expand on before we get into the spreadsheet stuff: the different types of SEO forecast.
Broadly, I think you can put SEO forecasts into three groups:
“I’m feeling optimistic — add 20% to this year” or similar flat changes to existing figures. More complex versions might only add 20% to certain groups of pages or keywords. I think a lot of agencies use this kind of forecast in pitches, and it comes down to drawing on experience.
Keyword/CTR models, when you estimate a ranking change (or sweeping set of ranking changes), then extrapolate the resulting change in traffic from search volume and CTR data (you can see a similar methodology here). Again, more complex versions might have some basis for the ranking change (e.g. “What if we swapped places with competitor A in every keyword of group X where they currently outrank us?”).
Statistical forecast based on historical data, when you extrapolate from previous trends and seasonality to see what would happen if everything remained constant (same level of marketing activity by you and competitors, etc.).
Type two has its merits, but if you compare the likes of Ahrefs/SEMRush/Sistrix data to your own analytics, you’ll see how hard this is to generalize. As an aside, I don’t think type one is as ridiculous as it looks, but it’s not something I’ll be exploring any further in this post. In any case, the template in this post fits into type three.
What makes this an SEO forecast?
Why, nothing at all. One thing you’ll notice about my description of type three above is that it doesn’t mention anything SEO-specific. It could equally apply to direct traffic, for example. That said, there are a couple of reasons I’m suggesting this specifically as an SEO forecast:
We’re on the Moz Blog and I’m an SEO consultant.
There are better methodologies available for a lot of other channels.
I mentioned that type two above is very challenging, and this is because of the highly non-deterministic nature of SEO and the generally poor quality of detailed data in Search Console and other SEO-specific platforms. In addition, to get an accurate idea of seasonality, you’d need to have been warehousing your Search Console data for at least a couple of years.
For many other channels, high quality, detailed historic data does exist, and relationships are far more predictable, allowing more granular forecasts. For example, for paid search, the Forecast Forge tool I mentioned above builds in factors like keyword-level conversion data and cost-per-click based on your historical data, in a way that would be wildly impractical for SEO.
That said, we can still combine multiple types of forecast in the template below. For example, rather than forecasting the traffic of your site as a whole, you might forecast subfolders separately, or brand/non-brand separately, and you might then apply percentage growth to certain areas or build in anticipated ranking changes. But, we’re getting ahead of ourselves…
The first thing you’ll need to do is make a copy (under the “File” menu in the top left, but automatic with the link I’ve included). This means you can enter your own data and play around to your heart’s content, and you can always come back and get a fresh copy later if you need one.
Then, on the first tab, you’ll notice some cells have a green or blue highlight:
You should only be changing values in the colored cells.
The blue cells in column E are basically to make sure everything ends up correctly labelled in the output. So, for example, if you’re pasting session data, or click data, or revenue data, you can set that label. Similarly, if you enter a start month of 2018-01 and 36 months of historic data, the forecast output will begin in January 2021.
On that note, it needs to be monthly data — that’s one of the tradeoffs for simplicity I mentioned earlier. You can paste up to a decade of historic monthly data into column B, starting at cell B2, but there are a couple of things you need to be careful of:
You need at least 24 months of data for the model to have a good idea of seasonality. (If there’s only one January in your historic data, and it was a traffic spike, how am I supposed to know if it was a one-off thing, or an annual thing?)
You need complete months. So if it’s March 25, 2021 when you’re reading this, the last month of data you should include is February 2021.
Make sure you also delete any leftovers of my example data in column B.
Once you’ve done that, you can head over to the “Outputs” tab, where you’ll see something like this:
Column C is probably the one you’re interested in. Keep in mind that it’s full of formulas here, but you can copy and paste as values into another sheet, or just go to File > Download > Comma-separated values to get the raw data.
You’ll notice I’m only showing 15 months of forecast in that graph by default, and I’d recommend you do the same. As I mentioned above, the implicit assumption of a forecast is that historical context carries over, unless you explicitly include changed scenarios like COVID lockdowns into your model (more on that in a moment!). The chance of this assumption holding two or three years into the future is low, so even though I’ve provided forecast values further into the future, you should keep that in mind.
The upper and lower bounds shown are 95% confidence intervals — again, you can recap on what that means in my previous post if you so wish.
Advanced use cases
You may by now have noticed the “Advanced” tab:
Although I said I wanted to keep this simple, I felt that given everything that happened in 2020, many people would need to incorporate major external factors into their model.
In the example above, I’ve filled in column B with a variable for whether or not the UK was under COVID lockdown. I’ve used “0.5” to represent that we entered lockdown halfway through March.
You can probably make a better go of this for the relevant factors for your business, but there are a few important things to keep in mind with this tab:
It’s fine to leave it completely untouched if you don’t want to add these extra variables.
Go from left to right — it’s fine to leave column C blank if you’re using column B, but it’s not fine to leave B blank if you’re using C.
If you’re using a “dummy” variable (e.g. “1” for something being active), you need to make sure you fill in the 0s in other cells for at least the period of your historic data.
You can enter future values — for example, if you predict a COVID lockdown in March 2021 (you bastard!), you can enter something in that cell so it’s incorporated into the forecast.
If you don’t enter future values, the model will predict based on this number being zero in the future. So if you’ve entered “branded PPC active” as a dummy variable for historic data, and then left it blank for future periods, the model will assume you have branded PPC turned off in the future.
Adding too much data here for too few historic periods will result in something called “overfit” — I don’t want to get into detail on this, which is why this tab is called “Advanced”, but try not to get carried away.
Here’s some example use cases of this tab for you to consider:
Enter whether branded PPC was active (0 or 1)
Enter whether you’re running TV ads or not
Enter COVID lockdowns
Enter algorithm updates that were significant to your business (one column per update)
Why are my estimates different to your old tool? Is one of them wrong?
There’s two major differences in method between this template and my old tool:
The old tool captured non-linear trends by using time period squared as a predictive variable (e.g. month 1 = 1, month 2 = 4, month 3 = 9, etc.) and trying to fit the traffic curve to that curve. This is called a quadratic regression. The new tool captures non-linear trends by fitting each time period as a multiple of the previous time period (e.g. month 1 = X * month 2 where X can be any value). This is called an AR(1) model.
If you’re seeing a significant difference in the forecast values between the two, it almost certainly comes down to the second reason, and although it adds a little complexity, in the vast majority of cases the new technique is more realistic and flexible.
It’s also far less likely to predict zero or negative traffic in the case of a severe downwards trend, which is nice.
How does it work?
There’s a hidden tab in the template where you can take a peek, but the short version is the “LINEST()” spreadsheet formula.
The inputs I’m using are:
Whatever you put as column B in the inputs tab (like traffic)
Linear passing of time
Previous period’s traffic
Dummy variables for 11 months (12th month is represented by the other 11 variables all being 0)
Up to three “advanced” variables
The formula then gives a series of “coefficients” as outputs, which can be multiplied with values and added together to form a prediction like:
“Time period 10” traffic = Intercept + (Time Coefficient * 10) + (Previous Period Coefficient * Period 9 traffic)
You can see in that hidden sheet I’ve labelled and color-coded a lot of the outputs from the Linest formula, which may help you to get started if you want to play around with it yourself.
If you do want to play around with this yourself, here are some areas I personally have in mind for further expansion that you might find interesting:
Daily data instead of monthly, with weekly seasonality (e.g. dip every Sunday)
Built-in growth targets (e.g. enter 20% growth by end of 2021)
Richard Fergie, whose Forecast Forge tool I mentioned a couple of times above, also provided some great suggestions for improving forecast accuracy with fairly limited extra complexity:
Smooth data and avoid negative predictions in extreme cases by taking the log() of inputs, and providing an exponent of outputs (smoothing data may or may not be a good thing depending on your perspective!).
Regress on the previous 12 months, instead of using the previous 1 month + seasonality (this requires 3 years’ minimum historical data)
I may or may not include some or all of the above myself over time, but if so I’ll make sure I use the same link and make a note of it in the spreadsheet, so this article always links to the most up-to-date version.
If you’ve made it this far, what would you like to see? Let me know in the comments!
A new client comes to your digital marketing agency and says their competitors are stuck to the local packs like mussels cleaved to coastal rock.
“How do we edge our way up in Google’s local finder, and find our place above the tideline? We don’t even know where to begin,” the local business owner says.
The rough truth is that Google’s local search engine results often don’t make sense at first glance, or even at second or third glances. Local brands are left to puzzle out how to achieve maximum growth when they’re consistently being outranked by sticky competition for their core search phrases.
About a year ago, I decided to run a study in which I’d track a local finder for a single query — “breakfast (X city)” — to see if anything brands or the public did over the course of 12 months would shift the top eatery out of its #1 spot. I chose a small SF Bay Area city where I’m not physically located (to remove the influence of proximity from the mix) and repeated the same query across time to see what we might learn from trying to explain the results at the end of the test period. I did my searches manually and tracked them in a spreadsheet.
My anonymized data and takeaways are at the service of your agency as you work to increase local clients’ visibility so that they can achieve optimum growth.
Visualizing a year of movement in the local finder
Google’s local finder results are paginated in sets of ten. In the following chart, you’ll see the top 10 competitors as they stood in January, moved throughout the year, and finished in December. A total of fifteen brands saw some visibility in the top 10 local finder results over the course of the year, and each brand is represented by its own color.
1) Nothing anyone did in 2020 shifted Brand #1 (which I’ll call Tansy) out of its top spot. No amount of links, reviews, photos, posts, category changes or any other activities over the course of year unseated Tansy.
When I see a results set like this, I suspect a sluggish market in which no one is making a strong enough marketing effort to surpass a business like Tansy. If you find a sluggish market, your client can become a winner with the right strategy.
2) The higher a business appeared in the local finder, the more stable it tended to be throughout the year. The lower a business appeared in the local finder, the more erratic its position was as the year moved along. One notable exception to this was dramatic eight-spot April drop the business that started out in position #2 experienced. My theory was that this outlier may have been tied to changes at the business at the onset of COVID-19. But other than this, the lower levels of the top 10 local finder results are very volatile, with some brands even vanishing while new ones popped into view.
It’s clear that local results are extremely dynamic, sometimes even changing from hour to hour within a single day, but you seem to be most secure at the top. Even if you only report on rankings to clients a few times a year, remind them that variation is the norm, and that they shouldn’t sweat the small ranking stuff because it’s tracking overall upward growth that matters most to their brand.
3) Of the 15 total brands that won a spot in the top 10 results over the course of the study, three began and ended the year in the same position. Two that maintained a presence in the results throughout the study ended up in a higher position at the end of the year than they’d begun with, and two ended up lower. Two latecomers began the year not in the top ten but achieved a top ten placement by year’s end. Finally, two that began the year in the top ten fell out of the set by year’s end and two latecomers made brief appearances at some point, only to disappear later.
I wasn’t expecting the pandemic to happen when I began this study, but my takeaway at the end of the test period was that this set of restaurants had done an amazing job adapting. Though a few restaurants lost their spot “above the tideline” of the top 10, most remained operational and visible, and some even made small gains. Anything you can do to help clients remain safely viable will be work of real value for the duration of COVID-19.
Seek strategic clues to unseat a sticky local competitor
So, a year has gone by in which no other brand was unable to unseat Tansy. This leaves us with two questions:
What could restaurants lower in this local finder do in the year ahead to mount a challenge to Tansy’s dominance?
As a side query, how do I feel about the results quality? Is it fitting that Tansy is ranking #1 for this search phrase, or are Google’s results inexplicable and/or of low quality?
To investigate this, I did an competitive audit of Tansy at #1, the brand at #5 which I’ll call Lovage, and the brand at #10 which I’ll call Rosemary, to continue my herbal theme.
Obviously, I don’t have access to the analytics of any of these brands, but I was able to analyze 48 data points for each brand to discover where Tansy is winning and where Lovage and Rosemary would need to improve to pry Tansy out of its spot, if possible. I won’t cover all 48 points here, but let’s look for answers together in this data set:
All three brands are within Google’s mapped city borders, though Rosemary is right at the edge, just within the red perimeter.
No location shares an address with a competitor with the same primary category, though Lovage is within a block of businesses with the same primary category.
Lovage is moderately winning on proximity to the city centroid, at .1 miles from it. Tansy is .4 miles, and Rosemary is much further away at 4.5 miles.
There are no signs of spam, in terms of location. Everything is legit.
No business has the word “breakfast” or the name of the city in its business title, so no one is either spamming or winning an advantage here.
Tansy is categorized as “breakfast restaurant”, but both Lovage and Rosemary are categorized as “American Restaurant”.
This is our first big a-ha. If Lovage or Rosemary see breakfast-related queries as their primary queries (their head terms), they would likely need to change their primary category to compete with Tansy. Right now, Tansy is getting a big win here.
Tansy is also doing a better job with secondary categories, according to GMBspy, having selected brunch restaurant, american restaurant, and family restaurant to let Google know more about their relevance. Lovage has only selected the rather redundant restaurant as a secondary category, and Rosemary has no secondary categories. Lovage and Rosemary are leaving opportunities on the table here to enrich their secondary categories.
Tansy comes out ahead again by having uploaded about 20 photos and accumulated 400+ total photos from the public. Lovage has uploaded 0 photos, though the public has stepped in with 100+ uploads. Rosemary has also uploaded 0 photos, and has only amassed 20 public pics.
In terms of quality, I saw lots of good shots for Tansy and Lovage, but Rosemary’s user-uploaded photos are fuzzy, unflattering, and in need of work. Both Lovage and Rosemary need to invest the time in uploading a great photo set to enrich their listing and improve conversions.
All three brands have achieved a laudable 4.6 star rating, so there is no clear winner here, but Tansy has 510 reviews, Lovage has 245, and Rosemary has 109.
Tansy is running away with the review game. Lovage needs to double its review corpus and Rosemary needs 5x the reviews it currently has to achieve comparable metrics.
Tansy is also ahead in terms of review recency, with their most recent review being 6 days ago, while it’s been 3 weeks since Lovage was reviewed and 2 weeks for Rosemary.
None of our three competitors have responded to a single Google review. This would seem to shore up the theory that owner responses don’t directly impact local pack rankings, because clearly a lack of responses isn’t preventing high placement in this local finder. That being said, ignoring conversations your customers are starting each time they review your business is not good customer service and could erode reputation and ratings over time. There’s opportunity here for Lovage and Rosemary to become more active than Tansy in ways that could improve customer experience and conversions.
I saw no signs of spam in Tansy’s body of Google reviews, so nothing can be reported by lower competitors to gain an advantage.
Meanwhile, over at Yelp, Tansy is ranking #2, with a 4-star rating, and 673 reviews. Lovage comes in at #7, with a 4-star rating, and 223 reviews. Rosemary is way down at #24, with a 4-star rating, and just 100 reviews. The high stars of all three brands could be doing something to shore up their rankings over in Google’s product, but this is just speculation on my part. I find Rosemary’s top 10 Google visibility a bit more mysterious after looking at Yelp.
I consider this an experimental area of Google’s review interface. It surfaces and quantifies subjects reviewers are discussing. I like to look at this to gauge how Google might derive signals of relevance in relationship to the search phrase.
Tansy’s top ranking for a breakfast query might be somewhat supported by 43 mentions of “french toast” and 5 of “breakfast burritos”, but Lovage looks to be in the best shape here with 52 mentions of “breakfast”. Rosemary has received 11 mentions of “breakfast”.
I found the place topics for Lovage especially interesting, because as I accumulated some rather low metrics for them elsewhere in my audit which caused me to feel surprised by their good #5 ranking, I revisited this data point. Could this winning number of mentions of “breakfast” be doing a great deal to support Lovage’s ranking for my search term, even though their metrics are severely lacking in other areas of the audit? Food for thought!
Don’t forget that review acquisition campaigns can shape response language by the way requests are phrased. Tansy should secure their relevance by asking patrons to specifically comment about breakfast, and Rosemary needs to keep working on breakfast mentions as they increase their overall review count.
Google Posts, Q&A and menus
Tansy is in the lead again, with minor usage of Google Posts, and 4 questions asked with some response from the brand. Our other two competitors have never published a Google Post or received, published, or answered a question.
Lovage and Rosemary could shine here with a moderate effort put into Google Posts since Tansy’s usage has been lukewarm, and it would take about 15 minutes for the two lower-ranked competitors to put up 10 FAQs and answer them to take on a more active appearance than Tansy.
In regards to menu links, only Tansy had posted one. Smartly, it was a link to the menu on their own website rather than on a third-party platform.
Here’s where I got a fairly significant audit shock.
Tansy has a real website that’s been around for 4 years with a Domain Authority of 19 and a GMB landing page Page Authority of 20. They’ve earned 70 links from 43 root domains. The basic contact info on the website matched the GMB contact info. The GMB landing page title tag optimization did not include my search phrase. The site passed Google’s mobile friendly test but does not pass secure HTTPs muster. The top link the site has earned is from a local online newspaper with a PA of 40, according to Moz Link Explorer.
But Lovage has no website at all, and aren’t linking their Google My Business listing to anything.
Meanwhile, Rosemary has a sketchy two-month-old subdomain on some sort of free website builder with a concerning backlink portfolio of 7,324 links from 74 root domains. The actual DA of the website builder domain is 22 and the GMB landing page PA is 15. The GMB NAP matched the landing page NAP but the GMB landing page title tag optimization did not include my search phrase. The site was neither mobile-friendly nor secure. Moz Link Explorer found that the top link followed to the site was from a completely unrelated web page on a lifestyle site about life in another state, with a PA of 43.
Tansy’s content was minimal, lacked the search phrase in its title tag, and was in what I’d consider pretty poor SEO shape. But it was better than having no website, like Lovage, or the single subdomain page that Rosemary has.
So, this is one of those good but startling audit surprises. No one has a strong website, and despite this, Lovage is ranking #5 with no website and Rosemary is managing top 10 visibility without a real website of their own. There is certainly opportunity for a competitor with a strong, optimized website and a solid backlink profile to make headway in a market like this where high rankings are being awarded despite minimal organic effort.
I looked at a variety of other points, like hours of operation, price attributes, and the sites Google was surfacing from around the web on the GMB profiles, but I didn’t see any major wins or losses here.
From the overall audit process, what I did see was that:
Tansy’s win is clear
Of the 20 factors in which one of the three competitors scored a clear win, Tansy won 17, Lovage won 2, and Rosemary won 2.
Nobody but Google knows what all the local ranking factors are, but as far as my auditing process can measure, it made sense that Tansy’s 17 wins were translating to the top ranking among these three competitors. As far as I can measure as a local SEO without access to behavioral signals and other analytics, the top result, at least, makes sense.
Lovage and Rosemary’s claims to visibility are cloudier
Things fall apart a bit after acknowledging that Tansy deserves to be #1. Google is measuring Lovage as being a better result than Rosemary, despite the former having no website and the latter having at least a free subdomain page it is designating as home.
Maybe Google is as suspicious of that backlink profile on the free website builder as I am and is pushing Rosemary below Lovage because of it. Maybe Lovage’s winning Place Topic mentions of “breakfast” are keeping it in the running for my breakfast query, and are even moderately representative of Google’s overall understanding of this entity’s relevance to searches for breakfast in this city.
The trouble is, within the first 10 results of the Local Finder, I saw Lovage outranking restaurants with higher metrics in many areas I haven’t described in my summary, and so, Google’s weighting of ranking factors remains frustratingly vague in this test, as it does in so many real-world cases.
Lovage or Rosemary could unseat Tansy if they chose to
Despite the opacity of Google’s local algorithm, there is clearly room for improvement for both Lovage and Rosemary. If either of these brands were your agency’s client, you would need to take Tansy’s wins column and build your strategy from it. Your strategy could include recommendations for:
Primary category adjustment based on ranking goals
Website development and optimization
Review acquisition, including both numbers and recency, as well as review language
Customer service improvements via owner responses and Q&A usage
The main thing is that Tansy’s effort has not been so enormous that it can’t be overcome. It has remained at the top for a year due to a modest presence — not an insurmountable one.
X factors and Google’s local SERP quality
For nearly two decades, local SEOs have been trying to identify and assign weight to the various local search ranking factors. The truth is, whenever I have occasion to conduct an audit, I realize that:
I’m confident that we know some of the factors, but certainly not all of them. I think there are X factors out there still to be discovered.
I have little confidence that we know the weight Google assigns to individual factors, and I strongly suspect that Google weights unique factors differently in different industries.
More on that second point: in this data set, I’ll reveal that the business which ranked #8 in December is IHOP — a large, corporate competitor with a Domain Authority of 68, and nearly 700,000 links from nearly 20,000 root domains. Yet, it ended up being outranked by both Tansy and Lovage, which are single location, independently-owned eateries. How does that happen?
I strongly believe that organic factors have a huge impact on local rankings, but it doesn’t play out that way in this local finder. I also strongly believe that review count matters, but Lovage is beating IHOP with fewer reviews. I still moderately believe that for remote searches, distance to city centroid continues to have some effect, but IHOP is very centrally located in this instance. And so on and so forth.
Overall, I feel Google’s results are, indeed, delivering a good quality experience for a person searching for breakfast in this city. The searcher is certain to find a decent variety of nearby options for a meal, and I saw no spam lying in wait for them in this particular top 10 of the local finder. But as to the individual placement of each restaurant, I did see mysteries that I couldn’t easily solve for myself and that agencies like yours would likely find difficult to explain to clients.
Creating a strong plan of action for clients, despite any ranking mysteries
No one factor will “do the trick” in any local finder. Just like flour doesn’t equal bread unless you add yeast, salt, and water, a single local ingredient won’t = rankings without attention to the whole recipe.
Your agency will encounter sluggish packs where no brand is taking substantial action to challenge the top competitor, meaning achievable wins are totally possible with a few good ingredients. You’ll uncover local finders so riddled with spam that reporting bad actors will be core to your strategy. And you’ll also encounter SERPs that are so actively managed by mighty competitors, making any headway for your client will require throwing everything but the kitchen sink at the problem.
Regardless of scenario, you can create the strongest plan action with these five steps:
The top ranked business in my study were the recipients of tons of love from the public. Their food, their service, their adaptations to the pandemic, and many other human factors really sang out loud in the reviews. The foundation of success both offline and online is positive real world relationships. Be sure you make this message central to what you teach all clients.
Identify the top competitor’s wins, and prioritize your local marketing strategy based on which factors you believe are having the most impact in the client’s unique market, whether that’s reviews, photos, or what have you. Even if there are mystery rankings, you’ll typically get the best results by applying best practices to presumed ranking factors, hoping to see cumulative rewards. But don’t take anyone’s word for it. Keep experimenting when you encounter mysteries. It may be your agency that unlocks an X factor.
Some agencies don’t report on rankings at all. If yours does, be sure you’re not overreporting, because the constant variation in ranking order can cause clients needless worry. Rather, use rankings mostly as internal benchmarks, and be sure you’re tracking how the work you’re doing is leading to upward growth in conversions and revenue.
Be sure incoming clients understand the influence of user-to-business proximity, meaning that there are no static #1 rankings. This yields many, many chances for your client to rank because customers are multi-located, mobile, and being served up highly dynamic local SERPs.
What would your agency add to my to-do list? What have you seen in your own year-long or multi-year local SERP tracking? Do you suspect the identity of an X factor no one is talking about? If you’ll share in the comments, we can all keep learning together!