That’s why the mantra “link building is relationship building” exists. Often, before you build a link, you have to build a relationship with the site owner first. This means anything from following them on Twitter, commenting mindfully on their posts, writing emails to them to discuss their content without pitching links, etc. It’s a productive strategy, but also a time-intensive one.
However, there’s another — relatively quick — link building strategy.
Is your ear itching? If you’re the superstitious type, this means that someone is talking about you.
Sometimes a webmaster will publish your brand name, products, or target keywords on their site without actually linking to your site. In SEO, these are known as “fresh mention” opportunities. These are typically some of the easiest link building opportunities available, since you don’t really have to explain yourself to the site owner. Mostly, you just have to ask them to put an <a href> tag in the code.
But how do you find these fresh mentions? There are multiple methods and tools, but today I’m going to highlight the one I use most often: Google Alerts.
Google Alerts is beneficial in a myriad of ways beyond the world of link building and SEO, but there’s no doubt that it’s the best way to stay on top of your fresh mention opportunities. Allow me to explain how you can use it!
Setting up Google Alerts
First off, the obvious: you need the correct link. To start using Google Alerts, head over to Google Alerts. You can technically set up alerts without a Gmail account, but I would recommend having one. If you don’t have one, click here to find out how to set one up.
When you have an account set up and land on Google Alerts, you will see a page that looks like this:
No, there’s not much to see. Not yet anyway.
Let’s take a basic example. Say you want to create an alert for mentions of link building. Simply type the phrase into the bar at the top.
You will see something similar to the image above, even before you click on anything else. The first box asks for which email address you want to receive the alerts (I’ve erased mine for the purpose of this article, but trust me, it’s there). Below that will be examples of recent alerts for your query.
Click the “Create Alert” button, and alerts will be sent to your selected inbox going forward. However, you can customize a few settings before you do so. Click the “Show options” dropdown next to the button to see a list of settings you can adjust:
Each item is auto-filled with the default setting. You can adjust the settings so that you only get alerts from specific regions, for certain types of content, and more. In general, I have found the default settings to suffice, but there are valid reasons you might want to change them (if you’re only interested in video content, for example).
When you’re done with the settings, you can create the alert!
Google Alert tips
From that point on, assuming you stuck with the default option of once-a-day emails, you’ll get an email every 24 hours that looks like this:
Notice the returns in this example include pages that talk about each individual word from your query (in this example the word “link” and the word “building”). Obviously, this isn’t helpful, and it’s a waste of time to sift through these results.
So, how can you make sure that you only get results for an exact phrase? Quotation marks!
I (intentionally) made this mistake when setting up this alert. Notice in the image from the first section that “link building” didn’t include quotation marks around it. Without them, Google Alerts will return results like the ones in the image above.
The quotation marks indicate that you’re looking for an exact match of that phrase, so when you set up an alert using them you will get something that looks like this:
Much better, right?
Note that you can combine terms with and without quotation marks in one alert. Say for example I was looking for content related to link building around images. Instead of “link building images,” a phrase not likely to occur too often, I could use:
This will return results that include both the exact phrase “link building” AND the term “images”.
Set up multiple alerts
If you’re using Google Alerts for link building, I recommend setting up more than one alert. Consider some of the following:
Your brand name
Your products or services
Your focus keywords
Personalities associated with your brand
If you’re concerned about all the emails flooding your inbox, adjust the settings to decrease the frequency or stagger delivery days. You can also set up a separate Gmail account that only serves to receive these emails. I personally find the former to be the better option, but I know people who do the latter.
Consider setting up alerts for your competitors as well. Doing so may give you a window into their link building and publicity strategies that you can learn from. Along with that, you might find new potential target sites that aren’t mentioning you. If they mention your competitor, it’s likely they are relevant to your niche.
Also include common misspellings of any of the list items above. While Google’s algorithm is typically smart enough to correct such misspellings in its search, a few valuable results may seep through even still.
Google Alerts can be helpful for other purposes other than link building. Certainly, if you’re engaged in an online reputation management campaign, they’re a necessity. Some use Alerts to track the kind of publicity their competitors are getting as well.
There are other excellent link building tools out there that can complement your “fresh mention” strategy if you are a link builder, but Google Alerts is an essential. I hope you find Google Alerts as helpful for link building as I have. If you have other tools or suggestions, please mention them in the comments below.
We want to fix things and believe we’re in control. When your house is filling with water, you grab a bucket. If there’s a hole in your roof, the bucket might help. If your sink is overflowing, the bucket is distracting you from the real problem. If the river is overflowing, that distraction could be deadly.
When traffic is falling, it’s easy to panic and focus on what you can control. Traffic isn’t just a nice-to-have — it puts food on the table and the roof over your head that keeps the water out. In the rush to solve the problem, though, we often don’t take the time to validate the problem we’re solving. Fixing the wrong problem is at best a waste of time and money, but at worst could deepen the crisis.
In any crisis, and especially a global one, the first question you need to ask is: is it just me, or is it the whole world? The answer won’t magically solve your problems, but it can keep you from making costly mistakes and start you on the path to a solution. Let’s start with a fundamental question:
(1) Did your traffic really drop?
My “fundamental” question might sound like a stupid question, especially given the wide impact of the COVID-19 pandemic, but it’s important to remember that traffic fluctuates all the time — there are weekends and seasonality and plain, old regression to the mean. What goes up must come down, and as much as we’d like it to be true, business is not perpetually up and to the right.
Using Google Analytics, let’s consider some ways we can validate a traffic drop. Here’s four weeks of GA data (March 1-28) for a site which was seriously impacted by COVID-19:
Given the known timeline of COVID-19 (the WHO declared it a pandemic on March 11), this is about as clean a picture of a traffic drop in the presence of a known cause as you’re going to get. Most situations are far messier. Even here, we’ve got the impact of weekends and day-to-day fluctuations. One quick way to get a cleaner view is to summarize the data by week (make sure your date-range covers full weeks, or this data will be skewed).
The trend is much clearer now. In a two week period, this site lost more than half of its traffic. I’m restricting the timeline for clarity, but as we gather more data, we can validate the trend pretty easily. The graph above covers all traffic sources. From an SEO perspective, let’s add in a traffic segment for Google traffic:
This graph is just eight data points, but it tells us a lot. First, we can clearly see the trend. Second, we can see that the trend is almost identical for both Google traffic and overall traffic. Third, we can see that this site is very dependent on Google for traffic. Don’t underestimate what you can learn from small data, if it’s the right data.
This isn’t meant to be a GA primer, but let’s look at one last question: Is this traffic drop seasonal? Usually, your own industry experience and intuition would come into play, but one quick way to spot this is to compare year-over-year traffic. One note: match your full weeks so that you’re covering the same amount of weekdays vs weekends. In this case, I’ve shifted the 2019 range to the four full weeks of March 3-30 …
This isn’t the easiest graph to read, and I probably wouldn’t put it in a report to a client, but you can see from the green and purple lines that both overall traffic and Google traffic for this site were relatively flat last year during March. This really does seem to be an unusual situation. Even if we knew nothing about the context and COVID-19, we could tell from just a few minutes of analysis that something serious is going on here.
(1b) Did your rankings drop?
As a search marketer, and given that we’ve clearly measured a Google traffic drop, the next question is whether this drop was due to a loss of rankings (we’ll get to other explanations in a moment). In Moz Pro, one quick way to assess overall weekly search visibility is to use either the main view under “Rankings” or go to the “Competition” tab. I like the competitive view, because you can quickly see if any changes impacted your broader industry …
I’ve simplified this view a little bit (and removed the site’s and competitors’ names for privacy reasons), but the basic story is clear — neither the site in question nor its competitors seemed to have any drop in visibility during March.
For a richer view, go back to the “Rankings” tab and select “Rankings” (instead of “Search Visibility”) from the drop-down. You’ll see a graph that looks something like this …
This visualization takes some getting used to, but it contains a wealth of information. The bars represent total ranking keywords/phrases, and the color blocks show you the ranking range (see the legend). Here we can see that overall rankings have been relatively stable, with even some small gains in the #1-3 bucket.
If your account is connected to Google Analytics, you can also overlay traffic during the same period, which is shown by the dark gray line. Dual-scale graphs can get tricky, but this visualization really makes it clear that there’s a mismatch between the traffic drop for this site and their search rankings.
(2) Did Google do something?!
Usually, when we ask [demand / shout / sob] this question, we mean “Did Google do something to the algorithm to make my life miserable?” We can argue about whether Google is trying to make your life miserable at another time (preferably, when the bars re-open), but the core question is valid. Did Google change the algorithmic rules in a way that’s negatively impacting your site?
For large-scale algorithm updates, you can check our own Google Algorithm History page. For smaller/daily updates, you can check our MozCast research project. While having a gut-check against major changes can be very useful, the messy truth is that Google rankings are a real-time phenomenon that’s changing minute-by-minute. In 2018 alone, Google reported 3,234 “improvements” to search.
Keep in mind that all Google algorithm tracking tools are based, to some degree, on fluctuations in rankings. In our example scenario, we’re not seeing ranking shifts. Let’s pretend, though, that we have seen a traffic drop with a corresponding ranking drop, and we’re trying to determine if it’s just us or if something changed with Google.
Here’s a graph of MozCast data from my analysis of the January 2020 Core Update …
In this case, we’ve got a pretty clear three-day period of ranking fluctuations. If our traffic dropped during this period, it’s not absolute proof that an algorithm update is to blame, but it’s a solid, educated guess and a useful starting point.
Let’s look at the two weeks around when COVID-19 was declared a global pandemic …
I’ve kept the same scale and 30-day average reference (from a relatively quiet period early this year). Note that algorithmic activity (i.e. ranking flux) is way up compared to the period before and after the January Core Update. One day (March 18) doesn’t even fit on the scale of the original graph and came in at 104°F on MozCast.
What does all of this mean? It’s possible that Google is changing the algorithm rapidly to address the broader changes in the world, but I strongly suspect that the world itself is impacting this flux. Sites are changing rapidly, adding and removing products and content, news sources have dramatically shifted their coverage, and some businesses are closing completely. On top of that, we’re seeing an unprecedented shift in searcher and consumer behavior.
Algorithm flux can be a useful answer to the question “Is it just me, or is it Google?” during normal times, but all that it’s telling us right now is that the world has turned upside-down. While that’s an accurate assessment, it’s not particularly helpful. If you’d like to hear more about the impact of COVID-19 on Google rankings, check out “SEOs talk COVID-19 search disruption” from Barry Schwartz with myself, Marie Haynes, Olga Andrienko, and Mordy Oberstein.
If traffic has dropped, but rankings haven’t, it’s also possible that the behavior of searchers has changed. We can get some insights into this by using Google Search Console. Here’s the graph of total clicks for our example site from March 1-28 (corresponding with the GA data) …
As expected, total clicks on Google results show roughly the same trend as Google organic traffic in GA. Total clicks are a function of two variables, though: (1) search impressions, and (2) click-through rate (CTR). Let’s look at those individually. Here’s the graph of total impressions for the same time period …
Now we’re getting somewhere — there’s an overall drop in impressions. This isn’t just about the example site, but searcher behavior before they even see or click on that site. People are searching less for the phrases that drive traffic to our example site. Finally, let’s look at CTR …
CTR has also dropped, even a bit steeper than impressions. This is a bit harder to interpret. Knowing what we know, it’s likely that people are clicking less because of overall lack of interest. This is consistent with the COVID-19 scenario. People are less likely to be looking for the service this site offers. On the other hand, it could be that something about the site or the competitive landscape has changed that’s driving down CTR.
If you see a CTR drop without a corresponding impression drop, review recent changes to the site, especially changes that could impact what’s displayed in search results (including your TITLE tags and META descriptions). In this case, though, it’s reasonable to assume that we’re looking at an overall drop in demand.
(3) Has the world gone mad?
Spoiler alert: yes, yes it has.
The Google Search Console data above has already suggested that we’re seeing a shift in the wider world and searcher behavior, but if you want to get outside of your own data, you can explore the world a bit with Google Trends. For example, here’s a Google Trends search for “movie tickets” for March 1-28 …
Not surprisingly, searcher interest in movie tickets declined sharply after the COVID-19 outbreak. People who aren’t going to movies aren’t going to be searching for showtimes and ticket prices. Google Trends data can be spotty in the long-tail, and we can’t necessarily attribute a trend to an event, but non-brand trends are a good supporting data point for whether your traffic drop is isolated to your site or is impacting your broader industry.
One final tip — everything discussed in this post can also be used to explore a traffic increase. Even during COVID-19, traffic has gone up for many topics and sites. For example, here’s the Google Trends data for “how to cut hair” from the same March 1-28 time period …
Whether or not cutting your own hair is a good idea, people are definitely showing more interest in the topic (I admit I’ve watched a couple of YouTube videos myself). We don’t typically dive deep into traffic increases — it’s too easy to just sit back and take the credit. I think this is a big mistake. Understanding whether a traffic increase was driven by changes you made or broader market shifts can help you understand what you’ve done right so that you can replicate that success.
The big picture is everything
Over the last few years, I’ve heard more people say things like “I don’t care about traffic, I care about conversions!” or “I don’t care about Google rankings, as long as I’m getting traffic!” Our gradual move toward bottom-of-funnel metrics makes sense — we’re all trying to make a living. Taken to extreme, though, we lose valuable information. Focusing on conversions is certainly better than focusing on “hits” a la 1998, but no single metric tells the whole story.
Let’s say that the only thing you track is leads. Leads are where the money is. Sales are up, leads are up, times are good. Great. Inevitably, disaster strikes (even if it’s a minor disaster), and your leads drop. What do you do? You’ve cut off your ability to read anything but the last chapter of the story. You know how it ends, but you don’t know how you got there. Without understanding the path from leads back to visits back to rankings back to impressions, you’re not going to see the whole story, and you’re not going to know where things went wrong.
Even when times are good, this approach is short-sighted. Sales-focused culture creates a tendency to celebrate the wins and not ask too many questions. If traffic is going up, why is it going up? What content or keywords are driving that traffic? What industry trends are driving that traffic? If you can answer those questions, you can replicate success. If you can’t, then you’re going to have to start from scratch as soon as the celebration ends (and the celebration always ends).
It may be cold comfort to know that your entire industry or the whole world is suffering with you, but I hope that this process at least prevents you from fixing the wrong things and making costly mistakes. Ideally, this process can help you uncover areas that may be trending upward or at least help you focus your time and money on what’s working.
There are several studies (and lots of data) out there about how people use Google SERPs, what they ignore, and what they focus on. An example is Moz’s recent experiment testing whether SEOs should continue optimizing for featured snippets or not (especially now that Google has announced that if you have a featured snippet, you no longer appear elsewhere in the search results).
Two things I have never seen tested are the actual user reactions to and behavior with SERPs. My team and I set out to test these ourselves, and this is where biometric technology comes into play.
What is biometric technology and how can marketers use it?
Biometric technology measures physical and behavioral characteristics. By combining the data from eye tracking devices, galvanic skin response monitors (which measure your sweat levels, allowing us to measure subconscious reactions), and facial recognition software, we can gain useful insight into behavioral patterns.
We’re learning that biometrics can be used in a broad range of settings, from UX testing for websites, to evaluating consumer engagement with brand collateral, and even to measuring emotional responses to TV advertisements. In this test, we also wanted to see if it could be used to help give us an understanding of how people actually interact with Google SERPs, and provide insight into searching behavior more generally.
The goal of the research was to assess the impact that SERP layouts and design have on user searching behavior and information retrieval in Google.
To simulate natural searching behavior, our UX and biometrics expert Tom Pretty carried out a small user testing experiment. Users were asked to perform a number of Google searches with the purpose of researching and buying a new mobile phone. One of the goals was to capture data from every point of a customer journey.
Participants were given tasks with specific search terms at various stages of purchasing intent. While prescribing search terms limited natural searching behavior, it was a sacrifice made to ensure the study had the best chance of achieving consistency in the SERPs presented, and so aggregated results could be gained.
The tests were run on desktop, although in the future we have plans to expand the study on mobile.
Users began each task on the Google homepage. From there, they informed the moderator when they found the information they were looking for. At that point they proceeded to the next task.
Facial expression analysis
Galvanic skin response (GSR)
Understand gaze behavior on SERPs (where people look when searching)
Understand engagement behavior on SERPs (where people click when searching)
Identify any emotional responses to SERPs (what happens when users are presented with ads?)
Interaction analysis with different types of results (e.g. ads, shopping results, map packs, Knowledge Graph, rich snippets, PAAs, etc.).
Research scenario and tasks
We told participants they were looking to buy a new phone and were particularly interested in an iPhone XS. They were then provided with a list of tasks to complete, each focused on searches someone might make when buying a new phone. Using the suggested search terms for each task was a stipulation of participation.
Find out the screen size and resolution of the iPhone XS Search term: iPhone XS size and resolution
Find out the talk time battery life of the iPhone XS Search term: iPhone XS talk time
Find reviews for the iPhone XS that give a quick list of pros and cons Search term: iPhone XS reviews
Find the address and phone number of a phone shop in the town center that may be able to sell you an iPhone XS Search term: Phone shops near me
Find what you feel is the cheapest price for a new iPhone XS (handset only) Search term: Cheapest iPhone XS deals
Find and go on to buy a used iPhone XS online (stop at point of data entry) Search term: Buy used iPhone XS
We chose all of the search terms first for ease of correlating data. (If everyone had searched for whatever they wanted, we may not have gotten certain SERP designs displayed.) And second, so we could make sure that everyone who took part got exactly the same results within Google. We needed the searches to return a featured snippet, the Google Knowledge Graph, Google’s “People also ask” feature, as well as shopping feeds and PPC ads.
On the whole, this was successful, although in a few cases there were small variations in the SERP presented (even when the same search term had been used from the same location with a clear cache).
“When designing a study, a key concern is balancing natural behaviors and giving participants freedom to interact naturally, with ensuring we have assets at the end that can be effectively reported on and give us the insights we require.” — Tom Pretty, UX Consultant, Coast Digital
This was the finding that our in-house SEOs were most interested in. According to a study by Ahrefs, featured snippets get 8.6% of clicks while 19.6% go to the first natural search below it, but when no featured snippet is present, 26% of clicks go to the first result. At the time, this meant that having a featured snippet wasn’t terrible, especially if you could gain a featured snippet but weren’t ranking first for a term. who doesn’t want to have real estate above a competitor?
In the information-based searches, we found that featured snippets actually attracted the most fixations. They were consistently the first element viewed by users and were where users spent the most time gazing. These tasks were also some of the fastest to be completed, indicating that featured snippets are successful in giving users their desired answer quickly and effectively.
All of this indicates that featured snippets are hugely important real estate within a SERP (especially if you are targeting question-based keywords and more informational search intent).
In both information-based tasks, the featured snippet was the first element to be viewed (within two seconds). It was viewed by the highest number of respondents (96% fixated in the area on average), and was also clicked most (66% of users clicked on average).
People also ask
The “People also ask” (PAA) element is an ideal place to find answers to question-based search terms that people are actively looking for, but do users interact with them?
What did we find out?
From the results, after looking at a featured snippet, searchers skipped over the PAA element to the standard organic results. Participants did gaze back at them, but clicks in those areas were extremely low, thus showing limited engagement. This behavior indicates that they are not distracting users or impacting how they journey through the SERP in any significant way.
One task involved participants searching using a keyword that would return the Google Knowledge Graph. The goal was to find out the interaction rate, as well as where the main interaction happened and where the gaze went.
What did we find out?
Our findings indicate that when a search with purchase intent is made (e.g. “deals”), then the Knowledge Graph attracts attention sooner, potentially because it includes visible prices.
By also introducing heat map data, we can see that the pricing area on the Knowledge Graph picked up significant engagement, but there was still a lot of attention focused on the organic results.
Essentially, this shows that while the knowledge graph is useful space, it does not wholly detract from the main SERP column. Users still resort to paid ads and organic listings to find what they are looking for.
We have all seen data in Google Search Console with “near me” under certain keywords, and there is an ongoing discussion of why, or how, to optimise for them. From a pay-per-click (PPC) point of view, should you even bother trying to appear in them? By introducing such a search term in the study, we were hoping to answer some of these questions.
What did we find out?
From the fixation data, we found that most attention was dedicated to the local listings rather than the map or organic listings. This would indicate that the greater amount of detail in the local listings was more engaging.
However, in a different SERP variant, the addition of the product row led to users spending a longer time reviewing the SERP and expressing more negative emotions. This product row addition also changed gaze patterns, causing users to progress through each element in turn, rather than skipping straight to the local results (which appeared to be more useful in the previous search).
This presentation of results being deemed irrelevant or less important by the searcher could be the main cause of the negative emotion and, more broadly, could indicate general frustration at having obstacles put in the way of finding the answer directly.
Purchase intent searching
For this element of the study, participants were given queries that indicate someone is actively looking to buy. At this point, they have carried out the educational search, maybe even the review search, and now they are intent on purchasing.
What did we find out?
For “buy” based searches, the horizontal product bar operates effectively, picking up good engagement and clicks. Users still focused on organic listings first, however, before returning to the shopping bar.
The addition of Knowledge Graph results for this type of search wasn’t very effective, picking up little engagement in the overall picture.
These results indicate that the shopping results presented at the top of the page play a useful role when searching with purchasing intent. However, in both variations, the first result was the most-clicked element in the SERP, showing that a traditional PPC or organic listing remains highly effective at this point in the customer journey.
Galvanic skin response
Looking at GSR when participants were on the various SERPs, there is some correlation between the self-reported “most difficult” tasks and a higher than normal GSR.
For the “talk time” task in particular, the featured snippet presented information for the iPhone XS Max, not the iPhone XS model, which was likely the cause of the negative reaction as participants had to spend longer digging into multiple information sources.
For the “talk time” SERP, the challenges encountered when incorrect data was presented within a featured snippet likely caused the high difficulty rating.
What does it all mean?
Unfortunately, this wasn’t the largest study in the world, but it was a start. Obviously, running this study again with greater numbers would be the ideal and would help firm up some of the findings (and I for one, would love to see a huge chunk of people take part).
That being said, there are some solid conclusions that we can take away:
The nature of the search greatly changes the engagement behavior, even when similar SERP layouts are displayed. (Which is probably why they are so heavily split tested).
Featured snippets are highly effective for information-based searching, and while they led to some 33% of users choosing not to follow through to the site after finding the answer, two-thirds still clicked through to the website (which is very different from the data we have seen in previous studies).
Local listings (especially when served without a shopping bar) are engaging and give users essential information in an effective format.
Even with the addition of Knowledge Graph, “People also ask”, and featured snippets, more traditional PPC ads and SEO listings still play a big role in searching behavior.
Featured snippets are not the worst thing in the world (contrary to the popular knee-jerk reaction from the SEO industry after Google’s announcement). All that has changed is that now you have to work out what featured snippets are worth it for your business (instead of trying to just claim all of them). On purely informational or educational searches, they actually performed really well. People stayed fixated on them for a fairly lengthy period of time, and 66% clicked through. However, we also have an example of people reacting badly to the featured snippet when it contained irrelevant or incorrect information.
The findings also give some weight to the fact that a lot of SEO is now about context. What do users expect to see when they search a certain way? Are they expecting to see lots of shopping feeds (they generally are if it’s a purchasing intent keyword), but at the same time, they wouldn’t expect to see them in an educational search.
Hopefully, you found this study useful and learned something new about search behavior . Our next goal is to increase the amount of people in the study to see if a bigger data pool confirms our findings, or shows us something completely unexpected.
Note: This post was co-authored by Cyrus Shepard and Rida Abidi.
Everyone wants to win Google featured snippets. Right?
At least, it used to be that way. Winning the featured snippet typically meant extra traffic, in part because Google showed your URL twice: once in the featured snippet and again in regular search results. For publishers, this was known as “double-dipping.”
All that changed in January when Google announced they would de-duplicate search results to show the featured snippet URL only once on the first page of results. No more double-dips.
Publishers worried because older studies suggested winning featured snippets drove less actual traffic than the “natural” top ranking result. With the new change, winning the featured snippet might actually now lead to less traffic, not more.
This led many SEOs to speculate: should you opt-out of featured snippets altogether? Are featured snippets causing publishers to lose more traffic than they potentially gain?
Here’s how we found the answer.
Working with the team at SearchPilot, we devised an A/B split test experiment to remove Moz Blog posts from Google featured snippets, and measure the impact on traffic.
Using Google’s data-nosnippet tag, we identified blog pages with winning featured snippets and applied the tag to the main content of the page.
Our working hypothesis was that these pages would lose their featured snippets and return to the “regular” search results below. A majority of us also expected to see a negative impact on traffic, but wanted to measure exactly how much, and identify whether the featured snippets would return after we removed the tag.
In this example, Moz lost the featured snippet almost immediately. The snippet was instead awarded to Content King and Moz returned to the top “natural” position.
Here is another example of what happened in search results. After launching the test, the featured snippet was awarded to Backlinko and we returned to the top of the natural results.
One important thing to keep in mind is that, while these keywords triggered a featured snippet, pages can rank for hundreds or thousands of different keywords in different positions. So the impact of losing a single featured snippet can be somewhat softened when your URL ranks for many different keywords — some which earn featured snippets and some which don’t.
After adding the data-nosnippet tag, our variant URLs quickly lost their featured snippets.
How did this impact traffic? Instead of gaining traffic by opting-out of featured snippets, we found we actually lost a significant amount of traffic quite quickly.
Overall, we measured an estimated 12% drop in traffic for all affected pages after losing featured snippets (95% confidence level).
What did we learn?
With the addition of the “data-nosnippet” attribute, the test had a significantly negative impact on organic traffic. In this experiment, owning the featured snippet and not ranking in the top results provides more value to these pages in terms of clicks than not owning the featured snippet and ranking in the top results.
Adding in the “data-nosnippet” attribute, not only were we able to stop Google from pulling data in that section of the HTML page to use as a snippet, but we were also able to confirm that we would rank again in the SERP, whether that is ranking in position one or lower.
As an additional tool, we were also tracking keywords using STAT Search Analytics. We were able to monitor changes in ranking for pages that had featured snippets, and noticed that it took about seven days or more from the time of launching the test for Google to cache the changes we made and for the featured snippets to be overtaken by another ranking page, if another page was awarded a featured snippet spot at all. The turnaround was quicker after we ended the test, though, as some of these featured snippets returned as quickly as the next day.
However, a negative aspect of running this test was that, although some pages were crawled and indexed with the most recent changes, the featured snippet did not return and has now either been officially given to competing pages or never returned at all.
To summarize the significant findings of this test:
Google’s nosnippet tags can effectively opt-out publishers from featured snippets.
In this test, we measured an estimated 12% drop in traffic for all affected pages after losing featured snippets.
After ending the test, we failed to win back a portion of the featured snippets we previously ranked for.
For the vast majority of publishers winning the featured snippet likely remains the smart strategy. There are undoubtedly exceptions but as a general “best practice” if a keyword triggers a featured snippet, it’s typically in your best interest to rank for it.
What are your experiences with winning featured snippets? Let us know in the comments below.
Join Moz SEO Scientist, Dr. Pete Meyers, Wednesdays in April at 1:30 p.m. PT on Twitter and ask your most pressing questions about how to navigate SEO changes and challenges in a COVID-19 world. Tweet your questions all week long to @Moz using the hashtag #AskMoz.
If you’ve been an SEO for even a short time, you’re likely familiar with Google Search Console (GSC). It’s a valuable tool for getting information about your website and its performance in organic search. That said, it does have its limitations.
In this article, you’ll learn how to get better-connected data out of Google Search Console as well as increase the size of your exports by 400%.
Google Search Console limitations
While GSC has a number of sections, we’ll be focusing on the “Performance” report. From the GSC dashboard, there are two ways you can access this report:
Once inside the “Performance” report, data for queries and pages can be accessed:
This reveals one of the issues with GSC: Query and page data is separated.
In other words, if I want to see the queries a specific page is ranking for, I have to first click “Pages,” select the page, and then click “back” to “Queries.” It’s a very cumbersome experience.
The other (two-part) issue is with exporting:
Performance data for queries and pages must be exported separately.
Exports are limited to 1,000 rows.
We’ll look to solve these issues by utilizing the GSC API.
What is the Google Search Console API?
Now we know the GSC user interface does have limitations: Connecting query data with page data is tricky, and exports are limited.
If the GSC UI represents the factory default, the GSC API represents our custom settings. It takes a bit more effort, but gives us more control and opens up more possibilities (at least in the realm of query and page data).
The GSC API is a way for us to connect to the data within our account, make more customized requests, and get more customized output. We can even bypass those factory default settings like exports limited to 1,000 rows, for instance.
Why use it?
Remember how I said earlier that query and page data is separated in the “vanilla” GSC UI? Well, with the API, we can connect query data with the page that query ranks for, so no more clicking back and forth and waiting for things to load.
Additionally, we saw that exports are limited to 1,000 rows. With the API, we can request up to 5,000 rows, an increase of 400%!
So let’s hook in, make our request, and get back a more robust and meaningful data set.
Log in to the appropriate GSC account on this page (upper right corner). For instance, if my website is example.com and I can view that Search Console account under [email protected], that’s the account I’ll sign into.
Enter the URL of the appropriate GSC account:
Set up your request:
Set startDate. This should be formatted as: YYYY-MM-DD.
Set dimensions. A dimension can be:
Set filters (optional). A filter must include:
dimension (a dimension can be: query, page, device, or country)
operator (an operator can be: contains, notContains, equals, notEquals)
expression (an expression can be any value associated with the dimensions)
Set the rowLimit. With the GSC API, you can request up to 5,000!
The page shared in step one makes all of this setup pretty easy, but it can be tedious and even confusing for some. I’ve done all the fussing for you and have created JSON you can edit quickly and easily to get the API return you’d like.
The following request will be unfiltered. We’ll set our preferred dates, dimensions, and a row limit, and then make our request.
The order in which you place your dimensions is the order in which they’ll be returned.
The API will return data for desktop, mobile, and tablet, separated out. The numbers you see in the GSC user interface — clicks, for instance — are an aggregate of all three (unless you apply device filtering).
Remember, your dimensions can also include “country” if you’d like.
This version of our request will include filters in order to be more specific about what is returned.
Filters are stated as dimension/operator/expression. Here are some examples to show what’s possible:
query contains go fish digital
page equals https://gofishdigital.com/
device notContains tablet
It looks like you can only apply one filter per dimension, just like in the normal GSC user interface, but if you know differently, let us know in the comments!
Choose a template, unfiltered or filtered, and fill in your custom values (anything after a colon should be updated as your own value, unless you like my presets).
Execute the request
So there you have it! Two request templates for you to choose from and edit to your liking. Now it’s time to make the request. Click into the “Request body”, select all, and paste in your custom JSON:
This is where you could manually set up your request keys and values, but as I stated earlier, this can be tedious and a little confusing, so I’ve done that work for you.
Scroll down and click “Execute.” You may be prompted to sign-in here as well.
If everything was entered correctly and the request could be satisfied, the API will return your data. If you get an error, audit your request first, then any other steps and inputs if necessary.
Click into the box in the lower right (this is the response from the API), select all, and copy the information.
Convert from JSON to CSV
Excel or Sheets will be a much better way to work with the data, so let’s convert our JSON output to CSV.
Use a converter like this one and paste in your JSON output. You can now export a CSV. Update your column headers as desired.
Query your own data
Most SEOs are pretty comfortable in Excel, so you can now query your request output any way you’d like.
One of the most common tasks performed is looking for data associated with a specific set of pages. This is done by adding a sheet with your page set and using VLOOKUP to indicate a match.
The API output being in a spreadsheet also allows for the most common actions in Excel like sorting, filtering, and chart creation.
Get more out of Google Search Console
GSC offers important data for SEOs, and the GSC API output offers not only more of that data, but in a format that is far less cumbersome and more cohesive.
Today, we overcame two obstacles we often face in the standard GSC user interface: the query/page connection and limited exports. My hope is that utilizing the Google Search Console API will take your analyses and insights to the next level.
While my JSON templates will cover the most common scenarios in terms of what you’ll be interested in requesting, Google does offer documentation that covers a bit more ground if you’re interested.
Do you have another way of using the GSC API? Is there another API you commonly use as an SEO? Let me know in the comments!
For local businesses today, there are numerous different ways to market your brand online. The majority of your potential customers still use Google to find local businesses near them — businesses where they will spend their hard-earned money. In fact, 80% of searches with “local intent” result in a conversion.
This begs the question: “What’s the best way to catch the attention of local searchers on Google?”
The answer: through Google Maps marketing.
What is Google Maps marketing?
Google Maps marketing is the process of optimizing the online presence of your brand in Google Maps, with the goal of increasing your brand’s online visibility.
When you search a query on Google that has local intent, you often see something like this:
Google Maps marketing utilizes a number of strategies and tactics to help your business become one of those three positions on local map packs.
Why is marketing important for Google Maps?
The reason every local business should care about ranking in Google Maps is simple: potential brand visibility.
It’s no surprise that Google is by far the most popular search engine. But what about Google Maps specifically?
Thus, any business that is serious about getting found in this day and age needs to utilize the power behind Google Maps marketing. This is why we at Ratynski Digital focus much of our local SEO time on getting our clients to rank both in Google Maps AND organic search results.
Google My Business (GMB) is a free platform provided by Google where local businesses can create a profile that is displayed across a variety of Google products.
In order to qualify for a GMB profile you must make in-person contact with your customers during your stated business hours. This may mean that you have a brick-and-mortar location where customers come to see you, or perhaps you travel to see your customers.
A GMB profile can display a variety of information about your business such as:
Business category or industry
Locations that you serve
Products and services
And much more depending on your industry!
The purpose of creating a Google My Business profile for your brand is to increase your rankings, traffic, and revenue.
Click on the blue button that says “Manage now” (be sure you are signed into your Google account).
Step 2: Create the listing and name your business profile.
Name your new listing and start adding all of your important business information.
It’s important to note that before you create your GMB profile, you should familiarize yourself with Google’s guidelines. And please, don’t create GMB spam. Not only will creating fake or spammy listings offer a horrible user experience for your potential customers, but it also puts you at risk for penalties and suspensions.
Step 3: Add as much relevant information about your business as possible.
Remember all those different types of information I mentioned above? This is when you get to add those to your profile. Take advantage of this free platform and try to include as much relevant information as you can. Keep in mind, you will want to avoid adding GMB categories that are NOT relevant to your business. You should also work to keep all of your Google My Business contact information accurate, and make sure that it matches your website.
Step 4: Verify your profile.
If this is a brand new account, you will need to verify the physical address with a postcard that will be sent via mail by Google.
If you are claiming a listing that already exists on Google Maps but is not verified, you may be able to verify the profile via email or phone.
Step 5: Pop the champagne — you did it! Easy peasy.
Now that we are all set up, let’s dive into Google Maps SEO.
Top Google Maps ranking factors
It’s important to have a firm understanding of Google Maps ranking factors before you can expect to see high-ranking results. Once you understand how it works, Google Maps marketing becomes as easy as operating your 7-year-old’s Easy Bake Oven.
Okay, maybe not that easy, but everything will be much more clear. For a deep dive, I recommend checking out Moz’s 2018 local ranking factors study, but I’ll cover the top factors here.
In a nutshell, there are eight ranking factors that contribute to ranking in Google Maps and the local pack:
Google’s local algorithm analyzes all of the signals listed above and ranks listings based on the following three areas:
Proximity: How close is the business to the searcher?
Prominence: How popular or authoritative is the business in the area?
Relevance: How closely does the listing match the searcher’s query?
Now that you have a handle on how the local algorithm works and its many ranking factors, let’s talk about specific ways to optimize your GMB profile to improve your ranking in Google Maps.
How to optimize for Google Maps
To kickoff your optimizations, double check that ALL of your business information is filled out in full and 100% accurate. This includes adding the many services that you might offer as well as descriptions of those services.
Sherri Bonelli wrote a comprehensive post on optimizing the information on your GMB listing. She did a great job covering that topic, so I am going to focus instead on three more factors that will make the biggest impact in the shortest amount of time:
1. Get more online reviews
Reviews continue to be one of the most important components for ranking in Google Maps, but the benefit of building more reviews is not purely for the purpose of SEO (not by a long shot).
Reviews offer a much better customer experience. They help to build up social proof, manage customer expectations, and they can sell your product or service before you even get in touch with your customer.
With 82% of consumers reading online reviews for local businesses, every business owner needs to understand the importance and power of reviews.
Google understands the customer’s desire to read reviews before they visit a store or trust a brand. They have heavily factored reviews into the local algorithm because of this (reviews from both Google and third parties).
Keep in mind that the “review factor” is not simply a measurement of who has the most reviews. That is certainly a piece of the puzzle, but Google also takes into consideration many other aspects like:
Whether a review has text along with the star rating or not.
The words chosen to write the review.
The overall star rating given to the business.
The consistency of reviews.
Overall review sentiment.
Business owners must regularly train themselves (and their team) to ask their customers for reviews. It’s important to set up systems and processes to make review generation a regular occurrence.
I also recommend setting up a process or purchasing a service that helps with review management. For example, Moz Local offers the ability to monitor the flow of reviews as well as comment and reply to those reviews as they come in (all in one cohesive dashboard). Always reply to your reviews!
Pro Tip: Don’t ask for a review too early. Too many businesses ask for a review for a product or service before their customer has had the opportunity to fully experience it (and actually benefit from it). Only after they have had the chance to solve their problem with your product or service should you ask for a review.
2. Build local links
Links are still one of the largest ranking factors in Google’s algorithm (both in organic ranking and in Google Maps). In fact, building local links is especially important if you want to rank in Google Maps.
It’s true that any link that isn’t marked as nofollow will pass “authority”, which will likely help with rankings. However, local links are especially important because they have a much higher probability of driving actual business.
One of the best ways to start building local links is to utilize your local relationships around town. Think about other businesses that you work closely with, organizations that you support, or even companies that might qualify as a “shoulder niche”.
For the highest success rate, start with businesses that you already have a relationship with or know well. You could offer to write or record a testimonial in exchange for a link, or perhaps you could co-create a piece of content that benefits both of your audiences.
Here’s exactly how to do it:
Create a list of niches that offer services that compliment (but don’t compete) with your business.
Consider how you might be able to incorporate these other companies into your content outreach.
For example, a carpet cleaning business may decide to create a really helpful piece of content about cost-effective ways to increase a home’s value in a specific market. They might include advice about landscaping, painting, and of course, carpet cleaning. Before writing the content, they could reach out to a few local painting, landscaping, or home service businesses in the area and ask if those businesses would be willing to collaborate on the content and perhaps add a link to their resource pages.
This process can also work even if you don’t have an existing relationship with the business currently. Here’s a basic outreach template you can use:
My name is [YOUR NAME] from [BUSINESS]. We are actually business neighbors in a way, as we are located not too far from you in [CITY]. I often pass by [THEIR BUSINESS] on my way to [LOCAL LANDMARK/DESTINATION].
I thought it was finally time to reach out and say hello, and let you know that if there’s ever anything you or your team need, please let us know.
Also, I am working on writing an article about [INSERT BLOG TOPIC HERE]. Since our businesses both serve a similar audience and compliment each other nicely, I was wondering if you’d like to be featured in the article?
I am going to include a section about [TOPIC ABOUT THEIR INDUSTRY], and would like to use a sentence or two with your advice coming from the [THEIR INDUSTRY]. It might even make a great addition to the resource page on your website. Please let me know if this is something you’d be interested in.
Either way, thanks for your time, and great to meet you!
Pro Tip: If you are working to build links on a budget, it may help to get approval for the link before you invest the time and resources in content collaborations.
3. Fight off GMB spam in the map
This final optimization is less of an “optimization” and more of a tactic. This tactic is powerful because unlike most GMB optimizations, the goal is not to do something better than your competition, it’s to remove the competitors that are trying to cheat their way to higher rankings.
Just how powerful is this approach? Very.
Let’s take a look at this Google Maps SERP as an example:
At first glance, all of these listings seem legitimate. However, after about two minutes of investigating you can quickly discern that a few are fake. One of them doesn’t have a website and links to Nerdwallet, some are using fake reviews, and some are even using fake addresses (one is using the DMV’s address).
Now imagine you are DCAP Insurance (a real company) and you are trying to rank higher in Google Maps. If you successfully remove the top four spam listings, you have now jumped to the #1 position without making any additional optimizations.
Starting to see the logic behind this approach?
Unfortunately, Google Maps still has quite a bit of spam throughout its ecosystem. In fact, out of the top 20 spots in the example above, I was able to find seven fake listings and three more that were extremely questionable. This approach can work whether a listing is using an improper business name, keyword stuffing, or is a fake location entirely.
How to remove or edit Google My Business spam
Create a detailed record of each GMB listing you find and what edits are necessary. This will help later on if the changes keep getting reverted back.
Next, head over to Google Maps, find the listing, and click on “Suggest an Edit”.
Depending on the issue at hand you can either select:
“Change name or other details”
“Remove this place”
If you’re trying to remove keyword stuffing from a listing’s business name, you simply select “change name or other details” and make the necessary edits.
If you’re dealing with spam of some sort, you will need to select “Remove this place” and then select the exact issue from the drop-down list.
When suggesting an edit doesn’t get the job done
Unfortunately, submitting an edit about spam doesn’t always cut it. When this happens the best way to handle these spam listings is to use Google’s Business Redressal Complaint Form.
When using the redressal form, you’ll need to provide evidence before the required action takes place. For more information, be sure to check out this helpful resource.
Google Maps SEO checklist
At this point, you likely understand the importance of filling out your Google My Business profile to completion. But that’s not all it takes to rank in Google Maps — ranking requires comprehensive optimizations on a variety of levels and there is often not just one magic thing.
To help you cover all your bases, I created this Google Maps SEO Checklist that will help you pinpoint specific areas for improvement.
Tracking results and GMB analytics
Tracking your results is crucial in every aspect of SEO and online marketing, and Google My Business is no different. Most of your profile analytics will be found in your Google My Business account.
You can find this information by logging into your account and selecting “insights” on the far left side. Here is an example of what that looks like for Roadside Dental Marketing’s Google My Business account.
From there, you should be able to see things like:
Which specific search queries triggered your listing.
How often your listing appeared in Google search.
How often your listing appeared in Google Maps.
What kind of customer actions were taken (e.g. visiting your website, requesting directions, phone calls).
Where customers are requesting business information from.
Which days of the you week get the most calls.
How many photos have been viewed, and how that number compares to your competition.
The one thing that GMB analytics does NOT offer is any sort of rank tracking. Thankfully, the brilliant people at Moz are working on Local Market Analytics (beta). LMA not only offers rank tracking on a local level, but it also contains a plethora of competitor information within a target market.
While covering the GMB basics is fine and dandy, comprehensive optimizations coupled with making ongoing improvements is what truly separates the wheat from the chaff. Regularly test different optimizations within your industry and market and closely monitor your results. If you’re ever in doubt, do whatever is in the best interest of your customer. They must always come first.
By investing in Google Maps marketing, you’ll be able to drive local leads to your business on a consistent basis. If you find yourself with any questions, let me know in the comments below or on Twitter and I will happily answer them!
While Google’s mission has always been to surface high-quality content, over the past few years the company has worked especially hard to ensure that its search results are also consistently accurate, credible, and trustworthy.
Reducing false and misleading information has been a top priority for Google since concerns over misinformation surfaced during the 2016 US presidential election. The search giant is investing huge sums of money and brain power into organizing the ever-increasing amounts of content on the web in a way that prioritizes accuracy and credibility.
In a 30-page whitepaper published last year, Google delineates specifically how it fights against bad actors and misinformation across Google Search, News, Youtube, Ads, and other Google products.
In this whitepaper, Google explains how Knowledge Panels — a common organic search feature — are part of its initiative to show “context and diversity of perspectives to form their own views.” With Knowledge Panel results, Google provides answers to queries with content displayed directly in its organic search results (often without including a link to a corresponding organic result), potentially eliminating the need for users to click through to a website to find an answer to their query. While this feature benefits users by answering their questions even more quickly, it brings with it the danger of providing quick answers that might be misleading or incorrect.
Another feature with this issue is Featured Snippets, where Google pulls website content directly into the search results. Google maintains specific policies for Featured Snippets, prohibiting the display of content that is sexually explicit, hateful, violent, dangerous, or in violation of expert consensus on civic, medical, scientific, or historical topics. However, this doesn’t mean the content included in Featured Snippets is always entirely accurate.
According to data pulled by Dr. Pete Meyers, based on a sample set of 10,000 keywords, Google has increased the frequency with which it displays Featured Snippets as part of the search results. In the beginning of 2018, Google displayed Featured Snippets in approximately 12% of search results; in early 2020, that number hovers around 16%.
Google has also rolled out several core algorithm updates in the past two years, with the stated goal of “delivering on [their] mission to present relevant and authoritative content to searchers.” What makes these recent algorithm updates particularly interesting is how much E-A-T (expertise, authoritativeness, and trustworthiness) appears to be playing a role in website performance, particularly for YMYL (your money, your life) websites.
As a result of Google’s dedication to combating misinformation and fake news, we could reasonably expect searchers to agree that Google has improved in its ability to surface credible and trusted content. But does the average searcher actually feel that way? At Path Interactive, we conducted a survey to find out how users feel about the information they encounter in Google’s organic results.
About our survey respondents and methodology
Out of 1,100 respondents, 70% of live in the United States, 21% in India, and 5% in Europe. 63% of respondents are between the ages of 18 and 35, and 17% are over the age of 46. All respondent data is self-reported.
For all questions involving specific search results or types of SERP features, respondents were provided with screenshots of those features. For questions related to levels of trustworthiness or the extent to which the respondent agreed with the statement, respondents were presented with answers on a scale of 1-5.
Trustworthiness in the medical, political, financial, and legal categories
Given how much fluctuation we’ve seen in the YMYL category of Google with recent algorithm updates, we thought it would be interesting to ask respondents how much they trust the medical, political, financial, and legal information they find on Google.
We started by asking respondents about the extent to which they have made important financial, legal, or medical decisions based on information they found in organic search. The majority (51%) of respondents indicated that they “very frequently” or “often” make important life decisions based on Google information, while 39% make important legal decisions, and 46% make important medical decisions. Only 10-13% of respondents indicated that they never make these types of important life decisions based on the information they’ve found on Google.
As it relates to medical searches, 72% of users agree or strongly agree that Google has improved at showing accurate medical results over time.
Breaking down these responses by age, a few interesting patterns emerge:
The youngest searchers (ages 18-25) are 94% more likely than the oldest searchers (65+) to strongly believe that Google’s medical results have improved over time.
75% of the youngest searchers (ages 18-25) agree or strongly agree that Google has improved in showing accurate medical searches over time, whereas only 54% of the oldest searchers (65+) feel the same way.
Searchers ages 46-64 are the most likely to disagree that Google’s medical results are improving over time.
Next, we wanted to know if Google’s emphasis on surfacing medical content from trusted medical publications — such as WebMD and the Mayo Clinic — is resonating with its users. One outcome of recent core algorithm updates is that Google’s algorithms appear to be deprioritizing content that contradicts scientific and medical consensus (consistently described as a negative quality indicator throughout their Search Quality Guidelines).
The majority (66%) of respondents agree that it is very important to them that Google surfaces content from highly trusted medical websites. However, 14% indicated they would rather not see these results, and another 14% indicated they’d rather see more diverse results, such as content from natural medicine websites. These numbers suggest that more than a quarter of respondents may be unsatisfied with Google’s current health initiatives aimed at surfacing medical content from a set of acclaimed partners who support the scientific consensus.
We asked survey respondents about Symptom Cards, in which information related to medical symptoms or specific medical conditions is surfaced directly within the search results.
Our question aimed to gather how much searchers felt the content within Symptom Cards can be trusted.
The vast majority (76%) of respondents indicated they trust or strongly trust the content within Symptom Cards.
When looking at the responses by age, younger searchers once again reveal that they are much more likely than older searchers to strongly trust the medical content found within Google. In fact, the youngest bracket of searchers (ages 18-25) are 138% more likely than the oldest searchers (65+) to strongly trust the medical content found in Symptom Cards.
News and political searches
The majority of respondents (61%) agree or strongly agree that Google has improved at showing high-quality, trustworthy news and political content over time. Only 13% disagree or strongly disagree with this statement.
Breaking the same question down by age reveals interesting trends:
The majority (67%) of the youngest searchers (ages 18-25) agree that the quality of Google’s news and political content has improved over time, whereas the majority (61%) of the oldest age group (65+) only somewhat agrees or disagrees.
The youngest searchers (ages 18-25) are 250% more likely than the oldest searchers to strongly agree that the quality of news and political content on Google is improving over time.
Given Google’s emphasis on combating misinformation in its search results, we also wanted to ask respondents about the extent to which they feel they still encounter dangerous or highly untrustworthy information on Google.
Interestingly, the vast majority of respondents (70%) feel that they have encountered misinformation on Google at least sometimes, although 29% indicate they rarely or never see misinformation in the results.
Segmenting the responses by age groups reveals a clear pattern that the older the searcher, the more likely they are to indicate that they have seen misinformation in Google’s search results. In fact, the oldest searchers (65+) are 138% more likely than the youngest searchers (18-25) to say they’ve encountered misinformation on Google either often or very frequently.
Throughout the responses to all questions related to YMYL topics such as health, politics, and news, a consistent pattern emerged that the youngest searchers appear to have more trust in the content Google displays for these queries, and that older searchers are more skeptical.
This aligns with our findings from a similar survey we conducted last year, which found that younger searchers were more likely to take much of the content displayed directly in the SERP at face value, whereas older searchers were more likely to browse deeper into the organic results to find answers to their queries.
This information is alarming, especially given another question we posed asking about the extent to which searchers believe the information they find on Google influences their political opinions and outlook on the world.
The question revealed some interesting trends related to the oldest searchers: according to the results, the oldest searchers (65+) are 450% more likely than the youngest searchers to strongly disagree that information they find on Google influences their worldview.
However, the oldest searchers are also most likely to agree with this statement; 11% of respondents ages 65+ strongly agree that Google information influences their worldview. On both ends of the spectrum, the oldest searchers appear to hold stronger opinions about the extent to which Google influences their political opinions and outlook than respondents from other age brackets.
Featured Snippets and the Knowledge Graph
We also wanted to understand the extent to which respondents found the content contained within Featured Snippets to be trustworthy, and to segment those responses by age brackets. As with the other scale-based questions, respondents were asked to indicate how much they trusted these features on a scale of 1-5 (Likert scale).
According to the results, the youngest searchers (ages 18-25) are 100% more likely than the oldest searchers (ages 65+) to find the content within Featured Snippets to be very trustworthy. This aligns with a similar discovery we found in our survey from last year: “The youngest searchers (13–18) are 220 percent more likely than the oldest searchers (70–100) to consider their question answered without clicking on the snippet (or any) result.”
For Knowledge Graph results, the results are less conclusive when segmented by age. 95% of respondents across all age groups find the Knowledge Panel results to be at least “trustworthy.”
Conclusion: Young users trust search results more than older users
In general, the majority of survey respondents appear to trust the information they find on Google — both in terms of the results themselves, as well as the content they find within SERP features such as the Knowledge Panel and Featured Snippets. However, there still appears to be a small subset of searchers who are dissatisfied with Google’s results. This subset consists of mostly older searchers who appear to be more skeptical about taking Google’s information at face value, especially for YMYL queries.
Across almost all survey questions, there is a clear pattern that the youngest searchers tend to trust the information they find on Google more so than the older respondents. This aligns with a similar survey we conducted last year, which indicated that younger searchers were more likely to accept the content in Featured Snippets and Knowledge Panels without needing to click on additional results on Google.
It is unclear whether younger searchers trust information from Google more because the information itself has improved, or because they are generally more trusting of information they find online. These results may also be due to older searchers not having grown up with the ability to rely on internet search engines to answer their questions. Either way, the results raise an interesting question about the future of information online: will searchers become less skeptical of online information over time?
How should I get listed in Google My Business if I’ve got multiple businesses at the same address? How many listings am I eligible for if I’m legitimately running more than one business at my location? What determines eligibility, and what penalties might I incur if I make a mistake? How should I name my businesses at the same address?
The FAQs surrounding this single, big topic fill local SEO forums across the web, year after year.
Today, Iet’s quickly tackle the commonest FAQs that local business owners and marketers raise related to this scenario, and if you have further questions, please ask in the comments!
Q: I have more than one business at the same address. Can I have more than one Google My Business listing?
A: If you are legitimately operating multiple, legally distinct businesses, you can typically create a Google My Business listing for each of them. It’s not at all uncommon for more than one business to be located at a shared address. However, keep reading for further details and provisos.
Q: How do I know if my multiple businesses at the same address are distinct enough to be eligible for separate Google My Business listings?
A: If each brick-and-mortar business you operate is separately registered with appropriate state and federal agencies, has a unique TAX ID with which you file separate taxes, meets face-to-face with customers, and has a unique phone number, then it’s typically eligible for a distinct GMB listing. However, keep reading for more information.
Q: Can service area businesses list multiple businesses at the same address?
A: Google has historically treated SABs differently than brick-and-mortar businesses. While no official guideline forbids listing multiple SABs — like plumbers and locksmiths — at the same location, it’s not considered an industry best practice to do so. Google appears to be more active in issuing hard suspensions to SABs in this scenario, even if the businesses are legitimate and distinct. Because of this, it’s better strategy not to co-locate SABs.
Q: What would make me ineligible for more than one Google My Business listing at the same address?
A: If your businesses aren’t registered as legally distinct entities or if you lack unique phone numbers for them, you are ineligible to list them separately. Also, if your businesses are simply representative of different product lines or services you offer under the umbrella of a single business — like a handyman who repairs both water heaters and air conditioners — they aren’t eligible for separate listings. Additionally, do not list multiple businesses at PO boxes, virtual offices, mailboxes at remote locations, or at locations you don’t have the authority to represent.
Q: Will I be penalized if I list multiple ineligible businesses at the same address?
A: Yes, you could be. Google could issue a hard suspension on one or more of your ineligible listings at any time. A hard suspension means that Google has removed your listing and its associated reviews.
Q: Will suite numbers help me convince Google I actually have two locations so that I can have more than one GMB listing?
A: No. Google doesn’t pay attention to suite numbers, whether legitimate or created fictitiously. Don’t waste time attempting to make a single location appear like multiple locations by assigning different suite numbers to the entities in hopes of qualifying for multiple listings.
Q: Can I list my business at a co-working space, even though there are multiple businesses at the same address?
A: If your business has a unique, direct phone number answered by you and you are staffing the co-working space with your own staff at your listed hours, yes, you are typically eligible for a Google My Business listing. However, if any of the other businesses at the location share your categories or are competing for the same search terms, it is likely that you or your competitors will be filtered out of Google’s mapping product due to the shared elements.
Q: How many GMB listings can I have if there are multiple seasonal businesses at my address?
A: If your property hosts an organic fruit stand in summer and a Christmas tree farm in the winter, you need to closely follow Google’s requirements for seasonal businesses. In order for each entity to qualify for a listing, it must have year-round signage and set and then remove its GMB hours at the opening and closing of its season. Each entity should have a distinct name, phone number and Google categories.
Q: How should I name my multiple businesses at the same address?
A: To decrease the risk of filtering or penalties, co-located businesses must pay meticulous attention to allowed naming conventions. Questions surrounding this typically fall into five categories:
If one business is contained inside another, as in the case of a McDonald’s inside a Walmart, the Google My Business names should be “McDonald’s” and “Walmart” not “McDonalds in Walmart”.
If co-located brands like a Taco Bell and a Dunkin’ Donuts share the same location, they should not combine their brand names for the listing. They should either create a single listing with just one of the brand names, or, if the brands operate independently, a unique listing for each separate brand.
If multiple listings actually reflect eligible departments within a business — like the sales and parts departments of a Chevrolet dealership — then it’s correct to name the listings Chevrolet Sales Department and Chevrolet Parts Department. No penalties should result from the shared branding elements, so long as the different departments have some distinct words in their names, distinct phone numbers and distinct GMB categories.
If a brand sells another brand’s products — like Big-O selling Firestone Tires — don’t include the branding of the product being sold in the GMB business name. However, Google stipulates that if the business location is an authorized and fully dedicated seller of the branded product or service (sometimes known as a “franchisee”), you may use the underlying brand name when creating the listing, such as “TCC Verizon Wireless Premium Retailer.”
If an owner is starting out with several new businesses at the same location, it would be a best practice to keep their names distinct. For example, a person operating a pottery studio and a pet grooming station out of the same building can lessen the chance of filters, penalties, and other problems by avoiding naming conventions like “Rainbow Pottery” and “Rainbow Pet Grooming” at the same location.
Q: Can I create separate listings for classes, meetings, or events that share a location?
A: Unfortunately the guidelines on this topic lack definition. Google says not to create such listings for any location you don’t own or have the authority to represent. But even if you do own the building, the guidelines can lead to confusion. For example, a college can create separate listings for different departments on campus, but should not create a listing for every class being offered, even if the owners of the college do have authority to represent it.
Another example would be a yoga instructor who teaches at three different locations. If the building owners give them permission to list themselves at the locations, along with other instructors, the guidelines appear to permit creating multiple listings of this kind. However, such activity could end up being perceived as spam, could be filtered out because of shared elements with other yoga classes at a location, and could end up competing with the building’s own listing.
Because the guidelines are not terribly clear, there is some leeway in this regard. Use your discretion in creating such listings and view them as experimental in case Google should remove them at some point.
Q: How do I set GMB hours for co-located business features that serve different functions?
A: A limited number of business models have to worry about this issue of having two sets of hours for specific features of a business that exist on the same premises but serve unique purposes. For example, a gas station can have a convenience market that is open 6 AM to 10 PM, but pumps that operate 24 hours a day. Google sums up the shortlist for such scenarios this way, which I’ll quote verbatim:
Banks: Use lobby hours if possible. Otherwise, use drive-through hours. An ATM attached to a bank can use its own separate listing with its own, different hours.
Car dealerships: Use car sales hours. If hours for new car sales and pre-owned car sales differ, use the new sales hours.
Gas stations: Use the hours for your gas pumps.
Restaurants: Use the hours when diners can sit down and dine in your restaurant. Otherwise, use takeout hours. If neither of those is possible, use drive-through hours, or, as a last resort, delivery hours.
Storage facilities: Use office hours. Otherwise, use front gate hours.
Q: Could the details of my Google listing get mixed up with another business at my location?
A: Not long ago, local SEO blogs frequently documented cases of listing “conflation”. Details like similar or shared names, addresses or phone numbers could cause Google to merge two listings together, resulting in strange outcomes like the reviews for one company appearing on the listing of another. This buggy mayhem, thankfully, has died down to the extent that I haven’t seen a report of listing conflation in some years. However, it’s good to remember that errors like these made it clear that each business you operate should always have its own phone number, naming should be as unique as possible, and categories should always be carefully evaluated.
Q: Why is only one of my multiple businesses at the same location ranking in Google’s local results?
A: The commonest cause of this is that Google is filtering out all but one of your businesses from ranking because of listing element similarity. If you attempt to create multiple listings for businesses that share Google categories or are competing for the same keyword phrases at the same address, Google’s filters will typically make all but one of the entities invisible at the automatic zoom level of their mapping product. For this reason, creating multiple GMB listings for businesses that share categories or industries is not a best practice and should be avoided.
Q: My GMB listing is being filtered due to co-location. What should I do?
A: This topic has come to the fore especially since Google’s rollout of the Possum filter on Sept 1, 2016. Businesses at the same address (or even in the same neighborhood) that share a category and are competing for the same search phrases often have the disappointment of discovering that their GMB listing appears to be missing from the map while a co-located or nearby competitor ranks highly. Google’s effort to deliver diversity causes them to filter out companies that they deem too similar when they’re in close proximity to one another.
If you find yourself currently in a scenario where you happen to be sharing a building with a competitor, and you’ve been puzzled as to why you seem invisible on Google’s maps, zoom in on the map and see if your listing suddenly appears. If it does, chances are, you’re experiencing filtering.
If this is your predicament, you have a few options for addressing it. As a measure of last resort, you could relocate your company to a part of town where you don’t have to share a location and have no nearby competitors, but this would be an extreme solution. More practically speaking, you will need to audit your competitor, comparing their metrics to yours to discover why Google sees them as the stronger search result. From the results of your audit, you can create a strategy for surpassing your opponent so that Google decides it’s your business that deserves not to be filtered out.
There’s nothing wrong with multiple businesses sharing an address. Google’s local index is filled with businesses in this exact situation ranking just fine without fear of penalization. But the key to success and safety in this scenario is definitely in the details.
Assessing eligibility, accurately and honestly representing your brand, adhering to guidelines and best practices, and working hard to beat the filters will stand you in good stead.
If you own or market a business location that makes a real-world community more serviceable, diverse, and strong, I’m on your side.
I love interesting towns and cities, with a wide array of useful goods and services. Nothing in my career satisfies me more than advising any brand that’s determined to improve life quality in some spot on the map. It does my heart good to see it, but here’s my completely unsentimental take on the challenges you face:
The Internet, and Google’s local platforms in particular, are a complete mess.
Google is the biggest house on the local block; you can’t ignore it. Yet, the entries into the platform are poorly lit, the open-source concept is cluttered with spam, and growing litigation makes one wonder if there are bats in the belfry.
Google comprises both risk and tremendous opportunity for local businesses and their marketers. Succeeding in 2020 means becoming a clear-eyed surveyor of any structural issues as well as seeing the “good bones” potential, so that you can flip dilapidation into dollars. And something beyond dollar, too: civic satisfaction.
Grab your tools and get your teammates and clients together to build local success in the new year by sharing my 3-level plan and 4-quarter strategy.
Residents, new neighbors, and travelers seeking what you offer will almost certainly find something about your company online, whether it’s a stray mention on social media, an unclaimed local business listing generated by a platform or the public, or a full set of website pages and claimed listings you’ve actively published.
Right now, running the most successful local business possible means acquiring the largest share you can of those estimated 1 trillion annual local searches. How do you do this?
By feeding Google:
Website content about your business location, products, services, and attributes
Corroborating info about your company on other websites
Local business listing content
Social media content
Remember, without your content and the content of others, Google does not exist. Local business owners can often feel uncomfortably dependent on Google, but it’s really Google who is dependent on them.
Whether the business you’re marketing is small or large, declare 2020 the year you go to the drafting board to render a clear blueprint for a content architecture that spans your entire neighborhood of the Internet, including your website and relevant third-party sites, platforms, and apps. Your plans might look something like this:
I recommend organizing your plan like this, making use of the links I’m including:
Begin with a rock-solid foundation of business information on your website. Tell customers everything they could want to know to choose and transact with your business. Cover every location, service, product, and desirable attribute of your company. There’s no chance you won’t have enough to write about when you take into account everything your customers ask you on a daily basis + everything you believe makes your company the best choice in the local market. Be sure the site loads fast, is mobile-friendly, and as technically error-free as possible.
Build out your listings (aka structured citations) on the major platforms. Automate the work of both developing and monitoring them for sentiment and change via a product like Moz Local.
Monitor and respond to all reviews as quickly as possible on all platforms. These equal your online reputation and are, perhaps, the most important content about your business on the Internet. Know that reviews are a two-way conversation and learn to inspire customers to edit negative reviews. Moz Local automates review monitoring and facilitates easy responses. If you need help earning reviews, check out Alpine Software Group’s two good products: GatherUp and Grade.Us.
Audit your competition. In competitive markets, come check out our beta of Local Market Analytics for a multi-sampled understanding of who your competitors actually are for each location of your business, depending on searcher locale.
Once you’ve found your competitors, audit them to understand the:
quality, authority and rate of ongoing publication you need to surpass
number and quality of Google posts, videos, products, and other content you need to publish
social engagement you need to create.
As to the substance of your content, focus directly on your customers’ needs. Local Market Analytics is breaking ground in delivering actual local keyword volumes, and the end point of all of your research, whether via keyword tools, consumer surveys, or years of business experience, should be content that acts as customer service, turning seekers into shoppers.
Use any leftover time to sketch in the finer details. For example, I’m less excited about schema for 2020 than I was in 2019 because of Google removing some of the benefits of review schema. Local business schema is still a good idea, though, if you have time for it. Meanwhile, pursuing relevant featured snippets could certainly be smart in the new year. I’d go strong on video this year, particularly YouTube, if there’s applicability and demand in your market.
The customer is the focus of everything you publish. Google is simply the conduit. Your content efforts may need to be modest or major to win the greatest possible share of the searches that matter to you. It depends entirely on the level of competition in your markets. Find that level, know your customers, and commit to feeding Google a steady, balanced diet of what they say they want so that it can be conveyed to the people you want to serve.
Let’s keep it real: ethical local companies which pride themselves on playing fair have good reason to be dubious about doing business with Google. Once you’ve put in the effort to feed Google all the right info to begin competing for rankings, you may well find yourself having to do online battle on an ongoing basis.
There are two fronts on which many people end up grappling with Google:
Problematic aspects within products
Litigation and protests against the brand.
Let’s break these down to prepare you:
Google has taken on the scale of a public utility — one that’s replaced most of North America’s former reliance on telephone directories and directory assistance numbers.
When you are marketing a local business, there’s a strong chance you will face one or more of the following issues while attempting to compete in Google’s local products:
Being outranked by businesses violating Google’s own guidelines with practices such as keyword-stuffed business titles and creating listings to represent non-existent locations or lead-gen companies. (Example)
Being the target of listing hijacking in which another company overtakes some aspect of your listing to populate it with their own details. (Example)
Being the target of a reputation attack by competitors or members of the public posting fake negative reviews of your business. (Example)
Being the target of negative images uploaded to your listing by competitors or the public. (Example)
Having Google display third-party lead-gen information on your listings, driving business away from you to others. (Example)
Having Google randomly experiment with local features with direct negative impacts on you, such as booking functions that reserve tables for your patrons without informing your business. (Example)
Being unable to access adequately trained Google staff or achieve timely resolution when things go wrong (Example)
Honest local business owners don’t operate this way. They don’t make money off of fooling the public, or maliciously attack neighboring shops, or give the cold shoulder to people in trouble. Only Google’s underregulated monopoly status has allowed them to stay in business while conducting their affairs this way.
Brilliant people work for Google and some of their innovations are truly visionary. But the Google brand, as a whole, can be troubling to anyone firmly tied to the idea of ethical business practices. I would best describe the future of Google, in its present underregulated state of monopoly, as uncertain.
In their very short history, Google has been:
I can’t predict where all this is headed. What I do know is that nearly every local business I’ve ever consulted with has been overwhelmingly reliant on Google for profits. Whether you personally favor strong regulation or not, I recommend that every local business owner and marketer keep apprised of the increasing calls by governing bodies, organizations, and even the company’s own staff to break Google up, tax it, end contracts on the basis of human rights, and prosecute it over privacy, antitrust, and a host of other concerns.
Pick your battles
With Google so deeply embedded in your company’s online visibility, traffic, reputation and transactions, concerns with the brand and products don’t exist in some far-off place; they are right on your own doorstep. Here’s how to fight well:
1. Fight the spam
To face off with Google’s local spam, earn/defend the rankings your business needs, and help clean polluted SERPs up for the communities you serve, here are my best links for you:
2. Stay informed
If you’re ready to move beyond your local premises to the larger, ongoing ethical debate surrounding Google, here are my best links for you:
Whether your degree of engagement goes no further than local business listings or extends to your community, state, nation, or the world, I recommend increased awareness of the whole picture of Google in 2020. Education is power.
You’ve fed Google. You’ve fought Google. Now, I want you to flip this whole scenario to your advantage.
My 2020 local SEO blueprint has you working hard for every customer you win from the Internet. So far, the ball has been almost entirely in Google’s court, but when all of this effort culminates in a face-to-face meeting with another human being, we are finally at your party under your roof, where you have all the control. This is where you turn Internet-driven customers into in-store keepers.
I encourage you to make 2020 the year you draft a strategy for making a larger portion of your sales as Google-independent as possible, flipping their risky edifice into su casa, built of sturdy bricks like community, pride, service, and loyalty.
How can you do this? Here’s a four-quarter plan you can customize to fit your exact business scenario:
The foundation of all business success is giving the customer exactly what they want. Hoping and guessing are no substitute for a survey of your actual customers.
If you already have an email database, great. If not, you could start collecting one in Q1 and run your survey at the end of the quarter when you have enough addresses. Alternatively, you could ask each customer if they would kindly take a very short printed survey while you ring up their purchase.
Imagine you’re marketing an independent bookstore. Such a survey might look like this, whittled down to just the data points you most want to gather from customers to make business decisions:
Have pens ready and a drop box for each customer to deposit their card. Make it as convenient and anonymous as possible, for the customer’s comfort.
In this survey and listening phase of the new year, I also recommend that you:
Spend more time as the business owner speaking directly to your customers, really listening to their needs and complaints and then logging them in a spreadsheet. Speak with determination to discover how your business could help each customer more.
Have all phone staff log the questions/requests/complaints they receive.
Have all floor/field staff log the questions/requests/complaints they receive.
Audit your entire online review corpus to identify dominant sentiment, both positive and negative
If the business you’re marketing is large and competitive, now is the time to go in for a full-fledged consumer analysis project with mobile surveys, customer personae, etc.
End of Q1 Goal: Know exactly what customers want so that they’ll come to us for repeat business without any reliance on Google.
In this quarter, you’ll implement as many of the requests you’ve gleaned from Q1 as feasible. You’ll have put solutions in place to rectify any complaint themes, and will have upped your game wherever customers have called for it.
In addition to the fine details of your business, large or small, life as a local SEO has taught me that these six elements are basic requirements for local business longevity:
A crystal-clear USP
Adequate, well-trained, personable staff
An in-demand inventory of products/services
Accessibility for complaint resolution
Cleanliness/orderliness of premises/services
The lack of any of these six essentials results in negative experiences that can either cause the business to shed silent customers in person or erode online reputation to the point that the brand begins to fail.
With the bare minimums of customers’ requirements met, Q2 is where we get to the fun part. This is where you take your basic USP and add your special flourish to it that makes your brand unique, memorable, and desirable within the community you serve.
A short tale of two yarn shops in my neck of the woods: At shop A, the premises are dark and dusty. Customer projects are on display, but aren’t very inspiring. Staff sits at a table knitting, and doesn’t get up when customers enter. At shop B, the lighting and organization are inviting, displayed projects are mouthwatering, and though the staff here also sits at a table knitting, they leap up to meet, guide, and serve. Guess which shop now knows me by name? Guess which shop has staff so friendly that they have lent me their own knitting needles for a tough project? Guess which shop I gave a five-star review to? Guess where I’ve spent more money than I really should?
This quarter, seek vision for what going above-and-beyond would look like to your customers. What would bring them in again and again for years to come? Keep it in mind that computers are machines, but you and your staff are people serving people. Harness human connection.
End of Q2 Goal: Have implemented customers’ basic requests and gone beyond them to provide delightful human experiences Google cannot replicate.
Q3: Participate, educate, appreciate
Now you know your customers, are meeting their specified needs, and doing your best to become one of their favorite businesses. It’s time to walk out your front door into the greater community to see where you can make common cause with a neighborhood, town, or city, as a whole.
2020 is the year you become a joiner. Analyze all of the following sources at a local level:
Print and TV news
School newsletters and papers
Place of worship newsletters and bulletins
Local business organization newsletters
Any form of publication surrounding charity, non-profits, activism, and government
Create a list of the things your community worries about, cares about, and aspires to. For example, a city near me became deeply involved in a battle over putting an industrial plant in a wetland. Another town is fundraising for a no-kill animal shelter and a walk for Alzheimer’s. Another is hosting interfaith dinners between Christians and Muslims.
Pick the efforts that feel best to you and show up, donate, host, speak, sponsor, and support in any way you can. Build real relationships so that the customers coming through your door aren’t just the ones you sell to, but the ones you’ve manned a booth with on the 4th of July, attended a workshop with, or cheered with at their children’s soccer match. This is how community is made.
Once you’re participating in community life, it’s time to educate your customers about how supporting your business makes life better in the place they live (get a bunch of good stats on this here). Take the very best things that you do and promote awareness of them face-to-face with every person you transact with.
For my fictitious bookseller client, just 10 minutes spent on Canva (you have to try Canva!) helped me whip together this free flyer I could give to every customer, highlighting stats about how supporting independent businesses improve communities:
If you’re marketing a larger enterprise, a flyer like this could focus on green practices you’re implementing at scale, philanthropic endeavors, and positive community involvement.
Finally, with the holiday season fast approaching in the coming quarter, this is the time to let customers know how much you appreciate their business. Recently, I wrote about businesses turning kindness into a form of local currency. Brands are out there delivering surprise flowers and birthday cakes to customers, picking them up when they’re stranded on roadsides, washing town signage, and replacing “you will be towed” plaques with ones that read “you’re welcome to park here.” Loyalty programs, coupons, discounts, sales, free events, parties, freebies, and fun are all at your disposal to say “Thank you, please come again!” to your customers.
End of Q3 Goal: Have integrated more deeply into community life, motivated customers to choose our business for aspirational reasons beyond sales, and have offered memorable acts of gratitude for their business, completely independent of Google.
Q4: Share customers and sell
Every year, local consumer surveys indicate that 80–90% of people trust online reviews as much as they trust recommendations from friends and family. But I’ve yet to see a survey poll how much people trust recommendations they receive from trustworthy business owners.
You spent all of Q3 becoming a true ally to your community, getting personally involved in the struggles and dreams of the people you serve. At this point, if you’ve done a good job, the people who make up your brand have come closer to deserving the word “friend” from customers. As we move into Q4, it’s time to deepen alliances — this time with related local businesses.
In the classic movie Miracle on 34th Street, the owners of Macy’s and Gimbel’s begin sending shoppers to one another when either business lacks what the customer wants. They even create catalogues of their competitors’ inventory to assist with these referrals. In Q3, I’m hoping you joined a local business alliance that’s begun to acquaint you with other brands that feature goods/service that relate to yours so that you can begin dedicated outreach.
Q4, with Black Friday and Small Business Saturday, is traditionally the quarter in which local businesses expect to get out of the red, but how many more wedding cakes would you sell if all the caterers in town were referring to you, how many more tires would you vend if the muffler shops sent all their customers your way, how many more therapeutic massages might you book if every holistic medical center in your city confidently gave out your name?
Formalize B2B customer referrals in this quarter in seven easy steps:
Create a spreadsheetheaded with your contact information and an itemized list of the main goods, services, and brands you sell. Include specialties of your business. Create additional rows to be filled out with the information of other businesses.
Create a list of every local business that could tie in with yours in any way for a customer’s needs.
Invite the owners or qualified reps of each business on your list to a meeting at a neutral location, like a community center or restaurant.
Bring your spreadsheet to the meeting.
Discuss with your guests how a commitment to sharing customers will benefit all of you
If others commit, have them fill out their column of the spreadsheet. Share print and digital copies with all participants.
Whenever a customer asks for something you don’t offer, refer to the spreadsheet to make a recommendation. Encourage your colleagues to do likewise, and to train staff to use the spreadsheet to increase customer sharing and satisfaction.
House flipping is a runaway phenomenon in the US that has remodeled communities and sparked dozens of hit TV shows. Unfortunately, there’s a downside to the activity, as it can create negative gentrification, making life less good for residents.
You need have no fear of this when you flip Google, because turning their house into yours actually strengthens your real-world neighborhood, town, or city. It gives the residents who already live there more stable resources, more positive human contact, and a more closely knit community.
Truth: Google will remain dominant in the discovery-related phases of your consumers’ journeys for the foreseeable future. For new neighbors and travelers, Google will remain a valuable source of your business being found in the first place. Even if governing bodies break the company up at some point, the truth is that most local businesses need to utilize Google a search utility for discovery.
Dare: Draw a line on the pavement outside your front door this year, with transactional experiences on your side of the line. Google wants to own the transaction phase of your customers’ journey. Bookings, lead gen, local ads, and related features show where they are headed with this. If Google could, I’m sure they’d be glad to take a cut of every sale you make, and you’ll likely have to participate in their transactional aspirations to some degree. But…
In 2020, dare yourself to turn every customer you serve into a keeper, cutting out Google as the middleman wherever you can and building a truly local, regenerative base of loyalty, referrals, and community.
Wishing you a local 2020 of daring vision and self-made success!
Does reading that title give you a mini-panic attack?
Having gone through exactly as the title suggests, I can guarantee your anxiety is fully warranted.
If you care to relive my nightmare with me — perhaps as equal parts catharsis and SEO study — we will walk through the events chronologically.
Are you ready?
August 4th, 2019
It was a Sunday morning. I was drinking my coffee and screwing around in our SEO tools, like normal, not expecting a damned thing. Then … BAM!
What. The. Hell?
As SEOs, we’re all used to seeing natural fluctuations in rankings. Fluctuations, not disappearances.
Step 1: Denial
Immediately my mind goes to one place: it’s a mistake. So I jumped into some other tools to confirm whether or not Ahrefs was losing its mind.
Google Analytics also showed a corresponding drop in traffic, confirming something was definitely up. So as an SEO, I naturally assumed the worst…
Step 2: Algo panic
Algorithm update. Please, please don’t let it be an algo update.
I jumped into Barracuda’s Panguin Tool to see if our issue coincided with a confirmed update.
No updates. Phew.
Step 3: Diagnosis
Nobody ever thinks clearly when their reptile brain is engaged. You panic, you think irrationally and you make poor decisions. Zero chill.
I finally gathered some presence of mind to think clearly about what happened: It’s highly unusual for keywords rankings to disappear completely. It must be technical.
It must be indexing.
A quick Google search for the pages that lost keyword rankings confirmed that the pages had, in fact, disappeared. Search Console reported the same:
Notice the warning at the bottom:
No: ‘noindex’ detected in ‘robots’ meta tag
Now we were getting somewhere. Next, it was time to confirm this finding in the source code.
Our pages were marked for de-indexing. But how many pages were actually de-indexed so far?
Step 4: Surveying the damage
All of them. After sending a few frantic notes to our developer, he confirmed that a sprint deployed on Thursday evening (August 1, 2019), almost three days prior, had accidentally pushed the code live on every page.
But was the whole site de-indexed?
It’s highly unlikely, because in order for that to happen, Google would have had to crawl every page of the site within three days in order to find the ‘noindex’ markup. Search Console would be no help in this regard, as its data will always be lagging and may never pick up the changes before they are fixed.
Even looking back now, we see that Search Console only picked up a maximum of 249 affected pages, of over 8,000 indexed. Which is impossible, considering our search presence was cut by one-third an entire week after the incident was fixed.
Note: I will never be certain how many pages were fully de-indexed in Google, but what I do know is that EVERY page had ‘noindex’ markup, and I vaguely remember Googling ‘site:brafton.com’ and seeing roughly one-eighth of our pages indexed. Sure wish I had a screenshot. Sorry.
Step 1: Fix the problem
Once the problem was identified, our developer rolled back the update and pushed the site live as it was before the ‘noindex’ markup. Next came the issue of re-indexing our content.
Step 2: Get the site recrawled ASAP
I deleted the old sitemap, built a new one and re-uploaded to Search Console. I also grabbed most of our core product landing pages and manually requested re-indexing (which I don’t fully believe does anything since the most recent SC update).
Step 3: Wait
There was nothing else we could do at this point, other than wait. There were so many questions:
Will the pages rank for the same keywords as they did previously?
Will they rank in the same positions?
Will Google “penalize” the pages in some way for briefly disappearing?
Only time would tell.
August 8th, 2019 (one week) – 33% drop in search presence
In assessing the damage, I’m going to use the date in which the erroring code was fully deployed and populated on live pages (August 2nd) as ground zero. So the first measurement will be seven days completed, August 2nd through August 8th.
Search Console would likely give me the best indication as to how much our search presence had suffered.
We had lost about 33.2% of our search traffic. Ouch.
Fortunately, this would mark the peak level of damage we experienced throughout the entire ordeal.
August 15th, 2019 (two weeks) – 23% drop in traffic
During this period I was keeping an eye on two things: search traffic and indexed pages. Despite re-submitting my sitemap and manually fetching pages in Search Console, many pages were still not being indexed — even core landing pages. This will become a theme throughout this timeline.
As a result of our remaining unindexed pages, our traffic was still suffering.
Two weeks after the incident and we were still 8% down, and our revenue-generating conversions fell with the traffic (despite increased conversion rates).
August 22nd, 2019 (three weeks) – 13% drop in traffic
Our pages were still indexing slowly. Painfully slowly, while I was watching my commercial targets drop through the floor.
At least it was clear that our search presence was recovering. But how it was recovering was of particular interest to me.
Were all the pages re-indexed, but with decreased search presence?
Were only a portion of the pages re-indexed with fully restored search presence?
To answer this question, I took a look at pages that were de-indexed, and re-indexed, individually. Here is an example of one of those pages:
Here’s an example of a page that was de-indexed for a much shorter period of time:
In every instance I could find, each page was fully restored to its original search presence. So it didn’t seem to be a matter of whether or not pages would recover, it was a matter of when pages would be re-indexed.
Speaking of which, Search Console has a new feature in which it will “validate” erroring pages. I started this process on August 26th. After this point, SC slowly recrawled (I presume) these pages to the tune of about 10 pages per week. Is that even faster than a normally scheduled crawl? Do these tools in SC even do anything?
What I knew for certain was there were a number of pages still de-indexed after three weeks, including commercial landing pages that I counted on to drive traffic. More on that later.
August 29th, 2019 (four weeks) – 9% drop in traffic
At this point I was getting very frustrated, because there were only about 150 pages remaining to be re-indexed, and no matter how many times I inspected and requested a new indexing in Search Console, it wouldn’t work.
These pages were fully capable of being indexed (as reported by SC URL inspection), yet they wouldn’t get crawled. As a result, we were still 9% below baseline, after nearly a month.
One particular page simply refused to be re-indexed. This was a high commercial value product page that I counted on for conversions.
In my attempts to force re-indexing, I tried:
URL inspection and requesting indexing (15 times over the month).
Updating the publish date, then requesting indexing.
Updating the content and publish date, then requesting indexing.
Resubmitting sitemaps to SC.
Nothing worked. This page would not re-index. Same story for over one hundred other less commercially impactful URLs.
Note: This page would not re-index until October 1st, two full months after it was de-indexed.
By the way, here’s what our overall recovery progress looked like after four weeks:
September 5th, 2019 (five weeks) – 10.4% drop in traffic
The great plateau. At this point we had reindexed all of our pages, save for the ~150 or so supposedly being “validated.”
They weren’t. And they weren’t being recrawled either.
It seemed that we would likely fully recover, but the timing was in Google’s hands, and there was nothing I could do to impact it.
September 12th, 2019 (six weeks) – 5.3% gain in traffic
It took about six weeks before we fully recovered our traffic.
But in truth, we still hadn’t fully recovered our traffic, in that some content overperformed and was overcompensating for a number of pages that were not yet indexed. Notably, our product page that wouldn’t be indexed for another ~2.5 weeks.
On balance, our search presence recovered after six weeks. But our content wasn’t fully re-indexed until eight-plus weeks after fixing the problem.
For starters, definitely don’t de-index your site on accident, for an experiment, or any other reason. It stings. I estimate that we purged about 12% of all organic traffic amounting to an equally proportionate drop on commercial conversions.
What did we learn??
Once pages re-indexed, they were fully restored in terms of search visibility. The biggest issue was getting them re-indexed.
Some main questions we answered with this accidental experiment:
Did we recover?
Yes, we fully recovered and all URLs seem to drive the same search visibility.
How long did it take?
Search visibility returned to baseline after six weeks. All pages re-indexed after about eight to nine weeks.