The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
A decade ago, you could define SEO to a layperson by establishing the relationship between “search” and “text.” Fast-forward to present day, and a sizable chunk of web traffic and online purchases now come from searches initiated by voice prompt. Because users ask for content differently when they use Siri or Alexa — compared to when they type a search query into a browser — optimizing content to capture more of that traffic is going to work a bit differently.
Voice search is different than browser search
You have to make a distinction early on between voice searches that simply transcribe a voice prompt into a search bar and return a list of results, or a search action that triggers a specific command from a digital assistant-style platform. Most content isn’t going to be able to accommodate optimizations for both the Google search bar and an Alexa voice command at the same time, and some content can’t be engaged by voice-enabled devices at all, like a screen-free home smart speaker that can’t display an article or play a video. Rather, if you want to reach audiences while they interact with voice-enabled devices, you can think of voice-optimized content as another arrow in your quiver.
Not all content needs to be voice friendly
Creating content specifically geared to be findable and consumable via voice search is going to be more important for some users than others. As screen-free devices and voice-enabled search become more ubiquitous, some sites and pages would likely benefit from becoming more Alexa-friendly. For example, location-based businesses have huge opportunities to increase their foot traffic by optimizing their online presence to be discoverable via voice search. There are more users to capture every day who are likely to ask Siri or Alexa to “find a pizza shop nearby,” compared to those who might navigate to Yelp or Google Maps and perform a text search for “pizza delivery.”
That said, voice searchability isn’t necessarily what you should build your entire SEO strategy around, even for those users likely to benefit the most from high voice search rankings. That’s because voice isn’t exactly replacing text search — it’s supplementing it.
For example, Siri will update a user on the score of a game, but won’t narrate the action blow-by-blow. If you want a page to rank because you want to serve ads to users interested in sports commentary, then trying to optimize all of your content to accommodate voice may not be the most effective way to drive engagement.
However, if you want to boost foot traffic for a retail sandwich shop, then you can absolutely optimize the business listing to be easier to find when users ask for “lunch spots near me” via voice command while driving, and tailor your approach with that goal in mind.
Smart devices and voice search see usage grow, but not yet dominate
Voice search is arriving quickly but has not yet hit critical mass, creating some low-hanging fruit for early adopters with specific content goals.
In July 2019, Adobe released a study suggesting that around 48% of consumers are using voice search for general web searches. The study did not differentiate between digital assistants on smartphones or smart speakers, but the takeaways are similar.
In Adobe’s study, 85% of those respondents used voice controls on their smartphones, and the top use case for voice commands was to get directions, with 52% of navigational searches performed via voice. Consistent with Adobe’s findings, Microsoft also released a study in 2019 reporting that 72% of smartphone owners used digital assistants, with 65% of all road navigation searches being done by voice prompt.
A 2018 voice search survey conducted by BrightLocal broke out some common use cases by device:
58% of U.S. consumers had done a voice search for a local business on a smartphone
74% of those voice search users use voice to search for local businesses at least weekly
76% of voice search users search on smart speakers for local businesses at least once a week, with the majority doing so daily
But mass adoption of voice tech is still lagging, despite inroads made during the COVID-19 pandemic. While the 2020 Smart Audio Report by NPR and Edison Research found that consumption of news and entertainment using these devices increased among a third of smart speaker owners in early 2020, a two-thirds majority of non-owners were “not at all likely” to purchase a voice-enabled speaker in the next six months, and nearly half of non-owners who use voice commands felt the same. People who own smart speakers still perform lots of traditional text searches, in accordance with Microsoft’s 2019 study, and not everyone who has access to voice command tech likes to use it for every basic function.
Part of the delay in mass adoption may be attributed to unresolved trust and privacy questions that come with being asked to fill our homes with microphones. A majority of smart speaker owners (52%) and a majority of smartphone voice users (57%) are bothered that their smart speaker/smartphone is “always listening.” However, a silver lining is that roughly the same numbers of users for each respective device trust the companies that make the smart speaker/smartphone to keep their information secure.
Market share of digital assistants across search
There are four major smart assistants processing the majority of voice search requests at the time of publication, each with their own search algorithms, but with some overlap and data sources in common.
Understanding the market share for each assistant can help you prioritize your optimization strategy to your top growth objectives. Each of these digital assistants are tied to different hardware brands with a slightly different appeal and user base, so you can likely focus your analytics tracking efforts to just one or two platforms depending on the audience you’re targeting.
The Microsoft 2019 Voice Report asked respondents to list which digital assistants they had used before, which provides a broad idea of how much voice search traffic we can expect to come from each of these engines. Siri and Google Assistant tied for first place, commanding 36% of the market each. Amazon Alexa accounts for 25% of all digital assistant usage, while Microsoft Cortana ranked third place, powering 19% of devices.
An interesting thing to note here is that the engine powering Cortana leans largely on a partnership with Amazon Alexa. Cortana provides voice command functionality to laptops and personal computers, such as “Cortana, read my new emails”, while Alexa sees more smart-speaker requests like “Turn on the lights” or “Play NPR.”
Optimizing for voice search vs. voice actions
Voice commands actually fall into two categories — voice search and voice actions — and each looks for different criteria to determine which response will be returned first for any given voice request. It’s really important to define which one you’re talking about when assessing an SEO plan for voice search, because they process content very differently.
A voice search essentially just replaces a keyboard input with a spoken search phrase to return results in a browser, such as using the “OK Google” command in a smartphone browser. This may impact how you tailor your keyword phrases, based on the user’s tendency to phrase queries more conversationally when interacting with a voice AI.
Voice actions, on the other hand, are specific voice commands or questions from the user that trigger certain apps or automations, such as placing an order for takeout via smart speaker or checking the weather from your car. Screen-free devices like home smart speakers and some car assistants use voice actions. These commands don’t return a ranked page of results, but often a single spoken result, with a prompt for further action. If you ask an Echo Dot device for the weather, it will describe the weather out loud based on data pulled from a predetermined source. It can’t return a list of popular weather forecast sites, because there is no screen to display a Search Engine Results Page (SERP). This is an important distinction.
Smart assistants often pull data from secondary sites to return these vocal snippet results, like pinging WolframAlpha for mathematical conversions or Yelp for local business listings. One such use case would be a voice search for “order a pizza.” The AI would route the query to Yelp or Google Maps, and verbally return one result such as “I found a pizzeria nearby with five stars on Yelp. Would you like to call Joe’s Pizza to place an order or look up driving directions?” This is sometimes known as “position zero,” when a search engine returns an abstract or snippet from within the content itself to answer a direct question without necessarily sending the user to the page.
Achieving position zero depends on the device
Ranking position zero for a voice action prompt depends on where those results are being pulled from. Improving the voice search ranking for driving directions to a specific physical storefront, for example, is often a matter of improving that business’s visibility on listing sites like Google Maps and Yelp, which you may already be doing as part of your SEO plan anyway.
The data source depends on the platform running the voice search. Google and Android devices utilize Google Local Pack, while Siri crawls Yelp to return results when prompted for “the best” in any specific category, otherwise prioritizing the closest results. Since Alexa pulls local results from Bing, Yelp, and Yext, having filled-out profiles and robust listings on those platforms will help a business rank highly in Alexa search results.
Each assistant also pulls NAP identity (name, address, and phone number of a business’s online listing). NAP pulls profiles for location-based results from slightly different and sometimes overlapping sources:
Siri pulls local recommendations from the NAP profiles on Yelp, Bing, Apple Maps, and Trip Advisor
Android devices and Google Assistant pulls NAP profiles from Google My Business
Alexa pulls NAP profiles from Yelp, Bing, and Yext
Cortana, powered by Alexa, pulls from Yelp and Bing
Someone hoping to optimize their business page for voice search will want to max out their NAP profiles across all platforms by making sure that their listings at business.google.com, bingmapsportal.com, and mapsconnect.apple.com are completely filled out. This is also where a reputation management product like Moz Local can help businesses looking to improve their rankings.
Should you go after the voice snippet feature?
Again, many of the strategies you’d use to achieve first position on a text-based web search still apply to optimizing voice search. To improve voice performance specifically and appear in SERP features and voice snippets, on-page content should be structured so it’s easy to extract, basically reverse engineering the featured snippet you want to produce. But the question is, will it actually help you to rank well in that kind of search? That depends on your goal.
If the page you’re optimizing is built to sell more pizza to local customers, then yes, a featured snippet that pulls your NAP data from Google My Business and provides the pizzeria’s phone number to a hungry local parked nearby is a very good thing. But if the page in question is intended to serve sponsored content about diabetes management to drive clicks to an affiliate link for glucose monitoring strips, then you don’t necessarily want to build a page that helps Siri define Type II diabetes aloud to an eighth grader completing their homework.
Structuring the content headings with a question, followed by a concise answer in the paragraph below, makes it more likely that Siri will recite content from a given page when asked a similarly worded question by the user. The first answers a digital assistant gives when responding to a voice search query are typically the same type of snippets that show up in SERP features such as “People Also Ask” and Knowledge Graph results from Google.
In other words, Siri is unlikely to return your website to answer the voice prompt “What is the chemical composition of sugar?”, but you could rank highly with a featured snippet to answer a search like “Is sugar really bad for children with ADHD?”
The most valuable content for those seeking on-page visitors is the kind that addresses questions that are hard to answer with a single spoken response.
Rand Fishkin made his predictions on the role of the vocal snippet in search results as voice search was ramping up in 2016, and provided some advice on how you can plan your content around it in this Whiteboard Friday. According to Fishkin, it depends on whether you’re in the “safe” or “dangerous” zone for the content you’re trying to rank for, based on how easily a voice response can address the user’s query without sending them to your page.
“I think Google and Apple and Amazon and Alexa and all of these engines that participate in this will be continuing to disintermediate simplistic data and answer publishers,” Fishkin wrote.
He advises users to question the types of information they’re publishing, adding that if X percent of queries that result in traffic can be answered in fewer than Y words, or with “a quick image or a quick graphic, a quick number,” then the engine is going to do it themselves.
“They don’t need you, and very frankly they’re faster than you are,” Fishkin summarized. “They can answer that more quickly, more directly than you can. So I think it pays to consider: Are you in the safe or dangerous portion of this strategic framework with the current content that you publish and with the content plans that you have out in the future?”
Voice-enabled devices are gradually becoming more embedded in consumers’ daily lives, but that doesn’t mean we should prioritize our content as though voice is bearing down on the traditional search engine results page, threatening to replace text all together in the role of SEO. Even if smart assistants and voice-enabled devices continue to become more popular year over year, they still fill a relatively niche role in most consumers’ technical gadget ecosystem at this time. That could change as the voice AIs become more sophisticated and talking to our gadgets starts to feel more normal, but the industry is still grappling with some serious growing pains.
Voice search and voice action technology still has some really exciting applications looming on the horizon, and marketers are already finding clever ways to insert their brands into the hands-free experience. Optimizing content for voice search is just one piece of that puzzle.
Give us your hottest takes and wildest predictions on where voice search is headed in 2021 in the comments!
The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
In this blog, we’ll examine some of the technical SEO implementations you can use for your site before, during, and after your link building campaign, to optimize the performance and long-term impact of each campaign. These will allow you to increase the short-term performance and build lasting benefits from a concentrated burst of brand interest and attention.
Build your strategy around new user retention
When undertaking a link building campaign, it’s important to remember that much of the traffic you generate will be from first-time users. These users are less likely to wait around for slow loading content and less likely to return if you don’t make a good impression. So whether your campaign is a brand-focused PR push, content-focused outreach strategy, or something in between, it’s worth investing in a robust technical SEO framework that helps users connect to and engage with your brand, for years to come.
To make the most of campaign traffic, ensure that users can smoothly connect and engage with your site from a range of channels and sources. Your SEO priorities should be broadly divided into tasks that encourage social shareability, create opportunities for site-wide SEO gains, and maximize the viability of your campaign as part of your broader marketing activity.
Let’s explore the SEO tactics you can use to optimize and improve the performance of your link building campaign. As part of your campaign, you should aim to:
Increase shareability by making the site load faster and display more consistently for the predominantly mobile audience that discovers your content via social media channels.
Improve opportunity for site-wide SEO gains through optimizations to internal linking, improvements to E-A-T indicators, and on-page SEO from landing pages.
Optimize for the viability of the campaign by improving tracking, channel integration, and planning for long-term link traffic.
In addition to being great for users, retention-based optimizations are efficient to manage because they can be planned and put in place well ahead of core campaigns. They’re also applicable to every type of link building campaign, and work in tandem with fundamental backlink management tactics.
Lay the groundwork with a backlink audit
Before undertaking the work to gain new backlinks, carry out a backlink audit for actionable data on the quantity of your existing backlinks and the quality of your referrers.
In the campaign planning stages, popular content identified in your audit can give you strategic insights into the kind of content that will perform well with your existing audiences and networks. The overall quantity of links will give you valuable benchmarks for evidencing campaign performance, and the quality and distribution of your inbound links will help your team set targets for which referrers, target pages, and anchor text will give you the most benefit.
Use a backlink audit tool to find out:
How many external sites are linking to your pages?
Which external sites are linking to your pages?
What is the quality of the external sites backlink profile?
Which pages have existing external backlinks?
Which backlinks are identified as “target errors” because they’re going 404 pages?
Which backlinks are going to 301s pages?
Use a backlink audit tool like Moz Pro Link Explorer to review your links.
If your inbound links include a significant proportion of toxic or poor quality links, then you may wish to take corrective actions like creating and submitting a disavow list in Google Search Console before the campaign starts in earnest. But if your links are of decent quality, then your next focus should be to reclaim broken links to 404s and reduce links to 301 pages.
Sort links by Status Code or Error to find 404s and 301s.
Why should you fix backlink navigation errors?
You should fix backlink errors to 301s and 404s to regain the link value and build momentum ahead of your campaign. When traffic arrives on a site via a 301 redirect link, you may lose some PageRank value from that connection and receive almost zero PageRank value from links pointing to a 404 page. Fixing these links re-establishes these connections and can help increase the organic performance of your campaign target pages by improving the overall domain authority.
These improvements also reduce user connection times and improve tracking data. Tools like Google Analytics find it difficult to attribute the original source of the click, often leading to referral traffic being incorrectly attributed as direct — which is less than helpful for marketers who want to know who their best performing referrers are.
How should you fix backlink navigation errors?
During the link reclamation process, you will want to update links to the best possible new URL. To improve the value of links you already have, carry out these actions:
Assign 301 redirects to any backlink “target errors” that are going to 404 pages. When assigning the new page, try to match it with like-for-like content. An old link to a page about “shoes” should not be redirected to a page about “sharks”, it should go to a page that is also about shoes.
Ensure that any existing 301s are linking to the final destination. Where possible, redirect chains should be removed to maintain as much link value as possible, reduce demands on the server, and optimize for crawl budget.
Where you have control over the linking page — for instance, on social profiles, directory listings, internal project sites, or partner sites — update any 301 or 404 links to a relevant 200 URL. This is something that’s often overlooked, but it’s common for sites that have recently adopted HTTPS to have the old HTTP link on their social profiles and listings. You can and should update these.
Improving these links makes it easier for bots to crawl and index your site, improves user connection speeds, and gives more consistency for your brand messaging before you begin your outreach activity.
SEO optimizations to improve social share effectiveness during link building campaigns
Securing network connections to reduce delays from referrals
Improving page speed for mobiles
Updating open graph data to improve social shareability
Improve security to reduce connection delays
Security is an element of technical SEO that is often oversimplified. Yes, HTTPS is a ranking signal, but many believe this is achieved simply by obtaining an SSL certificate. That is part of it — you do need an SSL, but it’s only the start.
Security optimizations often involve server-side updates that can improve the speed and quality of your connections across the web by streamlining the security verification process.
This is particularly important for social media sites like Facebook, which have high levels of encryption and security on their end and equally high expectations for referred domains. Security layers can slow down connection times and affect user trust, so investing time here can help campaign performance overall.
Upgrade your SSL to reduce connection time
Not all SSLs are equal, and the benefits of a secure website extend beyond achieving a lock icon in the search bar. An optimized SSL certificate will be using the latest transport layer security protocols and be error-free.
Test the quality of your SSL
There are several tools that allow you to check the performance of your existing SSL. Each tool runs different diagnostics, so it’s worth running them in tangent to get a full picture of where action should be taken. It’s possible to pass an inspection on one tool and not on another, so taking a layered approach will yield better results.
First perform a top-level SSL check
For a quick snapshot, run your site through the SSL Checker from SSL Shopper. They give you a top-level rundown of expiry dates and whether your SSL is trusted by all web browsers. You want all green ticks here. Any errors should be flagged and addressed directly with your certificate issuer.
Simply put, a healthy chain will have all green arrows.
In my experience, improving errors here can have an impact on site speed. One client saw an almost 70% reduction in overall server connection time and over 60% reduction in Average Page Loads on Chrome by resolving chain issues.
Example impact of SSL Certificate Updates.
Run a full diagnostic
Using the SSL Server Test, you can get a more detailed diagnostic of your security configuration. This test gives your site a grade and assesses a range of security indicators. To support a link building campaign, you’ll want to confirm that your server is running the “modern” Transport Layer Security (TLS) protocols, specifically TLS 1.2 and ideally TLS 1.3.
Test your security protocols
TLS 1.3 came into use around 2018 and has since become the preferred connection protocol for large scale CDNs and operating systems. Not only is it more secure, but TLS 1.3 can improve connection latency by around 45% by removing network connection steps.
Cloudflare, Facebook, and Android use this protocol as a default, and a matching upgrade for your site could improve performance for a significant number of web users. As a point of reference, Cloudflare alone is used by around 16% of all websites worldwide and 81% of those with CDNs, so upgrading your TLS could help more users to quickly access your site from each new (and established) link.
With the introduction of Google’s Core Web Vitals, speed metrics and upcoming algorithm updates have put additional emphasis on mobile page loading times in general. In a link building campaign, you could see spike in traffic to a single page. But Google is watching, and if your page doesn’t deliver high quality UX as measured by CWV for at least 75% of users, this traffic could affect your page ranking.
Social shares will almost certainly form part of your link building campaign — with good reason. With updates to your Open Graph (OG) tags, you can tailor your tags for better rendering and performance.
Optimize and customize OG meta tags for social posts.
If the hero image for your page doesn’t display when you post links to social media, then you need to update your OG meta tags. It could be that your page doesn’t have the tags in place, or that the fields are not populating automatically. In either case, this is something you can fix.
Your link building campaign can bring benefits to other pages on your site with careful planning.
Whether your link building campaign is based on a short-term campaign landing page, a thought leadership piece, or a core service, your on-page SEO should be optimized to encourage organic traffic as well.
This is because humans forget things.
So while a user may discover a campaign or promotion via direct link building activities, they should also be able to find it again or look it up to transfer any offline buzz into online knowledge.
Your target page should be supported by solid on-page SEO that includes:
Keyword-optimized content with H1s and H2s
Relevant images optimized with alt text, titles, and structured data
Metadata that is optimized for campaign keywords
If you’ve received a juicy backlink from a relevant source, don’t let the benefits stop at a single page. Ensure that your target page has links to other, relevant content across your site that will keep users coming back.
Build links from your target page
Map out your internal links to ensure that any pages being used as backlink landing pages include links to similarly-themed pages on your site. You’ll get the most SEO benefit if your internal link structure includes links to pages with:
Optimizations for keywords that are lexicographically similar
This allows users and web crawlers that are enabled for natural language search to understand that your content is part of a wider bank of knowledge and expertise. It also makes it more likely that users will return to other content on your site in the future.
Build links to your target page for better indexing
Internal linking goes both ways, so don’t forget to create internal links into campaign content. For short-term campaigns, marketers sometimes create bespoke campaign landing pages with minimal links across the wider site.
This can result in slower indexing for campaign pages, making it more difficult for users to find you via search. To address this, create an announcement-style blog post that links to the campaign target page. In this way, both pages can be entered into the updated sitemap and submitted for indexing.
Example Landing Page and Announcement Blog
As mentioned previously, humans may not remember every detail of your campaign target page when they try to find it again. However, they may remember the person that shared, created, or was featured in the content. So include information about the team behind the content or campaign in order to build Expertise, Authority, and Trust for your brand, and increase the impact of social shares. This is particularly useful for thought leadership campaigns, where expertise signals like author biographies can be optimized with structured data.
For short-term campaigns like those for new events or products, including trust indicators like dynamic reviews can assist with conversions.
Optimizing for the viability of the campaign
You aren’t building links for links’ sake. You’re doing so to meet wider business objectives like driving sales, increasing market share, or generating leads.
Review and customize tracking
Once you’ve built your links, traffic to your target page becomes an opportunity to generate valuable data for your conversions funnel. Your tracking should be designed to give you data that supports your overall business objectives.
During your campaign, you should keep your business goals in mind and understand how your target page contributes to those goals. Update the tracking for your target pages to include metrics that correspond to your aims and the content type. Users visiting a data rich 5,000-word industry report are at a different stage in the customer funnel than those visiting from a quality niche directory.
Creating target page metrics can also help with KPIs, reporting, and evidencing campaign ROI, which clients and managers adore.
Once your business metrics are in place, use the data you’ve collected from link traffic to inform performance on other channels. For example:
Scroll depth data from users who access a piece of high-quality content could be used for re-marketing content on YouTube or display advertising.
Demographic data from users who visit for niche relevant “awareness month” content could improve audience targeting for advertising or PR activity around the same topic.
Email signups for an outreach event can be added to Facebook as Custom and Lookalike audiences for more direct conversions.
Planning for long-term link traffic
Consider the lifecycle of your link building campaign target page.
Ideally, you want to be driving traffic to a URL that can accrue page authority over time. If you spend time creating traffic for a URL that needs to change soon after the campaign ends, you’ll eventually be driving traffic to a 301 link. As we discussed before, this doesn’t give as much PageRank value for your site as an active 200 link does.
So, plan for ways to keep a consistent, live URL for an extended period of time. Depending on your link building strategy, you may be able to employ one of the following techniques:
Use evergreen URLs for long-term content: Thought leadership, cornerstone content like a white paper, or one-off reports are likely to remain on your site for a long time. For this content, consider removing dates from your URL to make the content more evergreen, as this allows for content to be updated and reduces future redirect requirements.
Create permanent pages for recurring campaigns: Landing pages for recurring outreach content like annual event sponsorship or awareness campaigns should become part of permanent navigation. This allows you to build links every year to a well-optimized, annually updated, static page, rather than starting from scratch with blogs to different URLs every year.
Avoid building links to PDFs: Put any downloadable resources into an HTML landing page and build links to that page. Links to PDFs can be difficult to redirect because of how they’re configured in your htaccess file.
Plan for any unavoidable redirects: For short-term campaigns like sales promotions, plan for which page will become the permanent page. Include common copy on both domains to help Google understand that it’s not a soft 404.
Technical SEO can help you gain and maintain backlinks, connect with mobile users, and improve the quality of your connections when you:
Secure network connections to reduce delays from referrals
Improve page speed for mobile users
Update open graph data to improve social shareability
Update on-page SEO
Optimize internal links
Include E-A-T optimizations
Plan for channel integration
Plan for long-term link traffic
These tactics will help the links you build to add value for your customers, your rankings, and your business for many years to come.
Generation Z’s behaviors differ from the cohorts that came before it, creating a new challenge for businesses marketing to consumers within it. Gen Z’s presence is also growing in the marketing industry itself and, as such, learning how to work with and appeal to these young people is a critical step to take sooner rather than later.
Who is Generation Z?
Social media stars might be the first people who come to mind when you think of Gen Z (also affectionately called Zoomers), but this age group is more than just TikTokers and YouTubers. Although the purported birth years of this generation vary across different sources, Pew Research refers to them as individuals born from 1997 and onward. With that in mind, it may come as a surprise that these Americans now make up about 28.7% of the total population. For context, Baby Boomers now account for a smaller proportion of just 21.8%, and Millennials around 22%.
Even more shocking than these statistics may be the fact that the oldest members of Generation Z are now well into their twenties. While it’s easy to think of this group as teenagers and children, they’ve grown up quickly, and are now major players in the world’s economy. In fact, this group has an annual spending power of around $143 billion, and currently accounts for approximately 40% of global consumers.
It’s well known that members of this cohort are digital natives and have been raised alongside technology. In 2014, the UK’s Office of Communications tested the technological proficiency of children versus adults only to find that the average 6-year-old outperformed those in their 40s. It’s safe to assume most members of this new generation have a solid grasp of technology, and a skill set that rivals people much older. This may be even more prevalent now with the rising use of digital resources due to the COVID-19 crisis.
Pew Social Trends noted in a recent essay that much like Millennials, who faced the Great Recession during their coming-of-age years, Gen Z will be affected by the pandemic for a long time to come. With a job market that is more competitive than ever and digital skills in high demand, a career in search may become increasingly attractive. Although search engine optimization is ever-changing, its importance has been unwavering for nearly two decades, making it a stable option in an unpredictable world.
How do Zoomers interact with marketing as a whole?
When it comes to targeting this cohort, its members are creating new challenges for businesses. First and foremost, their relationships with brands are very different than those of the generations that came before them. Reports from IBM in association with the National Retail Federation found that, for Gen Z, brand loyalty must be earned. Zoomers are looking for a reflection of their personal values in brands and are prepared to hold them accountable. Beyond their resistance to conventional brand loyalty, research has also found that they are more difficult to engage.
Generally speaking, in this day and age, consumers are bombarded with thousands of ads a day and have become harder to reach. As such, it’s not shocking that a common statistic claims that members of Gen Z have the smallest attention spans of just eight seconds. However, Fast Company presents this information in a new light by explaining that they actually have “8-second filters”. These filters allow them to quickly process the tremendous amounts of information they encounter each day to hone in on what they actually care about, uniquely preparing them to glaze over advertising attempts (as they’ve been conditioned to do basically since birth).
To combat this trend, marketers have been pursuing a variety of novel strategies and methods. For example, experiential marketing has proven to be effective with Gen Z, and they’re also especially excited by virtual reality.
While there are many new marketing opportunities available, social media continues to be a major channel for Gen Z engagement. This is especially true when it comes to video content on sites like YouTube and TikTok. All in all, as these consumers move away from traditional television viewership, the need for alternative marketing avenues grows.
How does Gen Z use search?
With all of this background information in mind, it’s easy to see that search is well-positioned to access this target demographic. Generation Z may not be as responsive to direct advertisements, but they’re accustomed to searching.
As a matter of fact, search engines have been around longer than Gen Z has, with the first search engine appearing in 1990, so it’s no surprise that their use is second nature to this age group. Zoomers fully understand how to use search tools, and they have the capacity to quickly evaluate SERPs prior to deciding on which link will get their click.
They’ve always had the answers to any question readily available, so they also use search for more intentional discovery. Despite their noted “8-second filter”, Fast Company additionally found that they could become deeply focused on topics they find to be worthwhile. Furthermore, their nonchalance towards brand loyalty means they may be less likely to opt for a big brand website over others.
Finally, their use and reliance on mobile devices can’t be overlooked or overstated. The stereotype that people are now glued to their phones has some merit, and companies like Google have taken notice. They’ve already begun catering towards this shift, with things like mobile-first indexing and AMP pages now taking on greater importance. IBM and NRF discovered that, in a global survey of 15,600 Gen Z-ers, 60% would not use an app or site that loads too slowly. This puts the importance of mobile site speed into a greater perspective for SEOs hoping to capture this demographic through search.
The findings of a recent Fractl survey clearly align with each of these trends. They found that out of all the generations, Gen Z has the highest preference for long-tail queries. They know that a short-tail query will produce broad results, and they may not find what they’re looking for. In addition, their mobile usage has created an uptick in voice assistant search functions, which utilize these multi-word phrases as well.
Zoomers working as SEOs
Although this age group is well equipped to use search engines, it’s likely that the concept of SEO still remains somewhat foreign to them. A quick Coursera search shows that there are almost no SEO-specific college courses currently available to students. While some general digital marketing classes may have a chapter or section on SEO, that information can oftentimes be outdated due to the ever-changing nature of search. There are also a few certificate programs and online workshops, but the aforementioned issue is still present. In summary, the most accessible way for students to learn is through their own research, an internship, or some other similar experience that they happen upon.
That said, this industry can provide a fantastic career path for members of Gen Z, should they discover and choose to pursue it. Working in search allows you to develop a variety of skills from critical thinking to problem-solving and data analysis. Those in the SEO community are always up to date on the latest tech and trends, which is valuable in many facets of business. Furthermore, working within an agency provides the opportunity to learn about a vast range of industries and niches. Many SEOs even pick up web development, data science, and programming experience along the way, and these are three competencies that are in very high demand. All things considered, the many hard and soft skills that can be developed through SEO work are the foundations for being successful throughout a career.
Zoomers already have an aptitude for work in technology-based spaces, and those with the determination can pick up expertise quickly in this field. Prime examples of this include the use of SEO tools and content management systems. For instance, once a CMS such as WordPress is learned, that knowledge can be easily transferred to others like Drupal, HubSpot, and so on. The same can be said for tools like Google Analytics and Search Console, because understanding how to evaluate data within those platforms can be translated to a variety of others. In essence, SEO and Gen Z could truly be a match made in digital marketing heaven.
Understanding client-side Gen Z-ers
While SEO may not yet be a mainstream career path for most young people, those in the digital marketing field will likely encounter it at some point. As such, it’s important to keep in mind that members of this generation will also be working on the client side of search.
As previously mentioned, some Zoomers are already part of the workforce, and the presence of this cohort will only continue to grow. In the year 2020 alone, Gen Z made up approximately 24% of the worldwide workforce.
With an influx of new workers on the horizon, working with them may be a unique experience given their strong grasp of technology. On top of that, they’re also more familiar with concepts like analytics and data science as those careers are seeing a boom in the higher education sector. Members of this age group shouldn’t be underestimated when it comes to absorbing the ins-and-outs of SEO from the client’s point of view.
As Gen Z continues entering the workforce, likely in entry-level positions, it’s important to remember that they’ll be decision-makers in a few short years. They’ll have an increasing ability to influence budgeting decisions, so it’s absolutely critical to think about ways to connect with them now and communicate the value of SEO to save time, energy, and money in the long run. A few steps to work through are as follows:
Understand that they’re eager to learn and can do so quickly.
Walk them through the reasoning behind each recommendation to build their knowledge over time. As with clients of any age, this improves trust and helps them to see how SEO really works.
Take them seriously and listen to their insights.
They may have concerns, as any client might when it comes to SEO strategies and how they play into the overall marketing plan. Listen to what they have to say, as they may be new, but they could still provide impactful insights.
Embrace novel ideas and creative thinking.
Fresh ideas are never a bad thing, but it can be easy to feel resistant towards those that seem to come out of left field. Fight the impulse to immediately shut these down and instead seriously consider how they could be incorporated into the project.
Don’t shy away from using new tools and technologies.
As mentioned above, Gen Z isn’t intimidated by new forms of technology. Share interesting findings from tools like HotJar, Tableau, or Google Tag Manager to make SEO more exciting for them.
Be candid and transparent about performance analytics.
Be up front about the state of the site’s performance to build their confidence and appreciation for search. In the age of instant gratification, there are few things more satisfying than a positive trend line. On the other side of that, be sure to research and determine the causes for any downturns.
While Gen Z may be a mystery in many ways, two things are certain: they are well on their way to dominating many industries, and they shouldn’t be overlooked. Likewise, if you’re not preparing for their arrival, you might already be falling behind.
Give these findings and tips some thought, and if there are already Gen Z-ers in your organization, try to take time to pick their brains. Go ahead and learn to embrace the change – as we so often do in SEO – because these TikTokers and YouTubers will only be growing in influence.
As a Senior Analytics Strategist working at an agency with clients across industries, I’ve seen wildly contrasting performance throughout the pandemic. Certain online retailers and auto sites were far surpassing any historical performance, while others had to cut back budgets significantly. The variances in revenue performances also empirically correlated with time frames when the public received more support, in the form of stimulus checks.
My team at Portent conducted the study detailed below to verify our hypotheses that the pandemic caused revenue increases in online retailers and auto industries, and that those spikes correlated with stimulus distributions. We discovered a few specific factors that increased the probability of confirming our hypothesis along the way.
“Unprecedented” has undoubtedly been the word of the year, and it’s touched all aspects of life and business. There have been changes in consumer behavior across all industries — we’ve unfortunately seen swaths of shutdowns in particular markets while others have sustained or are even thriving. This post will provide some observations in online behavior along with some consumer data that should be used as predictive indicators through the rest of the pandemic.
The observations of changes in online behavior were pulled and anonymized from 16 of our clients across 8 different industries. We narrowed those 8 industries down to three categories defined by Google Analytics for the purposes of this analysis: Shopping (10), Travel (3), and Autos & Vehicles (3).
The sites included in this analysis were limited to the US where possible and ranged in monthly revenue from $16K to $103K and in monthly sessions from 4K to 44K.
Observation #1: Stimulus checks resulted in increases in online behavior
Stimulus checks initiated the first revival of spending since the start of the pandemic. Granted, it was only about a month between the first notice of a lockdown and the beginning of stimulus payments. However, that increase in spend remained at higher levels after the majority of stimulus checks were distributed for most sites in this analysis—of course, excluding the Travel sites.
There was a noticeable jump in both sessions and revenue during the (1) week of April 13th, when $80M worth of stimulus payments were deposited for taxpayers who had direct deposits set up. By the (2) week of April 20th, additional rounds of deposits were made to those who manually set up direct deposits through the IRS. And by the (3) week of June 3rd, the IRS had delivered $270B in stimulus checks to Americans. At this point, revenue and sessions began to normalize below that period of stimulus distributions until the undeniable Black Friday sales occurred.
Observation #2: The impact depends on the market
There were obvious industries that were impacted most by the changes in consumer behavior and are still barely recovering: travel, in-store retail, and restaurants, to name those that were hit the hardest. On the other hand, some industries are actually performing better than before, such as online retail sales and food and beverage stores.
What came as a surprise, however, was that the Autos & Vehicles sites actually sustained higher averages than the Shopping sites. The sites in this industry saw 26.8% higher sessions and 36.8% higher revenue compared to the dip seen in the beginning of the pandemic and also well above prior levels in the beginning of the year.
The stark jump in sessions and revenue also aligned with when the distribution of stimulus checks began. In hindsight, the increase in consumer spending in this industry could have been anticipated considering the limitations and fear associated with traveling by plane. Online behavior is higher in the summer months as well, as those who were becoming restless from quarantining began to take road trips to satisfy their wanderlust.
There were a few other predictable trends that we identified in our study:
There was one quick spike in athletic wear purchases with the average sustaining higher than pre-pandemic levels.
There was a prolonged spike in revenue and especially traffic for baking goods and flour purchases, which remained at much higher levels compared to prior numbers. Then there was the seasonal influx of interest during the holidays. Interestingly, the conversion rate was 123% higher during the holidays compared to its peak in the beginning of the pandemic.
With parents and families stuck at home, there was an exponential and lengthy growth in online behavior for children’s toys. Although the growth has tapered, it continues to see an upward trend.
As expected, travel sites have taken the largest hit out of our study with a significant drop that has resulted in little to no recovery. The increase in revenue and sessions in the summer is almost entirely attributed to a resort that saw a similar increase in interest from the wanderlusts of the Autos & Vehicles sites.
Observation #3: There’s a positive correlation between AOV and % change in revenue
There was a fairly strong correlation of 0.76 between average order value (AOV) and % change in revenue YoY* for Shopping sites only. The consumer behavior on Autos & Vehicles sites was more dependent on stimulus checks and weather while the behavior on Travel sites was dependent on the feeling of safety.
*The data in this chart was pulled with the following notes:
YoY comparison was for November 2019 vs. November 2020 (November being the highest performing month based on seasonality)
Two anomalies were excluded from this data: (1) A flour company and, (2) A company with an AOV of nearly $2K
This one makes intuitive sense if you think about who’s been disproportionately impacted by the pandemic. Industries that are still thriving are ones that were able to more easily transition to 100% remote work and ones that had enough funds to weather some dropoff in clientele. Those who were fortunate enough to be employed in those industries during this time are also more likely to be paid higher than the median. In fact, only 30% of parents earning $200K or more lost their jobs since the pandemic, compared to 65% of parents earning less than $25K. Those who lost jobs in higher income brackets were also more likely to be able to find work again.
High-income spenders weren’t significantly impacted by the pandemic aside from the first few months, during which the change in consumer spending came from uncertainty. Although high- and low-income brackets both saw significant drops in spending initially, high-income consumers returned to levels comparable to January 2020 while low-income consumers were still about 10% below on average through September 2020.
Considerations to forecast future performance
There’s quite honestly nothing novel in this analysis that hasn’t been surfaced through market research, and these observations have been corroborated by economic data. The key takeaways here are to pay attention to the trends we’re seeing, think about how they relate to your target audience or customer, and pay attention to new developments that may signal a shift toward normalcy once again as you re-enter the digital marketplace.
It’s important to narrow down your research to your target audience. If your business is international, you likely won’t be as impacted by future stimulus checks in the US. However, different international markets will recover at differing rates.
Similarly, it’s important to keep your industry in mind. S&P is already estimating that the most-affected industries may not recover fully until 2022. This means that industries like in-store retail, travel, and service will have to find alternative ways to pivot during this time to return to normal levels.
Once you’ve considered your market and industry, weigh the risks based on your AOV and the income level of your average consumer. The higher the average income level, the more likely it is that your market has already recovered or the higher your chances are of being able to adjust successfully.
Additional federal support
Although the support from the US government throughout the pandemic has been lackluster at best, there’s a possibility of additional support. The recent round of stimulus checks were more limited than the first, meaning the impact on consumer behavior might be less noticeable. Economists are guessing that consumers would rather save this smaller amount than put it back into the economy. However, these bills should be accounted for in forecasting with the hopeful potential of additional (and more significant) federal support.
The distribution of vaccinations is likely to take at least several months to be impactful and possibly even longer to reach herd immunity. During this time of forward movement in the pandemic, we will all need to monitor and predict consumer behavior in unprecedented ways until we begin to see normalcy again.
With so many customization options in your Google My Business profile, it can be tough to decide what to focus on. But when it comes to ranking on the SERP, there are actually only four GMB fields that influence where your business will land.
In this brand new Whiteboard Friday, MozCon speaker and owner/founder of Sterling Sky, Joy Hawkins, takes us through the fields she and her team has found do (and do not) effect rankings.
Click on the whiteboard image above to open a high resolution version in a new tab!
Hello, Moz fans. My name is Joy Hawkins, and today I’m going to be talking about which Google My Business fields impact ranking in the local pack. At my agency, Sterling Sky, we do a lot of testing to try and figure out what things actually influence ranking and what things do not.
We’ve come to the conclusion that there are only four things inside the Google My Business dashboard that a business owner or a marketing agency can edit that will have a direct influence on where they rank in the local results on Google.
1. Business name
So to start us out, I’m going to start with the first thing that we found has impacted ranking, which is the business name. Now this is one that’s kind of frustrating because I don’t think it should have so much of an influence, but it does.
This year in the local search ranking factors study I actually put this as my number one. Of all the things that influence ranking, this one, in my experience, has the most weight, which is again unfortunate. So as a business owner, obviously you’re thinking, “I can’t really change my business name very easily”. If you do happen to have a keyword rich business name, you will see an advantage there.
But the real action item would be to kind of look to see if your competitors are taking advantage of this by adding descriptive words into their business name and then submitting corrections to Google for it, because it is against the guidelines. So I’m not saying go out there and add a whole bunch of keywords to your business name on Google. Don’t do that. But you should keep an eye on your competitors just to see if they’re doing this, and if they are, you can report it to Google using the Google business complaint redressal form.
Now one thing that’s kind of a tip here — it has nothing to do with Google — but we’ve seen the same thing on Bing, which doesn’t get talked about a whole lot, but on Bing you’re actually allowed to have descriptors in your business name, so go ahead and do it there.
No impact: Q&A
Now I’m going to switch over to something that we found has not influenced ranking at all, which is Q&A. I kind of shoved it over to the section over there because it’s not actually in the dashboard currently. There isn’t a Q&A section in there, but it is on the knowledge panel on Google, and it is something that you should get an email alert about if somebody posts a question to your listing.
So we did a bunch of testing on Q&A and found, despite putting random keywords and very specific things in questions that we posted and also in the answers, there was no measurable impact on ranking.
So, unfortunately, that is not one area where you can kind of manipulate ranking for your clients.
Moving on to the second thing that we have found influences ranking — categories. Categories might sound kind of simple, because you go and you pick your categories.
There are 10 that you can add on there, but one thing I want to point out is that Google has around 4,000 categories currently, and they keep adding categories, and then they also sometimes remove them.
So we have been tracking this month over month, and we usually find that there are about two to 10 (on average) changes every month to the categories. Sometimes they add ones that didn’t exist before. For example, we found in the last year there have been a lot of restaurant categories added as well as auto dealer categories. But there are also some industries like dentists, for example, that got a new one a couple of months ago for dental implants.
So it is something that you want to kind of keep track of, and hopefully we will have a resource published soon where we can actually log all of the changes for you.
No impact: services
Now moving on to another thing that does not impact ranking, we’ll move over here to services.
So the services section — at first glance it looks like an SEO dream. You can put all kinds of descriptive words in there. You can tell Google a lot about the different services you offer.
But we have found that whatever you put there has no actual bearing on where you rank. So it’s not something I would spend a lot time on. Also, it’s not very visible. Currently it’s not really visible on desktop at all. Then if you go onto a mobile device, it’s kind of hidden off to a tab. It’s not something we have found really has a lot of weight, so spend a few minutes on it, but it’s not something I would revisit quite often.
Then moving back to the things that do impact ranking, number three would be the website field.
So this is something where you do want to kind of think and possibly even test what page on your website to link your Google My Business listing to. Often people link to the homepage, which is fine. But we have also found with multi-location businesses sometimes it is better to link to a location page.
So you do want to kind of test that out. If you’re a business that has lots of different listings — like you have departments or you have practitioner listings — you also want to try and make sure that you link those to different pages on your site, to kind of maximize your exposure and make sure that you’re just not trying to rank all the listings for the same thing, because that won’t happen. They’ll just get filtered. So that is a section that I would definitely suggest doing some testing on and see what works best for you and your industry.
No impact: products
Now moving on to something that we have found did not impact rankings — products.
So this is a feature that Google launched within I think about a year or so ago. It’s available on most listings. They are actually slowly rolling it out at the moment to all listings with the exception of a few categories that don’t have it. This section is kind of cool because it’s very visual.
If you’re a business that offers products or even if you offer services, you can technically list them in this section with photos. One of the neat things about the products section is that they are very visible on the knowledge panel on both desktop and on mobile. So it is something you want to fill out, but unfortunately we have found it doesn’t impact ranking. However, it does have an impact on conversions for certain industries.
So if you’re a business like a florist or a car dealer, it definitely makes sense to fill out that section and keep it up to date based on what products you’re currently offering.
Then moving back to the final thing that we found: number four for what influences ranking would be reviews (which is probably not going to be shocking to most of you). But we have found that review quantity does make an impact on ranking.
But that being said, we’ve also found that it has kind of diminishing returns. So for example, if you’re a business and you go from having no reviews to, let’s say, 20 or 30 reviews, you might start to see your business rank further away from your office, which is great. But if you go from, let’s say, 30 to 70, you may not see the same lift. So that’s something to kind of keep in mind.
But there are lots of reasons as a business, obviously, why you want to focus on reviews, and we do see that they actually have a direct impact on ranking.
There was an article that I wrote a couple of years ago that is still relevant, on Search Engine Land, that talks about the changes that I saw when a whole bunch of businesses lost reviews and just watching how their ranking actually dropped within a 24 to 48-hour period. So that is still true and still relevant, but it’s something that I would also keep in mind when you’re coming up with a strategy for your business.
So in summary, the four things that you need to remember that you can actually utilize inside Google My Business to influence your ranking: first is the business name, second would be the categories, third would be the website field, and finally the review section on Google.
Thanks for listening. If you have any questions, please hit me up in the comments.
Ready for more?
You’ll uncover even more SEO goodness in the MozCon 2020 video bundle. At this year’s special low price of $129, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:
21 full-length videos from some of the brightest minds in digital marketing
Instant downloads and streaming to your computer, tablet, or mobile device
The poet Burns once observed that the best laid plans “gang aft agley.” At Moz, we were about to publish our State of Local SEO industry report, based on our local search marketing survey to which hundreds of you generously replied. Then the public health emergency unexpectedly arose, and we decided to pause in our planning.
The findings of the survey, as they currently stand, contain valuable and surprising insights which are as relevant today as they were pre-COVID-19. Yet, in order to reflect the substantial changes the local business community is currently weathering, we are reaching out to you with a timely additional request.
If you market local businesses in any capacity, whether in-house or for an agency, please take our quick, supplementary six-question survey. Your answers will help everyone gauge the impacts of the past few weeks on our industry, and hopefully help in planning for the future. We would be so grateful for just a few minutes of your time to be sure the final report reflects the full picture of local business marketing.
Google shook up the SEO world by announcing big changes to how publishers should mark nofollow links. The changes — while beneficial to help Google understand the web — nonetheless caused confusion and raised a number of questions. We’ve got the answers to many of your questions here.
14 years after its introduction, Google today announced significant changes to how they treat the “nofollow” link attribute. The big points:
Nofollow can now be specified with 3 different attributes — “nofollow”, “sponsored”, and “ugc” — each signifying a different meaning.
For ranking purposes, Google now treats each of the nofollow attributes as “hints” — meaning they likely won’t impact ranking, but Google may choose to ignore the directive and use nofollow links for rankings.
Google continues to ignore nofollow links for crawling and indexing purposes, but this strict behavior changes March 1, 2020, at which point Google begins treating nofollow attributes as “hints”, meaning they may choose to crawl them.
You can use the new attributes in combination with each other. For example, rel=”nofollow sponsored ugc” is valid.
Paid links must either use the nofollow or sponsored attribute (either alone or in combination.) Simply using “ugc” on paid links could presumably lead to a penalty.
Publishers don’t have to do anything. Google offers no incentive for changing, or punishment for not changing.
Publishers using nofollow to control crawling may need to reconsider their strategy.
Why did Google change nofollow?
Google wants to take back the link graph.
Google introduced the nofollow attribute in 2005 as a way for publishers to address comment spam and shady links from user-generated content (UGC). Linking to spam or low-quality sites could hurt you, and nofollow offered publishers a way to protect themselves.
Google also required nofollow for paid or sponsored links. If you were caught accepting anything of value in exchange for linking out without the nofollow attribute, Google could penalize you.
The system generally worked, but huge portions of the web—sites like Forbes and Wikipedia—applied nofollow across their entire site for fear of being penalized, or not being able to properly police UGC.
This made entire portions of the link graph less useful for Google. Should curated links from trusted Wikipedia contributors really not count? Perhaps Google could better understand the web if they changed how they consider nofollow links.
By treating nofollow attributes as “hints”, they allow themselves to better incorporate these signals into their algorithms.
Hopefully, this is a positive step for deserving content creators, as a broader swath of the link graph opens up to more potential ranking influence. (Though for most sites, it doesn’t seem much will change.)
What is the ranking impact of nofollow links?
Prior to today, SEOs generally believed nofollow links worked like this:
Not used for crawling and indexing (Google didn’t follow them.)
Might be used for ranking, though the observed effect was typically small or non-existant
To be fair, there’s a lot of debate and speculation around the second statement, and Google has been opaque on the issue. Experimental data and anecdotal evidence suggest Google has long considered nofollow links as a potential ranking signal.
As of today, Google’s guidance states nofollowed attributes—including sponsored and ugc—are treated like this:
Still not used for crawling and indexing (see the changes taking place in the future below)
For ranking purposes, all nofollow directives are now officially a “hint” — meaning Google may choose to ignore it and use it for ranking purposes. Many SEOs believe this is how Google has been treating nofollow for quite some time.
Beginning March 1, 2020, nofollow attributes will be treated as hints across the board, meaning:
In some cases, they may be used for crawling and indexing
In some cases, they may be used for ranking
Emphasis on the word “some.” Google is very explicit that in most cases they will continue to ignore nofollow links as usual.
Do publishers need to make changes?
For most sites, the answer is no — only if they want to. Google isn’t requiring sites to make changes, and as of yet, there is no business case to be made.
That said, there are a couple of cases where site owners may want to implement the new attributes:
Sites that want to help Google better understand the sites they—or their contributors—are linking to. For example, it could be to everyone’s benefit for sites Wikipedia to adopt these changes. Or maybe Moz could change how it marks up links in the user-generated Q&A section (which often links to high-quality sources.)
Sites that use nofollow for crawl control. For sites with large faceted navigation, nofollow is sometimes an effective tool at preventing Google from wasting crawl budget. It’s too early to tell if publishers using nofollow this way will need to change anything before Google starts treating nofollow as a crawling “hint” but it may be important to pay attention too.
To be clear, if a site is properly using nofollow today, SEOs do not need to recommend any changes be made. Though sites are free to do so, they should not expect any rankings boost for doing so, or new penalties for not changing.
That said, Google’s use of nofollow may evolve, and it will be interesting to see in the future—through study and analysis—if a ranking benefit does emerge from using nofollow attributes in a certain way.
Which nofollow attribute should you use?
If you choose to change your nofollow links to be more specific, Google’s guidelines are very clear, so we won’t repeat them in-depth here. In brief, your choices are:
rel=”sponsored” – For paid or sponsored links. This would assumingly include affiliate links, although Google hasn’t explicitly said.
rel=”ugc” – Links within all user-generated content. Google has stated if UGC is created by a trusted contributor, this may not be necessary.
rel=”nofollow” – A catchall for all nofollow links. As with the other nofollow directives, these links generally won’t be used for ranking, crawling, or indexing purposes.
Additionally, attributes can be used in combination with one another. This means a declaration such as rel=”nofollow sponsored” is 100% valid.
Can you be penalized for not marking paid links?
Yes, you can still be penalized, and this is where it gets tricky.
Google advises to mark up paid/sponsored links with either “sponsored” or “nofollow” only, but not “ugc”.
This adds an extra layer of confusion. What if your UGC contributors are including paid or affiliate links in their content/comments? Google, so far, hasn’t been clear on this.
For this reason, we may likely see publishers continue to markup UGC content with “nofollow” as a default, or possibly “nofollow ugc”.
Can you use the nofollow attributes to control crawling and indexing?
Nofollow has always been a very, very poor way to prevent Google from indexing your content, and it continues to be that way.
If you want to prevent Google from indexing your content, it’s recommended to use one of several other methods, most typically some form of “noindex”.
Crawling, on the other hand, is a slightly different story. Many SEOs use nofollow on large sites to preserve crawl budget, or to prevent Google from crawling unnecessary pages within faceted navigation.
Bases on Google statements, it seems you can still attempt to use the nofollow attributes in this way, but after March 1, 2020, they may choose to ignore this. Any SEO using nofollow in this way may need to get creative in order to prevent Google from crawling unwanted sections of their sites.
Final thoughts: Should you implement the new nofollow attributes?
While there is no obvious compelling reason to do so, this is a decision every SEO will have to make for themselves.
Given the initial confusion and lack of clear benefits, many publishers will undoubtedly wait until we have better information.
That said, it certainly shouldn’t hurt to make the change (as long as you mark paid links appropriately with “nofollow” or “sponsored”.) For example, the Moz Blog may someday change comment links below to rel=”ugc”, or more likely rel=”nofollow ugc”.
Finally, will anyone actually use the “sponsored” attribute, at the risk of giving more exposure to paid links? Time will tell.
What are your thoughts on Google’s new nofollow attributes? Let us know in the comments below.
While SEOs have been doubling-down on content and quality signals for their websites, Google was building the foundation of a new reality for crawling — indexing and ranking. Though many believe deep in their hearts that “Content is King,” the reality is that Mobile-First Indexing enables a new kind of search result. This search result focuses on surfacing and re-publishing content in ways that feed Google’s cross-device monetization opportunities better than simple websites ever could.
For two years, Google honed and changed their messaging about Mobile-First Indexing, mostly de-emphasizing the risk that good, well-optimized, Responsive-Design sites would face. Instead, the search engine giant focused more on the use of the Smartphone bot for indexing, which led to an emphasis on the importance of matching SEO-relevant site assets between desktop and mobile versions (or renderings) of a page. Things got a bit tricky when Google had to explain that the Mobile-First Indexing process would not necessarily be bad for desktop-oriented content, but all of Google’s shifting and positioning eventually validated my long-stated belief: That Mobile-First Indexing is not really about mobile phones, per se, but mobile content.
I would like to propose an alternative to the predominant view, a speculative theory, about what has been going on with Google in the past two years, and it is the thesis of my 2019 MozCon talk — something we are calling Fraggles and Fraggle-based Indexing.
I’ll go through Fraggles and Fraggle-based indexing, and how this new method of indexing has made web content more ‘liftable’ for Google. I’ll also outline how Fraggles impact the Search Results Pages (SERPs), and why it fits with Google’s promotion of Progressive Web Apps. Next, I will provide information about how astute SEO’s can adapt their understanding of SEO and leverage Fraggles and Fraggle-Based Indexing to meet the needs of their clients and companies. Finally, I’ll go over the implications that this new method of indexing will have on Google’s monetization and technology strategy as a whole.
Ready? Let’s dive in.
Fraggles & Fraggle-based indexing
The SERP has changed in many ways. These changes can be thought of and discussed separately, but I believe that they are all part of a larger shift at Google. This shift includes “Entity-First Indexing” of crawled information around the existing structure of Google’s Knowledge Graph, and the concept of “Portable-prioritized Organization of Information,” which favors information that is easy to lift and re-present in Google’s properties — Google describes these two things together as “Mobile-First Indexing.”
Fraggles represent individual parts (fragments) of a page for which Google overlayed a “handle” or “jump-link” (aka named-anchor, bookmark, etc.) so that a click on the result takes the users directly to the part of the page where the relevant fragment of text is located. These Fraggles are then organized around the relevant nodes on the Knowledge Graph, so that the mapping of the relationships between different topics can be vetted, built-out, and maintained over time, but also so that the structure can be used and reused, internationally — even if different content is ranking.
More than one Fraggle can rank for a page, and the format can vary from a text-link with a “Jump to” label, an unlabeled text link, a site-link carousel, a site-link carousel with pictures, or occasionally horizontal or vertical expansion boxes for the different items on a page.
The easiest way for an SEO to think about a Fragment is within the example of an AJAX expansion box: The piece of text or information that is fetched from the server to populate the AJAX expander when clicked could be described as a Fragment. Alternatively, if it is indexed for Mobile-First Indexing, it is a Fraggle.
We have also recently discovered that Google has begun to index URLs with a # jump-link, after years of not doing so, and is reporting on them separately from the primary URL in Search Console. As you can see below from our data, they aren’t getting a lot of clicks, but they are getting impressions. This is likely because of the low average position.
Why index fragments & Fraggles?
If you’re used to thinking of rankings with the smallest increment being a URL, this idea can be hard to wrap your brain around. To help, consider this thought experiment: How useful would it be for Google to rank a page that gave detailed information about all different kinds of fruits and vegetables? It would be easy for a query like “fruits and vegetables,” that’s for sure. But if the query is changed to “lettuce” or “types of lettuce,” then the page would struggle to rank, even if it had the best, most authoritative information.
This is because the “lettuce” keywords would be diluted by all the other fruit and vegetable content. It would be more useful for Google to rank the part of the page that is about lettuce for queries related to lettuce, and the part of the page about radishes well for queries about radishes. But since users don’t want to scroll through the entire page of fruits and vegetables to find the information about the particular vegetable they searched for, Google prioritizes pages with keyword focus and density, as they relate to the query. Google will rarely rank long pages that covered multiple topics, even if they were more authoritative.
With featured snippets, AMP featured snippets, and Fraggles, it’s clear that Google can already find the important parts of a page that answers a specific question — they’ve actually been able to do this for a while. So, if Google can organize and index content like that, what would the benefit be in maintaining an index that was based only on per-pages statistics and ranking? Why would Google want to rank entire pages when they could rank just the best parts of pages that are most related to the query?
To address these concerns, historically, SEO’s have worked to break individual topics out into separate pages, with one page focused on each topic or keyword cluster. So, with our vegetable example, this would ensure that the lettuce page could rank for lettuce queries and the radish page could rank for radish queries. With each website creating a new page for every possible topic that they would like to rank for, there’s lot of redundant and repetitive work for webmasters. It also likely adds a lot of low-quality, unnecessary pages to the index. Realistically, how many individual pages on lettuce does the internet really need, and how would Google determine which one is the best? The fact is, Google wanted to shift to an algorithm that focused less on links and more on topical authority to surface only the best content — and Google circumvents this with the scrolling feature in Fraggles.
Even though the effort to switch to Fraggle-based indexing, and organize the information around the Knowledge Graph, was massive, the long-term benefits of the switch far out-pace the costs to Google because they make Google’s system for flexible, monetizable and sustainable, especially as the amount of information and the number of connected devices expands exponentially. It also helps Google identify, serve and monetize new cross-device search opportunities, as they continue to expand. This includes search results on TV’s, connected screens, and spoken results from connected speakers. A few relevant costs and benefits are outlined below for you to contemplate, keeping Google’s long-term perspective in mind:
Why Fraggles and Fraggle-based indexing are important for PWAs
What also makes the shift to Fraggle-based Indexing relevant to SEOs is how it fits in with Google’s championing of Progressive Web Apps or AMP Progressive Web Apps, (aka PWAs and PWA-AMP websites/web apps). These types of sites have become the core focus of Google’s Chrome Developer summits and other smaller Google conferences.
The answer is because PWA’s require ServiceWorkers, which uses Fraggles and Fraggle-based indexing to take the burden off crawling and indexing of complex web content.
ServiceWorkers and SEO
For a PWA to be indexed, Google requires webmasters to ‘register their app in Firebase,’ but they used to require webmasters to “register their ServiceWorker.” Firebase is the Google platform that allows webmasters to set up and manage indexing and deep linking for their native apps, chat-bots and, now, PWA’s.
Direct communication with a PWA specialist at Google a few years ago revealed that Google didn’t crawl the ServiceWorker itself, but crawled the API to the ServiceWorker. It’s likely that when webmasters register their ServiceWorker with Google, Google is actually creating an API to the ServiceWorker, so that the content can be quickly and easily indexed and cached on Google’s servers. Since Google has already launched an Indexing API and appears to now favor API’s over traditional crawling, we believe Google will begin pushing the use of ServiceWorkers to improve page speed, since they can be used on non-PWA sites, but this will actually be to help ease the burden on Google to crawl and index the content manually.
It’s important to remember that this is how AMP, Schema, and many other types of powerful SEO functionalities have started with a limited launch; beyond that, some great SEO’s have already tested submitting other types of content in the API and seen success. Submitting to APIs skips Google’s process of blindly crawling the web for new content and allows webmasters to feed the information to them directly.
It is possible that the new Indexing API follows a similar structure or process to PWA indexing. Submitted URLs can already get some kinds of content indexed or removed from Google’s index, usually in about an hour, and while it is only currently officially available for the two kinds of content, we expect it to be expanded broadly.
How will this impact SEO strategy?
Of course, every SEO wants to know how to leverage this speculative theory — how can we make the changes in Google to our benefit?
The first thing to do is take a good, long, honest look at a mobile search result. Position #1 in the organic rankings is just not what it used to be. There’s a ton of engaging content that is often pushing it down, but not counting as an organic ranking position in Search Console. This means that you may be maintaining all your organic rankings while also losing a massive amount of traffic to SERP features like Knowledge Graph results, Featured Snippets, Google My Business, maps, apps, Found on the Web, and other similar items that rank outside of the normal organic results.
These results, as well as Pay-per-Click results (PPC), are more impactful on mobile because they are stacked above organic rankings. Rather than being off to the side, as they might be in a desktop view of the search, they push organic rankings further down the results page. There has been some great reporting recently about the statistical and large-scale impact of changes to the SERP and how these changes have resulted in changes to user-behavior in search, especially from Dr. Pete Meyers, Rand Fishkin, and JumpTap.
Dr. Pete has focused on the increasing number of changes to the Google Algorithm recorded in his MozCast, which heated up at the end of 2016 when Google started working on Mobile-First Indexing, and again after it launched the Medic update in 2018.
Rand, on the other hand, focused on how the new types of rankings are pushing traditional organic results down, resulting in less traffic to websites, especially on mobile. All this great data from these two really set the stage for a fundamental shift in SEO strategy as it relates to Mobile-First Indexing.
The research shows that Google re-organized its index to suit a different presentation of information — especially if they are able to index that information around an entity-concept in the Knowledge Graph. Fraggle-based Indexing makes all of the information that Google crawls even more portable because it is intelligently nested among related Knowledge Graph nodes, which can be surfaced in a variety of different ways. Since Fraggle-based Indexing focuses more on the meaningful organization of data than it does on pages and URLs, the results are a more “windowed” presentation of the information in the SERP. SEOs need to understand that search results are now based on entities and use-cases (think micro-moments), instead of pages and domains.
Google’s Knowledge Graph
To really grasp how this new method of indexing will impact your SEO strategy, you first have to understand how Google’s Knowledge Graph works.
Since it is an actual “graph,” all Knowledge Graph entries (nodes) include both vertical and lateral relationships. For instance, an entry for “bread” can include lateral relationships to related topics like cheese, butter, and cake, but may also include vertical relationships like “standard ingredients in bread” or “types of bread.”
Lateral relationships can be thought of as related nodes on the Knowledge Graph, and hint at “Related Topics” whereas vertical relationships point to a broadening or narrowing of the topic; which hints at the most likely filters within a topic. In the case of bread, a vertical relationship-up would be topics like “baking,” and down would include topics like “flour” and other ingredients used to make bread, or “sourdough” and other specific types of bread.
SEOs should note that Knowledge Graph entries can now include an increasingly wide variety of filters and tabs that narrow the topic information to benefit different types of searcher intent. This includes things like helping searchers find videos, books, images, quotes, locations, but in the case of filters, it can be topic-specific and unpredictable (informed by active machine learning). This is the crux of Google’s goal with Fraggle-based Indexing: To be able to organize the information of the web-based on Knowledge Graph entries or nodes, otherwise discussed in SEO circles as “entities.”
Since the relationships of one entity to another remain the same, regardless of the language a person is speaking or searching in, the Knowledge Graph information is language-agnostic, and thus easily used for aggregation and machine learning in all languages at the same time. Using the Knowledge Graph as a cornerstone for indexing is, therefore, a much more useful and efficient means for Google to access and serve information in multiple languages for consumption and ranking around the world. In the long-term, it’s far superior to the previous method of indexing.
Examples of Fraggle-based indexing in the SERPs
Google has dramatically increased the number of Knowledge Graph entries and the categories and relationships within them. The build-out is especially prominent for topics for which Google has a high amount of structured data and information already. This includes topics like:
TV and Movies — from Google Play
Food and Recipe — from Recipe Schema, recipe AMP pages, and external food and nutrition databases
Science and medicine — from trusted sources (like WebMD)
Businesses — from Google My Business.
Google is adding more and more nodes and relationships to their graph and existing entries are also being built-out with more tabs and carousels to break a single topic into smaller, more granular topics or type of information.
As you can see below, the build-out of the Knowledge Graph has also added to the number of filters and drill-down options within many queries, even outside of the Knowledge Graph. This increase can be seen throughout all of the Google properties, including Google My Business and Shopping, both of which we believe are now sections of the Knowledge Graph:
Other similar examples include the additional filters and “Related Topics” results in Google Images, which we also believe to represent nodes on the Knowledge Graph:
The Knowedge Graph is also being presented in a variety of different ways. Sometimes there’s a sticky navigation that persists at the top of the SERP, as seen in many media-oriented queries, and sometimes it’s broken up to show different information throughout the SERP, as you may have noticed in many of the local business-oriented search results, both shown below.
Since the launch of Fraggle-based indexing is essentially a major Knowledge Graph build-out, Knowledge Graph results have also begun including more engaging content which makes it even less likely that users will click through to a website. Assets like playable video and audio, live sports scores, and location-specific information such as transportation information and TV time-tables can all be accessed directly in the search results. There’s more to the story, though.
Companies who want to leverage the Knowledge Graph should take every opportunity to create your own assets, like AR models and AMP Stories, so that Google will have no reason to do it. Beyond that, companies should submit accurate information directly to Google whenever they can. The easiest way to do this is through Google My Business (GMB). Whatever types of information are requested in GMB should be added or uploaded. If Google Posts are available in your business category, you should be doing Posts regularly, and making sure that they link back to your site with a call to action. If you have videos or photos that are relevant for your company, upload them to GMB. Start to think of GMB as a social network or newsletter — any assets that are shared on Facebook or Twitter can also be shared on Google Posts, or at least uploaded to the GMB account.
You should also investigate the current Knowledge Graph entries that are related to your industry, and work to become associated with recognized companies or entities in that industry. This could be from links or citations on the entity websites, but it can also include being linked by third-party lists that give industry-specific advice and recommendations, such as being listed among the top competitors in your industry (“Best Plumbers in Denver,” “Best Shoe Deals on the Web,” or “Top 15 Best Reality TV Shows”). Links from these posts also help but are not required — especially if you can get your company name on enough lists with the other top players. Verify that any links or citations from authoritative third-party sites like Wikipedia, Better Business Bureau, industry directories, and lists are all pointing to live, active, relevant pages on the site, and not going through a 301 redirect.
While this is just speculation and not a proven SEO strategy, you might also want to make sure that your domain is correctly classified in Google’s records by checking the industries that it is associated with. You can do so in Google’s MarketFinder tool. Make updates or recommend new categories as necessary. Then, look into the filters and relationships that are given as part of Knowledge Graph entries and make sure you are using the topic and filter words as keywords on your site.
Featured Snippets or “Answers” first surfaced in 2014 and have also expanded quite a bit, as shown in the graph below. It is useful to think of Featured Snippets as rogue facts, ideas or concepts that don’t have a full Knowledge Graph result, though they might actually be associated with certain existing nodes on the Knowledge Graph (or they could be in the vetting process for eventual Knowledge Graph build-out).
Featured Snippets seem to surface when the information comes from a source that Google does not have an incredibly high level of trust for, like it does for Wikipedia, and often they come from third party sites that may or may not have a monetary interest in the topic — something that makes Google want to vet the information more thoroughly and may prevent Google from using it, if a less bias option is available.
Like the Knowledge Graph, Featured Snippets results have grown very rapidly in the past year or so, and have also begun to include carousels — something that Rob Bucci writes about extensively here. We believe that these carousels represent potentially related topics that Google knows about from the Knowledge Graph. Featured Snippets now look even more like mini-Knowledge Graph entries: Carousels appear to include both lateral and vertically related topics, and their appearance and maintenance seem to be driven by click volume and subsequent searches. However, this may also be influenced by aggregated engagement data for People Also Ask and Related Search data.
The build-out of Featured Snippets has been so aggressive that sometimes the answers that Google lifts are obviously wrong, as you can see in the example image below. It is also important to understand that Featured Snippet results can change from location to location and are not language-agnostic, and thus, are not translated to match the Search Language or the Phone Language settings. Google also does not hold themselves to any standard of consistency, so one Featured Snippet for one query might present an answer one way, and a similar query for the same fact could present a Featured Snippet with slightly different information. For instance, a query for “how long to boil an egg” could result in an answer that says “5 minutes” and a different query for “how to make a hard-boiled egg” could result in an answer that says “boil for 1 minute, and leave the egg in the water until it is back to room temperature.”
The data below was collected by Moz and represents an average of roughly 10,000 that skews slightly towards ‘head’ terms.
SEO strategy for featured snippets
All of the standard recommendations for driving Featured Snippets apply here. This includes making sure that you keep the information that you are trying to get ranked in a Featured Snippet clear, direct, and within the recommended character count. It also includes using simple tables, ordered lists, and bullets to make the data easier to consume, as well as modeling your content after existing Featured Snippet results in your industry.
This is still speculative, but it seems likely that the inclusion of Speakable Schema markup for things like “How To,” “FAQ,” and “Q&A” may also drive Featured Snippets. These kinds of results are specially designated as content that works well in a voice-search. Since Google has been adamant that there is not more than one index, and Google is heavily focused on improving voice-results from Google Assistant devices, anything that could be a good result in the Google Assistant, and ranks well, might also have a stronger chance at ranking in a Featured Snippet.
People Also Ask & Related Searches
Finally, the increased occurrence of “Related Searches” as well as the inclusion of People Also Ask (PAA) questions, just below most Knowledge Graph and Featured Snippet results, is undeniable. The Earl Tea screenshot shows that PAA’s along with Interesting Finds are both part of the Knowledge Graph too.
The graph below shows the steady increase in PAA’s. PAA results appear to be an expansion of Featured Snippets because once expanded, the answer to the question is displayed, with the citation below it. Similarly, some Related Search results also now include a result that looks like a Featured Snippet, instead of simply linking over to a different search result. You can now find ‘Related Searches’ throughout the SERP, often as part of a Knowledge Graph results, but sometimes also in a carousel in the middle of the SERP, and always at the bottom of the SERP — sometimes with images and expansion buttons to surface Featured Snippets within the Related Search results directly in the existing SERP.
Boxes with Related Searches are now also included with Image Search results. It’s interesting to note that Related Search results in Google Images started surfacing at the same time that Google began translating image Title Tags and Alt Tags. It coincides well with the concept that Entity-First Indexing, that Entities and Knowledge Graph are language-agnostic, and that Related Searches are somehow related to the Knowledge Graph.
This data was collected by Moz and represents an average of roughly 10,000 that skews slightly towards ‘head’ terms.
SEO STRATEGY for PAA and related searches
Since PAAs and some Related Searches now appear to simply include Featured Snippets, driving Featured Snippet results for your site is also a strong strategy here. It often appears that PAA results include at least two versions of the same question, re-stated with a different language, before including questions that are more related to lateral and vertical nodes on the Knowledge Graph. If you include information on your site that Google thinks is related to the topic, based on Related Searches and PAA questions, it could help make your site appear relevant and authoritative.
Finally, it is crucial to remember that you don’t have a website to rank in Google now and SEO’s should consider non-website rankings as part of their job too.
If a business doesn’t have a website, or if you just want to cover all the bases, you can let Google host your content directly — in as many places as possible. We have seen that Google-hosted content generally seems to get preferential treatment in Google search results and Google Discover, especially when compared to the decreasing traffic from traditional organic results. Google is now heavily focused on surfacing multimedia content, so anything that you might have previously created a new page on your website for should now be considered for a video.
Google My Business (GMB) is great for companies that don’t have websites, or that want to host their websites directly with Google. YouTube is great for videos, TV, video-podcasts, clips, animations, and tutorials. If you have an app, a book, an audio-book, a podcast, a movie, TV show, class or music, or PWA, you can submit that directly to GooglePlay (much of the video content in GooglePlay is now cross-populated in YouTube and YouTube TV, but this is not necessarily true of the other assets). This strategy could also include books in Google Books, flights in Google Flights, Hotels in Google Hotel listings, and attractions in Google Explore. It also includes having valid AMP code, since Google hosts AMP content, and includes Google News if your site is an approved provider of news.
Changes to SEO tracking for Fraggle-based indexing
The biggest problem for SEOs is the missing organic traffic, but it is also the fact that current methods of tracking organic results generally don’t show whether things like Knowledge Graph, Featured Snippets, PAA, Found on the Web, or other types of results are appearing at the top of the query or somewhere above your organic result. Position one in organic results is not what it used to be, nor is anything below it, so you can’t expect those rankings to drive the same traffic. If Google is going to be lifting and representing everyone’s content, the traffic will never arrive at the site and SEOs won’t know if their efforts are still returning the same monetary value. This problem is especially poignant for publishers, who have only been able to sell advertising on their websites based on the expected traffic that the website could drive.
The other thing to remember is that results differ — especially on mobile, which varies from device to device (generally based on screen size) but also can vary based on the phone IOS. They can also change significantly based on the location or the language settings of the phone, and they definitely do not always match with desktop results for the same query. Most SEO’s don’t know much about the reality of their mobile search results because most SEO reporting tools still focus heavily on desktop results, even though Google has switched to Mobile-First.
As well, SEO tools generally only report on rankings from one location — the location of their servers — rather than being able to test from different locations.
The only thing that good SEO’s can do to address this problem is to use tools like the MobileMoxie SERP Test to check what rankings look like on top keywords from all the locations where their users may be searching. While the free tool only provides results with one location at a time, subscribers can test search results in multiple locations, based on a service-area radius or based on an uploaded CSV of addresses. The tool has integrations with Google Sheets, and a connector with Data Studio, to help with SEO reporting, but APIs are also available, for deeper integrations in content editing tools, dashboards and for use within other SEO tools.
At MozCon 2017, I expressed my belief that the impact of Mobile-First Indexing requires a re-interpretation of the words “Mobile,” “First,” and “Indexing.” Re-defined in the context of Mobile-First Indexing, the words should be understood to mean “portable,” “preferred,” and “organization of information.” The potential of a shift to Fraggle-based indexing and the recent changes to the SERPs, especially in the past year, certainly seems to prove the accuracy of this theory. And though they have been in the works for more than two years, the changes to the SERP now seem to be rolling-out faster and are making the SERP unrecognizable from what it was only three or four years ago.
SEOs need to consider the opportunities and change the way we view our overall indexing strategy, and our jobs as a whole. If Google is organizing the index around the Knowledge Graph, that makes it much easier for Google to constantly mention near-by nodes of the Knowledge Graph in “Related Searches” carousels, links from the Knowledge Graph, and topics in PAAs. It might also make it easier to believe that featured snippets are simply pieces of information being vetted (via Google’s click-crowdsourcing) for inclusion or reference in the Knowledge Graph.
Fraggles and Fraggled indexing re-frames the switch to Mobile-First Indexing, which means that SEOs and SEO tool companies need to start thinking mobile-first — i.e. the portability of their information. While it is likely that pages and domains still carry strong ranking signals, the changes in the SERP all seem to focus less on entire pages, and more on pieces of pages, similar to the ones surfaced in Featured Snippets, PAAs, and some Related Searches. If Google focuses more on windowing content and being an “answer engine” instead of a “search engine,” then this fits well with their stated identity, and their desire to build a more efficient, sustainable, international engine.
SEOs also need to find ways to serve their users better, by focusing more on the reality of the mobile SERP, and how much it can vary for real users. While Google may not call the smallest rankable units Fraggles, it is what we call them, and we think they are critical to the future of SEO.