Google Archives - Bruce Clay, Inc. https://www.bruceclay.com/blog/tag/google/ SEO and Internet Marketing Mon, 01 Apr 2024 15:42:13 +0000 en-US hourly 1 Google’s Explosive March Updates: What I Think https://www.bruceclay.com/blog/googles-explosive-march-updates/ https://www.bruceclay.com/blog/googles-explosive-march-updates/#comments Thu, 28 Mar 2024 17:37:33 +0000 https://www.bruceclay.com/?p=216622 Google's flurry of March updates created volatility for SEO. After an initial analysis, here are my thoughts on what's going on.

The post Google’s Explosive March Updates: What I Think appeared first on Bruce Clay, Inc..

]]>
Woman working on laptop, holding a cup of coffee.
On March 5, Google launched its first core algorithm update of 2024, in addition to unveiling updated spam policies and a spam update. Plus, its helpful content system was integrated into its core algorithm update system.

Google told Search Engine Land it would reduce unhelpful content in the search results by up to 40%.

In addition to sites that were negatively impacted by the core update, Google unleashed a slew of manual actions on sites that violated its most current spam policies.

Some sites were completely de-indexed. Many of these sites were likely driven by AI content.

via GIPHY

After an initial analysis of the updates and Google’s documentation, I am going to share what’s going on with a large part of the latest updates. In this article:

The Focus Is Largely on AI Content

Are you using AI content for your website? Now is a good time to evaluate your methods (if Google hasn’t already done that for you in its latest core update.)

If you sailed through this last update, does that mean you are immune? Probably not. Google’s algorithms continue to evolve, and Google’s main goal is to take out the trash.

With the rise in popularity of AI content in recent times, Google knew it had a looming problem on its hands.

I discussed at Search Engine Land a while back that AI content could create a world where the quality of an answer in the search results would be average at best — only as good as the AI would allow.

If you consider that many AI content tools are connected to the web, the AI is reading its own generated content to come up with ideas for more generated content.

Google has faced quality problems like this before. Over the years, Google has combated many different tactics that degrade its search results.

This is just another example of something that has become “spam” – AI content. But there are still ways to do it right.

What Google Is Now Saying About AI Content

So what did Google have to say about AI content in its recent announcements?

Here, Google talks about how it is addressing the “abusive behavior” of using AI content:

To better address these techniques, we’re strengthening our policy to focus on this abusive behavior — producing content at scale to boost search ranking — whether automation, humans or a combination are involved.

There’s a lot to unpack in that one statement. Let’s break it down.

Producing Content to Rank: Is It Spam?

Google says it is targeting content at a scale that’s intended to boost search rankings.

But isn’t all content created for SEO intended to rank?

Google has an opinion, but doesn’t go into too much detail:

There are some things you could do that are specifically meant to help search engines better discover and understand your content. Collectively, this is called “search engine optimization” or SEO, for short. Google’s own SEO guide covers best practices to consider. SEO can be a helpful activity when it is applied to people-first content, rather than search engine-first content.

It sounds like what Google is saying is that SEO is an afterthought.

Google seems to downplay SEO so that people follow its guidelines for creating helpful content first, then tack on SEO to make it perform better.

Sounds logical.

Except we all know that SEO isn’t just an add-on and that we first start with keyword research and trending topics to capitalize on the success of content before creating new content.

So, again, the question is: Aren’t we already producing content to boost search rankings?

Here’s what I think: Automated mass production of content with no regard for adding value is the problem.

Human-Edited AI Content: Is It Spam?

Google says it’s targeting websites with content that is totally automated, written by humans or a combination.

So AI content can be bad even when there is a human touch to it. That means if you are using AI tools, tread carefully.

AI tools are not inherently bad but abusing them is.

Here, Google talks more about the idea of “scaled content abuse”:

Examples of scaled content abuse include, but are not limited to:

  • Using generative AI tools or other similar tools to generate many pages without adding value for users
  • Scraping feeds, search results, or other content to generate many pages (including through automated transformations like synonymizing, translating, or other obfuscation techniques), where little value is provided to users
  • Stitching or combining content from different web pages without adding value
  • Creating multiple sites with the intent of hiding the scaled nature of the content
  • Creating many pages where the content makes little or no sense to a reader but contains search keywords

The first bullet is key: You can use AI tools all you want, and you can edit AI-generated content all you want, but if you’re not adding something unique – an expert perspective, personal experience, etc., then your content could be a fair target for Google enforcing its spam policies.

The third bullet elaborates: Don’t use AI tools that merely stitch together the information in the search results into a new article without adding some extra value.

It is fine to take keywords, make a list, make a unique outline, and from there devise your own content … just do not plagiarize or create generic content.

Even before the use of AI, content creators would look at the top-ranked pages as research for what they write.

However, even that has potentially caused an existential crisis for Google. Others have written about the downward spiral of the quality of the search results, and Google has been ramping up efforts to surface better content.

In a nutshell, you need to have something original, regardless of how you create the content.

The main point here is to differentiate your content. What is happening with AI content is generic content that doesn’t add anything new to the conversation.

So, What Is Content Spam Now?

Google’s job is to weed out the garbage no matter how the content is created. So it will lean heavily on its algorithms to identify what is quality content.

Google has taken many approaches to this in the past, and it will continue to evolve. In its latest iteration, Google has clarified what spam content is, and had this to say recently about AI content and new spam policies:

Our long-standing spam policy has been that use of automation, including generative AI, is spam if the primary purpose is manipulating ranking in Search results. The updated policy is in the same spirit of our previous policy and based on the same principle. It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation. [Emphasis added.]

What methods might Google use to further determine quality?

Many things. Perhaps high bounce rates, poor sentiment in reviews, low site trust, a high degree of similarity to other documents, no inbound mentions and lack of website maintenance.

Google also had this to say:

This will allow us to take action on more types of content with little to no value created at scale, like pages that pretend to have answers to popular searches but fail to deliver helpful content.

SEL spoke to Google, and a Google rep clarified:

What are examples of pages that pretend to have answers but fail to deliver? Tucker [a Google rep] explained that those are the pages that start off by stating it will answer your question, lead you on with low-quality content, and never end up giving you the answer to your questions:

  • “Our long-standing spam policy has been that use of automation, including generative AI, is spam if the primary purpose is manipulating ranking in Search results. The updated policy is in the same spirit of our previous policy and based on the same principle. It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation.”
  • “Our new policy is meant to help people focus more clearly on the idea that producing content at scale is abusive if done for the purpose of manipulating search rankings and that this applies whether automation or humans are involved.”

Surviving Google’s Algorithms

We are in the midst of another shift in SEO, where we continue to define quality.

There is no room for flying low under the AI radar or it’s a certain crash. It’s vitally important that organizations remain white-hat current on changes related to SEO requirements to remain visible and relevant within search results.

Assess and Pivot

For the near future, protection is job No. 1. Navigating changes in search algorithms requires businesses to periodically reevaluate their SEO tactics.

We recommend comprehensive SEO audits to surface any current or potential threats.

Adjusting to algorithm updates, adapting to rules as they change, and abandoning manipulative tactics even if they once worked are all key parts of maintaining high rankings in search engines.

Adopt Best Practices

Adopting best practices is indispensable to top rankings. Just because it is an SEO practice in your industry, doesn’t mean it is good.

Yes, that’s a bit obvious but if everyone did it then there would not be chaos over these updates.

The zone of acceptance is like a water balloon – shifting shape as the rules change. Adhering to core principles can make you almost immune to algorithm updates.

Create Quality Content

Quality content remains key in building a lasting online presence.

By producing user-centric content that offers something unique and of value, you can build trust, authority and credibility within your niche, not only helping with algorithm changes but also strengthening your overall online presence.

Spend time thinking about and creating helpful, people-first content.

Relevant topics are key for maintaining audience engagement and producing high-quality content that resonates with audiences.

By understanding your target audience’s needs and conducting extensive research using SEO tools, you can quickly locate trending or pertinent subjects that resonate with them.

But just be sure you are adding something unique to the conversation, and it’s not a “copy/paste/reword” approach.

See my AI content beginner’s guide if you want to continue leveraging AI’s benefits in your content creation.

Track Progress

Taking a set-it-and-forget-it approach to publishing content will harm you in the end.

As Google releases core updates, you need to understand how your site is faring after the dust settles.

You must use analytics tools to see what is working. Monitor progress, identify issues and pivot as needed.

Have you been impacted by Google’s March updates? Our SEO experts can help you get your SEO program back on track. Reach out to us today.

FAQ: How can businesses search rankings survive Google’s explosive March core and spam updates?

Google’s constant algorithm updates can wreak havoc on a business’s search rankings. But there are strategic tactics you can implement to ensure your website survives the explosive March core and spam updates (and any future update). Let’s go over them:

Understand Google’s updates: Familiarize yourself with the specific changes introduced in the March core and spam updates. Stay updated on the latest algorithm adjustments to align your SEO strategy accordingly.

Enhance website content: Craft high-quality, engaging content that caters to your target audience’s needs. Provide valuable information, incorporate relevant keywords (including variants and stemmed versions) and consider search intent to rank higher.

Focus on site speed: Google prioritizes fast-loading pages, as user experience is paramount. Optimize your website speed by compressing images, minifying code and leveraging caching techniques.

Improve mobile experience: With mobile devices dominating search, ensure your website is mobile-friendly and responsive. Design intuitive navigation, optimize images and make sure your content is easily readable on smaller screens.

Build authoritative backlinks: High-quality backlinks from reputable websites signal trustworthiness to Google. Implement a robust link-building strategy, focusing on acquiring links from relevant and influential sources.

Optimize on-page elements: Pay attention to on-page elements like meta tags, title tags, headers and URLs. Include primary and related keywords naturally, providing search engines with clear signals about your content.

Leverage social media: Engage with your audience on relevant social media platforms. Sharing and promoting your content can increase visibility, reach and potentially attract natural backlinks.

Monitor and analyze performance: Utilize web analytics tools to track your website’s performance and make data-driven decisions. Regularly monitor keyword rankings, organic traffic and user behavior to identify areas for improvement.

Businesses need to be proactive and adaptable to survive Google updates. Implementing these tips will help safeguard your search rankings and maintain visibility.

Step-by-Step Procedure:

  1. Stay up-to-date with Google’s algorithm updates, including the March core and spam updates.
  2. Analyze your website’s content and improve its quality, relevance and optimization for targeted keywords.
  3. Conduct keyword research to identify primary and related keywords to incorporate throughout your website.
  4. Evaluate your website’s speed and optimize it by compressing images, reducing server response time and leveraging caching techniques.
  5. Ensure your website is mobile-friendly and responsive, providing a seamless experience across all devices.
  6. Implement a comprehensive link-building strategy to acquire high-quality backlinks from authoritative websites.
  7. Optimize on-page elements such as meta tags, title tags, headers and URLs, incorporating target keywords effectively.
  8. Engage with your audience on social media platforms, sharing and promoting your content to increase visibility and attract natural backlinks.
  9. Monitor your website’s performance using web analytics tools, tracking keyword rankings, organic traffic and user behavior.
  10. Analyze the data to identify areas for improvement and make necessary adjustments to your SEO strategy.
  11. Continuously monitor and stay updated on the latest trends and best practices in SEO.
  12. Regularly audit your website for technical SEO issues, such as broken links, duplicate content, or improper redirects, and resolve them promptly.
  13. Stay proactive in optimizing your website’s loading speed by regularly monitoring and optimizing page elements.
  14. Constantly evaluate and adjust your content strategy to match search intent and deliver valuable information to your target audience.
  15. Stay engaged in your industry by participating in relevant forums, guest blogging, or collaborating with influencers to increase your online presence.
  16. Consider hiring a reputable SEO agency or consultant for professional guidance and support.

Remember, surviving Google’s updates requires continuous effort and adaptation. Follow these steps to ensure your search rankings remain strong amidst algorithm changes.

The post Google’s Explosive March Updates: What I Think appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/googles-explosive-march-updates/feed/ 1
What Are Google’s Top Ranking Factors? https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/ https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/#comments Wed, 22 Nov 2023 18:20:36 +0000 https://www.bruceclay.com/?p=204446 Discover the profound impact of content quality on website rankings. Learn expert strategies to optimize your content for enhanced search engine visibility and user engagement.

The post What Are Google’s Top Ranking Factors? appeared first on Bruce Clay, Inc..

]]>
Tablet displaying Google search engine.

Back in 2016, a Google engineer famously said that Google’s top three ranking factors were content, links and RankBrain. However, this was later disputed by more than one Googler.

Sure, those three factors are likely a huge part of how Google determines rankings, but that’s not an exhaustive list.

It is wrong to think that the algorithm is made up only of these three factors and that each of them carries one-third weight for each query. That’s because the factors in Google’s algorithm change from query to query.

That said, I’ll spend the rest of this article shedding more light on what I believe are the top factors in how your website ranks.

Content

If I were to choose the No. 1 spot among the top ranking factors, it would be content. Content is the fabric of the web. Without content, Google’s search results simply wouldn’t exist.

Now, any website can have content, but quality content is how you rank. That’s because the content you create is important to the efficacy of Google’s search results.

Google wants its users to have a good experience. To do that, the websites featured in its search results must offer good answers to their users’ queries.

To ensure that Google is featuring the best content in its search results, it has created things like:

For more, see:

Technical Factors

If your site is not crawlable and/or doesn’t perform well, it will likely not do well in the search results.

One of the most important things you can do is make sure your site is optimized from the ground up so that search engines can access, crawl and understand it with ease. And it must provide visitors with a good user experience.

Case in point: Google’s Gary Illyes once said:

“I really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made-up terms by the rank trackers and that they talked more with the developers of the website once done with the first part of this sentence.”

In Google’s Search Quality Evaluator Guidelines, it states that unmaintained sites are low quality:

… unmaintained/abandoned “old” websites or unmaintained and inaccurate/misleading content is a reason for a low Page Quality rating.

In 2021, Google rolled out its page experience update, outlining various technical factors that should be followed to ensure a good user experience. If you want to be rewarded with better ranking, optimize your site to address these signals.

For more, see:

Links

If content is the fabric of the web, links are the strings that tie it together. Ever since PageRank, links have been a significant way search engines determine rankings. Why? Links have always served as a “vote” from one website for another.

Even though Google says links have less impact now than they used to, they are still important.

However, not all links are created equal. Google doesn’t give every link to your site an equal vote. In fact, some links can even result in a negative impact on your website’s ability to rank.

Today, links are no longer a numbers game. Even though Backlinko research shows that the No. 1 result in Google has an average of 3.8 times more backlinks than positions No. 2 to 10, we have seen sites with fewer but higher quality links outrank sites with more.

To get links right, you want to focus on link earning rather than link building and the quality and relevance of links versus the quantity. Link earning first starts with creating excellent content and a trusted site so that it naturally earns the links it deserves.

In 2020, Google’s John Mueller said links were “definitely not the most important SEO factor.” What does that mean? Google wants people to focus on making a great site first and not worry too much about building tons of links.

For more, see:

RankBrain

RankBrain is probably one of the most misunderstood ranking factors. RankBrain is a machine learning component of Google’s algorithm. It uses multiple data points at the time of a search to help the search engine better interpret the intent of search queries and serve the most relevant search results.

With RankBrain, Google can go beyond factors like the quality of the content or the links to a webpage to help identify the very best answer to a search.

So to survive RankBrain, you have a lot of work to do to ensure that you are creating the type of content that satisfies the query / your keywords. If you don’t, you risk the chance of RankBrain deciding that your content is not relevant to a search and you can lose rankings.

For more, see:

There are countless factors that go into ranking a webpage, video, or image. And those factors change based on the search query. Understanding the top factors, though, is an important first step in knowing how search works.

Unlock your website’s potential with our expert SEO strategies tailored to Google’s dynamic ranking factors—let’s boost your visibility and climb the search results together. Talk to us

FAQ: How can I increase the ranking of my website with Google using its top ranking factors?

Today’s digital world necessitates having an effective online presence for any business to remain viable; and Google’s top ranking factors play a pivotal role when it comes to your website’s SERP ranking. So, how can these elements help boost website performance and how can businesses utilize these factors to their benefit? In order to understand more on this subject and gain some valuable insights.

At first, it’s essential to recognize that Google uses a sophisticated algorithm to rank websites. Although this algorithm incorporates numerous factors, here are a few primary ones which you should focus on:

  1. Content Quality: Producing high-quality, engaging and relevant content is of utmost importance for Google to recognize websites as providers of valuable information to their visitors, so invest time into producing engaging articles, blog posts, videos and other forms of media content creation such as books.
  2. Backlinks: An authoritative backlink network can make all the difference for improving website rankings. Partner with prominent industry websites when possible and prioritize quality over quantity when it comes to backlinks.
  3. Mobile Responsiveness: Being mobile-first makes having a mobile-responsive website essential. Google prioritizes websites which deliver seamless user experiences on all devices; to maximize this benefit for mobile visitors optimize both design and functionality to provide optimal experience across devices.
  4. Page Loading Speed: Slow websites can lead to poor user experiences and reduce Google ranking, so make sure yours loads quickly by optimizing images, minifying code and employing caching strategies.
  5. User Experience: Google prioritizes websites that focus on user experience. Focusing on elements like easy navigation, intuitive design and clear calls-to-action will not only increase rankings but will also foster engagement and conversions.

Moving forward to enhance your website’s ranking:

  1. Leverage Keywords Precisely: Conduct keyword research to identify relevant search terms, then incorporate these naturally into the content, meta tags and headings on your website.
  2. Optimize meta tags: Create attractive titles and descriptions that accurately capture your webpage content while encouraging viewers to visit it.
  3. Take Advantage of Header Tags: Header tags can help structure and simplify your content for users as well as search engines alike, making it more readable for both audiences.
  4. Include alt text with images: Alt text can help search engines understand your images more readily, making your website both accessible and search-engine-friendly.
  5. Regularly update content: Stagnant websites tend not to rank well; to stay ahead, ensure yours remains fresh by regularly publishing articles, blog posts or product updates.
  6. Leverage Social Media: Utilize social media channels such as Twitter and Facebook to increase exposure, drive visitors directly to your website, and potentially secure backlinks from outside sources.
  7. Monitor and Assess Website Performance: Utilizing tools like Google Analytics can be useful in monitoring website performance, user activity and traffic sources — providing valuable data that reveals areas requiring improvement and potential problem areas.
  8. Engage Your Audience: Responding to user comments, queries and feedback is vital in building relationships and maintaining your website’s visibility and reputation. Engaging and building relationships will pay dividends!
  9. Increase Website Security: Search engines prefer secure websites; utilize SSL encryption technology and regularly upgrade software updates in order to protect it against malware attacks.
  10. Optimize for local search: If your business has physical presence, pay special attention to local SEO factors like Google My Business listings, customer reviews and local keywords when optimizing for local search.
  11. Generate an XML Sitemap: Generating an XML sitemap helps search engines understand and index your website more effectively.
  12. Stay Abreast of Your Competitors: Keep tabs on what strategies, keywords and tactics your competitors employ in order to increase the ranking of their website.
  13. Create an effective internal linking structure: Internal links enable search engines to navigate your website more effectively while also helping establish its hierarchy, dispersing authority more evenly among pages and improving user navigation.
  14. Strive for positive user experience: Prioritize providing visitors with an engaging user experience on your website by prioritizing fast loading pages, intuitive interfaces and fast access to pertinent data.
  15. Stay current: With search engine algorithms ever evolving, staying informed on latest trends, updates and best practices is key for maintaining and improving the ranking of your website.

Utilizing Google’s top ranking factors is the cornerstone of improving website rankings. Apply these expert strategies, deliver valuable content with exceptional user experiences and watch as your rankings soar higher and higher on search engine result pages.

The post What Are Google’s Top Ranking Factors? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/feed/ 8
An Up-to-Date History of Google Algorithm Updates https://www.bruceclay.com/blog/google-algorithm-updates/ https://www.bruceclay.com/blog/google-algorithm-updates/#comments Thu, 09 Nov 2023 18:00:42 +0000 https://www.bruceclay.com/?p=72831 The only constant is change — especially when it comes to Google algorithm updates. We've collected all the most notable updates from Google since 200. With this history, we provide guidance on what to do once an update hits.

The post An Up-to-Date History of Google Algorithm Updates appeared first on Bruce Clay, Inc..

]]>
Algorithm update: A change in the search engine’s ranking formulas that may or may not cause noticeable seismic shifts in which webpages appear at the top of search results, but which is meant to improve the quality of results overall.

Seismograph volatility readings.

An SEO Perspective on Algo Fluctuations

The only thing that’s constant in search engine optimization is change. In one year (2020), for example, Google reported running 600,000+ experiments that resulted in more than 4,500 improvements to Search. That’s a lot of volatility, folks.

Here is our running list of the notable confirmed and major unconfirmed algorithm updates of all time. Below the list, we also explain how to watch for algorithm updates and what to do if you think your site has been impacted.

Main TOC:

The following are the major updates, in our view, of all time — the ones that have shaped the face of search and SEO. These links will take you to the most important Google updates in our opinion:

If you want to jump to a particular year, be my guest:
2023 | 2022 | 2021 | 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | Pre-2009

Google Algorithm Updates by Year

2023 ALGORITHM UPDATES

November 2023 – Core Update

In a rare development, Google announced its second major core update in as many months on November 2. The rollout is still ongoing and expected to be completed within the next few weeks.

October 2023 – Core Update

Google completed the third core update of the year on October 19.

October 2023 – Spam Update

The spam update began October 4 and ended October 19. This update addressed the increase in spam being reported by users when searching in their native languages.

September 2023 – Helpful Content System Update

The helpful content system update finished rollout on September 28. Google updated its guidance on AI-generated content, shifting its attitude to be consistent with generating helpful content. Google also revised guidelines on hosting third-party content. It now recommends blocking this content from being indexed if it is unrelated to the website’s main purpose. Learn more about the helpful content system here.

August 2023 – Core Update

The second core update of 2023 began on August 22 and completed September 7. See how it compared to previous updates.

May 2023 – Topic Authority System

Google announced its topic authority system as a way to “better surface relevant, expert, and knowledgeable content” in news queries. This system is not new — Google has been using it for several years, but they are discussing it now to bring more transparency into how ranking works.

April 2023 – Reviews Update

Google updated the name of this update from “product reviews” to just “reviews.” It now evaluates reviews of services, businesses, media — any topic that is reviewable. Google revised the language in its guidance documentation to apply to reviews of all kinds. This update was completed April 25. Learn more here.

March 2023 – Core Update

Google rolled out this update on March 15. It was the first of 2023 and finished rolling out March 28. This update caused more volatility than previous ones.

February 2023 – Product Reviews Update

Google’s product reviews update promotes review content that is of higher quality than the typical standard review information found online. The goal of this update is to “provide users with content that provides insightful analysis and original research, content written by experts or enthusiasts who know the topic well,” according to Google. This update added several new languages including English, Spanish, German, French and Italian among others. Google completed the rollout on March 7.

2022 ALGORITHM UPDATES

December 2022 – Link Spam Update

Hot off the heels of the helpful content system update, Google launched a link spam update on December 14. Google is targeting spammy links with SpamBrain, their AI-based spam-prevention system. Learn more about this update here.

December 2022 – Helpful Content System Update

This is the first update to Google’s helpful content system since its launch in August 2022. It began rolling out December 6, adding new signals and updating it worldwide to include all languages.

October 2022 – Spam Update

On October 19, Google quietly rolled out a spam update to improve detection of search spam. The update finished on October 21. Read more about Google’s spam update here.

September 2022 – Product Reviews Update

Google completed its second product reviews update of 2022 on September 26. Like the first update in July, this update rewards review-based content that is helpful and informative to shoppers.

September 2022 – Core Update

Google began rolling out its second core update of the year on September 12. It was completed two weeks later on September 26. This update appeared to have less of an impact than previous core updates.

August 2022 – Helpful Content Update

Google announced the helpful content update on August 18. This new signal rewards content that Google believes is helpful and informative, rather than content that is purely meant to rank well in search engines. The update rolled out August 25 and was completed on September 9.

Check out our video What to know about the Helpful Content Update to get Bruce’s take on the new signal.

June 2022 – Product Reviews Update

The Product Reviews Update was announced on July 27. While it was expected to take several weeks to roll out, it only took six days, fully rolling out on August 2. This update promotes quality review content to help shoppers learn more about products before purchasing. See Google’s help document on how to write high quality product reviews.

May 2022 – Core Update

This algo update acknowledged by Google caused some noticeable ranking fluctuations, more than some other recent core updates. It rolled out starting May 25 and finished June 9, 2022.

February 2022 – Page Experience Update for Desktop

Core Web Vitals and the page experience ranking factor applied to desktop as well as mobile as of this update, which finished rolling out on March 3, 2022.

2021 ALGORITHM UPDATES

June 2021 – Page Experience Update

Rolled out between mid-June and early September 2021, this update had less impact on rankings than Google’s core updates. Yet despite the fact that the update was somewhat downplayed in the SEO community, Google’s John Mueller confirmed it was more than simply a “tie-breaker” when ranking webpages. He also stated that when webmasters downplay the update, it may also mean that they downplay the impact that the ranking factors have on users.

What it did was create a new page experience ranking factor combining at least six signals related to how a webpage performs for mobile users. (Note that though it initially applied to mobile ranking only, Google rolled out the page experience update for desktop as well in February 2022.)

Core Web Vitals are three new performance metrics introduced with this update. Websites that meet certain performance thresholds can gain some competitive advantage and also improve their site’s user experience.

While improving page experience factors is often technical back-end work, top sites continue to improve their scores. So especially if you’re in a competitive industry, I highly recommend you check your site and get started.

For more details, see our comprehensive e-book Google’s Page Experience Update: A Complete Guide and these resources:

June 2021 – Core Update

In June 2021, Google released a core update and announced another one would be coming the following month in July.

Google told Search Engine Land that the reason the rollout was broken into two phases was that not all of the planned improvements for the June 2021 update were ready. So Google decided to release the parts that were ready and push out the rest the following month.

Many in the industry felt that this was a big update, according to a roundup of data published at Search Engine Land. Subsequently, many felt like the link spam update in July–August 2021 did not have as big of an impact, which is why we aren’t listing it separately here.

Was the June core update related to “your money or your life” webpages? Some thought so.

Google released a blog post that coincided with the June core update, stating that: “core updates are designed to increase the overall relevancy of our search results. In terms of traffic, we send, it’s largely a net exchange. Some content might do less well, but other content gains.”

February 2021 – Passage Ranking

On February 11, Google announced it had launched passage ranking for U.S. queries in English.

Passage ranking helps Google Search better choose and rank the most relevant webpages with passages (like blocks of text) that answer very specific queries.

Whereas before, the search engine may have ranked articles that give general information on the query; now, Google can find and rank articles that answer the query the best, even if it’s only within a block of text on the webpage.

Google said this about passage ranking:

Very specific searches can be the hardest to get right, since sometimes the single sentence that answers your question might be buried deep in a web page. We’ve recently made a breakthrough in ranking and are now able to better understand the relevancy of specific passages. By understanding passages in addition to the relevancy of the overall page, we can find that needle-in-a-haystack information you’re looking for.

Google stated that when it is fully rolled out globally, it will impact 7% of search queries.

2020 ALGORITHM UPDATES

December 2020 – Core Update

In early December 2020, Google released a new broad core update. Many industry commentators stated it was a big core update — one of the largest yet — with many sites seeing extreme traffic gains and losses.

As with previous broad core updates, there was no specific ranking factor targeted; rather, broad core updates are an update to how sites are evaluated.

While many webmasters were anticipating this update as a way to recover from losses from the May 2020 update, many were also concerned about the timing of this update, as it occurred during the holiday period.

You can check out an analysis of this core update at Search Engine Land: “Google’s December 2020 core update was big, even bigger than May 2020, say data providers.

November 2020 – Subtopics Ranking

Some websites may have experienced ranking changes on or around mid-November, and the subtopics ranking change may have been the reason.

Google did not announce this algorithm update (but did discuss it back in October). Google’s Danny Sullivan later confirmed in 2021 that the ranking change went live in November 2020.

It’s worth noting that subtopics ranking is a new feature of the algorithm, as opposed to an update of existing processes.

Google discussed subtopics ranking in October 2020, saying the following:

We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad. As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page. We’ll start rolling this out by the end of this year.

In other words, the subtopics ranking feature is designed to help Google understand how subtopics relate to a query.

As another example, if someone were to search for “SEO,” Google can now understand relevant subtopics such as agencies, conferences, tools, and Google algorithm updates. With this information, it can then show wider-ranging content in the search engine results pages.

For more, see the Search Engine Land article: “Google launched subtopics ranking in mid-November.”

May 2020 Core Update

Google announced a broad core update via Twitter. This update took approximately two weeks to fully roll out.

Many felt that the May update was significant, even for a core update, with many sites seeing significant losses or gains in traffic. Many algorithm-tracking tools registered extreme volatility.

A core update, according to Google, is a broad algorithm update that does not target specific types of queries or pages. Instead, the update is about improving how the search engine assesses content overall to make results more relevant. We are told to compare a core update to refreshing a list of top 100 movies that you made a few years back. Naturally, new items would appear in your list today, and other titles on your list would shift up or down.

Consequently, Google says that “pages that drop after a core update don’t have anything wrong to fix.” And some pages that were “previously under-rewarded” will rank higher.

Moz gives an analysis of the winners and losers of this core update here.

January 2020 – Featured Snippets Update

Google confirmed that results shown in featured snippets would no longer be repeated on the first page of search results. Previously, a featured snippet could be found in “Position 0” as well as one of the top organic listings on the page.

2019 ALGORITHM UPDATES

October 2019 – BERT

Google announced BERT — its deep learning algorithm also known internally at Google as DeepRank — on October 25, 2019. BERT impacts conversational search queries by helping Google to better understand context, including how words like “for” and “to” change the meaning of a search. Google later confirmed there was nothing specific to optimize for, and that BERT affects nearly every search performed.

September 2019 Core Update

Google announced this broad core algorithm update ahead of time. The industry weighed in after it rolled out, and many hypothesized it targeted link spam.

June 2019 Core Update & Site Diversity Update

Google pre-announced this update, which seemed to focus on correcting the way the algorithm evaluated links. It put more weight on the domain authority and trustworthiness of a site’s incoming links.

In my opinion, the “trustworthiness” component of the E-A-T factors rose in importance. And it overflowed SEO to include a fix for detecting sentiment. Note that this firmly places the issue with a ranking loss on marketing in general … No wonder Google has said that there is nothing sites can do specifically to “fix” their rankings after a core update runs. Mess with user sentiment, link sentiment, and quality, and prepare to die.

The search engine simultaneously released a separate site diversity update. The stated goal was to make search results more diverse. For most queries, users no longer see a domain appear more than twice in the top-ranked results, making it harder for a site to “saturate” a topic organically.

Hazards: Sites with too low a percentage of backlinks from trusted websites may have dropped. Those that used to have many pages ranking for a single query also lost some SERP real estate.

Winners: Pages may have risen that were previously under-rewarded. (Aren’t we all?)

March 2019 Core Update

Google confirmed this update, which seemed to fine-tune broad core algorithm changes of the past. Data showed the majority of the websites that were impacted were also impacted by the March 2018 core update and the August 2018 “Medic” update. To prevent naming confusion, Google tweeted the update’s name the same day it was released.

However, the losers weren’t impacted as much as the winners. Research found that sites whose search traffic increased experienced higher rankings site-wide, with no increase in the number of keywords ranked.

On the flip side, the update hurt many sites that provide a poor user experience (due to excessive pop-ups, poor navigation, over-optimization, and so forth).

And, of course, trust was a significant signal. Sites dealing with YMYL (Your Money or Your Life) topics took ranking hits if they were participating in untrusted activities.

Hazards: Google’s March 2019 Core Update behaved like an evolution of previous algorithms, negatively affecting sites that were over-optimized.

Winners: Reputable sites ranking for queries related to particularly sensitive topics, like those about health questions, benefited. According to SearchMetrics, “websites with a strong brand profile and a broad topical focus” also advanced. See, trust matters.

2018 ALGORITHM UPDATES

September 2018 – Small Update

Google confirmed there was a small update made, and reiterated that it wasn’t anything major.

August 2018 Core Update – Medic

Google announced the release of this broad core algorithm update, coined by the industry as “Medic.” Google’s advice? Be more relevant for searches — and not just page by page, but as a whole site. More advice stated that small tweaks to content may not suffice and that holistic changes to the site may be required.

I believe the Medic update was a significant step focused on trust (part of Google’s “E-A-T” quality factors). In my opinion, trusted sites that were interlinked with untrusted sites were penalized.

Learn more:

July 2018 – Speed Update

Fast performance creates a better user experience for searchers clicking on results. After alerting website owners months in advance, Google announced its Speed Update on July 9, 2018. Previously, page load speed factored into only desktop search results. Since this update, slow performance on mobile devices can hurt a site’s optimization.

April 2018 – Broad Core Update

Google confirmed this broad core update. As is the case with all broad core updates, Google indicates that there’s nothing specific to do. However, as indicated in a later confirmation of the March core update (below), it may have been around relevance.

March 2018 – Broad Core Update

Google confirmed this broad core update and reminded webmasters to continue building great content. Later, Google explained that most of its updates are around relevance, not how good or bad a site is.

2017 ALGORITHM UPDATES

December 2017 – Several Updates

Google reacted to the SEO community that dubbed fluctuations in December as “Maccabees.” The search engine said it wasn’t a single update but several minor improvements and that there weren’t any major changes as a result.

March 2017 – Fred

Google confirmed a string of updates with the caveat that they make three updates per day on average. When asked if the updates had a name, Google jokingly said all updates would be called Fred, and the name stuck. These updates seemed to be related to quality, and some impacted were sites using aggressive monetization tactics that provided a bad user experience, and that had poor-quality links. Fred was wide-reaching and focused on quality across a variety of factors, not just a single process.

One specific target was sites using aggressive monetization tactics that provided a bad user experience. Poor-quality links were also targeted by Fred. Link to an untrusted site, and it lowers your trust … and rankings.

Hazards: Sites with thin, affiliate-heavy, or ad-centered content were targeted.

Winners: Many websites featuring quality, high-value content with minimal ads benefited.

January 2017 – Interstitial Updates

Google preannounced that it would make updates in January 2017 that would impact sites with pop-ups on their pages that created a poor user experience. It confirmed this algorithm updates on January 10, 2017.

2016 ALGORITHM UPDATES

September 2016 – Penguin Integrated into Core Algorithm

Google confirmed that its webspam algorithm dubbed “Penguin” was rolled into the core algorithm. That meant that instead of manual refreshes, it would now work in real-time. Penguin would now devalue spam links instead of demoting whole sites. It also became more granular in how it worked.

Rather than demoting a site for having bad inbound links, the new Penguin tried to just do away with link spam.

Now, if a site has inbound links from known spam sites, Google just devalues (ignores) the links.

However, if a site’s backlink profile is too bad, Google may still apply a manual action for unnatural links to your website. Also, John Mueller said earlier this year that a software-induced penalty can still occur if a site has “a bunch of really bad links” (h/t Marie Haynes).

Friendlier Penguin has not proven to be 100% effective. As a result, many businesses still need help cleaning up their link profile to restore lost rankings. Google has said that you should not need to disavow files, yet also welcomes them. To me, that is a very clear signal that we should not rely on the algorithm alone when it comes to backlinks.

Hazards: Sites that had purchased links were targets, as well as those with spammy or irrelevant links, or incoming links with over-optimized anchor text.

Winners: Sites with mostly natural inbound links from relevant webpages got to rise in the SERPs.

See the detailed history of Penguin in the Penguin Algorithm Updates section.

September 1, 2016 – Possum

Possum is an unconfirmed yet widely documented Google algorithm update. This update targeted the local pack. Unlike confirmed updates, the details of the Possum update are a bit less clear. SEOs believe it sought to bring more variety into local SERPs and help increase the visibility of local companies.

With this update, Google seemed to change how it filtered duplicate listings. Before Possum, Google omitted results as duplicates if they shared the same phone number or website. With Possum, Google filtered listings that shared the same address. This created intense competition between neighboring or location-sharing businesses.

Hazards: Businesses with intense competition in their target location could be pushed out of the local results.

Winners: Businesses outside physical city limits had a chance to appear in local listings.

May 2016 – Mobile Update

Google announced ahead of time that it would increase the mobile-friendliness ranking signal in May. Google confirmed the rollout was completed on May 12.

Learn more: Mobile Friendly SEO Ranking Boost Gets Boosted in May

January 2016 – Panda Integrated into Core Algorithm

Google revealed that the Panda algorithm targeting quality was a part of the core algorithm. It was not clear when this happened exactly. Google also confirmed that even though it was part of the core algorithm, it did not operate in real-time.

This significant update is detailed below in the Panda Updates section.

Learn more: Google Explains What It Means To Be Part Of The “Core” Algorithm

2015 ALGORITHM UPDATES

October 2015 – RankBrain

Google revealed to Bloomberg that RankBrain — Google’s artificial intelligence system — was one of its top three ranking signals. Reportedly, 15 percent of queries per day have never been seen by Google before. Initially, RankBrain helped interpret those queries, but it may now be involved in every query. It’s also possible with RankBrain that searchers’ engagement with the search results is a factor in how it determines the relevancy of a result.

According to Google’s Gary Illyes on Reddit,

RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query. It is a really cool piece of engineering that saved our butts countless times whenever traditional algos were like, e.g. “oh look a “not” in the query string! let’s ignore the hell out of it!”, but it’s generally just relying on (sometimes) months old data about what happened on the results page itself, not on the landing page. Dwell time, CTR, … those are generally made up crap. Search is much more simple than people think.

OK, if it changes the target, then it is always right. If you change the results to match a user-intent profile, then in the future, all clicks would match that profile since that is all there is. Are all searches for a particular keyword always informational? RankBrain may think so and push out ecommerce sites from the results. Fortunately, it is often correct.

Hazards: No specific losers, although sites won’t be found relevant that have shallow content, poor UX, or unfocused subject matter.

Winners: Sites creating niche content and focusing on keyword intent have a better chance of ranking.

Learn more:

July 2015 – Panda Update 4.2

For more information on this update, please see the Panda section below.

May 2015 – Quality Update to Core Algorithm

Google confirmed a change to its algorithm (although not right away) on how it processed quality signals. Google stated that the update wasn’t intended to target any particular sites or class of sites, but was an update to the overall ranking algorithm itself.

April 2015 – Mobile-Friendly Update (“Mobilegeddon”)

Google announced ahead of time in February and then confirmed in April its mobile-friendly update was rolling out that would boost the rankings of mobile-friendly pages. This update laid the foundation for Google’s mobile-first search mechanism.
The update underlined mobile-friendliness as a ranking signal and laid the foundation for the way Google’s mobile-first search mechanism works today. It was fun watching ecommerce sites try to fit 700 navigation links into a mobile menu. (Side note: There are better ways to handle mobile navigation.)

Hazards: Sites without a mobile-friendly version of the page, or with poor mobile usability, suffered.

Winners: Responsive sites and pages with an existing mobile-friendly version benefited.

2014 ALGORITHM UPDATES

October 2014 – Penguin Update 3.0

For more information on this update, please see the Penguin section below.

September 2014 – Panda Update 4.1

For more information on this update, please see the Panda section below.

July 2014 – “Pigeon” Local Search Algorithm Update

The update dubbed Pigeon shook up the local organic results in Google Web and Map searches.

Google told Search Engine Land that it had made an update to its local ranking signals to provide better results for users. Dubbed “Pigeon,” these updates improved distance and location ranking parameters (“near me”), and incorporated more of the ranking signals used in Google’s main web search algorithms. Google said it probably wouldn’t detail any more changes to the local search algorithm in the future.

Hazards: Local businesses with poor on- and off-page SEO suffered.

Winners: Local companies with accurate NAP information and other SEO factors in place gained rankings.

Learn more: How Do I Rank Higher in Google Local Search? Bruce Clay’s Checklist for Local SEOs

June 2014 – Payday Loan Algorithm Update (3.0)

Google confirmed that the third iteration of the “Payday Loan” algorithm that targets heavily spammed queries was rolling out.

May 2014 – Payday Loan Algorithm Update (2.0)

Google confirmed the second iteration of the “Payday Loan” algorithm, impacting about 0.2% of English queries.

May 2014 – Panda Update 4.0

For more information on this update, please see the Panda section below.

February 2014 – Page Layout Algorithm Refresh

Google announced a refresh of its page layout algorithm. Its impact was not given.

2013 ALGORITHM UPDATES

October 2013 – Penguin Update 2.1

For more information on this update, please see the Penguin Update section below.

August 2013 – Hummingbird

Hummingbird was announced in September (although it had been live since August). Hummingbird was essentially a complete overhaul of its algorithm (not an added-on update) and was the beginning of semantic search as we know it today.

Google needed a way to better understand the user intent behind a search query. Search terms that were similar but different, for example, often generated less-than-desirable results. Take the word “hammer” as an example. Is the searcher looking for the musician, the museum, or a tool to pound nails with?

Google’s Knowledge Graph was a first step. Released the year before Hummingbird, the Knowledge Graph mapped the relationships between different pieces of information about “entities.” It helped the search engine connect the dots and improve the logic of search results.

Hummingbird used semantic search to provide better results that matched the searcher’s intent. It helped Google understand conversational language, such as long-tail queries formed as questions. It impacted an estimated 90% of searches and introduced things like conversational language queries, voice search, and more.

Hazards: Pages with keyword stuffing or low-quality content couldn’t fool Google anymore.

Winners: Pages with natural-sounding, conversational writing and Q&A-style content benefited.

Learn more: Google Hummingbird & The Keyword: What You Need To Know

July 2013 – Panda Update – Recovery

For more information on this update, please see the Panda section below.

June 2013 – Payday Loan Algorithm Update

Google announced a new algorithm to address the quality of results for heavily spammed queries such as “payday loans,” “viagra” and pornography-related keywords. Sites impacted tended to be those involved in link schemes, webspam, and often illegal activities.

Learn more: What You Need to Know About the Google Payday Loan Algorithm Update

May 2013 – Penguin Update 2.0

For more information on this update, please see the Penguin section below.

March 2013 – Panda Update #25

For more information on this update, please see the Panda section below.

January 2013 – Panda Update #24

For more information on this update, please see the Panda section below.

2012 ALGORITHM UPDATES

December 2012 – Panda Update #23

For more information on this update, please see the Panda section below.

November 21, 2012 – Panda Update #22

For more information on this update, please see the Panda section below.

November 5, 2012 – Panda Update #21

For more information on this update, please see the Panda section below.

October 2012 – Penguin Update 1.2

For more information on this update, please see the Penguin section below.

October 2012 – Page Layout Algorithm Update #2

Google announced an update to its page layout algorithm update and confirmed it impacted about 0.7% of English queries.

September 27, 2012 – Panda Update #20

See the Panda section below for more details.

September 18, 2012 – Panda Update 3.9.2

See the Panda section below for more details.

September 2012 – Exact-Match Domain Algorithm Update

Google announced an algorithm update dubbed “Exact-Match Domain” that aimed to reduce low-quality exact-match domains in the search results.

Learn more: The EMD Update: Like Panda & Penguin, Expect Further Refreshes To Come

August 2012 – Panda Update 3.9.1

See the Panda section below for more details.

July 2012 – Panda Update 3.9

See the Panda section below for more details.

June 25, 2012 – Panda Update 3.8

See the Panda section below for more details.

June 8, 2012 – Panda Update 3.7

See the Panda section below for more details.

May 2012 – Penguin Update 1.1

For more information on this update, see the Penguin Updates section below.

May 2012 – Knowledge Graph Release

In what Google described as a “critical first step towards building the next generation of search,” it began rolling out the Knowledge Graph. This is basically a knowledge base designed to match keywords to real-world entities.

April 27, 2012 – Panda Update 3.6

See the Panda section below for more details.

April 19, 2012 – Panda Update 3.5

See the Panda section below for more details.

April 2012 – Webspam Update (Penguin)

Google announced an algorithm designed to target sites that were directly violating its quality guidelines. With “Penguin,” link spam became the target of Google’s efforts. This significant update is detailed below in the Penguin Updates section.

March 2012 – Panda Update 3.4

See the Panda section below for more details.

February 27, 2012 – Venice Update

Google announced improvements to ranking for local search results. Dubbed “Venice,” this update to local search took into account the user’s physical location or IP address. This was a major change to how local search worked.

February 2012 – Panda Update 3.3

See the Panda section below for more details.

January 19, 2012 – Page Layout Algorithm Update

Google confirmed that it would be updating its page layout algorithm to penalize sites with overly aggressive “above-the-fold” ads.

January 2012 – Panda Update 3.2

See the Panda section below for more details.

2011 ALGORITHM UPDATES

November 2011 – Panda Update 3.1

See the Panda section below for more details.

November 2011 – Freshness Update

To give users the freshest, most recent search results, Google announced that it would be improving its ranking algorithm to prioritize freshness for certain queries. Google said it “noticeably impacts six to 10 percent of searches, depending on the language and domain you’re searching on.”

October 2011 – Panda Update 3.0

See the Panda section below for more details.

September 2011 – Panda Update 2.5

See the Panda section below for more details.

August 2011 – Panda Update 2.4

See the Panda section below for more details.

July 2011 – Panda Update 2.3

See the Panda section below for more details.

June 2011 – Panda Update 2.2

See the Panda section below for more details.

May 2011 – Panda Update 2.1

See the Panda section below for more details.

April 2011 – Panda Update 2.0

See the Panda section below for more details.

February 2011 – Panda Quality Update

Google announced on its official blog that a new update to reduce rankings for low-quality content had been introduced. Dubbed “Panda,” it took particular aim at content produced by so-called “content farms.”

The initial rollout impacted about 12% of English queries. (You’ll find detailed history in the Panda Algorithm Update section below.)

Hazards: Websites lost rankings if they had duplicate, plagiarized or thin content; user-generated spam; keyword stuffing.

Winners: Original, high-quality, high-relevance content often gained rankings.

January 2011 – Attribution Update

In an effort to reduce spam, Google updated its algorithm to better detect scrapers. Matt Cutts, Google’s head of webspam at the time, revealed the change on his personal blog, saying it was a “pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.”

2010 ALGORITHM UPDATES

June 2010 – Caffeine

Google completed a significant new web indexing system named Caffeine (originally announced in 2009). It enabled Google to speed up its search engine, as well as provide users with fresher content.

Learn more:

May 2010 – Mayday Update

Search Engine Land reported that at an industry event, Google had confirmed the so-called Mayday update. This update significantly reduced long-tail traffic for some sites.

Learn more:

2009 ALGORITHM UPDATES

December 2009 – Real-Time Search Launch

Google announced the release of real-time search, a “dynamic stream of real-time content” that allowed users to see the most recent and relevant tweets, news stories, and more.

Pre-2009 ALGORITHM UPDATES

December 2005 – “Big Daddy”

This infrastructure update worked to improve the quality of search results. It was visible in December and 100% live by March of 2006, as reported by Google’s then-head of webspam, Matt Cutts.

Learn more: Was Big Daddy Too Much for Google to Handle?

September 2005 – “Jagger” Updates Begin

In a series of three updates (September, October, and November), “Jagger” was meant to deal with the increasing amount of webspam in the search results.

Learn more:

November 2003 – Webspam Update (Florida)

The update called “Florida” targeting webspam was the first major update coming from Google that put the kibosh on tactics used in previous years to manipulate rankings.

Learn more: What Happened to My Site on Google?

A Note on Algorithm Changes Pre-”Florida”
Between 2000 and 2003, PageRank would usually be updated monthly, and rankings would fluctuate. Webmasters would often post their findings on Webmaster World (before the days of confirmations or announcements from Google).

Learn more: A Brief History of SEO

Panda Algorithm Update

Panda was rolled out in February of 2011, aimed at placing a higher emphasis on quality content. The update reduced the amount of thin and inexpert material in the search results. The Panda filter took particular aim at content produced by so-called “content farms.”

With Panda, Google also introduced a quality classification for pages that became a ranking factor. This classification took its structure from human-generated quality ratings (as documented in its Search Quality Evaluator Guidelines).

Websites that dropped in the SERPs after each iteration of Panda were forced to improve their content in order to recover. Panda was rolled into Google’s core algorithm in January 2016.

History of Panda Updates

From 2011 to 2016, Panda had many data refreshes and updates before being rolled into the core algorithm.

  • Core Algorithm Integration – January 2016: Google revealed through SEM Post and later confirmed that Panda was integrated into the core algorithm.
  • Update 4.2 – July 2015: Google revealed to Search Engine Land that it pushed out a slow rollout. This refresh affected 2 to 3 percent of English queries, and gave a second chance to those penalized by Panda in the previous refresh.
  • Update 4.1 – September 2014: Google’s announced that this refresh was meant to further “help Panda identify low-quality content more precisely.” The refresh impacted 3-5 percent of queries.
  • Update 4.0 – May 2014: Google announced a major update that impacted 7.5% of English queries.
  • “Recovery” – July 2013: Google confirmed to Search Engine Land that this refresh was “more finely targeted” than previous ones.
  • Update No. 25 – March 2013: Search Engine Land reported that Google’s Matt Cutts announced a Panda update for March 15, 2013, during the SMX West panel. Tests suggest it happened, though Google never confirmed.
  • Update No. 24 – January 2013: Google announced a refresh that would affect 1.2% of English queries.
  • Update No. 23 – December 2012: Google announced a refresh that would affect 1.3% of English queries.
  • Update No. 22 – November 21, 2012: Google confirmed this update to Search Engine Land and said that 0.8% of English queries were impacted.
  • Update No. 21 – November 5, 2012: Google confirmed to Search Engine Land that an update took place. This refresh affected 0.4 percent of queries worldwide and 1.1% of English queries in the U.S.
  • Panda 20 – September 27, 2012: Google confirmed to Search Engine Land this relatively major update (more than a data refresh) that took more than a week to roll out. The update impacted 2.4% of English queries. Panda 20 marked a change in the naming convention of the update.
  • Update 3.9.2 – September 18, 2012: Google announced a refresh that “noticeably” affected less than 0.7%. They also said to “expect some flux over the next few days.”
  • Update 3.9.1 – August 2012: Google confirmed a refresh that impacted about 1% of queries.
  • Update 3.9 – July 2012: Google announced a refresh that impacted about 1% of search results.
  • Update 3.8 – June 25, 2012: Google announced a refresh that “noticeably” affected about 1% of queries worldwide.
  • Update 3.7 – June 8, 2012: Google belatedly confirmed this refresh. Less than 1 percent of queries were noticeably impacted in the U.S. and 1% of queries were impacted worldwide. Ranking tools have suggested that this refresh was heavier hitting than others.
  • Update 3.6 – April 27, 2012: Google confirmed to Search Engine Land that it pushed out a refresh on this day, and said it affected very few sites.
  • Update 3.5 – April 19, 2012: Google confirmed that a refresh happened. Search Engine Land published a list of the “winners and losers.”
  • Update 3.4 – March 2012: Google announced a refresh that affected about 1.6% of queries.
  • Update 3.3 – February 2012: Google announced the refresh on its Inside Search blog, saying it would make Panda “more accurate and more sensitive to changes on the web.”
  • Update 3.2 – January 2012: Google confirmed to Search Engine Land that a data refresh had taken place.
  • Update 3.1 – November 2011: Google confirmed that a minor update went out and impacted less than 1% of searches.
  • Update 3.0 – October 2011: Google announced that people should “expect Panda-related flux in the next few weeks,” but that it would have less impact than previous updates at about 2 percent. The update included new signals in the Panda algorithm and a recalculation of how the algorithm affected websites.
  • Update 2.5 – September 2011: Google confirmed to WebProNews that a refresh happened, though declined to share details about the sites impacted by it.
  • Update 2.4 – August 2011: Google announced on its Webmaster Central blog that the Panda update had been rolled out internationally to English-speaking and non-English-speaking countries (except for Japan, Korea, and China). The update impacted 6 to 9 percent of queries in most languages.
  • Update 2.3 – July 2011: Google confirmed to Search Engine Land that it implemented a small data refresh.
  • Update 2.2 – June 2011: Google confirmed to Search Engine Land that a data refresh occurred.
  • Update 2.1 – May 2011: The industry first thought this was a much larger update and could be Panda 3.0, but Google clarified that it was only a small data refresh.
  • Update 2.0 – April 2011: Google announced the first core Panda update, which incorporated additional signals and rolled the algorithm out to all English-speaking Google users worldwide. Only about 2% of U.S. queries were affected.
  • Update 1.0 – February 2011: Google announced on its official that a new update to reduce rankings for low-quality sites had been introduced, impacting about 12% of English queries.

Learn more: Understanding Google Panda: Definitive Algo Guide for SEOs

Penguin Algorithm Update

The Penguin update worked to target link spam.

Before rolling out Penguin, Google paid close attention to page link volume while crawling webpages. This made it possible for low-quality pages to rank more prominently than they should have if they had a lot of incoming links.

Penguin helped with the mission to make valuable search results as visible as possible by penalizing low-quality content and link spam. Many sites cleaned up their links. But they could stay in Penguin jail for months, unable to regain their lost rankings until Google ran the next update.

Google made Penguin part of its real-time algorithm in September 2016, and a friendlier version emerged.

History of Penguin Updates

From 2012 to 2016, Penguin had several data refreshes and updates before rolling into the core algorithm.

  • Update 4.0 – September 2016: Google announced on its Webmaster Central blog that Penguin was now part of the core algorithm. This meant Penguin worked in real-time and was also more granular.
  • Update 3.0 – October 2014: Google confirmed to Search Engine Land that a data refresh had occurred. Google later said that the update impacted less than 1 percent of English queries.
  • Update 2.1 – October 2013: Google confirmed a data refresh happened. About 1 percent of searches were noticeably affected.
  • Update 2.0 – May 2013: Google’s Matt Cutts confirmed on “This Week in Google” that a significant update took place and impacted 2.3% of English queries.
  • Update 1.2 – October 2012: Google announced that a small refresh was happening. Only 0.3% of English queries would be affected.
  • Update 1.1 – May 2012: Google announced that the first Penguin data refresh had occurred. Less than 0.1% of English searches were impacted.
  • Update 1.0 – April 2012: Google announced on its Inside Search and Webmaster Central blogs that the Penguin update was launched and designed to catch spammers and those going against publisher guidelines. The update would impact about 3% of search queries.

How to Watch for Google Algorithm Changes

With the exception of recent broad core updates, Google rarely announces its algorithm updates. And when it does, it is usually only after others discover them.

With so many tweaks going on daily, it is possible that Google doesn’t know that some changes will be significant enough to mention.

Often the first indication you have is your own website. If your search traffic suddenly jumps or dives, chances are good that Google made an algo update that affected your search rankings.

Where can you go for information when your online world gets rocked? Here’s what I recommend …

Have a “seismograph” in place on your website.

To detect search traffic fluctuations on your own website, you need analytics software. If you haven’t already, install Google Analytics and Google Search Console on your website. They’re free, and they’re indispensable for SEO.

Watch the SERP weather reports.

RankRanger SERP fluctuations chart.

Various websites and tools monitor ranking changes across categories and search markets and report on SERP volatility. Here are places you can check for early warning signs of a search ranking algorithm update:

Follow industry resources.

I’m always reading as an SEO. For the latest Google news, I recommend that you:

What To Do After a Google Update

Think that an algorithm update has penalized your site?

Don’t panic. Remember — nobody truly understands the algorithm. Whatever you’re experiencing may or may not be due to a Google fluctuation. And Google may “fix” it tomorrow, or next week.

With this in mind, get intentional. And stay calm. Decide whether you need to act before you do.

Calm woman meditating with laptop.

Here’s a plan to follow after an algorithm update …

  1. Stay calm.
  2. Get into puzzle-solving mode. Do NOT react or make changes hastily. Instead, gather data. Determine whether your site was impacted by the change and not something else, such as a technical SEO issue. Or it could be that your rankings dived because your competitors moved up in the SERPs. Depending on the cause, you need to do something different in response.
  3. Learn about the update from several sources (see my suggested resources above). Find out what other SEO experts are saying and experiencing.
  4. Adjust your SEO strategy accordingly.
  5. Remember that Google’s ranking algorithms change all the time. What impacts your site today could reverse itself in a month.
  6. Change what makes sense on your website.
  7. Re-evaluate your impact.
  8. If no results have changed, now you can panic.
  9. Call us.

Last Thoughts: You Don’t Need to Beat the Algorithm

Google’s algorithm updates are constant, often unverified, and difficult to anticipate. That doesn’t mean you have to be afraid.

Don’t spend your time trying to figure out a way to beat the algorithm. You’ll waste hours chasing your tail and missing the things that truly matter, like creating a high-quality website that is worthy of ranking.

I like to tell a story to illustrate this …

Imagine you’re out camping with a friend, and a bear shows up. You both take off running, the bear in hot pursuit.

In this situation, do you have to be an Olympic runner to survive?

No — you just have to be faster than your buddy.

In the world of SEO, your mission is to be better than your competition. You don’t need to beat the bear.

So don’t let algorithm updates cause you to make knee-jerk decisions. Instead, be strategic about how and when, but stay informed so you can make these decisions properly.

If you found this helpful, please subscribe to our blog. If you’d like assistance with your website SEO, contact us for a free quote.

FAQ: What steps should I take if my site’s ranking is impacted after an update?

Search engine algorithm updates can significantly impact a website’s ranking, leading to fluctuations in search results. If your site’s ranking has been affected after a recent update, there are crucial steps to take for recovery and reclaiming lost visibility.

Understanding the Impact

The first step is to carefully assess the extent of the ranking drop and identify the specific pages or keywords affected. Analyze the timing of the decline with recent algorithm updates. By understanding the impact and the changes made in the update, you can develop a targeted plan for recovery.

Site Audit and Optimization

Conduct a comprehensive site audit to identify any technical issues or violations of search engine guidelines that may have caused the ranking drop. Focus on factors like page load speed, mobile-friendliness, and broken links. Optimize your website’s content by incorporating relevant keywords naturally and ensuring high-quality, valuable content that aligns with user intent.

Link Profile Analysis

Evaluate your website’s backlink profile to check for suspicious or low-quality links that might have triggered a penalty. Disavow toxic links, and work on acquiring high-quality backlinks from reputable sources to improve your site’s authority.

User Experience Enhancement

A positive user experience is crucial for ranking well in search results. Ensure your website is visually appealing, easy to navigate, and features clear calls to action to improve its overall user experience. Doing this will encourage more visitors to remain on your site for extended periods.

Engaging Content Strategy

Craft a content strategy that provides valuable, engaging, and informative content for your audience. Regularly update your website with fresh posts in order to establish authority in your niche area and demonstrate expertise.

Step-by-Step Procedure: How to Recover Lost Rankings After an Update

  1. Evaluate the Ranking Drop
  2. Identify the Affected Pages or Keywords
  3. Understand the Algorithm Update
  4. Plan a Targeted Recovery Strategy
  5. Conduct a Comprehensive Site Audit
  6. Address Technical Issues and Violations
  7. Optimize Website Content
  8. Focus on Mobile-Friendliness
  9. Improve Page Load Speed
  10. Fix Broken Links
  11. Analyze the Backlink Profile
  12. Disavow Toxic Links
  13. Acquire High-Quality Backlinks
  14. Enhance User Experience
  15. Optimize Site Navigation
  16. Create Clear Calls-to-Action
  17. Ensure a Visually Appealing Design
  18. Implement an Engaging Content Strategy
  19. Publish Valuable and Informative Content
  20. Monitor and Analyze Progress

By following these expert strategies and taking appropriate action, you can navigate through algorithm updates and recover lost rankings effectively. Remember to continually monitor your website’s performance and adapt your SEO efforts to stay resilient in the ever-changing landscape of search engine optimization.

The post An Up-to-Date History of Google Algorithm Updates appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-algorithm-updates/feed/ 62
Why SEO Basics Still Matter + Evergreen SEO Tips https://www.bruceclay.com/blog/why-seo-basics-matter/ https://www.bruceclay.com/blog/why-seo-basics-matter/#comments Fri, 11 Aug 2023 15:00:44 +0000 http://www.bruceclay.com/blog/?p=22296 There are a ton of advanced Web marketing tactics these days, and the evolution of the field has brought us to a very healthy, holistic approach to digital marketing. But it’s equally important not to lose sight of the basics that allow a website to reach its full potential. We see it time and time again; sites that don’t implement the fundamentals of SEO find obstacles creeping in to various parts of their sites, their businesses, their strategies. That’s why SEO basics are the foundation of any successful website.

At the upcoming Search Engine Strategies in San Francisco this August, Bruce Clay presents the session, “Getting Started with SEO.” Conferences host these types of sessions time and time again because the basics of SEO are still very relevant.

This is because:

-Large brands with complicated websites are unable to take their site to the next step without implementing the basics of SEO on their site.
-Small business site owners are just getting started in search engine optimization, and need to understand why these tactics exist, and how to implement them.

Inspired by Bruce’s upcoming presentation, I thought we’d use this post to look at what SEO basics still matter and why. But first, let’s explore the “lasting” side of SEO – the approach to SEO that stands the test of time.

Read more of Why SEO Basics Still Matter.

The post Why SEO Basics Still Matter + Evergreen SEO Tips appeared first on Bruce Clay, Inc..

]]>
A, B and C blocks on a table.

Now more than ever, it seems the world is changing. Not too long ago, I spoke about tipping for SEO and digital marketing, which is causing high-growth companies to drive even faster.

And then, of course, Google changes an average of 12 things per day in Search, and websites are being impacted for things they aren’t even doing wrong.

Yet with all these changes, the basics of SEO remain the same. It is important not to lose sight of the fundamentals that we know make a difference. They make a difference in Google’s ranking algorithm and make a difference to businesses trying to remain relevant and competitive online.

That is why SEO basics are the foundation of any successful website. Let’s look at this idea in more detail.

What Is an Evergreen SEO Strategy?

No matter what changes are thrown our way, a basic and dependable approach to SEO helps a company’s online presence weather those changes.

That approach includes the following SEO principles:

  1. Beat the competition, not the algorithm
  2. Focus on a whole-SERP SEO strategy
  3. Optimize to be the least imperfect

1. Beat the Competition, Not the Algorithm

There are countless ranking signals in Google’s algorithm, with thousands of updates made yearly to Search. So it’s safe to say that no one will know exactly what the search engine algorithm is looking for.

We do the best we can with the information that’s available to us and the wisdom we have from years of practice, success and failure. We stay on top of developments and test them over and over again.

However, instead of focusing on every little algorithmic possibility, we instead focus on our competition. Our competition is the content ranking on Page 1 of the search results. And we discover that competition through keyword research (more on that later).

For more, read:

2. Focus on a Whole-Serp Strategy

With keywords in tow, you will be able to see what types of search engine results are most prominent for those queries. You will prioritize your SEO and content development efforts there.

For example, some search queries will have more prominent video results, images, or perhaps featured snippets.

Google search engine results page for the query "how to get kool aid out of carpet."
Google SERP for the query “how to get kool aid out of carpet”

In other words, don’t just focus on the “blue links.” When you approach SEO and content development by understanding which features Google believes are the most relevant for a query, you evergreen your SEO program.

For more, read:

Optimize To Be the Least Imperfect

Analyze the content and the webpages that are ranking for your desired keywords. What are they doing well? What can you do better?

As mentioned, the goal of SEO is not to try to beat the ranking algorithm, which is infinitely large. The goal of SEO is to beat the competition. And the way to do that is to be least imperfect compared to the competition. Every website is imperfect against the Google algorithm. And when Google evaluates which pages to serve in its search results, it chooses the least imperfect compared to others for that search.

For more, read:

What Are Some Basic SEO Strategies That Get Results?

Here are five basic SEO strategies that are proven to get results:

  1. Keyword and audience research
  2. Information architecture aka SEO siloing
  3. Quality content
  4. Technical and on-page optimization
  5. Linking practices

1. Keyword and Audience Research

SEO keywords are single words or short phrases that represent the search queries that people use in a search engine. Once you have identified your keywords and explored the intent behind them (i.e. what a person is trying to accomplish when they use them in Google), you can do the following and more:

  • Identify and speak the language of the target market
  • Create useful content for your target audience
  • Communicate to Google that a webpage is a relevant match for a query
  • Drive more qualified traffic to appropriate webpages

For more, read:

2. Information Architecture aka SEO Siloing

How you organize the content on your site matters for search engines and users. SEO siloing builds relevance for a website and positions it as an authority on a topic. It also helps website visitors navigate the content with ease, and get complete answers to their questions.

For more, read:

3. Quality Content

Google wants to display the most useful content to its search engine users. So quality content is likely the most important ranking factor to get right.

For more, read:

4. Technical and On-Page Optimization

The performance of your website matters to website visitors. That is why you need to make sure that the site provides a good user experience. You do this through technical SEO practices that optimize the back-end of the website.

In addition, you want to help Google understand what a webpage is about, and this can be accomplished through on-page SEO.

For more, read:

5. Linking Practices

How you link to other webpages matters in SEO. There are three primary strategies in linking:

  1. Internal links: How you link to pages within your website
  2. Outbound links: Which websites do you link to
  3. Inbound links: Which websites link to you

Graphic illustrating the difference between internal links, inbound links and external links.

Each type of linking strategy has its own best practices. It is important to strive for quality and relevance with every link.

For more, read:

Without the basics of SEO, websites suffer when Google changes things or we experience economic or market downturns. This results in knee-jerk reactions that end up costing businesses more in the end than an upfront investment in evergreen SEO strategies.

Looking to add an evergreen SEO strategy to your company plan? Talk to us. We can help.

FAQ: How can I implement an effective evergreen SEO strategy to stay competitive in an ever-changing digital landscape?

With search engine algorithms frequently changing and competition intensifying, adopting an evergreen SEO strategy is essential to remain competitive in the long run.

An evergreen SEO strategy focuses on core principles that remain effective regardless of algorithmic shifts. The key is prioritizing beating the competition rather than chasing after ever-changing search engine algorithms. Conducting thorough keyword research to identify and understand your competition is the first step. By analyzing what content ranks on Page 1 of search results for your target keywords, you gain valuable insights into your competitors’ strategies and areas for improvement.

Adopting a Whole-SERP approach is a crucial aspect of a successful evergreen SEO strategy. It goes beyond focusing solely on traditional blue links in search results. Instead, consider the various features that Google deems relevant for a specific query, such as featured snippets, images, videos, or knowledge graphs. By understanding which elements are prominent for your target keywords, you can tailor your SEO and content development efforts accordingly to maximize visibility and engagement.

Optimizing to be the least imperfect compared to your competition is another fundamental principle of an evergreen SEO strategy. Instead of obsessing over trying to outsmart the ever-changing ranking algorithms, concentrate on delivering the best possible user experience and content quality. Analyze the webpages that rank for your desired keywords and learn from their strengths. By providing more value to your audience and addressing their needs comprehensively, you increase your chances of ranking higher in search results.

Implementing evergreen SEO strategies involves five proven practices:

  1. Conduct comprehensive keyword and audience research to understand user intent and create targeted content.
  2. Optimize your website’s information architecture using SEO siloing to establish authority on specific topics and improve navigation.
  3. Prioritize producing high-quality content that caters to your audience’s needs, as it remains the most significant ranking factor.
  4. Ensure technical and on-page optimization to enhance user experience and facilitate search engine understanding.
  5. Adopt effective linking practices including internal, outbound and inbound links to build authority and relevance for your website.

Mastering the art of evergreen SEO is essential for staying competitive in the ever-changing digital landscape. By focusing on beating the competition, understanding user intent and delivering high-quality, valuable content, businesses can create a solid foundation for long-term online success.

Step-by-Step Evergreen SEO Strategy Implementation: 

  1. Conduct comprehensive keyword research to identify relevant target keywords.
  2. Analyze competitor content that ranks on Page 1 of search results for your chosen keywords.
  3. Identify areas for improvement and potential content gaps in your niche.
  4. Develop a Whole-SERP strategy by considering different types of search engine results for your target queries.
  5. Prioritize content creation that aligns with your audience’s search intent.
  6. Implement SEO siloing to organize your website content and establish topic authority.
  7. Optimize website navigation for user-friendly access to relevant content.
  8. Focus on producing high-quality, valuable content that addresses your audience’s needs and queries.
  9. Ensure technical SEO practices to improve website performance and user experience.
  10. Optimize on-page elements such as meta tags, headers and content structure.
  11. Utilize internal linking to guide users to related content within your website.
  12. Employ outbound links to reputable sources that support and complement your content.
  13. Seek quality inbound links from authoritative websites in your industry.
  14. Monitor and analyze website performance and rankings regularly.
  15. Continuously update and improve your content to remain relevant and valuable.
  16. Stay up-to-date with industry trends and algorithm changes to adapt your strategy.
  17. Leverage social media and other promotional channels to drive traffic to your content.
  18. Encourage user engagement and interaction with your content.
  19. Monitor and respond to feedback and comments from your audience.
  20. Continuously evaluate and refine your evergreen SEO strategy based on performance metrics and changing market demands.

The post Why SEO Basics Still Matter + Evergreen SEO Tips appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/why-seo-basics-matter/feed/ 9
Do WordPress Sites Do Better in Google Search? https://www.bruceclay.com/blog/wordpress-sites-better-google-search/ https://www.bruceclay.com/blog/wordpress-sites-better-google-search/#comments Thu, 13 Jul 2023 18:14:45 +0000 https://www.bruceclay.com/?p=194476 Explore the reasons behind Google's embrace of WordPress, the impact of their partnership, and what makes WordPress sites competitive.

The post Do WordPress Sites Do Better in Google Search? appeared first on Bruce Clay, Inc..

]]>
A bundle of cubes with WordPress logo printed on them.

For several years, people have speculated that websites built on WordPress do better in Google’s search results. So, does this mean Google favors WordPress sites?

Let’s dive into:

Why Google Embraces WordPress

WordPress is behind more than one-third (43%) of the websites on the internet and its CMS market share is roughly 64%, according to W3Techs.

That makes it the most popular content management system and growing. In fact, from January 2020 to April 2022, WordPress usage grew from 35% to 43% – that’s nearly a 23% increase in just over two years.

And WordPress isn’t just for small businesses. WordPress powers some of the biggest brands and most popular websites including Zoom.us, Alibaba.com, Bloomerg.com, and Salesforce.com.

Google endorsed WordPress early on, stating back in 2009 that the CMS solved up to 90% of SEO mechanics, with features such as enabling an SEO-friendly site structure, mobile friendliness, and enabling page optimization via meta tags – and that is right out of the box.

Additionally, WordPress plugins enable you to further tailor functionality (including our own Bruce Clay WordPress SEO plugin), providing plenty of ways for optimizing a WordPress website for SEO.

Even with all of these new features, WordPress still isn’t perfect for organic search and needs work if you really want to increase your search results.

In 2017, Google presented findings at a WordCamp event that showed WordPress performed poorly compared to non-WordPress webpages on a lot of key performance indicators.

Graph from Google presentation showing median speed index between WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017
Graph from Google presentation showing Time to First Interactive metric for WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017
Graph from Google presentation showing First Meaningful Paint metric for WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017

(In case you didn’t notice, those performance indicators are the same found in Google’s “page experience” algorithm update that hit in 2021).

So in 2018, Googler Alberto Medina announced a partnership between Google and WordPress to help improve the WordPress ecosystem. (The announcement on his personal blog seems to be currently unavailable.)

Why partner together?

As WordPress expands in market share and becomes the website platform of choice for more websites, Google wants to ensure that WordPress sites perform well in search results.

That equates to a better experience for Google users. And this is what Google really cares about.

Does Google Give WordPress Sites Better Rankings?

Maybe – but it’s not likely due to a ranking signal in its algorithm. And, in fact, Google has stated otherwise.

In 2016, Google’s John Mueller said this on Twitter:

And again in 2021, Mueller said this:

The reason why WordPress sites may tend to rank better is due to WordPress’s SEO-friendly features. Not every CMS allows you to do advanced SEO. This is what you want in a CMS.

As WordPress and its community — including Google — continue working together to enhance it, WordPress sites will be more competitive in search results compared to other website builders.

If you are choosing between WordPress and Wix for creating a website, my advice would be to ensure you can compete effectively with your competition online.

And, if all your competitors use WordPress, it would be advantageous for all parties involved to be competing on an even playing field. You don’t get there by using a website builder that does not enable you to do advanced SEO.

Contact us today to learn more about how you can develop or update your WordPress site to rank in Google.

FAQ: How does the partnership between Google and WordPress benefit search results?

Google recently formed an alliance between itself and WordPress, offering multiple benefits for website owners and marketers. This partnership positively impacts search rankings for WordPress users in many ways, including:

Enhanced Crawling and Indexing

The partnership between Google and WordPress has resulted in improved crawling and indexing of WordPress websites. Google Searchbot is tightly integrated into WordPress websites and will quickly identify and index new or updated pages on them, giving Google the power to show them in relevant search results quickly. Google can even quickly index blog posts written for WordPress sites as quickly as they’re published or updated, making sure your posts appear quickly for searchers to discover them.

Optimized Site Structure and Mobile-Friendliness

WordPress is widely admired for its user-friendly interface and robust architecture, offering a host of SEO-friendly themes and plugins that can optimize website structure, increase speed and enhance mobile friendliness. This integration aligns perfectly with Google’s emphasis on mobile-first indexing and page experience signals. By leveraging WordPress’s features and Google’s guidelines, website owners can create highly responsive and user-friendly websites that are favored by search engines.

Streamlined Content Publishing and SEO

Content is king in the digital realm, and WordPress empowers website owners to create and publish high-quality content seamlessly. The partnership with Google further enhances this process by providing valuable insights and recommendations for optimizing content for search results. WordPress plugins, such as Yoast SEO, integrate with Google Search Console to offer real-time suggestions for improving keyword targeting, meta descriptions and overall content quality. This collaborative approach ensures that your WordPress site aligns with Google’s search algorithm, maximizing visibility and engagement.

Improved Schema Markup and Rich Snippets

Schema markup, also known as structured data, provides search engines with additional information about your website’s content, enabling them to display rich snippets in search results. WordPress offers numerous plugins that simplify the implementation of schema markup, ensuring that your content stands out in search listings. The Google-WordPress partnership has facilitated the adoption of standardized schema types and improved compatibility between WordPress and Google’s search engine. As a result, website owners can effectively communicate the relevance and context of their content to Google, leading to enhanced visibility and click-through rates.

The integration of Google’s search capabilities with the WordPress platform empowers website owners to create user-friendly, SEO-optimized websites that rank well in search results, ultimately driving more organic traffic and increasing online visibility.

The post Do WordPress Sites Do Better in Google Search? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/wordpress-sites-better-google-search/feed/ 4
Google Analytics 4: What It Is and How To Get Started https://www.bruceclay.com/blog/ga4-what-it-is-how-to-get-started/ https://www.bruceclay.com/blog/ga4-what-it-is-how-to-get-started/#comments Tue, 13 Jun 2023 22:43:06 +0000 https://www.bruceclay.com/?p=192715 Have you made the jump from Universal Analytics to Google Analytics 4? This step-by-step guide explains what GA4 is, how to install it and how to get started.

The post Google Analytics 4: What It Is and How To Get Started appeared first on Bruce Clay, Inc..

]]>
Website data and analytics displayed on a laptop computer.
Google Analytics 4 (GA4) is a completely reimagined analytics platform – the biggest change to Google’s analytics product since its inception in 2005.

Even though there are exciting new features in the GA4 platform, for many, this change is daunting. Migrating to a new platform takes work and many are unsure of where to start.

This task is made even more daunting with an approaching deadline – on July 1, 2023, all standard Universal Analytics will stop processing data. For premium users of Universal Analytics 360, Google has extended the sunset date from October 1, 2023, to July 1, 2024.
(You can learn more about significant dates in this help file from Google.)

In this article, we will give a high-level overview of GA4 so you can get a better understanding of the changes that you can expect to see, and how to get started. Here are some quick links so you can jump to a specific section:

What Is GA4?

Google Analytics 4 is not just an update to Universal Analytics – it’s a whole new platform.

GA4 was designed to meet evolving needs; website owners need to be able to track the customer journey across many channels and devices (like websites and apps). In addition, privacy is a growing concern.

Google says this of GA4:

It allows businesses to see unified user journeys across their websites and apps, use Google’s machine learning technology to surface and predict new insights, and most importantly, it’s built to keep up with a changing ecosystem.

Google goes on to point out some of the main differences between Universal Analytics and GA4:

Universal Analytics was built for a generation of online measurement that was anchored in the desktop web, independent sessions and more easily observable data from cookies. This measurement methodology is quickly becoming obsolete. Meanwhile, Google Analytics 4 operates across platforms, does not rely exclusively on cookies and uses an event-based data model to deliver user-centric measurement.
And though Universal Analytics offers a variety of privacy controls, Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users.

When it comes to the nitty-gritty details, there are several ways that UA and G4 differ in how they collect data, and in the metrics reported. Let’s look at that next.

Data Model Differences

Let’s go over what we believe to be the most significant change to Google’s analytics in how it collects data: Events.

This change impacts everything from the way sessions are recorded to the way the reports are set up. In other words: It’s a fundamental change.

Events

One of the biggest differences between UA and G4 is “events.” GA4 is based on the idea that any interaction (such as page hits, ecommerce hits, and social hits in UA) is an event. So in GA4, all the interactions are recorded as events.
Google explains:

A Universal Analytics event has a Category, Action, and Label and is its own hit type. In Google Analytics 4 properties, every “hit” is an event; there is no distinction between hit types. For example, when someone views one of your website pages, a page_view event is triggered.

Google Analytics 4 events have no notion of Category, Action, and Label and, unlike Universal Analytics reports, Google Analytics 4 reports do not display Category, Action, and Label. Therefore, it’s better to rethink your data collection in terms of the Google Analytics 4 model rather than port your existing event structure to Google Analytics 4.

Some of the old hit types in UA have been converted to GA4 events. For example, a page view hit would be converted to a page view event.

Screenshot of Google table comparing Universal Analytics hit type metrics vs. GA4 events.
Image credit: [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics, Google.
But some measurements have exact equivalents between UA and GA4, as shown in the illustration:

Screenshot of Google table comparing pageview attribute between UA and GA4.
Image credit: [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics, Google.
In terms of the user experience, the biggest impact will likely be in accessing the reports.

UA had report categories such as “Acquisition,” “Behavior,” etc. – all the associated reports were in those sections. GA4 doesn’t have that (for the most part).

For example, the data for the page views report in GA4 is in Engagement > Events > page_view.

Users will need to recreate some of those reports using event counts. If you want it to look exactly the same in GA4 as in UA, you need to create the report or use the Explorations option in GA4.

Metrics Updates

When it comes to metrics, there are a lot of little changes that will add up to a big change overall for tracking. Here are a few changes we think are significant:

  • Sessions
  • Engagement rate
  • Conversions

Sessions

Sessions are counted differently in GA4 versus UA. For example, there isn’t a midnight cutoff for sessions in GA4 like UA had, and GA4 doesn’t start new sessions for users who come in from different campaigns.

Google table comparing Session metrics between UA and GA4.
Image credit: [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics, Google.
Google says session counts could be lower in GA4 than in UA: “This is because Google Analytics 4 does not create a new session when the campaign source changes mid-session, while Universal Analytics does create a new session under that circumstance.”

The statistical estimates that GA4 uses for sessions, however, should deliver higher accuracy and lower error rates in data reporting.

Engagement Rate

Bounces are measured differently in GA4. In the new platform, the bounce rate is the percentage of sessions that were not engaged sessions.

An engaged session in GA4 lasts 10 seconds or more, has one or more conversion events or has two or more page or screen views.

If a user doesn’t meet any of the criteria listed, then it is considered a bounce.

Contrast that with the traditional bounce rate in UA, which measured if someone only visited one page on a website and didn’t trigger any other event.

Google table comparing bounce rate metric between UA and GA4.
Image credit: [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics, Google.

Conversions

Those who are used to tracking goals in UA will need to get familiar with conversion events in GA4.

In GA4, you will identify a key event important to your business. Once that event is hooked up on your website, it can be promoted to a conversion event inside of GA4.

Depending on how your goals are set up in UA, you may get a close equivalent in GA4.

But Google notes that there are some differences between UA and GA4 that may make it difficult to do an apples-to-apples comparison:

Google table comparing conversions metric between UA and GA4.
Image credit: [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics, Google.
Google notes:

Universal Analytics supports five goal types: destination, duration, pages/session, smart goals, and event goals. GA4, in contrast, only supports conversion events. It may not always be possible to use GA4 conversion events to precisely duplicate some UA goal types. For example, it’s not possible to duplicate a smart or duration goal using GA4 conversion events.

UA counts only one conversion per session for the same goal. GA4 counts multiple conversions per session for the same conversion event.

Your UA reports may be excluding data based on view filters.

You can find out more on how to set up a conversion in GA here.

Ways to Get Started with GA4

Let’s go over a high-level overview of how to get going with GA4.

Google outlines three ways to get started with GA4:

  1. If you’re a new analytics user (meaning you have not used Google Analytics in the past on a website or app).
  2. If you’re already a Google Universal Analytics user. This option will use your pre-existing UA tag to populate data into your new GA4 property. The GA4 Setup Assistant helps with this step.
  3. If you’re adding GA4 to a website builder platform, such as WordPress, Wix, Shopify, etc. This is how you get and input the new GA4 measurement ID into your platform.

To encourage people through the process, Google provides an outline overview of how to make the switch to GA4, with labels that show how much effort is involved in each step.

Google outline overview explains how to make the switch to GA4.
Image credit: [GA4] Make the switch to Google Analytics 4, Google.
Google also provides tools to help you get set up with its GA4 Setup Assistant, which has many handy features for those who already have UA.

Screenshot of GA4 Setup Assistant.
Setup Assistant for GA4.

The Setup Assistant (from Google):

  • Creates your new GA4 property.
  • Copies the property name, website URL, timezone, and currency settings from your Universal Analytics property.
  • Activates enhanced measurement in your GA4 property.
  • Creates a connection between your Universal Analytics and GA4 properties. This connection makes it possible for you to use Setup Assistant in your Google Analytics 4 property to migrate configurations from your Universal Analytics property to your GA4 property.
  • Sets the GA4 property to receive data from your existing Google tag, if you choose to reuse an existing site tag.

The GA4 Setup Assistant wizard does not backfill your new GA4 property with historical data. Your GA4 property only collects data going forward. To see historical data, use the reports in your Universal Analytics property.

A word of warning before you get started: Google has already sent out emails to Google Analytics users that they would be automatically migrated to GA4 starting the first week in March 2023 if they hadn’t already completed the process.

(Check out: Google’s GA4 Auto Migration: Here’s Why You Should Opt Out at Search Engine Land for a good overview on this topic.)

That means, for some, that Google will have already configured a GA4 property with basic settings.

You will want to check to see where you stand before proceeding. And you will most definitely want to ensure that the settings made for you are, in fact, what you wanted.

Verifying and Customizing Your GA4 Install

Once you have installed GA4, you will want to verify the install and customize settings as needed.

Verifying the Install

You can test the GA4 install by using the GA4 Tag Assistant Chrome extension or the GA4 DeBugView.

You can verify that the tracking code is properly sending data to the GA4 account by heading to Reports > Realtime to see that the data is loading.

Realtime report menu in GA4.
Realtime report menu in GA4.

As with any system, you need to check often to ensure that data is being collected and processed correctly, and that the GA4 website is properly configured. Validate the installation and operation of GA4 on a regular basis – at least once a week.

Address any issues identified during validation as soon as possible to ensure accurate data collection.

At this point the system is installed and gathering data. Wait at least a day or two before you can see meaningful data coming in.

Customizing the Settings

Using the GA4 Setup Assistant, you can configure your property settings.

This includes:

Property settings in GA4 Setup Assistant.
Property settings in GA4 Setup Assistant.

Conversions: The Setup Assistant can migrate goals from UA to conversions in GA4 for you, but you should still review them to make sure they’re correct. The migration isn’t always perfect. Go to Admin > Property Settings > Conversions to verify.

Custom dimensions and metrics: Custom dimensions and metrics can be created by defining their name and scope and assigning them to the relevant data stream. Go to Admin > Property > Custom definitions to modify.

Take note: Unrelated to any particular settings in the Setup Assistant, you should be aware of any data protection laws and regulations that may apply to your website and ensure compliance. GA4 should be compliant out of the box, but any extra data you capture may not be.

So make sure that you are not storing any information about a user that isn’t spelled out in your privacy policy, and that your privacy policy is compliant with whatever laws you are subject to (GDPR, CPRA or something else).

Getting Started with GA4 if You Don’t Have UA or Google Tag Manager

The following section is helpful for those who don’t already have a pre-existing UA account and/or don’t use Google Tag Manager, nor have the help of a CMS to install GA4. Follow the steps in this section to get started with GA4.

The following section is helpful for those who don’t already have a pre-existing UA account and/or don’t use Google Tag Manager, nor have the help of a CMS to install GA4.

Follow the steps in this section to get started with GA4.

Step 1: Create a GA4 Account and Get the Tracking Code

  1. Go to the Google Analytics website (analytics.google.com) and sign in with your Google credentials.
  2. Click the “Admin” button (bottom left corner of the page).
  3. Under the “Property” section, click the “+ Create Property” button.
  4. Select “Web” as the type of property you want to create.
  5. Enter a name and default URL for your website.
  6. Accept the terms and conditions (read them first, of course 😊), and click on the “Create” button.
  7. Once your property is created, then click on the “Data Streams” tab.
  8. Click Add Stream > Web and fill out the form with your website details.
  9. Click “Create Stream” to create the data stream.
  10. Once created, under “Data Streams”, click on the stream you created and you will see your GA4 Measurement ID – it will start with “G-“.
  11. Get and copy the tracking code by clicking “Get Tag Instructions” and “Install Manually,” then paste it into the header of your website.
  12. Once you have added the tracking code to your website, you can verify that it is working by using the GA4 Tag Assistant Chrome extension or the GA4 DeBugView.

Step 2: Install the Tracking Code on the Website

  1. Make sure you set up the GA4 account and tracking code (see previous section).
  2. This is where you add the GA4 code to all pages/templates on your site – and add it to all pages but only once per page.
  3. Save the changes to your server if necessary – usually a webmaster task.
  4. Verify that the tracking code has been installed across your website correctly.
  5. In your GA4 account under Data Streams, you should link your website to your GA4 account.
  6. Wait up to 48 hours after installing the GA4 tracking code to verify that data is coming in for your website.
  7. Check the “Realtime” report under the “Reporting” tab to see if data is being received in near real-time.
  8. Check the “Audience” report under the “Reporting” tab to see if the data is showing users and sessions.
  9. Use the GA4 Tag Assistant Chrome extension to verify that the GA4 tracking code is installed correctly and working properly.
  10. Use the GA4 DeBugView to troubleshoot any potential issues with the tracking code installation.
  11. Check your GA4 account’s data settings to make sure that data collection is turned on and that the correct data stream is selected.
  12. Make sure you are aware of any data protection laws and regulations in your area and that your website is compliant with them.

Step 3: Configure GA4 Conversions and Custom Dimensions

  1. Login to your GA4 account.
  2. Click on the “Admin” link (bottom left corner).
  3. Under the “Property” section, select the website that you want to configure.
  4. Select “Events” from the menu and click “Create Event,” then click “Create.”
  5. Give the event a name and fill out its conditions. This can either be a fired event on the website or a “page_view” event for a certain page.
  6. Click “Create” in the upper right.
  7. Back on the list of Events, find the event you created and toggle the entry to “Mark as conversion.”
  8. In the Property menu, go to “Custom Definitions” and set up custom dimensions and metrics by defining their name and scope.
  9. Monitor your GA4 account’s settings and make tweaks as needed.

Step 4: Monitor the GA4 Account for Data Accuracy

  1. Calendar your frequent check-ins to review the data in your GA4 account.
  2. Use the Realtime report to see any unusual spikes or drops in website traffic.
  3. Compare data to identify any trends or patterns.
  4. Use segments to analyze specific subsets of your data.
  5. Use the GA4’s machine learning features to identify trends and segments in your data.
  6. Use the GA4 DeBugView to troubleshoot any potential issues with the tracking code installation.
  7. To ensure data accuracy, check it all and maybe adjust filters, custom dimensions, or goals.

Get Started Now

With GA4 deadlines looming for Universal Analytics users, it’s important to start the process of setting up GA4 right away. While you are doing this, you may still reference UA data until you make the switch.

Migrating to GA4 is a big change, but the insights you’ll get into your customer lifecycle will be well worth it.

Get Your Free GA4 E-Book

We’ve packaged all of this information into a handy 20-page e-book that you can save and refer to over and over. Get your free copy of Google Analytics 4: What It Is and How To Get Started today!

Inundated with SEO data and unsure how to apply what it is telling you? Our SEO experts can help you make sense of it all, plus give recommendations to implement in your program that gets better results. Contact us for a free consultation.

FAQ: How do I migrate to Google Analytics 4 (GA4) from Universal Analytics, and what are the key differences between the two platforms?

As the landscape of digital analytics evolves, migrating from Universal Analytics to Google Analytics 4 (GA4) is becoming increasingly essential. GA4 is more than an upgrade of Universal Analytics; it’s an entirely reimagined platform designed to meet the unique needs of website owners. This whitepaper will assist in your migration process and understanding any differences between both platforms.

Understanding the Key Differences

The first step in migrating to GA4 is understanding the fundamental differences between the two analytics platforms. While Universal Analytics is anchored in the desktop web and relies on cookies for data collection, GA4 operates across platforms and employs an event-based data model. This shift in approach allows GA4 to provide more user-centric measurement and better adapt to a changing digital ecosystem.

Migrating to GA4: Step-by-Step Guide

  1. Create a new GA4 property: Start by logging into your Google Analytics account and navigating to the Admin section. Click “Create Property”, then “Web” for the type of property that you would like to create. Follow the instructions to set up your new GA4 property.
  2. Add GA4 tracking code to your website: Once your GA4 property is created, you will receive a Measurement ID starting with “G-.” Add the GA4 tracking code to all pages of your website’s header.
  3. Verify the installation: Use tools like GA4 Tag Assistant Chrome extension or GA4 DeBugView to ensure that the tracking code is sending data to your GA4 account correctly.
  4. Set up data streams: Under the “Data Streams” tab in your GA4 property, click on “Add Stream” and select “Web.” Fill out the necessary details to create a data stream for your website.
  5. Configure conversion events: Identify key events important to your business and mark them as conversions in GA4. Review and adjust them to ensure accurate tracking.
  6. Manage custom dimensions and metrics: Set up custom dimensions and metrics by defining their names and scope. This will help you collect more specific data for your analysis.
  7. Enable enhanced measurement: Activate enhanced measurement in your GA4 property to access more detailed insights about user interactions.

Data Model Differences

One of the major differences between Universal Analytics and GA4 is the shift from sessions to events. In GA4, every interaction on your website or app is recorded as an event, providing a more granular view of user behavior. While this change may require rethinking your data collection strategy, it offers deeper insights into user journeys across different channels.

Metrics Updates

GA4 introduces changes in metrics like sessions, engagement rate, and conversions. Sessions are now counted differently, without a midnight cutoff, which may affect session counts. The engagement rate now considers engaged sessions lasting 10 seconds or more, or those with conversion events or multiple page views. Conversions are now represented by conversion events, which may not always duplicate the same goals as Universal Analytics.

Tips for a Smooth Migration

  1. Plan ahead: Before starting the migration process, thoroughly plan and understand your analytics goals to ensure a seamless transition.
  2. Backup data: Save a copy of your Universal Analytics data for reference and comparison during and after the migration.
  3. Review reports and data accuracy: After the migration, review GA4 reports, and cross-check data accuracy to ensure a smooth transition.
  4. Stay informed: Keep yourself updated with GA4’s latest features and changes to leverage its full potential.

Step-by-Step Migration Procedure:

  1. Sign into your Google Analytics account.
  2. Navigate to the Admin section.
  3. Click on “Create Property.”
  4. Select “Web” as the type of property.
  5. Enter your name and the default URL for your website.
  6. Click “Accept” when asked about the terms and conditions.
  7. Click “Create.”
  8. Click “Data Streams.”
  9. Click “Add Stream” > “Web.”
  10. Fill out the form with your website details.
  11. Click “Create Stream” to create the data stream.
  12. Under “Data Streams,” click on the stream you created.
  13. Find your GA4 Measurement ID starting with “G-.”
  14. Get and copy the tracking code.
  15. Add the tracking code to the header of your website.
  16. Use tools like GA4 Tag Assistant or DeBugView to verify the installation.
  17. Set up conversion events and custom dimensions.
  18. Enable enhanced measurement.
  19. Monitor data accuracy and make necessary adjustments.
  20. Stay informed about GA4 updates and features for ongoing optimization.

The post Google Analytics 4: What It Is and How To Get Started appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/ga4-what-it-is-how-to-get-started/feed/ 1
Advanced Search Operators for Bing and Google (Guide and Cheat Sheet) https://www.bruceclay.com/blog/bing-google-advanced-search-operators/ https://www.bruceclay.com/blog/bing-google-advanced-search-operators/#comments Thu, 09 Feb 2023 18:11:53 +0000 http://www.bruceclay.com/blog/?p=41102 Learn how to use advanced search operators and search like a pro! Filter results in Bing and Google for SEO research (cheat sheet included).

The post Advanced Search Operators for Bing and Google (Guide and Cheat Sheet) appeared first on Bruce Clay, Inc..

]]>
User typing a site search operator on laptop.
When you search on Google or Bing, do you find exactly what you’re looking for the first time? Probably not.

Enter advanced search operators. These commands help you extract everything the search engine knows about a specific subject – and efficiently.

These tricks of the trade can definitely save you time, especially if you’re doing any kind of competitor analysis or SEO research.

Soon you’ll be searching like a pro as you learn:

What Are Search Engine Operators?

Search operators are commands that use special characters along with a query to make the search engine results more specific. Essentially, they work like filters that you can customize as needed.

To use a search operator, add the command into the search box and search as you normally would. The results are entirely different from the average search.

Why Should I Use a Search Operator?

SEOs routinely use search operators to filter results from a search engine. These advanced search skills let you easily:

  • Locate something specific online
  • Research a site you’re optimizing
  • Investigate the competitive field

When you get comfortable with a few of these commands, you can find what you’re looking for much faster.

How Do I Use Advanced Search Operators?

Enter search operators in the search bar along with your regular query, but with some modifications.

A search operator typically has:

  • A prefix: Something that comes before the search query
  • An addition: Something that is appended to the search query and contains special characters

For example, you can use the cache: prefix in front of the query, or you can use the OR command in between two words in a query.

In many instances – but not all – you want to ensure you do not put a space between the search operator character and the query.

So if you were using the site: command you would want it to look like this:

site:bruceclay.com page experience update

And not like this:

site: bruceclay.com page experience update

OK, all this information is helpful … but how about some examples?

Example 1: Quotation Marks

Quotation marks (“) help you to match an exact phrase. So searching for “advanced search tips” as an example (with the quotes) finds only pages that contain those words used as a phrase.

Example 2: Site Search

The site: command filters your search results to just one website. In other words: You are searching only one domain for the information you need.

Start with the command, which is site: then add the domain name you want to search and finally the topic you want to search the domain for.

In the example below, site: tells the search engine you want to browse a particular domain–bruceclay.com–and siloing is the topic you are interested in finding.

Your results would look something like the screenshot below. Google found 362 pages about siloing on BruceClay.com:

Google search results of bruceclay.com site search for siloing.

Example 3: Combining Search Operators

You can combine search operators to refine results even further. For example, you can combine site search with quotation marks to search for a longer phrase within a particular website.

Google site search animation.

This search found 157 pages. Without the quotation marks, the query would return way too many results. For instance, the search engine would find pages about “voice” or “search” — so nearly all the pages on our site.

Bing and Google Search Operator Documentation

Each search engine has its own set of advanced search operators. Here’s the official documentation from the two major search engines for your reference:

Search Operators Used in SEO Research

Here are seven ways to use the search commands for SEO research:

  1. Analyze the competition
  2. Find information about a specific page or site
  3. Discover indexing problems
  4. Help with site maintenance
  5. Further refine results
  6. Find social profiles
  7. Find potential internal links

In the examples below, the search query is in bold.

1. Analyze the Competition

related:bruceclay.com
The related: operator gives you a glimpse of competitor content. You’ll see a small selection of what Google considers to be similar. Then you can analyze their SEO metrics — including word count, keyword use, meta data and inbound links — so that you can make your page equal to and then better than the competition.

allintitle:seo blog
This query brings up webpages that have both “SEO” and “blog” in their metadata title. We could use this to find competing blogs to our own. The search operators allintitle: and intitle: let you find pages using your keywords in title tags.

Similarly, the commands allinurl: and inurl: let you identify competitors that use keywords in URLs. (Note that as of this writing, the intitle: command works in both Google and Bing searches, but allintitle:, allinurl: and inurl: work only in Google.)

cache:https://www.bruceclay.com/seo/
The cache: command shows you a search engine’s cached version of a page. It’s a way to check how the search engine actually sees your page. Cache shows what page content the search engine considers relevant to retrieve, making this Google search operator a valuable SEO diagnostic tool.

2. Find Information About a Specific Page or Site

info:competitorsite.com
Using the info: command in Bing gives you results that seem like a collection of these advanced search operators. It’s a one-stop shop to access a variety of onsite and offsite results about a website. Note: Google deprecated the info: operator in 2017.

3. Discover Indexing Problems

site:yourdomain.com
A site: command shows how many pages the search engine has indexed. Though the total number of results is only an approximation, it is a quick way to find out if you have an indexing problem — either too few or too many pages in the index.

site:yourdomain.com/blog/*
Specify a particular subfolder of your site to see how many pages it contains. For instance, adding the wildcard * finds all pages under the /blog/.

4. Help with Site Maintenance

site:yourdomain.com contains:pdf
The contains: Bing search operator gives you a powerful tool to find links within a site that point to a particular type of file. For example, the query above lets you locate every page on your site that has a link to a PDF file.

5. Further Refine Results

cats -musical
A minus sign (-) before a keyword removes any results with that word. Again, it’s a way to help filter results when a query might be ambiguous. If you’re looking for info about cats the animal, but there’s a showing of Cats the musical in your town, you can search cats -musical to remove results about the theater production.

intitle:keyword -site:yourdomain.com
You can use the minus sign (-) before a search command, too. The above example finds webpages that have your keyword in the title tag, excluding those on your own site. This reduces the clutter when doing competitor research.

6. Find Social Profiles

john doe (site:linkedin | site:twitter)
If you want to get in touch with someone via their social profiles, you can use the site: for social media profiles along with the person’s name (and company name if you have it). This will search any of the social media channels you want to look up for that person. The example above would show LinkedIn and Twitter.

7. Find Potential Internal Links

site:bruceclay.com/blog -site:bruceclay.com/seo/siloing/ intext: “siloing”
If you’ve followed me for any amount of time, you know that I recommend siloing as an SEO strategy. A key part of siloing is internal linking.

This advanced search is useful to find potential linking opportunities within a website. The example above combines the site: command with intext:, the minus sign (-) and exact match quotations (“).

What this particular search would do is find any webpages on the blog that mention siloing so that we could link to the main siloing page on the site. It uses the minus sign to exclude the page we want to link to from other pages.

List of Advanced Search Operators for SEO (Cheat Sheet)

In the table below, you’ll find the search engine operators that we routinely use in SEO research. (This is not an all-inclusive list.)
<!—
Open this operators cheat sheet as a PDF and save or print it for your reference!—>

Google Bing Result
allinanchor:
allintext: Returns webpages with all the words somewhere on the webpage.
allintitle: Finds pages that include all query words as part of the indexed title tag.
allinurl: Finds a specific URL in the search engine’s index. Also can be used to find pages whose URLs contain all the specified words.
AROUND() Finds webpages with words that are in a certain proximity to one another.
cache: Shows the version of the webpage from Google’s cache.
contains: Finds webpages that contain links to a particular type of file (such as pdf, mp3). This function is unique to Bing.
define: Presents a dictionary definition.
ext: ext: Returns only webpages with the file extension you specify (such as htm). Note: Bing includes this operator in its current list, but our tests could not produce reliable results.
filetype: filetype: Finds results of a single type only (such as pdf).
filetype: filetype: Finds results of a single type only (such as pdf).
feed: Finds RSS / Atom feeds on a site for the search term.
hasfeed: Finds webpages with RSS / Atom feed on the search term.
in This converts units of measure like temperature, currency, etc.
info: Presents some information that Bing has about a webpage such as related pages from the site, external pages talking about the page, and related results. This operator is not listed on the current Bing documentation, but our tests show that it continues to work.
intext: Shows pages that contain a specific word in their body text.
intitle: intitle: Finds pages that include a specific word as part of the indexed title tag.
inurl: Finds pages that include a specific keyword in their indexed URLs.
allinurl: Finds a specific URL in the search engine’s index. Also can be used to find pages whose URLs contain all the specified words.
inanchor: Finds webpages that use a specified keyword as anchor text in a link from the page.
inbody: Finds webpages that use a specified keyword in the body section of the page.
ip: Finds sites hosted by a certain IP address.
language: Find webpages in a specified language.
location: Finds webpages from a certain country / region.
map: Finds a map result for the query.
movie: Finds information about movies.
OR OR Finds webpages that have either query when used in between two queries. Must be capitalized to work correctly.
prefer: Adds emphasis to a search term to refine the results further.
related: Finds related sites to the domain you input.
site: site: Restricts the search to pages within a particular domain and all its subdomains.
source: Finds news results from a specific news source in Google News.
stocks: Displays stock information for a specific ticker symbol.
url: Checks if a domain is in the Bing index.
weather: Shows weather for a specific location.
* * Acts like a wildcard that can take the place of any word or phrase. Example: tallest * in the world
Excludes results that contain the word following the minus sign. Place this operation at the end of your search query.
” “ ” “ Finds instances of the exact text within the quotation marks everywhere it appears in the search engine’s index.
@ Searches social media for a certain query when put in front of the word(s).
$ Searches for a price when put in front of the query.
# Searches for hashtags.
Searches a range of numbers when put in between two numbers.
() Finds or excludes webpages with a group of words contained within the parentheses.

Doing research takes time. Especially when there are so many search engine results. These advanced search operators will get you searching like a pro – more efficiently with better results.


If you liked this advanced search operators guide, please share it!
We have lots of tips for search marketing. (It’s what we do!) Learn more ways to improve your SEO, PPC, and overall ROI by subscribing to our blog.

FAQ: How can I use Advanced Search Operators to extract precise information from search engines?

Becoming proficient with Advanced Search Operators will enable you to extract accurate and pertinent data from search engines. These commands act like filters allowing for tailored queries with more precise results.

To begin, consider using quotation marks around your search query to find an exact phrase match. This can be especially helpful when researching specific topics or phrases. For example, searching for “advanced search tips” (with the quotes) will retrieve pages containing that exact phrase.

Another powerful operator is the “site:” command, which allows you to search within a specific website. This is particularly useful when conducting competitor analysis or investigating a particular site. For instance, by using “site:example.com SEO tips,” you can narrow down your search to SEO-related content within the specified domain.

Combining multiple search operators can further refine your results. For instance, combining the site search with quotation marks can help you find longer phrases within a particular website. This is invaluable for in-depth research.

Other search engines, like Bing, offer their own suite of Advanced Search Operators that you can utilize. You can learn more by consulting the documentation provided by these search engines. It’s important to note that while these operators enhance your search experience, precision and relevance may vary. Therefore, it’s essential to become familiar with the various operators and practice using them to refine your results effectively.

Mastering advanced search operators allows you to extract precise information from search engine results quickly and accurately, saving time while meeting research and analytical requirements more effectively. By mastering them, you can unlock hidden knowledge found online with just the right combination and practice. 

Step-by-Step Procedure: How to Use Advanced Search Operators to Extract Precise Information from Search Engines:

  1. Understanding the Basics: Familiarize yourself with the concept and functionality of Advanced Search Operators found within search engines.
  2. Identify Key Operators: Explore the most commonly used Advanced Search Operators, such as quotation marks and site search.
  3. Use Quotation Marks: Practice using quotation marks to search for exact phrase matches, which are particularly useful for specific research topics.
  4. Master the Site Search: Learn how to use the “site:” command to filter your search results within a specific website or domain.
  5. Combine Operators: Experiment with combining multiple operators to refine your search results further.
  6. Utilize Other Operators: Explore additional operators, such as “inurl,” “intitle,” and “cache,” to enhance your search capabilities.
  7. Study Documentation: Refer to the official documentation provided by search engines like Google and Bing to understand the full range of available operators.
  8. Analyze Competitors: Use Advanced Search Operators to conduct competitor analysis, gaining insights into their SEO strategies and content.
  9. Discover Indexing Problems: Use the “site:” command to identify potential indexing issues with your website or specific subfolders.
  10. Find Internal Links: Use operators like “intext” and “inurl” to locate potential internal linking opportunities within your website.
  11. Analyze Social Profiles: Utilize the “site:” operator with social media platforms to find and connect with individuals through their profiles.
  12. Refine Search Results: Practice using the minus sign (-) to exclude specific keywords from your search queries, improving result relevance.
  13. Test Different Queries: Experiment with various combinations of Advanced Search Operators to find the most effective search results for your needs.
  14. Verify Information: Double-check the accuracy and relevance of the information obtained through Advanced Search Operators.
  15. Enhance Research Efficiency: Incorporate Advanced Search Operators into your regular research workflow to save time and improve data accuracy.
  16. Stay Updated: Stay informed about changes or updates to search engine operators to adapt and improve your search techniques.
  17. Share Knowledge: Educate your team or colleagues about Advanced Search Operators to enhance overall research productivity.
  18. Practice and Patience: Become proficient in using Advanced Search Operators through consistent practice and patience.
  19. Continuous Learning: Stay curious and keep exploring new ways to leverage Advanced Search Operators for research and analysis.
  20. Mastering Advanced Search Operators: Achieve expertise and efficiency by mastering Advanced Search Operators and becoming a proficient researcher in your field.

The post Advanced Search Operators for Bing and Google (Guide and Cheat Sheet) appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/bing-google-advanced-search-operators/feed/ 83
Does Google Favor Brands Regardless of SEO? https://www.bruceclay.com/blog/does-google-favor-brands-regardless-seo/ https://www.bruceclay.com/blog/does-google-favor-brands-regardless-seo/#comments Thu, 22 Sep 2022 17:32:00 +0000 https://www.bruceclay.com/?p=169983 There are many factors that can contribute to why big brands tend to show up in the search results. But are the giants always better than the little guys? The answer is no, but that means smaller brands have to work much harder.

The post Does Google Favor Brands Regardless of SEO? appeared first on Bruce Clay, Inc..

]]>
Google homepage on a laptop.
Ever since the “Vince” algorithm update in 2009, big brands seem to have gotten a leg up in the search results.

But it’s not just about being a big brand. It’s about the signals that come along with having a strong brand. Like expertise, authority, and trust.

Presumably, big brands have bigger budgets and more resources, allowing them to do better SEO and compete better in the search results.

Feedback Loop for Google

A more aggressive SEO strategy with all the right components can increase expertise, authority, and trust signals to Google. So, Google will reward that brand’s website.

If that brand ends up in position one on the search results, it will begin to garner more clickthrough than the other results.

And if people dwell on that site for a significant period of time, that could also be a signal that the page is very relevant to users.

All this can then act as a feedback loop for Google to continue to reward that page.

Don’t forget also that if you have a strong brand, people will recognize that, and it may impact their willingness to click on a result, too.

Familiarity Matters

One study showed that 26% said a familiar brand was a reason for clicking on search ads. Even though that study is referring to ads, it is telling nonetheless how people may react to brands in the organic search results.

So, there are many factors that can contribute to why big brands tend to show up in search results.

But are the giants always better than the little guys? The answer is no. In fact, a lot of times, big brands have more turnover and are less agile than the smaller guys.

However, despite a smaller company’s best efforts, there are times when big brands still outrank them.

For example, take RankBrain, Google Search’s machine learning component applied to its search results. It *may* enhance the favoritism of brands in the results, as I’ve written about here:

Because Google tends to favor big brands online for a variety of reasons, with RankBrain things like the site’s engagement rate, mentions of the brand across many social sites and so on could further enhance favoritism here. This could happen despite the fact that some bigger brands may have a weaker link profile than other websites in their space.

So, what do you do when you think you can do better than the big guys that are ranking number one? You try it!

Despite the fact that it can take a lot of work (and perhaps a bigger SEO budget), you can try to be “least imperfect” compared to the brand that is ranking for your desired keywords in the search results.

Remember, SEO is about beating the competition, not the algorithm. Also, remember that agility and speed to implement changes is a smaller company’s weapon against the big guys. Don’t be surprised by the traffic you didn’t get for the SEO work you didn’t do.

Will it always work? No. There are situations where it can be near impossible.

For example, if you are trying to compete for product-related keywords against a big brand, or if you have a history of spam or a manual penalty of some sort.

One way we’ve been able to help clients go up against big brands is by rethinking their keyword strategy altogether. Historically, brands have not been very good at owning long-tail keywords (those are the three to five-word keywords).

There is a lot of opportunity when you go after these keywords — you can get a good amount of traffic.

So does Google favor brands in the search results regardless of their SEO?

It occurred to me that this article could have been just one word – YES! But keeping to SEO answer standards – IT DEPENDS. Google tries to rank the highest quality, most relevant pages for a query. And major brands tend to have a lot of signals that Google is looking for.

Google is not perfect, however. That means smaller brands need to work harder to compete.

I have written about this topic at length in the past. So for in-depth SEO tips, see my article on how to beat the giants in the search results. And check out 50 “white hat” ways to get authority links to your site.

This article was inspired by a question I received during our live monthly Q&A sessions for SEOtraining.com members. Each month we meet to discuss all things SEO and answer member questions directly. If you’d like me to answer your SEO question live, head over to SEOtraining.com and sign up for membership.

FAQ: How do big brands leverage their advantage in SEO to dominate search results?

Big brands have mastered the art of leveraging their inherent advantages to secure top positions in search results. As an authoritative voice in the realm of SEO, I am excited to unveil the tactics that enable these brands to establish and maintain their dominance.

Harnessing the Power of Brand Signals

Big brands have a distinct advantage due to their reputation, expertise, and trustworthiness. These brand signals send a strong message to search engines, affirming the credibility and relevance of their content. By consistently delivering high-quality and authoritative content, big brands build a solid foundation that propels them to the forefront of search results.

Strategic Content Creation and Optimization

One of the cornerstones of big brands’ SEO success lies in their strategic content creation and optimization approach. These brands craft meticulously researched and informative content that resonates with their target audience. Incorporating relevant keywords and semantic variations, they optimize their content to align seamlessly with user intent. This strategic alignment significantly enhances their visibility in search results.

Elevating User Experience and Engagement

User experience plays a pivotal role in SEO dominance. Big brands invest in creating user-friendly websites that offer intuitive navigation, fast loading times, and mobile responsiveness. Providing a seamless and engaging browsing experience encourages users to spend more time on their site. This extended dwell time signals to search engines that the content is valuable and relevant, further solidifying their position in search rankings.

Strategies for Building Quality Backlinks

Big brands understand the importance of building a robust backlink profile. They forge strategic partnerships, collaborate with influencers, and engage in authoritative guest posting to acquire high-quality backlinks from reputable sources. These backlinks serve as a vote of confidence from the online community, enhancing their authority and visibility in search results.

Staying Ahead of Algorithm Changes

Search engine algorithms are constantly evolving, and big brands quickly adapt. They regularly monitor algorithm updates and industry trends and adjust their strategies to align with the latest best practices. By staying ahead of the curve, they maintain their competitive edge and continue to dominate search results.

The dominance of big brands in search results results from strategic and multifaceted SEO techniques. By capitalizing on brand signals, creating optimized content, prioritizing user experience, building quality backlinks, and staying agile in the face of algorithm changes, these brands carve a path to search supremacy. For businesses aspiring to emulate their success, a comprehensive and strategic SEO approach is the key to unlocking similar achievements.

Step-by-Step Procedure for Leveraging SEO Advantage for Search Dominance:

  1. Establish a robust brand identity focusing on expertise, authority, and trustworthiness.
  2. Conduct thorough keyword research to identify relevant search terms and user intent.
  3. Create high-quality, informative content aligned with user needs and search intent.
  4. Optimize content using relevant keywords, semantic variations, and structured data.
  5. Ensure a seamless user experience with a mobile-friendly and user-friendly website design.
  6. Monitor website speed and performance to enhance user engagement and dwell time.
  7. Develop a strategic backlink acquisition plan, including influencer collaborations and guest posting.
  8. Focus on building quality backlinks from reputable and authoritative sources.
  9. Regularly audit and optimize existing content to maintain relevance and search visibility.
  10. Stay updated with search engine algorithm changes and industry trends.
  11. Adapt SEO strategies in response to algorithm updates to maintain a competitive edge.
  12. Continuously track and analyze website performance using analytics tools.
  13. Monitor keyword rankings and organic search traffic to assess SEO effectiveness.
  14. Leverage data-driven insights to refine content and SEO strategies over time.
  15. Engage with your target audience through social media and online communities.
  16. Encourage user-generated content and reviews to enhance brand credibility.
  17. Foster partnerships and collaborations within your industry to amplify your brand’s reach.
  18. Regularly update and refresh content to provide up-to-date and relevant information.
  19. Implement structured data markup to enhance search engine understanding of your content.
  20. Cultivate a continuous learning and optimization culture within your SEO team.

The post Does Google Favor Brands Regardless of SEO? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/does-google-favor-brands-regardless-seo/feed/ 10
Search Quality Rater Guidelines Checklist: Evaluator Considerations https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/ https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/#comments Tue, 19 Oct 2021 19:45:00 +0000 http://www.bruceclay.com/blog/?p=38971 Google’s Search Quality Evaluator Guidelines give us clues about what the search engine focuses on, and consequently, what SEOs must focus on, too. For years, the buzzword for search engine optimization was “relevance” — making your site the most relevant result for a searcher’s query. But it’s all about usefulness today and moving forward. The goal of the search engine is simple: increase searcher satisfaction.

Here’s our checklist for making sure your SEO campaign aligns with Google’s priorities.

The post Search Quality Rater Guidelines Checklist: Evaluator Considerations appeared first on Bruce Clay, Inc..

]]>
EDITOR’S NOTE: You can always find Google’s current Search Quality Rater Guidelines here.

Google’s update of its Search Quality Rater Guidelines shows a shifted focus on the search engine and, consequently, for SEOs. BTW, the Google PDF file name says Evaluator and not Rater… but it is Rater.

For years, the buzzword for search engine optimization has been “relevance” — making your site the most relevant result for a searcher’s query. But as Duane Forrester, our former VP of organic search operations, observed: “It’s all about usefulness today and moving forward. The goal of the search engine is simple: increase searcher satisfaction. That’s why ‘useful’ is the new watchword. Previously we said ‘relevant,’ but really we all meant ‘useful.’”

Usefull signGoogle regularly updates its internal guidelines document that tells hired human quality raters how to evaluate sites as part of Google’s ongoing experiments. We in the search industry usually get only leaked tidbits and summaries to read. But last month, in a rare gesture, Google published the guidelines as a PDF for all to read.

While it doesn’t reveal any ranking formulas or algo secrets, the 175-page document complete with many examples and screenshots does offer a coveted view of what the search engine considers priority. As Google’s announcement states, “the guidelines reflect what Google thinks search users want” and therefore can help webmasters and business owners “understand what Google looks for in web pages.”

The guidelines are not the algorithm. But they show what Google focuses on, and that’s worth paying attention to.

What’s important for business owners is not all of the nitty-gritty technical details. Leave those to your SEO. Instead, business decision-makers need to glean what Google’s focus is so they can allot budgets and assign priorities correctly in a website strategy that’s aligned with what Google rewards.

It’s all about usefulness today and moving forward

Aligning Your Website with Google’s Priorities

Search engine priorities change over time, and your SEO strategy has to adapt. When you compare this 2015 version to previously leaked Google quality rater’s guidelines (as Jennifer Slegg does here and here), the differences point out how Google’s focus is shifting. The two biggest changes are:

  • Mobile everything: Not only is there a whole new section devoted to mobile quality, but also most of the examples now show screenshots taken on mobile devices.
  • Needs Met focus: A new rating scale judges how fully a web page result meets a mobile searcher’s need. Any site that is NOT mobile-friendly automatically fails this criterion. The entirely new section for judging Needs Met proves that Google is all about satisfying the searcher’s needs.

Here’s our checklist for making sure your SEO campaign aligns with Google’s priorities.

Mobile, Front and Center

Is your site really mobile-friendly?

Earning a passing grade on Google’s Mobile-Friendly Test tool is the bare minimum required for all web pages and apps now. Beyond this, you must make sure that tasks are easy to accomplish with a mobile device. From the guidelines, here’s a checklist you can use to evaluate how your site performs with a smartphone:

  • How easy/hard is it to fill out forms or enter data?
  • How does the site or app behave on a small screen? Are all features usable?
  • Is the content legible without requiring left-to-right scrolling to read text?
  • Do images fit on a small screen?
  • How easily can someone navigate? Are menus, buttons and links large enough?
  • What happens on your site when Internet connectivity is inconsistent or slow?

Needs Met or Not

How well does your site anticipate and fulfill a mobile user’s needs?

Another entirely new section added to Google’s quality rating guidelines is called “Needs Met Rating Guideline.” Here’s the description, which is clearly targeting MOBILE users’ needs (from Section 13.0):

Needs Met rating tasks ask you to focus on mobile user needs and think about how helpful and satisfying the result is for the mobile users.

To get a high quality rating in the Needs Met category, a search result and its landing page should:

  • Require minimal effort for users to immediately get or use what they’re looking for.
  • Satisfy all or almost all users looking for the same thing (so that they wouldn’t need to see additional results).
  • Provide trustworthy, authoritative, and/or complete information that is helpful.

A mobile user’s intent differs from that of a desktop or even tablet user. (Tip: Aaron Levy’s SMX presentation covers mobile audiences in depth.) Evidence of this is found in the new mobile section of Google’s Search Quality Rating Guidelines, where page after page of examples show what mobile users probably want when they search for various spoken or typed queries. At one point, raters are instructed to “think about mobile users when deciding if queries are [a particular type]. Use your judgment here.”

The takeaway for mobile SEO marketers as well as for app/website owners is this: Think about what mobile users may be trying to do, and make sure that your site fulfills these things as directly as possible. Google is all about satisfying mobile users’ needs; you should be, too.

Answering this question takes some serious thought, but ultimately pays off in spades.

Purpose-Driven Pages

Does the webpage have a clear purpose, and how well is it achieved?

One of the first tasks a rater must do is figure out what a webpage is for and then decide how well the page achieves that purpose. For example, the purpose of a news site homepage is to display news; the purpose of a shopping page is to sell or give information about a product; etc. Google has very different standards for different types of pages, so understanding a page’s purpose lays the foundation for assessing its quality.

How helpful is the page’s purpose?

Google wants each page to be geared to helping users. Helpfulness factors heavily into quality ratings. On the low end of the quality scale would be pages that harm or deceive users (even though they may be fulfilling their designed purpose).

To be deemed high quality, a page must have a helpful purpose, such as:

  • To share information about a topic
  • To share personal or social information
  • To share pictures, videos, or other forms of media
  • To entertain
  • To express an opinion or point of view
  • To sell products or services
  • To allow users to share files or download software
  • … many others.

Is the purpose of the website as a whole clear, on and off site?

Make sure that your website’s overall purpose is explained clearly, ideally on the About page. The rating guidelines include examples of pages with “non-obvious purposes” — pages that seemed pointless or inaccurate on their own, until the rater referred to the About or FAQ page and discovered they were actually beneficial (see Section 2.2).

In addition, Google looks at independent sources to see whether the site’s reputation matches what it claims about itself. If there’s conflict, Google will tend to believe what the outside sources have to say. For small businesses or organizations, a lack of reviews or reputation information does not mean the site is low quality (see Section 2.7).

Meaty Main Content and Helpful Secondary Content

Does the page have quality main content?

A webpage’s main content (which excludes ads, sidebars, and other supplementary parts that do not directly fulfill the page’s purpose) can earn a high quality rating if ALL of these are true:

  • There is a satisfying amount of high quality main content on the page.
  • The page and site have a high level of E-E-A-T (experience, xpertness, authoritativeness and trustworthiness).
  • The site has a good reputation for the page’s topic.

There are no hard and fast rules, and no minimum number of words per page. The guidelines encourage raters to decide whether the main content fulfills the purpose of the page satisfactorily.

Is there any supplementary content on the page that is helpful to users?

Google recognizes that supplementary content “can be a large part of what makes a High quality page very satisfying for its purpose.” Consider what you can include to offer related information, ways to find other cool stuff, or specialized content that could be helpful to people visiting that page.

YMYL Pages Have Higher Standards

How high quality are your site’s YMYL pages?

Pages that can impact a person’s “future happiness, health, or wealth” are known as Your Money or Your Life (YMYL) pages. Google first introduced this concept in the 2014 Search Quality Rating Guidelines, which held these types of pages to a much higher standard across all quality criteria. Examples include pages for shopping transactions, financial information, medical advice, legal information, and many more.

Google specifies “needs met” ratings that judge how well a webpage fulfills a searcher’s needs. If you have YMYL pages, needs met is particularly important.

Maintaining Your Site

Does your site look alive and well-maintained?

Raters are instructed to “poke around” to see whether a site is being maintained. Here are a few signs of life Google expects of a well-maintained, quality website:

  • Links should work.
  • Images should load.
  • Pages should continue to function well for users as web browsers change.

How fresh is your content?

Google’s algorithm is known to look at “freshness” as a ranking factor for many types of queries. When Googlebot gets to your site, does it find any recently added or updated content?

For blog posts and other content that is dated, don’t try to game the system by setting up a program to automatically change dates to make things look recent; Google’s on to that scheme. Raters are even instructed to manually check the Wayback Machine to investigate suspicious dates to see whether content is copied or original (see Section 7.4.7). By the way, Google’s algorithm doesn’t need the Wayback Machine to recognize original content, so don’t even try to cheat.

A healthy website frequently adds new content and/or updates old content to keep things fresh and useful for site visitors.

How expert is your content?

Thomas the really useful engine
Thomas the Tank Engine had the right idea all along.
(photo credit: Tommy Stubbs/Random House)

We know from the 2014 guidelines that Google quality raters look for signs of E-E-A-T, which stands for expertness, authoritativeness and trustworthiness. The newest guidelines reinforce this concept, but define “expertise” differently depending on the topic of the page (according to Section 4.3):

  • There are “expert” websites of all types, even gossip sites, forums, etc.
  • Topics such as medical, legal, financial or tax advice, home remodeling, or parenting “should come from expert sources.”
  • Topics on hobbies, such as photography or learning to play an instrument, “also require expertise.”
  • Ordinary people may have “everyday expertise” on topics where they have life experience, such as people who write extremely detailed reviews, tips posted on forums, personal experiences, etc.

Make sure your expert content is “maintained and updated” to increase your site’s E-E-A-T rating.

About Advertising

If you have ads or monetized links on your site, are they appropriate for the page’s purpose?

The guidelines state that “the presence or absence of Ads is not by itself a reason for a High or Low quality rating” because Google realizes that many websites and apps owe their existence to ad income. However, Google “will consider a website responsible for the overall quality of the Ads displayed” (see Section 2.4.3). So keep an eye on the amount and use of affiliate, display, or other types of advertising. Make sure that ads don’t overwhelm the more useful main content (and supplementary content, if any) that each page contains.

Wrapping Up Your Quality Review

The old saying goes that there’s always room for improvement. This post is by no means a complete SEO checklist. We hope that as you apply these points from the 2015 search quality ratings guidelines that are based on Google’s priorities, you’ll begin to view your online properties with a new SEO point of view — and make your sites and apps more useful.

If you’re eyeing the best way to improve your website quality and would like to have a free consultation, fill out a quote request, and we’ll give you a call.

FAQ: How can I align my website with Google’s priorities using the Search Quality Guidelines?

Creating a website that resonates with Google’s evolving priorities is crucial for sustainable online success. Google Search Quality Guidelines provide a roadmap to effectively understand and implement these priorities.

Mobile-friendliness is a pivotal factor in Google’s ranking algorithm. Websites that offer a seamless experience across various devices garner higher search visibility. As Google emphasizes mobile-first indexing, responsive design becomes essential. Mobile-friendly pages enhance user experience and cater to increasing mobile search users.

Content quality is another cornerstone. Google’s emphasis on “usefulness” encourages webmasters to provide informative, engaging, and relevant content. Comprehensive, well-researched articles showcase expertise, boosting credibility and user engagement. Balancing text with multimedia elements like images and videos enhances the content appeal.

User intent is at the heart of Google’s priorities. Ensuring your website meets user needs is vital. Analyze your audience’s queries and preferences to provide solutions that resonate. Optimize conversational queries by incorporating natural language in your content. Addressing user intent fosters longer time spent on your site, positively impacting ranking signals.

E-E-A-T (Experience, expertise, Authoritativeness, Trustworthiness) matters greatly. Establish your expertise through author bios, showcasing qualifications, and linking to reputable sources. Leverage authoritative backlinks to credible sites, enhancing your website’s trustworthiness. Regularly update content to demonstrate an ongoing commitment to accuracy and relevance.

User experience encompasses site speed, navigation, and design. A smooth browsing experience reduces bounce rates and increases user satisfaction. Minimize page loading times, ensure intuitive navigation, and maintain a clean design. A visually appealing and user-friendly website fosters positive interactions and contributes to higher rankings.

Step-by-Step Procedure: Aligning Your Website with Google’s Priorities

  1. Mobile-Friendly Optimization: Implement responsive design, ensuring a consistent device experience.
  2. Content Quality Enhancement: Craft well-researched, engaging content that caters to user needs.
  3. Address User Intent: Analyze user queries to provide relevant solutions and foster engagement.
  4. Establish E-E-A-T: Showcase experience, expertise, authoritativeness, and trustworthiness through bios and backlinks.
  5. Optimize User Experience: Prioritize site speed, intuitive navigation, and a visually appealing design.
  6. Keyword Research: Identify relevant keywords to target user queries effectively.
  7. Semantic Search Integration: Incorporate natural language and contextually relevant terms in content.
  8. Structured Data Implementation: Use structured data markup to enhance search results display.
  9. Internal Linking Strategy: Establish a logical hierarchy of internal links for easy navigation.
  10. Backlink Acquisition: Acquire authoritative backlinks from reputable websites in your niche.
  11. Regular Content Updates: Keep content fresh and accurate to demonstrate ongoing commitment.
  12. Local SEO Optimization: Optimize for local searches with accurate business information.
  13. Social Media Integration: Share content across social platforms to increase visibility and engagement.
  14. Mobile Page Speed Optimization: Optimize images and minimize code to improve mobile load times.
  15. Schema Markup Utilization: Implement schema markup for rich snippets and enhanced search visibility.
  16. User Engagement Analytics: Monitor user behavior to refine content and design strategies.
  17. Competitor Analysis: Study successful competitors for insights into effective strategies.
  18. Secure Website: Implement HTTPS for improved security and higher trustworthiness.
  19. Reduce Bounce Rates: Create engaging landing pages that address user needs promptly.
  20. Ongoing Monitoring and Adaptation: Continuously analyze performance metrics and adjust strategies accordingly.

The post Search Quality Rater Guidelines Checklist: Evaluator Considerations appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/feed/ 25
What Are Sitelinks? Best Practices for Google Sitelinks https://www.bruceclay.com/blog/what-are-sitelinks-best-practices-for-google-sitelinks/ https://www.bruceclay.com/blog/what-are-sitelinks-best-practices-for-google-sitelinks/#comments Mon, 26 Jul 2021 17:42:50 +0000 https://www.bruceclay.com/?p=102445 Sitelinks presented on the search engine results page can earn your website more clicks and an above-average click rate. Learn how to utilize this important feature.

The post What Are Sitelinks? Best Practices for Google Sitelinks appeared first on Bruce Clay, Inc..

]]>
Professional excited to see his website's sitelinks present on Google's SERP.
Pop quiz: Which feature can give you more real estate on the search engine results page while driving more clicks to your website? The answer is sitelinks.

According to April 2021 data from Searchmetrics, sitelinks show up for 50% of keywords on desktop and 22% on mobile. Sistrix data showed that when sitelinks are present on the SERP, they yield an above-average click rate of 46.9%.

So how do you get sitelinks? I’ll explain that and more in today’s article:

What Are Sitelinks?

Sitelinks appear as part of a website’s search results listing and are links to other webpages within the website. Sitelinks usually appear for branded (aka “navigational”) search terms, like when a person just types the name of the brand. But they can appear on informational queries as well.

Here’s an example of sitelinks showing for a search for my company, Bruce Clay Inc.:

Google sitelinks for BruceClay.com’s search listing for the search “bruce clay inc.”
Google sitelinks for BruceClay.com’s search listing for the search “bruce clay inc”

Clicking on any of those links above will bring you to their subsequent pages on our website. For example, the “blog” link takes you to our blog homepage.

Webmasters don’t have manual control over which sitelinks show up, and much of that has to do with Google’s algorithms. That can annoy some people. But there are some things you can do to influence them, which I’ll discuss later.

A Google help file explains that sitelinks won’t always show up:

We only show sitelinks for results when we think they’ll be useful to the user. If the structure of your site doesn’t allow our algorithms to find good sitelinks, or we don’t think that the sitelinks for your site are relevant for the user’s query, we won’t show them.

How Sitelinks Help SEO

In a nutshell: Sitelinks can improve the user experience, and that is almost always good for SEO. Here are a few ways sitelinks help SEO:

More SERP Real Estate

The more space your result takes up visually on a search engine results page (SERP), the better. Those sitelinks under your listing push the rest of the organic search results farther down the page, so searchers are presented with your navigational options, not someone else’s.

Improved Click-Through Rate

As mentioned in the introduction of this article, searches that produce SERPs with sitelinks mean more clicks.

When sitelinks are present on a SERP, Sistrix data shows the average click rate is 46.9% for the results with sitelinks. Position 2 only gets 14% of clicks.

Click-through rates for SERPs with sitelinks, Sistrix.
Click-through rates for SERPs with sitelinks, Sistrix

Not only that, but when compared to the average click rate of a SERP without sitelinks, the first position with sitelinks garners 46.9% of clicks versus just 28.5% of clicks when it’s just 10 blue links.

Data from Sistrix shows average click-through rates for SERPs with 10 blue links only.
Average click-through rates for SERPs with 10 blue links only, Sistrix

It’s worth noting that while Google usually shows sitelinks for just the top result, some SERPs include sitelinks on several of the top results. For example, positions 1 to 3 have sitelinks in the results for the general information query “about seo”:

Sitelinks can sometimes appear in multiple top-ranked search results.
Sometimes multiple top-ranked results can show sitelinks

Drive People to Key Webpages

Hopefully, Google is able to discern the key webpages on your site and include those in your sitelinks. When Google doesn’t get it right, that may mean you can improve your internal SEO efforts.

Nevertheless, sitelinks can help your users discover more than just the homepage on your site right out of the gate. And that means more organic traffic going to various pages within your website, not just the homepage.

You can look in Google Search Console to see how many of your key pages are garnering clicks. Start from the Performance on Search Results report by Pages to see your top pages; then you can click each top URL and view the Queries report to see what terms attracted searchers.

Types of Sitelinks

Sitelinks can show up in the search results in a few different ways.

Vertical, Two-Column Sitelinks
These sitelinks show below the main listing for a website, and can include up to six sitelinks in a two-column, vertical format.

Google search listing for BruceClay.com displays vertical, two-column sitelinks.
Google vertical, two-column sitelinks for BruceClay.com search listing

Horizontal Row of Sitelinks
These sitelinks show below the main listing for a website and can include up to four hyperlinks in a row. This can show up for both branded and informational queries.

For informational queries, sitelinks may point to sections within a webpage on a topic. This is especially true for lengthy articles, such as the following Wikipedia entry:

Horizontal row of sitelinks for Wikipedia.com entry.
Horizontal row of sitelinks for Wikipedia.com article

For branded queries, it can direct people to other pages on your website that are relevant to the query. Google shows the following sitelinks for a search for “bruce clay seo”:

Google displays sitelinks for the query "bruce clay seo."

Sitelinks Extensions (Paid)
Sitelinks extensions are specifically for paid search ads. While this article focuses on organic sitelinks, it’s worth mentioning ad sitelinks since the terms can be confused.

What Is the Sitelinks Search Box?

The sitelinks search box is another automated feature produced by Google in the search results. A search box shows up above your sitelinks so that users can search your site directly from the search results page.

Sitelinks search box showing for Target.com search listing.

The default for this search box is that it is powered by Google Search. But you can use specific markup language that will integrate your website’s internal site search instead.

Even by doing this, however, you’re not guaranteed that a sitelinks search box will show up. Google says:

Google doesn’t guarantee that a sitelinks search box will be shown in search results. Additionally, using the sitelinks search box markup doesn’t make it more likely that a sitelinks search box will be shown.

If you do not want the sitelinks search box to show up in SERPs, you have a little more control over this feature versus the regular sitelinks. You can add a specific meta tag to the homepage to prevent a sitelinks search box, per Google:

<meta name=”google” content=”nositelinkssearchbox” />

Google Sitelinks Best Practices

As I said, you cannot directly control what sitelinks the search engine chooses to display for your site. However, you can influence them with the following best practices.

Good Site Architecture

I believe that the best way to tell Google which pages are the most important on your website is to have a good site architecture that’s driven by best practices in SEO siloing. This includes how you set up your navigation and internal links.

In its help file on sitelinks, Google does mention a few things that can help them understand which pages to choose for sitelinks, and one of them is internal linking:

There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site’s internal links, make sure you use anchor text and alt text that’s informative, compact, and avoids repetition.

For more on SEO siloing, see:

Sitemap(s)

An XML sitemap does not directly influence sitelinks, but it can help search engines more easily discover key pages on your website. Similarly, an HTML sitemap helps search engines crawl your site and understand which of your pages are the most important.

To learn more about sitemaps, see:

Structured Data Markup

The more you can tell search engines about your website and help clarify the information for them, the better. Structured data makeup does just that. For example, the SiteNavigationElement can help search engines better understand your site structure.

Good Title Tags

Make sure the title tag is unique and describes the webpage ― for every webpage on your website. This is a best practice, in general, to help search engines understand what the page is about.

Title tags may also show up as-is in the sitelinks menu in the search results. In that case, you want them to look good. So make sure your title tag style is consistent throughout the site. For example, don’t have some in all caps, some in title case and others in lowercase, etc.

For more, read:

Anchor Links within the Page

I believe that including anchor links to page fragments is a best practice, especially on an informational page. Include a table of contents or just a bulleted list near the top of the article linking down to the main section headers. Not only does this help readers understand the structure of your page and jump to what interests them, but also it helps search engines clarify what’s there and possibly pull sitelinks from your list.

As an example, Google gives sitelinks in the SERP result to our guide on image search ranking. These are pulled from anchor links to headings on that page:

Sitelinks for BruceClay.com article in SERP result for “image search ranking.”
Sitelinks for BruceClay.com article in SERP result for “image search ranking

Your Website or Company Name

This is something that you may or may not be able to change. But just know that when your website or company name is generic or somewhat ambiguous, Google will have a harder time determining that your website is indeed the website that should show up as No. 1 in the search results. Of course, it will not show sitelinks as a result.

Content Maintenance

This isn’t really a best practice for sitelinks, per se, but more of a best practice for ensuring a good user experience.

Once you know which sitelinks are showing for your website, you want to ensure the pages are maintained well and offer a good user experience.

Sitelinks: Love ‘Em or Hate ‘Em

Some website publishers may not enjoy the sitelinks Google has generated for them. While you can’t disable sitelinks, you can rejoice in the fact that having them can actually be a good sign that Google sees your website and/or brand as reputable.

So your best bet to earn sitelinks or to influence them is to ensure you’re communicating the key pages on your website to Google through all the best practices mentioned above.

If you need assistance with your website SEO, please contact us today and let’s talk about it.

FAQ: How can I optimize my website to benefit from sitelinks and enhance user experience?

As an experienced authority in the realm of SEO and website optimization, I am excited to share my insights on optimizing your website to benefit from sitelinks and elevate user experience. The strategic utilization of sitelinks can significantly impact your website’s visibility in search engine results while providing users with streamlined navigation.

One of the key aspects of optimizing your website for sitelinks is establishing a clear and organized site architecture. Crafting a logical hierarchy of pages and sections not only aids search engines in understanding your content but also enhances user navigation. By categorizing your website’s content into relevant sections, you create a roadmap that directs both search engines and users to the most valuable information.

An often-overlooked factor in sitelinks optimization is the creation of compelling and concise title tags for each page. Craft title tags that accurately reflect the page’s content while being concise enough to capture users’ attention. Well-crafted title tags improve the chances of sitelinks appearing and entice users to explore further.

Engaging in effective internal linking is another vital strategy for sitelinks optimization. By interconnecting relevant pages within your website using descriptive anchor text, you guide users to discover related content. This practice enhances user experience by allowing seamless navigation and provides search engines with valuable information about the relationships between different pages.

One practical tip to enhance the user experience and encourage click-throughs on sitelinks is to update and maintain the linked pages regularly. Keep the content fresh, relevant, and valuable to ensure users find what they seek. Moreover, analyze the performance of sitelinks in Google Search Console to understand which pages garner the most clicks and adjust your optimization efforts accordingly.

Optimizing your website for sitelinks is a multidimensional approach that requires a combination of strategic site architecture, engaging title tags, smart internal linking, and consistent content updates. By integrating these expert strategies, you can elevate your website’s user experience, boost search engine visibility, and reap the benefits of enhanced click-through rates.

Step-by-Step Procedure: Optimizing Your Website for Sitelinks and Enhanced User Experience

  1. Evaluate your current site architecture to identify areas for improvement.
  2. Categorize your website’s content into logical sections and sub-sections.
  3. Craft descriptive and concise title tags for each page, focusing on relevance.
  4. Audit existing internal links and strategically interlink relevant pages using descriptive anchor text.
  5. Create an XML sitemap to assist search engines in discovering and indexing your content.
  6. Leverage structured data markup, such as SiteNavigationElement, to enhance search engine understanding.
  7. Implement anchor links within longer pages to aid both user navigation and sitelinks selection.
  8. Regularly analyze sitelinks performance in Google Search Console.
  9. Identify top-performing sitelinks and ensure their linked pages are up-to-date and valuable.
  10. Continuously update and optimize content to maintain relevancy and engagement.
  11. Monitor user behavior and adjust internal linking strategies based on insights.
  12. Ensure your website’s or company name is unique and distinguishable.
  13. Implement clear navigation menus that guide users through important sections.
  14. Utilize breadcrumb navigation to provide users with context and improve navigation.
  15. Leverage high-quality images and descriptive alt text to enhance visual content.
  16. Optimize site speed and mobile responsiveness for an improved user experience.
  17. Regularly conduct user testing to gather feedback on navigation and usability.
  18. Seek professional assistance to ensure optimal site architecture and SEO practices.
  19. Stay updated with industry trends, and algorithm changes to adapt your optimization strategies.
  20. Monitor and assess the impact of your optimization efforts on sitelinks appearance and user engagement.

The post What Are Sitelinks? Best Practices for Google Sitelinks appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/what-are-sitelinks-best-practices-for-google-sitelinks/feed/ 5