SEO guidelines Archives - Bruce Clay, Inc. https://www.bruceclay.com/blog/tag/seo-guidelines/ SEO and Internet Marketing Tue, 14 Nov 2023 07:37:40 +0000 en-US hourly 1 Search Engine Optimization Best Practices: Standards and Spam Discussion https://www.bruceclay.com/seo/standards/ Tue, 05 Oct 2021 23:43:37 +0000 https://www.bruceclay.com/?page_id=82668 As an SEO, you’re no doubt aware that the definition of spam changes over time. Thanks to black-hat SEOs, search engines are constantly updating their definitions of spam. Even worse, these definitions may vary between major search engines like Google, Bing, and Yahoo!. To be an effective SEO, avoiding spam is key. Because of the […]

The post Search Engine Optimization Best Practices: Standards and Spam Discussion appeared first on Bruce Clay, Inc..

]]>
As an SEO, you’re no doubt aware that the definition of spam changes over time. Thanks to black-hat SEOs, search engines are constantly updating their definitions of spam. Even worse, these definitions may vary between major search engines like Google, Bing, and Yahoo!.

To be an effective SEO, avoiding spam is key. Because of the severe penalties associated with spamming search engines, it’s always best to play it safe. But to play by the rules, you need to know what the rules are.

This article identifies SEO standards that stand the test of time. It defines the most common types of spam you should be aware of. Then we go over some essential ways to optimize your site and its contents without relying on tricks.

We’ve divided this article into a few sections so you can skip around as needed:

Who Fights SEO Spam and Penalties

It’s the SEO who is responsible for making sure that you do not have spam on your website or in your website strategy. SEOs work to protect and build your website’s experience, expertise, authoritativeness, and trustworthiness (E-E-A-T), make sure there are no penalties, and repair the site should they appear. Spam runs counter to all of those goals.

The definition of SEO varies. We have an entire SEO Guide that dives into the specifics of how to do search engine optimization and gives you a good foundation.

To start off our discussion of SEO standards, let’s pose a situation.

Situation: A Story of Two Sites

Site A is quite well written and exceptionally relevant for the search keyword “W.” Site B is not as well written, not as content-rich, and nowhere near as relevant. The search engines will not like B.

Site B uses search engine optimization (SEO) technology and a few borderline spam tricks. Suddenly site B outranks site A for the search ”W.” This hurts the user experience and lowers user satisfaction with the results from that search engine. Search engines see this as a slap to the face since their job is to ensure that visitors see relevant content and are happy.

Is it any wonder that search engines are always tightening down on spam rules? It is one matter to improve the quality, presentation, and general use of keyword phrases on a webpage. It is a different matter to trick the engines into higher rankings without editing the site content.

It is the position of the search engines that the role of the SEO practitioner is to improve the quality, quantity, clarity, and value of content. Quality content allows search engines to select worthy sites based on their unique relevancy factors. SEO practitioners should help search engines by making sites more relevant, clear, and accessible. SEOs should not use spam techniques to inflate the perceived relevancy of inferior sites.

Don’t disguise an inferior site – fix it. Don’t make site B appear to be more relevant than site A; actually make it more relevant.

While some search engines reward off-page SEO technologies, the improvements are often short-lived and of diminishing benefit. The cutoff for what is acceptable is also changing each day. Tricks that work today can get you unlisted tomorrow. Pages that are informative and contribute to the content, usability, and indexability of any site are the goal of SEO.

For too long SEO practitioners engaged in an arms race. Some saw their role as inventing more and more devious technology to trick search engines and beat competitors. Today, search engines have aggressive anti-spam programs, making this strategy ineffective. The news is out — if you want to get search engine rankings for your clients, you have to play well within the rules. And those rules are “no tricks allowed.”

Simply put work on honest relevancy and win. All others will fade away.

Case in Point: Doorway Pages

At one time, the “doorway page” was used as a portal to dynamic content. At one point, several major engines even endorsed them as a way to organize and display content on your site. However, by 2002 the search engines had reversed their opinions of doorway pages. Today, search engines now consider doorway pages to be spam.

This proves that what works today may not work tomorrow. If you play with fire, you will regret it.

Our advice is to always play in the center of the acceptable area. We also advise you to not experiment with new ways to fool the engines and earn overnight rankings. Even research for the purposes of self-education can cause long-term issues with your page rankings.

General SEO Standards to Practice

There are many different technologies and methodologies used by SEO practitioners. It is not the intent of a Code of Ethics to define HOW the code is met, but rather to set the bounds of compliance. Search engine acceptance depends upon meeting these codes plus SEO standards defined by each search engine. In general, if actions are in compliance with the Code of Ethics and meet the SEO standards of the search engines, then they are allowed.

But remember that what is an allowable trick today may be blacklisted tomorrow. It is better to focus on honest SEO than waste your time on something that will need to be abandoned soon.

These are general guidelines that may vary from search engine to search engine:

  1. Keywords should be relevant, applicable, and clearly associated with page body content.
  2. Keywords should be used as allowed and accepted by the search engines (placement, color, etc.).
  3. Keywords should not be utilized too many times on a page (frequency, density, distribution, etc.). They should be used naturally within the page content.
  4. Redirection technology (if used) should facilitate and improve the user experience. But this is almost always considered a trick and is a frequent cause for removal from an index.
  5. Redirection technology (if used) should always display a page where the body content contains the appropriate keywords (no bait and switch).
  6. Redirection technology (if used) may not alter the URL (redirect) or affect the browser BACK button (cause a loop). It also may not display page information that is not the property of the site owner without sound technical justification (e.g., language redirects).
  7. Pages should not be submitted to the search engines too frequently.

Each search engine must support at least the Robots Exclusion Standard. This is not always the case, but it should be.

Guidelines for a search engine or directory may further discuss the relevance, spamming, cloaking, or redirection. Usually, these will be discussed in a way that relates to user experience. In general, revising or adding content is good if it improves the user experience. This is the subjective area we must all interpret and is why rules change so often.

We recommend reading Google’s Search Quality Evaluator Guidelines document for specific examples of what the search engine considers high- and low-quality web content for search visitors.

The Players

There are three main players when it comes to search engine optimization:

  1. Clients – Owners of the website. Client emphasis is on sales, holding users (sticky), and user experience. There is also an emphasis on getting the visitor to take a desired action.
  2. Search Engines – Emphasis is on providing a positive user experience. This is achieved through relevance (controlled by algorithms) and minimal negative impact as a result of bait-and-switch technologies.
  3. SEO Firms – Professionals who obtain traffic for client sites as a result of a search engine query. This involves understanding the SE ranking algorithms and beating the competing SEO firms optimizing other clients for the same terms. It is important to follow SEO standards and remain within the “No Spam” boundaries (play within the rules) while doing this. SEO practitioners are paid by clients and are rewarded for rankings at almost any price.

Unfortunately, if the rules change, sites may be dropped from SE’s. If algorithms change, sites may be lowered in the rankings. If competing SEO firms are successful in finding a new trick within the rules, site rankings may fall. If new competing client sites enter the market, site rankings may drop. If the client site uploads altered pages or changes server technology, site rankings may drop.

SEO Processes

There are four main page-centric SEO processes used by search engine optimization firms:

  1. Editing Client Webpages: This is making revisions made to a client site’s pages so that they can rank higher in the search engine. This is honest SEO work and involves editing real honest website pages. The improvements better serve users and raise the quality of the page. This is the bread-and-butter of legitimate SEO firms. It is the clear winner when it comes to obtaining meaningful and long-lasting rankings.
  2. Man-Made Pages: These are a “doorway-like” technology (shadow page) that is keyword intensive. When visited, these pages should present an honest site page. This is a labor-intensive process that copies a real honest page and then alters it to emphasize keywords found on the honest page (page presented). In some implementations, this page loads the presented page into a frameset, and others redirect. This is not to be confused with web design, which adds extra content to a site that is intended for human visitors. ANY man-made page that is not intended for human visitors, no matter how great the content, is considered spam.
  3. Machine-Made Pages: These are often a “doorway-like” page where content is pulled from other site content based upon keywords. This content is then compiled by a software tool. Some generate pages using gibberish or templates that are easily detected by the search engines. This type of tool could generate thousands of pages in minutes. ANY machine-generated page that is not intended for human visitors, no matter how great the content, is considered spam.
  4. Cloaking: This is often associated with sites doing IP and USER-AGENT serving where the internet server will present a page that will vary based upon the visitor characteristics. This technology can be used to present different content to each search engine or browser. Because of this, a search engine seldom sees the same content that is presented to a browser. While there are acceptable reasons to do cloaking (such as legal restrictions of what content may be shown based on a visitor’s age or location), the use of cloaking that filters content based on whether the visitor is a spider or a human, no matter how great the content, is likely to be considered spam.

Editing Focus/Methodology

The primary methods used to improve search engine ranking are discussed on our site. This section lists a couple of areas that are affected by current SEO standards, and are called-out for special notice.

  1. Navigation: the use of links to encourage spiders to locate content within the website, and to support popularity algorithms.
  2. Content: the inclusion or focus on words, phrases and themes associated with search engine query strings.
  3. Transfers: pages that display (or transfer) to a real honest page. These pages are keyword-rich and theme-based for search engines. Yet they provide a reformatted page for the browser. This is very much like a standard Frames implementation in conjunction with a search engine optimized no-frames area. This practice includes URL-switching where the displayed page has a browser address that is different from the URL in the search engine link (thus is a redirection). It also includes instances where the browser back button causes a loop.

Bad Practice Issues

What makes a bad search engine optimization practice? When asking SEOs this question, spam and cloaking seem to be the leading answers. We present these items as bad practices and encourage others to submit ideas for this list as well. Some of these SEO practices were once accepted by the search engines but have become “bad” over time as the search engines have evolved to combat their individual notions of webspam.

Transparent, hidden, misleading and inconspicuous links — This includes the use of any transparent image for a link and the use of hidden links (possibly in div/layers). It also includes any link associated with a graphic without words/symbols that can be interpreted as representing the effect of taking the link. Also included are inconspicuous links like 1×1 pixel graphics or the use of links on punctuation. All of these would be considered “spam” and a cause for removal from a search engine index.

“Machine generated” pages – While many content management systems do create webpages, this entry refers to deceptive content pages generated to alter search engine results. These pages are typically created through software that takes keywords and uses them to assemble a high-ranking page. Pages like this are unconditionally spam because of their low quality and relevancy to any search.

Cloaking – When cloaking is used to deceive search engine user-agents for the purposes of ranking, it violates SEO standards and is considered spam. The only exception is when there is no impact (deletion, formatting, or insertion) on content delivered to the visitor versus the search engine spider. Where the stated objective of the tool [filtering by IP number or user agent] is to facilitate the delivery of differing content based upon visitor/search engine identification processes, the implementation of cloaking technology is considered BAD.

Search engines may remove cloaked sites from their index where deception is involved.

Content Farms/Article Directories – Before Google’s “Panda” update in 2011, article directories were a common way to increase PageRank. An article directory is a site that collects content about a specific subject. While collecting articles on a particular subject is not bad, many sites were “content farms.” These content farms churned out low-quality content on a particular topic to trick SE’s into increasing their PageRank. The Panda update removed sites that engaged in this practice. Today, search engines continue to filter out these low-quality pages as part of their core algorithms.

Spam – Spam runs from white-on-white to overloading the web with free webpages/sites developed to boost popularity through links. This category needs a clear definition, but it is the most easily defined in “black and white” rules.

External factors such as sites with numerous, unnecessary host names may also be caught. Some other common spam techniques include excessive cross-linking sites to inflate perceived popularity and the inclusion of obligated links as part of an affiliate program.

What the Engines Think Is Spam

Google

Google frequently updates its definitions of low-quality pages and webspam in its Search Quality Evaluator Guidelines. The latest edition also discusses low-quality pages that are not spam and simply miss the mark. For the purposes of this article, we will continue to focus on the definition as it relates to spam and not poorly made, well-intentioned pages.

Google has directed quality raters to rate a page as “low quality” if it contains low-quality MC (main content) or if the “title of the MC is exaggerated or shocking.” This specifically cracks down on clickbait types of headlines, which are less likely to gain a spot on the Google front page. Google provides additional context on this, noting that:

Exaggerated or shocking titles can entice users to click on pages in search results. If pages do not live up to the exaggerated or shocking title or images, the experience leaves users feeling surprised and confused.

Further explanations of low-quality content focus on whether there is “an unsatisfying amount of website information or information about the creator of the main content.” So contact information for the site owner should be easy to locate on a site, such as the business name, address, and phone number. And if an author is named for an article, enough background information should be shown to establish the person’s credibility and support the quality of the content.

More resources are available here:

Bing

Bing defines spam as pages that “have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information.” Bing has stated that if pages are found to contain spam, Bing may remove them at their discretion. They may also adjust the Bing algorithms to focus on more useful information and improve the user experience. Yahoo gets its search results from Bing, so any spam standards that apply to Bing will also apply to Yahoo.

Bing information and reporting tool: https://www.microsoft.com/en-us/concern/bing

Summary

Sites that are not in compliance with SEO standards of quality are in danger of being removed from the search engine indexes. Should all search engines enforce the same standards, many websites will be scrambling for honest SEO firms to optimize their sites. This creates an opportunity for SEO practitioners to set a standard for the future.

We encourage all who read this to be vocal with their staff, clients, and SEO providers about working toward compliance.

FAQ: How can I ensure my SEO strategies align with the best practices of SEO standards analysis?

Let’s explore some key insights to help you align your SEO strategies with industry-leading practices.

Understanding SEO Best Practices

To begin, it’s crucial to grasp the essence of SEO best practices. These practices are guidelines established by search engines and digital marketing experts that ensure your website is easily discoverable and provides valuable content to users. They cover aspects like keyword research, quality backlink building, mobile responsiveness, and user experience.

Navigating the SEO Standards Landscape

SEO standards are intricate, often guided by updates from search engines like Google. Keeping up with these changes is imperative. Regularly monitor industry news, follow reputable SEO blogs, and attend webinars to stay informed. Embrace a proactive approach, adjusting your strategies in response to algorithm shifts.

Data-Driven Decision Making

Incorporate data analysis into your SEO strategy. Leverage tools like Google Analytics and Search Console to gain insights into user behavior, keyword performance, and traffic sources. This data empowers you to refine your strategies based on real-time information, improving your site’s ranking and relevance.

Content Quality and Relevance

Content remains the heart of SEO success. Crafting high-quality, relevant content that addresses user intent enhances user experience and signals to search engines that your website is authoritative. Regularly update and optimize existing content to align with current search trends.

Technical SEO Excellence

Technical SEO is the backbone that supports your content’s visibility. Ensure your website is well-structured, with clean URLs, optimized images, and fast loading times. Implement schema markup to help search engines understand your content better. A technically sound website enhances user experience and contributes to better search rankings.

By seamlessly integrating these insights, your SEO strategy can transcend beyond conventional practices, yielding sustainable results. Remember, SEO is an ongoing journey, not a one-time task. Stay adaptable, continuously refine your approach, and you’ll confidently navigate the dynamic world of SEO.

Step-by-Step Procedure: Optimizing SEO Strategies with Best Practices of SEO Standards Analysis

  1. Familiarize yourself with fundamental SEO concepts and practices.
  2. Stay updated on the latest industry trends through reliable SEO news sources.
  3. Research and understand the search engine algorithms, especially those of major players like Google.
  4. Attend webinars, conferences, and workshops related to SEO standards analysis.
  5. Regularly monitor your website’s performance using tools like Google Analytics and Search Console.
  6. Analyze user behavior data to identify areas for improvement in your SEO strategy.
  7. Conduct comprehensive keyword research to identify relevant search terms for your target audience.
  8. Develop a content strategy that aligns with user intent and incorporates high-value keywords.
  9. Ensure your website’s technical aspects, like page speed and mobile-friendliness, are optimized.
  10. Implement schema markup to provide search engines with additional context about your content.
  11. Focus on building high-quality backlinks from authoritative websites in your industry.
  12. Create and publish fresh, informative, and engaging content on a consistent basis.
  13. Regularly update and optimize existing content to reflect current search trends and user interests.
  14. Collaborate with web developers to address any technical issues that might impact SEO.
  15. Monitor your website’s ranking and traffic patterns to gauge the effectiveness of your strategies.
  16. Stay adaptable and adjust your strategies in response to changes in search engine algorithms.
  17. Leverage social media and other platforms to amplify the reach of your content and engage with your audience.
  18. Consider investing in paid search advertising to complement your organic SEO efforts.
  19. Engage with your audience through comments, shares, and social interactions to build a strong online community.
  20. Continuously educate yourself about evolving SEO standards and adapt your strategies accordingly.

The post Search Engine Optimization Best Practices: Standards and Spam Discussion appeared first on Bruce Clay, Inc..

]]>
Paid Guest Posting: More Proof That It’s Bad for Business https://www.bruceclay.com/blog/paid-guest-posts-more-proof-bad-for-business/ https://www.bruceclay.com/blog/paid-guest-posts-more-proof-bad-for-business/#comments Tue, 30 Jun 2020 18:26:14 +0000 https://www.bruceclay.com/?p=81799 Google has been repeating itself for years: paid links are spam. Yet an entire economy has sprung up around paid guest posting — essentially just another form of paid links. What is the difference between selling or buying a link on a webpage and paying someone to write a webpage with the link in it? […]

The post Paid Guest Posting: More Proof That It’s Bad for Business appeared first on Bruce Clay, Inc..

]]>
Woman frustrated looking at computer.

Google has been repeating itself for years: paid links are spam. Yet an entire economy has sprung up around paid guest posting — essentially just another form of paid links.

What is the difference between selling or buying a link on a webpage and paying someone to write a webpage with the link in it? There is no difference to Google.

Yet paid article writers have been selling linked articles to the naive marketer for years. However, Google has drawn a hard line: paid guest posts are spam. And it doesn’t matter who you paid … if any fee is involved, then you are in the danger zone.

The latest developments absolutely send a clear message that paid guest posting as a way to build links will not be tolerated. And here are the lessons we’ve learned.

Lesson No. 1: Guest Posting Services Selling Links Are Spam

On June 3, 2020, SEMrush (a popular SEO tools SaaS company) received a tweet from Google’s John Mueller. The message? Your guest posting services are spam.

You can read more about how this came about here at Search Engine Roundtable for context. SEMrush was caught up in the turmoil because it had a service clearly called out by Google. They promptly responded to the community saying that the links through their service were not paid placements. Were the posts and included links free? Of course not — someone somehow made money.

Exactly who did not receive payments? No one can argue there wasn’t some form of payment involved.

Looking at Google’s advice on steering clear of link schemes, we can see some examples here of what to avoid. In Mueller’s tweet to SEMrush, he pointed to this article on links in large-scale article campaigns.

Some relevant excerpts from that article include:

Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. …

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users.

It’s pretty clear that Google does not want people or businesses to manipulate their rankings using links on paid guest posts. That’s due to the long history of guest posting as a way to extend and disguise link spam.

If Google does not take action to curb link spam, then pretty soon the first page of results will go to those that have the most money to buy guest posts instead of those worthy of ranking. Sounds like a breach of trust to me.

Ultimately, SEMrush decided to rethink its guest posting service and sent out a message to its community about it.

At the end of the day, this is not about pointing the finger at SEMrush, but about the lessons we can continue to learn about guest posting and link spam. SEMrush, like many, I think just got caught up in the moment and the fact that Google was quiet about it for so long. Now the signal is clear.

So this first lesson is understanding what link spam is.

I talked about this in a recent article on guest posting and manual penalties. It’s not just the people who are placing paid guest posts that should be aware of link spam. Websites that accept guest posts stand to suffer the most from Google penalties on their websites.

Back in 2017 (link earlier), Google said:

Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

Add:

  • Am I getting a fee? If yes, then it’s spam.
  • Am I paying for placement? If yes, then it’s spam.
  • Is this a link I would normally use and support? If no, then spam.

That brings us to our next lesson about paid guest posting and link spam: How to actually handle the links.

Lesson No. 2: Know When to Tell Google About the Links in Guest Posts

On June 3 and then again on June 11, 2020, John Mueller described how people should be handling links in guest posts: “nofollow” or the newer attribute, ”sponsored”:

Mueller followed up in the same thread, clarifying a little bit:

In typical Google fashion, the message is not 100% clear. One would assume that he meant that we were to disclose to Google the links pointing to the guest poster’s website.

But his message seems to say all links, even if they are “natural” (which could mean links to supporting research, too).

I assume he is doing this to make it easy for guest posters not to intentionally or unintentionally have spam links. If you “nofollow” all links, you have less of a chance of harming the site.

There are plenty of people who disagree with Google’s latest suggestion. Is it that big of a deal to go ahead and comply by using those attributes? Not really. Because even with those attributes, Google is likely to figure out more about the links on its own.

In March of this year, Google began treating these attributes (“nofollow” and “sponsor”) as merely “hints” when considering the links they are applied to. But a hint of what? A hint that you are probably selling links?

From its announcement of this change back in September 2019:

When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search. We’ll use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within our systems.

Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.

In other words, as Danny Sullivan at Google mentioned in a tweet, using these attributes is now a way to send more granular signals to Google (which, of course, is good for Google and some say not so good for others). But a signal of what? A signal that you are probably selling links?

And then Google can determine whether the link is good and will count, or if it’s a paid link and spam that will hurt the site.

And there remains the issue. According to the FTC, this may be an advertorial and it needs to be clearly identified as a paid page. I really think this is an entirely new blog post, so I only mention it here.

Takeaways for Guest Posting

So what’s the takeaway for a business that wants to guest post? Do it for reasons other than link building. Do it for traffic and users.

If you want to contribute content as a way to add value to a community, that is fine. But don’t expect it to build quality links or boost your website’s authority.

A better strategy is building links to your site by creating great content published on your own site. As I wrote in that post:

You want to build authority and you need links. But there are good links, bad links and downright ugly links. The good links you earn naturally by creating great content on your site that people want to link to. The bad or ugly links are usually those that come out of a “link building” program.

If you’re a website publisher who is accepting guest posts, create guest post guidelines that meet Google Webmaster Guidelines. If a link in the article is off-topic, the quality of the linking site (i.e., yours) will likely suffer.

Only accept posts that don’t diminish your website’s expertise, authority and trust. And make sure you and an SEO pro review all posts before publishing.


Only accept guest posts that you wish you had written. Otherwise, consider them poison.
Click To Tweet


When in doubt, it’s a good practice to use a “nofollow” or “sponsor” attribute on relevant links in guest posts.

But keep in mind that Google may catch spam links anyway:

This is something I talked about in my article on guest posting and manual penalties:

In most cases, “nofollow” is a hint to Google. But on Your Money or Your Life (YMYL) pages like news, finance, health and so on (see the list in Google’s guidelines), Google may ignore “nofollow” entirely. In other words, if you have a spam link on a YMYL website, consider yourself open to more scrutiny by Google and potential penalties.

Also, have you ever gotten an email telling you about all the wonderful sites you can get a paid guest post on? Think about it: Who owns Gmail? Do you really think Google does not know who is manipulating the link landscape and the sites to ignore?

I think that selling high domain authority (DA/DR) posts for the value of the links where Google probably ignores the links is unethical.

So does that mean every single link in a guest post needs to have “nofollow” or “sponsored”? If you want to be safe and are unsure, yes. The reader cannot tell if the link is followed or not, so making them all nofollow is a viable choice.

But for those who are savvier when it comes to the nature of link spam, you can make the call on which links should have which attributes.

As a reminder, here is Google’s help file on qualifying your outbound links.

Google table of rel attributes for link tags.

Note: If you receive guest blog post spam requests, you can report them to Google as “paid links” webspam.

Closing Thought

Remember Penguin? Now think about what happens when (not if) Google rolls over onto paid guest blogging.

I am imagining a world where, like the Penguin penalty, Google marks paid guest posts as spam. Not simply a zero value, but rather a hard loss of rankings that takes websites years to recover from.

Google does not want link manipulation by any means. What would you do if you were Google trying to protect your product?

If you need help deciding on the most effective ways to increase your website traffic and revenue, contact us today for a free quote and consultation.

The post Paid Guest Posting: More Proof That It’s Bad for Business appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/paid-guest-posts-more-proof-bad-for-business/feed/ 23
Avoiding Google Penalties https://www.bruceclay.com/seo/avoiding-google-penalties/ Wed, 13 Mar 2019 03:51:47 +0000 https://www.bruceclay.com/?page_id=62876 SEO Guide Step 14 What is a Google penalty   •  Top 3 guidelines for avoiding Google penalties   •  ​Google penalties by name   •  Manual actions FAQ: How can I prevent Google penalties to maintain my website’s SEO visibility? A penalty is when you get caught breaking the search engine’s rules, and they punish you. As the saying […]

The post Avoiding Google Penalties appeared first on Bruce Clay, Inc..

]]>
SEO Guide Step 14

Referee with penalty flag

A penalty is when you get caught breaking the search engine’s rules, and they punish you. As the saying goes, if you can’t do the time, then don’t do the crime. And losing all rankings for years makes for a tough lesson.

How can you avoid Google penalties and protect your website’s SEO visibility? This topic is vital because a search engine penalty can reduce or even wipe out your search traffic — and it can be costly and difficult to recover.

With every new algorithm update, Google seems to tighten its standards a little more. Websites operating just outside search engine guidelines can get caught. If your site is penalized, you can expect your rankings to slip and revenues to fall.

What Is a Google Penalty

From an SEO perspective, the term “penalty” means any negative impact on a website’s organic search rankings caused by an algorithm update or a manual action.

Anything that directly violates Google’s Webmaster Guidelines can result in a penalty against your website. The two main types are algorithm-based, which is automatic, and manual, which is an intentional penalization for “black hat” actions.

Though Google doesn’t call algorithmic hits “penalties” per se, the result feels the same to a website owner. It’s as if somebody threw down a penalty flag against your site.

Top 3 Guidelines for Avoiding Google Penalties

You can take steps to minimize the factors that contribute to SEO penalties. Here are the top 3 guidelines to avoid a penalty:

  1. Don’t buy links. Links passing SEO value to your site should look natural. Make sure a “nofollow” attribute is added to any paid links (such as ads), and stay away from link schemes.
  2. Don’t overuse keywords. Stuffing your content with exact repetitions of a key phrase can hurt your search engine rankings. Keywords are important, but use keywords appropriately in a natural-sounding way.
  3. Make original, quality content. You must have unique, high-quality web pages to rank.

Google Penalties by Name

Penalties range from a slight, temporary ranking hit (a slap on the wrist) all the way to expulsion from the search engine’s index.

Don’t let the cute black-and-white animals fool you. Search engine algorithms have teeth in them and can bite sites that seem to be breaking the rules.

Note: We go into detail on Google algorithm updates and how to prepare for them on our blog.

Panda penalty

Google Panda Penalty

The Panda update was in February 2011, with the first of many rollouts. By early 2016, Panda had grown up and become part of Google’s core algorithm.

The Google Panda algorithm aims to prevent poor-quality content from reaching the top of search results.

If Panda thinks your website provides low-quality content, it will be hard for your web pages to rank. Low-quality web content includes: “thin” pages with little or no added value, product pages with manufacturer-provided descriptions and no original text, and widespread duplicate content.

(The SEM Post’s guide to Google Panda is a good resource for more details.)

How to Avoid a Panda Penalty: Keep Panda fed and happy by providing original content that will satisfy searchers.

With a Panda issue, you can earn your way back up the search results by fattening up (rewriting/improving) your thin content, eliminating duplicate pages (or blocking them from search engines), and generally providing high-quality content for your site visitors.


Penguin penalty

Google Penguin Penalty

The Google Penguin algorithm combats webspam by detecting link spam.

When a site’s backlink profile (i.e., the full list of links coming from external sites) includes too many unnatural-looking links, Google suspects that site of trying to manipulate search rankings — and Penguin’s feathers get understandably ruffled.

Google launched the first Penguin update in April 2012. Several agonizingly slow rollouts later, Google announced a final update in September 2016. Penguin now operates in real-time as part of Google’s core ranking algorithm.

Google says that the new Penguin algorithm no longer gives penalties. Rather than demoting a site with low-quality backlinks, Penguin now just devalues bad incoming links so they don’t affect the site’s rankings.

However, your link profile is still your responsibility. Having a large percentage of poor backlinks is a low-trust signal. Many sites that were hit with a Penguin penalty still need to take steps to recover.

How to Avoid a Penguin or Link-Related Penalty: If Google detects a site is buying or selling links, negotiating link exchanges, participating in link farms, or engaging in any kind of unnatural linking, it should expect to get penalized.

Also, if your organic search traffic drops suddenly, it could be due to a link-related penalty. The traffic chart below shows a tragic but typical example:

Search traffic dropoff after penalty

To avoid Penguin issues, regularly monitor and clean up your backlinks. (Stay tuned — you learn how to do this in the next lesson!)

Intrusive Interstitial Penalty

Launched in January 2017, the Intrusive Interstitial Penalty affects mobile search results only.

Google penalizes sites that show an intrusive ad, popup, or standalone interstitial to a mobile user immediately after clicking a mobile search result.

In general, Google demotes web pages that block searchers from easily seeing the content. Certain types of interstitials aren’t penalized, such as login forms and legally necessary gates (for age verification or other).

This image shows three examples (provided by Google) of intrusive interstitials that would cause a penalty:

Intrusive interstitials examples per Google

How to Avoid an Interstitial Penalty: Give mobile searchers a good user experience. Avoid ads and popups that block too much of the screen right after a searcher arrives.

Payday Loan sign

Photo by Jason Comely (CC BY 2.0), modified

Payday Loan Penalty

Google updated its algorithm in June 2013 specifically to address the quality of results for heavily spammed queries such as “payday loans,” “viagra” and pornography-related keywords.

How to Avoid a Payday Loan Penalty: Sites penalized by the Payday Loan update tend to be heavily involved in link schemes, web spam, and often illegal activities. Steer clear of these, or you risk losing your organic search engine rankings.

This list covers Google’s major algorithmic penalties to this point. For a complete list of all the known updates, see Moz’s Google Algo Change History.

Manual Actions

Besides all of the algorithmic penalties, it’s also possible for a search engine employee to manually cause your website to go either up or down in the rankings. Google’s webspam team members can manually review websites and levy penalties or even kick a site out of the index.

What triggers a manual review? The search engine may be taking a closer look because of suspected foul play. (Did you know a majority of spam is reported by competitors?) Or, Google’s team may be re-evaluating a website that has requested “reconsideration” after cleaning up its penalty issues.

How to Check for Any Manual Actions: If your site receives a manual action notice, Google courteously tells you so. In fact, Google sent over 9 million webspam notifications to webmasters in one year alone.

Here’s how to see if any messages (good or bad) exist for your site:

  1. Log in to your account in Google Search Console.*
  2. Open the Manual Actions report.
  3. Read any messages to find out the specific reasons for the manual action and what parts of your site are affected by the penalty.

How to Avoid a Manual Action Penalty: Manual actions can result from anything that directly violates Google’s Webmaster Guidelines. They often relate to thin content (see Panda Penalty section), unnatural links (see Penguin Penalty), pure web spam (see Payday Loan Penalty), or other noncompliance issues.

If your site gets a manual penalty, resolve the issue(s) causing the manual action sooner rather than later.

Once you clean up your site, submit a reconsideration request. Google will examine your site again and, if it looks good, lift the penalty.

SEO GUIDE BONUS VIDEO

Google’s Gary Illyes explains how the Penguin algorithm works now that it’s a real-time component of the core ranking algorithm.

SEO GUIDE BONUS VIDEO #2

“How can you prepare for the future of SEO?” Bruce Clay addressed this question at a search conference, and his response was fortunately captured on video.

His advice — In the short term, monitor and get rid of “junk links.” Long term, reinforce your subject relevance through proper site structure. These strategies help you avoid Google Penguin penalties and other “disasters.”

Watch this brief interview to hear Bruce’s full answer.

Building on the important topic of avoiding Google penalties, the next step in the SEO guide shows you how to monitor your backlinks and remove unwanted links that could sink your SEO ship if a ​link penalty occurs.

 

Related blog posts and articles:

Do you think you’ve been hit with Google penalties? We can help. Read about our SEO Penalty Audit and optional link pruning services that can help businesses identify and recover from Google penalties.

FAQ: How can I prevent Google penalties to maintain my website’s SEO visibility?

Modern businesses must maintain an effective online presence through search engine optimization (SEO). Unfortunately, Google’s constantly shifting algorithms present businesses with numerous challenges when optimizing for search. Here are key strategies to safeguard your website from Google penalties while preserving your SEO visibility.

Understanding Google Penalties

Google penalties can cripple your website’s ranking and visibility, reducing organic traffic. These penalties are often triggered by violations of Google’s guidelines, such as keyword stuffing, cloaking, or spammy backlink practices. To safeguard your site, prioritize a holistic approach to SEO that prioritizes user experience, valuable content, and ethical link-building.

Quality Content is Paramount

Google recognizes and rewards relevant, high-quality content that meets user needs. Create engaging and thorough pieces that resonate well with your audience. Incorporate well-researched keywords naturally, avoiding over-optimization. Regularly update your content to reflect industry trends, ensuring your site remains a valuable resource.

Ethical Link-building Strategies

Backlinks play a pivotal role in SEO, but the quality of these links matters more than quantity. Engage in ethical link-building by collaborating with reputable websites in your niche. Avoid buying links or participating in link schemes, as these can trigger penalties. Focus on building relationships with influencers and thought leaders for genuine link opportunities.

Technical SEO and Website Performance

A well-optimized website structure and swift loading speeds are crucial for user satisfaction and SEO success. Regularly audit your website for broken links, crawl errors, and mobile responsiveness. A technically sound website enhances user experience and reduces the risk of penalties.

Monitoring and Adaptation

Staying informed about Google’s algorithm updates is essential. Continuously monitor your website’s performance using tools like Google Search Console. Regularly assess your SEO strategies and adapt them to align with the latest guidelines and best practices. Flexibility and a willingness to evolve are key to preventing penalties and maintaining visibility.

Step-by-Step Procedure: Preventing Google Penalties and Ensuring SEO Visibility

  1. Familiarize yourself with Google’s Webmaster Guidelines.
  2. Conduct an audit of your website to identify existing SEO issues.
  3. Optimize on-page elements, including title tags, meta descriptions, and headers.
  4. Create a content strategy that emphasizes high-quality, relevant content creation.
  5. Use keyword research tools to identify valuable and appropriate keywords for your niche.
  6. Integrate keywords naturally into your content, avoiding keyword stuffing.
  7. Prioritize user experience by ensuring easy navigation and mobile responsiveness.
  8. Regularly monitor your website’s backlink profile and disavow spammy or irrelevant links.
  9. Build relationships with authoritative websites for potential backlink opportunities.
  10. Stay updated on Google algorithm changes and SEO trends.
  11. Use tools like Google Search Console to identify and address technical issues.
  12. Implement proper redirects for any removed or changed pages.
  13. Monitor your website’s loading speed and optimize images and scripts as needed.
  14. Ensure your website’s original content is not duplicated across multiple pages.
  15. Avoid engaging in link schemes or purchasing backlinks.
  16. Regularly update and refresh your content to reflect industry changes.
  17. Employ social media to engage your target audience and promote your content.
  18. Analyze metrics like bounce rates, organic traffic volume, and conversion rate to make decisions.
  19. Adapt your SEO strategies based on performance data and algorithm updates.
  20. Consider seeking guidance from reputable SEO experts or agencies for specialized advice.

By following these comprehensive steps, you’ll be equipped to navigate the dynamic realm of SEO, effectively preventing Google penalties and maintaining strong online visibility for your website.

The post Avoiding Google Penalties appeared first on Bruce Clay, Inc..

]]>