Robert Ramirez, Author at Bruce Clay, Inc. https://www.bruceclay.com/blog/author/rramirez/ SEO and Internet Marketing Sat, 16 Dec 2023 02:25:17 +0000 en-US hourly 1 Mobile Friendly SEO Ranking Boost Gets Boosted in May https://www.bruceclay.com/blog/mobile-friendly-seo-ranking/ https://www.bruceclay.com/blog/mobile-friendly-seo-ranking/#comments Thu, 17 Mar 2016 15:30:44 +0000 http://www.bruceclay.com/blog/?p=40378 Generally speaking, Google's April 2015 mobile-friendly algorithm update (dubbed "Mobilegeddon" by the SEO industry) was sort of a bust. Months of talk about an organic ranking boost to mobile-friendly, aka mobile SEO compliant, websites turned out to be mostly hype.

While it did spur many sites to make their sites better for smartphone users, there was not a lot of movement across mobile search engine results pages, especially at the top of SERPs. And the mobile-friendly “boost” was ultimately implemented as a tiebreaker among sites that were deemed to have equal ranking strength -- a condition that rarely occurs.

Another mobile ranking boost announcement has just come out of Google. The announcement says that they will be increasing the effect of the mobile friendly ranking signal in May of this year.

Read more about the coming change ...

The post Mobile Friendly SEO Ranking Boost Gets Boosted in May appeared first on Bruce Clay, Inc..

]]>
Generally speaking, Google’s April 2015 mobile-friendly algorithm update (dubbed “Mobilegeddon” by the SEO industry) was sort of a bust. Months of talk about an organic ranking boost to mobile-friendly, aka mobile SEO compliant, websites turned out to be mostly hype.

mobile seo ranking boost may 2016

While the April 2015 Mobile-Friendly Update did spur many sites to make their sites better for smartphone users, there was not a lot of movement across mobile search engine results pages, especially at the top of SERPs. The mobile-friendly “boost” was seemingly implemented as a tiebreaker among sites that were deemed to have equal ranking strength — a condition that rarely occurs.

Google Turns Up the Volume on the Mobile-Friendly Boost

Another mobile ranking boost announcement has just come out of Google. The announcement says that they will be increasing the effect of the mobile friendly ranking signal in May of this year:

Today we’re announcing that beginning in May, we’ll start rolling out an update to mobile search results that increases the effect of the ranking signal to help our users find even more pages that are relevant and mobile-friendly.”

And a later clarification: “If you’ve already made your site mobile-friendly, you will not be impacted by this update.”

So, if you’re mobile-friendly, you’re safe. And if we take Google at their word, we can assume that there are no ranking boosts for different “degrees” of mobile-friendliness.

An example of degrees of mobile friendliness might involve Google favoring sites that have implemented AMP or responsive design — two mobile-friendly implementations that Google has gone out of their way to endorse — with more pronounced ranking boosts than other “mobile-friendly” sites.

For now, it appears that the mobile-friendly status (and resulting ranking increase) is still a binary consideration; you either have it and enjoy the benefits or you don’t.

Come May, we’ll see if this latest change actually starts to shuffle the rankings more dramatically. We don’t suspect that Google wants to have two sets of ranking results — one for desktop and another mobile — but that could be on the horizon. Certainly, this news is cause to motivate any businesses still not mobile-friendly to move ahead toward that goal.

Your Next Steps

The post Mobile Friendly SEO Ranking Boost Gets Boosted in May appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/mobile-friendly-seo-ranking/feed/ 23
How Google Removing Right-Side Ads from SERPs Affects Organic SEO https://www.bruceclay.com/blog/how-google-removing-right-side-ads-effects-seo/ https://www.bruceclay.com/blog/how-google-removing-right-side-ads-effects-seo/#comments Thu, 25 Feb 2016 18:38:45 +0000 http://www.bruceclay.com/blog/?p=39641 Late last week, Google announced that they were dramatically changing the layout of their SERPs by removing right-side ads from the page. At the same time, they added an extra ad to the top of the fold for “highly commercial queries” and three text ads to the bottom of the SERPs. This eliminates the number of paid ads for certain types of SERPs from as many as 11 to a maximum of seven.

There has been much speculation on what impact this will have in the PPC world, but what do these changes mean for SEOs who are more concerned with organic rankings?

Should SEOs be changing their tactics in light of the SERP shakeup? Or is it business as usual? There certainly seems to be a general consensus that organic SEO is the loser with all of these changes, but I’m not so sure that that is the case.

Read How Google Removing Right-Side Ads from SERPs Effects Organic SEO.

The post How Google Removing Right-Side Ads from SERPs Affects Organic SEO appeared first on Bruce Clay, Inc..

]]>
Late last week, Google announced that they were dramatically changing the layout of their SERPs by removing right-side ads from the page. At the same time, they added an extra ad to the top of the fold for “highly commercial queries” and three text ads to the bottom of the SERPs. This reduces the number of paid ads for certain types of SERPs from as many as 11 to a maximum of seven.

There has been much speculation on what impact this will have in the PPC world, but what do these changes mean for SEOs who are more concerned with organic rankings?

Should SEOs be changing their tactics in light of the SERP shakeup? Or is it business as usual? There certainly seems to be a general consensus that organic SEO is the loser with all of these changes, but I’m not so sure that that is the case. Let me explain why.

seo checkup google ads

The SERP Is Still Made Up of 10 Blue Links

You might assume that since Google is adding as many as four paid ads to the body of organic SERPs that some organic positions are getting squeezed off the page. This isn’t the case. While organic results are getting pushed down the page slightly (more on this in a bit), the total number of organic results on all SERPs is remaining consistent.

SERP elements like “In The News” boxes, related questions, answer boxes, and blended image search results can cause SERPs to include less than the standard 10 results (usually down to nine), but the new ads above and below the SERPs have not lowered the organic SERP result count any further. The same goes for navigational branded searches, which usually only have six or seven organic results. These numbers have remained consistent despite the removal of the right-side ads and the addition (in some instances) of ads above and below the fold.

So speculation that organic results are being profoundly impacted and somehow severely decreased by this change are not accurate. In so much as the SERP was ever made up of “10 blue links” it still is.

One Fewer Organic Result Above the Fold

The top of fold in Google desktop search results has long been dominated by paid ads. The addition of non-paid SERP enhancements like answer boxes, featured snippets, and most recently, Twitter results, continue to push the top organic results down the page.

Google’s most recent change does stand to impact organic results that now contain the fourth paid ad above the fold. These queries, which according to Google are “highly commercial queries,” now have one organic result displaying above the fold instead of two.

Statistics from W3Schools indicate that the most common screen resolution among Internet users is currently 1366 x 768; 35 percent of Internet users have this specific resolution. Here is what the above the fold section of a SERP looked like with three ads for the search query “all inclusive Hawaii trips” for this specific screen resolution:

Google Removing Right Side Ads - all inclusive hawaii trip serp
Click to enlarge

And the same query, with four ads above the fold:

Google Removing Right Side Ads - all inclusive hawaii trip serp 2
Click to enlarge

Essentially, the second organic result is being lost in the second example. The result is partially visible above the fold, but it certainly isn’t legible or as clickable as it was before the change. It should be noted that much of this depends on the number of ad enhancements that are included in the four paid ads above the fold. Results with more enhancements take up a bit more space and in some instances the second result is totally lost because of them.

Here’s another example of a SERP that had a blended image search result included as the second result for “above ground pools”:

Google Removing Right Side Ads - above ground pools serp
Click to enlarge

And the same query with four ads above the fold:

Google Removing Right Side Ads - above ground pools serp 2
Click to enlarge

Here we again see that the second organic result is just about completely lost. There’s no doubt that above-the-fold organic real estate is at a premium following this change. In fact, it has been effectively cut in half with the removal of the second result.

Is SEO Dead (Again)?

Many will go screaming for the exits at this point, proclaiming that SEO is dead. While it is true that the above-the-fold organic real estate has been compromised, there’s an incorrect presumption that the average user doesn’t scroll down the SERP, choosing instead to click on whatever results are displayed when the SERP first loads. In fact, there’s reason to believe that the structure of the new desktop SERP may inspire a modified behavior.

The most glaring change in the new SERP is clearly the loss of the right-side ads. Many have remarked that the SERP results look a bit barren without the sidebar ads included, and I would tend to agree. To a certain degree, ad blindness caused those results to be largely ignored by searchers, which is a big reason why Google pulled the trigger on removing them. But their absence has a very clear effect: bringing focus to the body of the SERP.

This is clearly a very intentional move by Google. Drawing more eyes to the top-of-fold ads should result in them being clicked more.

However, the ancillary effect of this is that the organic space gets additional attention from searchers, as well. With no distractions on the right-side, the eyes have no choice but to focus on this space. I imagine this will actually result in higher CTR for organic first page results.

Mobile Is King, Even in the New SERP

One of Google’s goals in changing the SERP and removing the right-side ads is to create a unified search experience for both mobile and desktop searchers.

Indeed, the removal of the right-side ads makes the desktop experience more closely resemble what you get in mobile SERPs. If we assume that desktop searcher behavior will start to mimic mobile searcher behavior, we can anticipate that the below-the-fold results will get more attention from searchers.

Advanced Web Ranking tracks organic click-through rates on SERPs for a large set of queries for both mobile and desktop results:

Google Removing Right Side Ads - advanced web ranking
Click to enlarge

While first-position results are clicked more often by desktop searchers, mobile searchers click on second to seventh position results more often. This tendency even extends to the second and third pages of results, which mobile users are much more likely to navigate to and tap than desktop searchers:

Google Removing Right Side Ads - advanced web ranking 2
Click to enlarge

So will this behavior extend to desktop searchers now that the SERP more closely resembles its mobile counterpart? Only time will tell, but there is little doubt that Google’s intent with these recent SERP changes is to influence searcher behavior. Google is attempting to condition searchers to engage with paid ads and eliminate the ad blindness that has historically been a knock on the paid section of SERPs.

If there is one takeaway for organic SEOs, it is to clearly understand Google’s intention: if the query is transactional in nature, Google wants to make sure that users are clicking on ads to complete the process and make their purchase. It helps to justify PPC ad budgets by increasing ROI and protecting Google’s corporate objective of increasing revenue from paid search ads.

Ecommerce SEO Just Got a Little Bit Harder

If there is a “loser” in Google’s recent SERP shake-up, it has to be the SEO who works exclusively on ecommerce sites. All of the changes that Google has made affect the queries they spend their time optimizing for. Among those changes:

Google Removing Right Side Ads - running shoes serp
Click to enlarge
  1. A fourth ad at the top of the SERP, which allows for
  2. an increase in paid ad enhancements, which help to draw attention to paid ads while simultaneously
  3. eliminating 50 percent of the organic results above the fold and
  4. removing right-side text ads, which helps to feature Product Listing Ads (PLAs).

This is one crowded SERP, and there isn’t a real organic result until well below the fold. This is what the top of fold for this SERP looks like on a 1366 x 768 screen:

Google Removing Right Side Ads - running shoes serp 2
Click to enlarge

So what’s an organic SEO to do? Throw in the towel? Pack up and go home? While winning in the ecommerce space clearly got a little tougher organically, there is still very real opportunity to make a splash.

One could argue that good SEO and top-of-first-page organic placement is going to have even more value in light of these changes. This is especially the case if CPC for paid ads at the top of the page for ultra-competitive head terms skyrockets (as many are speculating it will) — making paid inclusion on the page an option reserved exclusively for businesses with the largest of paid ad budgets.

Organic SEO may be the small-to-medium-sized business’s best option for leveling the playing field and continuing to grab a piece of the pie for the high volume transactional queries that are seeing the most SERP layout flux. Larger brands will be aiming to rank organically to double-down on first page SERP real estate, owning both above-the-fold paid advertisements and top 10 organic rankings — a result that can exponentially increase conversion rates.

And that’s to say nothing of the opportunities to target longer tail queries …

Long-Tail Optimization For The Win!

While highly competitive head terms are becoming more competitive than ever, the longer-tail queries still offer excellent opportunities for optimization. Long-tail queries are the organic SEO’s best friend; they are less competitive than head terms, but offer a much higher conversion rate.

A small tweak to our ultra-competitive head term “running shoes” to “discount running shoes” reveals a much different SERP:

Google Removing Right Side Ads - discount running shoes serp
Click to enlarge

By tweaking our target keyword, we’ve retrieved two extra organic results above the fold. Another tweak, this time adding a specific brand to the query, results in a SERP with even more organic real estate to take advantage of — “Nike discount running shoes”:

Google Removing Right Side Ads - nike discount running shoes serp
Click to enlarge

Conclusion

While it is too early to gauge the true effect of Google’s recent SERP layout changes, it is important for organic SEOs to remain calm and base their recommendations on the data available to them. As I have illustrated in this blog post, the net effect of the recent changes is negligible, and there’s reason to believe that ultimately, the changes may benefit organic SEO click-through rates.

It is also extremely important that SEOs, especially those in the ecommerce space, continue to identify and optimize for less competitive longer-tail queries and not fall into the trap of chasing head-term rankings that may increase traffic, but have a negligible impact on conversion rates.

Whether it’s the search algorithm or the SERPs themselves, the dynamic nature of our industry is what makes it such a challenge and what makes achieving great results so rewarding. The one absolute in organic SEO is change.

For more on the changes to organic SEO, check out our post What To Do in the Near Future of SEO.


Are you interested in hearing directly from Google reps how the change will affect organic click-through? Us too. It’s a topic sure to be discussed at SMX West, where Googlers will be taking the stage and answering questions posed by Danny Sullivan and SEOs. Subscribe to the blog for email updates of our liveblog coverage of the event taking place all next week.

The post How Google Removing Right-Side Ads from SERPs Affects Organic SEO appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-google-removing-right-side-ads-effects-seo/feed/ 8
Is Google about to Unplug Its Penguin? https://www.bruceclay.com/blog/is-google-about-to-kill-penguin/ https://www.bruceclay.com/blog/is-google-about-to-kill-penguin/#comments Thu, 12 Nov 2015 19:30:08 +0000 http://www.bruceclay.com/blog/?p=38858 TL;DR – A theory: The next Google Penguin update will kill link spam outright by eliminating the signals of inorganic backlinks. Google will selectively pass link equity by topical relevance of linked sites, made possible by semantic analysis. Google will reward organic links and perhaps even mentions from authoritative sites in any niche. As a side effect, link-based negative SEO and Penguin "penalization" will be eliminated.

Is the end of link spam upon us? Read why we think it's a good thing that Google is about to unplug its Penguin.

The post Is Google about to Unplug Its Penguin? appeared first on Bruce Clay, Inc..

]]>
Editor’s note, Oct. 6, 2016: Google spokesperson Gary Illyes made it official. The prediction posed in this post by Robert Ramirez came to pass with the September 2016 Penguin 4.0 update.

Inorganic links are not a negative ranking signal, but instead are ignored in Google’s ranking calculation. Therefore, this article’s theory currently stands as fact.

View the conversation where Illyes clarifies “demote” vs. “devalue”.

TL;DR – A theory: The next Google Penguin update will kill link spam outright by eliminating the signals associated with inorganic backlinks. Google will selectively pass link equity based on the topical relevance of linked sites, made possible by semantic analysis. Google will reward organic links and perhaps even mentions from authoritative sites in any niche. As a side effect, link-based negative SEO and Penguin “penalization” will be eliminated.

Is Google about to kill Penguin?

Is the End of Link Spam Upon Us?

Google’s Gary Illyes has recently gone on record regarding Google’s next Penguin update. What he’s saying has many in the SEO industry taking note:

  1. The Penguin update will launch before the end of 2015. Since it’s been more than a year since the last update, this would be a welcome release. (Editor’s note on Dec. 7, 2015: Since publication, a Google spokesperson said, “With the holidays upon us, it looks like the penguins won’t march until next year.”)
  2. The next Penguin will be a “real-time” version of the algorithm.

Many anticipate that once Penguin is rolled into the standard ranking algorithm, ranking decreases and increases will be doled out in near real-time as Google considers negative and positive backlink signals. Presumably, this would include a more immediate impact from disavow file submissions — a tool that has been the topic of much debate in the SEO industry.

But what if Google’s plan is to actually change the way Penguin works altogether? What if we lived in a world where inorganic backlinks didn’t penalize a site, but were instead simply ignored by Google’s algorithm and offered no value? What if the next iteration of Penguin, the one that is set to run as part of the algorithm, is actually Google’s opportunity to kill the Penguin algorithm altogether and change the way they consider links by leveraging their knowledge of authority and semantic relationships on the web?

We at Bruce Clay, Inc. have arrived at this theory after much discussion, supposition and, like any good SEO company, reverse engineering. Let’s start with the main problems that the Penguin penalty was designed to address, leading to our hypothesis on how a newly designed algorithm would deal with them more effectively.

Working Backwards: The Problems with Penguin

Of all of the algorithmic changes geared at addressing webspam, the Penguin penalty has been the most problematic for webmasters and Google alike.

It’s been problematic for webmasters because of how difficult it is to get out from under. If some webmasters knew just how difficult it would be to recover from Penguin penalties starting in April of 2012, they may have decided to scrap their sites and start from scratch. Unlike manual webspam penalties, where (we’re told) a Google employee reviews link pruning and disavow file work, algorithmic actions are reliant on Google refreshing their algorithm in order to see recovery. Refreshes have only happened four times since the original Penguin penalty was released, making opportunities for contrition few and far between.

Penguin has been problematic for Google because, at the end of the day, Penguin penalizations and the effects they have on businesses both large and small have been a PR nightmare for the search engine. Many would argue that Google could care less about negative sentiment among the digital marketing (specifically SEO) community, but the ire toward Google doesn’t stop there; many major mainstream publications like The Wall Street Journal, Forbes and CNBC have featured articles that highlight Penguin penalization and its negative effect on small businesses.

Dealing with Link Spam & Negative SEO Problems

Because of the effectiveness that link building had before 2012 (and to a degree, since) Google has been dealing with a huge link spam problem. Let’s be clear about this; Google created this monster when it rewarded inorganic links in the first place. For quite some time, link building worked like a charm. If I can borrow a quote from my boss, Bruce Clay: “The old way of thinking was he who dies with the most links wins.”

This tactic was so effective that it literally changed the face of the Internet. Blog spam, comment spam, scraper sites – none of them would exist if Google’s algorithm didn’t, for quite some time, reward the acquisition of links (regardless of source) with higher rankings.

black hood
Negative SEO: a problem that Google says doesn’t exist, while many documented examples indicate otherwise.

And then there’s negative SEO — the problem that Google has gone on record as saying is not a problem, while there have been many documented examples that indicate otherwise. Google even released the disavow tool, designed in part to address the negative SEO problem they deny exists.

The Penguin algorithm, intended to address Google’s original link spam issues, has fallen well short of solving the problem of link spam; when you add in the PR headache that Penguin has become, you could argue that Penguin has been an abject failure, ultimately causing more problems than it has solved. All things considered, Google is highly motivated to rethink how they handle link signals. Put simply, they need to build a better mousetrap – and the launch of a “new Penguin” is an opportunity to do just that.

A Solution: Penguin Reimagined

Given these problems, what is the collection of PhDs in Mountain View, CA, to do? What if, rather than policing spammers, they could change the rules and disqualify spammers from the game altogether?

By changing their algorithm to no longer penalize nor reward inorganic linking, Google can, in one fell swoop, solve their link problem once and for all. The motivation for spammy link building would be removed because it simply would not work any longer. Negative SEO based on building spammy backlinks to competitors would no longer work if inorganic links cease to pass negative trust signals.

Search Engine Technologies Defined

Knowledge Graph, Hummingbird and RankBrain — Oh My!

What is the Knowledge Graph?
The Knowledge Graph is Google’s database of semantic facts about people, places and things (called entities). Knowledge Graph can also refer to a boxed area on a Google search results page where summary information about an entity is displayed.

What is Google Hummingbird?
Google Hummingbird is the name of the Google search algorithm. It was launched in 2013 as an overhaul of the engine powering search results, allowing Google to understand the meaning behind words and relationships between synonyms (rather than matching results to keywords) and to process conversational (spoken style) queries.

What is RankBrain?
RankBrain is the name of Google’s artificial intelligence technology used to process search results with machine learning capabilities. Machine learning is the process where a computer teaches itself by collecting and interpreting data; in the case of a ranking algorithm, a machine learning algorithm may refine search results based on feedback from user interaction with those results.

What prevents Google from accomplishing this is that it requires the ability to accurately judge which links are relevant for any site or, as the case may be, subject. Developing this ability to judge link relevance is easier said than done, you say – and I agree. But, looking at the most recent changes that Google has made to their algorithm, we see that the groundwork for this type of algorithmic framework may already be in place. In fact, one could infer that Google has been working towards this solution for quite some time now.

The Semantic Web, Hummingbird & Machine Learning

In case you haven’t noticed, Google has made substantial investments to increase their understanding of the semantic relationships between entities on the web.

With the introduction of the Knowledge Graph in May of 2012, the launch of Hummingbird in September of 2013 and the recent confirmation of the RankBrain machine learning algorithm, Google has recently taken quantum leaps forward in their ability to recognize the relationships between objects and their attributes.

Google understands semantic relationships by examining and extracting data from existing web pages and by leveraging insights from the queries that searchers use on their search engine.

Google’s search algorithm has been getting “smarter” for quite some time now, but as far as we know, these advances are not being applied to one of Google’s core ranking signals – external links. We’ve had no reason to suspect that the main tenets of PageRank have changed since they were first introduced by Sergey Brin and Larry Page back in 1998.

Why not now?

What if Google could leverage their semantic understanding of the web to not only identify the relationships between keywords, topics and themes, but also the relationships between the websites that discuss them? Now take things a step further; is it possible that Google could identify whether a link should pass equity (link juice) to its target based on topic relevance and authority?

Bill Slawski, the SEO industry’s foremost Google patent analyzer, has written countless articles about the semantic web, detailing Google’s process for extracting and associating facts and entities from web pages. It is fascinating (and complicated) analysis with major implications for SEO.

For our purposes, we will simplify things a bit. We know that Google has developed a method for understanding entities and the relationship that they have to specific web pages. An entity, in this case, is “a specifically named person, place, or thing (including ideas and objects) that could be connected to other entities based upon relationships between them.” This sounds an awful lot like the type of algorithmic heavy lifting that would need to be done if Google intended to leverage its knowledge of the authoritativeness of websites in analyzing the value of backlinks based on their relevance and authority to a subject.

Moving Beyond Links

SEOs are hyper-focused on backlinks, and with good reason; correlation studies that analyze ranking factors continue to score quality backlinks as one of Google’s major ranking influences. It was this correlation that started the influx of inorganic linking that landed us in our current state of affairs.

But, what if Google could move beyond building or earning links to a model that also rewarded mentions from authoritative sites in any niche? De-emphasizing links while still rewarding references from pertinent sources would expand the signals that Google relied on to gauge relevance and authority and help move them away from their dependence on links as a ranking factor. It would also, presumably, be harder to “game” as true authorities on any subject would be unlikely to reference brands or sites that weren’t worthy of the mention.

This is an important point. In the current environment, websites have very little motivation to link to outside sources. This has been a problem that Google has never been able to solve. Authorities have never been motivated to link out to potential competitors, and the lack of organic links in niches has led to a climate where the buying and selling of links can seem to be the only viable link acquisition option for some websites. Why limit the passage of link equity to a hyperlink? Isn’t a mention from a true authority just as strong a signal?

There is definitely precedent for this concept. “Co-occurrence” and “co-citation” are terms that have been used by SEOs for years now, but Google has never confirmed that they are ranking factors. Recently however, Google began to list unlinked mentions in the “latest links” report in Search Console. John Mueller indicated in a series of tweets that Google does in fact pick up URL mentions from text, but that those mentions do not pass PageRank.

What’s notable here is not only that Google is monitoring text-only domain mentions, but also that they are associating those mentions with the domain that they reference. If Google can connect the dots in this fashion, can they expand beyond URLs that appear as text on a page to entity references, as well? The same references that trigger Google’s Knowledge Graph, perhaps?

In Summary

We’ve built a case based on much supposition and conjecture, but we certainly hope that this is the direction in which Google is taking their algorithm. Whether Google acknowledges it or not, the link spam problem has not yet been resolved. Penguin penalties are punitive in nature and exceedingly difficult to escape from, and the fact of the matter is that penalizing wrongdoers doesn’t address the problem at its source. The motivation to build inorganic backlinks will exist as long as the tactic is perceived to work. Under the current algorithm, we can expect to continue seeing shady SEOs selling snake oil, and unsuspecting businesses finding themselves penalized.

Google’s best option is to remove the negative signals attached to inorganic links and only reward links that they identify as relevant. By doing so, they immediately eviscerate spam link builders, whose only quick, scalable option for building links is placing them on websites that have little to no real value.

By tweaking their algorithm to only reward links that have expertness, authority and trust in the relevant niche, Google can move closer than ever before to solving their link spam problem.

Editor’s note, Dec. 7, 2015: On Dec. 3, we learned that, despite previous comments by Google suggesting otherwise, the next Penguin update will not happen before the year’s end. We’ve updated this article to reflect this.

The post Is Google about to Unplug Its Penguin? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/is-google-about-to-kill-penguin/feed/ 33
Unprecedented Google Announcement of a Mobile-Friendly Algorithm Change https://www.bruceclay.com/blog/google-mobile-friendly-algorithm-change/ https://www.bruceclay.com/blog/google-mobile-friendly-algorithm-change/#comments Fri, 27 Feb 2015 01:30:09 +0000 http://www.bruceclay.com/blog/?p=35792 This is a huge announcement, guys. Circle your calendars — April 21, 2015.

Google made an announcement today regarding their mobile search algorithm. In an unprecedented move, they have announced the exact date that they intend to change their mobile organic SERP algorithm to more heavily weigh “mobile friendliness” as a ranking signal. That date is April 21.

Read on for what's changing, what we don't know (yet) and what this means for you.

The post Unprecedented Google Announcement of a Mobile-Friendly Algorithm Change appeared first on Bruce Clay, Inc..

]]>
This is a huge announcement, guys. Circle your calendars — April 21, 2015.

Google made an announcement today regarding their mobile search algorithm. In an unprecedented move, they have announced the exact date that they intend to change their mobile organic SERP algorithm to more heavily weigh “mobile friendliness” as a ranking signal. That date is April 21.

Here’s an excerpt from the Feb. 26 announcement on the Google Webmaster Central Blog, with emphasis in red (mine):

Google blog about mobile friendly search

Update from Feb. 27: Google Webmaster Trends Analyst John Mueller addressed questions about the mobile friendliness announcement in a live, hour-long “office hours” Hangout On Air this morning. Scroll to the bottom of this post to watch the full video.

In Google’s history, I can NEVER remember them naming a DAY that they WILL be making an algorithm change. Unprecedented. Their language is also telling: “have a significant impact in our search results.”

This is a game-changing announcement. We need to treat it as such.

What Is Changing?

Prior to this, the mobile rankings for a website were usually tied to the ranking strength of the desktop site. If you ranked well on the desktop SERP, you usually ranked well on the mobile SERP as well. Google has always alluded to the fact, however, that the mobile-friendliness of your website could (would?) impact your organic rankings. This is Google definitively following through with that promise.

Starting on April 21, we can assume that mobile-friendly sites will see a dramatic boost in rankings, especially in spaces where their competition has not taken the time to get their “mobile houses” in order and do not enjoy the mobile-friendly distinction. To be clear, this blog article specifically talks about mobile search rankings — NOT desktop rankings.

What We Don’t Know (Yet)

Although the language of this announcement indicates that this is a change to mobile search results, there has been speculation that mobile friendliness will also impact desktop rankings in the future. (Some believe it already does to a small degree.) While this announcement stops short of indicating that this will occur on April 21, if mobile usability doesn’t begin to effect desktop rankings on that date, one day soon, I expect it definitely will.

What Google doesn’t indicate in their announcement is if the mobile-friendly ranking shift will apply on a site-wide or page-by-page basis. This distinction is especially important for websites using dynamic serving or separate mobile sites that contain mobile versions of some (but not all) content. We do know that the “mobile-friendly” label in SERPs is awarded to individual pages on a domain. It is not an all-or-nothing annotation. So the question is this: can we assume (always dangerous with Google) that the mobile search algorithm will judge website pages on their individual merits as well? Or, if the percentage of mobile-friendly pages on a domain is too low, will the entire domain see a demotion after April 21?

What This Means for YOU

We’ve expanded on the advantages of responsive design in the past. Responsively designed websites have a one-to-one relationship between desktop and mobile pages because they are one in the same. As Google’s preferred method of serving content to mobile users, we can assume that responsive sites will be favored by Google in search results going forward, and this is the first real step in that process.

If going responsive before April 21 is not an option for you, it is of vital importance that you consider the mobile solution you have in place and address its deficiencies as soon as possible. Google has gone to great lengths to help webmasters identify mobile site pitfalls and issues by adding things like the Mobile Usability Report to Google Webmaster Tools. That report details mobile usability errors that are specific to your domain. Google has also released the Mobile Friendly Testing Tool, which will analyze a URL and report if the specific page has a mobile-friendly design. Use the tools and resources available to earn the mobile-friendly badge across your website.

Here are more articles to help you along the way:

The post Unprecedented Google Announcement of a Mobile-Friendly Algorithm Change appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-mobile-friendly-algorithm-change/feed/ 25
POLL: Does Google Have a Responsibility to Refresh Its Penguin Algorithm? https://www.bruceclay.com/blog/google-penguin-algorithm-update-responsibility/ https://www.bruceclay.com/blog/google-penguin-algorithm-update-responsibility/#comments Tue, 12 Aug 2014 15:45:12 +0000 http://www.bruceclay.com/blog/?p=32670 And so we wait. . .

In the past 2 years we've had an increase in clients that come to our firm because they have been affected by an algorithmic or manual penalty. We offer many of these clients what we call Penalty Assessments, which are a series of deep-dive engineering documents that identify the type of penalty that the site is suffering from, offer a road map for recovery from the penalty as well as actionable recommendations for mitigating future risk. We work with penalized sites of all sizes, some attached to large corporations, others belonging to small to mid-sized businesses.

We've become really good at tasks like penalty identification and backlink profile clean up. We've gotten a number of clients out from under the revenue depressing weight of algorithmic and manual penalties alike. But lately, a number of our penalized clients are becoming impatient. It's not anything we've done, and it's not due to anything we can do. We, along with the rest of the SEO industry, have been waiting for 10 months for Google to refresh its Penguin algorithm.

Typically Penguin refreshes have stuck to a general May/October refresh schedule. However, the last refresh occurred more than 10 months ago. Reactions from vocal contingents in the SEO industry have run the gamut, with many expressing frustration on behalf of their penalized clients, while others defend Google’s right as a private company to tweak their product as they see fit.

Read more of Does Google Have a Responsibility to Refresh Its Penguin Algorithm?

The post POLL: Does Google Have a Responsibility to Refresh Its Penguin Algorithm? appeared first on Bruce Clay, Inc..

]]>
And so we wait. . .

In the past 2 years we’ve had an increase in clients that come to our firm because they have been affected by an algorithmic or manual penalty. We offer many of these clients what we call Penalty Assessments, which are a series of deep-dive engineering documents that identify the type of penalty that the site is suffering from, offer a road map for recovery from the penalty as well as actionable recommendations for mitigating future risk. We work with penalized sites of all sizes, some attached to large corporations, others belonging to small to mid-sized businesses.

We’ve become really good at tasks like penalty identification and backlink profile clean up. We’ve gotten a number of clients out from under the revenue depressing weight of algorithmic and manual penalties alike. But lately, a number of our penalized clients are becoming impatient. It’s not anything we’ve done, and it’s not due to anything we can do. We, along with the rest of the SEO industry, have been waiting 10 months for the next Google Penguin update.

Create your free online surveys with SurveyMonkey, the world’s leading questionnaire tool.

As professionals in the search marketing field are aware, in order to truly recover from a Penguin penalty, Google needs to refresh the specific elements that manage that portion of their algorithm. Google has refreshed the Penguin algorithm twice yearly, approximately every 6 months:

  • Penguin 1.0 – April 24, 2012
  • Penguin 1.1 – May 26, 2012
  • Penguin 1.2 – October 5, 2012
  • Penguin 2.0 – May 22, 2013
  • Penguin 2.1 – Oct. 4, 2013

Typically Penguin refreshes have stuck to a general May/October refresh schedule. However, the last refresh occurred more than 10 months ago. Reactions from vocal contingents in the SEO industry have run the gamut, with many expressing frustration on behalf of their penalized clients, while others defend Google’s right as a private company to tweak their product as they see fit.

Add your voice to the debate through the poll above.

With Great Power, Comes Great Responsibility

There are a number of very opinionated and strong arguments to support a position that Google owes the webmaster community a refresh, and soon. To say that Google dominates online search share is an understatement. At last check, Google’s reported search market share was near 68% but most industry pundits believe Google’s true search market share is north of 80% — 90% in some verticals.

There are also many who believe that Google aims to make cheaters pay for their crimes with an unforgettable punishment, and that this delay does just that, especially if there is no update until 2015. Google is essentially the only game in town when it comes to online marketing. Some argue that diversifying your online income funnels is the key to removing yourself from under Google’s thumb, but I see no viable second option to the visibility that Google can offer a business.

lady justice with scales
Photo by Dan4th Nicholas (CC BY 2.0), modified

Even more frustratingly, Google has seemingly passed judgment on webmasters everywhere by framing their algorithmic changes in an ethical light. While “ethics” and “morals” both relate to right and wrong, ethics are the guiding principles enforced on an individual by an external source (think religion, government or in this case, Google). For that external source to enforce an ethical standard on a community, it needs power. In this case, that power is being given to Google by its widespread use. Whether fair or intentional or not, the profitability of too many businesses and the livelihoods of too many individuals hinge on the fluctuations of Google’s search algorithms.

Google seemingly embraces this role by using language like that which appeared in Matt Cutts article announcing the original Penguin update in April 2012 entitled “Another step to reward high-quality sites.” In the article, Cutts explained that Google is interested in rewarding the “good guys” on the Internet:

“The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded.”

Clearly, here Google has framed the conversation regarding their algorithmic updates in black and white. Do good and be rewarded; try to cheat Google’s algorithm and you’ll be singled-out and punished. Google’s corporate motto “Don’t be evil” aspires to be more than a mission statement and instead serves as a moral code which they have placed at the heart of all they do. Indeed, Google even offers the penalized webmasters an avenue for confessing their sins and receiving penance, having outlined the process for recovery in their Help Forums.

However, if Google is going to offer this remedy, then don’t they have a responsibility to hold up their end of the bargain and reward the contrition of the offending websites?

How many businesses, anticipating a refresh in May, have done their due diligence in scrubbing their link profiles spotless, doing Google the huge favor of helping to clean the Internet of inorganic links in the process, only to still be under penalty after nearly a year’s time, their business’ profits decimated in the interim?

The Dark Unknown of a Refresh

I would speculate that Google’s inability to refresh its Penguin algorithm is not based on intentional malice. It is much more likely that as they incorporate the data from the hundreds of thousands (millions?) of disavow files that they’ve acquired over the past year, that trial SERPs are getting markedly worse, not better. This is an almost predictable result of the blind disavowing that many webmasters (and SEOs) engaged in after being penalized. If it’s Google’s intent to use the disavow data to identify low-value sites and improve SERPs, they have quite a task ahead of them.  Google now has the unenviable task of sorting through this mess, trying to return the best search results possible in a post-Penguin world.

It should also be noted that even worse than this current climate of frustration is the possibility of the unknown. Those who expect to see a benefit from their link pruning efforts are eager for the algorithmic refresh, but it’s possible we’ll see another unexpected outcome altogether. What if when Google hits reset on its backlink calculations it makes a number of other changes at the same time? With each Penguin iteration Google’s webspam classifier becomes more restrictive. There’s a good chance it will happen in the next refresh, with Google moving the line and lowering its tolerance for what is an acceptable backlink profile.

If this happened along with a refresh, would everyone who hopes to see gains be satisfied? And if it were to happen now, just as the holiday marketing season is set to begin, what kind of panic and chaos would we witness? Perhaps the devil we know is better than the devil we don’t know.

Predictions and speculation aside, all we can do is wait for Google, who first allowed sites to be rewarded for building links, but later penalized those same links (and sites) for being a bit too effective at influencing rankings. And we have thousands upon thousands of businesses who have had their profits decimated by Penguin penalties, either through ignorance of the guidelines or through their intentional manipulation, devoting substantial time, effort and resources to link pruning in the hopes of lifting the penalty and returning to Google’s good graces. We can only hope that when we do one day see the payoff of our link pruning work, our sites are deemed the better for it.

“Do no evil” implies the power to forgive when a website “repents” for their sins. Sites have worked hard to repent. They have learned their lessons. They want and need to be forgiven.

And so we wait. . .

The post POLL: Does Google Have a Responsibility to Refresh Its Penguin Algorithm? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-penguin-algorithm-update-responsibility/feed/ 12
9 Tips for Getting Your Manual Link Penalty Overturned https://www.bruceclay.com/blog/9-tips-manual-link-penalty-removal/ https://www.bruceclay.com/blog/9-tips-manual-link-penalty-removal/#comments Tue, 06 May 2014 15:30:15 +0000 http://www.bruceclay.com/blog/?p=31718 The effect that an unnatural link penalty can have on a website can be crippling. Make no mistake, there is a punitive aspect to these actions. Google is looking to teach webmasters a lesson, one that insures that they will not think about violating the search giant’s quality guidelines in the future. To drive their point home, Google makes the process of recovering from these penalties very difficult.

Link penalty recovery takes time, effort, and a substantial commitment of resources. Depending on your specific situation, you could end up reviewing and/or removing hundreds of thousands (millions?) of links. And generally speaking, there is no shortcut to forgiveness.

A good number of clients that approached our firm over the past year came to us suffering from some form of manual or algorithmic penalty. The good news is that we have seen a high degree of success in getting penalties overturned. What follows is a list of tips for getting a specific type of penalty removed: a manual link penalty.

Read more of 9 Tips for Getting Your Manual Link Penalty Overturned.

The post 9 Tips for Getting Your Manual Link Penalty Overturned appeared first on Bruce Clay, Inc..

]]>
The effect that an unnatural link penalty can have on a website can be crippling. Make no mistake, there is a punitive aspect to these actions. Google is looking to teach webmasters a lesson, one that insures that they will not think about violating the search giant’s quality guidelines in the future. To drive their point home, Google makes the process of recovering from these penalties very difficult.

Link penalty recovery takes time, effort, and a substantial commitment of resources. Depending on your specific situation, you could end up reviewing and/or removing hundreds of thousands (millions?) of links. And generally speaking, there is no shortcut to forgiveness.

A good number of clients that approached our firm for SEO services over the past year came to us suffering from some form of manual or algorithmic penalty. The good news is that we have seen a high degree of success in getting penalties overturned. What follows is a list of tips for getting a specific type of penalty removed: a manual link penalty.

These tips are my own and as such are one analyst’s experiences dealing with these types of specific penalties; every situation is unique, but generally speaking, if you follow these tips, you’ll increase your chances of recovering from the penalty substantially.

1. Removing Links Is Job #1

There’s a reason that this is the first item on the list. If you are suffering from a manual link penalty (or any link penalty, for that matter) your best chance of having it overturned is the removal (or nofollowing) of inorganic links from the Internet. Do not rely on the disavow tool; it’s considered a last resort tool by Google and should only be used when every effort has been made to remove poor backlinks manually.

While Google has never confirmed their exact method of processing reconsideration requests, my experience tells me that one of the main metrics they consider is the amount of live inorganic links that are removed over time.

2. Be Thorough in Your Pruning Effort

The more backlinks you can gather for auditing, the better chance you have of offering Google a complete pruning effort. You should never rely solely on reported backlinks in Google Webmaster Tools. In fact, consider adding a second or third source of links. Bing Webmaster Tools, MajesticSEO, Ahrefs, and Moz’s Open Site Explorer are all excellent sources of backlink data.

Use Multiple Backlink Data Sources(1)

 

There are plenty of stories out there that describe how Google has reported inorganic backlinks in denied reconsideration requests that are not listed in Google Webmaster Tools’ backlink report. It behooves webmasters to make an effort to monitor backlinks and create a more complete list if they hope to recover from a link penalty. If you paid someone to build links in the past, contacting them to try and obtain the original work log that listed the links built can be exceedingly valuable.

You can take this a step further and use Google itself to search for boilerplate language that link builders used to create backlinks to your site. Snippets from articles or searches for forum profile names and descriptions can uncover links that even the backlink reporting services may have missed.

3. Don’t Prune Nofollowed Links

Part of being thorough is making sure that you have gathered all of the pertinent data to assess your link profile. Nofollowed links do not pass PageRank and as such, they do not require removal or disavowing. You’d be shocked by how many clients I have worked with (many of whom run large online listing directories) who have nofollowed all external links on their site, but still receive removal requests.

It is important to note that Google Webmaster Tools does not offer data related to nofollowed or previously disavowed links (Matt Cutts, are you listening?), so you’ll need to crawl the links provided on those pages to gather details about them (Screaming Frog does a great job of this). Most link reporting tools like MajesticSEO, Ahrefs, etc., have a column that indicates if a link is nofollowed or not, but depending on the freshness of that data, the links themselves may not be live, which leads us to #4. . .

4. Make Sure the Links on Your Sheet Are LIVE

Just because a link is being reported by Google Webmaster Tools or another backlink data provider, it doesn’t mean that the link is active and live on the Internet. Depending on when your link report was run, the links in question could be long gone (and so could the site they appeared on). You can automate the process of crawling a list of links to ensure they are live fairly easily using tools like Screaming Frog or SEOTools For Excel.

5. Want Your Link Nofollowed or Removed? Ask Nicely

Some webmasters get inundated with link pruning requests and it’s not because they run a spammy site. Oftentimes their only sin was allowing followed comment links or trackbacks to be published on their site. Put yourself in the shoes of those webmasters when you craft your removal/nofollow request letter. How does the old saying go? “You catch more flies with honey than you do with vinegar”? This certainly applies to removal requests, so when crafting yours, be considerate and professional.

I do want to mention that there are some webmasters out there that will be less than amiable when they respond to your requests. Some will ask for money to remove your links while others will virtually scream, curse, spit and complain about your request. We advise our clients to ignore these responses, preserve their professionalism, and put the offending domains directly into the disavow list.

6. Disavow on the Domain Level

If you do have to resort to disavowing a link or set of links, it is probably better to disavow the entire domain as opposed to the individual URLs. Links that appear on blogs can often be republished on tag or archive pages and those pages aren’t always listed in the backlink report you might be working from. If you’ve decided the links in question have no value and can make the same claim for the entire domain, a domain level disavow is advised.

7. Track and Share Your Work

A proper link pruning spreadsheet details:

  • All of the data that you’ve gathered about a link
  • Your assessment of the value of that link (and reasoning behind it)
  • Your history in trying to get it removed or nofollowed
  • And finally the result of those efforts

Keeping vigilant track of this information in one location and then uploading that file to Google Drive to share with the Google engineer who will be reviewing your reconsideration request is vital to getting your penalty removed.

Don’t forget to make sure that anybody with the link to your link pruning spreadsheet can view the entire document. List links to your link pruning spreadsheet in your disavow file and in the text of your reconsideration request as well.

8. Admit Your Sin, Detail Your Penance, Promise to Never Do It Again

The Google webspam team published a set of videos discussing manual action penalties. In them, Matt Cutts repeatedly indicates that there are 3 aspects to getting your manual penalty overturned.

  1. Give background on how and why you got the penalty
  2. Detail how you fixed the issue
  3. Assure the search giant that you will NOT violate their quality guidelines again

The body of your reconsideration request should discuss these 3 points in detail.
Make letters thorough, but also consider hitting these 3 points in the first few paragraphs of the letter. While we would like to think that the Google engineers who review reconsideration requests read every word we write, it makes sense to give them a summary of the work performed early on in the letter.

When crafting reconsideration requests I ask myself this question: If the Google engineer only has time to read the first few paragraphs of this reconsideration request, would they have all of the information they need to overturn the penalty? If they don’t, redraft your letter so they do.

9. Be Patient; Auditing Links Takes Time

If there’s a common theme among failed link pruning campaigns, it’s the reluctance to roll up your sleeves and do the work necessary to fix the problem. Auditing links takes time and there’s no way to fake it. While it is true that you can exclude certain links from review (either because they are clearly valuable or harmful) the vast majority of links will require a website visit and a manual audit.

Unless you are Interflora, Rap Genius, or JCPenney, the process for recovering from a penalty is made intentionally difficult by Google. As I alluded to earlier, there is definitely a punitive aspect to the manual actions – they are intended to make a large enough impact into the health and profitability of a website that the “offending” webmasters think twice before breaking Google’s Quality Guidelines again.

Google has even tweaked their reconsideration request denial letters to indicate that they expect you to spend time pruning links before you submit additional requests. They want to see webmasters put in substantial work in order to recover from these penalties.

Some believe that the first rounds of reconsideration requests are automatically denied. While we have not seen that to necessarily be the case, we do see that denied reconsideration requests all share some common characteristics – most notably quick, sloppy, artificial link pruning efforts that rely heavily on disavowing instead of the very difficult work involved in manually removing the offending links.

Tough Choices and a Word of Warning

The process of link pruning will take time and be an iterative process. Depending on the overall disposition of your link profile, you may have to make some hard decisions about links that may have some value. We see this quite often as clients who engaged in link building, not link earning, campaigns are forced to prune or nofollow links that would ordinarily be considered organic. The decision to do this is not easily made and should probably only be done after initial pruning efforts have failed to remove the penalty in question.

There is no way around the fact that link pruning weakens websites. While the worst of the links that get pruned were probably not helping the website to begin with, there may be niche appropriate links that appear on less than stellar websites that do offer some value, but may have to be removed because a business engaged in inorganic link building and as a result, have cast doubt on their entire link profile. Often, the decision to remove those links is one that can only be made after initial reconsideration requests have failed.

If you’re facing a manual link penalty or suspect a penalty of some kind is hindering your rankings in Google and would like to speak with an experienced professional, don’t hesitate to learn more about our SEO penalty assessment and removal services. 

The post 9 Tips for Getting Your Manual Link Penalty Overturned appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/9-tips-manual-link-penalty-removal/feed/ 12
Breaking Bad SEO Habits: Unlocking Success for Online Marketing https://www.bruceclay.com/blog/5-seo-bad-habits/ https://www.bruceclay.com/blog/5-seo-bad-habits/#comments Fri, 10 Jan 2014 16:45:06 +0000 http://www.bruceclay.com/blog/?p=29768 Habits get a bad rap. They aren’t bad on their own. We develop them so we can get more done without having to think carefully about everything we do each day.

After any substantial amount of time in the SEO industry, you develop certain habits. Many of those habits are born of necessity, others reinforced by positive experiences, still others implemented for the sake of convenience. Regardless of their source, these tendencies influence the way we SEOs manage our client’s websites and behave online — and not always for the best.

Here are 5 habits that even the most experienced SEO analysts can fall into that could be harming your ability to successfully direct and manage a client’s online marketing campaigns.

Read more of 5 Bad SEO Habits You Need to Break.

The post Breaking Bad SEO Habits: Unlocking Success for Online Marketing appeared first on Bruce Clay, Inc..

]]>
Habits are often viewed negatively, but they have a purpose. Habits are designed to simplify our daily routines and help us achieve more without constant deliberation. Professionals in the SEO world also develop habits with time. Some habits are a result of necessity, while positive experiences reinforce others. However, certain habits can make it difficult to manage and direct online marketing campaigns for clients. In this article, you will learn about five common SEO mistakes that even experienced analysts can make and how to break them.

Overcoming Personalization Bias when Searching Results

SEO analysts are taught to limit personalization in search results. It is important to evaluate the rankings of your client’s websites without personalization due to search history and location. Using only non-personalized results can hide important insights into how users search.

Understanding the impact of Google+, localization, and blended search results can provide valuable SEO information. The majority of your client’s potential customers only perform Google searches with personalization. Understanding the variations in SERPs they encounter as they travel to your website, while it is impossible to replicate their exact experience, is crucial for an effective online strategy. Remember that the user experience begins with the SERP and that this experience is crucial to the success of your marketing campaigns.

Move Beyond Daily Ranking Checking

Rankings are a key factor in the success of an SEO campaign. Ignoring how search engines rank your client’s website for important search terms could have disastrous consequences. But constantly monitoring rankings and being overly reactive to them can be detrimental to your campaign.

The pursuit of a higher ranking can compromise an online marketing strategy. Reactive tactics based on daily ranking fluctuations often produce poor results. It is important to stick to the plan, even if temporary ranking dips occur when launching an SEO campaign. SEO professionals are familiar with long-term best practices. Do not get caught up in minor changes in rankings, and have faith in the process. SEO is a long-term process, not a quick sprint.

Strategic Implementation for SEO Enhancements

Any successful SEO campaign will have the enhancement of your client’s site as its primary objective. It is important to make sure that these enhancements are in line with your client’s model of business and make logical sense.

Every aspect of your website should be tuned to work harmoniously with the larger framework. The ultimate goal is converting visitors into customers. Unfortunately, this goal is not always reached. Some SEOs will try to implement authorship, rating, and review markups even if they are not relevant or beneficial to the client.

Recent Google changes, such as the reduction of rich snippets on SERPs, highlight potential harm caused by these tactics. Rich snippets that are overused or misused can be viewed by search engines as a low-value signal, which could lead to ranking loss. You can lose focus if you chase the latest SEO trends. Prioritize targeting relevant audiences and directing these visitors to your website.

Seeing the world through fresh eyes

It’s said that “you can’t see a forest for trees,” and this is true of SEO professionals as well as the websites they optimize. SEOs need to revisit their clients’ websites with a new perspective, just as a writer needs to distance from his work in order to evaluate it objectively. This helps identify issues that could hinder the conversion of website visitors into customers.

It is important to visit your clients’ websites regularly, but it is also crucial to periodically look at your own work with fresh eyes. It can be very valuable to collaborate with SEO analysts who are knowledgeable or to seek advice from colleagues. Bruce Clay, Inc. analysts are often involved in fruitful discussions where they share insights and best practices. Take advantage of the chance to learn new perspectives and avoid getting bogged down in the minutiae of on-page optimization.

Prioritizing user experience over search engine optimization

This is an important point that SEO professionals tend to overlook. It is difficult to strike a balance between optimization and conversion, but many SEOs sacrifice the usability of websites by focusing too much on ranking for specific keywords. Making changes for the sole purpose of optimizing keywords can have negative effects on a website’s usability.

Keyword optimization is becoming less important. Ranking for relevant search terms still remains important, but its importance in SEO is decreasing. In-depth research shows that users are finding websites by using unconventional and nonintuitive search terms. Long-tail keywords are the most effective for conversions. Capturing long-tail traffic can be achieved by creating comprehensive content that is naturally and intelligently tailored to a topic rather than over-optimizing a single keyword.

Google’s algorithmic updates, especially with the introduction to Google Hummingbird have shifted the emphasis from keyword-based search engine optimization (SEO) to topic-based search engine optimization. Search engines now focus on providing complete information to users and answering their queries instead of focusing solely on keywords. To capitalize on this shift, sites should prioritize comprehensive content that covers topics in depth.

SEO Habits Adapted to an Evolving Landscape

All SEO professionals are committed to promoting the best interests of their clients. It is important to realize that certain habits can hinder the success of managed sites. SEO is a dynamic area that constantly changes with search engines. This requires a constant and critical evaluation of the tactics used.

Reflect momentarily: Are you guilty of any bad SEO habits that need to be broken? By recognizing these habits and actively working on them, you can improve your effectiveness as an SEO professional and drive more success in online marketing campaigns. Share your experiences or confessions in the comments.

Break free from detrimental SEO habits today to elevate your online marketing success—prioritize user experience, embrace strategic implementation, and adapt to an evolving landscape for lasting SEO achievements. Contact us

FAQ: Unveiling Negative SEO Practices and Online Marketing Success

SEO is a constantly evolving field that requires professionals to adapt to the changing demands of search engine algorithms. Certain ingrained habits, however, can negatively impact a client’s success in online marketing. In this whitepaper, we will explore five common SEO habits that can be a trap for even experienced analysts. By analyzing and understanding these habits, we hope to equip SEO professionals with modern approaches for optimizing websites and driving successful online marketing campaigns.

1) Habit: Limiting Personalization Bias in Search Results

SEO analysts have traditionally been advised to focus solely on non-personalized results in order to accurately assess website rankings. This approach is a good way to evaluate ranking strength, but it does not consider factors that affect the search experience of the average user. Localization, personalized searches, and blended results all play a significant role in determining the SERP experience of users. Ignoring these factors can hamper the development of comprehensive marketing strategies. To combat this, SEO professionals need to tailor their campaigns according to the SERP variations that users encounter.

2) Habit: Daily rank checking is excessive

Rankings are unquestionably a key indicator of SEO success. A focus on ranking daily can lead to a reactionary approach that hurts overall campaign performance. SEO professionals should instead adhere to a clearly defined strategy and avoid being swayed too much by minor fluctuations in rankings. It is important to maintain a long-term outlook and trust in established best practices. SEO is a process that takes time and requires patience.

3) Habit: Implementing the “Newest” SEO enhancements blindly

Any SEO campaign should aim to improve a client’s site. Implementing the latest SEO trends blindly without considering their relevance and compatibility with a client’s business can be harmful. SEO professionals need to ensure that all website components contribute to the overall goal of turning visitors into customers. Implementing authorship and rating markups without relevant content or context is counterproductive. SEO strategies should be based on a thorough understanding of the target market and aligned with the client’s objectives.

4) Habit: Inability To See The Big Picture

SEO professionals may not be able to identify larger issues that are hindering conversion rates if they visit a client’s site frequently. It is important to revisit the website regularly and from a new perspective. Collaborating with colleagues and asking for their advice can help you gain valuable insights and identify blind spots. By gaining a broader perspective, SEO professionals can overcome tunnel vision and tackle underlying issues that impact the website’s conversion rate.

5) Habit: Prioritizing Search Engine Optimization Over User Experience

It is difficult to achieve a balance between the search engine optimization (SEO) and the user experience. Prioritizing keyword optimization above usability can compromise the overall effectiveness of a site. Over-optimization of specific keywords can lead to poor user experiences and reduce conversions. Search engines are increasingly prioritizing topic-based SEO. It is, therefore, important to create comprehensive, relevant content that answers users’ questions. By focusing more on valuable information and answers than on specific keywords, websites can better cater to users’ needs and align themselves with the changing search engine algorithms.

SEO professionals need to be aware of the habits that can hamper the success of online campaigns. By letting go of these habits, SEO professionals can optimize websites more effectively and achieve better results. An effective holistic SEO approach involves prioritizing user experience over SEO, considering personalized searches taking an overall long-term perspective approach to client’s models, and developing new perspectives. Professionals must stay ahead of this ever-evolving SEO landscape to remain competitive while giving their clients optimal results.

Step-by-Step Procedure to Break Bad SEO Habits and Unlock Success for Online Marketing:

Step 1: Overcoming Personalization Bias when Searching Results

– Emphasize the importance of evaluating search rankings without personalization to gain valuable insights.

– Explain that understanding variations in SERPs encountered by users is crucial for effective online strategies.

– Highlight the significance of user experience starting from the SERP and its impact on marketing campaigns.

Step 2: Move Beyond Daily Ranking Checking

– Stress the importance of rankings in SEO campaigns, but caution against being overly reactive to daily fluctuations.

– Encourage sticking to a long-term plan and best practices, even if temporary ranking dips occur.

– Emphasize that SEO is a process requiring patience and a focus on overall campaign goals.

Step 3: Strategic Implementation for SEO Enhancements

– Explain the primary objective of enhancing the client’s website and aligning improvements with their business model.

– Caution against implementing trendy SEO tactics without considering relevance and benefits to the client.

– Highlight the potential harm caused by overusing or misusing rich snippets and the need to target relevant audiences.

Step 4: Seeing the World Through Fresh Eyes

– Encourage SEO professionals to evaluate their clients’ websites periodically with a new perspective.

– Suggest collaborating with knowledgeable colleagues or seeking advice to gain fresh insights.

– Highlight the importance of avoiding getting too focused on on-page optimization details and minutiae.

Step 5: Prioritizing User Experience over Search Engine Optimization

– Stress the need to strike a balance between optimization and conversion while prioritizing usability.

– Explain that keyword optimization is becoming less important, and comprehensive content tailored to topics is crucial.

– Highlight the shift from keyword-based SEO to topic-based SEO and the importance of providing complete information to users.

Step 6: SEO Habits Adapted to an Evolving Landscape

– Emphasize the dynamic nature of SEO and the need for constant evaluation and adaptation of tactics.

– Encourage reflection on personal SEO habits and identification of bad habits that need to be broken.

– Highlight the importance of actively improving effectiveness as an SEO professional to drive success in online marketing campaigns.

This article was updated on December 16, 2023.  

The post Breaking Bad SEO Habits: Unlocking Success for Online Marketing appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/5-seo-bad-habits/feed/ 12
The Importance of Site Structure in the Absence of Keyword Data https://www.bruceclay.com/blog/importance-site-structure-absence-keyword-data/ https://www.bruceclay.com/blog/importance-site-structure-absence-keyword-data/#comments Mon, 21 Oct 2013 16:00:52 +0000 http://www.bruceclay.com/blog/?p=28678 Google sent shockwaves through the SEO community recently when it decided to encrypt all of its search query data and push “Not Provided” keyword results to 100%. While this change has been a long time coming, many SEOs are now struck with the stark realization that they are going to have to devise new ways to offer their clients the type of analysis and valuable metrics that they have become accustomed to with almost no keyword data.

Like so many aspects of SEO, internet marketing requires us to extrapolate conclusions from incomplete data. The complete lack of access to referring keyword data is another obstacle that must be overcome, but it also presents a unique opportunity to improve and leverage the structure of your website to help you claim some of that lost keyword data back.

Read more of The Importance of Site Structure in the Absence of Keyword Data.

The post The Importance of Site Structure in the Absence of Keyword Data appeared first on Bruce Clay, Inc..

]]>
Google sent shock waves through the SEO community recently when it decided to encrypt all of its search query data and push “Not Provided” keyword results to 100%. While this change has been a long time coming, many SEOs are now struck with the stark realization that they are going to have to devise new ways to offer their clients the type of analysis and valuable metrics that they have become accustomed to with almost no keyword data.

Like so many aspects of SEO, Internet marketing requires us to extrapolate conclusions from incomplete data. The complete lack of access to referring keyword data is another obstacle that must be overcome, but it also presents a unique opportunity to improve and leverage the structure of your website to help you claim some of that lost keyword data back.

A properly implemented site architecture can help chase those keyword demons away and allow you to track your online marketing campaigns with the accuracy and effectiveness that was normally reserved for the long-passed days when keyword data wasn’t the endangered species that it is today.

Siloing: It’s All about the Structure

Website siloing is a way of organizing your site’s content to establish clear themes. Proper site structuring can go a long way towards improving your site’s usability and visibility. Bruce Clay has been training website owners on the intricacies of site siloing for more than 7 years, as the SEO siloing article on bruceclay.com explains (shameless plug intended):

“In order to rank for keywords within Google, Yahoo and Bing, a site must provide information that is organized in a clear structure and language that search engines understand… The term siloing originated as a way to identify the concept of grouping related information into distinct sections within a website. Much like the chapters in a book, a silo represents a group of themed or subject-specific content on your site…”

A siloed website architecture builds themes around keyword sets. By optimizing specific sections of your website for a set of keywords in a particular theme, you can make assumptions about the sources of your organic landing page traffic to that “silo” or section.  Take this a step further by assigning your highest value keywords to specific pages in your silo (something you should be doing already) and you’ll have detailed data about the keywords that searchers use to land on specific pages of your site.

Siloing in Action

We’ll use the bruceclay.com website to illustrate our point. You’ll notice that our website is siloed across several themes. For the purposes of this exercise, we’ll look at the SEO silo, whose pages are contained in the directory bruceclay.com/seo/.

siloed site navigation

Our site has a physical silo structure, which means that the siloed pages all appear in an actual subdirectory on the site, in this case, /seo/; this is an important factor in our ability to track organic traffic through analytics.

In order to view our traffic on a silo basis, we go to the organic traffic report and make our primary dimension “Landing Page”. Next we add a filter that includes all results that contain the silo’s directory. In our case, we add an inclusive filter with the value “/seo/”. This allows us to view the landing page traffic for pages that appear in the /seo/ subdirectory only.

organic traffic reportThe resulting report offers us the organic search traffic for our SEO silo, whose pages all have specific keywords assigned to them. Armed with historical ranking data, we can begin to assign traffic averages to keyword ranking positions for specific time periods. This data can be extremely valuable as it offers us specific data on traffic fluctuations as they relate to increases and decreases in rankings.

Essentially, we have created a new type of keyword report, one that looks at a targeted section of our site and give us insight into keyword referral data by concerning itself with a finite list of keyword phrase possibilities.

While this type of analysis does not replace true keyword referral data, it does give us great, actionable insights into the effectiveness and shortcomings of our SEO campaigns. Long-tail traffic and specific keyword variations can be difficult to track with this type of analysis, but as explained earlier in this post, in the absence of true visitor keyword data, SEOs are forced to extrapolate conclusions from incomplete data.

Keywords have always been used as an important metric in the analysis of the effectiveness of SEO campaigns. As access to that data has shrunk over the past several years, it has been important that SEOs adapt their strategies and reporting to analyze the data we do have access to, specifically data that websites themselves own (as opposed to data being provided from outside sources, like search engines). Our agency has long preached that SEO campaigns should be judged by their ability to increase traffic, and little about that philosophy will change, regardless of how search engines treat the data that they report about searchers.

scientist kitty meme

 

The post The Importance of Site Structure in the Absence of Keyword Data appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/importance-site-structure-absence-keyword-data/feed/ 2