BruceClay - John Alexander, former SEO analyst https://www.bruceclay.com/blog/author/jalexander/ SEO and Internet Marketing Tue, 31 Jan 2023 20:59:11 +0000 en-US hourly 1 How to Optimize for Google Home NOW #OKGoogle https://www.bruceclay.com/blog/optimize-for-google-home-seo-voice-search/ https://www.bruceclay.com/blog/optimize-for-google-home-seo-voice-search/#comments Tue, 11 Oct 2016 18:27:00 +0000 http://www.bruceclay.com/blog/?p=41579 Google's recent debut of Google Home — and the impact it will have on search — has kept my mind reeling all week. I haven't been able to shake this sense that we are on the cusp of a real change in the way that people interact with technology. (It's also one step closer to the Star Trek computer Amit Singhal wants to make reality.)

Find out how Google Home will deeply impact our interaction with search engine results pages (SERPs) and, subsequently, digital marketing strategy.

Read How to Optimize for Google Home NOW #OKGoogle.

The post How to Optimize for Google Home NOW #OKGoogle appeared first on Bruce Clay, Inc..

]]>
Google’s recent debut of Google Home — and the impact it will have on search — has kept my mind reeling all week. I haven’t been able to shake this sense that we are on the cusp of a real change in the way that people interact with technology. (It’s also one step closer to the Star Trek computer Amit Singhal wants to make reality.)

Read on to find out how Google Home will deeply impact our interaction with search engine results pages (SERPs) and, subsequently, digital marketing strategy.

How Google Home Will Impact Our Interaction With the SERP

The Star Trek computer isn’t a bot that analyzes external data and catalogs instances of things to return a list of entries that users have to peruse. It’s a knowledge base, much like Google’s knowledge graph. It’s simple, intuitive, and omnipresent. In the world of Star Trek, people spend very little time looking at lists of options; the computer makes the decisions for them.

Google Home bases in seven colors.
Photo Credit: Google.

So how do we get to the 24th century computer from here? The announcements Google made on Oct. 4 took a big step in that direction. Both Google Home and Google Assistant (the heart of the new Pixel phone) bring Google’s experience with artificial intelligence to bear — and Google is training us to use technology in new ways.

The Google Assistant landing page invites visitors to use voice queries: “Ask it questions. Tell it to do things. Tell your Assistant to play jazz on the living room speakers, set your ‘go to gym’ alarm, make a reservation …”

SEOs see that and wonder: Where is there space for a SERP in there?

A SERP presents many results and lets a searcher click their choice. But voice searchers talking to Google Home have a different experience. Google wants to let the Assistant eliminate choices when there’s a clear best option — and “best” is defined by the Assistant.

Obviously, not all queries can have a single response. As you might imagine, a lot of the things we search for need a selection of answers or opinions. But do we need 15,000,000 opinions? Do we even need 10?

Example: I have a pretty small selection of power tools in a very small garage, but I’m getting into some simple woodworking. I recently needed to figure out how to quickly cut a 4″ diameter hole in some 1″ x 8″ pine. So I Googled it.

I really had to comb through the results based on my limited tool collection; a lot of the answers I ignored because they just weren’t helpful.

Now imagine doing that as a voice search with Google Home. I might hear many options the first time I searched. But because of machine learning, eventually Google would recall the particular site (or group of sites) that really caters to my skill level and figure out that out of the 15,000,000 results for “how to cut circles in wood” there might be 4 that are actually useful to me. That’s important information for Assistant; if I ask a question to Google Home while I’m in the garage plugging my jigsaw in, I don’t really want it reading 10 articles to me.

How Will Google Home Affect My SEO Strategy?

A lot of businesses have been doing online marketing, SEO, and PPC for long enough that it’s easy to think we have lots of time to catch up, or surpass, competitors.

While I’m not saying the SERPs are going away anytime soon, I do think that the increased emphasis on personalization is only going to make it harder to find new customers.

If you make a living off of publishing restaurant reviews, and people start using Assistant to find out about a new burger joint instead of Googling it, then Assistant will (likely) pull from one source to get reviews. New users might not even see your site as an option.

By the way, we have no clues yet on where AdWords may one day fit into a Google Home result.

Be Indispensable

Is the rise of spoken search results bad news for sites that aren’t Yelp, Wikipedia, or YouTube?

No! But it’s bad news for businesses who aren’t putting in the work to understand their audience. It’s bad news for businesses who aren’t willing to grow with their customers’ evolving needs.

If your business is willing to talk to your customers, to find out what your competitors are missing, then this new search technology is good news. Because the only way to be algorithm proof, the only way to secure a lasting position in the evolving world of search, is to be indispensable. So ask yourself this: Would the SERPs be lacking without your site?

Do some user experience testing. Survey your customers. Talk to your customer service reps to identify common questions or complaints, then address them. Figure out what your customers do just before and just after converting on your site; if you can help them perform some of those repetitive actions, you’ve suddenly simplified their lives.

This can be as simple as a good “People also bought” widget that anticipates the next need. If the user adds a nail gun to the cart, why not suggest some popular nails that fit the gun? Or, if it’s a hydraulic nail gun, maybe the user would like to know about a sale you’re having on air compressors.

Understand Your Analytics

The other thing to keep in mind here is that less traffic isn’t always a bad thing. A broad trend some of us in SEO have noticed is that many sites aren’t ranking for as many queries as they used to, which at first seems like terrible news. But many of those same sites are actually seeing better rankings for more specific queries, and a concurrent increase in conversions. As the search engines get better at understanding user intent, and as search becomes more and more personalized, rankings will be harder to track, and (in many instances) harder to get. But if your visits drop while your conversion rate improves, then that’s a net gain (assuming that you’re in business to make money, of course).

The one exception to this, of course, is sites that are dependent on page views for revenue (i.e., ad-heavy sites). I think now would be a good time to start developing a secondary revenue stream/way to monetize your site that you can grow over time, as people are spending less and less time on the SERPs.


Missed the Oct. 4 press conference where Google announced the debut of Google Home, Google Pixel and more? Watch the announcement below.

The post How to Optimize for Google Home NOW #OKGoogle appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/optimize-for-google-home-seo-voice-search/feed/ 16
Apps 101: Deep Linking, App Indexing, and Why They Matter https://www.bruceclay.com/blog/apps-101-what-is-deep-linking-and-app-indexing-setup/ https://www.bruceclay.com/blog/apps-101-what-is-deep-linking-and-app-indexing-setup/#comments Thu, 25 Jun 2015 20:36:03 +0000 http://www.bruceclay.com/blog/?p=37156 Apps are largely considered the new SEO frontier. Mobile app usage is growing at an incredible rate and shows no sign of slowing. Overall app usage grew by 76 percent in 2014, and lifestyle apps in particular saw 174 percent growth. If you are have an app or are developing an app, app indexation and deep linking are something you definitely need to be paying attention to. Basically, Google wants to treat your app like a website. It wants to crawl it and index it so that it can return specific pages from an app in search results. That ability to return specific pages within an app? That's deep linking.

This article is for those just dipping their toes in app indexation. Read on for a breakdown of what app indexing and deep linking actually are, as well as helpful examples of deep linking in action. You'll also learn basic requirements for Android and iOS setup.

Read more of Apps 101: Deep Linking, App Indexing and Why They Matter.

The post Apps 101: Deep Linking, App Indexing, and Why They Matter appeared first on Bruce Clay, Inc..

]]>
Apps are largely considered the new SEO frontier. Mobile app usage is growing at an incredible rate and shows no sign of slowing. Overall app usage grew by 76 percent in 2014, and lifestyle apps in particular saw 174 percent growth.

If you have an app or are developing an app, then app indexation and deep linking are things you definitely need to be paying attention to. Basically, Google wants to treat your app like a website. It wants to crawl it and index it so that search results can return specific pages from an app in mobile searches. That ability to return specific pages within an app? That’s called deep linking.

This article is for those just dipping their toes into app indexation. Read on for a breakdown of what app indexing and deep linking actually are, as well as helpful examples of deep linking in action (and how they will affect your SEO). You’ll also learn basic requirements for Android and iOS setup.

Dip your Toe Into Deep Linking and App Indexing

First, Some Definitions: What are App Indexing and Deep Linking?

What is Deep Linking?

Deep linking, in a general sense, involves linking to specific content within a website or app, rather than to the homepage. Here we’re talking in particular about getting specific elements of an app to show up in search results on a mobile device, allowing users to open an app directly from a search results page. Note: Users will only see this prompt if they have the particular app installed.

What Is App Indexing?

App indexing is the result of getting your app in Google’s index to enable deep linking. By allowing Google to index pages within your app, features (or promotions) within the app can begin showing up in users’ mobile searches, driving visits (and hopefully conversions) to the app.

Deep Linking in Action

Let’s say you search for “Jurassic World” on a mobile device, and you’re offered IMDB’s Jurassic World page rather than the IMDB homepage — this is deep linking in action. You, as the user, have the IMDB app installed on your smartphone, so you’re pleased to find among the top results the page for “Jurassic World” in your app, as well as a listing on IMDB.com directly.

Furthermore, if you wanted to read some reviews for the movie, you might type in “Jurassic World reviews” in a mobile search.

This result doesn’t give you the option to open the reviews in the mobile app. This would be a great opportunity to drive you to the app rather than the website, but that option simply doesn’t exist. Now, what if IMDB’s reviews page wasn’t ranking on the first page? Using deep linking in this instance would be a great way for IMDB to keep driving people back to the IMDB app, since Google is giving favor to apps that users have installed.

Next Steps: Getting Your App Indexed

To begin with, there are general setup requirements for Android and iOS:

Android Setup Requirements

  1. Must be developed with minSdkVersion 17 or lower.
  2. Only available on searches using Google app version 2.8 or higher, and Chrome for Android 4.1 or higher.
  3. Users must be signed in for deep linking to work.

iOS Setup Requirements

  1. Developed on a base SDK of iOS8.
  2. Only available on searches using Google app version 5.3 or higher, and Chrome for iOS.
  3. Users must be signed in for deep linking to work.

From there, Google offers specific guides for setting up Android apps and iOS apps for indexing. It is a technical process, but investing the time and effort can drive more users into your app and increase your relevance and visibility.

Have a specific question about app indexing and deep linking or mobile seo in general? Ask us in the comments! We’re always here to help.

Related blog posts:

The post Apps 101: Deep Linking, App Indexing, and Why They Matter appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/apps-101-what-is-deep-linking-and-app-indexing-setup/feed/ 14
Google Wants You to Make Your Site Faster or They’ll Do It For You. Will You Like the Result? https://www.bruceclay.com/blog/google-optimizes-sites-for-speed-transcoder/ https://www.bruceclay.com/blog/google-optimizes-sites-for-speed-transcoder/#comments Fri, 05 Jun 2015 18:41:10 +0000 http://www.bruceclay.com/blog/?p=37017 In April, around the time of Google's "Mobbilegeddon" mobile ranking update, the search engine announced another mobile optimization in testing. Via the Webmaster Central Blog, Google said they’d “developed a way to optimize pages to be faster and lighter, while preserving most of the relevant content.” In other words, if you don't optimize your site so that it loads quickly for mobile devices, Google will try to do it for you.

Called transcoding, Google says it's a feature intended to help deliver results quickly to searchers on slow mobile connections. Google's early tests show that transcoding returns pages with 80 percent fewer bytes and 50 percent faster load times. Indonesia has been the staging ground for early field tests, displaying transcoded sites when a mobile searcher is on a slow connection, like 2G.

Sounds cool, right? Now website owners and SEOs don’t need to worry about optimizing sites to be fast; Google is going to do it for us! What a magnanimous thing for Google to do. Except that there are a couple of reasons that this should give developers and webmasters pause.

Read about the pros and cons of Google's new low bandwidth transcoder.

The post Google Wants You to Make Your Site Faster or They’ll Do It For You. Will You Like the Result? appeared first on Bruce Clay, Inc..

]]>
In April, around the time of Google’s “Mobilegeddon” mobile ranking update, the search engine announced another mobile optimization in testing. Via the Webmaster Central Blog, Google said they’d “developed a way to optimize pages to be faster and lighter, while preserving most of the relevant content.” In other words, if you don’t optimize your site so that it loads quickly for mobile devices, Google will try to do it for you.

(Get your All-In-One Mobile SEO and Design Checklist here.)

Called transcoding, Google says it’s a feature intended to help deliver results quickly to searchers on slow mobile connections. Google’s early tests show that transcoding returns pages with 80 percent fewer bytes and 50 percent faster load times. Indonesia has been the staging ground for early field tests, displaying transcoded sites when a mobile searcher is on a slow connection, like 2G.

Sounds cool, right? Now website owners and SEOs don’t need to worry about optimizing sites to be fast; Google is going to do it for us! What a magnanimous thing for Google to do. Except that there are a couple of reasons why Google’s low bandwidth transcoder should give developers and webmasters pause.

The Cons of Transcoding

  1. Google says the optimized versions preserve “most of the relevant content.” There are two editorial decisions in that phrase: most and relevant. The biggest problem here is that Google, not you, decides which of your content is relevant, and how much of it to show.
  1. You probably didn’t hire a bot to design your website; do you want a bot optimizing it?

The Pros of Transcoding

For each of the cons, some sites could see real benefits from Google’s transcoding:

  1. For many websites owners, users’ seeing a stripped-down version of a site is better than not seeing it at all.
  1. Google includes a link to the original page on the transcoded version, so users have the option to see the page how you built it.
  1. Transcoded pages are undoubtedly fast. For a detailed comparison, I ran the two pages above through GTMetrix and came up with the following results:
Page Load Time Total Page Size Total Requests
Original Version 3.71 seconds 1.94 MB 155
Transcoded Page 0.56 seconds 17.3 KB 2

 

Viewing Your Transcoded Page

If you’re curious to see how your site renders when it’s been transcoded by Google, there’s a tool to show you just that. You’ll need to do a little workaround if you’re outside of Indonesia:

  1. Using the Chrome browser, go to the Low Bandwidth Transcoder emulator at https://www.google.com/webmasters/tools/transcoder.
  2. Click the toggle menu in the top right corner of the browser window, then click More tools and then Developer tools.
  3. Along the top of the window you’ll see two drop-down menus: Device and Network. Select any smartphone device from the menu selection, like the Google Nexus 4.
  4. Enter the URL you want to test in the “Your website” field and click the Preview button.
  5. Click the URL that appears below the text “Transcoded page:” and you will see how the page renders as Google transcodes it.

 

When Will We See Transcoding in the Wild?

A lot of this has to do with being lightweight; it’s not enough to use CDNs or have a high-end server. Google cares about the experience of people who access the web at the narrow end of a bottleneck, which is completely out of the hands of web developers. Your fast server only means so much to users on a 2G network.

At this point, we have no idea when, if at all, this functionality will be implemented anywhere outside of Indonesia. However, Google’s underlying statement is clear: websites should be really, really fast.

For information on how to optimize your pages for speed and mobile SEO, we recommend starting with these resources:

mobile-seo-and-design-checklist

The post Google Wants You to Make Your Site Faster or They’ll Do It For You. Will You Like the Result? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-optimizes-sites-for-speed-transcoder/feed/ 17
New Mac OS Allows Seamless Cross-Device Internet Experience; Just Another Reason to Shift to Entity SEO https://www.bruceclay.com/blog/another-reason-to-think-entity-seo/ https://www.bruceclay.com/blog/another-reason-to-think-entity-seo/#comments Wed, 05 Nov 2014 18:28:45 +0000 http://www.bruceclay.com/blog/?p=34387 It's been just over a year since Google updated its algorithm with Hummingbird, making it better equipped to serve conversational search queries.

The day is coming when the majority of searches will be conducted with natural language; most queries will be long-tail; and optimizing for a set of short keyword phrases won’t be sufficient. Some have been saying this same thing for a long time now. But have we all been listening?

New technology released by Apple in October is bringing us another step closer to device agnostic user experience. For search marketers, this is yet another reason to optimize for concepts over keywords. Here I'll describe that technology, and also share some recent stats on voice search to help us shift our thinking toward entity SEO.

Read Just Another Reason to Shift to Entity SEO.

The post New Mac OS Allows Seamless Cross-Device Internet Experience; Just Another Reason to Shift to Entity SEO appeared first on Bruce Clay, Inc..

]]>
Semantic Search, Hummingbird and Mobile devices

It’s been just over a year since Google updated its algorithm with Hummingbird, making it better equipped to serve conversational search queries.

The day is coming when the majority of searches will be conducted with natural language; most queries will be long-tail; and optimizing for a set of short keyword phrases won’t be sufficient. Some have been saying this same thing for a long time now. But have we all been listening?

New technology released by Apple in October is bringing us another step closer to device agnostic user experience. For search marketers, this is yet another reason to optimize for concepts over keywords. Here I’ll describe that technology, and also share some recent stats on voice search to help us shift our thinking toward entity SEO, optimization geared for semantic search.

New Apple Features Allow Seamless Cross-Device Internet Use

With the release of new operating systems for Macs and its i-devices (iPads and iPhones), Apple has created a seamless experience for Internet use (texts, emails and phone calls) with a feature-set called Continuity. One feature, Handoff, allows you to start and stop a task on one device, be it iPad, iPhone, or Mac computer, and then restart on another Apple device.

mac handoff icon in doc

Start typing an email on your phone then realize it would be easier to add attachments from your laptop? Handoff lets your laptop pick up right where your phone left off, provided both devices are near each other and on the same network. In the email example I used, when you look at your laptop you’ll see an icon representing the email you were just writing on your phone; clicking on it brings up the email right where you left off.

Through Continuity, you can take phone calls on your laptop, or send and receive text messages from your laptop. Plus, all of the text messages on your iPhone appear on your laptop. So you can click on a phone number pretty much anywhere on your Mac and send an SMS or iMessage.

Climbing Voice Search Stats

While I haven’t found any statistics on how many people currently perform voice searches on desktop versus on a mobile phone, I did discover Google’s own study showing how many teens use voice search, compared to adults. Some highlights:

  • More than half of the teens surveyed use voice search daily.
  • 41% of the adults said they “talk to their phones every day.”

How Cross-Device User Behavior and Voice Search Affects SEO

Prediction 1: Device tracking becomes obsolete.

I expect we’ll be moving away from a distinction between mobile vs. desktop search as new technologies gray the line; new technologies like phablets and wearables occupy a confusing middle ground. Sure, analytics can track what types of devices are visiting pages, and we as an industry can parse apart how users are accessing our websites. But Apple’s Continuity moves user behavior towards the device agnostic, which will naturally affect how people search.

Prediction 2: SEO strategy will evolve for semantic search technology. Here’s three ways:

Entity SEO emerges. Optimize for complete coverage of concepts, sometimes called entities, over keywords. It’s already been well established that someone performing a voice search on their phone tends to use conversational language, and focuses less on keywords. Semantic search, natural-language queries, and the underlying need to understand the connection between online concepts is exactly the basis of Google’s big Hummingbird Update last year.

More than ever, understand your audience. In the midst of optimizing Meta tags, checking page load times, and monitoring backlinks, don’t forget good market research, developing well-informed user personas, and maybe run a survey.

Answer questions, don’t rank for queries. In “Google Hummingbird & The Keyword” published last November, Jim Yu explains that previously SEO sought to answer the question “How do I rank for this query?” SEO today must solve the problem, “How do I best answer the questions my users have?” He says that if you’ve been staying up to date with trends in SEO, Hummingbird only reinforced the work you’d been doing.

Of course, that’s a pretty big if. And I think that a lot of SEOs fall outside that if, as do a lot of site owners. The transition from keywords to concepts is happening, and this latest move from Apple is the next clarion call that mobile is taking over. More than ever, it’s time to optimize for concepts rather than keywords. A unified user experience between desktop, laptop, tablet, and phone is the latest advance with major potential to shift how people interact with their gadgets, and ultimately, how they search.

The post New Mac OS Allows Seamless Cross-Device Internet Experience; Just Another Reason to Shift to Entity SEO appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/another-reason-to-think-entity-seo/feed/ 11
New Structured Snippets: An Enhanced SERP Snippet Is Just a Table Away https://www.bruceclay.com/blog/primer-google-structured-snippets/ https://www.bruceclay.com/blog/primer-google-structured-snippets/#comments Thu, 16 Oct 2014 23:56:42 +0000 http://www.bruceclay.com/blog/?p=33979 On September 22, the Google Research Blog announced Structured Snippets, a feature that “incorporates facts into individual results snippets in Web Search.” What it amounts to is elements from an HTML table being shown right in the SERP. If this sounds at all like Structured Data, it should. Sort of. Google displays data from your website on their results page, yet it doesn’t require schema markup or any other specialized coding. All you need is a table. Oh, and relevant data.

Read New Structured Snippets: An Enhanced SERP Snippet Is Just a Table Away.

The post New Structured Snippets: An Enhanced SERP Snippet Is Just a Table Away appeared first on Bruce Clay, Inc..

]]>
On September 22, the Google Research Blog announced Structured Snippets, a feature that “incorporates facts into individual results snippets in Web Search.” What it amounts to is elements from an HTML table being shown right in the SERP. If this sounds at all like Structured Data, it should. At least, sort of. Google displays data from your website on their results page, yet it doesn’t require schema markup or any other specialized coding. All you need is a table. Oh, and relevant data.

The Google post has an example of a Structured Snippet for the query “Nikon d7100”:
structured snippet in google serp for nikon d7100

In order to test these results out, we found the table below from Car and Driver. It’s formatted as a classic table, without any structured markup.

car and driver fast facts

And here is how it appears in the SERP:
google structured snippet serp for dodge challenger

As you can see, the data about the Dodge Challenger in the SERP listing above isn’t quite as easy to read as Google’s Nikon example, but the information is there.

This announcement has been greeted with a fair amount of skepticism, as many webmasters and content creators are frustrated that Google has found yet another way to take data from websites and present it on the search engine’s own pages, consequently stealing clicks from websites that actually published the data originally. But the fact is that there are several reasons to welcome this latest innovation.

Optimizing Structured Snippets

As is often the case, whether you welcome or dread it, this change has a lot to do with perspective. Google introduced this change to improve user experience, so SEOs and webmasters should have the same goal in mind when thinking of how to include interesting information in tables on their website to garner more attention in the SERPs. Here are some benefits to Structured Snippets:

  1. Challenges webmasters, designers and marketers to reexamine how we present information. A well-made table is an engagement object. It’s helpful for users, and it breaks up long blocks of text. Tables just became another tool in your content utility belt.
  2. No special markup required. Google said it, and based on all the examples we’ve seen, it’s true; you don’t need to learn some new technology to make the most of this change. Got data that would look good/be easier to read in a table? Great. Make that table.
  3. More real estate on the SERP. So far I haven’t heard anyone mention this, but in some instances, like in the Nikon example above, the amount of space for your entry nearly doubles. While it’s possible that Google pulling data from your website and putting it in SERPs may lower your click-through rate, it’s also possible that getting a larger entry in the SERP could help your CTR.

What Structured Snippets Mean for the Future of Search

First off, I’ll tell you what it doesn’t mean: the death of structured data. This isn’t cause for letting your schema markup fall by the wayside; if anything, Structured Snippets reinforce the importance of structured data overall. Why? Because both tools enable search engines to determine A) what your page is about, and B) how relevant it is to search queries. Search engines, as they’re always pointing out, exist to serve users, not webmasters. All of this structuring things allows search engine spiders to efficiently crawl your site and figure out who’s looking for what you’re offering.

It’s possible, and I’m really speculating here, that Meta tags (Title, Description, and the seldom-used Keywords tags) will become less and less important over time. Search engines know that it’s too easy to offer over-optimized Titles (can you say “clickbait”?) and so they’re beginning to look directly into your content; after all, how long has Google been presenting snippets of content in the SERP, where it used to always just be your Meta description? Structured Snippets are one more way to let spiders, and users, get your data quickly and easily.

Search Engine Land points out that Structured Snippets could cause some difficulty for websites that use responsive design, however, as tables are tough to format for mobile devices. Probably worth taking a page from Wikipedia’s playbook in formatting tables for a variety of devices.

One thing that is certain, is that those who make the most of this new tool stand to gain the most ground over those who are slow to adapt.

The post New Structured Snippets: An Enhanced SERP Snippet Is Just a Table Away appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/primer-google-structured-snippets/feed/ 3
Nowhere Left to Hide: Blocking Content from Search Engine Spiders https://www.bruceclay.com/blog/how-and-when-to-block-content-from-search-engines/ https://www.bruceclay.com/blog/how-and-when-to-block-content-from-search-engines/#respond Mon, 18 Aug 2014 22:08:09 +0000 http://www.bruceclay.com/blog/?p=32700 A major facet of SEO is convincing search engines that your website is reputable and provides real value to searchers. And for search engines to determine the value and relevance of your content, they have to put themselves in the shoes of a user.

Now, the software that looks at your site has certain limitations which SEOs have traditionally exploited to keep certain resources hidden from the search engines. The bots continue to develop, however, and are continuously getting more sophisticated in their efforts to see your web page like a human user would on a browser. It’s time to re-examine the content on your site that’s unavailable to search engine bots, as well as the reasons why it’s unavailable. There are still limitations in the bots and webmasters have legitimate reasons for blocking or externalizing certain pieces of content. Since the search engines are looking for sites that give quality content to users, let the user experience guide your projects and the rest will fall into place.

Read why you might want to block content from search engine bots and the SEO recommended way to do so in Nowhere Left to Hide: Blocking Content from Search Engine Spiders.

The post Nowhere Left to Hide: Blocking Content from Search Engine Spiders appeared first on Bruce Clay, Inc..

]]>
TL;DR
  1. If you’re considering excluding content from search engines, first make sure you’re doing it for the right reasons.
  2. Don’t make the mistake of assuming you can hide content in a language or format the bots won’t comprehend; that’s a short-sighted strategy. Be up front with them by using the robots.txt file or Meta Robots tag.
  3. Don’t forget that just because you’re using the recommended methods to block content you’re safe. Understand how blocking content will make your site appear to the bots.

When and How to Exclude Content from a Search Engine Index

A major facet of SEO is convincing search engines that your website is reputable and provides real value to searchers. And for search engines to determine the value and relevance of your content, they have to put themselves in the shoes of a user.

Now, the software that looks at your site has certain limitations which SEOs have traditionally exploited to keep certain resources hidden from the search engines. The bots continue to develop, however, and are continuously getting more sophisticated in their efforts to see your web page like a human user would on a browser. It’s time to re-examine the content on your site that’s unavailable to search engine bots, as well as the reasons why it’s unavailable. There are still limitations in the bots and webmasters have legitimate reasons for blocking or externalizing certain pieces of content. Since the search engines are looking for sites that give quality content to users, let the user experience guide your projects and the rest will fall into place.

Why Block Content at All?

when to block search engine spiders
Photo by Steven Ferris (CC BY 2.0), modified
  1. Private content. Getting pages indexed means that they are available to show up in search results, and are therefore visible to the public. If you have private pages (customers’ account information, contact information for individuals, etc.) you want to keep them out of the index. (Some whois-type sites display registrant information in JavaScript to stop scraper bots from stealing personal info.)
  2. Duplicated content. Whether snippets of text (trademark information, slogans or descriptions) or entire pages (e.g., custom search results within your site), if you have content that shows up on several URLs on your site, search engine spiders might see that as low-quality. You can use one of the available options to block those pages (or individual resources on a page) from being indexed. You can keep them visible to users but blocked from search results, which won’t hurt your rankings for the content you do want showing up in search.
  3. Content from other sources. Content, like ads, which are generated by third-party sources and duplicated several places throughout the web, aren’t part of a page’s primary content. If that ad content is duplicated many times throughout the web, a webmaster may want to keep ads from being viewed as part of the page.

That Takes Care of Why, How About How?

I’m so glad you asked. One method that’s been used to keep content out of the index is to load the content from a blocked external source using a language that bots can’t parse or execute; it’s like when you spell out words to another adult because you don’t want the toddler in the room to know what you’re talking about. The problem is, the toddler in this situation is getting smarter. For a long time, if you wanted to hide something from the search engines, you could use JavaScript to load that content, meaning users get it, bots don’t.

But Google is not being at all coy about their desire to parse JavaScript with their bots. And they’re beginning to do it; the Fetch as Google tool in Webmaster Tools allows you to see individual pages as Google’s bots see them.

screenshot of Fetch as Google Webmaster Tool

If you’re using JavaScript to block content on your site, you should check some pages in this tool; chances are, Google sees it.

Keep in mind, however, that just because Google can render content in JavaScript doesn’t mean that content is being cached. The “Fetch and Render” tool shows you what the bot can see; to find out what is being indexed you should still check the cached version of the page.

screenshot of how to find your site's Google cache

There are plenty of other methods for externalizing content that people discuss: iframes, AJAX, jQuery. But as far back as 2012, experiments were showing that Google could crawl links placed in iframes; so there goes that technique. In fact, the days of speaking a language that bots couldn’t understand are nearing an end.

But what if you politely ask the bots to avoid looking at certain things? Blocking or disallowing elements in your robots.txt or a Meta Robots tag is the only certain way (short of password-protecting server directories) of keeping elements or pages from being indexed.

John Mueller recently commented that content generated with AJAX/JSON feeds would be “invisible to [Google] if you disallowed crawling of your JavaScript.” He further goes on to clarify that simply blocking CSS or JavaScript will not necessarily hurt your ranking: “There’s definitely no simple ‘CSS or JavaScript is disallowed from crawling, therefore the quality algorithms view the site negatively’ relationship.” So the best way to keep content out of the index is simply asking the search engines not to index your content. This can be individual URLs, directories, or external files.

This, then, brings us back to the beginning: why. Before deciding to block any of your content, make sure you know why you’re doing it, as well as the risks. First of all, blocking your CSS or JavaScript files (especially ones that contribute substantially to your site’s layout) is risky; it can, among other things, prevent search engines from seeing if your pages are optimized for mobile. Not only that, but after the rollout of Panda 4.0, some sites that got hit hard were able to rebound by unblocking their CSS and JavaScript which would indicate that they were specifically targeted by Google’s algorithm for blocking these elements from bots.

One more risk that you run when blocking content: search engine spiders may not be able to see what is being blocked, but they know that something is being blocked, so they may be forced to make assumptions about what that content is. They know that ads, for instance, are often hidden in iframes or even CSS; so if you have too much blocked content near the top of a page, you run the risk of getting hit by the “Top Heavy” Page Layout Algorithm. Any webmasters reading this who are considering using iframes should strongly consider consulting with a reputable SEO first. (Insert shameless BCI promo here.)

The post Nowhere Left to Hide: Blocking Content from Search Engine Spiders appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-and-when-to-block-content-from-search-engines/feed/ 0
A Cheat Sheet for Mobile Design: Responsive Design, Dynamic Serving and Mobile Sites https://www.bruceclay.com/blog/mobile-design-cheat-sheet/ https://www.bruceclay.com/blog/mobile-design-cheat-sheet/#comments Thu, 26 Jun 2014 16:43:35 +0000 http://www.bruceclay.com/blog/?p=32259 We've all heard the statistics: 2014 is the year when more people access the Internet on a smartphone than on a computer or laptop. Mobile design is the future. You don't want your site left behind, but how exactly do you program for this increasingly mobile Internet? There are three main options, each with its own benefits and drawbacks.

  1. Option 1: Responsive Design
  2. Option 2: Dynamic Serving
  3. Option 3: A Mobile Site

In this post, I'll break down your mobile-readiness options, giving you the pros and cons of each to help you choose the best path forward for your website. Read more of A Cheat Sheet for Mobile Design: Responsive Design, Dynamic Serving and Mobile Sites.

The post A Cheat Sheet for Mobile Design: Responsive Design, Dynamic Serving and Mobile Sites appeared first on Bruce Clay, Inc..

]]>
Editor’s update (July 2015): In April 2015, Google made a pre-announced update to the algorithm that ranks mobile search results. A website’s mobile friendliness is a confirmed ranking signal for mobile search rankings. Take note and make your website mobile friendly with the help of the following article.

We’ve all heard the statistics: 2014 is the year when more people access the Internet on a smartphone than on a computer or laptop. Mobile design is the future. You don’t want your site left behind, but how exactly do you program for this increasingly mobile Internet? There are three main options, each with their own benefits and drawbacks. In this post, I’ll break down your mobile-readiness options, giving you the pros and cons of each to help you choose the best path forward for your website and SEO strategy.

Option 1: Responsive Design

Responsive design determines the resolution of the screen on which a page is being viewed using media queries, then adjusts the size and layout of the page appropriately. Google has stated it prefers responsive web design, which makes it the heavyweight in this discussion.

Pros:

  1. There’s only one version of each page. The same page adapts to the type of device displaying it (rather than detecting the type of device and then serving different content based on that). Having the same HTML and URL across all devices simplifies your site maintenance.
  2. Responsive design doesn’t rely on user-agent detection, as the other two options do. User-agent detection (i.e., detecting what browser or device is requesting a web page) isn’t bad in itself, but it’s not perfect, and if there’s a glitch in the process, users may get served the wrong version of your site. In addition, this saves the search engine spiders from having to crawl your site as several different user-agents — meaning more of your site gets crawled.
  3. Responsive generally loads more quickly in browsers. Because all devices get the same content, there’s no process of request-user agent detection-possible redirection. And anyone who’s ever been hungry and looked for a good restaurant on their smartphone knows, speed counts.

Cons:

  1. It can be a long and intensive process to redesign an existing site. So, if you’ve got a big site, moving to responsive may not be the best choice.
  2. Depending on the layout of your site, it may simply be too difficult to cram the contents onto a mobile screen. Sites like NYTimes.com maintain separate mobile sites because it’s easier to break the content up than it is to put it into a single page.
  3. Navigation elements don’t always resize well; hover-over elements don’t work on a touch-screen at all. So going responsive may mean changing your navigation.
  4. There have been instances where responsive pages with lots of images have loaded more slowly with responsive design. I should stress that this is not the norm, but it has happened.

Should you opt for responsive design, keep in mind that you’ll want to optimize your images and test your site on various browsers and devices (or use a good user-agent emulator) before pushing it live.

Option 2: Dynamic Serving

Sometimes referred to as user-agent “sniffing,” dynamic serving can be done in two ways and is tricky to implement. In fact, Google outlines some common mistakes made with dynamic serving. What this technique does is detect a visitor’s user-agent (i.e., what device they’re using to view your site) and then redirects at the server level. One way to implement dynamic serving is unidirectional redirecting, in which links to a site default to the desktop site, but mobile devices get redirected from the desktop site to the mobile site.

The second type, bidirectional redirecting, has code on both the desktop and mobile sites, making sure that any visitor, regardless of device, is served the appropriate content. (These pieces of code are sometimes called switchboard tags.) Implementation means putting a rel=”alternate” tag on the desktop, pointing to the mobile site; and, on the mobile site, putting a rel=”canonical” tag pointing to the desktop site.

Pros:

  1. Because the redirection is done at the server level, you only need one URL per page.
  2. Dynamic serving also works well for feature phones. As defined by PCMag.com, a feature phone is a “cellphone that contains a fixed set of functions beyond voice calling and text messaging, but is not as extensive as a smartphone.” For example, feature phones typically can’t download apps, but usually have some web browsing capability. Per Google, the biggest difference is that feature phones can’t process CSS, so they can’t handle responsive design very well. So it’s important to know your audience and what type of mobile devices they’re using.
  3. If you want to have a separate set of keywords specifically for your mobile users, then this option will let you do that since mobile users and desktop users each see distinct HTML. (Search Engine Land has a great article that discusses mobile-specific keywords.)

Cons:

  1. Dynamic redirecting doubles your site maintenance work because it sets up a separate site for mobile, with a separate set of indexed HTML requiring a separate SEO project.
  2. The necessary list of user-agent strings also requires constant maintenance, since new strings have to be added whenever a new mobile device is released.
  3. Lastly, keep in mind that you’ll need to use a “Vary HTTP-User Agents” header if your site serves content dynamically. The header helps content get served properly and helps engines cache it properly. Google has details on how to add this header.

Option 3: A Mobile Site

This option, as the name implies, involves creating a separate domain specifically for mobile users. The most common examples are m.domain.com or mobile.domain.com. It’s a popular option for large retailers; Bridget Randolph points out that “73% of websites ranked in the Quantcast Top 100,000 sites used URL redirects to a mobile specific URL.” Like dynamic serving, this technique involves developing content specifically for visitors using a mobile device; however, a separate mobile site’s URLs are distinct, so there is no server-level redirection.

Pros:

  1. For larger sites with page counts in the hundreds of thousands or millions, implementing responsive design may simply be too much work. A mobile site allows you to tailor your user’s experience, and slowly build up a unique mobile experience.
  2. Like dynamic serving, a mobile site is better for feature phones than responsive design. Depending on your site’s demographic, this may not be a criterion; but for some businesses, it’s an important consideration.

Cons:

  1. Your mobile site won’t benefit from any positive backlink profile that your desktop site has built up (unless you implement bidirectional redirects). So if you’re looking to get your mobile users to find you in organic search, this may be a real setback.
  2. Your mobile site will require some extra SEO work. You’ll have to submit a separate XML Sitemap to Google and Bing Webmaster Tools. Plus, smaller screens mean smaller SERPs, so you may need to edit your Meta tags. Mobile-specific Meta tags should be shorter than those for a desktop site.

As you can see, the content has been dramatically reformatted and reduced to make it readable on a mobile device.

In sifting through all of this information to make the right choice for your site, don’t forget to ask yourself how many of your visitors are using mobile devices to access the site. Check your analytics. If the total percentage of mobile traffic is under five percent, then you can probably wait to implement mobile design. For now. If the predictions are correct, then mobile usage will only continue to claim more and more Internet traffic.

For information on how to optimize your pages for speed and mobile SEO, we recommend starting with these resources:

mobile-seo-and-design-checklist

The post A Cheat Sheet for Mobile Design: Responsive Design, Dynamic Serving and Mobile Sites appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/mobile-design-cheat-sheet/feed/ 4