SEO optimization Archives - Bruce Clay, Inc. https://www.bruceclay.com/blog/tag/seo-optimization/ SEO and Internet Marketing Sat, 16 Mar 2024 15:10:40 +0000 en-US hourly 1 6 Useful Tools for Getting Structured Data Right https://www.bruceclay.com/blog/useful-tools-for-structured-data/ https://www.bruceclay.com/blog/useful-tools-for-structured-data/#comments Mon, 20 Nov 2023 17:55:47 +0000 https://www.bruceclay.com/?p=203885 Discover the top tools for perfecting structured data on your website. From Schema Markup Generator to Ryte Structured Data Helper, ensure your data stands out in search results.

The post 6 Useful Tools for Getting Structured Data Right appeared first on Bruce Clay, Inc..

]]>
Structured data displayed on laptop, tools hanging in background.

You’re excited about structured data and all the potential benefits to your website. But how do you know you’re doing it right? Enter the structured data toolbox. Here are six useful tools to get the job done.

Structured Data Markup Help

If you need help generating the code for your structured data, look no further than the following tools:

1. Merkle’s Schema Markup Generator

Merkle’s Schema Markup Generator creates JSON-LD markups for rich results, including all of the required item properties and more.

Screenshot of Schema Markup Generator.
Schema Markup Generator

You can test the code within the app and quickly copy the code to paste it elsewhere with a click of a button.

Supported types include:

  • Article
  • Breadcrumb
  • Event
  • FAQ page
  • How to
  • Job posting
  • Local business
  • Organization
  • Person
  • Product
  • Recipe
  • Video
  • Website

2. Google’s Data Highlighter

Data Highlighter can be accessed in Search Console and is a simple way to tag the data on a webpage. From Google:

“Data Highlighter is a webmaster tool for teaching Google about the pattern of structured data on your website. You simply use Data Highlighter to tag the data fields on your site with a mouse. Then Google can present your data more attractively — and in new ways — in search results and in other products such as the Google Knowledge Graph. …

For example, if your site contains event listings you can use Data Highlighter to tag data (name, location, date, and so on) for the events on your site. The next time Google crawls your site, the event data will be available for rich snippets on search results pages.”

Data Highlighter supports the following types:

  • Articles
  • Events
  • Local business
  • Restaurants
  • Products
  • Software applications
  • Movies
  • TV episodes
  • Books

Structured Data Testing Tools

If you need help validating the work you’ve done, here are some handy tools:

3. Google’s Rich Results Test

Validate your markup using Google’s Rich Results test. You can submit the URL of the page or a snippet of code. You can even choose which user agent.

Screenshot of Rich Results Test.
Rich Results Test

This test supports structured data in JSON-LD, RDFa and Microdata. Google recommends that you use the URL Inspection tool after using the Rich Results test to view how Google sees the page.

4. Schema Markup Tester

The Schema Markup Tester allows you to scan the markup on two pages for comparison. This is useful if you want to see which type of markup your competitor has that you don’t, for example. Or, if you want to compare two pages on your own website.

Screenshot of Schema Markup Tester.
Schema Markup Tester

Structured Data Testing Extensions

As part of your testing suite, here are some Chrome extensions worth checking out.

5. Ryte Structured Data Helper

The Ryte Structured Data Helper “highlights syntax errors, missing required properties, and displays all nested data in one location, so that you never need to leave the page.” You can view any errors flagged in red, warnings are flagged in orange. Plus you can click on an underlined label name to visit the Schema.org documentation to read more about requirements.

Screenshot of Ryte Structured Data Helper.
Ryte Structured Data Helper

6. Search Console Rich Results Status Reports

In Google Search Console, you can view rich results status reports. This will tell you which rich results Google could or could not read and give help on troubleshooting errors.

So there you have it: Six handy tools that everyone should have when they are creating structured data markup. Did I leave any good ones out? Tell me in the comments below.

Need help with creating your structured data markup? Our SEO strategy includes structured data markup for all of our clients. Schedule a free 1:1 with us.

FAQ: How can I ensure I’m using structured data tools effectively for my website?

Structured data is the backbone of modern SEO strategies, and employing the right tools is paramount for success.

To ensure optimal utilization, start by selecting tools that align with your website’s content and goals. Merkle’s Schema Markup Generator, for instance, stands out for its versatility, creating JSON-LD markups for various content types, from articles to events.

Moving beyond tool selection, understanding the purpose of each tool is essential.

Google’s Data Highlighter simplifies the tagging process, teaching Google about the pattern of structured data on your site. This aids in presenting your data attractively in search results and other Google products like the Knowledge Graph. Realize that this tool supports various content types, including articles, events, and local businesses.

Validation is a crucial step in ensuring the correctness of your structured data implementation. Google’s Rich Results Test offers a comprehensive examination, supporting different markup formats. After this test, utilize the URL Inspection tool to view how Google interprets your page, ensuring a seamless presentation.

For a competitive edge, compare your structured data markup against competitors or different pages on your website using tools like Schema Markup Tester.

This comparative analysis provides insights into your markup’s strengths and areas of improvement. Chrome extensions like Ryte Structured Data Helper highlight syntax errors and missing properties, streamlining the validation process.

Finally, integrate Search Console Rich Results Status Reports into your routine. This feature offers a clear overview of how Google interprets your structured data, providing valuable troubleshooting insights. Regularly monitoring these reports ensures ongoing optimization and helps address any emerging issues promptly.

Mastering structured data tools involves a strategic approach to tool selection, understanding, validation and ongoing monitoring. Incorporating these tips into your SEO strategy empowers your website with the structured data needed to stand out in search results.

Step-by-Step Procedure:

  1. Select appropriate tools: Choose tools like Merkle’s Schema Markup Generator and Google’s Data Highlighter based on your website’s content.
  2. Understand tool purpose: Gain a deep understanding of each tool’s functions, such as the tagging process with Google’s Data Highlighter.
  3. Validation with Rich Results Test: Validate your markup using Google’s Rich Results Test, submitting the URL or code snippet.
  4. Utilize URL Inspection Tool: After using the Rich Results test, employ the URL Inspection Tool to understand how Google interprets your page.
  5. Comparative analysis: Use Schema Markup Tester to compare your markup against competitors or different pages on your website.
  6. Chrome extensions for validation: Leverage Chrome extensions like Ryte Structured Data Helper for highlighting errors and missing properties.
  7. Monitor with Search Console: Regularly check the Search Console Rich Results Status Reports for insights into Google’s interpretation of your structured data.

The post 6 Useful Tools for Getting Structured Data Right appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/useful-tools-for-structured-data/feed/ 4
7 Proven Strategies To Increase Website Traffic for Your Business https://www.bruceclay.com/blog/strategies-increase-website-traffic-for-business/ https://www.bruceclay.com/blog/strategies-increase-website-traffic-for-business/#comments Mon, 16 Oct 2023 17:26:35 +0000 https://www.bruceclay.com/?p=201559 Learn how to increase your website traffic, leads and sales with these effective digital marketing strategies.

The post 7 Proven Strategies To Increase Website Traffic for Your Business appeared first on Bruce Clay, Inc..

]]>
Person typing on laptop, tablet and office supplies lying on yellow desk.

Consumers today are increasingly self–sufficient — they want to soak up as much information about your business as possible before making the decision to buy.

So, where do they go for their research? Your website. That means businesses must focus on generating traffic in order to compete and make money.

In this blog, we will cover actionable ways to increase traffic to your website and land more sales:

  1. Search Engine Optimization
  2. Content Marketing
  3. Social Media Marketing
  4. Public Relations
  5. Email Marketing
  6. Targeted Online Advertising
  7. Analysis and Adaptation

FAQ: How can SEO increase traffic?

1. Search Engine Optimization

This one is a no-brainer. The goal of search engine optimization (SEO) is to drive traffic to your website. SEO improves websites so they can rank higher in search results. And when you rank higher, you yield more traffic to your website.

SEO comprises different approaches, including technical SEO, on-page SEO and off-page SEO. Some of the tactics include:

  • Conducting thorough keyword research
  • Developing a high-quality SEO content strategy
  • Doing thorough competitive research to inform your SEO program
  • Optimizing the performance of your website
  • Monitoring and fixing technical issues
  • Creating a clear website structure to benefit visitors and search engines
  • Optimizing each critical webpage with SEO best practices

When you look at your brand in a popular search engine such as Google, you want to review what searchers will see. How well does the description of your site match what you do and who you are? What pages show up under your main page when you search for your brand? As an example, you can see how “Bruce Clay” appears in Google when searched. It’s clear that Bruce Clay is an SEO company that offers search marketing services worldwide.

Google search result for the query "Bruce Clay."
Google search for Bruce Clay as a brand

As you work to increase website traffic for your business, you also want to ensure that people see what you want them to see with that increased traffic.

2. Content Marketing

“Content marketing is like a first date. If you only talk about yourself, there won’t be a second one.” – David Beebe, branded content producer.

Content is critical for ranking in the search results and providing value to your target audience.

But today, content is held to extremely high standards by Google. The content you create must provide value and must be helpful.

To rank well in search engines, your content must demonstrate experience, expertise, authoritativeness and trust, otherwise known as E-E-A-T. For some subjects, direct experience is critical.

Providing helpful information along the customer journey will lead visitors to your website when they’re ready to purchase.

For more, see:

3. Social Media Marketing

Social media can be an incredibly effective tool for increasing traffic to your site.

Establish yourself as an authority on platforms relevant to your target market and regularly post engaging updates that pique followers’ interest.

In addition, you need to manage your presence in the social media communities by interacting with followers and utilizing paid promotional campaigns to broaden your reach and bring in additional visitors to your website.

It’s important to remember that social media marketing extends beyond providing updates and information about your products or services. Of course, you want to establish your brand, but you also need to interact with your followers. Building an audience on social media includes engaging with the people who show up to your account.

As your audience grows, we recommend hiring a community manager who specifically focuses on engaging with your followers. Building a brand relationship with people interested in your brand is one of the best ways to increase conversion rates.

4. Public Relations

Gaining mentions and publications on influential websites in your niche or high-profile media outlets can significantly enhance your brand while driving traffic back to your website.

In the context of SEO, there are some rules to follow regarding public relations, specifically press releases. To learn more about that, see Why Press Releases Still Matter to SEO and How to Write a Press Release That Entices Media.

5. Email Marketing

Email marketing is a very effective strategy for reaching your audience and bringing traffic to your site. It also gives you a high return on investment — research shows that email marketing ROI yields $36 for every $1 spent.

Emailing your subscribers keeps them informed with news, updates, information, or special promotions, and it ensures they will likely become repeat visitors to your website or customers of your products.

Email marketing is a complex strategy that involves different messaging for different audiences, so be sure that anything you send is helpful and relevant to the people receiving it.

6. Targeted Online Advertising

Google estimates that for every $1 you spend on Google Ads, you get $8 in return.

Digital advertising like Google Ads or social media ads lets you focus on targeting a specific audience to deliver uber-relevant ads.

Ad platforms allow advertisers to target potential customers based on demographics, interests and behavioral characteristics, creating highly effective digital campaigns.

This, in turn, allows you to drive targeted traffic to your site.

7. Analysis and Adaptation

Regularly monitoring website traffic with tools like Google Analytics and Google Search Console can help identify patterns, trends and areas ripe for optimization.

Studying visitor behaviors and the performance of your webpages gives you insights to adapt strategies accordingly.

Remember: Digital marketing is never done – it only evolves.

Increasing website traffic requires time and dedication, so be patient while staying aware of current digital trends.

Our SEO experts can help you increase website traffic, leads, conversions, sales and revenue. Reach out to us for a free consultation and let’s discuss how we can improve your SEO.

FAQ: How can SEO increase traffic?

SEO is a popular strategy to increase traffic to your website and convert this traffic into customers. Hence, all the most successful businesses take SEO very seriously, as it helps them grow to where they are now.

SEO is crucial because it involves optimizing content and website structure to increase your brand’s visibility in search engines. With the right SEO plan rooted in the proper techniques, traffic and conversions can grow significantly.

Search engines use complex algorithms to evaluate websites. By aligning your website with these search engines, your chances of appearing higher in search results increase, leading to more people seeing and clicking on your website.

The first part of SEO is keyword research. Keywords are words or phrases people type into search engines when seeking information. By identifying relevant keywords related to your website content and including them in titles, headings and content for optimization purposes, you can help search engines understand what the site is about while increasing its visibility through relevant searches.

Content creation is another critical SEO tactic, with search engines favoring websites offering visitors valuable and pertinent information. By regularly adding fresh and engaging content, you can bring in more visitors while increasing the time they stay on your website; both will improve search rankings while simultaneously building up a loyal following for future campaigns.

Alongside optimizing your site’s content, you must prioritize building high-quality backlinks from other reputable websites pointing back to it. When other trustworthy sites link back to your website, this signals to search engines that your content is trustworthy and valuable, improving rankings and driving more organic traffic to your business.

As with all SEO-related activities, monitoring and assessing efforts requires constant evaluation. Tools like Google Analytics are practical tools for measuring website performance by tracking metrics such as visitor count, bounce rate and conversion rates, giving insight into improvement areas while fine-tuning SEO strategies.

By implementing these SEO techniques and staying abreast of current trends and best practices, you can drive more visitors to your website while increasing its presence in search engine results. Remember, SEO is an ongoing process; continue refining and optimizing strategies until long-term success has been reached.

Step-by-step Procedure for Optimizing Content on Websites:

  1. Research keywords to uncover relevant phrases and words for your website.
  2. Be sure to include these keywords throughout the content on your website, from title tags and meta descriptions to headings and subheadings.
  3. Ensure that the content provided to visitors is helpful, well-organized and intuitive for them to access.
  4. Building an authoritative network with high-quality links in your field is key to long-term business success.
  5. Consider writing guest blog posts on reliable sites and engaging your target market via social media and online communities.
  6. Analytical tools provide an invaluable way to monitor and analyze the performance of your website. Monitor who is visiting, their behaviors and any traffic sources resulting from visits.
  7. Data analytics allows analysts to quickly recognize trends and user preferences by using collected information to perform data analyses on it.
  8. Adopt a data-driven approach to optimizing your SEO strategy.
  9. Optimize your content based on feedback from visitors.
  10. Stay current on the search engine algorithm changes and SEO best practices.
  11. Optimize your Core Web Vitals page speed to enhance user experience and SEO rankings.
  12. Use social media to spread information about your website content.
  13. Utilize email marketing to inform your target audience about your website to increase traffic.
  14. Stay ahead of your competitors by performing regular analyses to pinpoint areas for improvement.
  15. Put different SEO techniques through their paces and observe how they impact website traffic.
  16. Tweak meta tags and descriptions on your website to improve click-through rates, increasing clicks-per-visits.
  17. Employ structured data markups to elevate your website in search engine result pages and boost its rank.
  18. Continually review, analyze and adapt SEO strategies to maintain and expand visibility and traffic. for long-term and sustainable traffic growth.

The post 7 Proven Strategies To Increase Website Traffic for Your Business appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/strategies-increase-website-traffic-for-business/feed/ 1
7 Keyword Suggestion and Research Tools to Figure Out the Right Keywords for Your SEO Strategy https://www.bruceclay.com/blog/keyword-suggestion-research-tools/ https://www.bruceclay.com/blog/keyword-suggestion-research-tools/#comments Thu, 24 Aug 2023 17:23:41 +0000 https://www.bruceclay.com/?p=198192 Revolutionize your SEO strategy by learning how to optimize content and target your audience effectively for enhanced search engine visibility. Try these essential tools and techniques.

The post 7 Keyword Suggestion and Research Tools to Figure Out the Right Keywords for Your SEO Strategy appeared first on Bruce Clay, Inc..

]]>

Keyword research is a big job, and you need the right tools to do it successfully. There are plenty of keyword suggestions and research tools on the market — here are seven of them (in alphabetical order) you should be using to better meet your audience’s needs and rank better in search.

  1. Answer The Public
  2. Bing Webmaster Tools Keyword Research
  3. Google Ads Keyword Planner 
  4. Google Trends
  5. Keyword Generator (Ahrefs)
  6. Keyword Suggestions (SEOToolSet)
  7. QuestionDB

FAQ: How can I leverage keyword research tools to enhance my content strategy and SEO efforts?

1. Answer The Public

Answer The Public serves questions and phrases related to your keyword by using autocomplete data from search engines.

Add the data to your keyword research list, then write content to address each relevant question or phrase. You can write a separate webpage for each question or phrase, or include a bunch of them within one article (you might even rank for a featured snippet if you do it the right way).

Answer The Public search results for the query "cat sweaters."
Image source: AnswerThePublic.com

You can do a couple of searches for free each day, or upgrade to a paid subscription.

2. Bing Webmaster Tools Keyword Research

If you’re trying to compete in Bing, then you’ll want to use its Webmaster Tools keyword research tool to discover the keywords and phrases people are using on the Bing search engine.

Take your seed list of keywords and input them into the tool. The keyword tool will suggest matching or relevant keywords related to them, including search volume and trends. Keywords suggestions fall into the following three categories: related, questions, or newly discovered.

Bing Webmaster Keyword Research Tools results for the query “how to do SEO.”
Keyword Research results for the query “how to do SEO”

Plus, the tool also provides the top-ranked URLs for the root keywords and can also give data on the keywords that are already driving traffic to your website. There are all sorts of filters you can apply to the data to refine it as well.

Bing Webmaster Tools keyword research feature is free for those who have an account.

3. Google Ads Keyword Planner

The Google Ads Keyword Planner is not just for advertisers; it’s also a handy SEO keyword research tool for websites competing in Google.

Input your seed list and the keyword suggest tool will help you find the most relevant keywords. Or, you can enter your website and Google will look for keywords related to the content.

The Keyword Planner gives data on search volume and will provide bid estimates for advertisers (which can also help gauge how competitive a keyword in organic search will be). You can narrow down your search by using various filters as well.

Screenshot of Google Ads Keyword Planner results for the query "SEO services."
Image source: Google Ads

Keyword Planner is free for anyone who has a Google Ads account.

4. Google Trends

Google Trends offers — you guessed it — trends on search queries. You can enter a given keyword and get various data points on it, including:

  • Interest over time
  • Interest by region
  • Related topics
  • Related queries

You can also apply filters like what type of search — web search, image search, news search, etc.

Screenshot of Google Trends comparison results for "SEO services" and "how to do SEO."
Image source: Google Trends

You can also check out the Trending Now page to see what’s currently trending around the world. This can be useful for writing on timely topics.

Google Trends is a free keyword research tool.

5. Keyword Generator (Ahrefs)

Ahrefs’ Keyword Generator is a keyword suggestion tool that lets you enter up to 10 words or phrases and generates keywords for the search engine of your choice (there are nine of them), including Google, Bing, YouTube, Amazon and more. Filter by country.

The data is offered up into six categories of keyword types, including: Phrase match, having the same terms as your seed list, keywords the top-competing pages also rank for, search suggestions via autocomplete, newly discovered keywords and question formats.

Get data like keyword difficulty scores and search volume for each keyword listed. You can use filters to find those keywords with good search volume and low competition (the sweet spot!).

Screenshot of Ahrefs keyword idea results for "SEO."
Image source: Ahrefs.com

Also, find out any SERP features associated with that keyword (a must when you are doing a whole-SERP SEO strategy). And, you can use this tool to find out SEO metrics on the top-ranked webpages for a term, too.

You can use the free version with limited functionality, or try a paid trial for a small fee. After that, if you upgrade to a subscription, the Lite version is $99 per month, and the Standard version is $199 per month at the time of writing.

6. Keyword Suggestions (SEOToolSet)

Of course, we have to mention our solution to keyword research, too, and that’s our Keyword Suggestions tool, which is part of our SEOToolSet®.

With the Keyword Suggestions tool, you can find terms that are semantically related to the keywords in your seed list. The tool can provide search activity for each keyword, three metrics to indicate competitiveness, categories, and a trending chart.

Results from the Bruce Clay SEOToolSet Keyword Suggestions Tool.
Image source: SEOToolSet Keyword Suggestions Tool

The SEOToolSet allows you to research keywords further by presenting more data, including keyword relative “activity” as a search query.

You can use the Keyword Suggestions tool for:

  • SEO research
  • Content planning
  • Discovering word associations for video descriptions
  • Finding keywords you might want to exclude from your PPC campaign, and more.

The SEOToolSet offers a free trial and then is $24.95 per month after that. The free version of the Keyword Suggestion tool will serve up five related words and phrases to your keyword (entered one at a time) pulled from search engine data.

7. QuestionDB

QuestionDB allows you to enter a broad keyword and find relevant questions related to it, pulled from its database of more than 32 million questions that “have been asked on various websites over time.”

QuestionDB sample results for the query "protein powder."
Image source: QuestionDB.com

The generated list will give you the relevant questions for keywords, plus volume and difficulty data from DataForSeo, an SEO data API provider. (“Keyword difficulty represents the median backlink profile strength of the top 10 webpages ranking for a specific keyword,” according to DataForSEO.) From here, you can view related topics and download the list of questions.

QuestionDB is free with limited functionality, offering 50 results per search. You can upgrade for deeper dives into the data and unlimited searches for $15 per month at the time of writing.

Keyword research is not a small feat, and the right tools can help you get the keywords most relevant to your audience and your business. For more on keyword research, check out:

Don’t have the time or resources to conduct thorough keyword research? Let our SEO experts do the work for you. Schedule a free 1:1 consultation to discuss how we can help.

FAQ: How can I leverage keyword research tools to enhance my content strategy and SEO efforts?

Navigating the complexities of content strategy and search engine optimization demands strategic precision. Keyword research tools are beacons of insight, guiding your content toward relevance and your SEO efforts toward success.

Engaging in a comprehensive content strategy necessitates more than just high-quality writing; it involves strategic keyword integration. Keyword research tools serve as the compass in this journey. By delving into these tools, you uncover trending keywords and gain insights into what your target audience seeks. This valuable understanding empowers you to curate content that resonates deeply with your readers.

Seamless integration of meticulously researched keywords directly influences your search engine rankings. Search engines thrive on relevancy, and these tools offer you the means to align your content with user intent. As you strategically pepper your articles with these keywords, you provide search engines with clear signals, ultimately enhancing your chances of climbing the results ladder.

One key advantage of leveraging keyword research tools lies in uncovering untapped niches. These tools illuminate the uncharted terrain of long-tail keywords, where competition might be less fierce and relevance is more attainable. Incorporating these hidden gems into your content enriches your strategy and helps you reach a more targeted audience.

Beyond keyword discovery, these tools facilitate monitoring and adaptation. Continuously tracking keyword performance offers insights into user behavior shifts. As search trends evolve, these tools empower you to adjust your strategy in real time, ensuring your content remains aligned with your audience’s evolving needs.

Keyword research tools transcend mere words; they are the compass, guide and beacon illuminating your path to successful content strategy and SEO. By harnessing their power, you position your content to resonate with audiences and increase search engine ranks.

Step-by-Step Procedure: How To Leverage Keyword Research Tools for Enhanced Content Strategy and SEO

  1. Understand the significance of keyword research in content strategy and SEO.
  2. Familiarize yourself with different keyword research tools available in the market.
  3. Research your target audience’s preferences, interests and pain points.
  4. Explore trending keywords related to your niche using keyword research tools.
  5. Identify long-tail keywords that align with your content’s focus.
  6. Analyze the search volume and competition level of selected keywords.
  7. Prioritize keywords with a balance of search volume and competition.
  8. Incorporate relevant keywords organically into your content.
  9. Ensure that keywords match user intent and enhance the overall reader experience.
  10. Monitor the performance of integrated keywords using tracking tools.
  11. Adapt your content strategy based on emerging keyword trends.
  12. Utilize keyword research tools to identify content gaps and potential niches.
  13. Create content that addresses specific long-tail keywords and user queries.
  14. Incorporate keywords in key areas such as headings, subheadings and meta descriptions.
  15. Regularly update your content to reflect evolving keyword trends.
  16. Engage in competitor analysis to identify keywords driving their success.
  17. Leverage keyword research tools to refine and optimize paid advertising campaigns.
  18. Collaborate with your SEO team to align keyword research with technical optimization.
  19. Continuously educate yourself on emerging SEO and content trends.
  20. Keep a pulse on your audience’s preferences to consistently refine your keyword strategy.

The post 7 Keyword Suggestion and Research Tools to Figure Out the Right Keywords for Your SEO Strategy appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/keyword-suggestion-research-tools/feed/ 3
How to Remove a Webpage https://www.bruceclay.com/blog/how-to-remove-a-webpage/ https://www.bruceclay.com/blog/how-to-remove-a-webpage/#comments Tue, 21 Jun 2022 17:01:54 +0000 https://www.bruceclay.com/?p=144676 Sometimes it is necessary to remove a webpage from your site. But taking pages down could impact SEO or user experience. Learn how to remove webpages without causing SEO or user experience issues on your site.

The post How to Remove a Webpage appeared first on Bruce Clay, Inc..

]]>
SEO professional removes webpages from website.
Sometimes webpages are no longer relevant to your business. Or you could have other reasons to remove them. But taking pages down could cause SEO or user experience issues.

What do you do?

In this article, I will cover why you might remove webpages, and how to go about doing it to minimize the impact on SEO. Feel free to jump ahead:

Why Remove Webpages?

Outdated content is one of the most common reasons for wanting to remove a webpage. That’s logical when a page isn’t worth the effort to update or rewrite.

Sometimes, you may want to remove a webpage because it’s no longer relevant. Other times, you might need to remove an entire section of your website, for example, if you no longer provide the service the pages are talking about.

Of course, some website owners may need to remove content that is harming their site. But that is another topic.

SEO Impact of Removing Webpages on Your Site

When you publish a webpage, it can build SEO value over time.

If it gets links, has other ranking signals, and then ranks and brings in traffic, taking that page down can stop rankings and traffic.

If a person follows the link to your webpage from another site and the page no longer exists, they’ll get a “404 not found” message.

This can impact the user experience. The person will likely click away from your site (more lost traffic) unless you handle the 404 well. More on that later.

So you want to be careful about how you handle webpage removals. Luckily, there are easy ways to preserve the value of those pages so that your website and users still benefit.

How Do You Remove a Webpage?

When you want to get rid of an old webpage from your site, you have four options:

  • Update the content.
  • Use a 301 redirect.
  • Unpublish the webpage (and send any visitor to your custom 404 error page).
  • Use a 410 status code.

Option 1: Update the Content

OK, so updating the content is not removing it. But updating content is often the best solution to outdated content.

In fact, refreshing old content is an SEO best practice, and keeping web content up to date can improve relevancy and rankings. (No doubt you’ve heard me emphasize content maintenance in a healthy SEO program.)

So before you remove a webpage, first figure out if you can refresh the content on that URL to make the information current.

You’ll want to make sure that the content is still “on topic” to what it is today. So, in other words, don’t change the page to a new topic, but rewrite the content as needed.

Example: If a website republishes a new research report every year, the page keeps the same focus but with new data. The website should simply update the content at that same URL with this year’s report and highlights.

Other times, certain topics can be refreshed to make them more evergreen. This might look like updating the statistics, current trends, and the angle to bring it up to speed.

Remember, Google rewards webpages that keep their content up to date.

Option 2: Do a 301 Redirect

Before you kill a page, find out if there is a next-best webpage on your site to redirect that page to. If so, you can do a 301 redirect.

A 301 redirect sends the user from Page A (the page you want to drop) to Page B (the new location).

A 301 redirect also makes the search engine index the new page (Page B) and drop the old page (Page A) from the index. Plus, it transfers the inbound link authority of Page A to Page B in the process.

All in all, it’s a win-win for SEO.

One caveat: Make sure that the page you are redirecting to is topically relevant to the original webpage. Otherwise, it may be confusing to users and search engines as to why they are being brought to a webpage that is not relevant to what they were expecting.

If you cannot redirect to a relevant webpage, in some cases, you might redirect to a relevant category page on the website.

A last resort is to redirect to the homepage, but this is done on a case-by-case basis.

For more, read:

Option 3: Unpublish the Webpage (and Create a Custom 404 Page)

When you’ve exhausted other options like updating the webpage’s content or 301 redirecting the page, sometimes all that is left is to delete the page.

In these cases, your server should return a 404 “not found” error message to anyone who follows a link to that page from somewhere else.

It’s a bummer for user experience, but it does present a unique opportunity to help users explore new content. And that is by creating a custom 404 page.

A custom 404 page is a webpage that is served when a user gets a 404 error. This webpage should have helpful info on what the user can do next. For example, you can give links to other resources on your website instead.

Custom 404 page on BruceClay.com.
Custom 404 page on BruceClay.com

This could help capture some of the traffic you would have otherwise lost if you didn’t have an engaging 404 page.

In terms of SEO, pages that are deleted and serve a 404 will usually be removed from the Google index when the site is crawled again. Usually, that’s just what you want after you remove a webpage. But keep in mind that you’re not going to rank for those keywords anymore unless you have another, better page on the same topic.

That said, I recommend running a report to identify 404s regularly, and then seeing if any of them can be a 301 redirect instead. Google Search Console is a good place to start.

But in any case, a custom 404 page will work nicely to redirect users to a new resource.

For more, read:

Option 4: Use a 410 Status Code

A 410 status code tells the search engines that the page is permanently gone.

Google treats 404s and 410s similarly. And Google has clarified this on more than one occasion (see this Search Engine Journal article and this Search Engine Land article for more details on Google’s stance).

Most sites default to using 404 errors for not-found pages. One exception is the Salesforce Commerce Cloud e-commerce platform, and there may be others. So if you use 410 status codes on your site, remember that you also need a custom 410 error page for users, which can be just like your custom 404 page.

Final Thoughts

Removing webpages is sometimes a necessary thing. Whether it’s just regular website maintenance or the need to get rid of no-longer-valid content, rest assured there are ways to handle this to minimize the SEO and user experience impact.

If you’d like help identifying weak areas in your website content, our expert SEO and content teams can help. Contact us to get a free quote and services consultation.

FAQ: How can I ensure successful webpage removal without compromising SEO using strategic methods?

When it comes to removing webpages from your website, maintaining a delicate balance between enhancing user experience and preserving SEO integrity is paramount. Webpage removal is a necessary task, but if not handled strategically, it can lead to negative consequences. In this whitepaper, we delve into proven strategies to ensure successful webpage removal without compromising your site’s SEO efforts.

Before you embark on webpage removal, it’s crucial to assess the reasons behind it. Outdated content, irrelevance, or the need to enhance site performance are common triggers. Start by conducting a comprehensive content audit to identify which pages warrant removal. By understanding the underlying reasons, you can strategically plan your removal approach.

One of the most effective strategies is to utilize 301 redirects. When removing a webpage, redirecting its traffic to a relevant, existing page helps maintain user experience and preserves SEO value. Carefully map out redirections to ensure they lead users to related content, preventing frustration and preserving valuable inbound link authority.

Creating custom 404 error pages is another strategic method. While a 404 error signifies a missing page, a well-designed custom 404 page can turn this into an opportunity. Guide users to explore other relevant resources on your site, enhancing their engagement and minimizing the negative impact of a missing page.

Implementing a 410 status code is a more decisive approach, signaling to search engines that a page is permanently gone. While similar to a 404, a 410 communicates the page’s intentional removal, aiding in faster deindexing by search engines. Be mindful, however, of providing alternative resources or redirects to prevent a negative user experience.

Regularly monitor the effects of your removal strategies using tools like Google Search Console. Identify any potential issues or negative impacts on SEO performance and address them promptly. By proactively managing the aftermath of webpage removal, you can make data-driven adjustments to fine-tune your approach and further optimize your site.

Step-by-Step Procedure: Ensuring Successful Webpage Removal Without Compromising SEO

  1. Identify the Reasons: Conduct a thorough analysis to understand why a webpage needs to be removed.
  2. Content Audit: Perform a comprehensive content audit to identify pages that require removal.
  3. Prioritize Pages: Determine which pages have the least SEO value or relevance.
  4. 301 Redirects: Redirect traffic from removed pages to relevant, existing content.
  5. Map Redirections: Ensure that redirected pages align thematically and contextually.
  6. Custom 404 Pages: Create engaging custom 404 error pages with helpful links.
  7. Prevent Frustration: Guide users to alternative resources on custom 404 pages.
  8. 410 Status Code: Implement a 410 status code for pages that need to be permanently removed.
  9. Search Engine Communication: Signal intentional removal to search engines with a 410 code.
  10. User-Friendly 404s: Enhance the user experience on 404 and 410 error pages.
  11. Regular Monitoring: Use Google Search Console to track the impact of removals.
  12. Analyze Data: Identify any negative effects on SEO or user engagement.
  13. Adjust Strategies: Based on data, fine-tune removal methods and redirections.
  14. User Experience Enhancement: Continuously optimize custom error pages for user engagement.
  15. Inbound Link Preservation: Maintain valuable inbound link authority through redirections.
  16. Redirect Relevance: Ensure redirected pages are thematically similar to the original content.
  17. Data-Driven Decisions: Make informed adjustments based on search console data.
  18. Monitor for Success: Continuously assess the impact of removal strategies.
  19. Adapt and Refine: Regularly review and optimize webpage removal techniques.
  20. SEO-Friendly Site: Maintain a healthy site structure and user experience post-removal.

By following these steps, you can confidently navigate webpage removal while safeguarding your site’s SEO performance and user experience. Your strategic approach will help maintain a cohesive and optimized online presence.

The post How to Remove a Webpage appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-to-remove-a-webpage/feed/ 17
How to Get Rid of Duplicate Content in WordPress https://www.bruceclay.com/blog/how-to-get-rid-of-duplicate-content-wordpress/ https://www.bruceclay.com/blog/how-to-get-rid-of-duplicate-content-wordpress/#comments Wed, 04 May 2022 00:03:06 +0000 https://www.bruceclay.com/?p=84111 In SEO, two webpages that appear too similar can cause double trouble. Find out how to get rid of duplicate content issues in WordPress with an SEO plugin.

The post How to Get Rid of Duplicate Content in WordPress appeared first on Bruce Clay, Inc..

]]>
Duplicate content example with two lattes.

It is often said that two are better than one. But in SEO, two webpages that appear too similar can cause double the trouble. This is what we call duplicate content.

What exactly is duplicate content? Google defines it in the following way:

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar.

There are things that can cause duplicate content on your website. For example, you could have:

  1. Meta information duplicates
  2. Similar content across pages
  3. Boilerplate content on all pages
  4. Two different site versions
  5. A separate mobile site
  6. Trailing slashes on URLs
  7. CMS problems
  8. Parameterized pages
  9. Product description issues

These are all common issues for websites, and you can fix them. For a deeper dive (especially on what types of duplicate content cause problems for SEO), I recommend you read our guide on duplicate content.

Today, I’m going to dive into how to deal with the issue on a WordPress site specifically. But to start, this video overviews how to get rid of duplicate content on a website:

Now let’s focus on how to avoid duplicate content issues that fall into the category of duplicate meta information. And you’ll be able to do this easily with a WordPress SEO plugin.

A Quick Primer on Meta Info and Duplicate Content Issues

If you’re already well-versed on meta info and duplicate content issues, you can skip to the next section to learn about how you can fix it with a WordPress SEO plugin.

If you’re still here, let’s talk about the duplicate content issue.

Meta tags are on the code side of a webpage. They are meta information (or meta data) about a page, including the title and description. Meta data is one of the first pieces of code that a search engine encounters when it crawls a page.

Meta tags in HTML of the BruceClay.com homepage.
Page source view on a BruceClay.com article

The meta tags tell the search engine what the page is about. That’s why it’s so important that the titles and meta descriptions be accurate and unique within your website.

As smart as Google’s ranking algorithm is at figuring out the topic of a webpage, it still needs help with context. Because it can’t read like you or I can.

So the meta information, especially the meta title, is critical for communicating the topic of the page. This, in turn, helps Google understand that the webpage is a good match for a search query.

It’s a common SEO issue: meta tags on a site have duplicate or similar text. This can be especially true on large websites with hundreds or thousands of webpages.

And it can happen for a variety of reasons. Meta tags may not be a priority; automation software may be used due to too many webpages, and so on.

Regardless of the reason, the same or similar meta tags can cause duplicate content issues within a site. Again, this type of duplicate content is especially important to fix because meta tags are the first chance at communicating to the search engine what the page is about.

Will you suffer a Google penalty because of it? No. Google does not have a duplicate content penalty unless the duplicate content is deceptive.

From Google:

Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.

This usually occurs when sites copy content from other sites and claim it as their own.

For duplicate content within a website, though, you can suffer some consequences. And that consequence is your webpages being filtered from the search results.

When faced with two pages whose contents appear too similar, Google picks the page that it believes to be the best for the query. And it leaves the other page out of the results.

But this may or may not be the page you want to show up in the search results &mdash, so you want to avoid filtering.

How to Get Rid of Duplicate Meta Data with a WordPress Plugin

Until now, it has not been easy for WordPress users to identify duplicate content issues right in WordPress. But with our WordPress SEO plugin, it’s simple to get this data.

1. Install the Bruce Clay SEO WP Plugin

If you’re not already a user of our WordPress SEO plugin, here’s how you can get started:

Get a free trial here. We offer an affordable monthly plan at $24.95 thereafter with access to all WordPress SEO functionality plus our SEOToolSet® if you want more analytics and reports.

Installation is quick and easy, and you have two options. One way is to download the Bruce Clay SEO plugin from the WordPress repository here.

Another way is to  install the plugin from within your WordPress site by going to WP admin > Plugins > Add New and searching for “Bruce Clay.”

2. Set Up and Sync the Plugin

This step will sync all published content on your website with the toolset. You’ll synchronize your content when you first set up the plugin, from the Settings tab.

Synchronize content in plugin settings.

3. Review the Activity Tab for Duplicate Titles and Descriptions

See which pages on your site pose duplicate content issues at the metadata level. Our WordPress SEO plugin runs a check when pages are published or synched.

(A sync happens when the page is published, the “analyze content” button is pressed, or when a sync is manually run from the Settings screen.)

It then reports on any potential duplicate content issues from the same meta information.

Activity tab in Bruce Clay SEO plugin.

4. Click on the Page or Post

Select a page or post and automatically bring it up in the WordPress editor so you can adjust the title or description.

Click the post title to edit content.

5. Make Changes to the Meta Info

If you are using Yoast SEO, you would adjust the title and description there. If you are using our title and description feature, you can adjust it there. (Note: If Yoast is active on your site, the Bruce Clay SEO plugin automatically hides these fields to prevent any confusion.)

Title and meta description editable in plugin.
Bruce Clay SEO WP plugin’s title and description editor

Yoast SEO title and description fields.
Yoast SEO plugin’s title and description editor

6. Mark as Done

Mark the alert as “done” in the Activity tab. The alert will go away immediately.

Alert marked as done in Activity tab.

You can read more on how the Bruce Clay SEO plugin helps you with:

Get rid of the duplicate content in your WordPress site starting today. Get your free trial of our WordPress SEO plugin now.

FAQ: How can I effectively manage duplicate content issues on my website?

Duplicate content issues can significantly impact your website’s search engine ranking and user experience. Effectively managing these issues requires a combination of technical expertise and strategic planning. This article offers valuable insights to help you tackle duplicate content and optimize your website for better performance.

Identifying Duplicate Content

Identifying duplicate content is the first step in managing the issue. Utilize tools such as website crawlers to analyze your site’s content and identify instances of duplication. Look for similar content across different pages, repetitive meta information, and boilerplate text that may negatively affect your SEO efforts.

Addressing Duplicate Content

Once identified, it’s crucial to address duplicate content promptly. Begin by creating unique and compelling meta titles and descriptions for each page. Update or rewrite repetitive content to provide valuable insights and engage your audience. Ensuring that each page offers distinct and valuable information enhances your site’s relevance and credibility.

Preventing Duplicate Content

Prevention is key to maintaining a website free of duplicate content. Implement best practices such as canonical tags to indicate the preferred version of a page to search engines. Regularly audit your site for redundant content and use redirects to consolidate similar pages. You enhance your site’s overall SEO performance by proactively preventing duplicate content.

User-Friendly Navigation

A well-structured website with intuitive navigation can also help prevent duplicate content issues. Ensure that your site’s URLs are clear and descriptive, making it easier for both users and search engines to understand the hierarchy of your content. Utilize breadcrumb navigation and internal linking to guide users through your site while maintaining a coherent content structure.

Regular Monitoring and Optimization

Managing duplicate content is an ongoing process. Regularly monitor your website for any new instances of duplication that may arise. Stay updated with the latest SEO guidelines and best practices to optimize your content strategy continuously. By staying vigilant and proactive, you can maintain a high-quality website that delivers value to both users and search engines.

Step-by-Step Procedure: How to Effectively Manage Duplicate Content Issues on Your Website

  1. Identify duplicate content using website crawling tools.
  2. Analyze similarities across pages and examine meta-information.
  3. Determine the scope and extent of the duplicate content issue.
  4. Prioritize pages with the highest impact on SEO and user experience.
  5. Create unique and informative meta titles and descriptions.
  6. Rewrite or update redundant content to offer distinct value.
  7. Utilize canonical tags to guide search engines to the preferred page.
  8. Implement 301 redirects to consolidate similar content.
  9. Audit your website’s navigation and URL structure.
  10. Optimize breadcrumb navigation for easy user navigation.
  11. Use internal linking to guide users through your content.
  12. Regularly monitor your website for new instances of duplication.
  13. Stay updated with evolving SEO guidelines and best practices.
  14. Continuously optimize your content strategy to prevent future issues.

The post How to Get Rid of Duplicate Content in WordPress appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-to-get-rid-of-duplicate-content-wordpress/feed/ 75
What Is robots.txt? A Beginner’s Guide to Nailing It with Examples https://www.bruceclay.com/blog/robots-txt-guide/ https://www.bruceclay.com/blog/robots-txt-guide/#comments Tue, 29 Mar 2022 00:11:52 +0000 https://www.bruceclay.com/?p=124203 The one technical SEO element you don’t want to get wrong is robots.txt. So here's a handy guide that explains why every website needs it and how to create one.

The post What Is robots.txt? A Beginner’s Guide to Nailing It with Examples appeared first on Bruce Clay, Inc..

]]>
Wooden robot figure stands on a patch of grass.
Ah, robots.txt — one teeny tiny file with big implications. This is one technical SEO element you don’t want to get wrong, folks.

In this article, I will explain why every website needs a robots.txt and how to create one (without causing problems for SEO). I’ll answer common FAQs and include examples of how to execute it properly for your website. I’ll also give you a downloadable guide that covers all the details.

Contents:

What Is robots.txt?

Robots.txt is a text file that website publishers create and save at the root of their website. Its purpose is to tell automated web crawlers, such as search engine bots which pages not to crawl on the website. This is also known as robots exclusion protocol.

Robots.txt does not guarantee that excluded URLs won’t be indexed for search. That’s because search engine spiders can still find out those pages exist via other webpages that are linking to them. Or, the pages may still be indexed from the past (more on that later).

Robots.txt also does not absolutely guarantee a bot won’t crawl an excluded page, since this is a voluntary system. It would be rare for major search engine bots not to adhere to your directives. But others that are bad web robots, like spambots, malware, and spyware, often do not follow orders.

Remember, the robots.txt file is publicly accessible. You can just add /robots.txt to the end of a domain URL to see its robots.txt file (like ours here). So do not include any files or folders that may include business-critical information. And do not rely on the robots.txt file to protect private or sensitive data from search engines.

OK, with those caveats out of the way, let’s go on…

Why Is robots.txt Important?

Search engine bots have the directive to crawl and index webpages. With a robots.txt file, you can selectively exclude pages, directories, or the entire site from being crawled.

This can be handy in many different situations. Here are some situations you’ll want to use your robots.txt:

  • To block certain pages or files that should not be crawled/indexed (such as unimportant or similar pages)
  • To stop crawling certain parts of the website while you’re updating them
  • To tell the search engines the location of your sitemap
  • To tell the search engines to ignore certain files on the site, like videos, audio files, images, PDFs, etc., and not have them show up in the search results
  • To help ensure your server is not overwhelmed with requests*

*Using robots.txt to block off unnecessary crawling is one way to reduce the strain on your server and help bots more efficiently find your good content. Google provides a handy chart here. Also, Bing supports the crawl-delay directive, which can help to prevent too many requests and avoid overwhelming the server.

Of course, there are many applications of robots.txt, and I’ll outline more of them in this article.

But, Is robots.txt Necessary?

Every website should have a robots.txt file, even if it is blank. When search engine bots come to your website, the first thing they look for is a robots.txt file.

If none exists, then the spiders are served a 404 (not found) error. Although Google says that Googlebot can go on and crawl the site even if there’s no robots.txt file, we believe that it is better to have the first file that a bot requests load rather than produce a 404 error.

What Problems Can Occur with robots.txt?

This simple little file can cause problems for SEO if you’re not careful. Here are a couple of situations to watch out for.

1. Blocking your whole site by accident

This gotcha happens more often than you’d think. Developers can use robots.txt to hide a new or redesigned section of the site while they’re developing it, but then forget to unblock it after launch. If it’s an existing site, this mistake can cause search engine rankings to suddenly tank.

It’s handy to be able to turn off crawling while you’re preparing a new site or site section for launch. Just remember to change that command in your robots.txt when the site goes live.

2. Excluding pages that are already indexed

Blocking in robots.txt pages that are indexed causes them to be stuck in Google’s index.

If you exclude pages that are already in the search engine’s index, they’ll stay there. In order to actually remove them from the index, you should set a meta robots “noindex” tag on the pages themselves and let Google crawl and process that. Once the pages are dropped from the index, then block them in robots.txt to prevent Google from requesting them in the future.

How Does robots.txt Work?

To create a robots.txt file, you can use a simple application like Notepad or TextEdit. Save it with the filename robots.txt and upload it to the root of your website as www.domain.com/robots.txt —— this is where spiders will look for it.

A simple robots.txt file would look something like this:

User-agent: *
Disallow: /directory-name/

Google gives a good explanation of what the different lines in a group mean within the robots.txt file in its help file on creating robots.txt:

Each group consists of multiple rules or directives (instructions), one directive per line.

A group gives the following information:

  • Who the group applies to (the user agent)
  • Which directories or files that agent can access
  • Which directories or files that agent cannot access

I’ll explain more about the different directives in a robots.txt file next.

Robots.txt Directives

Common syntax used within robots.txt includes the following:

User-agent

User-agent refers to the bot in which you are giving the commands (for example, Googlebot or Bingbot). You can have multiple directives for different user agents. But when you use the * character (as shown in the previous section), that is a catch-all that means all user agents. You can see a list of user agents here.

Disallow

The Disallow rule specifies the folder, file or even an entire directory to exclude from Web robots access. Examples include the following:

Allow robots to spider the entire website:

User-agent: *
Disallow:

Disallow all robots from the entire website:

User-agent: *
Disallow: /

Disallow all robots from “/myfolder/” and all subdirectories of “myfolder”:

User-agent: *
Disallow: /myfolder/

Disallow all robots from accessing any file beginning with “myfile.html”:

User-agent: *
Disallow: /myfile.html

Disallow Googlebot from accessing files and folders beginning with “my”:

User-agent: googlebot
Disallow: /my

Allow

This command is only applicable to Googlebot and tells it that it can access a subdirectory folder or webpage even when its parent directory or webpage is disallowed.

Take the following example: Disallow all robots from the /scripts/folder except page.php:

Disallow: /scripts/
Allow: /scripts/page.php

Crawl-delay

This tells bots how long to wait to crawl a webpage. Websites might use this to preserve server bandwidth. Googlebot does not recognize this command, and Google asks that you change the crawl rate via Search Console. Avoid Crawl-delay if possible or use it with care as it can significantly impact the timely and effective crawling of a website.

Sitemap

Tell search engine bots where to find your XML sitemap in your robots.txt file. Example:

User-agent: *
Disallow: /directory-name/
Sitemap: https://www.domain.com/sitemap.xml

To learn more about creating XML sitemaps, see this: What Is an XML Sitemap and How do I Make One?

Wildcard Characters

There are two characters that can help direct robots on how to handle specific URL types:

The * character. As mentioned earlier, it can apply directives to multiple robots with one set of rules. The other use is to match a sequence of characters in a URL to disallow those URLs.

For example, the following rule would disallow Googlebot from accessing any URL containing “page”:

User-agent: googlebot
Disallow: /*page

The $ character. The $ tells robots to match any sequence at the end of a URL. For example, you might want to block the crawling of all PDFs on the website:

User-agent: *
Disallow: /*.pdf$

Note that you can combine $ and * wildcard characters, and they can be combined for allow and disallow directives.

For example, Disallow all asp files:

User-agent: *
Disallow: /*asp$

  • This will not exclude files with query strings or folders due to the $ which designates the end
  • Excluded due to the wildcard preceding asp – /pretty-wasp
  • Excluded due to the wildcard preceding asp – /login.asp
  • Not excluded due to the $ and the URL including a query string (?forgotten-password=1) – /login.asp?forgotten-password=1

Not Crawling vs. Not Indexing

If you do not want Google to index a page, there are other remedies for that other than the robots.txt file. As Google points out here:

Which method should I use to block crawlers?

  • robots.txt: Use it if crawling of your content is causing issues on your server. For example, you may want to disallow crawling of infinite calendar scripts. You should not use the robots.txt to block private content (use server-side authentication instead), or handle canonicalization. To make sure that a URL is not indexed, use the robots meta tag or X-Robots-Tag HTTP header instead.
  • robots meta tag: Use it if you need to control how an individual HTML page is shown in search results (or to make sure that it’s not shown).
  • X-Robots-Tag HTTP header: Use it if you need to control how non-HTML content is shown in search results (or to make sure that it’s not shown).

And here is more guidance from Google:

Blocking Google from crawling a page is likely to remove the page from Google’s index.
However, robots.txt Disallow does not guarantee that a page will not appear in results: Google may still decide, based on external information such as incoming links, that it is relevant. If you wish to explicitly block a page from being indexed, you should instead use the noindex robots meta tag or X-Robots-Tag HTTP header. In this case, you should not disallow the page in robots.txt, because the page must be crawled in order for the tag to be seen and obeyed.

Tips for Creating a robots.txt without Errors

Here are some tips to keep in mind as you create your robots.txt file:

  • Commands are case-sensitive. You need a capital “D” in Disallow, for example.
  • Always include a space after the colon in the command.
  • When excluding an entire directory, put a forward slash before and after the directory name, like so: /directory-name/
  • All files not specifically excluded will be included for bots to crawl.

The robots.txt Tester

Always test your robots.txt file. It is more common that you might think for website publishers to get this wrong, which can destroy your SEO strategy (like if you disallow the crawling of important pages or the entire website).

Use Google’s robots.txt Tester tool. You can find information about that here.

Robots Exclusion Protocol Guide

If you need a deeper dive than this article, download our Robots Exclusion Protocol Guide. It’s a free PDF that you can save and print for reference to give you lots of specifics on how to build your robots.txt.

Closing Thoughts

The robots.txt file is a seemingly simple file, but it allows website publishers to give complex directives on how they want bots to crawl a website. Getting this file right is critical, as it could obliterate your SEO program if done wrong.

Because there are so many nuances on how to use robots.txt, be sure to read Google’s introduction to robots.txt.

Do you have indexing problems or other issues that need technical SEO expertise? If you’d like a free consultation and services quote, contact us today.

FAQ: How can I optimize my website’s performance with an effective robots.txt file?

Ensuring your website’s optimal performance is paramount to success. A key aspect often overlooked is the strategic use of a robots.txt file. This unassuming text document wields the power to significantly impact your site’s search engine optimization (SEO) and overall performance.

At its core, a robots.txt file is a gatekeeper for search engine bots, guiding them on which parts of your website to crawl and index. By skillfully crafting this file, you can strategically control how search engines interact with your content. This optimization technique is vital for preventing unnecessary strain on your server, ensuring that valuable resources are allocated efficiently.

One essential application of robots.txt optimization is the ability to exclude specific pages or directories from being crawled. This is particularly useful for hiding unimportant or redundant pages, preventing search engines from wasting resources on irrelevant content. For instance, you can avoid video or audio files from being crawled, preserving your server’s bandwidth for more critical components.

Updating your website can be delicate, often requiring temporary withdrawal of specific pages. By utilizing robots.txt optimization, you can gracefully handle this situation without affecting SEO rankings. Temporarily blocking crawling on pages undergoing updates ensures that search engines won’t index incomplete or inconsistent content, maintaining your site’s credibility.

Moreover, robots.txt optimization empowers you to guide search engines toward your sitemap’s location. This simple step helps search engine bots navigate your site’s structure efficiently, ensuring no valuable content is overlooked. Strategically placing your sitemap in robots.txt enhances the discoverability of your most important pages.

While the benefits of robots.txt optimization are substantial, it’s crucial to proceed cautiously. Improper configuration can inadvertently block important pages, leading to declining search engine rankings. Therefore, seeking the guidance of SEO experts or referring to reputable resources, such as Google’s guidelines, is highly recommended before implementing changes.

A practical robots.txt file is a powerful tool in your SEO arsenal. By optimizing this seemingly unassuming element, you can exert control over how search engines interact with your website, ultimately enhancing performance, resource allocation, and overall user experience.

Step-by-Step Procedure for robots.txt Optimization:

  1. Understand the role of robots.txt in SEO and website performance.
  2. Select any pages or directories from which you would like to exclude crawling.
  3. Create a robots.txt file using any plain-text editor like Notepad or TextEdit.
  4. Specify user-agent directives to target search engine bots (e.g., User-agent: Googlebot).
  5. Utilize the Disallow directive to block access to pages or directories you want to exclude (e.g., Disallow: /videos/).
  6. Implement the Allow directive for specific pages within blocked directories (e.g., Allow: /videos/index.html).
  7. Use the Crawl-delay directive to control the rate at which bots crawl your site, if necessary.
  8. Include the Sitemap directive to guide search engines to your XML sitemap (e.g., Sitemap: https://www.domain.com/sitemap.xml).
  9. Test your robots.txt file using Google’s robots.txt Tester tool to identify any issues or errors.
  10. Upload the robots.txt file to the root directory of your website via FTP or your content management system (CMS).
  11. Monitor your website’s performance and search engine rankings after implementing robots.txt optimization.
  12. Regularly update and refine your robots.txt file as your website’s structure and content evolve.
  13. Consult SEO experts or reputable resources for guidance on best practices and advanced optimization techniques.
  14. Review and analyze your website’s crawl and index statistics to ensure effective robots.txt optimization.
  15. Adjust directives as needed based on changes in your website’s content and goals.
  16. Avoid blocking critical pages that are essential for search engine visibility and user experience.
  17. Continuously stay informed about updates and changes to search engine algorithms that may impact robots.txt optimization.
  18. Prioritize user experience and ensure that any exclusions align with your website’s content strategy.
  19. Regularly audit and maintain your robots.txt file to ensure ongoing optimization and performance.
  20. Keep abreast of emerging trends and best practices in SEO and robots.txt optimization for sustained success.

The post What Is robots.txt? A Beginner’s Guide to Nailing It with Examples appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/robots-txt-guide/feed/ 26
Search Quality Rater Guidelines Checklist: Evaluator Considerations https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/ https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/#comments Tue, 19 Oct 2021 19:45:00 +0000 http://www.bruceclay.com/blog/?p=38971 Google’s Search Quality Evaluator Guidelines give us clues about what the search engine focuses on, and consequently, what SEOs must focus on, too. For years, the buzzword for search engine optimization was “relevance” — making your site the most relevant result for a searcher’s query. But it’s all about usefulness today and moving forward. The goal of the search engine is simple: increase searcher satisfaction.

Here’s our checklist for making sure your SEO campaign aligns with Google’s priorities.

The post Search Quality Rater Guidelines Checklist: Evaluator Considerations appeared first on Bruce Clay, Inc..

]]>
EDITOR’S NOTE: You can always find Google’s current Search Quality Rater Guidelines here.

Google’s update of its Search Quality Rater Guidelines shows a shifted focus on the search engine and, consequently, for SEOs. BTW, the Google PDF file name says Evaluator and not Rater… but it is Rater.

For years, the buzzword for search engine optimization has been “relevance” — making your site the most relevant result for a searcher’s query. But as Duane Forrester, our former VP of organic search operations, observed: “It’s all about usefulness today and moving forward. The goal of the search engine is simple: increase searcher satisfaction. That’s why ‘useful’ is the new watchword. Previously we said ‘relevant,’ but really we all meant ‘useful.’”

Usefull signGoogle regularly updates its internal guidelines document that tells hired human quality raters how to evaluate sites as part of Google’s ongoing experiments. We in the search industry usually get only leaked tidbits and summaries to read. But last month, in a rare gesture, Google published the guidelines as a PDF for all to read.

While it doesn’t reveal any ranking formulas or algo secrets, the 175-page document complete with many examples and screenshots does offer a coveted view of what the search engine considers priority. As Google’s announcement states, “the guidelines reflect what Google thinks search users want” and therefore can help webmasters and business owners “understand what Google looks for in web pages.”

The guidelines are not the algorithm. But they show what Google focuses on, and that’s worth paying attention to.

What’s important for business owners is not all of the nitty-gritty technical details. Leave those to your SEO. Instead, business decision-makers need to glean what Google’s focus is so they can allot budgets and assign priorities correctly in a website strategy that’s aligned with what Google rewards.

It’s all about usefulness today and moving forward

Aligning Your Website with Google’s Priorities

Search engine priorities change over time, and your SEO strategy has to adapt. When you compare this 2015 version to previously leaked Google quality rater’s guidelines (as Jennifer Slegg does here and here), the differences point out how Google’s focus is shifting. The two biggest changes are:

  • Mobile everything: Not only is there a whole new section devoted to mobile quality, but also most of the examples now show screenshots taken on mobile devices.
  • Needs Met focus: A new rating scale judges how fully a web page result meets a mobile searcher’s need. Any site that is NOT mobile-friendly automatically fails this criterion. The entirely new section for judging Needs Met proves that Google is all about satisfying the searcher’s needs.

Here’s our checklist for making sure your SEO campaign aligns with Google’s priorities.

Mobile, Front and Center

Is your site really mobile-friendly?

Earning a passing grade on Google’s Mobile-Friendly Test tool is the bare minimum required for all web pages and apps now. Beyond this, you must make sure that tasks are easy to accomplish with a mobile device. From the guidelines, here’s a checklist you can use to evaluate how your site performs with a smartphone:

  • How easy/hard is it to fill out forms or enter data?
  • How does the site or app behave on a small screen? Are all features usable?
  • Is the content legible without requiring left-to-right scrolling to read text?
  • Do images fit on a small screen?
  • How easily can someone navigate? Are menus, buttons and links large enough?
  • What happens on your site when Internet connectivity is inconsistent or slow?

Needs Met or Not

How well does your site anticipate and fulfill a mobile user’s needs?

Another entirely new section added to Google’s quality rating guidelines is called “Needs Met Rating Guideline.” Here’s the description, which is clearly targeting MOBILE users’ needs (from Section 13.0):

Needs Met rating tasks ask you to focus on mobile user needs and think about how helpful and satisfying the result is for the mobile users.

To get a high quality rating in the Needs Met category, a search result and its landing page should:

  • Require minimal effort for users to immediately get or use what they’re looking for.
  • Satisfy all or almost all users looking for the same thing (so that they wouldn’t need to see additional results).
  • Provide trustworthy, authoritative, and/or complete information that is helpful.

A mobile user’s intent differs from that of a desktop or even tablet user. (Tip: Aaron Levy’s SMX presentation covers mobile audiences in depth.) Evidence of this is found in the new mobile section of Google’s Search Quality Rating Guidelines, where page after page of examples show what mobile users probably want when they search for various spoken or typed queries. At one point, raters are instructed to “think about mobile users when deciding if queries are [a particular type]. Use your judgment here.”

The takeaway for mobile SEO marketers as well as for app/website owners is this: Think about what mobile users may be trying to do, and make sure that your site fulfills these things as directly as possible. Google is all about satisfying mobile users’ needs; you should be, too.

Answering this question takes some serious thought, but ultimately pays off in spades.

Purpose-Driven Pages

Does the webpage have a clear purpose, and how well is it achieved?

One of the first tasks a rater must do is figure out what a webpage is for and then decide how well the page achieves that purpose. For example, the purpose of a news site homepage is to display news; the purpose of a shopping page is to sell or give information about a product; etc. Google has very different standards for different types of pages, so understanding a page’s purpose lays the foundation for assessing its quality.

How helpful is the page’s purpose?

Google wants each page to be geared to helping users. Helpfulness factors heavily into quality ratings. On the low end of the quality scale would be pages that harm or deceive users (even though they may be fulfilling their designed purpose).

To be deemed high quality, a page must have a helpful purpose, such as:

  • To share information about a topic
  • To share personal or social information
  • To share pictures, videos, or other forms of media
  • To entertain
  • To express an opinion or point of view
  • To sell products or services
  • To allow users to share files or download software
  • … many others.

Is the purpose of the website as a whole clear, on and off site?

Make sure that your website’s overall purpose is explained clearly, ideally on the About page. The rating guidelines include examples of pages with “non-obvious purposes” — pages that seemed pointless or inaccurate on their own, until the rater referred to the About or FAQ page and discovered they were actually beneficial (see Section 2.2).

In addition, Google looks at independent sources to see whether the site’s reputation matches what it claims about itself. If there’s conflict, Google will tend to believe what the outside sources have to say. For small businesses or organizations, a lack of reviews or reputation information does not mean the site is low quality (see Section 2.7).

Meaty Main Content and Helpful Secondary Content

Does the page have quality main content?

A webpage’s main content (which excludes ads, sidebars, and other supplementary parts that do not directly fulfill the page’s purpose) can earn a high quality rating if ALL of these are true:

  • There is a satisfying amount of high quality main content on the page.
  • The page and site have a high level of E-E-A-T (experience, xpertness, authoritativeness and trustworthiness).
  • The site has a good reputation for the page’s topic.

There are no hard and fast rules, and no minimum number of words per page. The guidelines encourage raters to decide whether the main content fulfills the purpose of the page satisfactorily.

Is there any supplementary content on the page that is helpful to users?

Google recognizes that supplementary content “can be a large part of what makes a High quality page very satisfying for its purpose.” Consider what you can include to offer related information, ways to find other cool stuff, or specialized content that could be helpful to people visiting that page.

YMYL Pages Have Higher Standards

How high quality are your site’s YMYL pages?

Pages that can impact a person’s “future happiness, health, or wealth” are known as Your Money or Your Life (YMYL) pages. Google first introduced this concept in the 2014 Search Quality Rating Guidelines, which held these types of pages to a much higher standard across all quality criteria. Examples include pages for shopping transactions, financial information, medical advice, legal information, and many more.

Google specifies “needs met” ratings that judge how well a webpage fulfills a searcher’s needs. If you have YMYL pages, needs met is particularly important.

Maintaining Your Site

Does your site look alive and well-maintained?

Raters are instructed to “poke around” to see whether a site is being maintained. Here are a few signs of life Google expects of a well-maintained, quality website:

  • Links should work.
  • Images should load.
  • Pages should continue to function well for users as web browsers change.

How fresh is your content?

Google’s algorithm is known to look at “freshness” as a ranking factor for many types of queries. When Googlebot gets to your site, does it find any recently added or updated content?

For blog posts and other content that is dated, don’t try to game the system by setting up a program to automatically change dates to make things look recent; Google’s on to that scheme. Raters are even instructed to manually check the Wayback Machine to investigate suspicious dates to see whether content is copied or original (see Section 7.4.7). By the way, Google’s algorithm doesn’t need the Wayback Machine to recognize original content, so don’t even try to cheat.

A healthy website frequently adds new content and/or updates old content to keep things fresh and useful for site visitors.

How expert is your content?

Thomas the really useful engine
Thomas the Tank Engine had the right idea all along.
(photo credit: Tommy Stubbs/Random House)

We know from the 2014 guidelines that Google quality raters look for signs of E-E-A-T, which stands for expertness, authoritativeness and trustworthiness. The newest guidelines reinforce this concept, but define “expertise” differently depending on the topic of the page (according to Section 4.3):

  • There are “expert” websites of all types, even gossip sites, forums, etc.
  • Topics such as medical, legal, financial or tax advice, home remodeling, or parenting “should come from expert sources.”
  • Topics on hobbies, such as photography or learning to play an instrument, “also require expertise.”
  • Ordinary people may have “everyday expertise” on topics where they have life experience, such as people who write extremely detailed reviews, tips posted on forums, personal experiences, etc.

Make sure your expert content is “maintained and updated” to increase your site’s E-E-A-T rating.

About Advertising

If you have ads or monetized links on your site, are they appropriate for the page’s purpose?

The guidelines state that “the presence or absence of Ads is not by itself a reason for a High or Low quality rating” because Google realizes that many websites and apps owe their existence to ad income. However, Google “will consider a website responsible for the overall quality of the Ads displayed” (see Section 2.4.3). So keep an eye on the amount and use of affiliate, display, or other types of advertising. Make sure that ads don’t overwhelm the more useful main content (and supplementary content, if any) that each page contains.

Wrapping Up Your Quality Review

The old saying goes that there’s always room for improvement. This post is by no means a complete SEO checklist. We hope that as you apply these points from the 2015 search quality ratings guidelines that are based on Google’s priorities, you’ll begin to view your online properties with a new SEO point of view — and make your sites and apps more useful.

If you’re eyeing the best way to improve your website quality and would like to have a free consultation, fill out a quote request, and we’ll give you a call.

FAQ: How can I align my website with Google’s priorities using the Search Quality Guidelines?

Creating a website that resonates with Google’s evolving priorities is crucial for sustainable online success. Google Search Quality Guidelines provide a roadmap to effectively understand and implement these priorities.

Mobile-friendliness is a pivotal factor in Google’s ranking algorithm. Websites that offer a seamless experience across various devices garner higher search visibility. As Google emphasizes mobile-first indexing, responsive design becomes essential. Mobile-friendly pages enhance user experience and cater to increasing mobile search users.

Content quality is another cornerstone. Google’s emphasis on “usefulness” encourages webmasters to provide informative, engaging, and relevant content. Comprehensive, well-researched articles showcase expertise, boosting credibility and user engagement. Balancing text with multimedia elements like images and videos enhances the content appeal.

User intent is at the heart of Google’s priorities. Ensuring your website meets user needs is vital. Analyze your audience’s queries and preferences to provide solutions that resonate. Optimize conversational queries by incorporating natural language in your content. Addressing user intent fosters longer time spent on your site, positively impacting ranking signals.

E-E-A-T (Experience, expertise, Authoritativeness, Trustworthiness) matters greatly. Establish your expertise through author bios, showcasing qualifications, and linking to reputable sources. Leverage authoritative backlinks to credible sites, enhancing your website’s trustworthiness. Regularly update content to demonstrate an ongoing commitment to accuracy and relevance.

User experience encompasses site speed, navigation, and design. A smooth browsing experience reduces bounce rates and increases user satisfaction. Minimize page loading times, ensure intuitive navigation, and maintain a clean design. A visually appealing and user-friendly website fosters positive interactions and contributes to higher rankings.

Step-by-Step Procedure: Aligning Your Website with Google’s Priorities

  1. Mobile-Friendly Optimization: Implement responsive design, ensuring a consistent device experience.
  2. Content Quality Enhancement: Craft well-researched, engaging content that caters to user needs.
  3. Address User Intent: Analyze user queries to provide relevant solutions and foster engagement.
  4. Establish E-E-A-T: Showcase experience, expertise, authoritativeness, and trustworthiness through bios and backlinks.
  5. Optimize User Experience: Prioritize site speed, intuitive navigation, and a visually appealing design.
  6. Keyword Research: Identify relevant keywords to target user queries effectively.
  7. Semantic Search Integration: Incorporate natural language and contextually relevant terms in content.
  8. Structured Data Implementation: Use structured data markup to enhance search results display.
  9. Internal Linking Strategy: Establish a logical hierarchy of internal links for easy navigation.
  10. Backlink Acquisition: Acquire authoritative backlinks from reputable websites in your niche.
  11. Regular Content Updates: Keep content fresh and accurate to demonstrate ongoing commitment.
  12. Local SEO Optimization: Optimize for local searches with accurate business information.
  13. Social Media Integration: Share content across social platforms to increase visibility and engagement.
  14. Mobile Page Speed Optimization: Optimize images and minimize code to improve mobile load times.
  15. Schema Markup Utilization: Implement schema markup for rich snippets and enhanced search visibility.
  16. User Engagement Analytics: Monitor user behavior to refine content and design strategies.
  17. Competitor Analysis: Study successful competitors for insights into effective strategies.
  18. Secure Website: Implement HTTPS for improved security and higher trustworthiness.
  19. Reduce Bounce Rates: Create engaging landing pages that address user needs promptly.
  20. Ongoing Monitoring and Adaptation: Continuously analyze performance metrics and adjust strategies accordingly.

The post Search Quality Rater Guidelines Checklist: Evaluator Considerations appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-search-quality-rating-guidelines/feed/ 25
SEO Website Migration Guide https://www.bruceclay.com/blog/seo-website-migration-guide/ https://www.bruceclay.com/blog/seo-website-migration-guide/#comments Wed, 15 Sep 2021 15:00:21 +0000 https://www.bruceclay.com/?p=106876 Thinking about doing a website migration? Currently going through one? We've developed a handy guide to help you protect your SEO and ensure your website migration is a smooth success.

The post SEO Website Migration Guide appeared first on Bruce Clay, Inc..

]]>
Birds migrating south for the winter.
Site migrations are a huge undertaking, and anyone who doesn’t think so hasn’t been through one. Unfortunately, many businesses make major changes to their websites without considering the impact on the SEO and overall performance of their site.

To be fair, it’s usually not their fault; many don’t realize the risks involved. Something as seemingly harmless as a site redesign, for example, can hurt the performance of the website and the business. (Cases in point: check out this client case study where we helped turn around a failed redesign.)

So I’ve created a handy reference guide for website migrations to ensure you are following best practices for SEO during this long journey.

In this article:

What is a Website Migration?

Website migration is a term used to describe when a website undergoes major changes, such as URL updates, redesigns, or content management system or hosting provider changes. Google defines site moves as either those with a URL change or those without.

Why Is SEO Important in a Website Migration?

A website migration is a major change to a website that can impact rankings and traffic. It is important to have an SEO professional oversee the many details that go into a website migration so that there is as little impact on the performance of the site as possible and so that the “new” site can outperform the old one.

Types of Website Migrations

Website migrations tend to fall into one of three categories: URL changes, design changes, or platform changes.

Here are some scenarios that prompt a website migration:

  • Moving a website from HTTP to HTTPS (for more on why this is important see: HTTPS for Users and Rankings)
  • Renaming URLs (aka URL migration)
  • Consolidating webpages and implementing 301 redirects and/or URL changes
  • Changing domain names
  • Merging with another website
  • Rearranging the website structure / navigation (for more, see: SEO Siloing: What, Why, How)
  • Redesigning the website and changing the code
  • Switching to a new content management system
  • Switching to a new hosting provider

SEO Checklist for a Website Migration

There are three phases to a website migration: pre-launch, launch, and post-launch. Each phase should have a defined set of activities. Below are just some of the steps you don’t want to miss in each phase.

Pre-Launch Phase

As you are planning your site migration, steps in the pre-launch phase should include the following.

Create a plan: Here, you are going to assemble all the people who will be involved in the site migration. Each person will have a list of things that need to be accomplished in each of the three phases — the pre-launch, launch and post-launch. Define goals for the website migration so that you can measure success. Pick a launch date, too — try to do it when website traffic levels are at their lowest. Having a tool to help manage all the tasks from a bird’s eye view will be helpful here.

Benchmark the website: Benchmark performance in different areas of the website. Then later, you can compare post-launch performance and quickly identify issues so you can address them right away. This includes running PageSpeed Insights and recording Core Web Vitals scores for your homepage and other important pages on your site.

I recommend you benchmark the following PageSpeed Insights scores:

And for Core Web Vitals, benchmark the following metrics: Largest Contentful Paint (LCP), First Input Delay (FID) and Cumulative Layout Shift (CLS).

Crawl the website: Use an SEO crawler to find and document any current issues with the website that you want to address during the site migration process. A few tools I recommend include our SEOToolSetⓇ site crawler, DeepCrawl and Screaming Frog.

Review the content: Do a content review of the site using a tool that can help you understand which pages perform well and which don’t (something simple like an export from Google Analytics can work here). You will want to make sure you know which pages are the top performers so that you preserve any traffic and conversions they provide. And for those that aren’t performing, you can decide if they need a rewrite, need to be folded into another, larger piece of related content, and/or need to be 301 redirected. (Doing a content review is something I talk more about in an unrelated article, here.)

Do a link review: Before the launch is a good time to analyze your inbound link profile. Get rid of any links that may not serve your site well moving forward and identify new link opportunities for the site as well. For more, see our guide to monitoring backlinks and link pruning.

Map 301 redirects: Map out any pages that you will no longer need, and which pages they will redirect to. Make sure you test the redirects in the staging environment before you go live. For more, see: How to Do a 301 Redirect.

Review page speed issues: Page speed issues (like those surfaced in Google Analytics Site Speed reports) can be identified and addressed during the migration process.

Review mobile-friendliness: Make sure you review the mobile-friendliness of your website, including things like website configuration, user experience and content. For more, see: Page Experience Matters: The Mobile-Friendly Site.

Prepare for page experience: Google’s page experience ranking update comprises all sorts of signals that you can optimize for ahead of the launch. For more, see: Google’s Page Experience Update 2021 — A Complete Guide.

Review design changes: Website design can impact SEO in a number of ways, for example, the code used, the way the new webpages will be set up or the navigation. Make sure there is an SEO review of any design changes.

Back up the old site: As insurance, you’ll want to make a backup of your site. Download all the images and any other assets so you have them stored just in case, and make a backup of your database if applicable.

Set up a testing environment: Most web developers know to create a separate environment to make and test changes before they go live on the site. Make sure that is happening and that the search engines can’t access the staging site.

Prepare a new HTML sitemap: This will be for users and will help them navigate the site with ease as needed. For more, see: How to Create a Sitemap.

Change campaign URLs: Even if you do implement 301 redirects, you will want to review any marketing campaigns driving traffic to specific URLs on the site and make sure they will have the new URLs.

Check structured data: If you have structured data on your webpages, test and address issues ahead of the launch. You can use Google’s Structured Data Testing Tool.

Check page titles, meta descriptions and headers. Make sure each page has unique meta information and that page headers (H1, H2, H3, etc.) are set up properly on the webpages. For more, see our articles on meta tags and heading tags.

Analytics review: Make sure that analytics tracking is set up properly. Here, it can be really useful to hire analytics specialists if none exist on the team. That’s because you want to set up tracking in a way that will help you see that you are reaching the various goals you have set for the website and the business.

Set up Search Console: If it’s not already, make sure Search Console is set up for the website. Google has a handy getting started guide, too.

Robots directives: Double-check to make sure that the robots.txt file is set up properly. See Google’s help files on robots.txt and our Robots Exclusion Protocol Reference Guide for more.

Launch Phase

On launch day, here are some things you can and should do …

Crawl the website and address issues: Do another crawl of the website to surface any errors that may be coming up. As mentioned earlier, here are a few tools I recommend: SEOToolSet’s site crawler, DeepCrawl and/or Screaming Frog.

Run Search Console tests: As soon as the site is live, you can perform different test functions in Search Console. Upload your XML sitemaps, configure URL parameters, upload an updated disavow file (as needed), use the URL inspection tool and so on.

Post-Launch Phase

After the launch, here are some things to consider …

Do pre-launch checks again. Go through your pre-launch list and make sure that everything went off without a hitch.

Rerun all the benchmarks for improvements/declines: Rerun the benchmark reports to determine if there was an improvement or decline on each of the metrics. Compare these with the pre-launch benchmark reports … How did you do?

Check crawl stats. In Search Console, check the crawl stats to make sure Google is crawling new web pages.

Use Search Console. Check out all the useful features in Search Console, and use them in the post-launch phase.

Test and tackle page speed. See how fast your webpages are by using Google Analytics’ Site Speed reports, the PageSpeed Insights tool or Google’s new Page Experience report.

Measure performance: You can begin to track progress right away, but things may be shaky for a while. Depending on the size of the site and the complexity of the migration, you will need to decide when you can start measuring true performance — which could be several months out. Look at things like rankings, traffic, user experience metrics, and conversions. (And, of course, all the goals and reports set up in Google Analytics in the pre-launch phase.)

Create your content strategy. There’s no doubt you will be adding in more content over time. Now is the time to get clear on how you will approach adding new pages to the website, making sure that the SEO professional/team is involved in all of the new content plans. This ensures you keep the site organized, optimized, and driving traffic.

Closing Thoughts

When done poorly, a website migration can cause a loss in traffic and revenue not just in the short term but even in the long run. When done well, however, a website migration can set up your website for better performance for years to come and ultimately set up your business for more success. Want an example? See this SEO case study on a successful site migration that led to a 166% jump in traffic!

We’ve helped hundreds of clients successfully complete a website migration and stay competitive in the search results. If you’d like help with your site move, please reach out to us.

FAQ: How can I ensure a smooth website migration while maintaining SEO performance?

Website migrations are pivotal moments that can either boost your online presence or hinder your search engine rankings. When considering a website migration, keeping SEO at the forefront is essential to ensure a smooth transition and maintain your hard-earned rankings.

Begin by meticulously planning your migration strategy. Evaluate your current website performance and identify areas for improvement. As you assemble a team for the migration, involve SEO professionals who understand the intricacies of preserving rankings. Setting clear goals and a timeline will help guide the process.

Prioritize content preservation during migration. Map out your existing URLs and create a comprehensive list of redirects to ensure a seamless transition for both users and search engines. This proactive approach prevents broken links and maintains your site’s credibility in the eyes of search engines.

Mitigate risks by benchmarking your website’s performance before the migration. Assess critical metrics such as page speed, mobile-friendliness, and Core Web Vitals. This benchmarking allows you to accurately measure post-migration improvements or setbacks, enabling you to address issues promptly.

During the migration, monitor closely for any unforeseen issues. Use tools like Google Search Console to identify and fix crawl errors promptly. Keep SEO specialists engaged to ensure that the migration process aligns with best practices and that search engines index the new site effectively.

Post-migration, continue monitoring your website’s performance closely. Analyze changes in rankings, traffic, and user engagement. Address any unexpected drops in rankings promptly by identifying and rectifying potential issues.

Step-by-Step Procedure: Ensuring a Smooth Website Migration with SEO Performance

1. Planning Phase:

– Assemble a migration team that includes SEO professionals.

– Define clear goals and establish a timeline.

– Evaluate the current website’s performance and identify areas for improvement.

2. Content Preservation:

– Map out existing URLs and create a comprehensive list of redirects.

– Prioritize content preservation to maintain user experience and SEO value.

3. Benchmarking Performance:

– Evaluate critical metrics such as page speed and mobile-friendliness.

– Benchmark performance to measure post-migration improvements accurately.

4. Migration Process:

– Monitor the migration process closely for any unexpected issues.

– Utilize Google Search Console to identify and rectify crawl errors promptly.

5. Post-Migration Analysis:

– Continuously monitor website performance after migration.

– Analyze changes in rankings, traffic, and user engagement.

– Address any drops in rankings promptly by identifying and resolving potential issues.

Website migrations require careful planning, execution, and ongoing monitoring. By following these expert strategies, you can ensure a smooth migration while maintaining your site’s SEO performance. Remember, a well-executed migration can improve rankings and overall success in the digital landscape.

The post SEO Website Migration Guide appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/seo-website-migration-guide/feed/ 8
Core Web Vitals: First Input Delay – What It Is and How to Improve It for SEO https://www.bruceclay.com/blog/core-web-vitals-fid/ https://www.bruceclay.com/blog/core-web-vitals-fid/#comments Wed, 01 Sep 2021 17:00:40 +0000 https://www.bruceclay.com/?p=85549 “Core web vitals” is a set of core webpage functionalities that impact user experience. Google’s ranking algorithm update called page experience, which Google rolled out from June to August 2021, incorporates core web vitals as NEW ranking factors for SEO. The current set of core web vitals includes: Largest Contentful Paint (LCP) First Input Delay […]

The post Core Web Vitals: First Input Delay – What It Is and How to Improve It for SEO appeared first on Bruce Clay, Inc..

]]>
“Core web vitals” is a set of core webpage functionalities that impact user experience. Google’s ranking algorithm update called page experience, which Google rolled out from June to August 2021, incorporates core web vitals as NEW ranking factors for SEO.

The current set of core web vitals includes:

  • Largest Contentful Paint (LCP)
  • First Input Delay (FID)
  • Cumulative Layout Shift (CLS)

In this article, part of our series covering the page experience update, I’ll discuss the first input delay or FID:

What Is First Input Delay (FID)?

FID measures the responsiveness of a page to user events. While technically, this could happen throughout the lifespan of a user session on a webpage, in practice, most interactivity problems occur during the initial page load. That is because this is when most resources are being downloaded, parsed, executed, and rendered.

Google discusses FID here:

First Input Delay (FID) is an important, user-centric metric for measuring load responsiveness because it quantifies the experience users feel when trying to interact with unresponsive pages—a low FID helps ensure that the page is usable. …

FID measures the time from when a user first interacts with a page (i.e., when they click a link, tap on a button, or use a custom, JavaScript-powered control) to the time when the browser is actually able to begin processing event handlers in response to that interaction.

Google cites two important reasons why the first input delay is important:

  • The first input delay will be the user’s first impression of your site’s responsiveness, and first impressions are critical in shaping our overall impression of a site’s quality and reliability.
  • The biggest interactivity issues we see on the web today occur during page load. Therefore, we believe initially focusing on improving site’s first user interaction will have the greatest impact on improving the overall interactivity of the web.

Keep in mind that FID will not apply to every situation, as Google points out here:

Not all users will interact with your site every time they visit. And not all interactions are relevant to FID … How you track, report on, and analyze FID will probably be quite a bit different from other metrics you may be used to.

How Is First Input Delay (FID) Measured?

FID measures the first impression of your site’s interactivity and responsiveness. It analyzes things like clicks, taps and key presses, which fall under the “responsiveness” category. It does not measure things like scrolling and zooming, which are related to animation.

Google recommends webpages aim for an FID of 100 milliseconds or less*, which means the page would be able to respond to an interactive event within that time frame. In order to be considered “good,” Google’s threshold should be met 75 percent of the time, segmented across mobile and desktop devices. You can learn more about how Google creates thresholds here.
*Updated threshold per Google as of 2/18/2021

FID score range from Google.
Google’s FID score

While the official threshold is 75% of page loads, Google says that for FID in particular, they “strongly recommend looking at the 95th to 99th percentiles, as those will correspond to the particularly bad first experiences users are having with your site. And it will show you the areas that need the most improvement.” This is true for both desktop and mobile users.

For developers, it’s important to understand that Google only measures the delay in event processing, not the “event processing time itself nor the time it takes the browser to update the UI after running event handlers.”

In other words, Google only measures how long the browser takes to start executing the event process. So, if you click on a link, it’s the delay between the time you click and the time the browser starts processing that click.

And when you’re ready to start improving FID, you’ll use tools that can help measure real data in the wild.

How Does First Input Delay (FID) Impact SEO?

Improving FID is another way to speed up your webpages for visitors. Consider that fast page loading was already a best practice for SEO and a ranking factor long before we heard of core web vitals. FID helps keep visitors on your site because they can interact with the content faster.

When people bounce from your site, they may never come back, and you can lose potential revenue. Not only that, but a sluggish site can also impact your rankings. That’s because Google’s AI, RankBrain, may take into account how a user engages with the search results.

Over time, if a website has enough visitors who go to the page from the search results and bounce back quickly, this could indicate they didn’t find what they were looking for. Because RankBrain’s goal is to analyze and serve the most relevant search results, rankings could suffer.

The good news is that most sites may already be OK when it comes to FID. In a study by Screaming Frog, 89% of mobile and 99% of desktop URLs fell within the threshold. The average was around 56 milliseconds on mobile and 13 milliseconds on desktop.

When looking at FID and search rankings correlation, Screaming Frog says that there’s much less of a correlation than for other core web vitals. But you need to recall that 2021 is when this becomes an important factor, and we would not expect an impact yet.

First input delay (FID) data from ScreamingFrog study.
“How Many Sites Pass the Core Web Vitals Assessment?,” Screamingfrog.co.uk

How Do I Improve My First Input Delay (FID) Score?

Google provides tools to measure FID, including:

You can also measure FID with the web-vitals JavaScript library and learn more about that here. If you are serious about improving CWV, this is the best way to get real-time feedback from actual user sessions to determine how to fix FID in the field.

The primary cause of a bad FID score is heavy JavaScript execution. So be sure to optimize how “JavaScript parses, compiles, and executes on your web page will directly reduce FID,” says Google. Reducing the amount of JavaScript and/or optimizing the running of JavaScript has always been a good idea for SEO.

If a user clicks while a JavaScript file is being processed, the browser can’t react, and the user feels blocked. If your FID score is in the red, you may need to split up your JavaScript files so the browser can go back and forth between JavaScript processing and reacting to the user.

Optimizing your JavaScript reduces page bloat, improves page performance, and provides Google with an easier path to index the correct content. That’s because Google will not have to process as much JavaScript to figure out what it needs.

In our experience, the more you can give Googlebot what it needs right away without having to process too many things, the better Google will index your site the way you think it should be indexed. Indexing is hugely important for SEO as it influences what pages Google determines are valid or not.

To optimize the FID score, Google recommends running a Lighthouse performance audit and looking at the opportunities uncovered but gives more detail on how to optimize JavaScript here.

Find out more about the update by reading our page experience series:

  1. What’s the Page Experience Update?
  2. How to Make a Mobile-Friendly Site
  3. Intrusive Interstitials & Why They’re Bad for SEO
  4. HTTPS for Users and Ranking
  5. Core Web Vitals Overview
  6. Core Web Vitals: LCP (Largest Contentful Paint)
  7. Core Web Vitals: FID (First Input Delay)
  8. Core Web Vitals: CLS (Cumulative Layout Shift)

Watch our on-demand webinar 3 Expert Tips to Improve Core Web Vitals to get more in-depth help on this timely SEO topic.

FAQ: What is the significance of First Input Delay in user experience?

First Input Delay (FID) is a pivotal metric that defines how responsive and interactive a webpage is upon user engagement. FID measures the time interval between a user’s initial interaction—such as clicking a link or tapping a button—and the browser’s ability to respond. It is, essentially, the first impression users have of a webpage’s responsiveness.

A seamless and prompt response to user input is integral to retaining visitors’ interest and ensuring their satisfaction. Users perceive a website as highly responsive when FID is minimized, leading to a positive user experience. Research indicates that user patience is limited, and even a slight delay in response can result in frustration, leading to higher bounce rates and decreased engagement.

FID’s significance extends beyond user satisfaction. Search engines, particularly Google, recognize the importance of user experience in determining search rankings. As part of its ranking algorithm, Google considers FID as a user-centric metric to gauge a website’s responsiveness. Websites that provide a smoother user experience by minimizing FID are more likely to rank higher in search results, gaining increased visibility and organic traffic.

To optimize FID, meticulous attention to website performance is crucial. Heavy JavaScript execution often contributes to delayed responses. By optimizing JavaScript code, reducing its size, and improving its execution efficiency, websites can significantly enhance FID scores. Identifying and addressing resource-intensive processes that impact FID is essential, as even seemingly insignificant elements can accumulate and cause delays.

Moreover, focusing on FID optimization aligns with the broader goal of improving website performance. A well-optimized website delivers a superior user experience and positively impacts SEO efforts. Websites that load swiftly and respond promptly to user input create a positive feedback loop, enhancing engagement, lowering bounce rates, and improving search rankings.

The significance of First Input Delay in user experience cannot be overstated. A quick and seamless response to user input is fundamental in retaining users, improving engagement, and enhancing overall website performance. Prioritizing FID optimization not only elevates user satisfaction but also contributes to improved search rankings and organic traffic.

Step-by-Step Procedure: The Significance of First Input Delay in User Experience

  1. Introduction to FID: Define what First Input Delay (FID) is and its importance in user experience.
  2. Measurement of FID: Explain how FID is measured and its impact on user interactions.
  3. User Perception: Discuss how FID influences user perception of website responsiveness.
  4. Impact on Engagement: Describe the correlation between low FID and higher user engagement.
  5. Search Engine Ranking: Explain how Google incorporates FID into its ranking algorithm.
  6. FID and SEO: Discuss the relationship between FID optimization and improved search rankings.
  7. Optimizing JavaScript: Provide insights into how heavy JavaScript execution affects FID.
  8. Reducing JavaScript Load: Detail strategies for optimizing JavaScript code to improve FID.
  9. Resource-Intensive Processes: Identify elements contributing to delayed responses and how to mitigate them.
  10. Holistic Website Performance: Highlight the broader benefits of FID optimization on overall website performance.
  11. User Experience Impact: Discuss the direct connection between FID optimization and enhanced user experience.
  12. Bounce Rate Reduction: Explain how FID optimization leads to lower bounce rates and higher engagement.
  13. Positive Feedback Loop: Illustrate how FID optimization fosters a positive user satisfaction and interaction cycle.
  14. Strategic Importance: Emphasize the strategic significance of FID optimization in digital marketing efforts.
  15. Real-world Examples: Provide case studies demonstrating the impact of FID optimization on user engagement and search rankings.
  16. Tools and Resources: List tools and resources available for measuring and improving FID.
  17. JavaScript Optimization Tools: Recommend tools for optimizing JavaScript execution to enhance FID.
  18. Performance Audits: Guide readers on how to conduct a Lighthouse performance audit for FID insights.
  19. Continuous Improvement: Stress the importance of ongoing monitoring and improvement of FID.
  20. Conclusion: Summarize the importance of FID optimization in delivering exceptional user experiences and driving SEO success.

The post Core Web Vitals: First Input Delay – What It Is and How to Improve It for SEO appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/core-web-vitals-fid/feed/ 17
What Are ‘Your Money or Your Life’ (YMYL) Webpages? https://www.bruceclay.com/blog/what-are-your-money-or-your-life-ymyl-webpages/ https://www.bruceclay.com/blog/what-are-your-money-or-your-life-ymyl-webpages/#comments Thu, 05 Aug 2021 16:36:02 +0000 https://www.bruceclay.com/?p=103408 "Your money or your life" may sound funny, but the implications of YMYL content is no joke. Here's how to keep within Google's lofty expectations and get rewarded with higher rankings.

The post What Are ‘Your Money or Your Life’ (YMYL) Webpages? appeared first on Bruce Clay, Inc..

]]>
Question mark written on a chalkboard.

“Your money or your life…” It sounds like something you’d hear in an old spaghetti Western, doesn’t it? This content type is no joke though.

Your money or your life (YMYL) is how Google describes “types of pages or topics that could potentially impact a person’s future happiness, health, financial stability, or safety.”

Understanding the implications of YMYL content and how to keep within Google’s expectations is essential for website publishers and SEO professionals.

In this post, you’ll find answers to the most pressing questions about YMYL content:

What Are Your Money or Your Life (YMYL) Webpages?

YMYL webpages contain information that could potentially impact a person’s life in serious ways. In its most recent Search Quality Evaluator Guidelines, Google explains that its Page Quality rating standards for YMYL content are “very high” because low-quality pages could have negative consequences for searchers.

YMYL pages aren’t just places where you can make a purchase. Examples of YMYL content include:

  • News about important topics and current events, especially in business, politics, science, and technology.
  • Information about civics, government, and the law, such as content about voting or social services.
  • Financial advice and information about taxes, savings, investments, etc.
  • Shopping pages, whether for products or services.
  • Health and safety content including medical information and anything to do with drugs, emergency preparedness, hospitals, and more.
  • Pages about groups of people, specifically claims or information about religion, sexuality, race, or other shared traits/demographics.

Google notes that the list in its guidelines is not comprehensive. Any webpage with information, advice, or claims “related to big decisions or important aspects of people’s lives” can be considered YMYL.

Why Is YMYL Important to SEO?

Google holds YMYL pages to a higher standard than other types of content. If you want webpages on these potentially sensitive topics to outrank the competition, all of the content on the page must pass the search engine’s rigorous quality checks.

This includes:

  • Main Content (MC): The part of the page that directly helps the page achieve its purpose, including text, images, videos, reviews, and other user-generated content, etc.
  • Supplementary Content (SC): Parts of the page that contribute to user experience but aren’t necessarily driving the purpose of the page, such as navigational links, for example.
  • Ads: Paid media and monetization designed to make money from the webpage, including affiliates or any other monetized links.

Website publishers must understand that even a YMYL webpage with high-quality content may not be up to snuff if there are affiliate links or ads directing readers to dubious information that could negatively impact them.

Think in terms of the overall experience a searcher has on the webpage in question.

How Do You Optimize for YMYL?

You can’t talk YMYL without running into another impactful acronym: E-E-A-T (experience, expertise, authoritativeness, and trustworthiness). E-E-A-T is not a ranking factor; rather, it is how Google describes what it’s looking for in quality web content.

E-E-A-T is important on all types of websites and pages, from forums with strictly user-generated content to fashion blogs and gossip websites.

It is even more important on webpages that users turn to for advice and information about the topics that could seriously impact their livelihood — YMYL pages.

From Google’s guidelines:

The amount of information needed for E-E-A-T assessment depends on the type of website. For example, YMYL websites demand a high degree of trust, so they generally need satisfying information about who is responsible for the content of the site.

There are a few aspects Google looks at in particular when considering E-E-A-T. Make sure your webpage clearly answers these important questions:

  • What is the purpose of this page?
  • How does it demonstrate expertise, authoritativeness, and trustworthiness?
  • Is the quality and amount of the main content indicative of a high-quality resource on this topic?
  • Who is responsible for the main content on this page?
  • What sort of reputation does this website and the person responsible for the page content have?
  • Does this site sell ads or offer links to questionable and, thus, potentially harmful content or sites?

Publishers will also want to pay particular attention to “Needs Met,” a section in the Search Quality Evaluator Guidelines added in late 2020. Needs Met measures on a scale of “Fails to Meet” to “Fully Meets” how well a search engine results page and the landing page associated with the query meet a mobile searcher’s intent.

Optimizing YMYL content, pages, and sites means making it as easy as possible for both site visitors and search engines to see that your resource will not negatively impact readers. Use these tips to improve your page’s E-E-A-T.

Experience

In December 2022, Google extended the acronym to E-E-A-T, emphasizing Experience. This shift highlights the significance of firsthand, real-life expertise among writers discussing specific subjects. Pages crafted by individuals possessing ample personal experience tend to be trustworthy and proficiently achieve their goals. For example, evaluating the reliability of a product review is influenced by whether it comes from someone who has used the item versus someone who hasn’t.

Expertise

Google is specifically looking at the expertise of the person who created the page’s MC. Expertise is particularly important in YMYL topics. Google wants to see that the person who created the content has professional experience, accreditation, education, first-hand experience, and/or other qualifications that make them an expert in the topic.

To help improve your content’s expertise:

  • Be clear about who created the main content and what makes them an expert on the topic. This may mean hiring experts, and you may want to include author names with brief bios in your content.
  • Properly source and cite credible information to support any claims that are not common knowledge. Remember that outbound links to expert sources boost your credibility.
  • Where content is user-generated, ensure there is an oversight process so that low-quality, potentially harmful information is removed.

Authoritativeness

In gauging authority, Google is looking at the authoritativeness of the creator of the MC, the MC itself, and the website as a whole.

To improve your authoritativeness:

  • Ask yourself, does this page leave questions unanswered? While word count or the volume of main content is not a ranking factor, Google does want to see that the amount of content is sufficient to satisfy the reader’s needs. (By the way, you can see the word count for the top-ranked pages by using our WordPress SEO plugin.)
  • Attract links from other reputable, highly authoritative websites in your niche, such as industry associations, recognized experts, and reputable publications. For more, see The CMO’s Guide to the New Link Building Strategy.

Trustworthiness

As with authoritativeness, Google is considering the trustworthiness of the MC creator, the MC itself, and the website. This is particularly important where the content offers advice on personal finance, legal issues, taxes, etc.

Ways to improve a website’s trustworthiness include:

  • For sales and store pages in particular, Google is looking for a “satisfying” customer service experience. Make sure the page proactively answers common questions and that the path to service — whether by online chat, phone, or any other method — is crystal clear.
  • Incorporate testimonials and reviews into pages rather than leaving them only on third-party websites or on one dedicated page. Don’t make users go looking for them; chances are, they won’t bother. Google says, “We consider a large number of positive user reviews as evidence of positive reputation.”
  • Display any industry associations, relevant qualifications, badges that illustrate how you protect visitor/shopper data, refund policy information, and other information that builds trust with visitors.
  • Google says that “when a high level of authoritativeness or expertise is needed, the reputation of a website should be judged on what expert opinions have to say.” What would a top industry expert say about your YMYL webpage?

Finally, look at your webpages on YMYL topics with a critical eye:

  • Would I give this website my money?
  • Would I consent to a medical treatment or investment plan based on what I see here?
  • How confident am I in the accuracy of this information?

Read our Complete Guide to the Fundamentals of Google’s E-E-A-T for more helpful tips and examples. And also see: 5 Times When SEO Siloing Can Make or Break Your Rankings for how siloing can help with YMYL webpages.

Raise the Bar for YMYL Webpages

The bar rises substantially when you’re optimizing YMYL sites and content. Be that reputable, trustworthy source for your visitors, and Google will reward you with higher rankings and greater online visibility.

Make sure you revisit and reevaluate YMYL pages often, particularly in fast-moving spaces such as medical technology and treatments, politics, or investments, for example.

What was a great resource last year may be outdated today. You can maximize the ROI of your content by consistently updating what you’ve already invested in creating to keep it performing year after year.

If you need expert help with your SEO or content, please contact us today for a free quote and consultation.

FAQ: How can I ensure my web content meets Google’s YMYL standards for experience, expertise, authoritativeness, and trustworthiness?

Creating web content that meets Google’s YMYL standards is essential for maintaining a strong online presence and building trust with your audience. YMYL, which stands for Your Money or Your Life, encompasses content that can directly impact a user’s well-being, finances, or safety.

Here are key strategies to ensure your content adheres to these critical standards:

Understanding YMYL Content

Start by comprehending what falls under the umbrella of YMYL content. Financial advice, medical information, legal guidance, and news are prime examples. Recognize the importance of expertise, authoritativeness, and trustworthiness in these niches, as Google holds such content to higher standards due to its potential impact on users’ lives.

Adding Experience 

You want to share your personal experience and opinions. High-quality content frequently showcases the creator’s direct involvement in the subject matter, validating suggestions as tested and verified, ensuring authentic insights.

Elevating Expertise

Demonstrate your expertise through in-depth research, citing reliable sources, and showcasing your qualifications. Include author bios and credentials to lend credibility to your content. Collaborate with industry experts for guest posts or interviews, adding diverse perspectives and enhancing your content’s value.

Establishing Authoritativeness

Build authoritativeness by consistently producing high-quality content over time. Create a content calendar that covers important topics within your niche. Back up your claims with data, case studies, and references to authoritative sources. Encourage user engagement through comments and discussions to show that your audience respects and values your content.

Cultivating Trustworthiness

Transparency is key to gaining trust. Disclosure of affiliations, sponsors, and potential conflicts of interest is paramount. Provide accurate and up-to-date information – any outdated or inaccurate details could severely undermine your credibility. Implement strong security measures on your website to protect user data, assuring visitors that their privacy is a priority.

Regularly Updating and Reviewing

Stay current with industry trends and changes in Google’s algorithms. Regularly review and update your content to ensure accuracy and relevance. Collaborate with peers for peer reviews to gain insights and perspectives that can further enhance your content’s quality.

Step-by-Step Procedure: Ensuring YMYL Compliance

  1. Identify Your Niche: Determine if your content falls under YMYL categories such as health, finance, legal, or safety.
  2. Research Thoroughly: Conduct comprehensive research on your chosen topic, citing reputable sources.
  3. Highlight Credentials: Showcase your expertise and credentials in author bios and content introductions.
  4. Collaborate with Experts: Engage with industry experts for collaborations, interviews, or guest posts.
  5. Create a Content Calendar: Plan a schedule to consistently produce authoritative content within your niche.
  6. Reference Authoritative Sources: Support your claims with data, studies, and references from reliable sources.
  7. Encourage Engagement: Foster user engagement through comments and discussions on your content.
  8. Ensure Transparency: Disclose affiliations, sponsorships, and potential conflicts of interest.
  9. Provide Accurate Information: Regularly update your content to maintain accuracy and relevance.
  10. Prioritize Security: Implement strong security measures to protect user data and privacy.
  11. Stay Updated: Keep up with industry trends and Google algorithm changes that affect YMYL content.
  12. Review Content: Conduct periodic content reviews for accuracy and quality.
  13. Peer Reviews: Collaborate with peers for feedback and insights on your content.
  14. Improve User Experience – To provide visitors with an effortless browsing experience, provide clear navigation features, and an attractive design to make the experience painless.
  15. Mobile Optimization: Optimize your site to be visible on mobile devices.
  16. Page Speed: Improve page loading speed for a better user experience and SEO ranking.
  17. Structured Data: Implement structured data to enhance search engine understanding of your content.
  18. Avoid Clickbait: Craft honest, accurate headlines that reflect the content’s purpose.
  19. Minimize Ads: Avoid excessive ads that can detract from the main content and user experience.
  20. Monitor Analytics: Regularly analyze user engagement, bounce rates, and traffic sources to refine your strategy.

Ensuring your web content meets Google’s YMYL standards requires a holistic approach prioritizing expertise, authoritativeness, and trustworthiness. Following these strategies and steps can enhance your online credibility and provide valuable content that genuinely benefits your audience while aligning with Google’s guidelines.

The post What Are ‘Your Money or Your Life’ (YMYL) Webpages? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/what-are-your-money-or-your-life-ymyl-webpages/feed/ 10