SEO best practices Archives - Bruce Clay, Inc. https://www.bruceclay.com/blog/tag/seo-best-practices/ SEO and Internet Marketing Fri, 29 Mar 2024 23:13:59 +0000 en-US hourly 1 Validate Your SEO Advice Using Google’s Recommendations https://www.bruceclay.com/blog/validate-seo-advice-googles-recommendations/ https://www.bruceclay.com/blog/validate-seo-advice-googles-recommendations/#comments Mon, 12 Feb 2024 20:38:55 +0000 https://www.bruceclay.com/?p=212002 Learn the best SEO practices and strategies for optimizing your website to improve search engine visibility and drive organic traffic. Find out how to prove that SEO works to your company's stakeholders.

The post Validate Your SEO Advice Using Google’s Recommendations appeared first on Bruce Clay, Inc..

]]>
Professionals working together in an office.

“How do we know SEO is going to work?” This question or some form of it is often asked when presenting digital strategies to your company’s internal stakeholders. And if they are not saying it out loud, they’re probably thinking it.

One of the best ways to get your SEO recommendations implemented at your company is to prove that search engines also view your tactics as a best practice. It’s your job as the professional to educate and show evidence that what you are recommending works.

So in this article, I will tie some common SEO best practices to the Google advice found in its SEO Starter Guide and other sources.


Search Engine Optimization

First, it’s useful to show that Google believes SEO is actually a good thing when done right. In its SEO Starter Guide, Google defines SEO as “the process of making your site better for search engines.”

Once you’ve established that SEO is a partnership between websites and search engines, you lay the groundwork for the recommendations to follow.

Getting Started with SEO

In its guide, Google outlines a few basic questions that website publishers want to explore:

  • Is my website showing up on Google?
  • Do I serve high-quality content to users?
  • Is my local business showing up on Google?
  • Is my content fast and easy to access on all devices?
  • Is my website secure?

SEO tactics that can help with each of those are as follows:

Making sure your website shows up. There are a number of reasons why search engines can’t crawl and index a site (for example, your robots.txt file). Technical SEO can help determine the problem.

Serving high-quality content to users. Google assesses the quality of webpages, and it all starts with creating quality content from qualified experts and authorities on the matter.

This is especially true for “your money or your life (YMYL)” webpages. Google’s John Mueller reiterates how important E-E-A-T is for “YMYL” pages in a 2021 Google SEO office hours session:

Local business marketing online. Local SEO caters to small businesses with brick-and-mortar locations so that they can show up in the search results.

In 2021, Google’s John Mueller reiterated how important optimizing the Google My Business listing is:

Fast content for mobile users. Optimizing for things like Google’s “core web vitals” as well as ensuring websites cater to mobile users is important to compete in the search results.

In 2022, Google’s John Mueller stated core web vitals is key for good performing websites:

On Reddit, Mueller also clarified that core web vitals adherence is more than a “tie breaker” when it comes to ranking:

John Mueller response to the Reddit post "Anyone else not buying Core Web Vitals?".

Secure websites. HTTPS is the gold standard to secure the data that’s exchanged between a web browser (such as Chrome) and a web server (which stores, processes and delivers your webpages to a user). In addition to this security measure, you want to implement controls to ensure your site is safe from hackers. And yes, this is a part of SEO!

Technical SEO

Technical SEO is the practice of optimizing the “back end” of a site so that search engines like Google can better crawl and index the website. Among other things, Google outlines the following:

  • Sitemaps. A sitemap is to tell search engines about the pages, images and videos that are on the site. It helps ensure more thorough crawling and indexing. Make sure your site has one. Learn more by reading our article: What Is an XML Sitemap and How Do I Make One?
  • Robots.txt. Make sure you are using this properly. Excluding important files can hinder search engines crawling and indexing your important webpages.

Google’s Gary Illyes once said on a Reddit thread:

“I really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made up terms by the rank trackers, and that they talked more with the developers of the website once done with the first part of this sentence.”

On-Page SEO

On-page SEO is the practice of optimizing webpages from top to bottom. In doing so, you are accomplishing two things: 1) giving pages a better chance of showing up in the search results and 2) creating a better user experience for website visitors.

Meta Information

Google says to pay attention to your webpages meta tags, including the title and description.

Titles

Here, Google says to “create unique, accurate page titles.” Google goes on to say that the titles should …

“Accurately describe the page’s content. Choose a title that reads naturally and effectively communicates the topic of the page’s content.”

  • Brief explanation: You want to make the title relevant to the page content so that 1) Google can quickly understand what the page is about, 2) you can get more click-through from the search results and 3) you can avoid a bounce because the page delivers on what the title says. You also want to avoid violating Google’s webmaster guidelines by attempting to keyword-stuff the title.

“Create unique titles for each page. Each page on your site should ideally have a unique title, which helps Google know how the page is distinct from the others on your site. If your site uses separate mobile pages, remember to use good titles on the mobile versions too.”

  • Brief explanation: Avoid duplicate content, plain and simple. Duplicate content can work against you by filtering your webpages from the search results.

“Use brief but descriptive titles. … If the title is too long or otherwise deemed less relevant, Google may show only a portion of it or one that’s automatically generated in the search result. Google may also show different titles depending on the user’s query or device used for searching.”

  • Brief explanation: You want to keep titles consistent with character count best practices, so that your titles are not cut off (aka “truncated”) in the search results. Think of the title as an advertisement for your webpage. Follow best practices but also make it compelling so people want to click through. Know that even with your best efforts, Google may change how it is displayed in the search results.

Description Tag

Google says: “Use the ‘description’ meta tag.”

Description meta tags are important because Google might use them as snippets for your pages. Note that we say “might” because Google may choose to use a relevant section of your page’s visible text if it does a good job of matching up with a user’s query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet.

Google goes on to say …

“Accurately describe the page content. Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there’s no minimal or maximal length for the text in a description meta tag, we recommend making sure that it’s long enough to be fully shown in Search … and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.”

  • Brief explanation: Just like the title tag, you want to view this tag as an opportunity to promote the webpage. Making it compelling is key. Google says there are no character count limits, though we recommend keeping within best practices to ensure the entire snippet is shown in the search results.

“Use unique descriptions for each page. Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain …”

  • Brief explanation: Meta tags can also contribute to the pesky duplicate content issue that can happen when Google sees similar content on a page. Since the description tag (along with the title tag) is usually one of the first pieces of content a search engine encounters on the page, you want it to be original and relevant to the page.

Learn more:

Heading Tags

Google says:

Use meaningful headings to indicate important topics, and help create a hierarchical structure for your content, making it easier for users to navigate through your document.

Google goes on to say:

“Imagine you’re writing an outline. Similar to writing an outline for a large paper, put some thought into what the main points and sub-points of the content on the page will be and decide where to use heading tags appropriately.”

  • Brief explanation: Headings carry weight in terms of categorization of your webpage to both search engines and users. The H1 is considered the most important heading on the page (usually the title), then H2 as a subsection, H3 as a sub-subsection and all the way to H6. You only want one H1 per page.
  • Our recommendation is to maintain a hierarchy and not use more than H1 to H4. Beyond H4 you are usually too detailed and should be doing an e-book.
  • And yes, you can use more than one H1 on a portal page where you link to other pages from each section. We have done it and it works just fine. But for detailed content pages it is usually better to have a single H1 for that content. Google’s John Muller has stated more than one H1 doesn’t matter.

Images

Google says:

  • Use HTML elements
  • Use the “alt” attribute
  • Use brief but descriptive filenames and alt text
  • Use standard image formats
  • Use an image sitemap

All of these recommendations highlight the fact that optimizing images is an important part of optimizing the content on a page.

Learn more:

Structured Data

Google says:

“Structured data is code that you can add to your sites’ pages to describe your content to search engines, so they can better understand what’s on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.”

  • Brief explanation: Structured data helps to further clarify to the search engine what the content on the page is about. It also has the benefit of enhancing the snippets in the search results in many cases. This can improve your click-through rate from the search results.

How important is it? It probably depends. But Google’s John Mueller says it’s an “extremely light signal.” In a since-deleted tweet:

“What about the non-RR SD that’s not absolutely clear from the page? It can be helpful, but it’s also limited in the extra value it provides. How do you rank something purely from SD hints? It’s an extremely light signal. If you’re worried, make the content more obvious.”

Learn more:

Site Structure and Navigation

How you structure your website can impact both visitors to your site and the search engine’s ability to determine relevance. Google says:

The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the website owner thinks is important. Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.

Google goes on to say:

“Create a simple directory structure. Use a directory structure that organizes your content well and makes it easy for visitors to know where they’re at on your site. Try using your directory structure to indicate the type of content found at that URL.”

And:

“Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don’t require an internal “search” functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.”

  • Brief explanation: SEO siloing is a search engine optimization technique that structures a website’s content by grouping related webpages together in hierarchical categories based upon how people search. SEO siloing shapes the directories of your website and its internal linking structure to help Google and visitors better find the content.

Learn more:

“Create a navigational page for users, a sitemap for search engines.”

  • Brief explanation: As mentioned earlier an XML sitemap is for search engines, but a site map is for users to get a bird’s eye view of the content on your site and to improve the user experience.

“Show useful 404 pages.”

  • Another user experience-focused recommendation is to have a 404 page that will render if a link is broken on your site. This page can serve up other relevant content or suggest a next step to the visitor.

Learn more:

Content

Content is one of the most important things to get right when it comes to SEO. Remember, Google wants only the highest quality webpages in its search results. So creating quality content can enhance your ability to compete in the search results.

Google says:

Creating compelling and useful content will likely influence your website more than any of the other factors discussed here. Users know good content when they see it and will likely want to direct other users to it. This could be through blog posts, social media services, email, forums, or other means. Organic or word-of-mouth buzz is what helps build your site’s reputation with both users and Google, and it rarely comes without quality content.

Google goes on to say:

“Know what your readers want (and give it to them). Think about the words that a user might search for to find a piece of your content.”

  • Brief explanation: Doing good keyword research will lay the groundwork for your content creation efforts. Making sure you target the right keywords can make or break your SEO strategy. And then, of course, optimize well.

In a 2021 Google SEO Office Hours, John Mueller reiterates where to put the keywords on the page so Google will understand what the page is about.

“Act in a way that cultivates user trust. Users feel comfortable visiting your site if they feel that it’s trustworthy. A site with a good reputation is trustworthy. Cultivate a reputation for expertise and trustworthiness in a specific area.”

And:

“Make expertise and authoritativeness clear. Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic.”

  • Brief explanation: Both of the statements outlined above refer to what Google calls “expertise, authoritativeness and trustworthiness” in its Search Quality Evaluator Guidelines. Each of these factors contribute to the overall quality of a webpage and a website, according to Google. Creating content with these in mind can help you compete in the search results.

Learn more:

“Provide an appropriate amount of content for your subject.”

  • Brief explanation: Content takes time and effort. Rather than just guess how much content is appropriate for the topic, be sure to find out what the top-ranked pages are doing for the keyword or topic you are writing about.

Learn more:

Mobile-Friendly Websites

A mobile-friendly website creates a good experience for people who visit a website from a smartphone or tablet.

Google says:

Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence.

  • Brief explanation: Google is planning to switch to a mobile-first index by the end of March 2021. That means it will analyze and rank the mobile version of websites and not the desktop version, as they have historically done. In addition, mobile friendliness is a factor in Google’s page experience ranking update. So having a mobile-friendly website is critical to compete in the search results today.

Learn more:

Hiring an SEO

Hiring the right SEO professional or agency can pay dividends. Hiring the wrong one can harm your website. Google weighs in on this topics as well:

An SEO (“search engine optimization”) expert is someone trained to improve your visibility on search engines. By following this guide, you should learn enough to be well on your way to an optimized site. In addition to that, you may want to consider hiring an SEO professional that can help you audit your pages. Deciding to hire an SEO is a big decision that can potentially improve your site and save time. Make sure to research the potential advantages of hiring an SEO, as well as the damage that an irresponsible SEO can do to your site.

I disagree that simply reading Google’s starter guide is enough direction to compete in the search results. SEO is quite literally a full-time job that not only takes a lot of work but also a log of learning. Search engines change on a dime and we must continuously research, test and inform our efforts.

This is just a sample of the advice that Google gives on how to improve a website. Showing internal stakeholders this and other advice from Google, like on its Google Search Central channel on YouTube, can help support your ongoing SEO efforts and the ability to get things implemented.

Our SEO experts can work with you to develop an SEO program that gets better results — more qualified traffic, better search ranking and increased revenue. Contact us today to schedule a FREE 1:1 consultation.

FAQ: How does validating SEO advice with Google’s recommendations ensure it gets implemented?

The best way you can ensure your SEO strategies get implemented correctly is to validate the advice you give with Google’s own recommendations.

Google holds valuable insights into how it ranks websites and what factors influence search visibility. By aligning your strategies with their recommendations, you can increase the likelihood of success. Let’s dive deeper to better understand the significance and benefits of validating SEO advice with Google.

The Authority of Google’s Recommendations: Google’s recommendations are based on extensive research, data and algorithms designed to provide the best user experience. As a leading search engine, Google has a keen interest in guiding webmasters and marketers to follow SEO practices that align with their search algorithm.

SEO Best Practices: Google offers detailed guidelines on various aspects of SEO, such as improving website speed, creating quality content, optimizing metadata and utilizing structured data. You are more likely to achieve better search visibility and higher rankings by adhering to these best practices.

Improving User Experience: Google’s focus has always been on delivering the most relevant and useful results to its users. When you align your SEO strategies with Google’s recommendations, it ensures a better user experience on your website, leading to improved engagement and conversions.

Staying Within Guidelines: Google periodically updates its search algorithm to provide better search results and combat spammy tactics. By validating SEO advice with Google’s recommendations, you minimize the risk of employing tactics that may be considered manipulative or against their guidelines. This helps in building long-term, sustainable SEO strategies.

Avoiding Penalties: If you implement SEO strategies that do not align with Google’s guidelines, there’s a chance your website could be penalized. Penalties can lead to a significant drop in organic traffic and rankings, making it crucial to adhere to Google’s recommendations and avoid any tactics that may be regarded as spammy or deceptive.

Trustworthiness and Credibility: Following Google’s advice establishes trust and credibility with both search engines and users. When search engines view your website as trustworthy and relevant, they are more likely to rank it higher in search results. Users also tend to trust websites that adhere to best practices, resulting in increased clicks and conversions.

Staying Ahead of the Curve: The landscape of SEO is constantly evolving. Validating your SEO advice with Google’s recommendations keeps you informed about the latest trends and updates. This allows you to adapt your strategies accordingly, ensuring that you remain competitive and maintain a strong online presence.

Continuous Improvement: Google’s recommendations act as a benchmark for SEO. Use these benchmarks to identify areas of improvement and adjust your approach accordingly. This iterative process helps you continuously enhance your website’s search visibility and overall performance.

Step-by-Step Procedure:

  1. Familiarize yourself with Google’s Webmaster Guidelines, which cover essential aspects of SEO practices.
  2. Regularly visit Google’s official blog and webmaster-focused forums for updates on algorithm changes and new recommendations.
  3. Review Google’s recommendations regarding website speed optimization and analyze your site’s performance using tools like PageSpeed Insights.
  4. Optimize your website’s metadata, including title tags and meta descriptions, based on Google’s guidelines for improved visibility in search results.
  5. Ensure your site is mobile-friendly by following Google’s mobile-friendly guidelines and use their Mobile-Friendly Test tool to assess your website’s compatibility.
  6. Implement structured data markup to help search engines understand your content and enhance its appearance in search results. Refer to Google’s Structured Data Guidelines for detailed instructions.
  7. Focus on creating high-quality, unique and valuable content that aligns with Google’s content guidelines. Emphasize relevance, usefulness and originality.
  8. Optimize your website’s internal linking structure and navigation to improve user experience and facilitate search engine crawling.
  9. Monitor Google Search Console regularly to identify any technical issues or penalties affecting your website’s performance. Resolve these issues promptly following Google’s recommendations.
  10. Conduct regular keyword research using tools like Google Keyword Planner to identify relevant keywords and optimize your content accordingly.
  11. Leverage Google Analytics to track and analyze website performance metrics. Use these insights to identify areas for improvement and align your strategies with Google’s recommendations.
  12. Engage with the SEO community and attend industry conferences, where you can learn from experts who also adhere to Google’s recommendations.
  13. Stay vigilant for algorithm updates and refresh your knowledge by reviewing Google’s documentation regarding these updates and their impact on SEO strategies.
  14. Test and iterate your SEO strategies based on the data and insights you gather. Regularly review your website’s performance in search results and make adjustments as necessary.
  15. Engage in ethical link building practices that follow Google’s guidelines to establish authority and credibility within your industry.
  16. Regularly perform website audits to identify and resolve any issues affecting your website’s search visibility. Consider using crawling tools like Screaming Frog or DeepCrawl for comprehensive analysis.
  17. Stay informed about Google’s advancements in AI and machine learning, as these technologies can influence SEO strategies in the future.
  18. Document and track changes made to your website based on Google’s recommendations. This allows you to measure the impact of these changes on your SEO performance.
  19. Continuously educate yourself and your team about SEO best practices by exploring Google’s free resources, attending webinars and reading reputable industry blogs.
  20. Remember that while Google’s recommendations are a valuable benchmark, it’s essential to analyze your specific audience and business goals to tailor your SEO strategies accordingly.

The post Validate Your SEO Advice Using Google’s Recommendations appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/validate-seo-advice-googles-recommendations/feed/ 2
Improve Your Website Performance in Two Simple Steps https://www.bruceclay.com/blog/improve-website-performance-two-simple-steps/ https://www.bruceclay.com/blog/improve-website-performance-two-simple-steps/#comments Thu, 21 Dec 2023 18:30:08 +0000 https://www.bruceclay.com/?p=206465 Learn how to boost your website's performance with simple steps: externalize CSS and JavaScript. Speed up your site and improve search rankings effortlessly.

The post Improve Your Website Performance in Two Simple Steps appeared first on Bruce Clay, Inc..

]]>
Man updating website on a laptop in an office.

So you want a better-performing website. At the most basic level, you need to make sure that search engines can access the body content on a webpage as quickly as possible and that the page loads fast.

That said, there are two simple SEO best practices that can help with both of those requirements, and those are:

  1. Externalizing CSS (cascading style sheets)
  2. Externalizing JavaScript

In this article, I’ll explain why and give some simple steps to get started.

What Is CSS?

Cascading Style Sheets (CSS) describe how HTML code should be displayed on a webpage to create the look and feel of a website, for example, fonts and colors.

What Is JavaScript?

JavaScript is a programming language that enables interactivity on webpages, for example, a search box, audio and video, or maps.

Why Externalize JavaScript and CSS?

You want your website code to be search engine friendly. So, you need to be sure that the underlying code makes it easy for search engine spiders to crawl and understand what the webpages are about.

This needs to happen so search engines can determine the relevance of a search query. One of the first things search engines should crawl is the body content on a webpage, not unnecessary lines of code.

You also want your website to be fast. Search engines like Google care about webpage performance for user experience — so much so that they released their page experience algorithm update with ranking signals devoted to it.

Both CSS and JavaScript can clutter up a webpage, make it slower to load and harder for search engines to crawl. You want the actual body content on a webpage to be accessible in the first hundred lines of code.

Externalizing these files is an easy way to remedy the problems I just mentioned. Doing this can speed up page load time, significantly help rankings and save crawl budget.

Benefits of Externalizing CSS

Creating an external CSS file gives you one place to control the look of the website, so it’s much more efficient than editing every single page of a website when you want to make a change.

When you have a CSS file, you only need to make changes to the external file and those changes are applied to the entire site.

Having an external CSS file has other benefits, too. It allows you to remove inline formatting, such as font tags, and replace them with CSS tags that instruct what style to apply. This results in less code cluttering the webpage.

Less code means smaller file sizes. Smaller file sizes mean web pages load faster.

Benefits of Externalizing JavaScript

Creating an external file for JavaScript has similar benefits. When you move the JavaScript off individual webpages and into an external file, your webpages only need a single line of code that calls the JavaScript file for information.

JavaScript tends to be long and cumbersome, so doing this one simple thing could cut the size of a webpage in half.

Is Your JavaScript and CSS Externalized Already?

It’s easy to check if your CSS and JavaScript is externalized. Go to your website homepage and look at the source code. To view the source, right-click on the page and select: “view page source.”

Screenshot of “view page source” on BruceClay.com.
Screenshot of “view page source” on BruceClay.com

Here is an example of a sample of code from non-externalized JavaScript on a webpage:

<script language=”javascript”>

var _pn=”Your+No-Fault+Rights”; //page name(s)

var _mlc=”No+Fault+Advice”; //multi-level content category

var _cp=”null”; //campaign

var _acct=”WE531126G4MC09EN3”; //account number(s)

var _pndef=”title”; //default page name

var _ctdef=”full”; //default content category

var _prc=””; //commerce price

var _oid=””; //commerce order

var _dlf=””; //download filter

var _elf=””; //exit link filter

var _epg=””; //event page identifier

</script>

This is just a sample of how JavaScript can take up valuable space with the page code. It’s normal that websites will have many lines (30+) of JavaScript that should be externalized.

Alternatively, here is what it would look like with an external JavaScript file:

<script type=”text/javascript” src=”/sample.js”></script>

Here is an example of a non-externalized style sheet:

.content { clear: left;

background-color: #ffffff;

background-image: URL(“/images/movie_reel2.jpg”); /* small reel */

/* background-image: URL(“/images/movie_reel.JPG”); large reel */

background-position: 100% 100%; /* lower right corner */

background-repeat: no-repeat;

border: 2px solid #666666;

border-style: solid solid none solid;

padding: .5em 1em 1em 1em;

margin-bottom: 0em;

margin-top: 0em;

text-align: left;

}

h1 {

font-family: Georgia, Times New Roman, Times, sans-serif;

font-size: 18px;

font-style: italic;

font-weight: bold;

color: #003399;

text-align: center;

}

It is common for websites to have CSS hardcoded into a webpage rather than including it in a single file to be called from numerous pages within a website.

Here is what it would look like with an external CSS file:

<link rel=”stylesheet” href=”styles.css”>

How to Externalize JavaScript

The first thing to mention is that sometimes it’s necessary to have JavaScript code on a webpage, for example, for accurate tracking or page functionality.

Aside from tracking and functionality, not all JavaScript code is detrimental to page speed and should be externalized.

In fact, in some cases using in-line JavaScript can speed up the page load time and performance, for instance, in cases where JavaScript blocks the rendering of the page.

Using in-line JavaScript at the top of the page may enable content at the top of the page to become visible to users without having to wait for large JavaScript files to load.

So, here’s how to externalize JavaScript:

  1. Identify the JavaScript code that you want to externalize by locating the opening and closing <script> tag with the HTML source code of the webpage.
  2. Cut the javascript code, which is between <script> and </script>.
  3. Using a text editor like Notepad, create a new document and paste the JS code into the new blank document.
  4. Save the file with the file extension “.js”.
  5. Upload the file to your server and make a note of its path.
  6. Go back to your original html file and insert the following, where “path/filesource.js” is the URL of the newly created .js file:
    <script language=”JavaScript” src=”path/filesource.js”></script>
  7. Now, the embedded JavaScript code is replaced with just one line.

When search engines crawl the webpage, they will only have one line of code to read before they go on to the rest of the page.

Subsequently, this can help with Google’s page experience algorithm update, specifically, First Input Delay. (Note that FID will be replaced by Interaction to Next Paint (INP) in March 2024.)

How to Externalize CSS

To externalize style sheets, simply follow the same instructions as for the JavaScript file, except save the file with a .css file extension. In the original webpage code you’ll replace all the CSS coding with the following:

<link href=”cssfilename” rel=”stylesheet” type=”text/css”>

Once again you are left with one line of code on your original page, allowing for simpler site indexing for the spiders.

Small Steps, Big Impact

Externalizing JavaScript and CSS won’t fix your website performance problems, but it’s a great start.

One client we worked with implemented these two best practices and reduced 20,000 lines of code to 1,500. The website subsequently saw a significant improvement in rankings and moved to the top of organic search results for many keywords.

So, yes, it is effective and a good place to start as you are improving the performance of your web pages.

Need help boosting your website’s speed and search ranking? Contact us for a free consultation.

FAQ: How can I enhance my website’s performance using two simple steps involving CSS and JavaScript?

Optimizing your website’s performance is crucial for user satisfaction and search engine ranking in the dynamic landscape of web development. Leveraging CSS and JavaScript effectively can significantly elevate your site’s speed and functionality. Let’s delve into two simple yet powerful techniques to boost your website’s performance.

Minify and Concatenate CSS and JavaScript Files

When it comes to optimizing website speed, reducing file sizes is paramount. Combining multiple CSS or JavaScript files into one minimizes HTTP requests, enhancing load times. Minification removes unnecessary characters (whitespace, comments) without altering code functionality.

Insightful Tip: Employ build tools like Grunt or Gulp for automated minification and concatenation processes. Consider using Content Delivery Networks (CDNs) for quicker file delivery.

Implement Lazy Loading for Non-Critical Resources

Lazy loading postpones the loading of non-essential resources until they’re needed. For instance, images below the fold or secondary JavaScript can be loaded asynchronously, boosting initial page load speed.

Insightful Tip: Utilize the `loading=”lazy”` attribute for images to instruct browsers to load them only when they come into the viewport, optimizing user experience and load times.

Buyer Intent Search Terms and Their Role

Understanding buyer intent search terms is pivotal for effective optimization. Terms like “website speed improvement techniques,” “CSS and JavaScript optimization,” or “lazy loading implementation” showcase a user’s intent to enhance website performance. Integrating these terms into your content aligns with user queries, improving visibility.

The Impact of Performance on User Experience and SEO

Optimized website performance directly influences user experience, increasing engagement and reducing bounce rates. Moreover, search engines prioritize faster-loading sites, positively impacting SEO rankings.

Balancing Aesthetics with Performance

While optimizing performance is crucial, maintaining a visually appealing website is equally vital. Finding the balance between aesthetics and functionality ensures an engaging user experience without compromising speed.

The Role of Continuous Monitoring and Testing

Regularly monitoring website performance metrics and conducting tests is key to sustaining optimal functionality. Tools like Google PageSpeed Insights and Lighthouse assist in identifying performance bottlenecks, enabling timely adjustments.

Future Trends: Evolving Strategies for Optimal Performance

As technology advances, emerging trends like HTTP/3 and enhanced JavaScript frameworks continue to reshape performance optimization strategies. Staying updated on these trends ensures your website remains competitive and well-optimized.

Enhancing your website’s performance through CSS and JavaScript optimizations is a cornerstone of successful web development. By employing techniques like file minification, lazy loading and staying attuned to evolving trends, you’re enriching user experience and bolstering your site’s visibility in the digital realm.

Step-by-Step Procedure: Enhancing Website Performance

  1. Assess Current Performance Metrics: Use tools like Google PageSpeed Insights to identify areas for improvement.
  2. Implement CSS and JavaScript Minification: Utilize build tools or online services to compress and combine files.
  3. Consider Content Delivery Networks (CDNs): Opt for CDNs for faster file delivery.
  4. Evaluate Lazy Loading Opportunities: Identify non-critical resources suitable for lazy loading.
  5. Implement Lazy Loading: Integrate the `loading=”lazy”` attribute for images or asynchronous loading of secondary scripts.
  6. Research Buyer Intent Search Terms: Understand user queries related to website performance.
  7. Integrate Buyer Intent Keywords: Incorporate relevant terms in your content for improved visibility.
  8. Monitor Performance Metrics: Regularly analyze metrics and adjust strategies accordingly.
  9. Balance Aesthetics and Functionality: Ensure visual appeal without compromising speed.
  10. Stay Updated on Emerging Trends: Follow developments in web optimization for future-proofing strategies.

This comprehensive step-by-step guide provides a detailed roadmap to effectively enhance your website’s performance using CSS and JavaScript optimization techniques. Following these steps ensures a streamlined, high-performing website that resonates with user expectations and search engine algorithms.

The post Improve Your Website Performance in Two Simple Steps appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/improve-website-performance-two-simple-steps/feed/ 12
7 SEO Best Practices You Can’t Ignore if You Want to Rank in the Organic Search Results https://www.bruceclay.com/blog/seo-best-practices-you-cant-ignore/ https://www.bruceclay.com/blog/seo-best-practices-you-cant-ignore/#comments Wed, 13 Sep 2023 15:58:18 +0000 https://www.bruceclay.com/?p=199025 SEO is a fast-moving industry that is always evolving. One Google announcement, a current event or a change in the competitive landscape can alter how you go about your SEO strategy in an instant. But, we do have best practices that stand the test of time. The way we go about doing those best practices […]

The post 7 SEO Best Practices You Can’t Ignore if You Want to Rank in the Organic Search Results appeared first on Bruce Clay, Inc..

]]>
Data and analytics displayed on laptop, tablet and phone.

SEO is a fast-moving industry that is always evolving. One Google announcement, a current event or a change in the competitive landscape can alter how you go about your SEO strategy in an instant.

But, we do have best practices that stand the test of time. The way we go about doing those best practices might evolve, but they are still rooted in the fundamentals of good SEO. And, when following these best practices, you can better weather any storms that may come your way.

Here are seven SEO best practices you can’t ignore if you want to compete in the organic search results.

  1. Create the Right Type of Content
  2. Meet or Beat the Top-Ranked Content
  3. Create a Good User Experience
  4. Optimize Your Images
  5. Silo Your Website
  6. Focus on Link Earning Not Link Building
  7. Manage Duplicate Content

FAQ: How does duplicate content impact search rankings and what types of duplicate content should be managed? 

1. Create the Right Type of Content

Every search query/keyword has a different intent behind it — what the search engine user is trying to do. Google knows this and serves up different types of content to meet those needs.

There will always be the blue links, which lead to webpages. But often, there are other types of content as well, like video, images and much more. This is what we call engagement objects – SERP features that engage and ultimately make money for Google.

An engagement object is a SERP feature shown on a search engine results page (SERP) that falls outside of the traditional organic search results (i.e., the blue links).

Google Search results showing different types of SERP features for the query “how to get Kool-Aid out of carpet.”
Google Search results showing the different types of SERP features for the query “how to get Kool-Aid out of carpet.”

Searchmetrics keeps track of the most common SERP features that show up throughout the year with its SERP Features Monitor.

Screenshot of SERP Features Monitor from Searchmetrics.
Image source: SERP Features Monitor, Searchmetrics.com

So, how do you create and optimize the right type of content to match the search query? Through what we call a whole-SERP SEO strategy.

A whole-SERP SEO strategy analyzes the features that show up most in the search results for target keywords and then optimizes for them.

The first step is to take the keywords you want to rank for, then analyze the content in the search results that is showing up for them. Is it mostly blue links? Are there videos? Images? What else?

Google search results for the query “cute hamsters.”
Google search results for the query “cute hamsters”

This will help you set the content strategy for the type of content you are going to create. A whole-SERP SEO strategy gives you a roadmap for the type of content you need in your SEO program.

This strategy can also help combat the phenomenon of “zero clicks.” A zero-click search result happens when Google is able to answer a search query or facilitate an action right within the search results page.

2. Meet or Beat the Top-Ranked Content

Knowing what type of content to create is the first step. How you create and optimize the content for search engines and users is the next step.

SEO is a game of being the least imperfect. I say least imperfect because no one is going to optimize a piece of content precisely to Google’s algorithms. So, all content in the search results is imperfect when it comes to optimizing.

That said, the goal is to be least imperfect compared to your competition. All SEO programs should work to beat the competition, not the algorithm.

Here, you want to understand what makes the top-ranked content for your keyword tick. Start analyzing the top results for each keyword. Of course, you could do this manually, but SEO tools are going to save you a lot of time and effort here.

Google search results for the query “surf lessons ventura county.”
Google search results for the query “surf lessons ventura county”

For example, you could use an SEO tool like our Multi-Page Information tool (free version) and see the on-page SEO factors of multiple competitors.

Screenshot of data from the Bruce Clay SEOToolSet’s free Multi-Page Information tool.
Example of the data you can get from the Bruce Clay SEOToolSet’s free Multi-Page Information tool

Or, if you are using a WordPress site, you can use our WordPress SEO plugin to get real-time data on the top-ranked pages for your keywords.

That means customized SEO data for your content versus following best practices that are typically generic.

It also means knowing how many words to include in your meta tags and your body content, plus the readability score — all based on the top-ranked content.

Screenshot of Bruce Clay SEO Plugin for WordPress dashboard showing content ranking.
Bruce Clay SEO Plugin for WordPress dashboard showing content ranking
Screenshot of Bruce Clay SEO Plugin dashboard showing keyword ranking and traffic data.
Bruce Clay SEO Plugin dashboard showing keyword ranking and traffic data

These types of tools will help you understand how to optimize the content you are creating. But you should also look closer at the nature of the top-ranked content as well before you start writing.

Google values experience, expertise, authoritativeness and trustworthiness (E-E-A-T) as outlined in its Search Quality Evaluator Guidelines. A component of E-E-A-T is to have shared attributes in the information you are sharing with the top-ranked or highest-quality webpages on the topic.

In other words, Google says in its Search Quality Evaluator Guidelines:

Very high quality MC should be highly satisfying for people visiting the page. Very high quality MC shows evidence of a high level of effort, originality, talent, or skill. For informational pages, very high quality MC must be accurate, clearly communicated, and consistent with well-established expert consensus when it exists. Very high quality MC represents some of the most outstanding content on a topic or type that’s available online. The standards for Highest quality MC may be very different depending on the purpose, topic, and type of website.

I discussed what this means practically in The Complete Guide to the Basics of Google’s E-E-A-T.

For instance, say you have content that states that blueberries can cure cancer. Even if you feel you have the authority to make this claim, when competing against YMYL content, you will not be considered an expert for a query about cancer because the claim is not supported elsewhere.

And don’t forget: Once you have created a great piece of content, don’t skimp on the headline. A good headline can get you more clicks and drive more traffic than a lackluster one.

Much of the advice and tools I’ve discussed so far apply to getting data for and optimizing standard web pages (the blue links). If you are up against videos, for example, you will also need to examine them closely and think about your YouTube SEO efforts.

3. Create a Good User Experience

Once a person reaches your website from the organic search results, will they have a good experience?

You should care about user experience because you want to make sure you get the most out of the traffic that you send to your website. If all those efforts lead to a bad webpage and the user quickly leaves, then you have wasted your time and money.

Google wants to make sure websites are providing a good user experience, too. So Google has developed ranking signals to ensure only the websites that provide the best experience will compete on Page 1 of the search results.

One thing that Google may look at is when a large percentage of users from the search results go to your webpage and then immediately click back to the search results. This could be an indication of a poor user experience and may impact your future rankings.

Then you have the page experience algorithm update, which hit in 2021 and combines pre-existing ranking signals such as:

  • Mobile-friendliness
  • HTTPS (secure websites)
  • Non-intrusive interstitials

… with new rankings signals that include what Google calls “core web vitals.” Core web vitals look at things like:

  • Page load performance
  • Responsiveness
  • Visual stability
Screenshot of Google's search signals for page experience, including core web vitals.
Image source: “Evaluating page experience for a better web,” Google Webmaster Central Blog

There is much to do in this area to optimize a website for user experience. You can download our e-book: Google’s Page Experience Update: A Complete Guide, to learn more about how to get your website up to speed.

Cover of the e-book "Google's Page Experience Update: A Complete Guide" by Bruce Clay.

4. Optimize Your Images

You need to optimize all your content assets so that they have the opportunity to rank. That includes images.

Visual search and Google Images have been a focal point for Google for some time. More and more images are showing up in response to search queries. seoClarity reports that in 2021, more than 55% of keywords result in image results.

Google wants to rank great images, but it also wants to ensure those images are within the context of great content, too. I wrote about this in an earlier article on how to improve image search ranking:

We’ve all had the experience of finding an image and clicking through to a not-so-great webpage. To prevent this, the Google Images algorithm now considers not only the image but also the website where it’s embedded.

Images attached to great content can now do better in Google Images. Specifically, the image-ranking algorithm weighs these factors (besides the image itself):

Authority: The authority of the webpage itself is now a signal for ranking an image.

Context: The ranking algorithm takes into account the context of the search. Google uses the example of an image search for “DIY shelving.” Results should return images within sites related to DIY projects … so the searcher can find other helpful information beyond just a picture.

Freshness: Google prioritizes fresher content. So ranking images will likely come from a site (a site in general, but we believe the actual webpage in question) that’s been updated recently. This is probably a minor signal.

Position on page: Top-ranked images will likely be central to the webpage they’re part of. For example, a product page for a particular shoe should rank above a category page for shoes.

Of course, there are all sorts of optimization techniques you can do to improve image ranking. Read more here for 17 important ways you can optimize your images for search, which includes:

  1. Tracking image traffic
  2. Creating high-quality, original content
  3. Using relevant images
  4. Having a proper file format
  5. Optimizing your images
  6. Always creating Alt text
  7. Making use of the image title
  8. Creating an image caption
  9. Using a descriptive file name
  10. Implementing structured data
  11. Considering image placement on the page
  12. Analyzing the content around the image
  13. Being careful with embedded text
  14. Creating page metadata
  15. Ensuring fast load time
  16. Making sure images are accessible
  17. Creating an image sitemap

5. Silo Your Website

Creating and optimizing quality content is really important. But just as important is how you organize all the content on your website.

Google has indicated more than once that it not only looks at the quality of a webpage but also the site as a whole when ranking content.

In its Search Engine Optimization Starter Guide, Google says:

Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.

In its Search Quality Evaluator Guidelines, Google says it looks at the website as a whole to determine if the website is an authority on topics.

So what does this mean? When someone searches for something on Google, one of the ways that the search engine can determine the most relevant webpage for a search is to analyze not only the webpage but also the overall website.

Google may be looking to see if a website has enough supporting content for the keywords/search terms on the website overall. Enough, clearly organized, information-rich content helps create relevance for a search.

We call this SEO siloing. SEO siloing is a way to organize your website content based on the way people search for your site’s topics. Its goal is to make a site relevant for a search query so that it has a better chance of ranking.

Illustration of a siloed website structuring power tools.
Example of a siloed website

The goal of SEO siloing is to build a library of content around primary and long-tail keywords on your website and then connect them via your internal linking structure.

Google advocates for what SEO siloing does. In its Search Engine Starter Guide, Google says:

Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don’t require an internal “search” functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.

There is a lot that goes into SEO siloing, and I recommend reading these articles:

6. Focus on Link Earning Not Link Building

Link building is not a numbers game anymore. Search engines want to see that a website has quality, relevant links.

Google’s John Mueller confirmed this in a video, stating that:

“We try to understand what is relevant for a website, how much should we weigh these individual links, and the total number of links doesn’t matter at all. Because you could go off and create millions of links across millions of websites if you wanted to, and we could just ignore them all.”

You can view that video clip here:

We have seen client websites with fewer but more quality inbound links outperform their competition. So, what does “link earning” look like?

  • Avoiding all those known spammy link-building tactics, mass email requests for links, purchasing links, participating in link farms, etc.
  • Understanding the “gray areas” of what could be considered link spam, such as paid guest posting.
  • Creating quality content that earns relevant links.
  • Getting creative with how you earn links and being diligent about how you maintain them.

You can learn more about how to create a good link-earning SEO program in our e-book, “The New Link Building Manifesto: How To Earn Links That Count.” In it, you’ll find a roadmap for earning quality, relevant links, including 50 ways to earn links safely and effectively.

Cover for the Bruce Clay e-book "The New Link Building Manifesto: How To Earn Links That Count."

7. Manage Duplicate Content

Duplicate content can impact your rankings. And, depending on what type of duplicate content you have on your website, it can even trigger a manual action by Google if it’s considered spam.

There are two types of duplicate content:

  1. Duplicate content involving webpages on your site only
  2. Duplicate content involving webpages on your site and other sites

When you have duplicate content involving your website and other sites, Google may flag this as spam (for example, if your site is scraping or copying content from another).

The good news is that most websites are only dealing with non-spammy duplicate content on their own websites. This is when you have content (two or more webpages) that is the same or similar.

This can impact your rankings. When Google is presented with two of your webpages that appear to be too similar, the search engine will choose the page it believes is the most relevant and filter the other page or pages out of the results.

Google’s Mueller explains in a video:

“With that kind of duplicate content it’s not so much that there’s a negative score associated with it. It’s more that, if we find exactly the same information on multiple pages on the web, and someone searches specifically for that piece of information, then we’ll try to find the best matching page.

So if you have the same content on multiple pages then we won’t show all of these pages. We’ll try to pick one of them and show that. So it’s not that there’s any negative signal associated with that. In a lot of cases that’s kind of normal that you have some amount of shared content across some of the pages.”

You can watch that video clip here:

So, what to do? We’ve found that the most common types of duplicate content are the following:

  • Two site versions
  • Separate mobile site
  • Trailing slashes on URLs
  • CMS problems
  • Meta information duplication
  • Similar content
  • Boilerplate content
  • Parameterized pages
  • Product descriptions
  • Content syndication

And of those common types, we tend to see duplicate meta information as a top culprit. So, it’s important to always create unique meta tags.

If you’re on a WordPress site, you can use our WordPress SEO plugin to help monitor and detect duplicate content issues in your meta tags.

For more on how to address the common types of duplicate content on your website, see Understanding Duplicate Content and How to Avoid It.

Closing Thoughts

These SEO best practices are not the end of your SEO work, but they are the beginning of creating a winning SEO strategy that will respond to any curve ball that comes your way.

Schedule a free 1:1 consultation to learn more about how you can boost your SEO profile and maximize your visibility online. 

FAQ: How does duplicate content impact search rankings, and what types of duplicate content should be managed?

Duplicate content has long been a cause for website owners and digital marketing professionals who worry that duplicated material could negatively affect search rankings. Search engines struggle to provide users with relevant and original information that meets user search queries if duplicates appear within results pages.

When search engines encounter duplicate content, they face a dilemma. Search engines must determine which version is more relevant and worthy of being given priority in ranking. They may punish pages that contain duplicate content by downranking them or penalizing their rankings accordingly. This can adversely affect website visibility and organic traffic, making it crucial for webmasters to address duplicate content issues.

Website owners should be aware of and manage several types of duplicate content. The first type is identical content found on multiple pages within the same website. This can occur when a website generates multiple URLs for the same content, leading to duplicate versions. Search engines might struggle to decide which URL to prioritize, potentially diluting the page’s ranking potential.

Another type of duplicate content is syndicated or copied content from other websites. While syndication can be a legitimate practice, ensuring that the content is properly attributed and adds value to the website is essential. Otherwise, search engines may consider it duplicate and penalize the website for duplicate content.

Product descriptions and e-commerce websites often face challenges with duplicate content. Similar products may have identical or nearly identical descriptions, leading to duplicate content issues. It is advisable to provide unique, compelling descriptions for each product to avoid these problems and improve search rankings.

Finally, duplicate content can also arise from printer-friendly versions, mobile versions, or session IDs appended to URLs. These variations in URLs can confuse search engines and result in duplicated content. Implementing canonical tags and managing URL parameters can help resolve these issues and ensure search engines understand the preferred version of the content.

To manage duplicate content effectively, website owners should take proactive steps. Conducting regular content audits to identify and address duplicate content is essential. Utilizing tools such as site crawlers and duplicate content checkers can aid in this process by scanning the website for duplicate instances and providing recommendations for improvement.

Once identified, duplicate content issues can be resolved through various means. One approach is consolidating duplicate pages by redirecting or consolidating the content under a single URL. Implementing 301 redirects or rel=canonical tags can help guide search engines to the preferred version of the content and consolidate ranking signals.

For e-commerce websites, ensuring unique product descriptions and optimizing metadata can go a long way in avoiding duplicate content penalties. Additionally, monitoring syndicated content and implementing proper attribution can help maintain a healthy balance between original and duplicate content.

Regularly monitoring website performance, traffic patterns and search engine rankings is crucial for detecting any potential duplicate content issues. Prompt action and continuous improvement will help maintain strong search rankings and enhance the overall user experience.

Duplicate content can significantly impact search rankings by confusing search engines and diluting ranking potential. Various types of duplicate content, such as identical pages within a website, syndicated content and product descriptions, should be managed effectively.

Step-by-Step Procedure:

  1. Conduct a comprehensive content audit to identify instances of duplicate content on your website.
  2. Use site crawlers or duplicate content checkers to scan the website and identify duplicate content.
  3. Prioritize resolving duplicate content issues on the website’s pages before addressing external sources.
  4. For identical content found on multiple pages within the same website, determine the primary URL and implement 301 redirects from the secondary URLs.
  5. Ensure that all syndicated or copied content from other websites is properly attributed and adds value to your website.
  6. Review product descriptions on e-commerce websites and make them unique and compelling for each product.
  7. Regularly monitor the website for printer-friendly versions, mobile versions, or session IDs appended to URLs, and implement canonical tags to indicate the preferred version of the content.
  8. Manage URL parameters to eliminate duplicate content issues caused by session IDs or tracking parameters.
  9. Utilize tools such as site crawlers and duplicate content checkers to scan the website for new instances of duplicate content periodically.
  10. Review the recommendations provided by the tools and implement necessary changes to address duplicate content.
  11. Consolidate duplicate pages by redirecting or consolidating the content under a single URL.
  12. Implement 301 redirects from duplicate pages to the preferred version to guide search engines and consolidate ranking signals.
  13. Ensure that each product on e-commerce websites has a unique description and optimized metadata.
  14. Monitor syndicated content and verify that proper attribution is in place to differentiate it from duplicate content.
  15. Continuously monitor website performance, traffic patterns and search engine rankings to identify any new duplicate content issues.
  16. Take prompt action to resolve duplicate content problems as they arise.
  17. Regularly review and update your content strategy to avoid unintentional creation of duplicate content.
  18. Provide a seamless user experience by eliminating duplicate content, which can confuse and frustrate visitors.
  19. Stay updated with search engine guidelines and best practices to address duplicate content effectively.
  20. Continuously improve your website’s content and ensure it remains unique, relevant and valuable to users.

The post 7 SEO Best Practices You Can’t Ignore if You Want to Rank in the Organic Search Results appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/seo-best-practices-you-cant-ignore/feed/ 7
Why Press Releases Still Matter to SEO and How to Write a Press Release That Entices Media https://www.bruceclay.com/blog/why-press-releases-still-matter-to-seo-and-how-to-write-one/ https://www.bruceclay.com/blog/why-press-releases-still-matter-to-seo-and-how-to-write-one/#comments Wed, 12 Jan 2022 18:54:03 +0000 https://www.bruceclay.com/?p=116945 A press release is a valuable publicity tool that impacts brand awareness and web traffic. Learn how to write a successful press release with these SEO and media best practices.

The post Why Press Releases Still Matter to SEO and How to Write a Press Release That Entices Media appeared first on Bruce Clay, Inc..

]]>
Sign displays "read all about it," advertising newspapers and magazines.
A press release is a valuable publicity tool. It can impact your brand awareness and website traffic. But can it help with SEO?

A lot has changed since the days SEOs used press releases for stuffing keyword-rich links to their websites. Today, issuing a press release takes a lot of care. You have to keep in mind both SEO and media best practices if you want it to be effective. I’ll explain how in the following sections:

Brief History of Press Releases and SEO

For a while, press releases stuffed with keyword-rich links were the norm. Writing a press release was one way to try to boost SEO value of the URLs on your own website. But, of course, those links were self-serving and not natural.

Google spoke about this tactic on several occasions. Back in 2011, former Google representative Matt Cutts said that “the links in the press releases themselves don’t count for PageRank value, but if a journalist reads the release and then writes about the site, any links in that news article will then count.”

Cutts reinforced this in a 2012 Google Webmaster Help Forum when he wrote: “I wouldn’t expect links from press release websites to benefit your rankings.”

Then, in a Google Webmaster Help video from 2012, Cutts went into detail about the difference in value between a press release (at the low end) and an article in the New York Times (at the high end) of the “continuum of content and the quality of that content and what defines the value add for a user.”

However, some evidence suggests that a link from a press release could pass some value as a ranking factor. So SEOs pressed on.

For example, SEOPressor conducted an experiment where they issued a press release. In it, nonsensical anchor text — “leasreepressmm” — linked to Matt Cutts’ blog, and the Matt Cutts blog ranked No. 4 for “leasreepressmm.”

This suggested that the links in a press release had some noticeable affect, especially for noncompetitive and infrequently used search terms.

Google realized keyword-stuffed links in press releases were a problem. So in 2013, Google devalued links coming from press releases. In a Google Webmaster Central Hangout, Search Engine Land’s Barry Schwartz asked Googler John Mueller for more details about the update of the Webmaster Guidelines to include link schemes related to press releases.

“This (update is) just following up with other changes that we’ve made in the past,” Mueller said. “These are links that were essentially placed by the webmasters themselves; that’s something that we would consider … unnatural. A lot of these (press releases) cases kind of fall into that, where essentially … the webmaster is generally creating a bunch of links.”

Links in a press release aren’t “something that an external person is … recommending,” Mueller continued. “It’s more something that webmasters are creating themselves to promote their websites.”

Mueller concluded that “generally speaking, promoting your website is perfectly fine and a reasonable thing to do, but (press release links aren’t considered) natural.”

Today, this policy holds. Certain links in press releases are considered spam. According to Google, these are the links that violate their guidelines:

Links with optimized anchor text in articles or press releases distributed on other sites.

For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

Why Press Releases Still Matter

Press releases have been around forever, and they are still good for what they were originally created for: increasing your visibility. If you have something newsworthy to announce, writing a press release still makes sense.

In terms of results, press releases can lead to increased traffic and branding.

Press releases have strong branding value, especially if a journalist turns your press release into an article that will reach the masses and live online.

Plus, press releases can contribute to your business’s and website’s perceived expertise, authority, and trustworthiness (E-E-A-T) — which are quality metrics from Google.

If a business is getting a lot of mentions, especially in the press, it can help with the authority factor. I wrote about this more in my guide on the basics of E-E-A-T.

Press Releases: SEO Best Practices

In the grand scheme of press releases, links do come into play since companies naturally point back to themselves. For example, “For more information, visit www.Macys.com.”

Here, it’s important to distinguish between navigational and transactional links. (Skip this part if you’re well-versed in SEO jargon already.)

  • Navigational links use a domain name, a company name, or “click here” as the anchor text. They point to an entity and usually take a person to the homepage of a website.
  • Transactional links use keywords in the anchor text. This passes some additional information in the link about why a person would click it, such as “best ski blog” or “buy snowboards here.”

As an SEO best practice, having one navigational link in a press release pointing to your website is OK. But steer clear of transactional links, or at least ensure their link tag includes a specific attribute such as rel=”nofollow” or rel=”sponsored.” This makes it clear to search engines that the link is not trying to pass PageRank value for SEO ranking purposes.

I’ll note that some press release distribution services still tout the “link-building benefits” of press releases. Some even encourage the use of “keyword-rich anchor text” in those links. But don’t be fooled.

If you have previously used press releases as a link-building tactic to try to influence rankings (with followed links), then you might want to start the process of disavowing potentially harmful links. Google requires that paid links include the proper attribution; see Qualify your outbound links to Google for more details.

Tips for Press Release SEO

Here are some SEO best practices to follow when it comes to press releases:

  • Research keywords: Perform keyword research on the topic you are writing about. Include those keywords in your press release, making sure they show up in the headline and first 250 words.
  • Make it unique: Focus on creating unique content. Press releases are no exception to this general rule. If you create a compelling press release, then it is more likely to get picked up for an article, and that article may earn you quality links.
  • Limit links: As I stated above, use only navigational links in your press releases. In most cases, there will be only one link. There are some exceptions, such as if a press release is announcing a merger, for example, then it would make sense to link to both companies’ websites. Be sure to use the company name or domain as the anchor text.
  • Nofollow links: Make sure your press release distribution service uses “nofollow” or an equivalent attribute on your links. If they don’t, then it’s time to get a new press release distribution service.
  • Distribute it, then post and share: Use a reputable press release distribution service to send out your article, and they should provide statistics on media outlets that picked it up. Once it’s out in the world, you can share it in your social media channels and also post it (or a summary, if you choose) on your own website with a link to the main PR post. For an example, check out our press page.

Now that you know some SEO dos and don’ts, let’s talk about how to write a really effective PR.

How to Write a Press Release That Entices Media

When a press release gets picked up, it’s not by chance. Press releases that get turned into stories are often written with the publication’s editor and the journalist in mind. They’re relevant, concise, engaging, and error-free.

If you want your own press release to stand out, make sure that you:

  • Create original, engaging content
  • Issue newsworthy press releases
  • Follow formatting best practices
  • Make it engaging

Let’s look at each of these in more detail …

Create Original, Engaging Content

Press release distribution sites send out thousands of press releases per day. So you can imagine that journalists experience some sort of press release fatigue. Translation: You need to stand out.

Step 1 is writing original content about something people would want to read. Don’t think of it as just an ad for your company or product.

Issue Newsworthy Press Releases

Press releases often serve a dual purpose: inform your own audience and get media exposure. You want to issue news about your company and have it on your website to keep your target audience and visitors up to speed on your business. Beyond that, though, you would love to spread the word to new audiences through media coverage.

Not every press release topic is going to fulfill both goals. So you might surrender to the fact that a press release about an award might be good for your website’s newsroom, but it might not be the stuff of media headlines.

That said if you are trying to entice the media to pick up your story, make sure it’s newsworthy. Keep in mind that a company that continuously issues press releases that aren’t newsworthy may be more quickly dismissed by journalists when the company puts real news out.

So what are some newsworthy topics?

  • Proprietary research findings
  • Crises and how you manage them
  • A new product or service
  • A grand opening or company event worth highlighting
  • A donation or volunteer effort
  • A merger or acquisition
  • VIP hires or departures
  • A partnership announcement

As you are writing the press release, think about the angle too. In its guide to writing a great press release, Cision says it should be written like a news article with “a clear news angle.”

An angle is the story’s main theme or perspective. This can make the press release more appealing and more of a story. For example, the angle could be meant to invoke emotion, address a conflict, or highlight progress somehow.

Follow Formatting Best Practices

Journalists answer the who, what, where, when, why, and sometimes, how as quickly as possible in their news stories — usually within the first paragraph. So you should too.

Make it very clear what the story is right away in your press release. Check out the inverted pyramid style of writing for more tips on how to do this.

You’ll want to consider length too. Conventional wisdom says to keep press releases between 300 and 500 words (or roughly one page).

Another thing you want to be diligent about is spelling and grammar. If you want to lose credibility fast, then having a press release with spelling and grammatical errors is one way to do it.

Furthermore, make sure the press release has the proper press release structure, including writing it in Associated Press (AP) style.

In its guide to writing a great press release (linked to earlier), Cision offers the following tips and more:

  • Keep headlines and subheadlines brief and shareable — and include your company name.
  • Start with a dateline.
  • Make your call to action obvious (the sooner, the better!) if you have one for this story.
  • Use headers and lists to segment your release, especially if it’s long.
  • Limit paragraphs to four or fewer sentences, and vary sentence length and structure.
  • End with contact information.

Make It Engaging

As mentioned earlier, you want to write a press release as if it were a news story. So think about what sort of engaging information you can include to grab the reader’s (aka the journalist’s) interest.

This includes things like:

  • Statistics
  • Quotes from key stakeholders
  • Image(s)

All of these things can be reused by the journalist when they pick up the story, and it makes their life easier to have these included.

Final Thoughts

I’ll come back to the question I opened with: Can press releases help with SEO? The answer is yes, but indirectly.

There’s a lot that goes into writing a press release. You need to make it stand out in hopes of getting media coverage, and you also need to follow SEO best practices. Done right, a press release can turn into a valuable asset for your brand, raising awareness, driving traffic and contributing to your brand’s authority.

If you found this article helpful, we invite you to subscribe to our blog.

FAQ: How can I optimize press releases for search engines and media exposure to enhance brand visibility and authority?

Brand recognition and authority are pivotal. Optimizing press releases for search engines and media exposure has become crucial. By meticulously aligning your press release strategy with SEO best practices and media preferences, you can effectively boost your brand’s visibility and enhance its authority in the industry.

To embark on this journey of press release optimization, start by conducting thorough keyword research. Identify relevant keywords that resonate with your brand and industry. Seamlessly integrate these keywords into your press release’s headline and the opening paragraph. A well-crafted headline captures readers’ attention and is a vital SEO element that search engines consider.

Crafting an engaging press release is an art that balances newsworthiness and brand messaging. News angles that invoke emotion, address conflicts, or highlight progress are more likely to resonate with both media outlets and their audiences. By focusing on originality and engaging content, you enhance the likelihood of your press release being picked up by journalists and shared widely across platforms.

The role of links within press releases cannot be underestimated. Utilize navigational links that direct readers to your website’s relevant pages. Avoid transactional links laden with keyword-rich anchor text, as they can be manipulative to search engines. Remember, the goal is to create a seamless experience for both readers and search engine crawlers, ultimately contributing to your brand’s online authority.

A critical factor in the success of your press release lies in its distribution strategy. Partner with reputable press release distribution services that understand the nuances of SEO. Ensure that any links embedded within your press release are tagged appropriately, using “nofollow” or “sponsored” attributes when necessary. This adheres to search engine guidelines and safeguards your brand’s online reputation.

Optimizing press releases requires a strategic blend of SEO tactics and media appeal. By crafting original, engaging, and informative content while following best practices for links and distribution, you can elevate your brand’s visibility and establish authority within your industry.

Step-by-Step Procedure:

  1. Conduct thorough keyword research to identify relevant keywords.
  2. Seamlessly integrate selected keywords into the press release headline and opening paragraph.
  3. Craft an engaging press release with a compelling news angle.
  4. Focus on originality and create content that resonates with both media outlets and their audiences.
  5. Utilize navigational links within the press release to guide readers to relevant pages on your website.
  6. Avoid transactional links with keyword-rich anchor text that may seem manipulative to search engines.
  7. Partner with reputable press release distribution services that understand SEO nuances.
  8. Ensure any embedded links in the press release are tagged appropriately with “nofollow” or “sponsored” attributes.
  9. Balance newsworthiness and brand messaging to create a captivating press release.
  10. Opt for news angles that invoke emotion, address conflicts, or highlight progress to resonate with readers and media outlets.
  11. Craft a well-crafted headline that captures attention and aligns with SEO practices.
  12. Focus on delivering informative and engaging content that offers value to readers.
  13. Follow best practices for link usage, adhering to guidelines set by search engines.
  14. Maintain a seamless experience for both readers and search engine crawlers.
  15. Choose navigational links to enhance user experience and brand credibility.
  16. Embrace reputable distribution services to share your press release effectively.
  17. Ensure all links are tagged appropriately to uphold your brand’s online reputation.
  18. Consistently monitor and adapt your press release strategy to evolving SEO trends.
  19. Evaluate your press releases’ effectiveness in visibility and authority enhancement.
  20. Continuously refine your approach based on insights and feedback for optimal results.

The post Why Press Releases Still Matter to SEO and How to Write a Press Release That Entices Media appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/why-press-releases-still-matter-to-seo-and-how-to-write-one/feed/ 20
What Is an XML Sitemap and How Do I Make One? https://www.bruceclay.com/blog/what-is-xml-sitemap/ https://www.bruceclay.com/blog/what-is-xml-sitemap/#comments Thu, 03 Dec 2020 18:55:27 +0000 https://www.bruceclay.com/?p=86909 An XML sitemap is a file that webmasters create and put on their site to tell search engines like Google and Bing about the pages, images, and videos that are on the site. This content list works like a map, helping ensure more thorough crawling and indexing. The XML sitemap is created using XML (Extensible […]

The post What Is an XML Sitemap and How Do I Make One? appeared first on Bruce Clay, Inc..

]]>
Looking at a map.

An XML sitemap is a file that webmasters create and put on their site to tell search engines like Google and Bing about the pages, images, and videos that are on the site. This content list works like a map, helping ensure more thorough crawling and indexing.

The XML sitemap is created using XML (Extensible Markup Language), a type of markup language used on the web in which tags share information.

Not only do XML sitemaps tell the search engines all the URLs that you want indexed and crawled, but they also provide other information, such as how frequently you update the pages.

An XML sitemap differs from an HTML sitemap in that the XML sitemap is just for search engines. On the other hand, an HTML sitemap is a webpage on your site that contains links to help visitors navigate to the important pages on your site.

Now that you have definitions in hand, let’s talk about:

XML Sitemaps: An SEO Best Practice with Benefits

For SEO purposes, you must build an XML sitemap and keep it up to date to help ensure that search engines index and crawl all the important pages on your site.

While some view an XML sitemap as “nice to have,” it’s actually an SEO best practice for every site to have at least one — even though Google says that sites under 500 pages might not need it.

Sure, search engines should be able to find all the pages by following links on the site, but the reality is that many sites don’t follow proper linking architecture. So, it can be hard for search engines to discover the content.

Benefits of XML sitemaps include:

  • They improve the crawl rate and indexation on a site.
  • They can help you spot problems (for Google, check Search Console).
  • They provide other useful information to search engines about your site.
  • They alert search engines to new pages and hopefully get them indexed sooner than if you waited for search engines to find them.

Now that we’ve gone over some benefits, it’s worth noting what XML sitemaps cannot do:

  • An XML sitemap does not guarantee that a search engine will index or crawl all pages, nor will it pass any link popularity or help with subject theming.
  • An XML sitemap will not impact your rankings. However having a higher number of pages indexed in the search engines may increase your chances of ranking.
  • An XML sitemap should not be relied upon as a way to fix crawling issues. If there is an indexation problem, also look at other factors like the architecture of the site or the quality of the content and its links rather than relying on an XML sitemap alone.

Special XML Sitemaps to Know

Other than the standard XML sitemap, it’s good to know about key sitemaps specific to content such as news, images, and video. Here, we’ll primarily go over Google sitemaps. You can learn more about Bing sitemaps here.

News XML Sitemaps

If you are a publisher of news-related content and don’t have a news sitemap, you may not be getting the visibility you want. For articles that have been published in the past two days, a news sitemap contains URLs for them. Create news sitemaps in addition to your generic XML sitemaps. For more, learn how to create a news sitemap.

Video XML Sitemaps

Video sitemaps can help Google find and understand your video content by telling it exactly where and what the video content is on your site. Video content includes webpages that embed videos, URLs to video players, or the URLs of raw video content. If Google cannot discover the video content at the URLs provided, it will ignore them. Note that while Google recommends using video sitemaps and schema.org’s VideoObject to mark up videos, it also supports mRSS. You can also view Bing’s guidelines on video sitemaps here.

Image XML Sitemaps

Image sitemaps help Google discover images on your site — especially those reached via JavaScript. You can suggest the most important images on your page that you want included in Google.

How to Create an XML Sitemap

You could create a sitemap manually, but using a sitemap generator makes the job easier. And to help you, there are many good third-party tools for creating XML sitemaps. One is Microsoft Bing’s free server-side Bing XML Sitemap Plugin, which can automatically generate two types of XML sitemaps that any search engine can read:

  • Comprehensive sitemap, which includes all files (except any you disallow in your robots.txt file)
  • Recently updated sitemap, which includes URLs of changed files only (useful for your own tracking or for prioritizing the pages that search engines should crawl)

Here’s a useful video from Google on creating an XML sitemap:

We also cover how to create sitemaps (both XML and HTML versions) in our SEO Guide.

For Large Websites

XML sitemaps are especially useful for large sites to make sure all the URLs are discoverable by search engines.

Large websites may need to break their list of URLs into many XML sitemaps. This ensures that the number of page URLs per sitemap doesn’t exceed the limit.

XML sitemaps can contain up to 50,000 page URLs.

You can have separate XML files by media type if you have original videos, news,​ images, etc., that you want to be indexed. So, for example, if you have videos on your site, create a specialized video XML sitemap to help make sure the search engines find your video files.

As a bonus, if you break down your XML sitemaps into smaller sitemap files, maybe by site sections, it allows you to watch your indexation performance for each section of your site and identify where indexation issues exist.

You can then create a sitemap index file that lists all the sitemap files on your site. To optimize sitemap files, you can also compress the file using gzip.

XML Sitemap Tips

The required XML tags are: <urlset>, <url>, and <loc>. The tags <urlset> and <url> are for formatting the XML, and <loc> is for identifying the URL.

Optional metadata tags are:

  • <lastmod> – last modified date
  • <changefreq> – how often the page changes (such as hourly, daily, monthly, never)
  • <priority> – how important the page is from 0 (the lowest) to 1 (the highest)

Site owners aren’t required to use the optional tags, but the engines may consult them when deciding how often they should recrawl pages. Google states that it does not use the <priority> or <changefreq> tags at all. While Google may consider <lastmod>, it does not base decisions on this tag.

If you use these tags, keep them accurate to help the search engines better crawl your site. Pages that you are optimizing should be set to a higher priority. If you have archived pages that you haven’t updated in years, set to a low priority with a <changefreq> of “never.”

Upload to the Site

Once you have created the ​sitemap file, upload it to the root of your website (for example: https://www.your-domain-name.com/sitemap.xml). Now it’s time to let the search engines know about it using your robots.txt file.

A robots.txt file is simply a text file saved at the root of your website that gives instructions to visiting search engine spiders.

Your robots.txt file should look like this, with a sitemap directive line for each of your different XML sitemaps:

User-agent: *
Disallow: /tmp/
Disallow: /filename.html
Sitemap: http://website.com/my-sitemap1.xml
Sitemap: http://website.com/my-sitemap2.xml

If you have multiple sitemaps, or if your CMS generates files with some unique names, then all you have to do is mention them by name in the robots.txt file, one per line. Or if you have created a sitemap index file, then you can specify just the index file location in the robots.txt and list all your separate sitemap files in the sitemap index.

That’s it! Now you can let the search engines do the rest.

Submitting a Sitemap

Some people prefer to submit the sitemaps manually. This is primarily due to timing. A submission is known to start the indexing instead of waiting for the search engine to give you a turn.

Another reason to submit a new sitemap to Google is to check it for errors. Google tries to continue parsing a sitemap file even if it has minor errors. However, if the XML is badly formed, then it could cause Google to ignore all entries after the badly formed entry (like a missing “>” or “</url>” tag). The Sitemaps report in Search Console will tell you if any problems were encountered, such as:

Has errors: The sitemap could be parsed but has one or more errors; any URLs that could be parsed from the sitemap will be queued for crawling.

You can proactively submit your XML sitemap(s) to Google and Bing as follows:

  • Google: Log in to your Google Search Console account and go to Sitemaps.
  • Bing: Log in to Bing Webmaster Tools. Then see the Sitemap widget on the dashboard or go to the Sitemaps feature.

Sitemaps report in Google Search Console.
Google Search Console’s Sitemaps tool lets you submit a sitemap and view history.

XML Sitemap Case Study

After diagnosing that a client with a large website had only 20% of pages indexed, we implemented several tactics to help. We resubmitted their standard XML sitemap and fixed a large number of errors coming up on the client’s Search Console account.

We also submitted specialized XML sitemaps and implemented canonical tags throughout the entire site, as it had a large amount of duplicate content.

Indexation results jumped from 24% to 68%! And this percentage keeps growing, resulting in significant improvements in organic search traffic.

Final Thoughts

The goal of XML sitemaps is to help search engines crawl efficiently and thoroughly. You facilitate this by creating a sitemap and using the appropriate tags so the engines can understand how to best crawl your site.

As a final note: Be sure to keep your XML sitemaps up to date. If you add or remove pages, make sure your sitemap reflects that. You should also check Google Search Console frequently to ensure that Google is not finding any errors in your sitemap.

You can find more information about the sitemaps protocol at sitemaps.org.

If you need help with your website’s organic search performance, contact us for a free quote for SEO services.

FAQ: How do XML sitemaps enhance search engine optimization, and what are the benefits they offer?

XML Sitemaps have emerged as a fundamental tool for bolstering search engine optimization efforts. These structured documents serve as a roadmap for search engine crawlers, directing them to all the essential pages of your website. This proactive approach ensures that search engines can efficiently discover and index your content, thereby improving your website’s overall visibility in search results.

One notable benefit of XML Sitemaps is their ability to prioritize content. You can guide search engine bots toward your most critical pages by assigning priority levels to different pages. This becomes particularly advantageous when dealing with large websites or those with intricate structures, preventing vital pages from being overlooked during crawling.

Furthermore, XML Sitemaps facilitate the inclusion of additional metadata about each page. Search engines rely on this information to make informed indexing decisions that ensure your latest and relevant content gets priority. Metadata provides this tool, helping search engines determine when a page was last changed and its update frequency compared to others on your site. With such metadata in hand, search engines are better equipped to prioritize indexing decisions accordingly and give it the focus it needs.

Incorporating XML Sitemaps also leads to quicker indexation of new or updated content. When you publish fresh material or change existing pages, the sitemap signals to search engines, prompting them to crawl and index the changes sooner. This feature is invaluable for time-sensitive content, such as news articles or limited-time promotions.

In a competitive online landscape, user experience is paramount. XML Sitemaps indirectly improve user experience by ensuring visitors can easily find your content through search engines. When your website ranks higher in search results due to effective indexing, it attracts more organic traffic, which in turn can lead to higher engagement, conversions, and business success.

XML Sitemaps are an essential asset in modern SEO strategies. Their ability to guide search engine crawlers, prioritize content, and expedite indexation significantly impacts a website’s visibility and user engagement. By integrating XML Sitemaps into your SEO practices, you harness a powerful tool that enhances your online presence and sets the stage for sustainable growth.

Step-by-Step Procedure: Enhancing SEO with XML Sitemaps

  1. Understand XML Sitemaps – Get acquainted with XML Sitemaps, as they serve as navigational tools for crawlers.
  2. Generate XML Sitemaps using either a CMS plugin or an XML Sitemap Generator Tool.
  3. Include Essential Pages: Ensure that your XML Sitemap includes all crucial pages of your website, including main content, category pages, and important internal links.
  4. Set Priority and Frequency: Assign priority levels and update frequencies to pages within the XML Sitemap to guide search engine crawlers toward important content.
  5. Add Last Modification Dates: Incorporate the last modification dates for each page in the XML Sitemap to indicate when the content was last updated.
  6. Submit Sitemaps to Major Search Engines. Submit your XML sitemaps via major search engines such as Google and Bing’s respective webmaster tools to submit them as searchable indices.
  7. Update your Sitemap Regularly: Keep an XML sitemap updated whenever new content or any alterations are added or changes to your website.
  8. Monitor Crawl Errors: Periodically check for crawl errors and issues related to your XML Sitemap in the webmaster tools.
  9. Optimize Site Structure: Ensure your website’s structure is organized and user-friendly, which indirectly benefits XML Sitemap functionality.
  10. Prioritize User Experience: Focus on providing valuable and relevant content to enhance user experience, which contributes to improved search rankings.
  11. Create High-Quality Content: Craft high-quality, informative content that naturally incorporates relevant keywords to attract organic traffic.
  12. Utilize Internal Linking: Implement effective internal linking strategies to guide users and search engine crawlers through your website.
  13. Monitor Analytics: Regularly monitor website analytics to track the impact of XML Sitemaps on search engine visibility, traffic, and engagement.
  14. Stay Updated on SEO Trends: Stay informed about evolving SEO practices and algorithm changes to adapt your XML Sitemap strategy accordingly.
  15. Optimize for Mobile: Ensure your website is mobile-responsive, as mobile-friendliness is crucial in search engine rankings.
  16. Promote Social Sharing: Encourage social sharing of your content, as social signals indirectly influence search engine rankings.
  17. Manage Duplicate Content: Address duplicate content issues, as they can affect your website’s search engine performance.
  18. Optimize Images: Compress and optimize images to improve website loading speed, positively impacting user experience and SEO.
  19. Secure Your Website: Implement security measures such as SSL certificates to establish a secure browsing environment, which can enhance search rankings.
  20. Regularly Audit and Update: Regularly audits your XML Sitemaps, content, and SEO strategies to adapt to changing search engine algorithms and user behavior.

The post What Is an XML Sitemap and How Do I Make One? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/what-is-xml-sitemap/feed/ 5
How to Optimize Images, Videos & Other Media Content for SEO https://www.bruceclay.com/seo/optimize-multimedia-content/ Wed, 13 Mar 2019 03:52:06 +0000 https://www.bruceclay.com/?page_id=62890 SEO Guide Step 11 Page experience: images, videos, and other multimedia content Image optimization Video optimization Audio optimization FAQ: What are the key elements to consider when optimizing images for SEO?  Page Experience: Images, Videos, and Other Multimedia Content In the Step 10 lesson, you learned why having engaging images, videos, and other rich media […]

The post How to Optimize Images, Videos & Other Media Content for SEO appeared first on Bruce Clay, Inc..

]]>
SEO Guide Step 11

Page Experience: Images, Videos, and Other Multimedia Content

In the Step 10 lesson, you learned why having engaging images, videos, and other rich media objects on your webpages matters for search engine ranking. And let’s face it — without them, ​plain text content is boring! SEO optimization of these elements is crucial for human visitors as well as search engines.

Dog looking bored captioned Don't be boring

Photo by Josh (CC by 2.0), modified

Non-text elements such as images, videos, audio, and other types of rich media help engage and retain a visitor’s interest. They also raise the quality of your webpage. Search engines are getting better at reading these non-text engagement objects, but it’s the job of SEO to clearly communicate what the content is about.

This lesson focuses on the most frequently used and SEO-significant types of rich media and how to optimize them …

Image Optimization Best Practices for SEO

Content with images performs better than plain text content. Yet, a website’s ​images are an area of optimization that businesses frequently ignore.

Check for these essential elements when optimizing images for search:

Image attributes in WordPress

Image attributes are easy to set in WordPress.

  • Image selection: Choose an image that relates to the subject matter of your content. Beyond being relevant, an image that’s original (such as a photo you stage and shoot yourself) has more value from an SEO and branding perspective since it will be unique.
  • Honor copyrights: If you do use stock photos, make sure to pay any required license fees and give proper attribution. (SEO Tip: You can search for royalty-free or “creative commons” images, but be careful since each individual image may have its own usage requirements.) Images with trademarks or logos can also be tricky; check the owner’s legal requirements to make sure your user is permissible.
  • File format: Save the image in a format search engines can index. With most image editors, you can save the file as a PNG, JPEG or GIF.
  • Name: Describe the picture using the appropriate keyword(s) within the file name.

Use text or use image with ALT attribute

  • A​lt text: Include brief text describing the image in an alt attribute (in the HTML image tag). This text can be read by both search engines and people (especially the visually impaired, who use screen-reader software to “read” a page).
    SEO Tip: With linked images, search engines treat the alt attribute as the link anchor text.
  • Text support: Give the image context by describing it in a caption and in the surrounding text, including the keywords you used in the file name and ​alt attribute.
  • Size: File size matters on the web. Keeping image file sizes small helps your pages load quickly, which is vital particularly for a good mobile user experience.
    SEO Tips: Make the browser’s job of displaying your images easier. Resize each image before uploading it to your website, and specify height and width attributes in the image tag.
  • Sitemap: (Optional) Create an image XML Sitemap that lists all your images, and then submit it to the search engines for increased visibility within image search. (See Step 9 for more on creating XML Sitemaps.)

See our in-depth image SEO guide for more recommendations.

Image Types: The Choice is Yours

There are many types of images to choose from. A photo grabs a reader’s attention well, especially if it contains people’s faces. Beyond photographs, consider diagrams, artwork, illustrations, charts, graphs, logos, screenshotsmemes — basically, any visual rich media element that communicates your subject matter and engages your user.

Another type of image that’s particularly popular (and gets reshared a lot) is the infographic. Infographics are relatively easy to build — you can find data through research (citing your sources, of course), and then put it together using one of the many free infographic tools available online. As a segue to our next section on video optimization, here’s an example infographic we made using Piktochart:

Why Use Video Infographic

Video Optimization Best Practices for SEO

As the “Why Use Video?” infographic above explains, video is a key factor in SEO. Bruce Clay considers it to be the most important engagement element to have in your multimedia content arsenal. In fact, Google has a vested interest in video since it owns YouTube, which has become the second most popular search engine on the web.

Making videos doesn’t require a huge investment in equipment and software, or even a lot of tech savvy anymore. Some simple low-end options include smartphone videos, screen-capture videos, and live video conferences, ​in which multiple participants ​have a conversation from remote locations, filmed with their laptop mics and cameras, and automatically turned into a video.

Whether your videos are recorded with your webcam or high-end productions, you’ll need to follow some SEO tips to help your videos rank better in search:

  • Format: Save your video in a format search engines can read, such as MPG, MPEG, MOV, M4V and WMV among several others.
  • Hosting: Use YouTube (or a similar video hosting site, like Vimeo) to host your video and then embed it on your site. This enables the video to be found in YouTube searches, as well as web and video searches in Google, Bing, etc.
  • Branding: Make your brand name or website URL visible if you host the video on a third-party site like YouTube. For instance, upload it to your brand’s YouTube channel and show your brand name in the video (a subtle reference near the end works well).
  • Tags: Place keywords in the video’s file name, description and keyword tags.
  • Surrounding text: Optimize text around the embedded video with relevant keywords. Describe what the video is about so readers know what to expect and search engines can make sense of your video.
  • Transcript: Create and upload a transcript, or use YouTube subtitles and captions as a transcript alternative. (SEO tip: Your keyword targets should be mentioned in the video so they’ll naturally appear in the transcript.)
  • Sitemap: Create and submit a video XML sitemap to make it easier for search engines to find and index your video content. (See Step 9 for more on creating XML sitemaps.)

Audio and Podcast Optimization for SEO

Audio is a great way to add value to your existing content while enhancing the user experience.

Music sites are not the only ones that can offer quality audio files for visitors to listen to. ​Non-music sites can find ways to enrich the visitor’s experience using audio. (Note: Let the visitor control whether audio plays, rather than starting the sound automatically, as a courtesy.)

A podcast is a digital audio file that can be downloaded or streamed from the web. Widespread internet access in cars and phones has enabled podcasts to surge as a popular way to consume content. Millions of people listen to podcasts every day to learn new skills, get news and entertainment, or hear an audiobook — among many other reasons to listen to podcasts.

Consider hosting a podcast if you have lessons, news recaps, interviews, or some other type of audio content that would be interesting and useful for your audience to digest on a regular basis.

Here are a few tips for incorporating audio files (podcasts, music, or other) as multimedia content on your site:

  • Quality: Choose audio files with good sound quality.
  • File naming: Optimize audio file names with relevant keywords just as you would for image or video files.
  • Relevant text: Similar to the video, you can create and provide a transcript that includes mentions of your keywords. Also, use the text surrounding the podcast/audio file to describe its contents and incorporate keywords.
  • Title, description: Make sure each episode of your podcast has its own unique title and description.

SEO GUIDE BONUS

Find out why producing expert, competent content is necessary for your search marketing strategy. Optimizing multimedia elements helps that strategy succeed. Here Bruce Clay speaks to Murray Newlands in an interview for Search Engine Journal.

 

Next up in the SEO guide, you’ll learn how to organize your website using siloing. Bruce Clay was the first to introduce this key technique, and it alone has helped many brands see rapid, measurable improvements in search engine rankings.

Need more SEO tips?
See more on Video Optimization or Search Engine Optimization

Related blog posts and articles:

FAQ: What are the key elements to consider when optimizing images for SEO?

Search engines like Google rely on various factors to rank webpages, and images play a crucial role. To ensure your images contribute positively to your SEO efforts, it’s essential to consider the following key elements:

  1. File Format Selection: 

Begin by choosing the right file format. JPEG is ideal for photographs and images with many colors, while PNG is suitable for transparent graphics and images. Selecting the appropriate format can significantly impact loading times and user experience.

  1. Image Size and Dimensions: 

Keep image dimensions reasonable. Overly large images can slow down your website, affecting SEO and user satisfaction. Resize images to fit their display dimensions and use responsive design for mobile optimization.

  1. Compression Techniques:

Utilize image compression to reduce file sizes without compromising quality. This not only improves website speed but also positively influences your SEO ranking.

  1. Descriptive File Names:

Optimize your image filenames by using descriptive, relevant keywords. Avoid generic names like “image123.jpg.” Instead, opt for names that reflect the image’s content, such as “red-velvet-cake.jpg.”

  1. Alt Text: 

Alt text is essential for accessibility and SEO. Provide accurate and concise alt text that describes the image’s content, incorporating relevant keywords naturally.

Attention to these elements will enhance your website’s SEO performance and user experience, ultimately increasing search engine ranking. Optimizing images goes beyond SEO; it makes your site more accessible and user-friendly.

Step-by-Step Procedure: Image SEO Optimization

  1. Select the Right File Format: Choose JPEG and PNG based on your image’s characteristics.
  2. Resize Images: Adjust image dimensions to match their display size and use responsive design.
  3. Compress Images: Use compression tools to reduce file sizes while preserving quality.
  4. Optimize Filenames: Rename images with descriptive, keyword-rich names.
  5. Alt Text: Provide accurate and concise alt text describing image content and incorporating relevant keywords.
  6. Implement Image Sitemaps: Create and submit image sitemaps to search engines.
  7. Lazy Loading: Implement lazy loading to improve page loading times.
  8. CDN Integration: Consider using a Content Delivery Network (CDN) to speed up image delivery.
  9. Structured Data: Add structured data markup (schema.org) for images.
  10. Monitor Performance: Regularly check website speed and image loading times.
  11. Mobile Optimization: Ensure images are optimized for mobile devices.
  12. Accessibility Compliance: Verify that images are accessible to all users, including those with disabilities.
  13. Keyword Research: Conduct keyword research to identify relevant keywords for image optimization.
  14. Competitor Analysis: Analyze competitors’ image SEO strategies.
  15. User Experience Testing: Test how images affect user experience on your website.
  16. Backlink Building: Secure backlinks to image-rich pages for improved SEO.
  17. Regular Updates: Keep images updated and relevant to your content.
  18. Analytics Monitoring: Use analytics tools to track image performance and make necessary adjustments.
  19. User Engagement: Encourage user engagement with images through social sharing and comments.
  20. Adherence to SEO Best Practices: Stay updated with SEO best practices and adapt your image optimization strategy accordingly.

By following this comprehensive procedure, you can optimize your website’s images for SEO effectively, enhancing your search engine rankings and user experience.

The post How to Optimize Images, Videos & Other Media Content for SEO appeared first on Bruce Clay, Inc..

]]>
Technical SEO Tips https://www.bruceclay.com/seo/technical-seo-tips/ Wed, 13 Mar 2019 03:49:37 +0000 https://www.bruceclay.com/?page_id=62786 SEO Guide Step 16 Technical is not on-page Checking your instrument panel Cloaking Redirects Duplicate content Custom 404 error page Plagiarism Site performance Robots.txt appropriately Hacked content & UG spam Structured data FAQ: What are the risks of cloaking in SEO, and how can I avoid it? Technical Is Not On-Page Technical SEO is the […]

The post Technical SEO Tips appeared first on Bruce Clay, Inc..

]]>
SEO Guide Step 16

Technical Is Not On-Page

Technical SEO is the practice of optimizing the “back end” of a site so that search engines like Google can better crawl and index the website.

Please read this post on Technical SEO vs. On-Page SEO: The Differences as a starter.

Up to this point, this SEO Guide has primarily focused on developing quality content that earns links naturally and is optimized for search.

Now, we’re going to shift gears. This lesson covers technical SEO tips on various issues that are critical to ranking success. It is a part of our much larger multi-step SEO Guide, helping you learn how to do search engine optimization.

Without keeping an eye on a few technical things, you could watch the hard work you put into optimizing your website go to waste — like a leak that ends up sinking an otherwise seaworthy vessel.

Sinking ship

The search engines must be able to find, crawl, and index your website properly. In this lesson, we’ve assembled a list of technical SEO tips you need to know to avoid mistakes that could sink your online ship.

Checking Your Instrument Panel

Before we cast off and start talking technical, let’s make sure your instruments are working.

To do SEO well, you must have analytics installed on your website. Analytics data is the driving force of online marketing, helping you better understand how users are interacting with your site.

We recommend you install this free software: Google Analytics and possibly Bing Analytics (or a third-party tool). Set up goals in your analytics account to track activities that count as conversions on your site.

Your analytics instrument panels will show you: which pages are visited most; what types of people come to the site; where visitors come from; traffic patterns over time; and much more.

Getting analytics (and Google Search Console, as well) set up is one of the most important technical SEO steps. Seeing your site performance data will help you steer your search engine optimization.

Casting Off … Technical Issues to Watch for

1. Avoid Cloaking

First, keep your site free from cloaking. Cloaking means showing one version of a page to users but a different version to search engines.

Search engines want to see the identical results users are seeing and tend to be very suspicious. Technically, any hidden text, hidden links or cloaking should be avoided. These types of deceptive web practices frequently result in penalties.

You can check your site for cloaking issues using our free SEO Cloaking Checker tool.

We suggest you run your main URL through it on a monthly or regular basis (so bookmark this page).

Free Tool – SEO Cloaking Checker

 

2. Use Redirects Properly

When you need to move a webpage to a different URL, you want to direct users to the most appropriate (subject-related) page. Also make sure you’re using the right type of redirect.

As a technical SEO tip, we recommend always using 301 (permanent) redirects. A 301 tells the search engine to drop the old page from its index and replace it with the new URL. Search engines transfer most of the link equity from the old page to the new one, so you won’t suffer a loss in rankings.

Mistakes are common with redirects. A webmaster, for example, might delete a webpage but neglect to set up a redirect for its URL. This causes ​people to get a “Page Not Found” 404 error.

Furthermore, sneaky redirects in any form, whether they are user agent/IP-based or redirects through JavaScript or meta refreshes, frequently cause ranking penalties.

In addition, we recommend avoiding ​​302 redirects. ​Though Google says it attempts to ​treat 302s as 301s, a 302 is a temporary redirect. It’s meant to signal that the move will be short-lived, and therefore search engines may not transfer link equity to the new page. Both the lack of link equity and the potential filtering of the duplicated page can hurt your rankings.

Read more: ​How to Properly Implement a 301 Redirect

3. Prevent Duplicate Content

It’s a good idea to fix and prevent duplicate content issues within your site.

Search engines may get confused about which version of a page to index and rank if the same content appears on multiple pages. Ideally, you should have only one URL for one piece of content.

When you have duplicated pages, search engines pick the version they think is best and filter out all the rest. You lose out on having more of your content ranked, and also risk having “thin or duplicated” content, something Google’s Panda algorithm filter penalizes. (See Step 14 of this SEO Guide for more detail on penalties.)

If your duplicate content is internal, such as multiple URLs leading to the same content, then you can decide for the search engines by a number of methods. You can:

  • Delete unneeded duplicate pages. Then 301-redirect the URLs to another relevant page.
  • Apply a canonical link element (commonly referred to as a canonical tag) to communicate which is the primary URL.
  • Specify which parameters should not be indexed. Use Google Search Console’s URL Parameters tool if the duplicate content is caused by parameters being added to the end of your URLs. (You can read more about this in Search Console Help.)

Any of these solutions should be used with care. We help our SEO clients with these types of technical issues, so let us know if you want a free quote for assistance.

Read more: Understanding Duplicate Content and How to Avoid It (10 Ways) and Is Duplicate Content Bad for Search Engine Rankings?

Help visitors not jump ship

4. Create a Custom 404 Error Page

When someone clicks a bad link or types in a wrong address on your website, what experience do they have?

Let’s find out: Try going to a nonexistent page on your site by typing http://www.[yourdomain].com/bogus into the address bar of your browser. What do you get?

If you see an ugly, standard “Page Not Found” HTML Error 404 message (such as the one shown below), then this technical SEO tip is for you!

standard 404 error

Most website visitors simply click the back button when they see that standard 404 error and leave your site forever.

It’s inevitable that mistakes happen and people will get stuck sometimes. So you need a way to help them at their point of need.

To keep people from jumping ship, create a custom 404 error page for your website.

First, make the page. A custom 404 page should do more than just say the URL doesn’t exist. While some kind of polite error feedback is necessary, your customized page can also help steer people toward pages they may want with links and other options.

Additionally, you want your 404 page to reassure wayward visitors that they’re still on your site, so make the page look just like your other pages (using the same colors, fonts and layout) and offer the same side and top navigation menus.

In the body of the 404 page, here are some helpful items you might include:

  • Apology for the error
  • Home page link
  • Links to your most popular or main pages
  • Link to view your sitemap
  • Site-search box
  • Image or other engaging element

Since your 404 page may be accessed from anywhere on your website, be sure to make all links fully qualified (starting with http).

Next, tell your server. Once you’ve created a helpful, customized error page, the next step is to set up this pretty new page to work as your 404 error message.

The setup instructions differ depending on what type of website server you use. For Apache servers, you modify the .htaccess file to specify the page’s location. If your site runs on a Microsoft IIS server, you set up your custom 404 page using the Internet Information Services (IIS) Manager. WordPress sites have yet another procedure. (​Use the “Read more” links below to see detailed technical help.)

We should note that some smaller website hosts do not permit custom error 404 pages. But if yours does, it’s worth the effort to create a page you’ve carefully worded and designed to serve your site visitors’ needs. You’ll minimize the number of misdirected travelers who go overboard, and help them remain happily on your site.

Read more: How to Configure a Custom 404 Page on an Apache Server, How to Configure a Custom 404 Page in Microsoft IIS, and Google’s help topic Create useful 404 pages

5. Watch Out for Plagiarism (There are pirates in these waters …)

Face it; there are unscrupulous people out there who don’t think twice about stealing and republishing your valuable content as their own. These villains can create many duplicates of your web pages that search engines have to sort through.

Search engines can usually tell whose version of a page is the original in their index. But if your site is scraped by a prominent site, it could cause your page to be filtered out of search engine results pages (SERPs).

We suggest two methods to detect plagiarism (content theft):

  • Exact-match search: Copy a long text snippet from your page and search for it within quotation marks in Google. The results will reveal all web pages indexed with that exact text.
  • Copyscape: This free plagiarism detection service can help you identify instances of content theft. Just paste the URL of your original content, and Copyscape will take care of the rest.

Try to remedy the plagiarism issue before it results in having your pages mistakenly filtered out of SERPs as duplicate content. Ask the site owner to remove your stolen content from their website. You could also consider revising your content so that it’s no longer duplicated. (SEO Tip: If you can’t locate contact information on a website, look up the domain on Whois.net to find out the registrant’s name and contact info.)

Read more: About Scraper Sites

Fast sites make users happy quote

6. Protect Site Performance

How long does it take your website to display a page?

Your website’s server speed and page loading time (collectively called “site performance”) affect the user experience and impact SEO, as well.

Google uses page load time as a ranking factor in mobile search. It’s also a site accessibility issue for users and search engines.

The longer the web server response time, the longer it takes for your web pages to load. Slow page-loading times can reduce conversion rates (because your site visitors get bored and leave), slow down search engine spiders so less of your site gets indexed, and hurt your rankings.

You need a fast, high-performance server that allows search engine spiders to crawl more pages per sequence and that satisfies your human visitors, as well. Web design issues can also sink your site performance, so if page-loading speed is a problem, talk to your webmaster.

SEO tip: Use Google’s free tool PageSpeed Insights to analyze a site’s performance.

SEO GUIDE BONUS VIDEO

This Google Webmasters video explains that page speed can be a factor in Google’s algorithm, particularly as a tie-breaker between otherwise equal results.

There is an SEO optimization benefit to fast site performance — and conversely, great harm to your users and bottom line if your site is too slow.

7. Use robots.txt Appropriately

What’s the first thing a search engine looks for upon arriving at your site? It’s robots.txt,
a text file kept in the root directory of a website that instructs
spiders which directories can and cannot be crawled.

With simple “disallow” commands, a robots.txt is where you can block indexing
of:

  • Private directories you don’t want the public to find
  • Temporary or auto-generated pages (such as search results pages)
  • Advertisements you may host (such as AdSense ads)
  • Under-construction sections of your site

Every site should put a robots.txt file in their root directory, even if it’s blank, since that’s the first thing on the spiders’ checklist.

But handle your robots.txt with great care​; it’s like a small rudder capable of steering a huge ship. A single disallow command applied to the root directory can stop all crawling — which is very useful, for instance, for a staging site or a brand new version of your site that isn’t ready for prime time yet. However, we’ve seen entire websites inadvertently sink without a trace in the SERPs simply because the webmaster forgot to remove that disallow command when the site went live.

SEO Tip: Some content management systems (e.g., WordPress) come with a prefabricated robots.txt file. Make sure that you update it to meet your site’s needs.

Google offers a robots.txt Tester
in Google Search Console that checks your robots.txt file to make sure it’s working as you desire. Secondly, we suggest running the Fetch as Google tool if there’s any question about how a particular URL may be indexed. This tool simulates how Google crawls URLs on your website, even
rendering your pages to show you whether the spiders can correctly process the various types of code and elements you have on your page.

Read more: Robots Exclusion Protocol Reference Guide

Look out for spam

8. Be on the Lookout for Hacked Content & User-Generated Spam

Websites can attract hacked content like a ship’s hull attracts barnacles — and the bigger the site, the more it may attract.

Hacked content is any content that’s placed on your website without your permission.

Hackers work through vulnerabilities in your site’s security to try to place their own content on your URLs. The injected content may or may not be malicious, but you don’t want it regardless. Some of the worst cases happen when a hacker gains access to a server and redirects URLs to a spammy site. Other cases involve bogus pages being added to a site’s blog, or hidden text being inserted on a page.

Google recommends that webmasters look out for hacked content and remove it ASAP.

Similar to this problem, user-generated spam needs to be kept to a minimum.

Your website’s public-access points, such as blog comments, should be monitored. Set up a system to approve blog comments, and keep watch to protect your site from unwanted stowaways.

Google often gives sites the benefit of the doubt and warns them, via Google Search Console, when it finds spam. However, if there’s too much user-generated spam, your whole website could receive a manual penalty.

Read more: What is hacking or hacked content? and User-generated spam (from Google Webmaster Help)

9. Use Structured Data

Structured data markup can be a web marketer’s best mate. It works like this. You mark up your website content with additional bits of HTML code, and the search engines read these notes to learn what’s what on your site.

The markup code gives search engines the type of context only a human would normally understand. The biggest SEO optimization benefit is search results may display more relevant information from your site — those extra “rich snippets” of information that sometimes appear below the title and description — which increases your click-through rates.

Structured data markup is available for many categories of content (based on Schema.org standards), so don’t miss this opportunity to improve your site’s visibility in search results by helping your SERP listings stand out.

Read more: How to Use Structured Data to Improve Website Visibility

Land, Ho!

Just two more lessons to go! In the next lesson, we’ll cover essential mobile SEO tips for making sure your site can rank in mobile search.

Related blog posts and articles:


SEO Training — We offer SEO training as well as SEO tools so that you can better optimize your sites. If you want a great SEO education, please check out our SEO training course. You will not be sorry!

FAQ: What are the risks of cloaking in SEO, and how can I avoid it?

One common tactic that can lead to severe consequences if not handled correctly is cloaking. In this blog, we’ll delve into the risks associated with cloaking in SEO and guide you on how to avoid falling into its treacherous trap.

Understanding Cloaking in SEO

Cloaking involves presenting different content to search engines and human visitors, a practice frowned upon by search engines. While it might seem like a shortcut to better rankings, it comes with substantial risks. Search engines, like Google, can penalize your website if they detect cloaking, resulting in lower rankings or even removal from search results.

Risks of Cloaking

  1. Penalties: The most significant risk is that search engines can penalize your site, causing a drop in rankings or complete removal from search results. This can be devastating for your online presence.
  1. Loss of Trust: Cloaking erodes trust with both search engines and users. Your bounce rate will increase if visitors do not find what they expect when clicking your link.
  1. Reputation Damage: Your reputation may suffer, and gaining trust from your audience may become harder. Negative online chatter about your site can also impact your brand.

Avoiding Cloaking: Expert Tips

  1. Follow Guidelines: Keep up with search engine guidelines and best practices. Google’s Webmaster Guidelines will help.
  1. Consistency: Stay consistent for all of your website users Transparency is key.
  1. Use the “rel=canonical” Tag: Implement this tag to indicate the preferred version of a page if you have similar content. It helps search engines understand your intentions.
  1. Monitor Your Site: Regularly check for any unintentional cloaking issues on your site. Tools like Google Search Console can help detect anomalies.
  1. Quality Content: Focus on creating high-quality, user-centric content. This not only pleases search engines but also engages your audience.

Cloaking may seem tempting as a way to boost your SEO rankings, but the risks far outweigh the benefits. Instead, invest your efforts in ethical SEO practices to sustain your website’s growth over time. Following the expert tips outlined in this article, you can safeguard your site’s reputation and ensure a lasting presence in the digital landscape.

Step-by-Step Procedure: Avoiding Cloaking in SEO

  1. Familiarize yourself with search engine guidelines, especially Google’s Webmaster Guidelines.
  2. Maintain consistency in content across all versions of your website.
  3. Implement the “rel=canonical” tag to indicate preferred content versions.
  4. Regularly monitor your site for unintentional cloaking issues using tools like Google Search Console.
  5. Focus on creating high-quality, user-centric content for your website.

The post Technical SEO Tips appeared first on Bruce Clay, Inc..

]]>