SEO Academy

search icon
Sorry, but nothing matched your search terms.

There’s so much more to creating an online presence than simply uploading content onto a webpage. Getting noticed in search results is the best way to grow a business and brand name, and creating good content is a key to ranking for multiple search queries. However, search engine crawlers use HTML tags to read and understand the web pages they index. This means that it is not just your content that is important, but how your content reads on both the frontend and the backend of your website. 

It’s best to think of HTML as Google crawlers’ native language, and this guide will explain how to use your SEO HTML tags to better communicate to Google the relevance of your content to searchers.

What Are HTML Tags?

HTML tags are the foundation of any website. They are small snippets of code that are embedded into the back-end of a website, and there are different HTML uses for different components on a webpage. 

A HTML tag is characterized by <> and </> surrounding the word or phrase.

Why Are HTML Tags So Important for SEO?

In order to understand why HTML tags are important for SEO purposes, it is important to understand the fundamentals of how a search engine works. 

Simply put, a search engine’s goal is not only to provide informative answers to its users, but its job is to find relevant and timely content based on the searcher’s query.

There are over 200 ranking factors that go into how a search engine promotes relevant results in their search engine result pages (SERPs). In fact, Google regularly changes its algorithm to improve user experience and the quality of search engine results. Unfortunately, Google keeps these algorithm updates under lock and key, but SEO HTML best practices is a sure-fire method of communicating information about each page so search engines can read your content accordingly.

One incredibly important SEO best practice is investing time and energy into learning exactly how to implement the right HTML tags on the backend of your website.

The HTML process is as follows:

  1. The website owner creates the page’s content.
  2. The website developer implements HTML code into the backend of the website.
  3. The web page is published.
  4. Search engine bots arrive on the page of the website and read the HTML code.
  5. The search engine bots store and index this information about the web page.
  6. When a user searches for a keyword phrase that is relevant to what the HTML code communicated to crawlers, the web page has a better chance of showing up in the SERPs.

Not all HTML elements are created equally. There are some that are more important for showing up in search results, and you will create them depending on your target keyword and the specific topic of the given page.

The Most Important HTML Tags for Ranking in Google

Here we explain the most important HTML tags to direct your SEO efforts, and the best practices on how to do so for each.

Page Titles or “Title Tags”

Ask any SEO expert out there and they will say that your page title, or the title tags, are arguably the most important HTML snippet to include on your website. Because after all, if you don’t have a title, how will Google or users know what your page is about? HTML specifically tells Google “hey, here is the title of this page” and once indexed, the title becomes the clickable headline in the SERPs.

The HTML code for a title tag is: <title>your title here</title>

3 Ways to Optimize Page Titles

Technically, Google can choose any snippet of text to be the page title in the SERPs. But in order to ensure Google indexes the proper SEO title, there are certain best practices you should follow for all your title tags.

Keyword Optimize

Before you optimize any HTML element, the first step is to identify your focus keyword for the page. Then you will want to put your target keyword into the page title. Not only will this provide informational context for the reader, but will give an additional signal to the search engine crawlers about what each page is about.  

But do be careful about keyword stuffing and overusing keyphrases in your titles. Like with the rest of the content on your page, too many similar keywords in one place will send warning signals to the search engine that you may be spam.

Keep It Short

Google will only show the first 50-60 characters of your SEO page titles. A good title tag will be short and sweet, preventing your title from being cut off and possibly confusing prospective customers. You’ll have a bunch more space in the headline tags and general content to expand.

Set Up Proper Expectations

At the end of the day, you want to be as helpful to your clients as you can. Your website should not only be a representation of your brand, but an informative resource for all website visitors. This means your page titles should be clear, concise, and adequately reflect what the content of the page is about.

So although a unique title that sparks the curiosity of searchers can seem like the right approach, in reality, users are looking to get the answer to their question as quickly as possible. In the long run, clear, relevant page titles will help improve CTR, which can help secure higher rankings for your website overall.

Meta Descriptions

It is best to think of your page’s meta description as being the synopsis on the back of a book. They are short, quick, and easily digestible sentences that explain more in-depth what the page content is about. 

Where a page’s title grabs the attention of the user, a meta description adds more context and background information. 

A meta description is found within the SERPs directly under the page’s clickable URL. Implementing a meta description will provide Google with the right information they need, without Google having to take snippets of text from the same page and create one themselves. When this happens, your user may not get the most accurate description, and it can cause them to lose interest in your brand name.

The HTML code for a meta description tag is: <meta name =”description” content “your description”/>

Meta Description and Click-Through-Rates

Data shows that a well-crafted meta description undoubtably entices users to click over to your page. According to Backlinko, website pages with meta descriptions had about a 6% higher CTR than those that did not. 

Now, if you are thinking this isn’t that big of a percentage, consider it this way. If your page shows up 500 times in a Google search per month, that is 30 more clients clicking over to your page than if you didn’t have a simple meta description! Adding a meta description is an easy way to get new customers headed your way.

3 Ways to Optimize Meta Descriptions for SEO

Even if you do have a meta description, there is a small risk that Google will choose another sentence or two from that page that they think is more specific and relevant. But to prevent this, you can follow some SEO best practices.

Use the Same Focus Keyword That Is In the Page Title

You really want to drive home to the search engine spiders and your consumer that your page is about a specific keyword or phrase. Where SEO page titles are used for rankings, meta descriptions are more user-focused. Google does have an expected CTR as a ranking factor, so with this in mind, it is crucial to keep the important keywords consistent throughout all the SEO HTML tags, title tag and meta description included.

Be Mindful of the Length

Just like with SEO titles, you need to watch the length of your meta description. There’s only limited space available in the search engine result pages, so Google has to cut off meta descriptions around 150-160 characters. That’s not to say that you have to match your description up perfectly with the character count, but do your best so the description is easily understood.

Use a Call to Action

Internet users have very short attention spans, so it’s a good idea to remind them of why they need to enter your site! Call to actions don’t have to be overly complex or unique, a simple “learn more here” or “contact us today” can work wonders with your click through rate.

Headlines or “Heading Tags”

Just like having a strategy of choosing what key phrases you want to incorporate on your page, you have to develop a strategy and a plan for how you structure that information in subheadings. For maximum readability, for both users and search engine crawlers, you cannot just put a ton of information down on a page. There needs to be structure, and that’s where headlines (header tags) or h2 through h6 tags come in. 

Users don’t always read the entire page of content, rather they scroll through the page and see if any of the different sections answer their questions. They’ll browse briefly, read the section that appeals to them the most, then leave to complete another action. And if your page isn’t split into multiple sections and is instead one long winding piece of content, then the user will bounce from your page before they even get to reading.

This is why headlines are so important; they are the foundation to your landing page or blog post’s construction.

A header tag looks like: <h1>your heading here </h1>

Tips on Using Keywords, Synonyms, or LSI terms in Headlines

So how do you utilize keyphrases in your multiple headlines without keyword stuffing or sounding spammy? The answer comes with LSI keywords.

LSI stands for Latent Semantic Indexing, which are synonyms that are related to the main keyword(s) you are trying to target. Sprinkling LSI keywords throughout your content makes it easier for search engines and users to get a general idea of what your content is about.

However, it is important to note that LSI keywords are not always synonyms of your keyword, rather LSI keywords are related phrases to your topic. For example, if your keyword is coat, a synonym would be jacket. LSI keywords for coat would be winter, spring, feather down, puffy, warm, light, etc. 

Just as your headlines give structure to the page as a whole, LSI terms give more context to the content. Considering Google takes a look at your entire page before indexing and categorizing it, utilizing LSI keywords and synonyms in your SEO HTML tags will work to drive home the meaning and messaging of your content.

You can use tools like our landing page optimizer to identify related keyphrases and focus words for your heading tags. Just enter your keywords into the tool, and our software provides a list of keywords that have strong topical relevance to the keyphrase. The dropdown menu will provide you plenty of terms to choose from.

3 Ways to Optimize Headings for SEO

Thinking in terms of how search rankings work is pivotal for creating headlines that convert. Here are three tips on how to optimize your heading tags for SEO purposes.

Don’t Use More Than One H1

There are multiple sizes of header tags you can use: H1, H2, H3, H4, H5, and H6. The higher the number, the smaller the text will be, and the less important it is to use a keyword phrase. Using more than one H1 tag can confuse search engines, as they see the H1s as being the title of the page. 

A quick note: H1 tags are not to be confused with title tags as mentioned above. Title tags are shown in the search engine result pages, whereas H1s are only shown on the web page itself.

Stay Consistent

You should write your headlines so they are consistent and concise. It is always a good practice to write your headlines in a way that if you were to remove all other content, the headings would read like a list.

Write Headings as if They Are Queries

Because headings are noticed and ranked by search engine bots, you should always use this space on your website to your advantage and write content that can help with rankings. Many users enter keyphrases as questions, so headlines that resemble queries a searcher would ask, or a helpful answer to a question, is an optimizing strategy that works well.

Other Important SEO HTML Tags

There are more SEO HTML tags that can also function as a ranking signal. Here’s other important elements of your HTML code to pay attention to.

Alt Tags

An alt tag (alt attribute) is basically an image tag, or your own description or explanation of what the images on your website include or are about. When you think from an SEO perspective, you know that crawlers can’t see your images, so a little bit of alt text is the only way they will understand the relevance of that image to the keyphrases that users enter into their Google search bar.

The goal of alt text is to allow Google to know what the image is about, but also to help the user in case they are visually impaired or the image does not load. There is more to an alt tag than an accessibility factor, however, as alt tags help search engine crawlers read the images themselves and index them. This is why you sometimes see images from multiple brands when you click the “image search” tab during a Google search. 

So as a rule of thumb, make sure to use an alt tag for anything visual on your website.

Robots Tag

A website user has the ability to set up parameters for how search engine bots crawl their website via robots tags. These tags give direction on which pages can be crawled and which should be ignored from an indexing perspective. The nofollow attribute prevents Google crawlers from following internal links to other pages of your site. They are useful if you have some seasonal pages that may not always be relevant, or if you are currently working on updating a webpage.

A robots tag looks like: <meta name=”robots” content=”noindex, nofollow”>

Canonical Tags

Google is very strict when it comes to unique content, and will penalize you if you have duplicate content or thin content on your page. A canonical tag ensures that this doesn’t happen.

A canonical tag is a tag that you can put on a page that labels it as the “master.” Multiple pages like product pages can result in unique URLs, which can confuse search engines on which page to show in the SERPs. You may have multiple pages for many different reasons, but to Google, you need to use source code to tell them which pages to crawl and rank in the search results pages.

Adding a canonical tag to a page will tell the search engine to ignore any other duplicate content on the website, which will prevent you from being docked in the ratings.

The SEO Benefits of HTML Tags

As mentioned previously, there are plenty of search engine optimization benefits of optimizing the source code of your website. They include:

  • Allowing search engine crawlers to read your pages and index them more efficiently, which boosts your website in search.
  • Helping users see and understand every single piece of content on a page so they see your result as the better match over your competitor.
  • Emphasizing relevant keywords and similar keyphrases.
  • Encourage a higher click through rate from the search engine result pages to your website.

While SEO HTML tags may seem overwhelming to webmasters at first, rest assured with a little practice, they will become much easier to implement. You can add tools like the Yoast SEO premium to your Wordpress site to make sure you’re implementing your metadata correctly. In turn, your website will see more keyword rankings, growing your market share with each new post you publish.

Our experts are here to help your business stand out on the Internet. From link building to on-page SEO to keyphrase research, we can address all the important factors needed to get your website ranking and in not a lot of time.

When it comes to search engine optimization, the details matter. Why? Search engines analyze and index page details and these details add up to a recipe for success in search engine results. While many people overlook the importance of URLs, we urge you not to underestimate them. Creating SEO-friendly URLs from the time of a page’s creation can eliminate compounding issues in the future and give you a competitive edge in the SERPs.

TLDR: URL best practices will save you time, money, and stress while improving your site’s crawlability. And this article will teach you how to optimize every URL you create.

What Exactly is a URL?

Every URL, or uniform resource locator, is unique. Why? Because every URL is only associated with one web page. As the name implies, a URL is a unique address that a web user enters into a web browser (such as Google Chrome). After hitting ‘enter’ on the address bar, your web browser sends a request to the server where the URL’s data lives. The server then locates or retrieves the web page (or “resource”) and sends the data page to the browser to present to the user.

Understanding the Parts of a URL

You likely noticed that every URL is a string of characters, often including letters, numbers, periods, slashes, and colons. Furthermore, you likely noticed that these characters follow a pattern. This pattern denotes specific information.

The Protocol

The first element of every URL is the protocol–either https:// or HTTPS://. This is the protocol or instructions for how the data sent between a web browser and servers are handled. HTTPS is a more secure information delivery system with data encryption.

The Subdomain

A subdomain is a part of the main domain that delineates different versions or parts of a website. 

For example, a website that has a shop may have a subdomain that denotes that the page the user is a part of the online shop rather than the blog. You will also find language subdomains such as en, de, es.

The Domain 

A domain name is the root of the URL and represents the overall website.

The TLD

The TLD or top-level domain indicates the category or type of website (to an extent). For example, “com” stands for commercial.

Subfolders

Many URLs also contain subfolders (also called subdirectories), these often act as one more level of organization (similar to how folders on your Google Drive allow you to organize individual files).

The Slug

The final element of the URL in our example is the page designation, path, or “slug.” This is where you must refine the text while optimizing your URL for SEO.

How URLs Affect SEO

Search webcrawlers, including Google’s, use a wide array of elements on a page to better understand the content on that page and the overall website. Two elements of a web page that search engine bots analyze to index a page are the URL text and URL structure.

More importantly, writing a URL in a search-engine friendly manner the first time helps you avoid rewriting it in the future and requiring a redirect. Why avoid redirects? Redirects require a server to first look in one location then move onto another location to retrieve the web page data. And while this is often only a fraction of a second, this time can compound during a search engine crawl leading to site load speed issues that can worsen the user experience.

How to Write Search Engine-Friendly URLs

Luckily, writing SEO-friendly URLs is straightforward and easy once you understand the foundations. This section will outline how you can create evergreen URLs that webcrawlers can easily understand and that will last the lifetime of your site.

1. Use Your Target Keywords Wisely

Keyword-driven content is the heart of good SEO. In fact, it’s pivotal to SEO, that your keywords should extend to your URL text. Your primary or target keyword should appear in your URL text. If it makes sense to put it at the beginning of the page locator, then do so, but do not force.

You should also avoid keyword stuffing your URL. To do so, be sure you use a relevant keyword in relation to your page content. Descriptive keywords also result in a better experience for users and webcrawlers.

Examples:

Good: unlawn.org/never-cut-wet-grass-again

Bad (keyword stuffing a URL): unlawn.org/never-cut-wet-grass-again-avoid-cutting-wet-grass-stop-wet-grass-cutting

2. Keep It Short & to the Point

Shorter URLs are more user-friendly than long URLs. They also pair back any information that may be confusing to search engine crawlers. Using a shortened version of your page title often results in an accurate and short URL.

While Google denies URL length as a ranking factor, studies show that the highest-ranking search results often contain a total of 50 to 70 characters, including the root domain, subdomain, and page text.

When deciding on what to decide and what to omit when crafting your URLs, you will want to omit extraneous words or characters, including:

  • articles (a, an, the)
  • conjunction (and, but, so, because, since, etc.)

Shorter URLs improve the user experience and make your pages easier for search engine crawlers to understand.

3. Use Hyphens to Separate Words

When words are shoved together without separation, web crawlers’ NLPs struggle to understand the individual words. To avoid this problem, place hyphens between words in your URL. This makes your URLs easier to read for webcrawlers and searchers.

Example:

Good: yourwebsite.com/blog/seo-outsourcing-guide/

Bad: yourwebsite.com/seooutsourcingguide

Why does readability matter for searchers? Your URL path appears in the SERPs. And searchers do read them in order to understand the page they’re going to click onto. The hyphens in the URL improve the user experience of the person scrolling through search results.

Why not use underscores instead of hyphens in your slug? To put it simply, Google recommends using hyphens to make a URL easier for their bots to understand. Underscores add complexity which can confuse and slow down a crawler.

Why not use spaces in a slug? Spaces in a URL are converted into code. You may have even noticed them in the past. They transform into %20.

For example, example.com/how%20grooming%20your%20dog%20at%20home As you can see, this switch renders the URL more difficult to read and much longer–neither of which is ideal for a web address.

4. Use Lowercase Letters

Many people don’t realize that URLs are, in fact, case-sensitive. And while Google’s John Mueller contends that Google doesn’t care if you have capital letters in your URL when it comes to ranking signals, we argue that all lowercase is still best practice. And here’s why:

1) Having a mix of uppercase and lowercase letters look bad and are just generally a little more difficult to read. It turns out all lowercase letters result in a more readable URL. (I’ve also noticed that this practice is done more often by people that simultaneously do not use hyphens).

2) Having all uppercase letters looks like the URL is yelling at the user.

3) People have to put in more effort to type with caps. (If you’re thinking “but they get to the same page either way,” keep in mind there are users that do not know that. And note our next point).

4) A URL with differing capitalization counts as different page styles can cause duplicate content issues to crop up unless you employ correct canonical tags). Without correct canonicals, you may get flagged by Google for duplicate content–or find the wrong version of your page appearing in the SERPs.

Just to help you better understand #1 and #2 above, here are some examples:

1) Good: yourwebsite.com/best-seo-practices-for-beginners

Bad: yourwebsite.com/Best-SEO-Practiced-For-Beginners or yourwebsite.com/BestSEOPracticesForBeginners

2)  Good: yourwebsite.com/best-seo-practices-for-beginners

Bad: yourwebsite.com/BESTSEOPRACTICESFORBEGINNERS

5. Do Not Omit Important Modifiers

For a very long time, you would hear that you should always leave out “stop words” in your URLs because they just take up space. And this was true–for a while–and still is when it comes to articles and conjunctions. 

However, with Google’s BERT algorithm (an NLP algorithm that processes human language), Google often ignored prepositions since they were seen as minor signals in relation to content meaning. However, over time, the Google team realized that these words often add a lot of contextual meaning to searchers’ intent. 

For example, someone looking for “restaurants nearby downtown Nashville” has a slightly different intent than “restaurant in downtown Nashville.”

So, if the word is important to understanding the category of the content on the page, keep it in your URL.

6. Avoid using special characters

Special characters tend to clutter your URL. They’re also a bit more ambiguous in meaning than words. Therefore, it’s best to leave them out. Another way to think about this rule is that you never want a user to be confused if you were to tell them what to type. 

For example: if you have the URL

example.com/SEO-&-so-much-more, the user would likely type out “and”.

7. Omit unnecessary numbers

Why remove numbers from your URL text? Content re-optimization is a possibility for any blog post. When you have a number in your URL, you’re not leaving your content room to grow in the future. And if you do change the number of items on a list-style blog, you wind up needing to create a new URL and a redirect.

For example: numbers in a list (ex: 7-ways-to-improve-your-running-form)

8. Reduce the number of subfolders

When it comes to subfolder best practices, keep it simple. While a short URL is easier for searchers to scan, subfolders also tell Google more about the organization of your page through category name. Your subfolders tell Google more about the organization of your page, but Google does warn that having a list of subfolders can lead to a long URL that isn’t as easy for users to understand at first glance from the SERPs. Having several subfolders in your URL structure also signals to Google that the content may be less important.

When creating subfolders, remember to keep your category names concise to reduce the risk of an overly long URL slug.

Other Questions Regarding SEO and URLs

So, when it comes to creating URLs, use the guidelines above. However, we know that sometimes we don’t learn about the magic of SEO until we’re 100+ pages deep. We’ve got you covered. This section will go over that and answer those other questions many people have about URLs and SEO.

Q: I have a lot of pages that don’t follow URL best practices. What should I do?

This question comes up time and time again: “Should I change my URLs just for SEO?” The answer is, “it depends.” When it comes to URLs, you have a few choices for re-optimizing (or not) existing URLs. 

  1. How ugly/bad is it? If it’s really bad, then change it, but make sure you set up a redirect.
  2. Has it been redirected before? The last thing you want is a chain of redirects. If this is the case, and the URL isn’t that bad–leave it alone.
  3. Did the content change? If the URL is no longer relevant to the page, change the URL.
  4. Is the URL creating keyword cannibalization? Then change the URL to reflect another keyword that the page could rank for.

Q: Does my domain name affect my ranking?

No. Google doesn’t count a site’s domain name as a ranking factor according to Google’s John Mueller. However, while search engine crawlers don’t care about your domain name, users do. A good domain name can improve brand awareness and visitors’ ability to remember your site. To create a good domain, avoid confusing spelling, unsafe characters, and make it memorable.

Q: What are dynamic URLs?

Dynamic URLs are URLs that change with each request, usually generated by a server-side scripting language. They are used to track user interactions and to pass data between pages. You will often see these on share URLs (for example, you will find these on Amazon and Zillow). This is because the unique tracking number allows sites to better understand how the sharer will use the link.

Q: Does anchor text affect how Google sees my destination page?

Yes. Google uses the anchor text as well as annotation text to better understand the content of your destination page. We recommend using diverse (but on-topic) anchor text and Focus-Term-rich content surrounding the link.

Your-Are Ready to Creating URLs for SEO Success

Whether you’re the site owner of a small business or a marketing content writer for an enterprise-level company, your URLs matter. A concise URL slug can help Google and other search engines better understand your content.

For the best search engine rankings, always refer to URL best practices and use the best SEO tools for content creation. High-quality quality with a URL that reflects the topic is a winning combination for landing pages, blogs, and product pages.

To learn more about SEO tools and SEO strategy, check out our SEO starter guide. With SEO from the address bar to your internal links, you can begin to improve your site’s overall search engine rankings with confidence.

 

There are three directives (commands) that you can use to dictate how search engines discover, store, and serve information from your site as search results:

  • NoIndex: Don’t add my page to the search results.
  • NoFollow: Don’t look at the links on this page.
  • Disallow: Don’t look at this page at all.

These directives allow you to control which of your site pages can be crawled by search engines and appear in search.

What does No Index mean?

The noindex directive tells search crawlers, like googlebot, not to include a webpage in its search results.

Indexing is the process by which Google scans, or ‘crawls,’ the internet for new content that is then added to the search engine’s library of search-accessible content.

How Do You Mark A Page NoIndex?

There are two ways to issue a noindex directive:

  1. Add a noindex meta tag to the page’s HTML code
  2. Return a noindex header in the HTTP request

By using the “no index” meta tag for a page, or as an HTTP response header, you are essentially hiding the page from search.

The noindex directive can also be used to block only specific search engines. For example, you could block Google from indexing a page but still allow Bing:

Example: Blocking Most Search Engines*

<meta name=”robots” content=”noindex”>

Example: Blocking Only Google

<meta name=”googlebot” content=”noindex”>

Please note: As of September 2019, Google no longer respects noindex directives in the robots.txt file. Noindex now MUST be issued via HTML meta tag or HTTP response header. For more advanced users, disallow still works for now, although not for all use cases.

What is the difference between noindex and nofollow?

It’s a difference between storing content, and discovering content:

noindex is applied at the page-level and tells a search engine crawler not to index and serve a page in the search results.

nofollow is applied at the page or link level and tells a search engine crawler not to follow (discover) the links.

Essentially the noindex tag removes a page from the search index, and a nofollow attribute removes a link from the search engine’s link graph.

NoFollow As a Page Attribute

Using nofollow at a page level means that crawlers will not follow any of the links on that page to discover additional content, and the crawlers will not use the links as ranking signals for the target sites.

<meta name=”robots” content=”nofollow”>

NoFollow as a Link Attribute

Using nofollow at a link level prevents crawlers from exploring a specific link, and prevents that link from being used as a ranking signal.

The nofollow directive is applied at a link level using a rel attribute within the a href tag:

<a href=”https://domain.com” rel=”nofollow”>

For Google specifically, using the nofollow link attribute will prevent your site from passing PageRank to the destination URLs.

 

However, Google did recently announce that as of March 1, 2020 the search engine will begin to treat NoFollow links as “hints” that contribute to a site’s overall search authority.

Why Should You Mark a Page as NoFollow?

For the majority of use cases, you should not mark an entire page as nofollow – marking individual links as nofollow will suffice.

You would mark an entire page as nofollow if you did not want Google to view the links on the page, or if you thought the links on the page could hurt your site.

In most cases blanket page-level nofollow directives are used when you do not have control over the content being posted to a page (ex: user generated content can be posted to the page).

Some high-end publishers have also been blanket applying the nofollow directive to their pages to dissuade their writers from placing sponsored links within their content.

How Do I Use NoIndex Pages?

Mark pages as noindex that are unlikely to provide value to users and should not show up as search results. For example, pages that exist for pagination are unlikely to have the same content displayed on them over time.

Domain.com/category/resultspage=2 is unlikely to show a user better results than domain.com/category/resultspage=1 and the two pages would only compete with each other in search. It’s best to noindex pages whose only purpose is pagination.

Here are types of pages you should consider noindexing:

  • Pages used for pagination
  • Internal search pages
  • Ad-Optimized Landing pages
    • Ex: Only displays a pitch and sign up form, no main nav
    • Ex: Duplicate variations of the same content, only used for ads
  • Archived author pages
  • Pages in checkout flows
  • Confirmation Pages
    • Ex: Thank you pages
    • Ex: Order complete pages
    • Ex: Success! Pages
  • Some plugin-generated pages that are not relevant to your site (ex: if you use a commerce plugin but don’t use their regular product pages)
  • Admin pages and admin login pages

Marking a Page Noindex and Nofollow

A page marked both noindex and nofollow will block a crawler from indexing that page, and block a crawler from exploring the links on the page.

Essentially, the image below demonstrates what a search engine will see on a webpage depending on how you’ve used noindex and nofollow directives:

Marking an Already Indexed Page as NoIndex

If a search engine has already indexed a page, and you mark it as noindex, then next time the page is crawled it will be removed from the search results.

For this method of removing a page from the index to work, you must not be blocking (disallowing) the crawler with your robots.txt file.

If you are telling a crawler not to read the page, it will never see the noindex marker, and the page will stay indexed although its content will not be refreshed.

How do I stop search engines from indexing my site?

If you want to remove a page from the search index, after it has already been indexed, you can complete the following steps:

  1. Apply the noindex directiveAdd the noindex attribute to the meta tag or HTTP response header
  2. Request the search engine crawl the pageFor Google you can do this in search console, request that Google re-index the page. This will trigger Googlebot crawling the page, where Googlebot will discover the noindex directive.You will need to do this for each search engine that you want to remove the page.

Confirm the page has been removed from searchOnce you’ve requested the crawler revisit your webpage, give it some time, and then confirm that your page has been removed from the search results. You can do this by going to any search engine and entering the site colon target url, like in the image below.

  1. If your search returns no results, then your page has been removed from that search index.
  2. If the page has not been removedCheck that you do not have a “disallow” directive in your robots.txt file. Google and other search engines cannot read the noindex directive if they are not allowed to crawl the page.If you do, remove the disallow directive for the target page, and then request crawling again.
  3. Set a disallow directive for the target page in your robots.txt fileDisallow: /page$
    You’ll need to put the dollar sign on the end of the URL in your robots.txt file or you may accidentally disallow any pages under that page, as well as any pages that begin with the same string.Ex: Disallow: /sweater will also disallow /sweater-weather and /sweater/green, but Disallow: /sweater$ will only disallow the exact page /sweater.

How to Remove a Page from Google Search

If the page you want removed from search is on a site that you own or manage, most sites can use the Webmaster URL Removal Tool.

The Webmaster URL removal tool only removes content from search for about 90 days, if you want a more permanent solution you’ll need to use a noindex directive, disallow crawling from your robots.txt, or remove the page from your site. Google provides additional instructions for permanent URL removal here.

If you’re trying to have a page removed from search for a site that you do not own, you can request Google removes the page from search if it meets the following criteria:

  • Displays personal information like your credit card or social security number
  • The page is part of a malware or phishing scheme
  • The page violates the law
  • The page violates a copyright

If the page does not meet one of the criteria above, you can contact an SEO firm or PR company for help with online reputation management.

Should you noindex category pages?

It is usually not recommended to noindex category pages, unless you are an enterprise-level organization spinning up category pages programmatically based on user-generated searches or tags and the duplicate content is getting unwieldy.

For the most part if you are tagging your content intelligently, in a way that helps users better navigate your site and find what they need, then you’ll be okay.

In fact, category pages can be goldmines for SEO as they typically show a depth of content under the category topics.

Take a look at this analysis we did in December, 2018 to quantify the value of category pages for a handful of online publications.

We found that category landing pages ranked for hundreds of page 1 keywords, and brought in thousands of organic visitors each month.

The most valuable category pages for each site often brought in thousands of organic visitors each.

Take a look at EW.com below, we measured the traffic to each page (represented by the size of the circle) and the value of the traffic to each page (represented by the color of the circle).

Monthly Organic Traffic to Page = Size
Monthly Organic Value of Page = Depth of Color

Now imagine the same charts, but for product-based sites where visitors are likely to make active purchases.

That being said, if your categories similar enough to cause user confusion or compete with each other in search then you may need to make a change:

  • If you are setting the categories yourself, then we would recommend migrating content from one category to the other and reducing the total number of categories you have overall.
  • If you are allowing users to spin up categories, then you may want to noindex the user generated category pages, at least until the new categories have undergone a review process.

How do I stop Google from indexing subdomains?

There are a few options to stop Google from indexing subdomains:

  • You can add a password using an .htpasswd file
  • You can disallow crawlers with a robots.txt file
  • You can add a noindex directive to every page in the subdomain
  • You can 404 all of the subdomain pages

Adding a Password to Block Indexing

If your subdomains are for development purposes, then adding an .htpasswd file to the root directory of your subdomain is the perfect option. The login wall will prevent crawlers for indexing content on the subdomain, and it will prevent unauthorized user access.

Example use cases:

  • Dev.domain.com
  • Staging.domain.com
  • Testing.domain.com
  • QA.domain.com
  • UAT.domain.com

Using robots.txt to Block Indexing

If your subdomains serve other purposes, then you can add a robots.txt file to the root directory of your subdomain. It should then be accessible as follows:

https://subdomain.domain.com/robots.txt

You will need to add a robots.txt file to each subdomain that you are trying to block from search. Example:

https://help.domain.com/robots.txt

https://public.domain.com/robots.txt

In each case the robots.txt file should disallow crawlers, to block most crawlers with a single command, use the following code:

User-agent: *

Disallow: /

The star * after user-agent: is called a wildcard, it will match any sequence of characters. Using a wildcard will send the following disallow directive to all user agents regardless of their name, from googlebot to yandex.

The backslash tells the crawler that all pages off of the subdomain are included in the disallow directive.

How to Selectively Block Indexing of Subdomain Pages

If you would like some pages from a subdomain to show up in search, but not others, you have two options:

  • Use page-level noindex directives
  • Use folder or directory-level disallow directives

Page level noindex directives will be more cumbersome to implement, as the directive needs to be added to the HTML or Header of every page. However, noindex directives will stop Google from indexing a subdomain whether the subdomain has already been indexed or not.

Directory-level disallow directives are easier to implement, but will only work if the subdomain pages are not in the search index already. Simply update the subdomain’s robots.txt file to disallow crawling of the applicable directories or subfolders.

How Do I Know if My Pages are NoIndexed?

Accidentally adding a no index directive page on your site can have drastic consequences for your search rankings and search visibility.

If you find a page isn’t seeing any organic traffic despite good content and backlinks, first spot check that you haven’t accidentally disallowed crawlers from your robots.txt file. If that doesn’t solve your issue, you’ll need to check the individual pages for noindex directives.

Checking for NoIndex on WordPress Pages

WordPress makes it easy to add or remove this tag on your pages. The first step in checking for nofollow on your pages is by simply toggling the Search Engine Visibility setting within the “Reading” tab of the “Settings” menu.

This will likely solve the problem, however this setting works as a ‘suggestion’ rather than a rule, and some of your content may end up being indexed anyway.

In order to ensure absolute privacy for your files and content, you will have to take one final step: either password protecting your site using either cPanel management tools, if available, or through a simple plugin.

Likewise, removing this tag from your content can be done by removing the password protection and unchecking the visibility setting.

Checking for NoIndex on Squarespace

Squarespace pages are also easily NoIndexed using the platform’s Code Injection capability. Like WordPress, Squarespace can easily be blocked from routine searches using password protection, however the platform also advises against taking this step to protect the integrity of your content.

By adding the NoIndex line of code within each page you want to hide from internet search engines and to each subpage below it, you can ensure the safety of secured content that should be barred from public access. Like other platforms, removing this tag is also fairly straightforward: simply using the Code Injection feature to take the code back out is all you will need to do.

Squarespace is unique in that its competitors offer this option primarily as a part of the suite of settings in page management tools. Squarespace departs here, allowing for personal manipulation of the code. This is interesting because you are able to see the change you are making to your page’s content, unlike the others in this space.

Checking for NoIndex on Wix

Wix also allows for a simple and fast fix for NoIndexing issues. In the “Menus & Pages” settings, you can simply deactivate the option to ‘show this page in search results’ if you want to NoIndex a single page within your site.

As with its competitors, Wix also suggests password protecting your pages or entire site for extra privacy. However, Wix departs from the others in that the support team does not prescribe parallel action on both fronts in order to secure content from the crawler. Wix makes a particular note about the difference between hiding a page from your menu and hiding it from search criteria.

This is particularly useful advice for less experienced website builders who may not initially understand the difference considering that removal from your site menu makes the page unreachable from the site, but not from a prudent Google search term.

How Google Determines Relevance

Search engines want to prioritize the most relevant content possible for different searches and search terms (also known as keywords). One of the ways search engines are able to tell that content on a webpage is relevant for a particular search term, is that the term appears in the body of the content on the page. You’ll hear the terms on-page content and on-site content used in the SEO world. On-page content refers more to the content that is visible on the page itself, while on-site content can sometimes be used more broadly to include content found in meta data or schema markup.

Onsite content refers to both VISIBLE (page copy) and INVISIBLE (meta data) content.

Keywords

If you were interested in finding an Italian restaurant in NYC – you might type Italian Restaurant NYC into the search bar. Google could quickly find pages that included all of those terms, but there’s going to be hundreds of pages that include the words “italian”, “restaurant”, and “NYC” (or variations on NYC – such as “new york” and “manhattan”).

To surface the most relevant results (as opposed to just a diner in NYC which has an italian sub on the menu), Google looks for additional “focus” keywords in the copy of the page to help determine that the ENTIRE page is relevant to the search term, as opposed to just one line on the page.

Focus Keywords

If I searched Italian Restaurant NYC Google would expect a relevant page to have some terms on it like pasta and parmesan, maybe burrata.

The more terms included in the copy of a page that google knows are relevant to italian restaurants, the more likely google isto prioritize that result in the search results compared to a page that has less focus keywords.

Search engines also take into account the frequency of certain terms on a page. For example, how often terms like entree or course are used can help a search engine understand if a page with the term italian is more relevant to a restaurant search as opposed to a sandwich search.

Focus keywords help search engines determine how relevant a page is to the term that was searched.

Selecting Keywords

When optimizing content for search engines, you cannot be all things to all people. A page is unlikely to rank for both fancy birdhouses and Italian restaurants in NYC; those two terms will have almost no overlapping focus keywords.

This means that sites need to optimize content for the types of users (and searches) which are most likely to bring converting traffic to a website.

Targeting Volume

For a site where the profit model functions off of advertising and/or readership, it makes sense to prioritize keywords sheerly based on their volume. The more eyeballs on a site, the more likely content is to be shared, and the more advertisements will get viewed.

However, for most sites, search intent should be taken into account before volume.

Targeting Conversions

For sites where the profit model functions off of purchase, or participation, we want to look at the exact search terms being used. When it comes to converting traffic, the intent behind keywords becomes much more important.

Targeting Search Intent

Take for example two searches one is used jeep dealers south detroit and the other is used blue cars. The first term indicates a much clearer intent to purchase, as the person already knows the type of vehicle they’re looking for, and is looking for a physical location where they could see the vehicles.

For your own site, if the goal is to attract more converting users to the site, you want to target keywords that suggest an intent to engage, purchase, donate, or otherwise complete a goal relevant to your business.

If you sell umbrellas a converting user is a user who will buy an umbrella. The term rain might be related to umbrellas, and have a huge search volume, but it’s unlikely to be the term that a user looking to purchase an umbrella would use for a search. Take a look at the keywords below.

Rain has the highest search volume, but not the highest CPC. The other keywords also have much clearer search intent (looking for umbrellas, specific types of umbrellas). Given the organic ranking difficulty of the keyword, the search volume, the CPC, and the search intent — custom umbrella would be the best term to target. However, if you don’t offer custom printed umbrellas, the term would not be converting for your site because the searcher would not be able to find the product they were looking for.

You should only target keywords that are likely to bring in CONVERTING traffic to your site.

Improving Content

To have your site receive more converting traffic, we want to target terms that are likely to convert for the site, and then create content that Google (and other search engines) will recognize as hyper relevant for those searches.

Four-Step Process

  • Keyword Research
  • Ranking Analysis
  • Content Optimization
  • Adding Links
Keyword Research

What terms would you want to rank for, what types of keywords or search terms would a user looking for your business put into the search bar? Selection of these keywords is done through Keyword Research*, where you look at where competitor sites are getting traffic as well as how currently converting traffic is coming into your domain.

Ranking Analysis

Look at the keyword metrics to determine how difficult it will be for your site to rank for each of the identified terms. More difficult terms (terms with more competition) will have to have longer content, with more focus keywords included, for your site to be able to make it onto the first page of search results. 

Content Optimization

There are two parts to content optimization:

The first part is creating useful, meaningful, content that provides value to your desired audience. To get some ideas for what content might be useful to your audience, you can take a look at trending topics, frequently asked questions, and search volume for different topics or terms.

The second part is optimizing that content for search engines so that Google, Bing, and others know exactly what the content on the page is about and when to surface your content as a search result.

Adding Links

Search engines use internal links to help understand the subject matter and importance of pages within your site. The pages with the most internal links are viewed as the most important pages on your site.

Importance: Pages linked in your main navigation are linked to from every page on your site, and should be the ones you and your users consider most important.

Subject Matter: The anchor text that you use for internal links send signals to search engines telling them the subject matter or topic of the page. If you call a page “SEO Services” in your main navigation, and that text links users to a page – both the users and search engines expect that page to be about “SEO Services.”

Search engines use external links, in part, to evaluate the quality of the content you’re providing users. If you’re writing a piece on a topic, and linking to sites that a search engine already knows are authoritative on the same topic, it demonstrates that you are providing good resources. This increases a search engine’s trust in the relevancy of your content.

Recap

To optimize content we need to:

  • Establish which searches you want your content to appear for – these are your target keywords you should pick 1-3 target keywords.
  • Establish the focus keywords that you need on your page to rank for each of those target keywords.
  • Incorporate those focus keywords into your page copy in a way that does not detract from the value of your original content.

Italian Restaurant Example

Let’s pretend we own an Italian restaurant in NYC and we want to capture people who are looking to select an Italian Restaurant in NYC.

Keyword Research

1. Identify Target Keywords Remember: Target Keywords should be the 1-2 search terms that get typed into Google where you REALLY want to be the result a user clicks on.

We’ll start by popping a few potential search terms into a keyword explorer (Ahrefs in this case):

The first thing to note is that we did pretty well with our initial keyword guesses! We can see we managed to include the parent topic. The parent topic is the keyword related to our search that gets the most monthly search volume.

Sometimes how YOU think about searching for something and how AVERAGE users think about searching for something will be different.

Pro-tip: Always keep an eye on the parent topic to see if there are additional keywords you could explore.

We’ll pick italian restaurants nyc because out of this set of keywords it has:

  1. High volume (4,300 monthly searches)
  2. Low keyword difficulty (8)
  3. Reasonably high CPC ($6)

 

Target Keyword Selected: italian restaurants nyc

How do we know which additional terms (focus keywords) need to be on the page for Google to recognize your content is hyper relevant for a search (target keyword)? We run that search ourselves, and then look at the top 10 results, and the content/terms on each of those pages.*

*For this example, we’re going to use the dashboard’s FREE Content Optimizer tool.

Ranking Analysis

Once we’ve identified our target keywords, the search terms we want our page to rank for, we’ll want to analyze the pages currently ranking in the top 20 search positions for those terms.

This will help us identify the topics and focus keywords that need to be on our page if we want it to rank. Incorporating focus keywords will improve the relevancy of our page for the target keyword and, as a result, our page’s search engine ranking position (SERP).

2. Establish a List of Focus Keywords focus keywords are the terms that support our target keywords and help search engines recognize our page is hyper-relevant for related searches

Below you’ll find the current copy on our Italian restaurant’s home page. If we compare this copy to the copy on the first twenty search results for italian restaurants NYC we can see that our content currently only shares a few keywords with those pages (focus keywords are highlighted below in yellow).

Since the 19th century, townsfolk in Vatican City used to take a portion of all food brought in from local harvests to give to the needy. In 1935, the Aribetta family opened a restaurant for dinner in the nearby neighborhood, La Decima (The Tithe) in honor of the tradition and the people who continued to maintain it. Many iterations of their family later, in 2015, the descendants of the original founders of La Decima opened a satellite location in Brooklyn. Both restaurants adhere to two principals: stay close to the roots of their traditional recipes but with refreshed presentations, and local sourcing that gives back to the community.

TIME OUT NEW YORK

La Decima offers dishes like spaghetti cacio e pepe, roasted lamb and Italian ham served with hot, crispy mozzarella, accompanied by a list of all Italian wines.

NEW YORK OBSERVER

Even the pasta, which is hard to present in a way that gives proper credit to the effort needed to produce it, comes across well. The cacio e pepe, in which pecorino and Parmesan bind themselves to thick al dente strands of homemade spaghetti, is phenomenal.

NEW YORK TIMES

Where do you eat in Rome? Is it right that you love the Trastevere district and especially the La Decima restaurant, which has now opened a U.S. branch in the Park Slope section of Brooklyn, in New York City

FODOR’S – ONLINE

Following the lead of the many lauded NYC chefs who have opened second and third restaurants here, one of Rome’s most celebrated restaurants, La Decima, just opened its first stateside outpost not in Manhattan, but in the Brooklyn neighborhood of Park Slope.

Our menus feature homemade pastas, homemade desserts, the finest Italian imported cheeses and olive oils, and farm-to-table ingredients.

*No outside bottles of wine are permitted*

3. Narrow Down to Missing Focus Keywords

Review the focus keywords already incorporated.
Out of 92 potential focus keywords, our content only uses 6, and is not going to be recognized as very relevant for our target term.

  • New York City
  • Olive oils
  • Travel guide
  • Al dente
  • Park Slope
  • Time Out New York
4. Categorize the Missing Focus Keywords

Here are some additional terms (focus keywords) shared by a number of the pages ranking in the top 10 positions in google, that we can consider incorporating into our content:

  • Del postos
  • East villages
  • Italian restaurant
  • Via carota
  • Il bucos
  • Meatpacking district
  • Rida sodi
  • Cheap eats
  • Village restaurants
  • Midtown west
  • Romantic restaurants
  • Central park
  • Brick oven
  • Traditional italian
  • Date night
  • Italian spot
  • Tasting menus
  • City guides
  • Special occasion
  • Wine bar
  • Fine dining
  • Cocktail bar
  • Italian food
  • Times square
  • Little italy
  • Locanda Verde
  • Danny Meyers
  • Restaurants in New York

Not all of these terms will make sense for our page, but we want to incorporate as many focus keywords as possible that will fit naturally into our page. Separate the list into two categories off topic and on topic.

You may find that you see competitor brands as suggested focus terms, or terms which are not relevant for your product or service offerings. These terms will likely go into your “will not incorporate” category, unless you add them in with content or features like product comparison tables.

Content Optimization

5. Incorporate Additional Focus Keywords

Below you’ll see the same copy from before, but this time with additional focus keywords incorporated into the copy.This content is now better optimized for search.

You can see the additional focus keywords we added in BLUE.

Since the 19th century, townsfolk in Vatican City used to take a portion of all food brought in from local harvests to give to the needy. In 1935, the Aribetta family opened an Italian restaurant for dinner in the nearby neighborhood, La Decima (The Tithe) in honor of the tradition and the people who continued to maintain it. Many iterations of their family later, in 2015, the descendants of the original founders of La Decima opened a satellite location in Brooklyn. Both restaurants adhere to two principals: stay close to the roots of their traditional italian recipes but with refreshed presentations, and local sourcing that gives back to the community, which have made it one of the best Italian Restaurants in New York City.

But where the original La Decima might be a bit far for your special occasion night out in NYC, if you’re looking for cozy romantic restaurants to offset the rush of a day out sightseeing in Central Park or Times Square, consider skipping Little Italy and heading out to the quieter Park Slope for authentic traditional Italian without the rush.

See what reviewers have to say about our cozy brick oven style Italian food and wine bar that make La Decima the perfect Italian spot for date night.

TIME OUT NEW YORK

La Decima offers dishes like spaghetti cacio e pepe, roasted lamb and Italian ham served with hot, crispy mozzarella, accompanied by a list of all Italian wines.

NEW YORK OBSERVER

Even the pasta, which is hard to present in a way that gives proper credit to the effort needed to produce it, comes across well. The cacio e pepe, in which pecorino and Parmesan bind themselves to thick al dente strands of homemade spaghetti, is phenomenal.

NEW YORK TIMES

Where do you eat in Rome? Is it right that you love the Trastevere district and especially the La Decima restaurant, which has now opened a U.S. branch in the Park Slope section of Brooklyn, in New York City

FODOR’S – ONLINE TRAVEL GUIDE

Following the lead of the many lauded NYC chefs who have opened second and third restaurants here, one of Rome’s most celebrated restaurants, La Decima, just opened its first stateside outpost not in Manhattan, but in the Brooklyn neighborhood of Park Slope.

Our menus feature homemade pastas all perfectly al dente, homemade desserts, the finest Italian imported cheeses and olive oils, and farm-to-table ingredients. Our rotating tasting menus and full cocktail bar make La Decima the perfect place whether you’re looking for drinks, tapas, or fine dining restaurants in New York. Our brick oven pizzas are great for lunch, and you’ll be glad you got out of the east villages / west villages to sate your olive oil cravings!

Curious about our full date night recommendations for New York City? Check out our neighborhood travel guide, and let us know ahead of time if you’re celebrating a special occasion.

*No outside bottles of wine are permitted* Please check out the selection from our wine bar instead.

6. Check Content Ranking Power

Once you’ve added a number of additional focus keywords to your content, run it through a content tool again and see if your copy has a higher score, and is now capable of ranking on the first page of the search results.

We have now incorporated 26 focus keywords, and this content is capable of ranking in the top 10 google search results.

Terms incorporated:

  • East villages
  • Romantic restaurants
  • Central park
  • Brick oven
  • Traditional italian
  • Date night
  • Italian spot
  • Tasting menus
  • City guides
  • Special occasion
  • Wine bar
  • Fine dining
  • Cocktail bar
  • Italian food
  • Times square
  • Little italy
  • Restaurants in New York
  • New York City
  • Olive oils
  • Travel guide
  • Al dente
  • Park Slope
  • Time Out New York
7. Add More Focus Keywords if Needed

Sometimes simple edits are not enough to incorporate relevant focus terms. In some cases, missing a large swath of focus terms is an indication that you’re missing a block of content users would find helpful.

In our Italian restaurant example, if we wanted to incorporate more terms and get ourselves into the top 3 results in Google, we would need to add a section that allows us to mention more locations in Manhattan. A great way to do this would be to add a section about how to travel to our restaurant from different areas of the city.

To rank on the first page of the search results, your page copy needs to include your target keyword AND supporting focus keywords.

Adding Links

8. Add Links

Once your content has been optimized for your target keyword(s) you should look for opportunities to insert links into your copy. You want to link internally to other pages using relevant anchor text, and externally to resources that will help your users.

For our Italian restaurant example, we might link internally to our menus and externally to our Yelp reviews or Open Table booking service.

 

Adding Links

There are two types of links that you’ll want to consider adding:

  • Internal Links: these are links that will bring a user from one page on your site, to a different page which is also on your site.
  • Outbound links: these are links that will bring a user from your site to another, different, site.

Internal Linking

We add internal links to:

  • Help users access additional content
  • Increase conversions
  • Help search engines crawl and index the site
  • Help distribute SEO value throughout the site
Help Users Access Additional Content

When adding links to a page, the first thing you’ll want to do is think of other RELEVANT content you’ve created that a user might want to access. For example, if we had a page on our italian restaurant’s website talking about how we cater events, it would make sense to link to our catering menu. Basically, you want to make it easy for a user to find all relevant content and information on your site.

Increasing Conversions

Review each page for conversion opportunities. Link to sign up forms, scheduling forms, prompt users to call you, email you, or make a purchase. Calls to Action (CTAs) should be placed wherever a user might find them helpful. For example on our catering page, it would make sense to add a CTA linking the user to a page where they could request a quote or place an order.

Helping Search Engines Index Your Site

Internal links also help search engines discover content on your site, and understand how that content is connected. Ideally all pages would be linked from the main navigation, and corresponding sub-navigations. Search engines discover site content in part by using site crawlers to map your site.

A site crawler is a program (also known as a bot) that lands on your site and then clicks into every link on the page they landed on indexing each page they find. For each now page it discovers, the crawler will repeat the same process – clicking every link and indexing any new pages it finds. Pages linked from the main navigation, and pages which are linked to frequently are easy for crawlers to discover and index. Pages which are never linked to, may not get indexed at all.

Distributing Search Equity

Lastly, internal links help distribute search equity through your site. If a search engine ranks one page on your site very highly, any other (relevant) pages you link to will also see a boost in the search results. This is partially because if Google thinks a page is relevant to users, it assumes that the additional content you link to will also be relevant to the user.

Outbound Linking

You may have heard a theory that linking out to other sites will cause you to lose search equity. While you don’t want to go crazy with links, being associated with other quality sites will increase your search equity.

It’s like the phrase “you’re known by the company you keep” – you are judged by the people you associate with. In search this is true, good quality sites link out to high-quality resources.

It’s the difference between someone suggesting a medical procedure and giving you the results of relevant clinical trials, and someone suggesting a medical procedure and linking to their friend’s instagram page who swears it works. The better the sources you reference, the more trust search engines will have in your content.

Quality Sites Reference Quality Resources

Good quality sites reference high quality content

Search engines view this as a trust signal for two reasons, it shows that you:

  1. Know enough about the topic or area to recognize quality content.
  2. Are focused on helping the user by surfacing other quality content.

Remember! Quality content is content that provides value to a user – it can be as simple as well curated comedy clips, or as complex as the results of cutting edge clinical trials. It all depends on what will best serve the user.

Actionable Take-Aways

Get more words ON the page

You need room for copy on your pages. You’ll need enough room to include your target keyword, and enough focus keywords to get your page ranking. The more competitive the keyword (aka the higher the keyword difficulty) the more focus keywords you will need to incorporate.

Pro tips:
  • You should try to include your target keyword(s) in the first paragraph of content on your page.
  • Google prioritizes in-depth content, so if you want a page to rank well, consider hitting the 2,000-4,000 word mark, or creating a series of subpages that support the main content or topic.
  • User engagement counts for more than you think, so add media, links, and CTAs that will cause users to take action (click, watch, share, buy, etc).
  • More weight is given to the content in your headers and titles than the rest of your page copy, so give them more consideration (we’ll cover more of this in our on-site technicals 101 piece)
  • If you’re posting a video, consider including a transcript on the page (even if it’s in an accordion/collapsible tile).

Remember: search engines can extract meaning from the use of synonyms, the context in which the keyword appears, and the frequency with which specific word combinations are mentioned.

Link Between Pages on Your Site

Don’t go crazy, but help the user access the information they’re looking for as easily as possible.

Pro tip:
  • Remember, the only way that crawlers will find a page on your site, is if another page links to it!

Add Calls to Action (CTAs)

Calls to Action help users engage more effectively with your business and site. Prompt them to sign up, call now, email you, contact you, learn more, get started, schedule an appointment, or buy now!

Pro tips:
  • It’s good to have a CTA in your main nav, typically as a button on the right-hand side.
  • Placing a CTA at the bottom of your page gives users a clear next step when they’ve scrolled to the bottom of the page content.
  • Placing a CTA at the top of a page will help a user discover the action they should take, and gives them something to return to once they’ve read your content.
  • If you are a more sophisticated site, and have heatmap/tracking activated, putting a CTA right where you start to see a high percentage of drop-offs will help keep that audience engaged.

Keep the Content Valuable

Content on your site should be:

  • User-friendly
  • Unique
  • Authoritative and Trustworthy
  • Aligned with the search intent of your users
  • Updated regularly!
Pro tip:
  • Trends, information, and design standards change over time. You will need to review and update your content regularly to account for these changes and ensure your content continues to be valuable to your users.

Author Authority has become even more important in SEO to communicate the quality of your website content and the trustworthiness of your website as a whole.

But what is author authority and how is it understood?

Here is a guide to this content-quality standard and what it means for your SEO efforts.

What is Author Authority?

Author authority is a measurement of how much credibility and expertise a given author has on the topic they are writing about.

Say you’re not feeling well, and you go to an article online to get medical advice. When you get to the bottom of the article, you see that rather than being written by a health professional, the author bio describes the writer as working in real estate.

Likely, you would feel the content you just read has less trustworthiness, because the author doesn’t have that specific topic as their area of expertise.

Author authority is particularly important in fields like healthcare, law, finance, or any more technical niche. 

The bottom line is, when it comes to the issues that matter the most to us, we want to hear from experts.

Is Author Authority a Ranking Factor?

Technically, author authority is not a ranking factor. 

But the importance of author reputation has been growing in recent years. In their recently updated Search Quality Rater Guidelines, Google emphasized the importance of the author of the main content.

Section 2.6 states, “An important part of the PQ rating is understanding the reputation of the website. If the creator of the MC [main content] is different from the creator of the website, it’s important to understand the reputation of the creator as well.”

So although author authority is not being used in Google’s ranking algorithm, it is one way that Google understands quality content. And it does evaluate content quality for how to rank search results.

Who writes your content is something that searchers and Google’s quality raters do pay attention to, and thus webmasters who want to show up in search engines should pay attention to it as well.

How Does Google Measure Author Authority?

So how does Google know whether or not the byline of a piece of content is an authoritative source?

Here are the factors that you should include in your content to make sure Google sees your content creators as authoritative.

1. Author Bylines and Bios on Every Piece of Content

Every blog post or in-depth article on your website should be accompanied with a clearly displayed author byline and bio.

Although your primary service or product pages don’t need to have clearly displayed authorship, long-form SEO content that explores a topic or related subfields in depth should be accompanied with an author bio.

This also makes for more effective content. Per the previous example, you want your target audience to get to the bottom of an article and see that the particular author does have subject matter expertise.

2. Author About Page or Bio Page

For every content creator who has a blog post or article on your website, they should also have a bio page that communicates they are a trusted expert.

Author information in an author page can include any of the following: Job title, education, areas of expertise, the kind of content they create, and mentions to other trustworthy websites where that creator has published content.

If the content creator has written multiple blog posts or articles for your web site, it’s good to link out to all of their contributions from their author page.

3. Links to Social Media Profiles

You should also be including links to your content creators’ social media profiles on their bio pages.

This makes it easy for search quality raters and website visitors to further research your authors and cross reference their content to better evaluate whether or not they are true experts.

Having social media links can also make it easier for your expert authors to get verified on social media websites like Twitter.

It can help them earn Google Author panels and help elevate their status as experts, and thus your reputation as a webmaster who features expert authorship.

4. Author schema on the Author Page

Schema markup makes it easier for Google to extract specific information about authors and display it in their search results.

You’ll often see that author bios that appear in search engine results are pulled directly from bios on the publications where those authors appear.

You can use the Schema Creator in your dashboard to easily generate author schema and add it to your bio pages.

Simply select the “Person,” option and complete all of the required JSON-LD properties. Then, copy and paste the markup into the HTML header section of your author pages.

Final Thoughts on Author Authority

The more content a website publishes, the more website owners should focus on establishing the expertise of their authors in specific fields.

If your website publishes content on different topics, author authority is also very valuable.

It will give your target audience full confidence that they are reading reliable information when they discover your content through search engine results.

 

 

 

 

Internal links allow Google to rank your site more accurately and index your site more effectively. 

Your website’s Internal links not only improve the user experience, they communicate to web crawlers your site architecture and how your web content interrelates. 

Without a strong, strategic internal linking structure, your site may lose SEO value and struggle to rank in search engines.

Here is a guide on SEO best practices for internal links, and some mistakes you might be making that could be impacting your organic visibility.

What are Internal Links?

An internal link is a hyperlink that points to a different page on the same website.

They are commonly used to help users navigate between different pages of a website, but can also be used for SEO purposes.

Internal links help to keep visitors on your website longer, which can improve your site’s SEO performance.

What are the Different Types of Internal Links?

There are a few different types of internal links you likely have on your website right now. 

Some of them will bring more SEO value than others, so it’s good to know the difference between each.

Menu/Navigation

The links in your menu/navigation bar are some of the most important internal links. These links remain consistent no matter where a site visitor travels across your website.

They should point to the most important pages (e.g. product categories, primary services, blog, about, etc.) and should give users a high-level overview of what type of content is on your website.

Because the majority of your link equity is most likely on your homepage, these internal links will distribute a significant amount of page rank across your website, so make sure the pages linked there are the most important and the ones you want to rank.

The internal links you include here will also communicate to those users visiting your website for the first time where to go next. 

Footer Links

Footer links are at the bottom of your web pages. Like the nav bar, the footer is like an anchor that remains consistent across your website.

There may be some repetition in the links you include in your navigation menu and your footer, and that’s okay. They also will be sending quite a bit of link equity from your homepage to the pages linked there.

If users reach the bottom of a web page and have not found a place to click next, you want them to find what they are looking for in the footer.

Buttons/CTA Links

The internal links that you include on your buttons or CTAs are important for shaping the user or buyer journey across your website and for conversion rate optimization.

Most likely, CTA links are pointing to web pages that push users further down the conversion funnel, whether that is to a web page to book a meeting, request a demo, submit an email address, or add an item to a cart.

The anchor text of these internal links will be primarily user and conversion focused.

Sidebar Internal Links

Sidebar links are often used to provide users options of relevant content or what page they could go to next. 

For publishers that feature a lot of content on their website, sidebar links can help site visitors who are browsing your website without necessarily looking for something specific, but are just exploring the various content you offer.

Sidebar links are very common on news sites, recipe sites, or those that want the opportunity to show users multiple pages (and thus multiple advertisements).

In-Article Links

In-article links are those that are included in the body of blog posts or long-form articles. They point to relevant content that can provide users with more context or information.

These types of links are very common because they have loads of SEO value. 

If you are not linking to other relevant articles on your website within each blog post, you’re missing out on opportunities to improve your ranking positions and search engine visibility.

Why are Internal Links Important for SEO?

The SEO benefits of internal links are significant, and can improve your search engine visibility for a variety of reasons.

1. Direct Users & Google to your Most Important Pages

Internal links let Google know the most important content on your website. You can use internal links to help Google understand which pages to promote in the SERPs.

2. Help Google Find and Index your Pages

When indexing sites, search engine crawlers begin on your homepage and spread out from there, using internal links as their navigational guide. 

When you have a strong internal linking system, Google is more likely to find and index all your URLs, so your newest content has ranking potential.

3. Communicate Topical Relevance Through Anchor Text

You may wonder how Google knows what your site and landing pages are about. 

Google’s web crawlers use the anchor text from internal linking to understand the purpose and meaning of your content and its relevance to specific search terms.

Anchor text best practices can improve your SEO.

4. Maximized Crawl Budget

Strategic use of noindex and nofollow tags with your internal links can help you ensure that Google is crawling and indexing your most important pages.

For pages that don’t need to be indexed, like thank you or confirmation pages, internal links with nofollow directives can prevent low-value or low-converting pages from ending up in Google’s index. 

It also leaves room in your website’s crawl budget for Google to index those pages that you do want to rank.

5. Better User Experience

Internal links also make your website a better place for site visitors.

Navigation links guide users along a conversion journey after they find you in the SERPs, and in-content links can point them to other relevant pages.

6. Displays Topical Depth and Breadth

Interlinking your topically related pages can turn your website into a topical powerhouse.

Having lots of internal links in your blog posts to related topics or subtopics shows Google crawlers that your website has topical authority, and is a go-to expert source in a particular industry niche or topic area.

How to Analyze My Internal Links for SEO

If you are not sure whether or not you have internal link issues on your website, a site crawler or site audit tool can help you identify any issues.

To run a site audit, do the following.

  1. Navigate to the Site Audit tool in your dashboard
  2. Enter your homepage url into the Auditor and click “Audit Site”
  3. Select your preferred User Agent, Crawl Speed, and Crawl Budget
  4. Wait for your audit to generate. Depending on the size of your website, it may take up to a day for the auditor to crawl all of your pages. You’ll receive an email when your site audit is ready.
  5. Look for your homepage in the Sites List and click “View Audit.”

If you are not comfortable using our software on your own, you can also order an Internal Linking Analysis in our order builder. Our technical SEO experts will determine if there are any link issues on your site and provide a roadmap for how to optimize your internal linking profile for better organic visibility.

 

Common Issues with Internal Links

You can use the Site Auditor to see whether or not you are utilizing internal linking best practices. 

Our report will flag any internal linking issues that may be preventing your web pages from earning higher keyword rankings in the SERPs.

Not Enough Internal Links

One of the most common mistakes that new or unoptimized websites make is that they do not include enough internal links on their web pages.

If your web pages are failing to include the right amount of internal links, it will be flagged in your site audit report.

This may or may not be an easy fix, depending on the number of web pages you have on your website. 

To resolve the issue, do the following:

  • If you already have relevant content on your website but you are just not linking to it, adding internal links to those pages is the first step to resolving this issue.
  • But if you are a newer website, you will need to write and publish relevant content on your website, and it will need to be high-quality in order to bring SEO value. Then, once the content is live on your website, you can take the next step of adding internal links.

Too Many Internal Links

Although you want to include internal links on your web pages, too many outlinks on a page (both external and internal) can appear like over-optimization to Google.

Make sure that you are only including links to relevant, helpful content. And don’t overdo it by stuffing your navigation menu or footer with too many internal links. 

Reserve those links for the most important pages on your website – the ones you really want to rank in the SERPS.

Broken Internal Links

Another very common issue that may be flagged in your site audit report is broken internal links.

A broken internal link occurs when you move or delete a page on your website, and you do not update previous internal links with the new destination url.

As a result, those internal links point to 404 pages. Sending Google crawlers and users to a dead page is not good for SEO or for the user experience.

Broken internal links are very common with large enterprise or ecommerce websites that are constantly updating their content. 

To resolve a broken internal link, take one of the following actions:

  1. Restore the dead/deleted page
  2. Update the internal link with a new destination url

Internal Links with Redirects

Sometimes, webmasters may not be worried about internal links because they use 301 redirects whenever they move or delete a page.

Although 301 redirects are good for SEO in terms of the links from other websites that point to your web pages, internal links with 301 redirects are not considered SEO best practice.

Why? Because redirecting internal links slows down your website and causes Google crawlers to have to move through your website at a slower pace.

Whenever you move a page, a part of your website maintenance needs to be updating any internal links with the new destination url.

This shows Google crawlers that you are an attentive webmaster, and thus makes them more likely to promote your pages.

Unoptimized Anchor Text

The anchor text that you use to internally link your pages is also important to your keyword rankings and your user experience.

Anchor text lets Google know what your other web pages are about, how your content interrelates, and displays the many valuable pieces of content that live permanently on your website.

Final Thoughts on Internal Links

Your website’s internal link profile is essential to optimize if you want to rank for high-value keywords in your industry.

Taking the time to audit your internal links and repair any issues can be all the difference in your ranking positions.

 

We all know that linking to other pages within your own website architecture, also known as internal linking, matters for ranking purposes. It helps search engine crawlers index your site, and the more you link to a page internally, the more important search engines believe that page is to your site (the better that page’s chances are of being prioritized in search).

External linking can also be helpful to your SEO and ranking. However, many companies, agencies, and small businesses are still hesitant about linking to outside sources from their own pages for fear of losing users, or losing search equity.

Outbound Linking Barrier #1

The fear of losing search equity demonstrates a slight misunderstanding of how links work in terms of SEO.

The Hose Myth

Many people think of links like houses that search equity flows through. In this mental model, search equity originates from users, they bestow it on a site by visiting/engaging, and that search equity flows to other pages/sites via links. The problem with this idea of links is that you view search equity as a very finite commodity—and believe that you “lose” search equity every time you link to another site (this is false).

A Better Mental Model

Think of a link like a recommendation. One site is recommending another site to their users by linking to that site.

Let’s take that metaphor a little further with a scenario: Let’s assume your friend asks you for recommendations on someone to hire. Consider the following two outcomes:

  • You give them recommendations for two really good candidates.
  • You give them recommendations for the full 523 people you had in your phone contacts.

In the first outcome your friend probably found you very helpful, and would come to you for help again. In the second outcome, your friend probably didn’t find that useful at all, and they are unlikely to return to you for help.

Search engines are similar. If they see a site link out to high-quality, reputable resources, then they feel like that site is helpful, and they’ll reward that helpfulness in search.

To summarize, here are three key reasons why outbound links work for companies of any size.

  • Search engines judge you by the company you keep (especially Google). Your reputation can benefit from being associated with other sites that are well known for being reputable on related topics.
  • Users prefer information to be curated for them rather than having to find it themselves. Whenever you have the opportunity to do so, link to the most relevant resources in your field. This will encourage a user to bookmark your page or share your content.
  • Linking shows that you know which resources are most relevant. This enables you to highlight to Google that you know authoritative content when you see it.

Outbound Linking Barrier #2

The next major barrier most businesses face to outbound linking is the concern that they’ll lose converting users to other sites.

The Lost Traffic Concern

As pointed out by Moz, it’s true that by linking to another website, you’re directing some traffic away from your own page.

Benefits Outweigh the Cost

Most sites will set external links to open in a new tab for users, reducing the chance of the user being truly pulled away from the site. Additionally, users who are still in the research phase are less likely to have converted during their session anyways.

In your site’s overall SEO strategy, each page is an opportunity to showcase your expertise and depth of knowledge about the topic, space, or industry through your content. When you reference other authoritative sources via outbound links it builds trust for your own website with users, and sends content quality signals to search engines. In this way, outbound linking helps improve the SEO health of your website and reduces behaviors that negatively impact SEO, like u-turns and bounces.

Furthermore, posting an outbound link to a site you find valuable is also a way of extending your hand for potential partnership. This can be a solid way to start building relationships with bloggers, writers, and businesses in the same niche, location, or complementary industry. If you’re a local business, suggesting or recommending other local businesses can even help search engines recognize your page better for local search. In a way, you’re asking Google and other search engines to associate your page with that of other related sites and their SEO efforts/attributes (like location or authority).

How to Select Sites for Outbound Linking

First, only ever select resources that will provide value to your user (informational value, entertainment value, etc).

The easiest place to figure out how to rank in Google? Google itself. Complete a preliminary search of your top keywords and see what Google currently thinks is worth promoting.

Tools for Checking Site Authority

If there are sites that you already know about, and want to check on their authority, you can use a Domain Authority (DA) checker. DA is a metric created by Moz that scores websites based on a scale that goes up to 100. The higher the score, the more domain authority that website holds, making it a strong candidate for an outbound link.

Another great place to start is Ahrefs, using their site explorer you can check the backlink profile for any site already ranking well for your target keyword(s) or sites already linking to a source that you know is authoritative. Look for sites with a high Domain Rating (DR), as a starting point.

Bonus: Screaming Frog, a free tool, can analyze your competitor’s site and provide you a list of their outbound links.

Considerations for Site Selection

Here are a few questions to keep in mind when choosing backlinks:

  • What one-to-three pages help to support my claims or share related content?
  • Which other pages cover the topic well?
  • Do these pages also have good domain authority/domain rating?
  • Do these pages operate by bloggers or domain owners in related or similar niches?
  • Are these pages ones that get regular traffic and social shares from others in my niche?
  • Do I find these sites to be valuable sources of information I trust?

Going through these questions can help you pinpoint whether or not another site is a good choice for a link.

The most valuable sites to link to are those with strong domain authority. Google prioritizes high authority and high organic traffic (OT) metrics. For example, if you were writing a page for your chiropractic business about post-car-accident back injuries, linking to research from the Mayo Clinic could be valuable because you’re backing up the statements you shared on your page with a trusted source of medical information. With Bing, the kings of content are sites that end in .gov and .edu. This is because they’re often associated with government agencies, research, and universities.

If a certain topic is trending in the news, linking to a site that has less domain authority but is the primary source of coverage for this topic can help you jump on the trend while the topic is still fresh in your readers’ minds.

Linking to Your Own Earned Media

A wonderful chance to link to outbound content while also building on your own traction is to link to other websites that have mentioned or profiled your company or your own site. Any form of earned media, such as a mention in a reported piece, a guest blog, or an interview on someone else’s podcast allows you to benefit from the other person’s link to your website and for you to write up a recap for your own site to link to theirs.

Over time, a strategy like this signals to the search engines that you’ve “shown up” as a trusted link by many others in your niche. This is also a much more organic way of building traffic and SEO traction than outdated spammy methods like link farms or linking parties.

You’ll sometimes see sites create an entire section for news or press to highlight earned media.

Outbound Linking Helps Secure Inbound Links

When you create quality content on your own site, you’re also likely to become a hub for outbound links from other people, too! Establishing your site as a worthwhile resource and home for quality content means that over time, you’ll continue to post outbound links to other valuable websites.Your own site might also pick up some backlinks of its own as other people connect to your content. At that stage, link-building becomes a cycle and it’s much easier to build on your own results.

One method we discussed earlier in this article is content curation. Or creating a page that links to all the best resources on a topic, and helps users quickly navigate those resources by providing either brief color-commentary or high-level organization. An example of this would be an article like “The 10 Best Places to Visit When Traveling to Arlington, VA” or “The 20 Best Resources for Getting Started with Inbound Marketing.”

The Outbound Link “Strategy” to Avoid

Avoid two-way backlinking schemes run by private blog networks. These are sometimes referred to as “linking parties.” In Google’s recent updates, they have been penalizing efforts to game the system with links shared between blogs. (They’re calling these discrepancies “link schemes.”)

Worried about your site? If you haven’t participated in any of the following acts, you should be fine:

  • Excessive link exchanges:  This is what we were talking about in the paragraph above. Google will come after you for participating in “link to me and I’ll link to you” schemes. You also should avoid partnering with a page, influencer, blogger, etc. that doesn’t make sense for your company. Google is watching for relevance. So if you get your link on a site that just doesn’t make sense, Google will ping you for it. For example, a link to an automotive shop shouldn’t show up within the blog pages of a bakery.
  • Large-scale article marketing or guest posting campaigns:  Sometimes, businesses think they’re sneaky and can use one singular article several times in hopes of getting a few links out of it. Reworking a few words here and there isn’t going to fool Google.
  • Exchanging goods or services for links:  Google will know if you sent someone a “free” product in exchange for a link. If you do this, make sure it fits your brand and looks natural.

The best way to get on Google’s good side is to create unique, relevant content that your audience will genuinely love. We don’t care what you’ve heard, creating good content pays off. Remember, you’re in this for the long-game.

As pointed out in a July 2019 edition of #AskGoogleWebmasters, outbound linking should always be done without getting involved in any schemes, adding outbound links in user-generated content, and links in ads.

Final Thoughts

The best way to become a trusted source in your niche is to publish regular high-quality content of your own. Forming relationships with other writers and bloggers in your niche by following their content and commenting can also open the doors for future link-building opportunities.

Remember, Google is evolving all of the time. The company isn’t doing this to punish you or take away your hard-earned followers. The algorithm changes to filter out spammers. Rule of thumb? Do your research. Take the tips we’ve laid out in this article to heart. Google doesn’t play and it will penalize your site for a variety of reasons, including joining the wrong link directory, article marketing (which is spinning the exact same article multiple times in hopes of ranking), keyword stuffing, and unnatural anchor text. (You wouldn’t want unnatural text repping your brand anyway, right?

Your SEO efforts are best spent on pages that are not yet ranking on Page 1 for target search terms. Once you’ve seen what is working on your page, flex your new SEO muscles by selecting an under-performing page and test out how you can make improvements to that page. You can then track whether you’re able to boost your rankings for that page. We have a great article with advice on creating great on-page content.

Learning how to properly use 301 redirects for SEO can make it so your website maintains keyword rankings and organic traffic even as you make changes to your content or site architecture.

The reality is, our websites are constantly changing. Good and attentive webmasters will add new and updated content over time to make sure they are providing the highest quality content and page experience to users. 

As a result, redirects become necessary to make sure users and search engine crawlers can find your content. But improper use of redirects can result in lost keyword rankings, lost link equity, and a poor user experience for your website visitors.

When implemented with SEO best practices, 301 redirects shouldn’t undermine your SEO efforts, but ensure that your search visibility is maintained. Here’s a guide to 301 redirects and how to implement them correctly.

Types of Redirects

Here are all of the redirects that you might want to know about, particularly if they are mentioned in your dashboard’s site auditor report.

  • 301 = “Moved Permanently” – best for SEO
  • 302 = “Moved Temporarily” – often used during website redesigns
  • Meta Refresh = page-level redirect that is not recommend for SEO

As a general rule, if a page is important and you want it to rank, then you should use a 301 redirect if it the page is ever moved

What is a 301 Redirect?

301 redirects are used to tell browsers and search engines that a web page has been permanently moved to a new location.

For example, https://website.com/why-anchor-text-diversity-is-good-for-your-backlink-profile redirects to https://website.com/anchor-text-diversity

301 redirects ensure that users and search engines are always directed to the most current and relevant content. A 301 redirect tells the search engine that the page has been moved, and the old page can be safely removed from the search engine’s index, while the new page should be indexed instead.

Why Do 301 Redirects Matter for SEO?

There are a few ways that 301 redirects can impact your web pages SEO performance.

  • Ensure your most up-to-date versions of your web pages are what are indexed and shown to searchers
  • Protects site visibility of your web pages during and after site migrations
  • Helps you maintain the majority of the link equity the original page has earned through backlinks or previous link building efforts

Common Redirect Issues

There are some common issues that occur with redirects that can impact SEO performance. It’s possible that one or more of these issues will be flagged if you run an SEO audit using the Site Auditor in your dashboard.

Broken Redirects

Broken redirects are those that point to 404 or dead pages. When this happens, you will often see an error message like this:

The negative impact of a broken page for users and search engines is pretty clear, so you want to avoid sending either to a dead page at all costs.

Unfortunately, broken redirects are hard to detect without the use of a site auditor. But if you’re a webmaster for an ecommerce website with thousands of product pages that are constantly being added or old pages being deleted, broken redirects are more common than you might think.

Here are the two ways you can resolve this issue.

  1. Reinstate the dead page so the redirect is no longer broken
  2. If you want the dead page to stay dead, you need to remove every internal link on your website that points to that dead page

Redirect Chains & Redirect Loops

Although when used sparingly, redirects are good for SEO, then can also. harm your SEO performance if used excessively. 

A redirect chain occurs whenever one or more redirects point from a url to a destination url.

Google does not want to see redirect chains on your website, as they slow down your website and make it take longer for Google to crawl your website.

Redirect loops are when your redirects point to urls with other redirects, sending spiders in a loop where they never arrive at a destination page at all.

As a general rule, it should never take more than one redirect to get to a destination page. 

The best way to avoid excessive redirect chains is to make sure you use SEO friendly urls from the beginning. That means optimizing your urls from the and sticking to them after you update the content.

However if you do need to resolve a redirect chain or loop, take the following steps.

  • For redirect chains: Replace the redirect chain with a single redirect
  • For redirect loops: Fix the final destination url

Redirecting to HTTP instead of HTTPS

It’s important that HTTP pages always redirect to HTTPS protocols. HTTPS provides users with a safer browsing experience and it is a confirmed ranking factor.

For more info on getting an SSL certificate and redirecting an HTTP site to HTTPs, read our detailed guide on HTTPS.

Internal Links with Redirects

If you have internal links that redirect, it is likely slowing down your website and causing you valuable link equity. 

The site audit report will let you know if this issue is present on any of your pages.

After adding a new version of a page or deleting a page, a part of your regular website maintenance needs to be updating all of your internal links that previously pointed to those pages to the new destination url.

This can take some time, particularly if you have a lot of web pages and are using internal links to elevate your SEO performance.

But it shows Google that you’re an active webmaster that is doing the necessary work to make your website the best place for visitors.

Redirects in XML Sitemaps

There should be no pages in your XML sitemap that redirect to other destination urls. You should be updating your sitemap instead with the new destination so Google crawlers are directed straight to the newest version of the page that you want indexed.

Other Redirect Errors

There are a few other redirect issues that might be flagged in your Site Audit report.

Redirect urls should be lowercase.

And all of the protocol variants (HTTPS, HTTPS) should redirect to the same destination url.

How to Setup a 301 Redirect

There are many ways to implement a redirect depending on your content management system. Some CMS like WordPress will automatically set up a 301 redirect when you make changes to the url path of an existing page. 

There are also many plugins that you can add to your WP site that help confirm on-page SEO best practices with redirects.

But to add a redirect manually, you will need to edit your .htaccess file.

A .htaccess is a powerful website file that is used by Apache web servers. It is located in the root directory of your website. The root directory may be in a folder labeled public_html, www, htdocs, or httpdocs, depending on your hosting provider.

To edit the file, all you need is the old page’s URL and the new page’s URL.

  1. Log in to your website’s hosting account.
  2. Find the file that contains your website’s code.
  3. Look for the code that redirects users to a specific page.
  4. Copy the old page’s URL and paste it into the code, replacing the old URL.
  5. Copy the new page’s URL and paste it into the code, replacing the old URL.
  6. Save the file and upload it to your website’s server.
  7. Test the redirect by visiting the old page’s URL. You should be redirected to the new page.

Conclusion

If you have a large website and you haven’t been thinking about redirects until recently, there may be quite a bit of technical work you need to do to get your site on track.

If you are unable to resolve the redirect issues identified in your Site Audit report on your own, reach out to our technical SEO team.

Alt text often finds its way into SEO content optimization discussions. Designed as a means to increase a site’s accessib| ility, these seemingly inconsequential alt attributes can have an impact on your site’s SEO and usability. To help you make the most of your alt text, we will cover how to write alt text to maximize SEO potential and improve your site’s accessibility.

What is Alt Text?

Alt text or alternative text are written image descriptions within an image’s IMG tag’s ALT attribute in HTML code. 

Also referred to as “alt attributes” or “alt descriptions,” these text descriptions provide information about the appearance and function of images on a web page should the image not load or should the user be visually impaired.

Uses for Alt Text

These alt attributes may be at the forefront of on-page SEO checklists. However, the impetus for alt text began in 2006 when the United Nations audited the world’s most popular websites and found very few offered equal access to the information they provided for visually impaired users. Since then this text has primarily been utilized for:

Alt Text for Accessibility

Internet users with visual impairments from blindness to color-blindness rely on alt text to gain full access to a website’s content. Screen reader users and users of other assistive technology have alt text read aloud. This provides screen reader users with a clearer picture of all the information on the page.

Using a screen reader to explore sites can provide you with a better understanding of what a user would experience should they rely on a screen reader.

Alt Text for Loading Issues & User Experience

If an image file cannot load, its alt text will be displayed in its absence. This can be quite useful should a user have low bandwidth or choose to turn off their browser images to save data. Just as visually impaired users rely on this alt text to fill them in on the purpose and content of an image, users with slower internet connections do not miss out on the image through the use of alt text for an overall better user experience.

Additionally, when alt text stands in for an image, it enriches your content and provides the reader a more well-rounded understanding of the text.

Alt Text for Image SEO

Web crawlers use NLP to read the alt text HTML to better understand what the image is, the purpose of the image, and the context of the image for better indexing and better image search results.

This gives the crawler a better understanding of your web page and gives your image the opportunity to appear in a Google image search.

Examples of Alt Text

Ironically, understanding how to create good alt text often requires a show-don’t-tell approach. So, here are some examples of images with their alt tag texts:

alt=”Beagle standing in a frosty field on a cold morning.”

Here’s what it looks like in the HTML:

alt=”Dignity of Earth and Sky Statue”

alt=”<p>Clear evidence: Atlantic currents carry the Gulf Stream</p>”

If you want to find out whether there is alt text on a web page, you can use an alt text tester to check.

Formatting Alt text

Most CMSs will format your alt text into HTML for you. However, to implement alt text you can insert the following code into your IMG tag:

<img src=”file” alt=”add text” width=”” height=””>

How to Write Good Alt Text

Writing good alt text doesn’t require expertise in creative writing or coding. It does require that you look at images through a new lens, though.

One way to do this is to imagine you’re describing the picture to someone over the phone. As you do so, keep in mind whether or not your listener would benefit from an explanation of the image’s purpose.

How can you make your alt text better with accessibility and SEO in mind?

1. Be as descriptive as possible. 

More descriptive alt text provides the users with a better understanding of the image. As you construct your descriptive alt text, include what makes the image important, unique, and how it enriches the text.

We can all agree that representation matters. Screen reader users also want to know when a brand is inclusive in its imagery. So, be sure to include gender and ethnicity when it’s relevant within your descriptions.

2. Keep it concise but not too short. Leave out extraneous information. 

The example above is too long and would have benefited from the use of the caption tag or long description tag instead.

The best alt text is a phrase or two at most (or a line of alt text). When constructing your alt text, consider what’s a given, what the informational priorities are, and how it informs the webpage content. Reduce redundancy by omitting anything included in the content.

Again, considering the purpose of the image and article for context is key.

Keep in mind that alt text is not a caption. If you need to provide source credit or a source citation, use a caption for that information.

3. Use your target keywords.

If your target keyword is evident in the image, include it in your alt text. As we pointed out, web crawlers will read these attributes to gain a better understanding of your content.

Keep in mind that long-tail keywords are easier to rank for, even when it comes to image searches. 

For example, instead of ranking for “whale shark,” you could try to rank for “whale shark with its mouth open.”

4. Do not stuff your keywords. 

Keyword stuffing is never a good idea. Especially when it leads the user astray as to what the image depicts. Always aim for appropriate and informative alt text that will substitute meaning in place of images when required.

Additionally, keep in mind that Google NLP is great at figuring our semantic relationships between words, so if your image is related to your target keyword, your alt text should be, too–and the result should be a natural signal to Google’s indexing system. 

For example, notice in the image above the alt text mentions “heavy duty dog chew toys.” Google displays this image in search queries for “dog toys for heavy chewers,” which is semantically related to the original query.

Bad alt text = keyword stuffing: alt=“custom dog tag, custom dog ID tag, customized dog ID.”

5. No need to explain that it’s a photo.

One mistake many people make is to include “photo of,” “picture of,” or “image of” in their alt text. This is not needed. Your alt tag indicates that it’s a photo, so these just add unnecessary verbiage and redundancy.

6. Use longdesc=”” for lengthy descriptions.

There are times when an image benefits from a longer description within the alt text resulting in a better user experience. For example, an infographic that is not accompanied by a blog doesn’t add value unless explained clearly. 

For these instances, you will want to use the longdesc=”” tag.

7. Describe buttons too.

Buttons are often images with text embedded. These fall under the category of images of text, which means you need to let your user know what they say in order for them to be useful. 

Provide your user with an accessible alternative for buttons with:

<input type=”” src=”” name=””

  height=”” width=”” alt=”text on button”>

8. Avoid typos and misspelled words.

Proofreading and correct spelling can hinder a screen reader’s ability to correctly convey the meaning of your image. Additionally, typos in your alt attributes can become an image SEO disaster if left unchecked.

9. Consider the type of image.

While you don’t need to point out that you are describing an image, you may want to mention if the type of image is unique. Some image forms you may want to mention include:

  • Illustration
  • Graphs and charts
  • Paintings or other fine art
  • Maps
  • Infographics
  • Gifs and animations

What Else Should You Need to Know About Alt Text?

Writing effective alt text will become second nature over time. However, knowing when to use image alt text, when to skip it, and other image best practices can also improve your site’s SEO and accessibility.

Avoid images of just text.

What not to do:

It can be tempting to add a screenshot, PNG, or JPEG of text. However, this text will never be read by web crawlers. Additionally, because you do not want to exclude the visually impaired from the information in an image, you will want to type that image’s text into the alt text tag.

When Not to Add Alt Text

Decorative images do not need to include alt text. This is because the content of the image doesn’t add to the meaning of the webpage’s content. However, you should include an empty or null alt attribute in your HTML. This null alt text will signal to the screen reader to not read a description of the image.

You can write a null alt attribute as: alt=“ “ or alt=””

You may also want to use a null alt attribute with an image that is a link with a text version beside it. 

Do you need Alt Text for videos?

No, but you will want to include a transcription of the video for hearing impaired users, those that speak other languages, and those viewers who cannot play the video with audio on.

How to Check Your Images’ Alt Text or Another Site’s Alt Text

To read the alt text of an image, all you need to do is right click the image and select “Inspect” or “Inspect Element.” This will open the HTML and CSS element inspection tool. On a Mac, you can also use Control + click.

You can also use an accessibility checker for accessibility issues.

Always Consider the Context of the Image

When it comes to providing thoughtful alt text, consider the purpose of the image. This also presents you with a few more opportunities to use your target keywords. 

For example, if the purpose of a blog is to compare the quality of dog treats, your keyword is premium dog food, and your image is two bowls of dog food for comparison, you can use the alt text, “a bowl of premium dog food beside a bowl of lower quality for comparison purposes.” This allows you to smoothly integrate your keyword without stuffing.

Is alt text the same as an image caption?

No. Image captions are visible to site users even when the image loads while Alt text only lives in your HTML. The purpose of captions is to provide copyright information or an explanation that is needed to understand the content of the image.

How to Add Alt Text in WordPress

Adding image alt text in WordPress is simple. When you upload an image, you can add your image alt text before inserting it into the page. Some versions of WordPress include the image alt attributes menu beside the image thumbnails. Others include the menu at the bottom of the thumbnail screen.

Some Models of Decent to Effective Alt Text

When it comes to alt text, there are varying levels of quality. You can settle for decent alt text or you can strive to provide the best alt text for your users and SEO. Here are some examples of basic alt text models:

Bad: alt=”dog”

Better: alt=”brown dog with leash”

Best: alt=”Tan poodle happily playing in the grass with its leash still attached”

Bad: alt=”people with books”

Better: alt=”mother and son doing homework”

Best: alt=”illustration of a black mother helping her son with his homework to demonstrate the power of involved parents.”

Bad: alt=”picture of a cup, napkin, and pen”

Better: alt=”a blue coffee mug next to a napkin with writing and pen”

Best: alt=”a blue coffee mug with coffee to the left sitting on a wooden table with a pen opposite and a napkin between with the words set goals, not limits”

Alternative Text: A Better User Experience & SEO

It can be easy to skip or rush constructing your images’ alt text. But doing so would be a disservice to your webpage visitors and your SEO. We urge you to think of your alt text as a way to improve every webpage. Optimizing your images for search engines includes providing web crawlers with context through alt text. Additionally, many people rely on alt text to fully understand and interact with your website. Alt text increases accessibility by taking the place of the image should it not load or a user has visual or cognitive disabilities.

Since 2020 over 58% of site visits now come from mobile search traffic. If you aren’t taking mobile into account heavily enough, it’s likely hurting your business.

The use of mobile devices is rapidly changing the way customers are searching, engaging, and buying. Consumers have access to faster Internet while they’re on-the-go. That means Internet traffic is increasing through mobile devices. Beyond social engagement and consuming content, they’re also making buying decisions.

Mobile Search is Often the First Step for Purchases

According to Morgan Stanley, 91% of adults keep their smartphones within arm’s reach. That’s ninety-one percent of ALL adults, and it’s shifting both business culture and research practices. Rather than dedicating time to research a topic, users now perform micro-searches on the go, and then follow-up on those initially discovered options or solutions later on.

How big is this trend? An IDG Global Solutions survey found 92% of senior execs own a smartphone used for business, 77% of those research business purchases from their mobile device with 95% then finalizing related purchases via laptop/desktop. That’s a huge portion of the B2B purchase pool starting their journey from mobile. Missing a user during their initial mobile-based exploration may mean your business is losing out on a huge portion of the market.

Mobile Search is Often Location-Oriented

This trend is even more compounded for local businesses, as 58% of mobile users search for local businesses daily. What’s more? 89% of those users search for a local business at least once per month. We also learn from HubSpot that, when consumers do a local search, 72% of them visit a store within five miles. What does this mean for business with an Internet presence? It’s time to make it mobile-friendly.

What Does the Rise of Mobile Search Mean for Businesses?

Websites now need to be responsively designed so they can serve mobile users just as well as desktop users. Responsive design is a design that adapts to the size of the user’s viewport (i.e. screen), by changing font sizes, adjusting images, and even collapsing page elements to make navigation simpler. Responsive websites that follow modern design standards help users access and understand the information they need more quickly.

Additionally users now view responsive functionality as a trust signal. A study conducted by socPub indicates that 57% of Internet users will not recommend a business that has a poorly designed mobile site.

Because mobile users comprise an increasing number of searches and site visits, they now represent the largest source of traffic in a slew of markets (new industry segments falling into this bucket each month). Our clients regularly pick up market share with simple mobile-friendly design updates, especially within industries that are traditionally late-adopters.

Your Website is Now Your Storefront

Your site is now your storefront. If your site looks terrible or functions poorly, users will leave instead of working to get at your information – it costs a user nothing to click the next result in search.

Google Prioritizes Mobile-Optimized Sites

Google has switched over to mobile first indexing. Mobile-first indexing prioritizes mobile friendly sites over other sites in the organic search results. Even if your target consumers aren’t heavy mobile-users yet, your site still needs to be mobile-optimized if you want to show up higher in the search results (even for desktop-based searches).

Users Are Making Purchase Decisions from Search Alone

With mobile devices rapidly changing the way consumers access information your offsite optimizations are also becoming critical. For example most users performing local searches never go past the search results themselves (aka they don’t actually click into websites anymore). Local search users are typically able to surface the information they want directly within the search results through features like the local Map Pack.

How Can I Improve My Mobile SEO?

The first step toward reaching mobile users is having a mobile-friendly website. Currently, in 2021, responsive web design is the best design approach for mobile-friendliness. Responsive design is the best approach for mobile design because:

  • You will serve the same content to both mobile and desktop users
  • The content will adapt responsively to all screen sizes and mobile device types
  • Search equity is centralized to a single URL for all pages
  • It’s a better user experience
  • Google prefers responsive design

What exactly is responsive design?

Responsive design in an approach for creating web pages where layouts and content dynamically adapt to the size and orientation of the screen or viewport being used.

In the example below you can see that the desktop version of this responsive site the text and video are displayed side-by side, and in the mobile version of the site those elements have been stacked.

This responsive theme adjusts to the width of different devices from smartphones to tablets, even large wide-screen viewports, by rearranging and resizing the design elements.

There have been a few ways to handle mobile sites since the invention of smartphones, the first two mobile design waves were plagued with usability issues, and hard to maintain. Let’s take a look at what didn’t work, and why you should consider migrating to a responsive design if you’re still employing one of these outdated mobile design tactics.

Outdated Approach #1: Mobile Subdomain, Separate Mobile Website

The first wave of design involved creating a different site entirely to serve as the mobile site. This approach involved serving a mobile version of the site using a different URL, a mobile URL. For those of you who have been around long enough, you may remember pages you visited from a mobile device redirecting from domain.com to m.domain.com.

This approach required setting up canonical tags for every page, as each mobile web page contained content duplicative to the desktop page. This approach also split the search equity for each page as desktop users interacted with the desktop site, and mobile users interacted with the mobile website.

When users shared pages from the site, creating backlinks they were split between the mobile subdomain and the regular site domain as separate URLs were being served to each user group. It also meant that every time an edit was made to content on the desktop site, a second round of edits had to be made on the separate mobile site. Mobile pages under this paradigm often provided a worse user experience as they typically served less content than the full desktop site did for desktop users.

Outdated Approach #2: Dynamic Serving of Mobile Sites

The next wave of design consolidated pages under a single URL, but dynamically served cached pages based on the user’s device type using an http response header.

This iteration of mobile design allowed sites to consolidate search equity between their desktop site and mobile site. It also did away with the need for canonical tags on virtually every site page.

However, it meant that every time a device came out with new dimensions, a new instance of the site had to be spun up, formatted, and tested to be served to users. This system became increasingly impossible to maintain as the market diversified and the dimensions for mobile screens became rapidly non-standard. Dynamically serving a mobile version of your site was plagued with issues including repeated issues with serving the desktop version to mobile users.

Current Best Practice: Responsive Design

Responsive design consolidates the mobile version of a webpage and the desktop version of a webpage under a single URL. It also serves the same instance of code, regardless of the size of the mobile screen or desktop viewport.

This allows site owners to combine their desktop SEO and Mobile SEO efforts, employing a single set of SEO best practices and strategies. Responsive design is easier to maintain as you don’t have to manage different content or code for a single page.

Instead all elements fluidly rearrange to suit mobile visitors and desktop visitors as needed. If a user switches from full screen to half-screen with their browser, the design elements will shift accordingly so the user experience is largely unchanged.

How to Check If Your Mobile Site is Google-Friendly

In July 2019, there were over 1.69 billion more mobile searches than desktop searches performed in the US alone. Search itself has become mobile-first. The first place you’ll start when checking your site for mobile optimization is checking out how Google views your site.

Mobile SEO Strategy is All About Google

Google holds over 90% of the market share for mobile search traffic in the U.S., because Google has spent years optimizing search specifically for mobile users. Many of Google’s search results are so well optimized, that mobile users don’t even need to click into an actual result to find the information they need.

Rich snippets and rich results now display enough information for users to take action based off of the search results alone, from finding movie times to the addresses of local businesses, to how to troubleshoot tech problems.

How did Google get so far ahead of the competition with mobile search? They started testing and prioritizing mobile features years ago, and as mobile search volume overcame desktop search volume, Google shifted to prioritizing mobile users over desktop users.

A Brief History of Google’s Mobile Search Results

In 2015 Google rolled out mobile-friendly search results, serving a separate set of search results to mobile devices. This update, often called Mobilegeddon, prioritized mobile-friendly websites in the search results.

In 2016 Google began to experiment with mobile-first indexing, cataloging the mobile version of page content, rather than the desktop version.

In March of 2018 Google formally began rolling out mobile-first indexing, and migrating over to the mobile-version of pages for sites that it had already indexed as desktop versions. To quote Google themselves, “Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.” Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.

In July of 2018 Google rolled out page speed as a mobile ranking factor, ranking sites with slow load times lower in the search results.

Figuring Out Which Trends Will Last

Over the past decade Google has also continually rolled out additional data-rich mobile-first search features from movie times, to reviews, to product images. Google often pivots when rolling out new features, as it continually tests and then prioritizes what works best for serving users the most valuable information.

For example, Google originally published a guide helping webmasters create separate mobile sites under the m.domain.com URL – a tacit approval of the process, only to pivot within a year to formally recommending responsive design under a single unified URL.

Similarly, the AMP (accelerated mobile pages) standard, has been pushed heavily in the past few years. AMP pages, which load in a fraction of the time of normal pages, seem to be struggling with many of the issues that m.domain.com mobile pages had back in the day.

Sites using AMP pages are often managing two sets of page content, with one set slimmed down to meet the AMP standard. There are also challenges with AMP pages being served from a Google URL rather than the site’s own domain. While Google recently addressed some of these concerns with signed exchanges, it’s still causing questions around whether link equity is being split between the AMP viewer URL, the original AMP source, and the AMP cache URL.

Trends that are here to stay? Responsive design, quality content that gets right to the point, making sites as fast as humanly possible.

Check if Google is Flagging Mobile Issues

So what should you pay the most attention to in terms of Mobile optimization? If you already have a website, start with Google’s Mobile Friendly Test. This tool will give you an aggregate rating for whether or not Google thinks your site is mobile-friendly. The tool will also prompt you to view a full usability report in Google Search Console.

If you want to access this report on your own directly from Search Console, login to your account for the domain, and use the left-hand navigation to click into “mobile usability” under Enhancements.

Here you will find a list of the mobile issues that Google has detected on your site. Examples included text being too small to read, clickable elements being too close together, content being wider than the screen, etc.

Click into any of these issues, and you’ll see more granular information to help you improve your mobile SEO, such as the pages where the errors are found. You’ll also see a space to validate that the error has been fixed once you make adjustments to your site.

These are errors Google is specifically recognizing and calling out for your site. From a search rankings perspective, these should be at the top of your list to fix.

Check if Google Is Indexing Your Webpages

Google can’t serve pages in the search results that it can’t see. Make sure that Google is indexing your pages for search.

Enable Crawl by Googlebot

Check your robots.txt file, and make sure that it’s not blocking Googlebot. Your robots.txt file can be used to block certain types of bots and crawlers, but if you’re trying to rank highly in the SERPs, Googlebot should not be one of them.

To check if your robots.txt file is blocking Googlebot, you can either use a free robots.txt tester, or use the link inspection feature in the search console.

NoIndex

A few years ago you could check blocked resources straight from google console in a consolidated view, but as these issues became less prevalent google has dropped the aggregate view. Secondary tools like screaming frog can still give you a full list of NOINDEX and NOFOLLOW pages from your site. Alternatively you can check the status of individual links straight from the Search console using the URL inspection tool.

This tool also allows you to manually submit links and request indexing of new pages, revised pages, and pages that crawlers have yet to discover.

Checking if Your Mobile Site is User-Friendly

Now that you’ve resolved a majority of the technical usability issues, it’s a good idea to check for issues mobile users face that may not have been caught by Google.

How Does Your Site Appear on Mobile?

Start by taking a look at how your site appears on different devices, this free tool will let you select from a variety of mobile devices and desktops to give you a full sense of how your site looks on different devices.

You should quickly be able to see any major issues with formatting that could be hindering the mobile user experience, or making your site look unprofessional. Examples include poorly formatted text, grainy or stretched images, or overlapping page elements.

Work with your webmaster or web development team to clean up any design elements that aren’t displaying well on mobile. Once your site layout is mobile optimized, you’ll want to check that your site is compelling to mobile searchers on the Google search results page.

Are the Visible Portions of Page Titles and Metas Compelling?

Users only click into a site from search if the rich snippet, page title, and/or meta description are compelling. Your title tag for your page needs to front-load your target keyword(s), and your meta description should include the most pertinent information about your page first.

Page titles can be very similar between pages, so meta descriptions can often make the difference for which result or results site visitors click.

Also keep in mind that rich snippets can provide even less space for title tags and meta descriptions. In the example below you can see how each result only displays about 3-4 words from the page title.

If you use a major platform like Wordpress there are SEO plugins that will help you manage your title tags and meta tags. If your site is custom, you may need to edit this information directly in the html code.

If you’re seeing a good amount of organic traffic from your target keywords, the next step is to make sure that traffic is actually seeing your mobile optimized content.

Are You Losing Visitors to Page Speed?

Over half of mobile searchers will abandon a page that takes longer than three seconds to load. Separately, for every additional second it takes a page to load, conversions fall by 12%.

To check your mobile page speed use Google’s PageSpeed Insights Tool, and see how quickly your site loads on a 4G connection. This tool will give you a granular breakdown of all speed issues you can address to improve your site speed.

Most major website platforms (Wordpress, Squarespace, Wix, etc) will have native features and plugins that will automatically optimize image files for mobile devices to reduce page load times.

Do Any Pages Have Super High Mobile Bounce Rates?

Bounce rates are a great indicator that a page is not providing value to users. If you see bounce rates are much higher on specific pages for mobile users than for desktop users this is a sign that the page may have some issues with either mobile formatting, mobile load times, or that the relevant content may take too long to scroll to on mobile.

To check bounce rates, simply login to your Google Analytics dashboard. You’ll be able to view aggregate bounce rates for your site, bounce rates by page, and track how bounce rates change as you make adjustments to webpage content.

Avoid Intrusive Pop Ups

Intrusive pop ups, and poorly designed pop ups can increase your bounce rates on mobile and tablet devices. Intrusive pop ups can also hurt your organic search rankings, especially with Google. An update Google rolled out in 2016 devalues mobile pages that have intrusive pop ups, lowering the page’s rankings in the search results.

There are two major popup issues that can cause bounce rates and devaluing of a page in SERP. Pop ups that have not been optimized for mobile traffic can be impossible to close on small screens, and may cause mobile searchers to bounce from your site. Pop ups that prevent a user from accessing content on-load will hurt your mobile SEO especially with Google. Google considers pop ups that block site visitors from content to be “intrusive.”

Examples of intrusive pop-ups and interstitials:

  • A pop up that displays immediately, or while the user is trying to read through content
  • An interstitial that has to be exited before the user can access the main content
  • A full-screen interstitial that has to be scrolled past to access the main content

That doesn’t mean you should abandon popups entirely. Used correctly, and designed with mobile UX in mind, pop ups can help improve your conversion rate. These pop ups are ones that help the mobile user along their journey, are contextually relevant to the content, or are a legal requirement. Pop ups that appear as a user is looking to complete the next step in their journey are generally fine as well.

Examples of pop-ups and interstitials that are okay:

  • Pop ups that notify mobile searchers that a site uses cookies.
  • Pop ups that confirm a user’s age for restricted content or services.
  • Pop ups that take up a reasonable amount of room and are easy to dismiss.

Optimize Your Site for Voice Search

A report issued by PwC states that, compared to conducting a traditional search, 71 percent of respondents prefer voice searching. Now that we know users prefer voice search, let’s look at how we can optimize our websites to reach them.

  1. BE CONCISE. The average voice response ANSWER is less than 30 words long. Avoid filler or unnecessary terms like “however” or “thus” and be as direct and straight to the point as possible while completely answering a question. Google actually has an entire guide outlining the type of responses selected for voice searches, and the biggest takeaway is that answers should be brief and direct.
  2. Voice searches pull in part from “featured snippets.” That means, when someone asks a question using a voice search, Google pulls answers from approximately 30 percent of these snippets.
  3. Consider the user’s intent. When crafting your content, ask yourself what users are searching for before landing on your site. Doing this will help enhance the content’s relevance. Therefore, if you’re optimizing your page for a specific featured snippet, your goal should be understanding your visitor’s intent and providing them with an answer immediately.
  4. Use long tail keywords and questions in headers. Often, voice searches occur as though the user is speaking to a human. Short, choppy keywords are rarely in use. Long-tail keywords and phrases are how people talk. So, when optimizing your site, consider using these phrases in conjunction with questions. That way, your website will pop up more often when users are trying to solve a problem, find a product, or use a service.
  5. Optimize for local searches. Users are going to search using local SEO. According to Small Business Trends, 58 percent of mobile users find local businesses using voice searches. Adding phrases to your content like, “near me” or your geographic area will help boost your rankings.

Are You Addressing the Customer’s Journey?

Mobile-friendly websites must think through the customer’s journey. Ask yourself these three questions:

  • What types of users hit my site?(Who are they, how old are they, what are their roles)
  • What would those users be want from my site?(ex: to establish pricing, to find my business location, to complete an online purchase, to share a story)
  • Can each user easily complete their journey using only the main nav?

Your main navigation should help users quickly and easily get what they want from your site, without a user needing to use site search or “click around.” Once you have a handle on your audience segmentation and goals, you should confirm that your users are not facing any major barriers along each journey.

There are a few ways to do that, here are two:

  • If you have a program like Hotjar or Lucky Orange installed that allows you to view your own users’ onsite journeys – you can watch user recordings to see if users are struggling to complete tasks.
    • Ex: Users abandon scrolling because information is too far down a page
    • Ex: Users have a lot of “U-Turns” – pressing back almost immediately because what they wanted wasn’t on the page they clicked into.
    • Ex: Users rage-click an element that’s not opening or functioning correctly.
    • Ex: You see error messages displayed to the user from your site.
    • Ex: You see users begin conversion, but abandon forms or carts.
  • You can conduct direct user research:
    • Recruit users that you’re able to interact with directly
    • Request they complete specific tasks on the site
    • Have them explain their thinking and reactions as they interact with your site

Your marketing shouldn’t be only about what devices your potential customer is using, it should be about the journey they’re taking. What are their lifestyles, habits, and device preferences? Conduct research, surveys, and interviews with your current audience. This marketing tactic is an excellent opportunity to develop a relationship with your existing customer base. Offer incentives and prizes to those who choose to participate.

Create Journey-Driven Designs

Designing websites focusing on mobile users means we have drastically less real estate, so minimalism is critical. The last thing a user wants to do is scroll through or resize your pages. According to a scrolling and attention study the

Nielsen Norman Group conducted, 74 percent of users indicated their viewing time is spent on the first two screens of content. Therefore, responsive design is the solution. You can accomplish this in a variety of ways, including:

  • Hiding content under sliders
  • Using sticky live chat or feedback widgets
  • Implementing mobile pop-ups
  • Redirecting to social media
  • Creating a bare-bones presentation
  • Eliminating sidebars
  • Taking advantage of banner space
  • Replacing graphics with a search bar

Pro-Tip: For mobile-users, one often overlooked difference is that tap-areas need to be large enough for users to click on interactive elements (links, buttons, drop-downs) with precision.

Mobile User Experience Optimization Recap

For local business:

  • Make sure to include NAP (name, address or service area, phone number) on your website.
  • Claim and complete your Google My Business (GMB) listing and your Bing Places account.
  • Optimize pages to include names of local cities and landmarks
  • Focus on location-based rich snippets like the Map Pack

For all businesses:

  • Make use of structured data to leverage google search’s rich snippet features.
  • Confirm your responsive design is acting as-expected.
    • You can use a tool like this Responsive Design Checker to confirm how your site looks at the most common breakpoints
    • You can check out alerts and mobile feedback directly from Google through your site’s Google Search Console
    • Install a user-session recording software
    • Hotjar, for example, will let you see if your users are struggling in any areas (ex: pages are too long and users abandon before hitting content critical to conversion).
  • Focus on SPEED:
    • Optimize images for mobile (reduce file size)
  • Pro-tip: start out with a responsive design or theme and it should handle this for you.
  • Minify CSS
  • Leverage caching
  • Enable Accelerated Mobile Pages (AMP)
  • Switch anything you have on flash over to HTML5 instead

Final Thoughts

Mobile searching remains the leader because everyone loves the convenience of using their devices. Your audience is busy, on-the-go, and living in a digitally-driven world. As a result, their mobile queries will continue to be on an upward rise. Even though mobile searches are similar to those on a desktop, your site must be optimized for your audience’s visits. Your brand should be easy to use and support your customer’s journey. A mobile-friendly design that responds to the level of mobile searches you receive should be your goals.

There are times when a PDF file is the type of content that brings the most value to your audience. And just like with your traditional html pages, SEO for PDFs can help them earn keyword rankings so your target audience can discover them through organic search.

Although not always the greatest for SEO, Google does index PDFs, and sometimes ranks them, meaning that your executive reports, white papers, survey results, or other PDF content should be using SEO best practices and be optimized for search.

How Does Google See PDF Files?

PDF files are treated like regular pages by search engine crawlers. Google converts PDFs into HTML, and then indexes the HTML version of the page.

FWIW we convert PDFs & other similar document types into HTML for indexing too, so theoretically there wouldn’t be too much difference.

In the SERPs, the PDF tag will be visible next to the title tag. This lets the user know that they will be directed to an indexed version of a PDF page rather than a traditional web page.

How to Optimize PDFs for SEO

Because Google does treat PDF files like regular web pages, that means all of the same on-page SEO best practices still apply.

Have an SEO-Friendly File Name

Just like Google looks to file names of your images to understand their relevance to your web page content, your PDF file name should communicate to Google what your content is about. 

Best practices for optimized file names include:

  • Short and Sweet
  • Includes target keyword
  • Accurate description of PDF content

Optimize your PDF Title and Meta Description for your Target Keywords

SEO Meta Tags like the title and meta description of your PDF will be visible to users in the SERPs. They will be used by Google to understand the primary topic of your PDF.

Including your target keywords in these elements of your PDF, and those with Keyword Difficulty scores that are achievable for your website, will help improve your ranking potential.

If you are using Adobe Acrobat Pro, you can edit the title of the PDF via:

  • File > Properties
  • Edit your title in the Title Field

Similarly, you can also edit your meta description by clicking:

  • File > Properties
  • Click Additional Metadata

Edit your meta description in the Description field

Designate Headings in your PDF Documents

Just like you use heading tags to help users navigate your web pages (and search engines understand the topical depth of your content), PDFs should also leverage headings.

Most likely, your white paper, report, or PDF document has some natural structure separated by headings. Make sure that you take the time to specify h1-h6 tags in Adobe Acrobat.

  • Navigate to the Tags icon on the left sidebar
  • Click on the text you want to specify as an h1-h6
  • Right click on the tag > Properties
  • Select the relevant heading from the Type dropdown list

Include Internal Links to your Website

Having internal links will help direct users to other relevant pages on your website. 

They also signal to Google the range of content that your website offers and what other topic areas you cover.

Google will follow the links in your PDFs, meaning you can use them to spread link equity to other pages.

Don’t Save PDFs as Images

Google is going to struggle to crawl and render the content of your PDF if it is saved as an image file. It also makes it more difficult for users if they want to highlight text.

PDF Issues Flagged in the Site Auditor

The Site Auditor will flag an issue if a web page is linking to a .doc file instead of a .pdf.

.doc and .docx are not seen as SEO best practices because they will not be included in Google’s index, meaning missed opportunities to get your content in front of more audiences.

And because .doc are not compatible for all users like .pdf docs, including PDFs instead provides a better experience for website visitors.

Should I Be Using PDFs?

In general, a web page is more likely to get in front of more audiences because of how easily it can be understood and indexed by search engines.

PDFs are generally seen as not great for SEO. But when you do have PDF content, you should make the most of it!

So the short answer is don’t make PDFs a big part of your SEO content strategy. But for content like annual reports and white papers that are in PDF form, optimize them!

In their everlasting quest to provide users with the best results for search queries, Google added Page Experience metrics to their ranking algorithms. The Google Page Experience Update made it so factors such as mobile-friendliness, web safety, interstitials, and a site’s overall UI/UX are officially ranking factors. The Page Experience Update rollout started in early June 2021 and ended on September 2nd. It was the first update to heavily focus on a user’s experience within each part of a web page.

Google’s motivation behind the update was to improve the overall search experience through the websites they promote in Google search. As a result, websites that prioritize creating a high-quality and engaging page experience saw an improvement in their overall rankings. Those that didn’t adapt, well, they dropped in their keyword rankings.

If you’re not sure whether your web pages provide a high-quality page experience for users,  this article is made for you. Our guide will walk you through how websites that have maintained their search visibility have responded to the Page Experience Update. Then, you can replicate their strategy in your own website for improved SEO performance.

Where Does the Page Experience Fit Into Google’s Algorithm Updates?

The Page Experience truly shook up the SEO world in 2021. Why? This update added a new layer to how SEO experts prioritize the usability of websites. As a result of the update, Google is not only focused on promoting relevant pages, but those that provide enhanced speed, less element shifting, and improved responsiveness. The value of a web page is not only in its relevance, but in how it performs for the user, and most experts e agree this update is a change for the better.

Other Google Updates

This is not the first update Google has launched to its algorithms. Google has a long and varied history of updating its algorithm. In 2018 alone, Google launched over 3,000 updates to how the browser produces search results. These types of updates range from large to small, and they usually include changes to indexation, data, search UI’s, webmaster tools, and ranking factors.

How Algorithms Affect Internet Searches

All of these updates play into the many algorithms that power every search. Google uses algorithms to help fulfill a specific function, grouped into one larger, core algorithm. Sound complex? We promise it’s not.

All of these updates play into the many algorithms that power every search. Google uses algorithms to help fulfill a specific function, grouped into one larger, core algorithm. Sound complex? We promise it’s not. Here is a breakdown of the different types of ranking factors used by Google:

  • Content: The most popular content algorithm is known as Panda, and it helps Google judge relevant content, penalizing and rewarding content based on specific parameters.
  • Backlinks: The Penguin update helps Google determine if a link is spammy and deserves to be factored in with the crawling and indexing process.
  • Organizing: All this information has to be stored somewhere, and there are specific algorithms to help with that.
  • User Experience: In addition to your great content, Google needs to see if your website brings valuable information to users. It does this by rating your website’s user experience (UX) and factoring it into the search engine organic results.

What is UX & Why Does It Matter?

Simply put, user experience is the study of how users interact with your website. User experience targets potential users at all steps of their journey and helps you get into your customer’s minds before they come to your website, during their time on the site, and after they leave.

For many business owners, a good user experience equates to a pretty website. While it is always a good idea to have an aesthetically pleasing website, a few pretty graphics won’t cause your customers to convert. Instead, your website’s interface needs to be optimized with the consumers in mind.

The Impact of Better UX

Here are some user experience statistics that drive home the sheer importance of creating a good page experience:

  • 88% of all consumers report that they would be less likely to return to a website after having a poor user experience.
  • It’s estimated that businesses with poor user experience lose about 50% of potential sales.
  • Consumers form about 75% of their judgment on a company based on their website’s usability and viewport.
  • Customers are routinely choosing to browse the Internet from their phones, with 48% of users being annoyed with poorly optimized pages and 53% of users leaving a mobile site if it doesn’t load in three seconds.
  • A well-optimized user interface can improve conversions by up to 200%.

When it comes to your website, there are likely hundreds, if not thousands, of competitors offering products and services similar to yours. With this in mind, you can’t risk that your potential customer’s first impression of you is impacted by low-quality UX. Staying on top of user experience trends and best practices has always been important to earning new customers, but it will now be essential to showing up in search results.

What Is the Google 2021 Page Experience Update?

Unlike many of Google’s algorithm updates, Google did release a lot of information and tools to help users prepare for and respond to this update. The update was a big one and is now considered one of Google’s largest.

Due to trade secrets and proprietary information, Google only released some information about their updated algorithms. But as 2021 unfolded, web developers and SEO experts inferred how to make optimizations to best match the new ranking factors.

Luckily, we’ve done the heavy lifting for you by outlining the key information you need to know to ensure your website provides the kind of page experience that will be most valued by Google.

The New Core Web Vitals

Google released a metric set named Core Web Vitals, a set of metrics that measure a website’s speed/loading time, responsiveness, interactivity, and visual stability. These metrics were released in May, fully functional in June, and remain the foundation of the 2021 algorithm release. 

The Core Web Vitals include these three benchmarks: 1. Largest Contentful Paint, 2. First Input Delay, and 3 Cumulative Layout Shift), to help site owners measure a website’s holistic user experience.

While we know that these new measures are subject to change and can still evolve, since June of 2021, they have remained consistent. Here’s the breakdown of the three basic metrics:

Largest Contentful Paint reports the render time of the largest image or block of text visible within a web page’s viewport. Simply put, it relates to the time it takes for your webpage to load the biggest piece of content on a page. An ideal LCP would be within 2.5 seconds of loading the page.

First Input Delay (FID)

(measures interactivity)

First input delay measures a consumer’s first impression of your website’s interactivity and responsiveness. It does so by monitoring how long it takes from when a user first interacts with a web page (i.e., clicking on a button) to how long it takes for the browser to respond to that action. Think of it as how long it takes for a user to press a button and for that information to appear. An ideal FID is under 100 milliseconds.

Cumulative Layout Shift (CLS)

(measures visual stability)

Have you ever been scrolling on a website and are just about to click on a button, when the layout moves and you are all of a sudden in a different portion of the page? That is a layout shift, and if your website has a lot of them, it can hamper your user experience. Cumulative layout shift measures the combined effect of this movement on one webpage.

Visual stability is exactly that—how stable the webpage is when it loads—and if the page stays steady throughout a consumer’s scroll. CLS measures how many times a user experiences unexpected layout shifts, with the ideal metric for this being less than 0.1.

As a best practice, to ensure that you are meeting the right target for each of these metrics, it is recommended you test and monitor about 75% of all pages on your website. It is important to understand that these Core Web Vital metrics are user-centered new metrics that give real-world data to see and understand how users interact with your website.

What We Know So Far – Page Experience Signals

A better page experience leads to deeper engagement and allows consumers to get more done. There are already existing page experience metrics that Google uses to help webmasters monitor their performance, including:

Mobile Friendliness: Not all searches are created equal, meaning your website should perform on mobile phones at the same level it does on desktops. This new signal will factor more heavily into SEO.

Safe Browsing: This metric ensures the security and safety of your website, verifying there is not any harmful content on it.

HTTPS Security: Having an HTTPS tag on your website means it is safe and secure for users, and their information isn’t at risk of being stolen.

Intrusive Interstitial Guidelines: Many websites have a ton of intrusive pop-ups that get in the way of a user finding the information they need. Because of this, Google has created a set of guidelines on how to include pop-ups on a webpage without severely hampering the user’s experience as a whole.

How to Optimize for Google’s Page Experience Update

All this information on search engine functionality and algorithms may sound complicated, but don’t worry. There are many easy steps anyone can take to prepare their website for the most important aspects of The Page Experience.

Here are a few of the steps you can take to maintain and improve your SEO.

1. Know and Use the Tools Available to You

There are plenty of free tools available to you that will allow you to monitor these new ranking factors on your website. Using them to consistently monitor your own website will not only help your user experience metrics soar but bring more potential customers to convert. A few examples include:

  • LightHouse: This tool targets the Core Web Vital metrics for each page on your website. In many ways, Lighthouse has become the best way to account for a Core Web Vitals report.

  • PageSpeed Insights: Here you can check multiple metrics and reports that go into your entire website’s page speed rating and the Core Web Vitals.

  • Mobile-Friendly Test: Check here if your website performs as well on mobile as it does on desktop.

  • Chrome User Experience Report: This report collects real-time data for each Core Web Vital, as listed above.

  • Google Search Console: This gives you a glimpse into what is happening within your website, based on real-world usage from actual consumers for accurate and nearly live reports.

Web Vitals JavaScript: This tool measures all Core Web Vitals in JavaScript using APIs.

2. Audit Your Site Across Users’ Devices

If you have both a smartphone and computer, then you likely know the way in which different devices load pages differently, both in terms of visuals and page speed. There are some tools that can help you audit your website without having to purchase a truckload of devices.

  • The Lighthouse tool has an easy selection button at the bottom that allows you to switch between running your report for mobile and desktop. And you can use these metrics’ visual indicators and reports for targeting individual components of your page experience to improve.

  • PageSpeed Insights also allows you to toggle between your mobile and desktop performance stats.
  • With Responsinator, you can test out how your website looks on a plethora of mobile devices, from phones to tablets. This is a great, free way to ensure that the actual rendering of your page is not lost in translation between different devices.
  • CrossBrowserTesting allows you to test out both the appearance and performance of your website on over 2,000 different browsers and devices. This is a great way to ensure that your site not only looks but also performs optimally across a range of formats.

3. Improve Your PageSpeed Insights Score

Google’s PageSpeed Insights (PSI) tool lets you know how well your website performs for both desktop and mobile browsers. It also provides detailed information that can be used to deliver a faster user experience. If you find that your PSI is scoring less than ideal (anywhere below a 90), then you’ll want to take some measures to boost your page speed. Here are some ideas to consider:

  • Compress Your Images: Large image files are a significant contributor to longer load times. Luckily, there are many free tools available that can help you compress your files and diminish the time it takes to load them. If you host your page on WordPress, then Smush is a handy plugin to optimize and compress images, one you don’t need to be an SEO expert to understand.
  • Use a Browser Cache: Browser caching is another simple fix that significantly improves the speed of your page. Essentially, a browser cache allows a web browser to remember commonly occurring elements of your site, such as header and footer material. This way, users won’t have to reload this material every time they click on a new page on your site. For WordPress users, W3 Total Cache is a tool we’ve found useful.

Implement Accelerated Mobile Pages (AMP): Originally used for news sites, AMP pages are essentially stripped-down versions of existing pages that can load up more quickly on mobile devices. While not necessary for pages loading optimally, AMP can be a boon to pages that are currently lagging. It’s likely you’ve already encountered AMP on your phone, noted by the little, encircled lightning bolt in the page’s corner.

4. Have a Benchmark

It is of the utmost importance to understand where your website stands before you make changes. We all know that having the top spot in the search engine result pages is our top goal, but, if anything, the rollout of this new algorithm means that it is time to shift focus to include a user’s experience.

So you need to test, test, and test! Use the free tools above on each page of your site and move slowly. Take note of what is working and what isn’t in order to be best prepared. This way, whenever you make changes, you’ll be able to track your results easily and won’t be sidelined with the introduction of Google’s search algorithm next year.

5. Optimize Your Content

Your website is nothing if not a place for your potential customers to gain information, so be sure to optimize your content, one of the most important Google search ranking factors.

The SEO Content Assistant is the best way to improve your on-page content (you can access it by setting up a free account). Using this tool, you can target up to five keywords and take immediate steps to give your content more topical-depth and authority.

But you can’t just put the content on your page without any organization, as this is where header tags come in. The proper use of headers such as the title tags and header tags will not only segment out your information into easily digestible chunks, it will also make it easier for Google to crawl and index. The SEO Content Assistant will let you know which focus terms should appear in headings.

These subheadings do double duty. They’re also a great way to optimize your target keywords, as the more prominent they are on your page and your URL, the more Google will believe the information you are creating is valuable content.

6. Don’t Forget Images

Yes, it is important to have original written content on your website. However, it is much more important to diversify the types of content you use. Images are a significant ranking factor, in addition to how they engage the searcher and create a great page experience. Plus, you cannot appear in a Google search for images if you do not have optimized images.

The easiest way to use images is to put them at the top of the page, grabbing the user’s attention as soon as they get on a specific landing page. However, keep in mind the Largest Contentful Paint metric. And make sure to optimize these images by reducing their load time by compressing them. You also want to incorporate relevant keywords in the alt text if it’s appropriate, so in case of a problem with loading the page or visually impaired users are visiting your site, they can see what the photos are meant to be used for.

Get Started and Stay Informed

What did we learn from this rollout? Details and Milliseconds matter… and updating your website in response to Google’s Page Experience update is a win/win for you and your web visitors. They receive a better user experience, and your website is rewarded with positive signals to Google’s web crawlers. 

Site owners who focus their efforts on following proper user experience best practices have sailed through the update without major negative impacts to their overall search visibility.

So, get started with amping up your website’s mobile friendliness, responsiveness, and other fixes for a great page experience.

It’s essential to closely monitor your website, even long after new metric rollouts. Be sure to keep tabs on ranking changes. It can take weeks (and sometimes months) for Google to register changes to a page and change your ranking for a Google Search, so you’ll want to check up on your GSC Insights reports.

What are users actually searching for when they type in a keyword into Google? This is the question that millions of marketers are trying to answer when they create a piece of content that they want to rank in the SERPs. 

Similarly, Google also wants to understand what users want. We are all trying to understand “search intent,” or what a user is ACTUALLY looking for when they type something into the Google search bar.

Search intent optimization is the effort to create content that satisfies what users are actually looking for when they type a specific search term into Google. This definitive guide provides an in-depth look into what search intent optimization is, the different types of search intent, and how to use search intent to its full potential. 

Keep reading to get an inside look into how to optimize your website for search intent.

What is Search Intent?

Search intent is the true intention behind the user’s search query. Providing content that matches the user’s search intent helps the user find the exact content they are looking for quickly and easily and leaves them with a “satisfying,” experience. Satisfying search intent is considered one of the highest quality indicators by Google. 

What’s an Example of Search Intent?

For example, let’s say a user types “brunch” into the Google search bar. What information are they actually looking for? Places in their neighborhood to get brunch? The history of brunch? Brunch recipes? 

In this example, Google doesn’t seem to have a complete understanding of what the user’s intent is with this term. As a result, they provide SERP results that can satisfy all of those various intentions.

 

 

In contrast, if a user types the search term, “brunch places near me open during weekdays,” Google has a much better understanding of what the user is looking for. They show the user a list of restaurants in the Map Pack that are open on weekdays. In addition, the top-ranking web page result is a complete list of all of the weekday brunch restaurants in the searcher’s location.

In short, understanding search intent is important to both Google, which wants to give the user the best search experience, and marketers, who want their content to rank on page 1.

Figuring out what a user wants, and providing them EXACTLY that, is the foundation of search intent optimization.

Why Does Search Intent Matter?

Why do people even turn to Google when they need information? Because Google is so very good at giving users the exact information they are looking for.

This ability to satisfy the user’s intent has all sorts of beneficial outcomes for all of the various stakeholders involved in search engines: 

  • For users: A quick, easy, and satisfying experience when using the search engine
  • For Google: Satisfied users who return to the search engine again and again
  • For Webmasters: The opportunity to connect with their target audience 

By understanding search intent, it’s much more possible to create content that will meet the needs of the user. 

But sometimes, that is easier said than done, because every user is different. However, thinking about the main types of search intent, and creating content accordingly, can help your website match the intention of the majority of users.

The 4 Types of Search Intent

To understand and discuss search intent, digital marketers have developed four primary terms that can categorize all of the billions of searches that happen every day in search engines.

1. Informational

Informational search intent is when the user is looking for information on a certain topic. 

This is the most common type of search intent. All of the below search terms can be categorized as informational:

  • “who was president in 1879”
  • “what does privatization mean”
  • “what is Jeff Bezos net worth”

Clearly, these users are looking for very specific answers to their questions. So for searches like these, Google will often show a featured snippet that provides the answer to the user very, very quickly.

In addition, Google is showing results that may answer the next questions users have about that same topic.

These types of searches can often result in a zero-click search, or those where the user doesn’t actually click to any web page because they immediately see the answer they are looking for. However, sometimes users are not necessarily searching for a specific answer, but simply more information about a given topic. 

The below keyword queries, then, also fall under the “informational” category. 

  • “saving for retirement”
  • “top paying careers”
  • “benefits of yoga”

With informational searches like these, Google is likely to show a wider range of resources that explore the topic in-depth, giving the user the ability to choose the web page that appears most relevant to their intentions.

2. Navigational

Navigational search intent is when the user is looking for a specific web page or website. 

For example, if a user types in a company name, they are likely looking for that company’s website. Or, maybe the user visits the same website multiple times, and is looking for a specific page on that specific website.

Here are some examples of keywords that fall under the navigational type:

  • “mgm vegas restaurants list”
  • “buzzfeed news”
  • “salesforce blog”

With navigational searches, Google will most likely rank the specific pages that the user appears to be looking for.

As seen by the examples above, navigational searches most often include the mention of a brand name or the name of a website.

But for enterprise brands that have a large online presence, branded searches may produce other types of results, like Wikipedia pages, news articles, stock prices, etc.

Also, large brands will often have a Knowledge Panel appear in their branded searches. This panel includes key information about the company that Google has gathered from various sources across its index.

3. Transactional

Transactional search intent is when the user is looking to purchase something specific. In these cases, users are further down the sales funnel and closer to a purchase decision.

For these types of searches, Google most often ranks specific product pages, like this result for “2017 grand cherokee blue.”

Transactional searches will very often return rich results, meaning results beyond just the traditional blue link. They will include product images, prices, reviews, and other information about the product.

Also, because transnational searches often represent users with the intention to make a purchase, transactional searches are often returned with Google Search Advertisements.

4. Commercial

Commercial search intent can resemble both informational and transactional searches. This category represents users who are looking to make a purchase or to educate themselves in order to make a future purchase. 

For this example, a user might be doing research for something they plan to buy in the future. Some examples might include: 

  • “best supplements for hair growth”
  • “Standing desk comparison”
  • “taylormade stealth vs callaway rogue” 

In the above cases, Google is likely to show web pages that compare products or detail features, reviews, or additional information on the products/services. They know that the user is not ready to buy a specific product but is still in the research phase.

And because Google knows from this search that the user is still in the consideration stage of the marketing funnel, Google might promote various types of results.

In the above case, Google is showing ads, specific products, as well as blogs that compare various features and benefits of multiple brands of hair growth supplements. 

By doing so, Google is able to satisfy the various search intentions of different users.

What Are the Best Practices for Search Intent?

By understanding the actual search intent of keywords, content marketers can create content that is more relevant and valuable to their target customers. 

So how do you actually optimize for search intent?

Here are three basic steps that will help your content creators better satisfy search intent.

1. Keyword Research

A keyword can tell you a lot about what a user wants.

By understanding the topics and queries that users are searching for, companies can create content that is more likely to rank and be seen by their target audience.

This can be done through a variety of tools, including keyword research tools and analytics, to identify the keywords that are most relevant to your target audience. 

More specifically, look at the keyword variations provided by the dashboard tools to see if there are other similar keywords that may be more relevant to the content you are creating.

2. Top-Ranking Content

Not sure what the search intent of a keyword is? Look at the content that is already ranking on page 1.

Because not only is satisfying search intent important to you, it’s also important to Google. Looking at what Google is already promoting gives you a hint of the type of search intent they associated with the keyword. This should clue you in to the type of content to create if you also want to rank for that particular search term.

You can look at the Keyword Magic Tool to review the type of content that has been ranking on page one over time.

3. Analytics Tracking

Google Analytics can tell you how long your organic visitors are staying on a particular page, whether they click on additional pages, or how many sessions those users have.

These are all vital pieces of analytics that can help you understand whether or not your content is satisfying search intent.

You can also use other behavior analytics tools like HotJar or Mouseflow to see how users are actually interacting with your content.

For your high-value, high-converting pages, these tools can give you essential insights to improve your content, and user experience, and to better satisfy your user’s intentions.

Conclusion

In conclusion, thinking about the search intent of keywords can help you create better content and reach your target audience more effectively!

 

Updated for 2022

For website owners, marketers, and business owners who want to improve a site’s Google search visibility, there is no better data available than those that come from Google Search Console. What makes these important metrics priceless to site owners and webmasters? They come straight from Google. So, if you have yet to use Google Search Console to improve your site’s impressions and clicks in Google search results. Now’s the time.

This guide will cover how to create a GSC account as well as how to make the most of your GSC account to improve your keyword rankings.

What You’re Missing If You’re Not Using Google Search Console

There are many SEO softwares out there that track keyword rankings, but Google has the most accurate and up-to-date information about the keywords your site ranks for on a daily basis. In this way, Google, or software built over Google’s API, is the only source of truth for how your URL is performing in search right now.

If you want to earn more keyword rankings and drive organic traffic from search, Google Search Console and GSC-powered analytics are the best available. Beyond keyword tracking plus providing site owners with crawl-request capabilities, the platform empowers you to perform key SEO actions that will improve site visibility and ensure that Google is crawling and indexing your site as efficiently as possible.

What is Google Search Console?

Google Search Console is a free platform provided by Google to measure website performance in search engine results pages. 

Formerly known as Google Webmasters, Google Search Console allows site owners to check the indexing status of their web pages, track their keyword rankings, and improve the overall visibility of their websites in search engine results pages. 

In September of 2019, Google Search Console went through an update that removed many of the old reports as well as dashboard options. So, if you read a reference to the “old” vs “new” Google SearchConsole, the new GSC refers to the version introduced in 2019. 

However, the new user interface still provides most of the same functionality and capabilities as the old version but with new names. Console continues to improve search query results for searchers and site owners by allowing site owners to see their site through the eyes of a Googlebot.

Key Features of Google Search Console

There are many features that Google Search Console offers webmasters to understand their search engine performance and elevate their usability to meet Google’s standards.

  • Get a general overview of your website’s SEO performance
  • Inspect any URL in your domain for search query data, indexing, crawling, and mobile usability
  • Access key SEO metrics like clicks, impressions, click-through-rate (CTR), and average position
  • Browse the search queries that your landing pages rank for
  • Access performance metrics for Google Discover
  • Submit XML Sitemaps
  • Submit Disavow files
  • Monitor Core Web Vitals for desktop and mobile versions of your website
  • Be notified by Google of any security issues with your website
  • Browse all of your backlinks, internal links, and top linking anchor text

Getting Started: How to Set Up Google Search Console

If you’re brand new to tracking your website in the GSC platform, setting up your account and getting started is simple. Here is a run-through of all of the first steps to get your account properly linked to your domain, adding users, and linking your GSC account with other useful platforms.

How to Verify Your Site & Add Your Domain Property 

One of the biggest hindrances for many site owners is connecting their GSC account to their domain. While this preliminary step may seem intimidating, it’s not technically complicated. Additionally, you only need to do it once for each of your domains.

Add Your Property in GSC

Once you’ve created an account with Google Search Console, open the drop-down menu and select “Add Property”. 

You will be given the selection of five different types of properties, various ways to verify your property, as well as the subsequent steps for each verification method. These verification methods depend on the property type you’re adding.

5 Ways to Setup Your Domain Property for GSC

  • HTML file upload: Simply upload a verification HTML file within a designated place within your website.
  • Domain name provider: For users of WordPress, GoDaddy, and other domain registrars, you can sign in to your domain account, and add the DNS TXT by pasting the text within the text field provided. You can also verify your site from GSC with a domain name provider.
  • Google Analytics tracking code: Use the same tracking code provided within your GA account. You may need to first change your GA permissions for this code to work within GSC.
  • HTML tag: Add a <meta> tag to your site’s HTML <HEAD> section.
  • Google Tag Manager container snippet code: Copy and paste the GTM container snippet code connected to your site. You can find this under View>Edit>Manage container-level permissions.

All of the above are simple verification methods. Google provides detailed instructions for each in Google webmaster tools.

Add all domain versions to your GSC account and verify which is your preferred. This includes a version of your site with “www” and without. Otherwise, GSC may not give you a complete collection of your metrics. You can also set up a 301 redirect to your preferred domain from your non-preferred.

Other Tasks to Make the Most of Your GSC Account

After you’ve established your GSC account, you can begin exploring all the features. Spend time looking through all the reports and functions. Do not worry about ‘breaking’ your account. There are very few, if any, ways you can mess up your GSC account.

How to Add Users to Google Search Console

If you have multiple team members or individuals monitoring the same domain, it’s easy to add additional users to your Google Search Console account. Property owners can designate both owners, restricted users, or associates. Each comes with limited permissions. It’s important to consider what level of access each user should be granted before adding them to the property.

If you’re a site owner that is working with an SEO or digital marketing agency to execute your SEO strategy, you can add your account owners or agency representatives as users or associates.

Users vs Owners in GSC

Verified Owner: There is only one verified owner per GSC property. This should always be a person that will always be with the company.

Added Owners: Can change and control all aspects of their connected GSC properties.  They can also see all of the Google Search Console data and can take most actions that a property owner can, but only the verified property owner can make fundamental changes to the account, such as adding or removing property owners, changing an address, or submitting a sitemap. 

Added owners can add more owners, users, and associates if they choose.

Users: Users have access to all data on a GSC property and perform some actions. However, users cannot add more users. You will find there are restricted users and full users. As the name implies, restricted users have more limited access to data and actions.

Associates: This title can seem a bit misleading. It is for associating a Google Analytics property with a GSC account. This association gives you access to GSC reports within your GA account (and vice versa). See below for instructions on linking your GSC and GA accounts.

A GSC property can have up to 100 users and 1 GA property.

How to Add Users to Your GSC Account

Here are the Google webmaster instructions for how to add a user to an account.

How to Link your Google Search Console account with Google Analytics

Google Analytics is another great free tool that Google provides to site owners and one you should already be using to understand your website traffic and other key performance indicators. Google Analytics focuses on metrics related to all traffic on your website, not just the users who arrive from organic search.

If you want to simplify your life and have all of your analytics in one location, you can link your Google Search Console account to your Google Analytics account. By doing so, Google will import all of your GSC data into your Google Analytics reporting.

To do so, you will have to have both accounts already set up. Then, login into your Google Analytics account.

  1. Press “Admin”
  2. Select “Property Settings”
  3. Scroll to Search Console Settings and select “Adjust Search Console”
  4. Choose your preferred reporting view and click “Save”

Once these steps are complete, you will have additional reporting available and your GSC data in your Google Analytics account.

How to Link Your Google Search Console Account with our Dashboard GSC Insights

While the data from GSC is still the best, because it comes straight from Google, the dashboard and reports are rather limited in comparison to other SEO software. This is one of the few downsides of GSC. If you want access to a more comprehensive dashboard with more advanced data visualizations, our dashboard GSC Insights tool combines a more engaging user interface with the same powerful Google Search Console Data. 

Within the tool, you will also find a multitude of unique reports based on your GSC data, such as the economic value of your organic traffic (compared to how much that traffic would cost in a CPC campaign). 

GSC insights is built over Google’s API, meaning you have access to the same real-time data you find in GSC–which is a great advantage. 

To access the tool, create an account (you will also get access to our other SEO tools like our SEO Content Assistant and Keyword Researcher). 

Select the “Google Search Console” tool on the left side of the dashboard, where you will be able to link a new account via the “Projects” tab.

The Metrics: How to Use Google Search Console

Once you’ve added your domain property, users, and linked your primary Google accounts, you’re ready to start tracking your website’s performance in the SERPs.

Much of the learning curve related to GSC is getting used to the jargon. Here is an overview of the primary metrics that Google tracks organized by how they are structured in your Google Search Console Account.

The Overview Section

The overview section is a summary of the three primary categories of metrics that Google Search Console Tracks: Performance, Coverage, and Enhancements. Here’s what they mean:

  • Performance: Metrics associated with the performance of your URLs in search results
  • Coverage: Metrics related to crawling and indexing of your URLs
  • Enhancements: Metrics related to technical elements of your website that could improve your performance and coverage

URL Inspection

The URL inspection tool is a way to quickly inspect the indexed version of any given web page on your domain. 

There are three ways to access and use the tool in your Google Search Console account.

  1. Use the Inspect URL search bar at the top of the page
  2. Select “URL Inspection” in the toolbar on the left side of the page
  3. Select any URL that appears in a list

Once you inspect a URL, GSC will provide you an overview of information from their last crawl attempt, including whether or not the URL appears in search results, whether it appears in search results but has issues, where the page was found, and more.

Watch the below video for more information on how to make the most of the URL inspection tool and the data it provides.

Performance: Search Results

The Performance section of your Google Search Console account will provide all of the metrics related to your URLs’ search appearances. The primary data graph will provide four key metrics that quantify how your site is performing overall across all web pages.

  • Total clicks: The total amount of clicks your web pages have received across all search queries for the selected date range
  • Total Impressions: The total amount number of times your web pages appeared in search results and were seen by users during the selected date range
  • Average CTR: The average click-through-rate (percentage of impressions that resulted in a click) of your web pages across all search queries for the selected date range
  • Average position: The average ranking position of your web pages across all search queries for the selected date range

If you want to filter these search performance metrics for specific keywords, landing pages, days, or some other factor, you can set your parameters from the table located below the primary graph.

The table will populate based on the filters or combination of filters you select. You can filter your clicks, impressions, average position, and average CTR by any of the following parameters.

  • Queries: View your search performance for any keyword or search phrase that your URLs currently ranks for in the SERPs
  • Pages: View your search performance for individual pages on your site
  • Countries: View your search performance by country
  • Devices: View your search performance by device type
  • Search Appearances: View your search performance by rich results, AMP non-rich results, job listings, and job details
  • Dates: View your search performance metrics for any day in the entire history of your domain

How best to utilize all of this data will depend on your keyword targets, strategy, and SEO goals. Overall, the data is designed to provide you with both a big picture summary and many granular pictures of the strengths and weaknesses of your search performance.

Index: Coverage, Sitemaps, and Removals

Before Google can rank your URLs for keywords, it has to crawl and index your pages. The Index section of your Google account will give you all of the information you need to understand how Google is indexing your pages, as well as tools to improve how Google is understanding your web pages.

Coverage: Index Coverage Report

This feature of the Google Search Console tool allows site owners to see which of their landing pages have been indexed and whether Google bots encountered any problems when crawling them. Since 2018, Google uses mobile-first indexing data when available. 

Ideally, there will be zero errors on your coverage report, but if not, Google will identify what pages had errors and the type of error so you can attempt to correct them.

How to Identify Site Errors & Indexing Issues with Google Search Console

To review your site errors, simply select Coverage under the Index section of the sidebar.

After the report opens, you can identify individual pages with crawl errors.

Sitemaps: How to Add a Sitemap to Google Search Console

A sitemap is exactly what it sounds like — a roadmap for all of the web pages in your domain. It communicates to Google’s crawlers what the most important pages of your website are, even if you don’t have internal links pointing to all of your pages.

Essentially, a sitemap helps Google crawl your website more intelligently and efficiently, which benefits your keyword rankings in the long run. You can generate an XML sitemap using a site map generator (try the Yoast plugin). If you are unsure of which pages to include in your sitemap, consult one of our SEO strategists before generating your file.

You are certainly not required to add a sitemap to your GSC account, and for smaller sites with only a few landing pages, it may not be necessary. However, larger websites (like e-commerce sites with lots of product pages) owners can really see their keyword rankings benefit by adding a sitemap. 

To upload your XML sitemap in your Google Search Console account. First, select the “Sitemap” tab.

ou will be directed to the page where you can upload your XMLgoogle bots the link to your file and click submit.

In Google Search Console, you will be notified when there are any crawl or indexing errors on your website. Checking back in on your sitemap is important as Google won’t necessarily crawl every page on your website with the same frequency. 

Also, as your website grows or anything changes with your overall site architecture, you will want to update your sitemap so Google continues to understand which pages are central to your site architecture.

Removals: Temporarily Block URLs in Google

One of the newest features of Google Search Console, the Removal tool allows site owners to both remove content from the SERPs and see which content has been removed due to the requests of third parties.

The removal tool offers three options to site owners:

  1. Temporarily hide URLs for showing up in the search results
  2. See which URLs are not showing in Google because they are “outdated content”
  3. Identify if URLs have been filtered by Google’s “SafeSearch,” adult filter

The majority of site owners will only ever access the first feature of the tool. If for some reason you need to temporarily remove a URL from Google, you can submit your request via the removal tool. The request can be submitted for a specific URL or any URL containing the same prefix, and they will be removed from Google for six months.

This tool does not permanently remove web pages from Google’s index. To do so, site owners need to either use the robots <noindex> tag or delete the content from their site.

Core Web Vitals: Website Page Experience

In 2020, Google switched out its PageSpeed report with Core Web Vitals report. This report is all about the technical performance of your website. Google bots measure website performance and page experience through their Core Web Vitals metrics. 

Google Search Console provides Core Web Vitals reports for both the mobile and desktop versions of your website.

The following metrics make up Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures load times and performance
  • First Input Delay (FID): Measures interactivity
  • Cumulative Layout Shift (CLS): Measures visual stability

Google recommends focusing on improving those affected URLs that receive “Poor,” and “Needs Improvement,” ratings. Click on the item in the Details list to get more details about the issue type that is resulting in the subpar rating.

How to Notify Google That You’ve Fixed Core Web Vitals Issues

Once you’ve attempted to fix the issue, you can verify whether Google is seeing the issue as resolved across your entire site or notify Google of individual fixes.

Simply select the status of the fix under the Validation column in the Details menu beneath the Core Web Vitals report.

As you work through these fixes and check off that you’re ready for the fix to be validated, the item will drop to the bottom of the status table. Once all items are accounted for, Google will track usage data for 28 days to see whether the issue is fixed.

Some improvements you may be able to make on the backend of your WordPress site (or whatever CMS you use), but other errors may require you to work more closely with a developer.

Mobile Usability Report

The mobile usability report provides site owners with insights into how their website performs on mobile devices. If any of your pages do have mobile usability issues, the “Details” table will highlight the type and status of the error. Errors that may be noted on your mobile usability report include:

  • Viewpoint not configured: This error signifies that your website is not responsive and not resizing to fit the screen size of the user’s device
  • Fixed-width viewport: This error signifies that although your website was responsive and resized, it did not do so correctly for all devices
  • Content not sized to viewport: This error means that elements of your website layout are not reflowing to fit the new page size
  • Small font size: This error communicates that your font is too small to read on certain devices
  • Touch elements too close: This error signifies that elements of your website that users may interact with — such as CTA buttons or input fields — do not have proper spacing on mobile devices
  • Flash usage: This error will only be notified on websites that use Adobe Flash content

Some of these errors are simple to fix. Others are not. But it is important to address the errors as best as possible, particularly if your website traffic primarily comes from a mobile device. With mobile-first indexing, Google uses the mobile version of your website for indexing and ranking, meaning solving these issues is of great importance for getting higher rankings.

AMP: Accelerate Mobile Pages

AMP pages are essentially a streamlined version of HTML that results in faster loading times on mobile devices. If you want to create accelerated mobile pages for your website, your web developer will need to follow AMP HTML specifications. Then, you can use Google’s tool to confirm that your AMP pages are properly set up and recognized by googlebot indexing.

Enhancements: Improve your Rich Results with Structured Data Markup

The enhancements section of the Google Search Console account provides site-wide insights and performance reports that can help site owners improve their rich results or those SERP results that include carousels, images, or other non-text elements.

Rich results can help your search results look more attractive to users, but your site can only appear in rich results if you have structured data markup or the standardized format of website code for classifying the page content. 

What you see in the “Enhancements” portion of the Google Search console toolbar will depend on what structured data types you have implemented across your site. You can work with your web developer or an SEO agency to implement structured data on the backend of your website.

What Enhancements in GSC?

Enhancements are essentially HTML improvements that elevate the performance of your pages for users as well as helping Google’s bots better understand the primary purpose of each of your web pages. It is worth the effort to make sure to use structured data markup if you want to improve your overall appearances in the search engine results pages.

What Features Will You Find in Enhancements?

First, explore Schema.org, to find structured data or schema markup that signal to Google that your site or pages have a specific form of information. 

These include:

  • Breadcrumbs
  • Prices
  • Recipes
  • Location
  • Hours
  • Availability
  • Reviews
  • and more

Breadcrumbs: Help Google Understand Your Site Architecture

Breadcrumbs are a rich results type that help Google’s crawlers and users understand your site structure, or where your web pages are located in your website’s hierarchy. 

Site owners in any industry can benefit from adding breadcrumbs to their websites for SEO purposes and to improve the user experience. However, large sites with multiple landing pages.

How Do Breadcrumbs Improve SEO and Your User Experience?

The first page of your website that users arrive at through organic search may be a blog post or a resource page, not necessarily your homepage or primary category pages. With breadcrumbs, users (and Google bots) always have a sense of where they are located on your website.

Once you have implemented the breadcrumbs structured data markup, Google will provide a report that lets you know whether or not there are any errors and which support team can help you resolve the issue.

Logos: Improve your Brand Impression

The Logos structured data markup helps Google understand the logo associated with your business or website. By adding the Logos structured data markup, it ensures that your logo appears in branded search results for your company. The logos structured data type will help Google know to show your brand logo in knowledge panels.

In the Google Search Console Tool, the logos report will let you know whether or not your logo is appearing correctly in rich results and if not, how to correct the issue.

Local Business Enhancement: Get More Calls and Foot Traffic

The local business structured data markup is really important for any local or small business with a brick-and-mortar store that users will likely find via mobile search.

This structured data markup type provides essential information about your local business like your physical address, hours of operation, online reviews, and makes it easy to contact you by phone straight from the result. If there are any errors in your search result, Google will notify you in the GSC dashboard.

Security & Manual Actions

Having a safe, secure website creates a higher-quality web experience for users, which is why Google provides warnings of any security issues it detects on your website. Google will flag any issues that could harm users it sends to your website and will notify you of which pages the security issues were detected on.

Google Search Console provides details on how to resolve the specific security issues it detects. It is the site owner’s responsibility to resolve those security issues in order for their rankings to improve. 

How to Notify Google of Manual Penalty and Security Fixes

In order to test whether or not the security issues have been resolved, simply “Request Review,” in your Security Issues report and provide the following information in your request:

  • Explain the security issue
  • Describe how you have attempted to resolve the security issue
  • Provide detailed information on the outcome of your efforts

It usually takes 2-3 weeks before Google Search Console notifies site owners by email of whether the issues have been resolved.

GSC Manual Actions

The manual actions report is meant to prevent those websites that use black hat or unethical attempts to manipulate their search results. If your site is issued any manual actions, some or all of your site will not be shown in search results.

There are a variety of reasons why your website may be issued a manual action, but they all fall under the umbrella of failing to follow Google’s webmaster quality guidelines. 

To fix the issue, follow these guidelines outlined in Google Search Console’s help center.

Legacy Tools & Reports

Since the launch of the new Google Search Console, legacy tools and reports do not have a current replacement in the new Google Search Console dashboard. Google does not make the older version of GSC available, but site owners can still access these tools via the links Google provides in their help center:

  • Robots.txt tester – Test the syntax of your robots.txt file. You can also use the dashboard’s free robots tool to generate robots.txt
  • URL parameters tool – Communicate to Google any special URL parameters
  • International Targeting – Debug hrfelang settings or set country targets
  • Data Highlighter – Use when structured data markup is not helping Google extract information from your webpages
  • Crawl Rate settings – Allows you to lower Google’s crawling requests

Links: See your Backlink Profile at a Glance

The Links section of Google Search Console allows site owners to see their backlinks, internal links, top linking sites, top linked pages, as well as the most common anchor text other site owners use when linking to theirs. Site owners can export a CSV file of all of their backlinks by clicking the “Export External Links,” button to get a closer look at those sites that are linking to theirs.

If you have a large amount of low-quality links that you think might be impacting your rankings, you can submit a disavow file using the disavow tool. This will not remove the links, but instead, make it so Google no longer considers them in their evaluation of your website. You can use our backlink analyzer to generate disavow text for your disavow file.

Only advanced SEOs should be disavowing links, as you don’t want to accidentally harm your site by disavowing the wrong links. It’s also important to use this tool sparingly, as Google wants site owners to find other ways to get those links removed or to instead focus on earning more high-quality links to their site.

How to Use Google Search Console Effectively

With so much data and performance reporting available in Google Search Console, it can be overwhelming for new site owners. How do you make the most of all that valuable information to improve your search results? Here are the primary ways to think about using the Google Search Console dashboard. In conjunction with our GSC tool and our software suite of tools, you can make the most of Google’s data to improve your search results more effectively and quickly.

Keyword Rank Tracking

The most straightforward purpose of using GSC is to see the specific keywords your site is ranking for. If you are doing the work of SEO correctly, you create and publish content on your site with specific keywords in mind. You can use GSC to confirm whether your content is high-enough quality to earn those rankings.

After you optimize your landing pages, use Google Search Console to measure your site’s performance well for those keyword targets. If you make additional on-page optimizations or off-site backlink building to those pages, measure how your keyword rankings change.

By elevating your site’s presence with Search Console data, you can improve your search traffic across multiple relevant keywords for the long term.

Understand the Strengths and Weaknesses of your On-Site Content

Understanding the best and worst-performing pages of your website helps you determine your next step for improving your number of impressions. Whether you need to create new content, redirect specific pages, optimize an HTML tag, re-optimize relevant pages, resolve keyword cannibalization, or more, you can identify the different ways you can improve your content with our Google Search Console.

Using GSC Insights to Maximize the Value of GSC Data

To automate and more easily track your SEO performance, GSC Insights makes tracking campaigns, keywords, and pages even easier.

GSC Insights and our dashboard can help you identify SEO potential based on competitor data, sites with potential keyword cannibalization (often caused by duplicate content), and indexing issues, including a sitemap report.

Backlink Monitoring

In addition to SERP data, SEO professionals encourage site owners to monitor off-site SEO metrics such as backlinks.

Your domain rating and the likelihood of appearing as a top page in Google’s SERPs depends on your site’s domain rating. Backlinks are essential to your site’s performance. 

Directing SEO efforts off-site can elevate your site’s ranking more quickly than anything else. Backlinks are key in the algorithm and can improve your performance for different keywords because your domain name will be perceived as more authoritative in Google’s algorithm. 

To increase your domain rating, you need backlinks. 

After you connect your Search Console account to GSC Insights, you are able to monitor your backlink profiles to ensure that your website is not keeping company with any questionable web properties. 

Our Dashboard makes monitoring your backlinks easy, and the software suite provides you with backlink outreach opportunities, so you can become discoverable in more relevant queries.

SEO A/B Testing

For more advanced SEO specialists, our Dashboard is a great platform for running A/B tests on your website–with spreadsheets.

How to Use GSC Insights to Track Campaigns

Site events allow you to easily mark then track changes you make to your site within the GSC Insights tool. Whether you change a minor element in a piece of content such as a title tag, meta tag, meta description, or edit a number of pages, you can create an event to mark the change.

From there, you can track months of data, years of data daily to monitor the effectiveness of your efforts. You can isolate data such as page rank, keyword changes, impressions, and CTR,

Because our Dashboard’s keyword data is more up-to-date than tools like ahrefs or semrush, you can perform better keyword research and see if the changes improve your site’s search performance more quickly and accurately.

Final Thoughts on How to Use Google Search Console

If you want to improve your site’s SEO, the first thing you need is full access to the important information that helps you track your search engine optimization efforts. Google Search Console website has all the basics and is the easiest way to track SEO and site settings for both large and small sites.

Paired with GSC Insights, you can understand Google results for your site’s pages and site’s keywords with even more granularity. For more pro tips on how to make sure Google‘s tools are working best for you, talk to one of our SEO specialists. We can help you understand your search console reports and provide managed SEO plans that will elevate your website URL in the search results pages.

Is your SEO strategy failing to drive enough conversion-oriented organic traffic to your e-commerce site? Or are you just getting started with a new e-commerce store?
This article covers e-commerce specific SEO strategies and considerations. By applying these advanced on page SEO tactics and best practices, you can expect your online business or online store to gain search visibility, better organic rankings, higher organic search traffic, and ultimately more online shoppers.

We’ll walk you through the same basic SEO elements you’ve undoubtedly read about countless times before, but address their application for an ecommerce store. These practical SEO tips will help you significantly increase your new customers and online sales by making it easier to reach your target audience without spending a dime on PPC advertising.

How to Set IA for E-commerce Sites

Category-based information architecture (IA) and site architecture are critical for ecommerce sites, and this IA should inform your main navigation.

Online users have no patience. If they can’t find what they’re looking for quickly, they’ll bounce back to the search results and try the next site in the search results.

As your product catalog grows, you will need to put more effort into making it easier for users to find the right products!

Your site must have well-thought through UX (user experience) elements such as filters, navigation links, breadcrumbs, product categories and subcategories as well as clear URL structures and product naming conventions.

Ecommerce IA Is All About Product Categories

Good UX makes it easy for a user to understand where they are in your site, and what products and/or services your online store offers. For ecommerce SEO, site structure is typically based off of product categories, product collections, and products/product filters.

  • Step 1: Determine Product Categories
  • Step 2: Determine Product Sub-Categories and/or Collections
  • Step 3: Determine Product Names

Product categorization for your e-commerce website may need to be different from your product categorization operationally. Your ecommerce SEO strategy needs to reflect how your consumers view your products, not how your business views your products.

IA Should Reflect How Your Customers Think

In general, your site structure should reflect how your customers think about your products and services, even down your actual product names.

A common mistake that e-commerce sites make is organizing their products online the way they view those products from a production or operational perspective.

How everyday shoppers think about your products may be different than how you think about your products.

For example, you may think of a piece your company makes as “Breville part #: BJE510XL/45” but your everyday shopper may search for “Breville Filter Basket Replacement”.

So how do you establish and/or close the gap between how you think about your products, and how your prospective customers search for your products?

When In Doubt, Ask Your Customers!

Pull from a usability best practice — have direct conversations with existing customers. Ask them how they’d describe your products or services, how they mentally categorize your offerings, or how they searched for your business to begin with. Their input will help you understand the language your customers are using, and how they think about your products or services.

If you need a starting place for understanding how your consumers view your products, you can have them complete a card sorting exercise. Card sorting is a UX (user experience) tactic that helps you prioritize and group information based on how your customer’s see it – by literally giving them all the elements and asking for their feedback.

Account for Broader Market Trends

Next, complete keyword research. Keep in mind a handful of responses can be very helpful for gaining insights, but they may not reflect the broader market. Spot check search volume for keywords using the language your customers use, the language you would use, and be open to discovering additional ways the market overall searches for your products/services.

How to Conduct Keyword Research for an E-commerce Store

Keyword research helps you understand how the market thinks about products and services, and which search terms are likely to convert if you can attract those shoppers to your ecommerce site.

Start With a Keyword Research Tool

There are a number of tools to help you with keyword research. If you already use adwords, you could use the google keyword planner tool as a starting point for establishing the best keywords (search terms) to target. You can also use our keyword volume or keyword tracker tools.

Make a Starter List of Product-Specific Keywords

Your goal should be to identify a list of keywords that describe your products and have high search volume. This set of terms represents how your consumer base thinks about your products, and this is the language that you should use throughout your ecommerce website (ex: for product categories or collections).

You need to know what long-tail keywords, which specific terms, people use when searching for the exact products that you sell.

Narrowing Down Keywords By Search Intent

The types of keywords you would use to optimize an e-commerce site aren’t necessarily the types you would use in another niche. You want to attract users towards the bottom of the purchase funnel. To do this, you need to identify a user’s search intent.

As the name implies, the term “search intent” refers to the reason someone is performing a search. This also influences the words they choose when performing a search. Search intent can be broken down into four categories: informational, navigational, commercial, and transactional.

  • Informational searches use keywords that indicate a user wants to learn about a particular topic.
  • Navigational searches include the domain that results should be surfaced from such as The New York Times or Twitter.
  • Commercial searches suggest a user is interested in a general product or service, but hasn’t chosen a specific option yet. This type of search might include keywords such as “best guitar for beginners” or “guitar reviews.” The person performing this search is clearly thinking about making a purchase in the future. They’re simply conducting initial research first.
  • Transactional searches are usually performed when a potential customer knows precisely what they wish to buy. As such, words such as “buy” and “for sale” often show up in these searches. To continue with the example above, after conducting research and deciding which guitar to buy, the user in question might search for “buy Yamaha Gigmaker EG Electric Guitar Pack” or “Yamaha EG Electric Guitar Pack for sale.”

It’s important that you select keywords that have commercial or transactional search intent. These are the terms that will convert for your site. Do NOT simply pick keywords that have high search volumes. After all, your core goal is actual ecommerce sales. You want to attract more potential customers, not just generic organic traffic!

Select Keywords with Commercial Search Intent

In general, ecommerce keywords belong to the commercial and transactional categories. It’s easy to understand why you’d want to focus primarily on researching transactional (and, to some degree, commercial) keywords. These are simply the types of keywords people frequently use when they are ready to make a purchase. This makes them ideal for ecommerce websites.

Select Keywords That Are Very Specific

It’s also worth noting that ecommerce keywords are often very specific, or long tail. Long-tail keywords are keywords that have modifier terms around the basic keyword. Identifying these more specific keywords can be extremely valuable.

In the example below we see how more specific searches can have a higher cost per click. This is a market indicator that the term is higher-converting. You may also notice that there is less search volume on long-tailed, or specific, keywords. Broad searches have higher search volume because there is a much wider range of reasons those searches could be performed. Specific searches have much clearer intent.

A very specific search phrase, for example one that includes model type, size, brand name, or location indicates a user knows exactly what they want and they are ready to make a purchase. Very specific searches tend to be more high-converting, and more impactful on your bottom-line.

Select Keywords That Have High CPC

One more indicator that a keyword is likely to be used by potential customers and have a higher conversion rate is that the term has a high Pay Per Click (PPC) or Cost Per Click (CPC) value. These are terms that the market has already validated as conversion-oriented. However, you cannot rely on CPC alone, you still need to check for the relevancy of terms against your own product list and/or services. Converting for the market will not always mean converting for your specific site or business.

Set Up Your E-commerce Website Architecture

Once you understand the search queries your customers are using, at both a category and product level, you’re ready to finalize your site architecture. Use broad high-volume terms as product categories/product collections, and then more specific terms for sub categories and individual products.

You’ll use this architecture to set up the structure of your site, from your home page, to category-based landing pages, sub category landing pages, collection pages, and finally product pages. This architecture will also inform how your main navigation and sub navigations are structured.

Create Your Category Landing Pages

Once you’ve determined your product categories and subcategories, consider creating related landing-pages for each category and/or subcategory. A recent study has shown that sites which increase their landing pages from 10 to 15 see a 55% increase in leads. These pages can be used to target keywords that are broader than your product-specific keywords. For example, this page on Guitar Center for electric guitars:

This landing page (and URL) target the broader and higher-volume term “electric guitars.” Category pages help your site capture traffic from higher-volume terms while the individual product pages target much more specific long-tailed keywords (aka higher-converting keywords).

Set Up Your Main Navigation

Category-based site structures also help users navigate quickly to relevant products/services right from the homepage. For ecommerce websites of any size each category page can be a main nav or sub nav item. This strategy adds the related category keywords to every page on your site, as well as increases the page rank for these pages through internal linking, as the main navigation is repeated on every page of your site.

Take Guitar Center’s site for example, all of these sub-category pages for “guitars” are listed (linked) in the main navigation, and therefore all the terms you see here are “read” by search engines on every single page of Guitar center’s site — not just their homepage. This site structure also makes it very easy for users to find the exact product they’re looking for and even discover new products. This boosts your page’s odds of appearing in relevant search results.

Implement Category-Based Breadcrumb Navigation

For larger ecommerce sites, adding all product categories or subcategories to the main navigation may not be feasible. In this instance, breadcrumbs can provide an alternative method for leveraging internal linking and product pages to help users navigate deeper sites.

Guitar Center’s main nav does not display or internally link to any pages below the “electric guitars” category. However, there are additional product subcategories within the electric guitars product category. To improve usability and discoverability of these additional subcategory pages breadcrumb navigation has been added (highlighted in red below).

Breadcrumbs are especially useful on product pages, as they can help users discover a full product line, clarify the website structure, and provide a secondary navigation link to bounce a user back multiple site levels without having to press the back button multiple times.

Amazon, as another example, uses breadcrumbs to provide users with secondary navigation on almost every product page.

Employ Category-Based Redirects

Finally, category pages can be helpful for ecommerce sites, as you can set up category-based redirects. Category-based redirects allow out-of-stock products to be redirected to the main category page. This improves the user experience and reduces the chances of Google, Bing, or other major search engines reading any of your pages as 404ing.

Determine Your Site’s URL Structure

A well thought-through site structure will also enable you to programmatically generate custom product URLs that include relevant keywords (describe the product) and are easy for users to read/understand.

Use Plain-Language URLs

Keyword-based URLs are more “clickworthy” than URLs that consist of seemingly random characters such as product SKUs. URLs featuring keywords essentially “tell” a search engine algorithm more about what type of product is featured on the page.

URLs Should Include the Product Name

In ecommerce seo, the URL slug will typically be the main product keyword — usually the product name. Your main keyword for the product should also be included in your H1. Continuing with the earlier example of a user searching for a beginner guitar set, this page from GuitarCenter demonstrates the right way to generate a product URL:

The URL slug is the name of the product. It’s also the H1 (not just stylistically but also with the HTML H1 tag applied). The result? This is the first page to appear in a Google SERP when users search for “yamaha gigmaker eg electric guitar pack.”

Popular Ecommerce URL Formats

The URL also illustrates a popular ecommerce URL format: domain.com/category/product. Other options to consider include domain.com/collection/category/product and simply domain.com/product.

Determining which format to use requires deciding whether your products belong to specific categories and collections, or whether they stand on their own. The chosen format works in this example because the product is an electric guitar pack belonging to a specific brand.

Explore the traditional URL structure of various ecommerce platforms when reviewing your options, but keep in mind that you can often change the structure by choosing the right theme or directly editing the code.

Optimize Individual Product Pages

I’m sure you’ve spent time optimizing your homepage already, but did you know it’s even more crucial to optimize the SEO of your individual product pages? These are the pages that need to appear in relevant search results, and you want them to be strong enough that they convince guests to make a purchase.

Product-Focused Technical SEO Elements

Once you’ve done some preliminary keyword research, begin to optimize your individual product pages by addressing your Technical SEO:

  • Page Title
  • Meta Description
  • H1
  • Clear search intent

E-commerce Title Tags

Page titles, also known as title tags, need to accurately represent what a product is. They also need to feature the primary keyword for which you most want to rank. Make sure this keyword or phrase is front loaded in the title tag so it’s more noticeable on a small mobile screen, and the key information from the title tag still displays even in rich snippets.

Take a look at the examples below. In the first, only the first two words of the page title are visible in the first one. In the second we see subcategories highlighted.

E-commerce Meta Descriptions

You should also include relevant keywords in the meta description for your product page. Your meta description should encourage the user to click into the search result, clarify what the user can expect from the page, and include product-relevant keywords to help your page rank.

It’s important to keep in mind that Technical SEO is primarily about helping users navigate online. The meta information on your product pages (including page titles, descriptions, and headers) serve much the same purpose as highway signs or signage at an airport: They help users reach their intended destination. Thus, you should attempt to be as informative as possible, while also being brief and direct.

Improve Product Metas with “Modifier” Words

When thinking about what information to include in your title, meta description, and/or H1 it can help to think about product modifiers. These are terms (often included in long-tail keyword searches) that further describe the product. These often include include items such as:

  • Price
  • Size(s)
  • Color(s)
  • Material(s)
  • Whether an item is for a particular gender and/or age group
  • Discounts (you may want to include keywords like “as low as”)
  • Shipping options

Remember, many of the users you’re targeting know exactly what they’re looking for in detail. You thus need to provide them with information demonstrating you’re selling exactly what they’re looking for.

Images Sell Products, Alt Tags Help

Including images of your products is crucial to ecommerce SEO.

  • Images increase product sales exponentially, and help users form an idea of what they’ll be receiving for their money.
  • Well-optimized images with fast load speeds help send signals to search engines that your site has been optimized for mobile users.
  • Alt tags and image title tags can help your images show up in image searches as well as sending additional keyword-relevancy signals to search engines.
  • Images are a prerequisite for being included in Google’s rich snippets at the top of the search results.

Images Provide Better Customers Context

Images allow you to display your products in dynamic ways. If you’re selling apparel, product images where items are being worn by models provide customers a better sense of how an article of clothing looks when worn.

Images can also provide context for products. For instance, maybe you sell furniture and fixtures. An image of a product in a room (ideally surrounded by a few other items) will give users a better idea of its size, and how it will look in their own homes.

Images Need to Be Size and Speed Optimized

Product images with small file sizes, which do not display page load speeds, and which adjust responsively – display well on mobile. Google is continuing to switch sites over to mobile-first indexing and both Google and Bing noticeably reward sites with images and rich media in search.

Notes on Drafting Alt Tags

Image ALT text tags are simply descriptions of images on your site. They also play a significant role in ecommerce SEO. Alt text tells a user what an image depicts when the image either doesn’t load, or when the user is blind, but it also provides search engines more information about what an image itself is relevant for in search.

Where appropriate, ALT tags should feature the keywords you want to rank for without adding confusion to the image description itself. Keep in mind the point of alt text is still to help people with accessibility issues, so keep alt text relatively short (no more than 125 characters) and try to be specific about any key product features highlighted in the image (such as product name, size, materials, and any other relevant information). This is another way in which you can tell a search engine what type of content appears on the page.

Lengthen Product Descriptions

Including a short product description right next to the product image is a smart way to capture a potential customer’s attention and improve the on page SEO. They’ll see both the product and the most important information about it at the same time.

How to Avoid Thin Content

Review this example to understand why this method is effective. The image shows off the product, while the copy provides a user with the basic essential information.

Scroll down, however, and you’ll find a lengthier product description. A longer product description section gives you the opportunity to include more keywords in your content and ensures you won’t be prevented from ranking for thin content.

Additionally, great content is more likely to engage customers and build brand awareness. When you only have thin content users spend less time on the page, and less time considering your product. Not only does more time spent on the page improve your rankings in the SERPs, but also great descriptions will help guests better understand why a product is valuable.

The right word count for descriptions depends on how much content exists on a blank product page. What this refers to is the sum of text in the navigation, header, footer, etc. before product information is added. Making sure product descriptions are longer than the sum of the base page content is a good starting point.

Never Use The Manufacturer’s Product Description

One thing to always note, though: never use the manufacturer’s product description. Ensure yours is unique and not copied from the manufacturer or another site. This is important because search engines won’t show the exact same content (duplicate content) from multiple sites. They instead only display content from the site that is deemed most “trustworthy.” This can end up being the site that has had the content up the longest, the site with the most traffic, the site with the most backlinks, or the site with the most users.

How to Avoid Duplicate Content

If you have multiple pages for essentially the same product (ex: the same product in different colors, or the same product in different sizes), you’ll need to make some choices so that search engines are not confused by what are essentially duplicative pages/duplicative content:

  • Keep the product description on each page relatively the same, but set one page as canonical.
  • Create entirely unique product descriptions/unique content for each page.
  • Vary a percentage of the content under the descriptions using LSI keywords (latent semantic indexing keywords).

To avoid duplicate content when creating product descriptions, try breaking up the content into multiple sections. For example, one section could describe the story behind the product. Another could list its key features. Yet another could feature customer testimonials, just like the Biossance example above.

Include LSI Keywords

LSI keywords are terms that search engines expect to see on a page that is related to a particular topic. Often called focus terms, LSI keywords help Google understand the focus of a page. Read more about focus terms and content optimization here.

LSI keywords can help you tailor each page to rank better for the longtail keyword or term you’ve selected. We even have our own content optimization tool that you can use for free.

Break Product Descriptions Into Sections

You should also consider that different people care about different product features. Breaking your descriptions up into sections makes it easier to appeal to all users. For example, if you were selling a garment, various users might care about such information as size, durability, warranties, shipping options, color options, special features (such as water-resistance), and more. Use your longer product descriptions to provide all information you believe potential buyers would be interested in. If your product descriptions are exceptionally long, you can even use internal links called jump links to help users navigate down to relevant content more quickly.

Implementing Product Schema for an eCommerce Site

Schema markup – also known as rich snippets – refers to HTML tags you can add to your content. When used correctly, schema can increase CTR by as much as 677% and boost traffic 20-30%. By providing users with more valuable information about your content when it shows up in Google search results.

For example, maybe a user’s search results include one of your product pages. With schema markup, you could include customer ratings in your organic search result – or even show up in the product rich snippets that appear at the top of the search results. It’s also worth noting that Google’s own John Mueller has confirmed that schema is important to SEO.

How to Implement Product Schema

Product schema can substantially improve your SEO. It’s also fairly easy to implement. The following are two simple ways to do so:

Install a Plugin

Do you use a major platform like Shopify or WooCommerce to manage your ecommerce site? If so, you can simply install a plugin for schema markup. It will allow you to add the necessary schema with ease.

Other platforms, like WordPress, have their own plugins, too, like these. So does Squarespace. While these platforms don’t allow for extensive schema markup on their own, plugins can expand their capabilities.

Use a Schema Markup Platform

If you have a custom site that is NOT managed via WordPress, you could instead use SchemaApp, which allows you to organize your schema markup data on one platform. You can also use this tool if you host your e-commerce site through such platforms as Shopify, Woocommerce, BigCommerce, and Squarespace.

There’s also Google’s Structured Data Markup Helper, which you can follow along with after selecting “Products” from the main screen.

Does your organization have substantial in-house technical resources? If so, you can coordinate with a web developer to add schema markup to your site via Schema.org. This allows you to exercise a greater degree of control over the purpose of the schema you wish to add. It’s not an option for all businesses, but it’s worth considering if you have the necessary resources.

Build Backlinks

Building a strong backlink profile is part of an effective SEO strategy for any site. Ecommerce sites are no exception. Inbound links (also known as backlinks) are critical for improving your SERP, and can help you bump terms stuck on the second search engine results page up to the first.

What are inbound links or backlinks? A backlink is when another (external) site links back to your site, referencing your products, services, or content. In essence it’s another site referring their own users to your site because your site provides value. Essentially, search engines view other sites linking to your site as a positive reference from a real person. Link signals are weighted heavily in SEO. Each backlink your site receives, increases the value of your site in the eyes of Google, and thus improves your rankings.

As more domains link back to your site, your own site’s domain authority will increase. As your domain authority increases, so does your site’s SEO value. Search engines use these ranking signals (backlinks) to determine which sites are most relevant online for related topics. Adding link authority boosts your page’s SEO value, and it’s ranking in the search results.

A strong backlink profile improves brand awareness and captures top of funnel web visitors who may encounter your site/brand via another initial source.

How to Build Backlinks

Link building starts with creating quality content that will be used and shared by people outside of your own site. Securing inbound links from reputable sites tells search engines your site is also reputable. These links may also provide additional opportunities for your products to display in rich snippets, and direct more traffic back to you.

There are several ways you can build backlinks for an ecommerce site. The following are a few methods we’ve found to be successful for e-commerce sites:

Submit Your Products to Product Lists & Pages

Many sites routinely post lists such as “Best Holiday Gifts for College Students,” “25 Life Changing Products under $25”, or “What to Get the Person Who Has Everything.” There are also sites specifically designed to help users discover new products (such as uncommon goods, product hunt, or pinterest). Submitting your products to these sites boosts your odds of showing up on such lists. Additionally, you may wish to submit your products to sites where users actively discover products, such as Pinterest, Product Hunt, or Wish.

Post Strong Blog Content

A blog featuring valuable content can be a very useful tool for building backlinks. We recommend starting by identifying a list of ideas for blog posts. Each blog post should be tailored to a frequently asked question, or frequently searched topic, relevant to your business and your target consumer. Popular blog content, such as “Top 10 Gift Ideas for Father’s Day 2020” can encourage others to link back to this type of entry if it is fairly comprehensive.

Additionally, you could submit guest blogs to other sites, linking back to your products in the content. For further reading, SEMRush has a great guide to guest blogging as a linking strategy.

Pitch Product Pages

A more advanced approach would involve pitching product pages. Think about the kind of sites and publications that are likely to cover your products. Check their writer profiles and masthead to find their contact information, and submit a product for review. Each site will have its own process, so research publications and influencers or discuss this with an editor or other relevant individual before submitting your products blindly.

Image Above: Example of a Product Write-Up Included in a List

You may also want to coordinate with influencers in your niche. This guide gives an excellent intro to reaching out to influencers. Search for social media influencers in your industry. If a popular Internet personality recommends your products, that will generate more backlinks and drive overall interest in your brand.

Next Steps

Once you’ve optimized the basics (product descriptions, URLs, schema, etc.), Make sure that your site has enough trust signals and social proof that consumers feel confident purchasing from your online store:

  • Your site design needs to look professional, from the home page to your checkout page.
  • You need to provide security indicators around payment portals such as SSL certificates (ex: https vs. http).
  • Add product reviews to your site!

Customers typically trust user reviews through third-party platforms (such as reviews on Google My Business or Amazon) more than curated product reviews you post to your own site. However, testimonials from people where their full name is displayed can still be a great first step, or even an addition to pulling third party customer reviews onto your own site.

Depending on how you’ve set up your product schema, you’ll also be able to display your aggregate rating, or star rating directly in the search results (especially the Google search results).

These points are all important to keep in mind when developing an ecommerce SEO strategy for your site. The right combination of tools and techniques can be the key to ranking higher, attracting more organic traffic, improving your conversion rate, and ultimately increasing your e-commerce sales.

Get 7 Days Free to use the most powerful SEO software on the planet

Learn More

Many Shopify store owners are not leveraging their full SEO potential. You might think that you don’t need Shopify SEO or that it is too difficult to set up, but you could be missing out on a large number of potential customers browsing your online store. If you want to help your e-commerce store show up in Google, here are some SEO Shopify tips to help you get started. 

What is Shopify SEO?

Shopify SEO is the process of making your online store more visible in the search engine results pages (SERPs). This means that when people search for the product that you sell, you want to rank high so that you get more organic traffic and increased sales.

Best SEO Shopify Tips for 2022

In order to optimize your Shopify store effectively, follow these 9 tips.

Tip #1: Categorize Your Products

Category pages allow online shoppers to easily navigate your products. You’ll want to target keywords that your shoppers would be interested in. 

You’ll first want to start with title tags and meta descriptions. As PracticalEcommerce states, “The title tag is the most influential on-page element that sets your page’s keyword theme and, combined with the meta description, influences the search terms the page ranks for.” 

Another element you’ll want to optimize is your heading tags. Heading tags are parts of the page that tell the reader what certain parts of the page contain. 

The Media Captain offers these tips for more effective headlines:

  • Only have a single H1 on the page.
  • H2-H4s should elaborate on the core theme and follow a clear hierarchy of information.
  • Use relevant keywords in your headings.
  • Make your headings relevant and captivating; you want your readers to be intrigued and informed on the content within that section.

Tip #2: Optimize Your Shopify Store’s URL Structure

If you want to rank well, another thing you have to keep in mind is your URL structure. A clear URL makes it easy for visitors and search engines to navigate your site. 

ContentKing states a URL is generally considered good if it’s: 

  • Descriptive and easy to read
  • Brief
  • Consistent
  • Lowercase

This makes it so that when using Search Engines users can see the whole URL and keywords before they click.

You should also try to include keywords in your URLs, so rather than something like this:

example.com/categories/jk13d3

You should have a URL that contains keywords from your product page. So if ID jk13d3 stands for “Levi Jeans” on your site, you could change the URL to:

example.com/categories/levi-jeans 

Another tip is to make sure you separate your URLs with hyphens. Google may not be able to read your URLs (eg. example.com/soccershoes) if you don’t include punctuation. Keep in mind Google treats hyphens as a space (eg. example.com/soccer-shoes) and underscores as a separate character.

Tip #3: Choose Your Keywords Wisely

The keywords that you incorporate into your eCommerce store are incredibly important. That’s why it’s imperative to do research and see what your direct competitors are using. This can be done just by going to their websites and compiling data or using a keyword research tool. Here are just a few tools that might work for you:

  • Google Keyword Planner – Get help with keyword research and selecting the right terms.
  • WordStream Keyword Tool – Discover and export new keywords and performance data to help you succeed in Google Ads and Bing Ads.
  • Our SEO Dashboard – a popular SEO tool that specializes in keyword research, competitor analysis, and content optimization

When picking what keywords to target there are a couple of factors you’ll want to look at:

  1. Search Volume: this is the amount of time a keyword is searched for within a certain timeframe.
  2. Keyword Difficulty: this is how hard it will be to rank for a certain keyword
  3. CPC: Cost-per-click is the amount advertisers are paying to target the keyword in Google Ads and is a good sign of conversion potential

Usually, you want to target keywords with a high search volume, high CPCs, and less competition. 

Once you know what keywords you’re targeting, start incorporating them into product descriptions, landing pages, and headlines. You’ll want to continue to monitor these keywords and see how they’re performing, and make adjustments when necessary.

Tip #4: Attract Buyers With Your Page Title

When searching Google your page title is the first thing online shoppers will see and influences whether or not someone will click on it. And what’s the right length for a title tag? As Moz states, “While Google does not specify a recommended length for title tags, most desktop and mobile browsers are able to display the first 50–60 characters of a title tag.”

Here’s a formula for writing a great page title:

Keyword | Additional Keyword | Business Name

Here’s what this looks like when you search “cashmere sweater” on Google:

As you can see, these Page Titles are short and sweet. It’s important not to stuff your titles full of keywords as this can get you into trouble with Search Engines.

Tip #5: Develop A Strategic Content Plan

To create a great content strategy you can’t just create blog content, stuff in a bunch of keywords, and call it a day. Instead, you’ll want to start off by thinking about your target audience and what type of content resonates most with them.

Dive into your analytics and look at things like the age, gender, and location of people who buy your products. Another big factor is how your audience likes to consume your content. Do they use Instagram? Prefer videos or white papers? Short or longer-form content?

Start thinking about what sets you apart from your competition. From there, you can develop original content that aligns with your brand and your audience’s interests. Make sure to do keyword research, SEO competitor analysis, and look at keyword gaps. And finally, get started on writing your unique content! Content can be for:

  • Social Media
  • Email
  • Guides/Ebooks
  • Infographics
  • Videos
  • Blog Posts

Once published on your Shopify website, you’ll want to monitor performance to see what page content your audience is enjoying and interacting with. This will impact your content plan moving forward.

Tip #6: Keyword Density

Keyword density – also called keyword frequency – is the number of times a keyword appears on a webpage compared to the overall word count. In the past keyword stuffing, or putting as many keywords as possible into content was very common. Now Google penalizes the page rankings of sites that keyword stuff. 

Want to figure out your keyword density? It’s easy to do! Simply divide the number of times a keyword is used on your page by the total number of words on the page. 

For example, say your content had 30 keywords and overall 2,000 words:

30 / 2000 = .015 %

Multiply that by 100 to get the percentage and you get 1.5 %.

For keyword density, there’s no perfect amount although “…many SEOs recommend using approximately one keyword for each 200 words of copy.”

There are also some free tools to help you calculate keyword density if you want to streamline the process:

  • SureOak: offers free keyword analysis for the SEO of any page on your website.
  • SmallSEOTools: analyzes the density of your text just as a search engine would do.
  • SEO Content Assistant: Recommends a frequency for key topical terms based on your target keyword

Tip #7: Acquire Trustworthy & Authoritative Backlinks

What’s a backlink? Backlinks are links from one website to another. They’re important because these links tell Google and other search engines that your site has domain authority. 

So when a bunch of other websites link to your website this increases your website’s domain authority with search engines, therefore also increasing your SEO. 

Not all backlinks hold the same weight though. You want to have backlinks to websites that have high domain authority. For example, a backlink from a page on Shopify would be extremely valuable since they are an authoritative website.

There are a couple of strategies you can use to when link building:

  • Guest posting: research blogs or companies that relate to your business and email them to see if they would be interested in a guest post. 
  • Become a source for reporters: HARO (Help a Reporter Out) connects people that need sources, like journals and publications, to sources. It’s free to sign up and you can filter by industry. 
  • Write great content: If you write quality content that resonates with your audience others might start linking to it without you even needing to reach out.
  • Infographics: according to Search Engine Journal, an infographic is 30 times more likely to be read than a text article. Companies love including them in blogs and ebooks so if you’re able to create a visually appealing infographic it can go a long way for your e-commerce website.

Tip #8: Optimize Image Alt Text

Today, nearly 38% of Google’s SERPs show images. That’s why it’s important to capitalize on another source of organic traffic – images.

Alt text or alt tags is the copy that will appear if an image fails to load. If someone is using a screen reading tool the image will be described using this copy. 

In order to add Alt text to your products Shopify provides these instructions:

  1. From your Shopify admin, go to Products > All products.
  2. Click the name of the product that you want to edit.
  3. From the product details page, click a product media item to see the Preview media page.
  4. Click Add ALT text.
  5. Enter your alt text, and then click Save ALT text.
  6. Click the X to exit the preview page.

Again, you won’t want to keyword stuff with your Alt text images. Instead, focus on only one or two keywords and be descriptive of the product. If a product has text on it make sure to include that.

 

A good example is the picture above of a bag of Doritos. In the Alt text they include the product name, flavor, size, and amount of product included. This tells the reader exactly what the image is without going overboard with copy.

Tip 9: Offer a Great User Experience 

You may have heard the term UX thrown around before – it stands for User Experience. In terms of an eCommerce website this means things like your website is easy to navigate, visitors can find products they’re looking for, it’s easy to add items to a shopping cart and checkout.

How do SEO and UX work together? SEO drives traffic to your site, and UX gets that traffic to convert. 

You should take these factors into consideration to create a great UX experience.

  • Navigation – you want your navigation to be as simple as possible. Visitors should be able to go to any page and know how to go to a different location. A great feature to include is a search function. Proving an intuitive and easy to use navigation can improve the customer search experience and boost conversion rates.
  • Site Speed – for eCommerce Consumers say their ideal speed is two-seconds. If your site is too slow there’s a good chance shoppers will leave and go to a competitor instead. PageSpeed Insight is a great free tool for checking your site’s speed. 

Mobile Friendly – In 2021, 72.9 percent of all retail e-commerce is expected to be generated via m-commerce. That’s why your site needs to work and look great on both desktop and mobile. Make sure your homepage, navigation, images, and copy all display correctly on mobile devices. Go through the process of purchasing a product and ensure that process is seamless. Shoppers will immediately switch to a different site if they experience any issues.

Final Thoughts on SEO Shopify Tips

Deploying these tips across your Shopify store can be all the difference in driving real clicks to your website. If you are not quite ready to take your SEO strategy into your own hands, reach out to an e-commerce SEO expert to get started.

For anyone who wants to drive organic traffic to their website from search engines, it’s important to have a strong understanding of the role keywords play in getting your website in front of your target audience. There are thousands of ways that users might search for products or services like yours. So what are keywords and their role in SEO strategy?

What are Keywords in SEO?

Keywords are the words and phrases that people enter into search engines. Searchers use keywords to find what they are looking for, and search engine bots use keywords to understand what kind of content users want.

Another common way of thinking about keywords is the term search intent. Keywords communicate the search intent of the user, for example if they want to make a purchase, compare products, read reviews, or something else.

What Are the Different Types of Keywords?

There are four primary types of search intent every business owner should attempt to address with their web page content:

  • Informational. These come from users who are searching for an answer to a question or want more information on a product or service.
  • Commercial. Meant for users who are searching for a specific brand, product, or company, and are contextual keywords.
  • Navigational. Consumers looking to find a specific site or page.
  • Transactional. Customers who are looking and are ready to complete a purchase.

Within these four categories, there are two popular subsets of keywords:

  • Head terms: These are short terms of either one or two words.
  • Long-tail: These are longer phrases and can be anywhere from a few words to an entire sentence long

It’s important to note that more than 70% of all keyword searches are long-tail. The majority of people use conversational queries when using search engines. Publishing multiple pages on your website that target long-tail keywords can be an easy, affordable way to expand your market share.

Using Keywords

When a user performs a search relevant to your business, we want that user to see your website listed in the search results. It doesn’t matter if you have the best products/services on the planet if a potential customer can’t find you.

There are two ways to use keywords to show up in searches:

  • Buy Search Ads (Google Ads, Bing Ads)
  • Optimize for Organic Search

Method for Using Keywords #1: Buying Search Ads

You can show up in the search results for various keywords by creating search ads that will display whenever a user searches for the keywords you’ve selected.

Example of a paid search advertisement 

After you create an ad or ad campaign, the platform you are using (ex: Google Ads) will ask you to enter your target keywords. Pro tip: if you’re just getting started with ads, less is more when it comes to keyword targeting, it’s best to start with just a handful of high-value terms.

Method for Using Keywords #2: Optimize for Organic Search

The other way to show up in the search results for a keyword, is to optimize your site for that keyword. This is a two part approach: you first have to make sure your site has content relevant to that keyword, and then second you have to get some other sites linking back to yours referencing that content.

Buying ads costs direct money (you pay per click), and optimizing for organic search costs indirect money (time and effort to create content and optimize pages). So we want to narrow down our focus to just the keywords that are likely to provide your business the most value.

Remember: Ads can be quick wins, but will cost you more over time, they’re like renting space in the search results where SEO investments may take slightly more time, but you’re building search equity and earning your spot more permanently.

 

Keyword Metrics

Let’s put search intent aside for a moment, and look at the other ways we can establish which keywords are good targets for your business.

Here are examples of important keyword metrics as seen in the Keyword Researcher. Let’s walk through what each of these metrics are and what they mean.

Keyword Difficulty (KD)

This is a metric that measures how difficult it is to rank in search results for this term organically. The scale goes from 1-100 with 100 being most difficult. Websites with higher Domain Authority stand a better chance of ranking for more competitive keywords.

How do I use this info?

The best way to use Keyword Difficulty is as a part of your decision for whether or not you’re going to target the keyword with organic optimization.

  • Terms with a keyword difficulty of 35 or less are usually relatively easy to move with onsite and offsite SEO improvements.
  • Terms with a keyword difficulty of 60 or more are going to take a LOT of SEO investment if you want your site to show up on the first page of search results.

If the Keyword Difficulty score is really high, but the CPC is really low, you may be better off buying ads than trying to rank organically.

Example from the Above Keyword Searches

The keyword enterprise seo has a KD of 81. This is a very competitive keyword to rank for organically, so for a website just starting out, this keyword would likely be too competitive to target.

However, the keyword enterprise seo services has a KD of 54, meaning less competition to rank in organic positions but very similar search intent.

Cost Per Click (CPC)

This metric is how much advertisers are paying to target the keyword in Google Ads campaigns. A high CPC tells you that other businesses think searches using this keyword will result in a sale. The higher the CPC, the more valuable the sale is to other businesses.

How do I use this info?

For paid media, the benefit of low CPCs is obvious — lower costs. But for SEO campaigns, CPC can be used to help determine keywords that are good candidates for organic optimization.

  • Low CPCs show keywords with more informational or navigational search intent. Low CPCs equate to lower conversion potential
  • High CPCs show keywords with more transactional or commercial search intent. High CPCs equate to higher conversion potential.

Using CPC in conjunction with KD is a great way to find organic keyword target that are realistic to rank for but still have strong buying intent.

Example from the Above Searches

The keyword enterprise seo services has lower keyword difficulty, but the highest CPC on the above list. If your brand invested in on-page seo for your landing page, off-site seo, and some supporting blog content, your brand could probably rank on the first page of search results for this term within 3-6 months. In the long run, that might be more cost effective than paying for paid search ads over time.

Search Volume

This metric tells you how many users are performing searches for the keyword each month. A high search volume tells you that if you rank on the first page of the search results for the keyword, you are likely to get a lot of users hitting your site.

How do I use this info?

If you’re looking at two keywords that have similar keyword difficulties, similar CPCs, similar search intent, but one has higher search volume – that’s the one you should target with organic search optimizations (i.e. improving your onsite content to target that term).

Example from the Above Searches

All of the above searches have a meaningful search volume. The higher search volumes for the keywords enterprise seo and seo for enterprise are likely due to the significantly higher Keyword Difficulty scores. Although ranking for those keywords would be great, remember it is more competitive. So for organic SEO purposes, targeting the terms with 480 and 500 Search Volumes can still have tremendous benefits.

The true VALUE of a keyword is the number of converting users that keyword can draw into your site. The number of users is measured by search volume and the conversion likelihood is measured by CPCs and search intent.

Remember – don’t get lost in the data! A keyword could have great volume, high CPC, low difficulty, and STILL be a bad choice for your specific business if the term has nothing to do with what you sell or offer or the content on your website.

High-value keywords are ones that bring CONVERTING users to your site.

4 Rules for Choosing Keywords for SEO

Rule #1: Always try to target keywords that are likely to bring users to your site who convert.

Indicators that a keyword is likely to convert:

  • High CPC
  • Very specific (long-tailed)
  • Aligns with your products/services
  • Clear search intent
  • You’ve seen the term convert for your site already (through your Google Analytics, Google Ads, or Google Search console)

Rule #2: Try to pick keywords that will pull the MOST converting users into your site.

500 conversions are better than 50 conversions (remember best case scenario, in position 1 of the search results your click-through-rates will average 33% of all traffic for the keyword). For example, here is the traffic share by position for the keyword enterprise SEO.

Make sure you choose keywords that have meaningful search volume, otherwise you will be optimizing your website for no one.

Rule #3: Always do extensive keyword research before you invest in organic optimizations.

You can use keyword research tools like google keywords planner or our dashboard to do keyword research. Ideally, each one of the landing pages on your website should be targeting a different keyword, meaning you’ll need to find multiple relevant keywords to deploy a comprehensive SEO strategy.

You can also use the Content Researcher tool in your dashboard to identify whether or not a keyword is a realistic target for your website. Here’s a video tutorial walking you through the process.

Rule #4: Spend wisely – know when to invest in SEO improvements vs. paid ads.

If you are still in the early stages of determining whether to spend your digital marketing budget on SEO paid ads, make sure you consider all of the important keyword information above. Ideally, you will likely want to have some combination of both strategies.

Invest in SEO Improvements in the following scenarios:

Low-Hanging Fruit
Quick wins.

  • CPC is high
  • Search volume is high
  • Keyword difficulty is low
  • Search intent is clear

Precision Strikes
Quick wins.

  • CPC is very high
  • Search intent is clear
  • Keyword difficulty is low
  • Search volume is low

High-Return Investments
        These are terms that will take you longer to rank for, but will have a huge impact on your business.

  • CPC is High
  • Search Intent is Very Clear
  • Keyword Difficulty is Medium or High
  • Search volume is high

Consider buying Ads in the following scenarios:

Work Smarter Not Harder**

  • Keyword difficulty is high
  • CPC is low
  • Search Intent is Clear
  • Note: volume doesn’t matter as much with PPC ads

While We Wait

  • CPC is High
  • Search Intent is Very Clear
  • Keyword difficulty is Medium or High
  • Search volume is high
  • You’ve decided to make an SEO investment in the term, but it’s going to be a while before you’re ranking on the first page of the search results.

Our Dashboard’s Keyword Researcher Tool

If you’re interested in learning more about the best keywords to target for your site, sign up for a free account and check out our Keyword Researcher.

Here are some of the key metrics you can check with the help of the Keyword Researcher:

  • Keyword Difficulty, Search Volume, and CPCs
  • Search volume trend over time
  • Search volume by country
  • Top-ranking web pages
  • Suggested keywords, related autocompletes and questions

You can easily add keywords to a list and export to a spreadsheet to share with key members of your team.

If you want to target multiple keywords with every web page on your website, you may want to consider a keyword clusters strategy. If you need help with choosing the right keywords for your SEO campaigns, schedule a meeting with one of our SEO strategists.

If you’re looking for ways to improve the SEO of your website, content pruning is an effective method to consider. 

Content pruning is all about removing any unnecessary content from your website, which can help your website rank higher in search engine results and can also improve user experience.

In this article, you will learn how to do content pruning for SEO, from identifying what needs to be pruned to taking steps to ensure that it’s done correctly. Keep reading to find out more.

What is Content Pruning?

Content pruning involves a thorough review of all content on a website to identify any content that could be seen as irrelevant or low quality from the perspective of search engine algorithms. Once identified, content pruning involves removing those low-quality pages from the website or replacing them with better content.

A part of regular website maintenance is making sure that all of the content on the website is up-to-date and relevant to the website’s goals. This can help ensure that a website has content that is useful to its visitors and provides value to users who arrive from search engines. 

By removing low-quality or outdated content, websites can see improved search visibility for their highest-value, highest-converting web pages.

Why Remove Content From My Website?

Content production takes time and resources. So you may be wondering: Why remove content from my website after all the work it took to create it?

Although it may feel counterintuitive to your content strategy, content pruning can actually have major benefits to your search engine performance. This is even more true for websites that have a robust SEO content strategy and are publishing new web pages on a regular basis.

Some of those benefits include the following:

  • Improve a website’s visibility by allowing search engines to index your best and highest-converting web pages
  • Ensure visitors are presented with the most up-to-date information
  • Provide higher-quality content and a better user experience
  • Prevent visitors from seeing any low-quality pages
  • Ensure your crawl budget is spent on rank-worthy content

Any content that sits on your websites that doesn’t pull its weight in either traffic or conversions isn’t actually bringing value to your business.

By taking the time to regularly prune their content, content managers can ensure that their website is performing at its best. That’s why sometimes content pruning is the right choice for your content strategy.

What Makes a Web Page Prunable?

Here are some of the qualities to look for when searching for content on your website that may need to be pruned.

Low-Quality

Bad content can have a negative impact on a website’s rankings. Low-quality content can include pages with duplicate content, thin content, or other qualities. It can also include pages that are not user-friendly or are low on useful information. Content pruning can help to identify and remove such pages, allowing search engines to easily index the more relevant, more quality content on the website.

Duplicate Content

Duplicate content is web pages with the same content or similar content that search engine crawlers do not identify as distinct. Google does not want to see duplicate content on a website unless it has been clearly identified as such via a canonical tag.

Thin Content

Thin content is often short-form content that doesn’t provide any real value to the user. Although there is no exact content length that Google privileges in search engine results, experiments have shown that longer, in-depth content tends to rank higher.

Most often, thin content can be combined with other pages on your website to provide a more comprehensive, in-depth answer to a topic, question, or keyword query. Combining content, or redirecting thin pages to more in-depth ones, are also a part of the content pruning process.

Outdated Content

The reality is, the content on our website will become outdated over time. This is why creating evergreen content is important, however, it’s unlikely that your long-form content will last forever without the need for updating. Trends, technologies, and knowledge will change, and web pages should include the most up-to-date, useful information for search engines.

Also, outdated information can be confusing for visitors and lead to a poor user experience. Removing outdated content can ensure that visitors are presented with the most relevant and useful information.

Under-performing Content

If you have a web page on your website that does not get traffic or conversions, what value is it bringing your business? If the web page does not rank in search results, convert users, or is not a vital part of the buyer journey, it doesn’t really have a place on your website unless you take the time to improve its content.

How to Find Pages for Content Pruning

You can use the Page Pruning tool in the dashboard to discover pages that may be eligible for content pruning.

To find the tool, navigate to Site Auditor > Page Pruning.

This tool will show you any pages that are eligible for pruning due to a variety of reasons:

  • Low organic impressions
  • Low clicks
  • Indexability
  • Total ranking keywords
  • Average position
  • Content quality/scores

Remember, just because a page appears on this list doesn’t mean that it has to be pruned/deleted, but that it may be eligible based on its performance metrics.

Next Steps for Page Pruning

Once you have reviewed the software’s suggestions and confirmed that the pages are eligible for pruning, here are the next options for you to take.

1. Improve the Content on the Page

The underperformance of the page may rest in the fact that the content is thin or is not providing a comprehensive answer to the user’s questions. 

You can look to the “Boostable,” tab in the Page Pruning tool to identify those pages that just might need a slight content score boost.

The URLs that are listed here are already earning organic impressions but are not seeing as much success in organic traffic. Most likely, Google sees those pages as relevant but is not ranking them on the first page as a result of the content.  

You can use the SEO Content Assistant in your dashboard to discover ways to strengthen and improve your content. Or, use the on-page audit tool to see what on-page elements may be impacting your performance.

Follow the guided suggestions for focus terms, headings, questions, and internal links. Include them on the page to make the content more rank worthy. 

2. Update the Content to be More Evergreen

If your content covers trends or keywords that have seasonal search volume, that may impact their underperformance.

Consider updating the content with more evergreen information so the content has a longer shelf life in the SERPs.

Also, make sure that the information on the page is up-to-date with accurate, relevant information. Over time, links may break or content may become outdated. Updating your blogs and articles every 1-2 years should be a part of your regular website maintenance.

3. Build Backlinks to the Page

If both the SEO Content Assistant and on-page content school confirm that your content has high scores and is rank-worthy, you may just need a bit more of a link boost.

Backlinks are Google’s number one ranking factor. If you don’t have very many backlinks pointing to your web page, that may be a reason why it is not ranking on the first page.

You can use the backlink research tool in your dashboard to see which of your web pages have the least amount of link equity.

Consider investing in a link-building campaign in order to improve the off-site signals of the content. Doing so is likely to improve the overall keyword rankings, impressions, and organic clicks.

4. Reoptimize the Page for a Different Keyword

Another possible explanation for your content’s poor performance may be keyword related. 

Some keywords are more competitive than others. If you optimized the page for a keyword that is out of reach, reoptimization may be your next step.

When choosing keywords for SEO, you want to make sure your website has a realistic chance of ranking for the target keyword. Websites with higher Domain Authority will stand a better chance of ranking for competitive keywords.

We suggest keywords that are less than or equal to your website’s Domain Authority.

So once you find a more realistic goal, optimize for that keyword instead. This will likely involve changing metadata, website copy, and headings on the page. But it can make a huge difference in improving organic performance.

5. Redirect the Page to a More High-Quality One

A page may be flagged for pruning because Google is ranking a more helpful piece of content on your website.

This is known as keyword cannibalization. It happens when two pieces of content are very similar and Google doesn’t know which to promote. If there is a page that is ranking less often but is similar in relevance, you can do your “content pruning,” by adding a 301 redirect from the less comprehensive page to the better performing. 

6. Combine Thin Content into a More Comprehensive Resource

If you have a series of pages that are thin on content but relate to a similar cluster of keywords, consider combining those pages into a more useful, long-form resource.

Why? Because Google likes to promote content that satisfies users’ search intent. That means not only answering their initial questions but all of the additional questions that might follow regarding that primary topic. 

So before giving up on that set of keywords entirely, combine those various pages into one page. Then, see if the overall keyword rankings improve.

7. Consider Removing the Page Entirely

This is the last step you want to consider after you have concluded that none of the above steps help to elevate content performance.

The reality is, if a piece of content is not driving website traffic, converting users, or an essential part of the buying journey, it doesn’t really deserve space on your website.

Take the ultimate step and trim that content branch off your tree.

Conclusion

Making content pruning a regular part of your website maintenance is a good habit to get into. This is especially true for websites that publish a lot of content and have a robust SEO strategy.

There is so much that goes into launching a new website. Purchasing a domain name, finding a great web designer, and writing website copy are all essential steps in a successful website launch. Most site owners invest a lot of money and time into the design and readability of their websites. But without any website traffic, no one will even see the high-quality website they have worked so hard to build. 

That’s why SEO is so important for new websites. Executing a careful SEO process with your new url can help you earn keyword rankings, protect from Google penalties, and save you time in the long-run. Starting with a good SEO plan can prevent site owners from having to make changes to foundational elements of their website down the road like the url structure, pillar pages, or overall site architecture.

Those new site owners who implement the best practices of search engine optimization are better situated to quickly show up in search results and start earning new site visitors and customers. For new site owners who want their hard work to be rewarded with organic traffic, these 10 SEO practices are the best approach for building your new website on a solid foundation.

If you don’t have the resources in-house, connect with SEO professionals to ensure your new website has what it takes to launch straight into the search results.

New Website SEO Checklists:

Keyword Research | Site Architecture | On-Page SEO | SEO Tracking | Page Experience | XML Sitemap | Local SEO | Link Building | PPC | Content Strategy

#1: Do your Keyword Homework for your New Website

Any new business or website entering a niche market needs to have a concrete understanding of their target audience. In the world of SEO, businesses connect with their target audiences through keywords and search terms. Keyword Research is the process of identifying and selecting the search terms that your target audience is already using to search for products, services, and content like yours. 

Although getting traffic from search is the ultimate goal, increased traffic is not helpful if the visitors who arrive at your new site aren’t actually in the market for the products or services your brand offers. If you select keywords that are irrelevant to your industry niche, organic traffic is unlikely to translate into increased conversions. If you select keywords with low search volume, it is unlikely you will drive many clicks simply from a lack of impressions. 

The best keywords, then, will be those that your website stands a strong chance of ranking for and converting from. Keywords with higher CPCs are more likely to drive qualified traffic and lead to your website. Why? Because if there are companies out there willing to pay top-dollar to rank in a PPC campaign for specific keywords, it’s likely the individuals using those search terms are converting.

It’s also important to consider the keyword difficulty of your target keywords. One of the most common mistakes that new site owners make is targeting keywords that are far too competitive. Like any new business or product, your website has to prove its value and usefulness to internet users over time. Choosing keywords with SERPs that are dominated by well-established websites with huge backlink profiles is not a smart strategy for new websites. Instead, identify longer-tail keyword phrases that are less competitive and provide opportunities for your new website to start ranking right now.

In this example, “business intelligence services,” is a great keyword target because it has lower organic difficulty, reasonable search volume, and higher cost-per-click

As you grow your backlink profile and domain authority over time, you can reoptimize your landing pages and web content for those more competitive search terms. The majority of searchers never make it past the first page of the search results, so if your website is unlikely to get to page 1 for your main keyword target, it’s best to go back to the drawing board and find a more realistic keyword.  

Keyword Research Checklist for New website SEO:

  • Use a keyword tool to identify the high-value keywords in your industry that will bring qualified traffic to your new website
  • Choose keywords that have a strong search volume, a lower keyword difficulty, and higher cost-per-click
  • Optimize your pillar pages — home page, services pages, or category pages — for those primary keywords
  • Develop a content strategy so you can continue to target other relevant search terms or long-tail keywords to earn more search engine rankings and more traffic in the long-term

#2: Build an SEO-Friendly Site Architecture

Site architecture refers to the way your new website is structured. Not only is site architecture essential for users to navigate your website with ease, it’s also important for the search engine bots that crawl and index your landing pages. Because your site’s architecture is an essential part of whether or not your landing pages show up in Google search results, it’s important to identify the primary keywords you want to rank for and structure your website accordingly. 

The homepage is the natural starting point of your new website, but it’s not necessarily the first page users will come across in search results. A strategic site architecture will give new site visitors strong footing regardless of which page they land on first. Users should always know where they are in your website and where they need to click for the content they seek next. 

A common metaphor for site architecture is the pyramid. Your homepage will form the top of the pyramid, and beneath it, you will have the primary category pages of your site. These category pages should be able to encompass all of the content on your website. Beneath those category pages you will have the specific pages that naturally fit within that category (e.g. For ecommerce sites, these likely include product pages). Your url paths should not only reflect your site architecture, but be short and keyword rich to be as SEO-friendly as possible.

The header and footer of your new site should include links to key pages so users can quickly navigate to important, valuable content. Your internal linking structure will provide the pathways for Google’s crawlers as well, and it will also distribute PageRank across your website, an important Google Ranking factor. Having breadcrumbs on your site will also help users and Google bots understand your website’s architecture. 

All of these SEO site structure tips can be completed by your web developer on the backend of your website. Popular CMS platforms like WordPress make it simple to add breadcrumbs. If you don’t think about these structural elements early on, you could be forced to make changes down the road that require an extensive site migration with redirects, url changes, or even website redesign. This can cause money, time, and even a loss of organic search rankings.

Site Architecture Checklist for New Websites:

  • Structure your website with your homepage at the top, category pages beneath the homepage, and then individuals pages or product pages within those categories 
  • Keep your urls short and keyword rich
  • Link the most important pages of your website in your header and footer
  • Find CMS plugins to set up breadcrumbs so users and search engine robots can easily understand how your website is structured

#3: Cover your On-Page SEO Basics

Once you have identified the keywords your ideal audience is using, you need to optimize your content for those keywords. On-Page content includes all of the visible and invisible content of your website. Google crawls all of it to understand the quality, relevance, and authority of your content in relation to your competitors.

Optimizing your landing page copy is easy with a tool like the Landing Page Optimizer. Enter your target keyword into the tool, and our software will scan the top ranking SERP results to identify key terms and phrases that Google associates with the search term. Add the suggested Focus Terms into your content to improve the topical depth, overall content score, and elevate your chances of ranking.

If you’re using a popular CMS like WordPress, Magneto, or Shopify, optimizing the backend of your website is fairly simple. Still, make sure that the web developer you are working with has SEO knowledge and can make sure the HTML elements of your website like your page titles, meta descriptions, and heading tags are optimized for your target keywords. Even better, have them use schema.org markup on the backend of your new website so Google crawlers can more easily crawl and index your web pages and improve the quality of your search appearances. 

As you add new landing pages to your website, optimize those pages for other relevant keywords or long-tail phrases that will expand your market and increase your overall keyword rankings. Also, make sure that each piece of content you create has good information architecture that includes internal links back to your primary category pages. By doing so, you will help Google better understand how all of your web pages relate. 

On-Page SEO Checklist for New Websites:

  • Use Landing Page Optimizer SEO tool and include the suggested Focus Terms into your website copy. Get your content score over 80 if you want to rank
  • Demonstrate topical-depth and authority with long-form, original content 
  • Optimize title tags, meta descriptions, heading tags, and alt text for your target keywords
  • Use schema.org to improve your overall search results

Learn more about the best practices of on-page content optimization with our comprehensive ebook

#4: Start Tracking the Performance of your New Website SEO

There are many different tools and softwares that you can use to track the performance of your website, but Google Analytics and Google Search Console are the primary tools every site owner should set up to track how their website is performing. With the key data and metrics these free tools provide, you can start understanding how users are interacting with your new site and how Google crawlers are ranking your web pages.

Google Analytics allows you to understand where your website traffic comes from, how long users stay on your website, how many pages they view, and other key metrics. With Google Search Console, you can measure your keyword rankings and search engine performance. Metrics like impressions, clicks, and click-through-rates can help you understand which content on your website is performing best in search engines and which pages are driving the most traffic to your website.

There are other tools and software that can help you monitor your overall SEO strategy. Once you begin to acquire backlinks for your new website, a tool like the backlink analyzer can help you identify spammy or toxic backlinks, perform competitor benchmarking, or identify new link building opportunities.

New Websites SEO Tracking Checklist :

  • The first step is to set up Google Analytics and Google Search Console Accounts for your domain name
  • For B2Bs, link your Google Analytics account to your CRM platform to track your leads coming from organic search
  • Set up your account to access the SEO dashboard and start tracking your SEO performance

#5: Make sure your New Website is High-Performing

An optimized website with high-quality content and high-performing UI/UX is key to driving organic traffic to your site. Users are going to press the back button on their browsers if your web pages take too long to load, don’t perform well on mobile devices, or don’t provide a high-quality user experience as they click from landing page to landing page. 

In 2021, the page experience of a website will be considered in Google’s ranking algorithms. Google’s crawlers will evaluate the page experience and usability of your new website primarily through their Core Web Vitals metrics, which measure speed, load times, interactivity, and visual stability. 

You can monitor your new website’s Core Web Vitals in your Google Search Console Account. GSC will provide you a score of “Good”, “Needs Improvement”, or “Poor” for every page of your website. If you see multiple pages with a “Poor,” rating, you may need to make some upgrades to your web hosting, web design, or other technical aspects of your website. 

Page Experience Checklist for New Websites: 

  • Make sure your website performs well across all devices — mobile, tablets, and desktop — and that all webpages, visuals, graphics, and animations load promptly
  • Use Google’s Free PageSpeed Insights Tool to check your page speed
  • Monitor your Core Web Vitals in Google Search Console and make any necessary improvements
  • Invest in a quality UI/UX designer to improve the overall page experience of your site 
  • Read this blog post for some tips on how to make your website faster

#6: Submit an XML Sitemap so Google Understands your New Website

The more new pages you add to your website, the more important an XML sitemap will be. Sitemaps communicate to Google crawlers which web pages of your website are the most important, make it easier for Google to find and crawl all of your pages, and help Google better understand the hierarchy of your website. 

There are tools that can help you make your own sitemap, but consulting with an SEO expert is a good idea to make sure you accurately identify the highest-performing, most important pages of your website.

For larger and enterprise-level websites with hundreds to thousands of landing pages, there may be multiple pages of your website that don’t generate conversions that don’t need to show up in search results. An SEO expert can implement noindex tags to ensure Google doesn’t rank those in their search engine results pages.

Once you have your sitemap complete, upload it in Google Search Console. GSC will notify you of crawl errors so you can make adjustments accordingly.

XML Sitemap Checklist for New Websites:

  • Identify the most important pages on your website that you want Google to crawl, index, and prioritize in keyword rankings
  • Create your XML sitemap with the help of a sitemap generation tool or SEO professional
  • Upload your sitemap into Google Search Console

#7: Create Profiles on Third-Party Review Sites and Get Local Citations

After your on-page optimization is complete, the next step is to work on implementing an off-site SEO strategy for your new website. Off-site SEO is all about creating signals on other websites that show search engines like Google and Bing that your new website is trusted by lots of people on the internet. These off-site signals play a big part in Google’s ranking algorithm. 

An easy way for your website to start building these signals is to get listed across business directories in your industry. For local businesses, local citations with up-to-date, accurate information about your business are particularly important for appearing in location-based searches. A local citation service can get your new website and business information listed across hundreds of business and online directories really quickly.

And for websites of all sizes, not just local businesses, creating profiles on the most important Third-Party review sites in your industry is essential for building your off-site signals. Google crawls and aggregates data from these sites and considers the quality of your reviews in its ranking algorithms. Make sure you are monitoring them regularly for any negative content.

It’s also a good idea to incentivize customers you’re confident had a positive experience to leave reviews. Point them toward the review sites that will have the most impact and are likely to show up in the top-ranking spots for searches with your brand name.

Local SEO Checklist for New Websites:

  • Create business profiles on the important review sites in your industry
  • Sign up for a local citation service to get your site listed across thousands of business directories
  • Incentivize customers to leave positive reviews to build offsite signals on the most important review sites

#8: Launch a Link Building Campaign

One of the most important search engine ranking factors is your overall site authority. When your new website launches, your Domain Authority score will be zero, because you have not yet earned a reputation among internet users, search engines, and other webmasters. If you want to improve your Domain Authority score, you’ll need to earn backlinks.

Google measures site authority primarily through backlinks, or links to your website from other web pages. The logic goes that if another website links to yours, they must find your content trustworthy and valuable. Earning links on other websites is one of the most important parts of any off-site SEO strategy, and it’s commonly known as link building. 

Creating original content that includes links to your website, and pitching that content to other websites, is one of the best ways to earn backlinks in a way that is Google compliant. Webmasters are always on the hunt for good content, and links that come from reputable websites in your industry will be more valuable and move the needle on your site authority even faster.

There are many link building strategies out there. From guest blog posts to broken link building, getting creative is key in earning new links that are outside of your existing network. Building up your backlink profile is one of the most essential steps in improving your keyword rankings and driving organic traffic to your new site.

Link Building Checklist for New Websites:

  • Make sure you have high-quality content on your website that is worthy of links!
  • Start getting traction with off-site SEO by asking site owners in your own personal network to build links to your new site
  • Outreach to other reputable site owners in your industry to identify link building opportunities or guest blogger partnerships
  • Launch a link building campaign with us to earn high-quality links that have contextual anchor text and are earned through original content

#9: Launch a PPC Campaign to Get Traffic Right Away and Improve your SEO

The reality is, search engine optimization takes time. Getting your web pages ranking organically for competitive keywords does not happen overnight. It will likely take months for your new site to establish its reputation with search engines. So in the meantime, a carefully executed PPC campaign can help you start generating clicks, traffic, leads, and revenue while you continue to do the work of link building and content creation.

A good way to think about PPC campaigns is like “renting,” traffic. Once your budget runs out, those clicks will no longer come. Overall, SEO is still a better strategy than Google Ads campaigns because owning those top spots means more traffic in the long-term. For this reason, you can use your PPC campaigns to prospect traffic quality for SEO. If those keywords you pay to rank result in conversions, it will be worth your efforts to try to rank organically for those same terms.

PPC can get very expensive though, so before you devote the majority of your new website’s marketing budget to PPC, make sure you are working with a reputable digital marketing agency that has a proven track of creating optimized PPC campaigns. 

PPC Checklist for New Website SEO:

  • Identify the keyword terms that will drive qualified traffic and leads to your new site
  • Optimize your ads for relevance and conversion so you can improve your Quality Score and pay lower CPCs in the ad auction
  • Use your PPC campaigns to prospect traffic quality, and then redeploy those keywords in your SEO campaigns

#10: Develop an SEO Content Strategy for your New Website

Most site owners launch their new website with a homepage, their primary category pages, and other key product pages. However, if you never add new pages to your website, you will struggle to drive traffic for the long-term. Internet users are always looking for new content, and if your website doesn’t offer anything new, users will quickly grow uninterested and will probably not return to your site.

Not only do new landing pages mean additional keyword rankings, multiple studies have shown that websites with over 40 landing pages have higher conversion rates. Having many long-form, high-quality landing pages that are valuable and useful to your site visitors gives those visitors more reasons to navigate your site without having to bounce back to the SERPs or search for content somewhere else. 

Blog posts are an ideal way to constantly create new content that targets long-tail keywords, creates content marketing opportunities, and builds authority, expertise, and trustworthiness. A strong content strategy also benefits your off-site SEO, because great content assets that live permanently on your website give more reasons for others to link to your new site. Overall, a strong content development strategy is essential to moving your website from launch to long-term traffic. 

Content Strategy Checklist for New Websites:

  • Create an editorial calendar and start publishing more content on your website on a consistent basis
  • Use new web content and landing pages to optimize for other related keywords you hope to rank for
  • Promote your content through other channels like email marketing, social media campaigns on LinkedIn and Facebook, or by pitching through PR and link building campaigns.

We all know that keywords are essential when it comes to optimizing your website for better rankings in search engines. It was once common practice to choose one keyword to optimize each piece of content for. But with newer, more complex search engine technology, optimizing for keywords clusters is a more standard SEO practice.

Google has gotten better at understanding content, so your keyword strategy also needs to get more advanced. Using a keyword cluster model to drive your overall content strategy can help bolster your search engine results and help you outrank competitors.

Here is a guide on everything you should know about keyword clustering.

What is a Keyword Cluster?

In short, keyword clusters are keywords that represent searches with similar consumer intent.

Also known as keyword groupings, these “bunches” of keywords are paired together because they represent the same, overarching intention.

Users don’t always search for products, services, or answers to their questions in the same way. For example, say your business is an e-commerce brand that sells women’s athletic shoes. An example of a possible ways that users might search for your products include:

All of the above search queries display the same intent, which is to purchase athletic shoes for women. If you only optimize your content for one of these terms, you’ll miss out on thousands of users who are looking for products and services like yours.

The reality is, Google usually ends up ranking our web pages for multiple keyword phrases. With a keyword clusters model, you can be more strategic in making sure Google ranks your web pages for the similar terms that your target audience is using.

How to Create a Keyword Cluster

Creating a keyword cluster involves being more thorough with your research and more strategic about keyword targeting. It requires a strong understanding of your audience and the types of terms they use to find products, services, or content like yours.

1. Do Your Keyword Research

Any SEO professional understands that before you start with any type of keyword targeting, you must do your research.

You not only need to see what keywords users are searching for surrounding your search term/ topic. You also need to know which of those terms are more valuable and display the greatest conversion potential.

Example of Keyword Research in Google Sheets

And when we say research, we don’t just mean finding a few keywords. Done correctly, extensive keyword research involves putting together a list of hundreds to even thousands of keywords that might bring potential customers to your website.

When thinking about what kinds of keywords to add to your list, ask yourself the following questions:

  • What products and services do I offer?
  • What problems can I solve for my customers?
  • Why would a consumer choose my company over my competitors?
  • If I was brand new to the industry, what words would I be searching to learn more?
  • What type of customers do I usually get?
  • Do I have any content currently that can answer users’ most common questions?

Once you have some ideas about what primary keywords and overarching topics you want to target, take your time to identify all the variations of the keyword and topic as possible.

This means all long-tail keyword phrases, pillar topics, synonyms, and related subtopics. While there is no perfect number to shoot for, when you finish your keyword research you should have a couple hundred keywords to work from. This will give you a good number to help you build out multiple keyword clusters.

2. Use a Keyword Tool

To find essential keyword metrics for the many keywords you’ll need in your list, you will want to use a tool like our keyword researcher. Features like “Suggested Keywords,” will help you find similar keywords quickly and then easily add them to a list that you can eventually export to a CSV file.

Our Keyword Researcher tool simplifies your keyword strategy by giving you all the data and insights you will need to create keyword clusters to improve your website’s content.

Our tool will provide the data for:

  • Monthly Search Volume: The average number of users who enter the keyword into the Google search bar every month.
  • Keyword Difficulty: The competitive landscape of the keyword on a 0-100 scale. Higher Keyword Difficulty scores mean more competition.
  • Search Volume by Country: If you need the search metrics for specific country or region, the tool can be filtered by country.
  • Cost-per-click: This is the price that digital advertisers are paying in Google Adwords to rank their website on the top of the SERPs. The higher the CPC, the higher the chances that this keyword will bring qualified traffic your way.

3. Identify Themes and Groupings

Once your list of keywords is complete, you’ll want to take your list and identify similar themes. Chances are, you may have already noticed some themes pop up while you were collecting your research.

The various patterns you might see will guide your keyword clusters. Some examples to look for are:

Relevance

This goes back to natural language processing. Are there certain groups of words that are synonyms and share the same search intent? The more similar the keywords, the easier it is for Google to crawl your landing page and gain insight on its subject matter.

Search Volume

The core keywords in your clusters need to have a reasonable search volume. This shows that users are actually searching for those terms.

While long-tail phrases will naturally have lower search volume due to their specificity, make sure any long-tail terms you include in your cluster still display strong conversion potential in their cost-per-click metrics.

Most people know what keywords are, but understanding how to choose keywords can be a bit more mysterious. When presented with search volume, cost-per-click, and keyword difficulty, it’s easy to feel overwhelmed, especially when you are creating your content strategy. When it comes to keyword metrics, keyword difficulty is often considered the most challenging to understand.

So, what is keyword difficulty, and how does keyword difficulty factor into your SEO content strategy? This article will explain how this metric is calculated and how you can use keyword difficulty to improve your website’s search visibility.

What Is Keyword Difficulty?

Keyword difficulty is a metric used in search engine optimization that estimates how much of a challenge it would be to rank for that keyword. Some platforms use the terms SEO difficulty and keyword competition instead of keyword difficulty.

A higher keyword difficulty score indicates how difficult it would be to be displayed on the first page of Google’s search engine results page (SERP). A high score means that there is fierce competition among the websites that are already ranking for that keyword.

Keyword difficulty is measured on a scale of 0 to 100. The closer to 100, the more difficult the keyword would be to rank for. The nearer to 0, the easier the keyword is to rank for.

How Is Keyword Difficulty Calculated?

So, where does this number come from? Most SEO tools use a range of metrics to calculate these metrics. Some of these factors include the competing domains’ ratings and your own domain rating.

We use a weighted formula that takes into account the rating of the domains that rank for the specific keyword, the traffic share that frequents the top-ranking SERPs, and other nuanced factors. Domain rating is calculated by the number and quality of backlinks to that particular website.

Why Does Keyword Difficulty Matter in SEO?

You want to compete in your own weight class. If your competition is a domain rating giant, your content won’t be able to compete–and it will become buried in the SERPs. Keyword difficulty allows you to rank on Google for keywords that you can be discovered for. This, of course, leads to higher traffic and increased conversions.

Furthermore, keyword difficulty allows you to strategically invest your time, money, and other resources in efforts that will pay off. This makes creating an SEO content strategy easier.

How to Choose the Best Keywords for Your Website

Choosing the right keywords is key to SEO success. When selecting what keywords you can rank for, you will want to take into account keyword difficulty, search volume, CPC, and search intent.

1. Evaluate Keyword to Rank & Form a Winning SEO Strategy

The first step to creating a successful SEO content campaign is narrowing down where to focus your efforts. This is part of the beauty of keyword difficulty–it allows you to identify words you can realistically rank for, saving you time, and earning you organic traffic.

Pillar Pages & Cluster Content

Most SEO content specialists suggest starting with pillar pages, then clustering supporting target keywords around them. A pillar page is a long-form guide, landing page, or blog that focuses on a primary or core aspect of your business or website. Cluster content relates to the pillar topic but gives you the opportunity to expand upon smaller details or aspects of the pillar topic.

You should have a handful of pillar pages that relate to your industry. You can add cluster content as needed and as new keyword opportunities arise. 

A pillar structure allows Google’s web crawlers to more easily assess your website’s semantic content and give you kudos for overall topical depth.

For example, if you run a spa, your pillar topics will feature the primary aspects of your business. These may be massage, skincare, acupuncture, hot stone treatments, and waxing. From there, you can expand upon the topic. For example, you can have cluster content pieces for the benefits of each individual type of massage.

Once you’ve defined your pillar topics, you can then begin to create cluster content ideas based on keyword research.

2. Perform Keyword Research

With established pillar topics, you can begin your keyword research for your content clusters. When performing your research, your goal is to find keywords, or search terms, that are the most relevant to your business and that you will be able to rank for. This will require research. 

Often the simplest way to choose keywords is to list subtopics for your pillar pages. Then, plug those topics into a keyword research tool to explore potential keyword choices.

The SEO suite allows you to quickly perform research and nail down your supporting keywords. With the software suite, the Keyword Discovery tool is an excellent choice for this step.

Simply type what keyword you want to research into the search box. Then, explore suggested keywords by viewing all. If you have a brick-and-mortar business or only ship within one geographic zone, you will want to refine your keyword location.

Review the list of keywords and select keywords by Search Volume (SV) and Pay-Per-Click Difficulty (PPCD)–which is similar to keyword difficulty but takes into account using a paid campaign. 

Select keywords with a difficulty score that is lower than your domain rating or domain authority. Keep in mind the low competition keywords have low difficulty scores. This ensures you have the best chance of appearing in the search results for that keyword. Watch this video for more information on how to use Keyword Difficulty and Domain Authority in your keyword selection.

3. Turn Your Research into Keyword Optimized Content

Once you have a list of keywords you want to create cluster pages around, you can use SEO content creation tools to optimize for those keywords.

In the suite, navigate to the Content Assistant tool. Then select create a new article. Enter your selected keyword into the Add keyword field. The Content Assistant will run that keyword through our proprietary analyses and display the keyword’s

  • Search Volume (SV)
  • Keyword Difficulty (KD)
  • Cost-Per-Click (CPC)

If you hover over the keyword, you can select the graph icon to display a full report of the keyword’s metrics.

You will also find suggested target keywords you can add to the same article. You can add up to 5 target keywords per piece of content.

From there, you will be given a list of focus terms to include that relate to each particular keyword. These terms help improve the value of your content for search engine algorithms.

Keyword Difficulty Scale

With the keyword researcher tool, you will find the keyword difficulty scale prominently displayed below the keyword you are researching. This scale is color-coded by difficulty level and labels the difficulty score of the particular keyword.

KD Difficulty Scale

0-25 = Easy

26-50 = Average

51-75 = Hard

76-100 = Very Hard

Keyword Difficulty Tools

A simple search will result in a wide array of keyword difficulty score tools. While each software will use a slightly different method of calculating keyword difficulty scores or keyword difficulty metrics, you will gain insight from whichever you choose. When comparing competitor software, keep in mind that you want to choose the best for your overall SEO needs. This often includes:

  • An intuitive user interface, including a keyword overview
  • A reliable keyword research tool that displays volume, keyword difficulty (KD), PPC, and KW competitors
  • The ability to change keyword difficulty metrics location to another country or region
  • Page authority for competing URLs and domain authority scores
  • Backlinks profile analysis
  • Which keywords should appear in headings
  • Content idea suggestions based on keywords
  • A keyword mosaic to easily visualize how your competitors use your targeted keywords
  • A content creation tool to ensure your content is optimized for search results potential and content quality

Keyword Difficulty Tool

Our Dashboard offers a full suite of SEO tools, including a range of tools to help users select competitive keywords that they have the highest chance of appearing in the top organic search results for.

There, you will find:

  • The SEO Content Assistant for optimizing based on the right keywords. 
  • Content Ideas aids you in planning content that engages your target audience. 
  • The Content Researcher displays your ranking potential for keywords, KD, a terms grid, keyword discovery, the content score for top-ranking URLs, SV, CPC, readability and word count for your competition, and domain rating.
  • The Keyword Researcher tool streamlines the keyword research process and helps you understand your keywords’ KD.

Free Keyword Difficulty Tools & Free Keyword Checkers

There are a plethora of free keyword difficulty software choices available, including the Keyword Research Tool and the SEO Content Assistant. 

Keyword Difficulty & Your Website

With the right keywords, you can take your content from underperforming and undiscoverable to Google’s first page of search results. Become the author of your site’s future by taking the next step toward mastering keyword selection, page authority, and SERP rankings. Our Dashboard makes keyword difficulty metrics easy to understand and guides you through the steps of producing the best content in your niche.

When you enter a phrase, question, or topic into the Google Search bar, Google returns relevant results that offer you an answer or solution. So how does Google know exactly what you’re asking for?

In order to improve the relevance of the SERPs, Google has had to get better at understanding human language. Thanks to natural language processing algorithms and machine learning, Google is one of the most literate robots out there. 

So what can website owners do to leverage the power of Google’s NLP models? It’s called semantic SEO. It’s all about creating content that shows Google that your content has topical depth, relevance, and quality.

What is Semantic SEO?

Semantic SEO is the process of creating your content around topics, instead of one or two keywords. It is when you build more meaning into the content you create. Also, it involves thinking about the true intent of your readers and how the various landing pages on your website interrelate. 

A piece of content that is optimized properly for semantic SEO not only will answer the question a user has now but will answer the second, third, and fourth question they may have after reading. It is all about adding more depth, meaning, and reason to your content.

There are 3 problems that content creators face when dealing with content on a search engine:

Google is smart, but it’s still a robot. Several Google algorithm updates have focused on helping Google crawlers understand human language. However, they are bots. While their machine learning is quite advanced, they aren’t able to truly speak with human language. This is where semantic writing can come in handy.

  • Other brands are vying for your target audience. With more content fighting for valuable search engine real estate, it is hard to stand out to a searcher. Google uses authority, quality, and page experience to determine whether or not your relevant content is more valuable than your competitors.
  • Google can promote your content, but you have to do the rest. Search engines can usually figure out what your content is about. But it’s up to you to answer consumer’s questions before they even ask them and drive them toward conversion.

The concept of semantic search can help with all 3 of the above problems. Using this method not only will help a content creator develop topics that answer their user’s intent, but they are also able to position the content where more than one question can be answered in more than one way. The goal is to add depth to your content and to frame content to longer queries and topic clusters rather than a literal keyword match.

Semantic SEO and Natural Language Processing

To get a better understanding of what exactly semantic SEO is, it’s important to understand how the data behind language is processed. 

Natural language processing (NLP) is how computers work to understand human language and infer meanings behind what is spoken. 

NLP models are building blocks of communication between humans and computers. New advancements in NLP are happening all the time, like with SMITH and GPT-3. 

With every new algorithm update, search engines like Google get better at understanding human language.

How Do Search Engines Use Semantics and Natural Language Processing?

Google released an update back in 2019 that aided them in semantic analysis and NLP. The BERT update (Bidirectional Encoder Representations from Transformers), was one of the most important algorithm updates in years and affected 10% of all existing search queries. 

In short, BERT helped Google better understand what the words in a sentence mean. It also helps Googlebot gain insights of the context around certain words.

Here are a few different examples of linguistic artificial intelligence capabilities that are used by NLP models like BERT.

Semantic Mapping

Semantic mapping is the act of exploring connections between any word and phrase and a related set of words and concepts. A semantic map will visualize how terms work together, and the search intent of the consumer.

For example, if a person simply searches for the term “pizza for dinner,” they could be looking to order a pizza from a local restaurant. Or, they could be looking for a recipe to make one at home. 

So how does Google know what the true search intent is? It’s all about the terms that the consumer searches with the main “pizza” keyword. If they are looking to make one at home, they could be searching for ingredients and proper oven temperatures. Or, if they’re looking to order, we can assume their search query will include the terms “near me” or descriptive words like “best ” or “great.”

Semantic Coding

This is the process of using coding to better explain to Google what types of information can be found on each different page.

One popular example of semantic coding is schema.org. Schema markup is a semantic vocabulary of tags, or microdata, that you can add to your HTML. It improves the way search engines read and represent your web pages in relevant search results. When you use schema markup, you tell Google exactly what is on your page, and how you want to present it. 

There are other vocabularies and tags that you can add to help Google understand your content. Header tags such as H1-H6 map out the multiple subheadings and breaks in the content. 

There are also other semantic tags that can be used for emphasis, quotations, citations, acronyms, definitions, and thematic breaks.

How Can you Improve your SEO with NLP ?

As mentioned before, there are many benefits to crafting content with natural language processing in mind. By crafting your content towards how NLP models work, you can earn more keyword rankings, better SERP positions, and more organic traffic.

Add Structured Data Markup

Google wants site owners to add structured data to their website so they can better understand their content. If you have specific products, events, or jobs that you’re trying to promote, there is no reason not to add the appropriate markups so your pages can appear in rich results.

Use Internal Links With Contextual Anchor Text

When a user clicks onto your website, they want to find a solution to their problem. They’re looking for the easiest and best answer to their solution. If they have to spend time hunting around for information, they most likely will move on to another website.

That’s why you’ll want to provide as much relevant, and contextual information to them as soon as they land on your website.

This means not only using content to create meaning around a particular topic, but to include helpful internal links and anchor text. 

You need to think of your customer’s journey. Instead of putting all the information they may need on one web page, a strong internal link structure can work wonders. It both answers users’ questions and boosts your own SEO. However, you will want to create links with relevant anchor text that matters to the reader, to ensure they click on them!

Think Bigger About Keywords

Keyword research is, in a nutshell, complex. Most digital marketers just do simple keyword research on individual keywords that are relevant to their products and services. 

But just think of how many different types of words there are in the English language! Your keyword research should really aim to capture all the different ways that users may search for products or services like yours. For example, there are better Chrome extension to better understand other ways users are searching for similar queries.

This includes verbs, adjectives, related questions and phrases, subtopics, and lsi keywords. LSI keywords, also known as latent semantic index keywords,are search terms that are relevant to the main, single keyword you are targeting.

When you enhance your keyword research and double down on content creation, you will have great success in creating relevant content. That content will also appear higher up in the search results for more relevant keyword phrases.

The Benefits of Writing Content with Topical Depth

There are many benefits to writing content with topical depth. Although longer content is not technically a ranking factor, the advantages of creating more in-depth content are clear in the SEO research and the rankings.

Rank for More Keywords

Even if you only want to rank for one primary keyword, Google usually ends up ranking our web pages for multiple keyword phrases. Why not hit multiple birds with one stone when writing your content? 

Pieces of content with topical depth tend to explore more subtopics and questions related to the primary keyword target. This broadens the reach of the content within the SERPs. 

The idea is simple; the more you write about multiple different related subtopics, the higher chances you have when improving your visibility in multiple search results. 

Above anything else, optimizing for multiple keywords will ensure that you have more opportunities to drive traffic to your website.

Decrease Bounce Rate and Increase Scroll Depth

Not only does Google read the content you promote on your website, they take a look at how people are consuming it. 

In a quest to ensure the best user experience possible, Google only wants to promote high quality web pages. If they see that consumers are bouncing from your website almost as soon as they land there, they will believe that your website isn’t relevant or valuable.

Because in-depth content requires longer landing pages, your visitors will scroll further and spend longer on the page. That is, as long as the text, images, and rich media are quick to load.

Topical depth allows for you to dive really deep into a certain topic at hand. But as you add topical depth to the page, make sure that you don’t sacrifice other key parts of the user’s experience. That includes site architecture, links, and easy navigation elements like jumplinks.     

Improve Readability

Have you ever read a piece of web content that’s stuffed full of the keyword the site owner wants to rank for? Not only is it difficult to read, it devalues the entire experience.

As stated before, semantic SEO is not hyper-focused on one keyword. By writing with semantic SEO in mind, you actually improve the readability of your content.

When you use related words to your main topic, then you’re able to give the reader more context, which improves readability tenfold. Readability is important on both a user intent and SEO basis, as readability improves engagement with the page. 

Engagement is one of the most important metrics that Google pays attention to, and it ties directly in with improving search rankings and search volume. And who wouldn’t want that?

Allow for Easy Conversions

The whole purpose of writing content is to get some sort of conversion, whether it is a phone call, an email subscriber, or a purchase. 

So every bit of content you create will need to serve its own purpose. If your content is stagnant and doesn’t inspire users, then what is the point? Content ties all of your marketing efforts together, and fantastic content will allow for an easy conversion. 

When users find content that is in-depth and answers their questions, they will be more likely to buy. They will also more likely see your brand as an industry authority.

Using the SEO Content Assistant to Improve Semantic SEO

Our SEO Content Assistant uses NLP algorithms and machine learning to help site owners practice semantic SEO.

With this tool, any digital marketer can take their target keyword and create high-quality, in-depth content. The higher the content score, the more likely the content will rank well in Google and push the user toward conversion.

Our SEO Content Assistant uses semantic technology. It compiles a wide variety of related phrases, LSI keywords, and shoulder topics from your specific keyword and main topic. Additionally, the software will suggest focus terms, multiple on-page elements to use, and keyword frequency that your content needs. This will help your web page outrank competitors for high-value keywords in your industry. 

As a result, you will have dozens of Focus Terms to use in your content that help to enhance the topical depth of your content. The SEO Content Assistant allows for you to optimize your content up to 5 keywords, but make sure those target keywords are related and have similar relevancy. 

Our easy interface makes it seamless to import your content from existing URLs, export new content into Google Docs, and collaborate on your projects with other members of your team.

All in all, this tool will enhance your content and make it a valuable asset to your digital marketing goals.

Final Thoughts on Semantic SEO

Semantic SEO is a fantastic tool for both search engine optimization purposes, but also to really grab your users at the perfect time for conversion. 

By creating content that works to speak to a consumer in natural language, instead of stuffing keywords unnaturally, consumers are more likely to gain the overall context of your main topic.

Pillar pages and topic clusters have become increasingly popular in the world of SEO, offering a powerful way to boost your website’s search engine rankings. This approach to SEO content enables marketers to create comprehensive content hubs with multiple related topics, linking them together to create a powerful library of content that optimizes their website for more overall visibility. Continue reading to learn more about pillar pages and topic clusters as well as how our SEO agency can help you use a topic cluster model to your advantage.

What are Topic Clusters?

For starters, topic clusters are a popular content marketing strategy used to create an information hierarchy that is easily navigable for readers. They are made up of the main page, known as a pillar page, and several other pages, known as cluster pages. The pillar page serves as an overview of a specific subject while the cluster pages provide more detailed information.

Topic clusters are connected by internal links, which allow readers to easily find other topically-related content. This also helps to keep readers on a website as well as improve their understanding of the subject. 

For example, we have a topic cluster on our website centered on link building, which is one of our agency services. Our link building page is considered the “pillar page,” and we have created a variety of “cluster content,” that explores that main topic more in-depth. Some of the blog posts in that cluster include:

  • “12 Link Building Strategies for This Year”
  • “How to Choose the Best Quality Link Building Services”
  • “White Hat Link Building Versus Links to Avoid”

All of these blog posts link back to the primary pillar page, helping create a cluster of content that Google understands is topically related and full of in-depth information about the topic of link building.

Topic clusters can help to increase website visibility in search engines, as their algorithms tend to favor websites that are well-organized, authoritative on a topic, and have a strong internal linking profile.

Examples of Topic Clusters

As an example, let’s say that your business is a personalized travel agency catered toward ethical tourism. 

One of your pillar content pieces would include a page optimized for the keyword “ethical travel companies.” Within the pillar content pieces, you’ll also want to target keywords like “ethical travel agency” or “eco travel company.” The total number of topic clusters on a website will be determined by the services the business offers.

For example, you can create pages about sustainable travel packages or reasons why ethical travel is important. Your business can also create another content cluster around a larger topic, such as how to become a more “sustainable traveler.” This cluster could potentially explore the following topics:

  • How to Fly Sustainably
  • Tips for Respecting Local Cultures as a Traveler
  • 4 Ways to Travel Ethically
  • How to Become a Responsible Traveler

This is just one example of how you can create a well-structured, easy-to-scan cluster. This makes it easier for searchers to crawl, and it makes it easier for potential customers to browse your cluster topics. It’s all about diving deeper into the informational side of your services and showing a search engine that you can provide value to your target audience.

Do SEO Topic Clusters Improve Rankings?

Yes. In the past, websites would simply create blog posts to match popular keyword phrases. Each web page would be optimized to rank for a specific keyword, and there was no attention placed on the content of the site in its entirety. 

But search algorithms began to change. The updates made them better at determining the expertise of a website. Now, they search for in-depth content, shifting their focus from keywords to topic areas. As a result, content strategists have now moved toward using topic clusters to organize their content. By building a content plan and implementing it appropriately, strategists have learned how to create a significant impact on SEO results, improving overall site authority.

With the current state of algorithms, not using a topic cluster strategy would put your webpage at a disadvantage, particularly if your competitors are enterprise-level. Topic clusters exhibit the quality indicators that Google looks for, and creating content with these qualities will produce the best results.

  • In-depth, topically-rich content
  • Content that exhibits industry expertise
  • Content with internal links to other relevant content
  • Created by expert authors who have experience in the topic
  • Articles that fall under the website’s primary niche

It’s important to realize that topic clusters are a long-term strategy that allows your site to gain search rankings for broad, overarching keywords. Building a content cluster out on your website won’t happen overnight. In the beginning, topic clusters can help your website gain search traffic for longer tail keywords that are less competitive and elevate the performance of pillar content for more competitive, higher search volume keywords in the long term.

Topic clusters will also help your brand build E.E.A.T. (Experience, Expertise, Authority, Trustworthiness) in certain topics so that Google sees your website as an industry leader.

How Do I Use Topic Clusters to Improve SEO?

To create a successful SEO topic cluster, it is important to first research relevant keywords and phrases using a keyword research tool. This will help identify the topics that are most relevant to your website’s target audience. By creating a cluster of related topics, the website’s SEO will improve, as it provides more opportunities for search engines to index and crawl the website’s pages. 

Below, you’ll find step-by-step instructions on how you can use topic clusters to improve your search engine optimization.

How to Build Topic Clusters on My Website

If you want to build topic clusters on your website, here are the following steps to take:

  1. Identify your primary keyword targets for your pillar pages and topic clusters
  2. Plan out your pillar page and determine subtopics with related terms and smaller search volumes.
  3. Create the cluster pages using the subtopics you’ve listed.
  4. Build your pillar page for the main hub of content.
  5. Start linking your cluster pages and repeat the process.

Look below to find a more detailed breakdown of each step.

Keyword Research

Begin by choosing a topic and doing some keyword research. This topic area should be relevant to your products, services, content, or areas of expertise. For example, if you are a real estate broker, some relevant topics may be “real estate agents,” “commercial real estate,” or “investment properties.”

You can use the content planner tool from the dashboard to help you identify possible subtopics for your topic cluster.

For example, the following keyword clusters were provided from the keyword input, “commercial real estate.”

You can also use the built-in AI content generator to help you develop a relevant title for your cluster content. For example, here is a suggested article title for the keyword cluster “small warehouse space.”

Keyword cluster and Topic Idea generated in the Content Planner

 

Content Creation

Once you’ve identified keywords and a blog topic idea, it’s time to start creating high-quality content. This is essential for every website page, and you’ll want to use the appropriate list of related keywords provided by the content planner. When creating any web page, it’s important to make sure that it’s well-optimized for your target keywords, so make sure to use a content optimizer like the SEO Content Assistant to confirm your content has strong ranking potential.

Content optimization in the SEO Content Assistant

This tool will show you recommended focus terms and internal links to include in your content in order to improve its topical depth. If it’s going to rank well, your cluster content will need to provide value and meet the search intent of your audience.

Add Internal Links

The final step is to link your cluster content to your pillar page. This means creating links from your pillar landing page to your other content pages, and from your other content pages back to your landing page. You can also link to other pages within that same cluster. 

Internal linking is an important part of the SEO topic cluster optimization process, as it helps to establish the overall hierarchy of your content. By following these steps, you can create a well-optimized topic cluster for your website. And by doing so, you can improve your website’s search engine ranking and attract more website visitors.

Repeat the Process

You can continue building out topic clusters on your website for as many topic areas that have relevance to your products or services. For example, we have topic clusters related to our primary agency services, including link building, content marketing, technical SEO, and more.

Eventually, your website will look like this:

Topic Clusters Internal Linking Profile

4 Additional Tips for Creating Topic Clusters

Here are some things to keep in mind to improve the performance of your topic clusters even more.

1. Focus on Creating a High Quality Pillar Page

Creating a pillar page is a bit different from creating a traditional blog post. This page aims to offer well-organized content that keeps visitors on the site longer and provides better signals to Google that your site is authoritative. It’s essential to create a well-thought-out pillar page to drive up engagement, increase page views, and appear as an expert on a particular subject.

2. Link to your Pillar Pages from your Homepage

The entire goal of a topic cluster is actually to elevate the rankings of your pillar pages. Those pillar pages should serve as the top of the cluster hierarchy, and you will want to be sending link juice from your homepage directly to them.

As a general rule, pillar pages should target more competitive keywords while your cluster pages target long-tail versions of keywords that represent subtopics of that topic area. Again, your pillar page is the most important, and you want your cluster content to elevate those pillar pages, not compete with them.

3. Be Careful with your Internal Links

Linking to a pillar page of another topic cluster may confuse search engine crawlers about which page is the most important. Although you may want to link to pages outside of the topic cluster, avoid linking to a pillar page of a separate cluster.

4. Watch out for Keyword Cannibalization

When using a topic cluster strategy incorrectly, it can sometimes result in keyword cannibalization. This occurs when multiple web pages on a single website contain similar keywords or phrases and therefore Google doesn’t know which one it should promote. Search engines will then often prioritize the page with the most optimized content over the other pages, resulting in the other pages not appearing in search results. 

When identifying cluster content, make sure that your subtopics are distinct enough for search engines to understand they need to be promoted for a different set of keywords. If you end up experiencing some type of keyword cannibalization, you can correct this by combining some of your similarly focused cluster content into one page.

Then, add 301 redirects from those old pages to the newer, more comprehensive page.

How to Track the Performance of an SEO Topic Cluster

You can use the GSC Insights tool in your dashboard to track the performance of your topic clusters. This can help you get a good idea of how your website is performing for the key “topic areas,” that search engine crawlers associate with your website.

Navigate to the “Page Groupings,” feature and start assigning your topic cluster content to the same group as the accompanying pillar page.

Tracking Topic Cluster Performance in GSC Insights

From the above data, I can see that our link building topic cluster is performing very well, meaning Google sees our website as authoritative in this topic area. If we add new link building content to our website, Google is more likely to promote it, as we have already proven ourselves authoritative in this topic area.

Adding more cluster content to our Technical SEO and Content Creation, as well as investing in link building to those content clusters, will help Google start seeing us as more authoritative in those sub-niches of SEO as well.

Conclusion

Remember that SEO topic clusters do not happen immediately and require consistent content creation efforts. However, taking the time to build out all of these pieces of content can have drastic impacts on your organic rankings and help Google rank your web pages for years to come.

Domain Rating (DR) and Domain Authority (DA) are two metrics SEO experts rely on to understand the authority of websites. SEO strategists regularly use DA and DR as key performance indicators for their campaigns or to benchmark how websites measure up against competitors. 

DA and DR are separate authority scores with their own unique calculations, but both are regularly used in the SEO industry to represent ranking potential. In general, high DA and high DR scores are seen as reliable predictors that a website will perform well in search results.

As a result, many SEO professionals have posed the question of which metric is more reliable. Is DA or DR more accurate in predictions of better SEO performance? These questions are not answered in the below study, but we set out to compare DR and DA metrics and to better understand their relationship to other SEO performance indicators like organic traffic. 

What is Domain Authority?

Domain Authority is a metric created by Moz that predicts the ranking potential of a website. It is scored on a 0-100 scale. According to Moz, the higher a domain’s DA score, the more likely web pages from that domain will rank well in search engine results.

DA is calculated by evaluating several factors, but most significantly, the inbound links pointing to a domain. The link data used to calculate DA scores comes from Moz’s Link Explorer web index.

Domain Authority is designed to be a comparative metric, meaning earning a 100 score is not the ultimate goal. Site owners can use DA to measure the ranking potential of their own website in comparison to other domains in their industries.

What is Domain Rating?

Domain Rating (DR) is a proprietary metric from Ahrefs that is designed to measure the strength of a domain’s backlink profile via total number of backlinks and total unique referring domains. In addition to quantity, Ahrefs also considers the quality of the inbound links pointing to a domain while calculating DR scores.

DR is on a logarithmic 0-100 scale with higher scores indicating stronger, more robust backlink profiles. The link data used to calculate DR scores comes from Ahrefs’ large link index.

Unlike DA, DR is best described as a representation of backlink profile strength rather than ranking potential. But because backlinks and unique referring domains are some of Google’s top ranking factors, Ahrefs’ DR scores are also often used in the SEO community to quantify how well a domain will rank in search engine results.

DA/DR and Google

It’s important to note that neither of these metrics are Google ranking factors. However, because both authority metrics rely on similar factors that Google uses in their ranking algorithm, higher scores often have a higher correlation with better search engine rankings.

The Dataset

For this DR vs. DA comparison, we used the below metrics for 9,739 active websites.

  • Moz Domain Authority Score
  • Ahrefs Domain Rating Score
  • Ahrefs estimated total monthly organic traffic from the top 100 search results

Domain Rating vs Domain Authority Scatter Plot

The below scatter plot charts Domain Rating and Domain Authority scores along with estimated monthly organic traffic. Organic traffic is charted on a logarithmic scale with green representing higher organic traffic and red lower organic traffic.

Based on the upward trend, we can make the following conclusions:

  • There is a positive relationship between DA and DR scores. As DA scores increase, DR scores tend to increase as well, reflecting that both metrics are relying on similar factors to calculate scores.
  • There is a positive relationship with both DA/DR scores and organic traffic. As both DA and DR scores increase, the amount of estimated monthly organic traffic also increases.

Where the data diverts from this trend is the group of domains where DA is significantly higher than DR scores.

It’s possible that this outcome is due to Ahrefs’ greater emphasis on higher quality links and more authoritative referring domains. It’s possible that Ahrefs does not assign as much value to the links pointing to those domains as Moz does, therefore resulting in higher DA scores but lower DR scores.

Regressions

Regression analysis is a mathematical way to understand whether a variable impacts another. In the below regression analyses, we attempted to isolate DA and DR to understand their unique relationship to organic traffic.

Organic Traffic Explained By DR

Organic traffic as explained by DR | intercept = -141730 slope = 6709.6

As expected, there is a positive correlation between higher DR metrics and higher organic traffic.

Organic Traffic Explained By DA

Organic Traffic Explained by Mox DA | intercept = -295370 slope = 7908.4

Like DR, higher DA scores also correlate with higher organic traffic. DA showed a slightly stronger relationship with organic traffic than DR, but only minimal.

Final Thoughts on DR vs. DA

When comparing Domain Rating vs Domain Authority, it is clear that both have earned their place in the SEO community as useful metrics to predict improved SEO performance. Both DA and DR have a positive relationship to each other as well as higher organic traffic to a domain. 

Utilizing both metrics for comparative and SEO analysis can be beneficial, and increasing domain authority and domain rating scores can have a positive relationship with other SEO metrics like total keyword rankings and increased organic traffic.

Have you ever heard SEOs use the term “backlink profile,” but are not quite sure what it means?

Improving the strength of your backlink profile can help Google perceive your website as more trustworthy and authoritative.

If you’re new to backlink profile analysis, this guide will cover all of the terminology related to backlink profiles and teach you how to analyze the strengths and weaknesses of your backlinks.

Then, you can leverage that information to develop a more impactful off-site SEO strategy.

What is a Backlink and Why is it Important?

A backlink is a link from another website that points to your website.

Backlinks are important because they are one of the main factors that search engines use to rank web pages. The more high-quality backlinks a web page has, the more likely that web page is to rank in the SERPS for relevant keywords.

If your website has lots of backlinks, it shows Google that other webmasters on the internet find your content valuable. It’s the primary way that search engine crawlers determine whether a website is trustworthy, reputable, and worthy of being promoted in the SERPs.

What is an Example of a Backlink?

Let’s say I was writing an article about Google search advertising and wanted to mention a statistic about search ad conversion rates. To cite my claims and provide more information to my audience, I may choose to link to this informative list of marketing statistics from Hubspot.

By doing so, I just created a backlink for Hubspot.

If I were to link to another relevant page on my same website, that would be considered an internal link. Although good for SEO, internal links are an on-page SEO signal and should not be confused for backlinks.

An inbound link is only considered a backlink if it comes from another website.

What is a Backlink Profile?

A backlink profile is the complete list of all of the backlinks pointing to your website.

The number and quality of these backlinks is a key factor in how well your website ranks in search engine results pages (SERPs).

But there are other key factors that SEOs use to evaluate the overall strength of your backlink profile. To do backlink analysis successfully, it’s important you understand the terminology associated with this evaluation process.

Total Backlinks

Total backlinks is the total number of backlinks pointing to your website.

In general, websites with more backlinks will outrank websites with fewer backlinks.

However, it’s not just a numbers game. If your competitor has more backlinks than you, that doesn’t guarantee better keyword rankings.

The quality of the websites that link to yours is a big factor in determining backlink profile health and ranking potential.

Referring Domains

When a website links to yours, that website is considered one of your website’s linking domains, or referring domains.

Essentially, that website can send you referral traffic because of the link they have included to your website in their content.

Say your website has a thousand backlinks, but they all come from the same three websites. Although a good number of backlinks, it does not signal a strong backlink profile, because only three other websites on the internet find your content valuable enough and trustworthy enough to link to.

That’s why in addition to backlinks, the total unique referring domains in your backlink profile is another top Google ranking factor.

The more unique referring domains in your backlink profile, the more likely Google is to promote your web pages, as your content is clearly trusted by many people across the internet.

Domain Authority of Referring Domain

A quality indicator for referring domains is their Domain Authority or Domain Rating scores.

Domain Authority is a metric from Moz that estimates the site authority (or overall reputation) of a website.

Backlinks that come from authoritative domains with higher site authority scores will be even more valuable.

Backlinks from websites with very low DA or DR scores can actually weaken your backlink profile, as Google crawlers will see that your website is keeping company with questionable web properties.

Organic Traffic of Referring Domain

If your referring domains also rank for multiple keywords and earn organic traffic, that also increases the value of a backlink.

Why? Because if those searchers are finding their content valuable, and that linking domain is linking to your content, then search engines understand that your content must also be valuable.

In layman’s terms: I know a guy, who knows a guy.

Page Authority of Linking Page

Every time a website links to another website, it sends along a portion of its PageRank (or link equity) to the linked page. The more PageRank the linking page where your backlink comes from, the more link equity that backlink sends along to you.

PageRank is a patented metric that Google uses to understand the value of each individual page in its index. The more PageRank a page has, the more likely it is to rank in search engines.

Google used to let users know how much PageRank a web page had, but not anymore. Page Authority, then, is a metric SEOs now use to estimate PageRank.

Topical Relevance of Referring Domain

Backlinks that come from referring domains with topical relevance to yours will be more valuable than backlinks from websites outside your industry.

For example, if you sell pet products on your website, links from veterinarian clinics, pet stores, pet blogs, or animal publications will all be topically relevant.

But if your backlinks come from websites focused on appliance repair, from the comment section of random blog posts, or other irrelevant sites, Google may suspect your website of purchasing those links or trying to falsely elevate your backlink profile strength.

When you start looking for link building opportunities, you will want to focus on websites that share content relevance.

Top Anchor Text

The anchor text of your backlinks also influences the health of your backlink profile.

Although you do not have control of how other webmasters link to yours, Google will be looking at the anchor of your backlinks to understand what your content is about.

The majority of your anchor text will include your brand or business name, but a healthy diversity of anchor text signals that the backlinks your website is earning are through organic, Google-compliant practices.

On-Site Link Location

Google also considers where your backlinks are on the linking page. Is it in the body of the text? The comment section? In the caption for an image?

The linking location tells Google quite a bit about your website. If the webmaster of a high-quality website links to yours in the body of their article, they likely trust your reputation.

If a user links to your website in a comment on another webmasters blog post, Google considers that questionable linking and will trust your website less.

Toxic Backlinks

Toxic backlinks are backlinks that harm your backlink profile for any of the reasons listed above. Some signals of a toxic backlink may include:

  • Low DA or DR score of referring domain
  • High Toxicity scores (if you’re using our dashboard for backlink analysis, look at the spam score)
  • Irrelevant, explicit, or spammy anchor text
  • No organic traffic signals of the referring domain
  • Backlinks from linking websites with no industry relevance to yours

A big part of backlink profile analysis is identifying any toxic links and trying to reduce their impact. Historically, SEOs have used Google’s Disavow tool to discount the impact of toxic links, but Google’s 2021 Link Spam update has helped Google crawlers better identify and nullify low-quality links.

This means that although the low-quality site may be linking to yours, it’s not harming your backlink profile health, because Google is not counting it against you.

Why Do a Backlink Analysis?

So now that you understand the terminology associated with your backlink profile, you know how to successfully analyze it to determine its strengths, weaknesses, and overall health.

How can you then leverage that information to improve your own SEO performance?

1. Understand what it will take to rank

If your competitors have stronger backlink profiles than yours, you are unlikely to outrank them for search queries with higher Keyword Difficulty scores.

By analyzing the total links and total referring domains of your backlink profile against your competitors you can get a sense of how aggressive you’ll need to be with your own link building strategy.

2. Find Outreach Targets

Seeing who is linking to your competitors can give you a list of websites that you may want to reach out to for link building.

If those referring domains are finding your competitors’ content valuable, they may be willing to link to your content as well (as long as your content is similarly high-quality).

You can use the Link Gap Analysis tool in your dashboard to identify common referring domains among your competitors.

This can help you easily and quickly get a list of outreach targets and then reach out to those relevant webmasters and bloggers.

3. See if you Need a Link Building Campaign

One of the best ways to improve your website’s backlink profile is to actively pursue high-quality backlinks from authoritative websites–also known as link building.

You can do this by publishing high-quality content that other websites will want to link to, participating in relevant online communities, or engaging in more technical strategies like broken link building.

Final Thoughts on your Backlink Profile

Now that you understand what a backlink profile is and how to analyze it, you can use a backlink profile analysis tool to evaluate your website’s backlinks more closely. You’ll be able to leverage your new knowledge and our software together to start building a healthier, strong backlink profile.

If you care about SEO, you probably know how important backlinks are in your website’s ability to rank in search engine results. But what you might not know is the significant impact that the anchor text of those backlinks has on your SEO performance. Just like the Domain Authority and topical relevance of a linking website impacts how Google perceives you, your backlink profile’s anchor text diversity, or lack of it, can influence your SEO performance.

Why is Anchor Text Diversity Important?

According to Moz, anchor text is an important attribute that determines a link’s value. Google pays close attention to anchor text and relies on it to understand what web content is about.

When other websites link to yours, they don’t always link in the same way. That’s why a wider variety of anchor text is beneficial to your backlink profile, because it looks more natural and organic to Google crawlers.

In contrast, if all of your backlinks have the same anchor text, or keyword-rich anchors, that will appear as possible manipulation or black-hat link building. If Google suspects your website might be trying to unnaturally elevate your rankings through suspicious techniques, it can harm your SEO performance in the long term.

The biggest challenge of monitoring the anchor text in your backlink profile is that for the most part, you have zero control of how other webmasters choose to link to your content.

Link building is already hard enough work as it is, and the added necessity of being strategic about anchor text can make it feel harder. But you should already be monitoring every backlink you earn–whether organically or through outreach–to ensure each one ends up helping you rather than causing harm.

So if you have already started earning backlinks but have not yet started paying attention to anchor text, it’s time to dive in. Here’s some introductory knowledge of how Google views anchor text, and also some best practices to make sure your backlinks don’t sink you into the internet void. We want to make sure you swim right up to the top of the SERPs.

Everything You Need to Know About Anchor Text

Anchor text is the clickable text in a hyperlink that directs to another website or location. Google relies on anchor text and the words that surround it to understand the subject matter of the linked page.

But not all types of anchor text bring the same value to your backlink profile. In the eyes of Google, keyword-rich anchors, generic anchors, and naked urls each have their own nuance. 

As you begin to accumulate backlinks, it’s a good idea to familiarize yourself with the different types of anchor text so you can ensure that your backlink profile always has a healthy level of variety.

Anchor Text Types

There are multiple types of anchor text that might be appearing in your backlink profile. You can find a more detailed run-down of each type here, but these are the most common types you will begin to see as you embark on your link building efforts.

  • Branded – Links containing your brand name
  • Exact Match – Links with the exact keyword you want to rank for
  • Partial Match – Links with a variation of the keyword you’re trying to rank for
  • Phrase Match – Links with the keyword phrase you want to rank for
  • Generic – Links with non-specific information like “Click here,” or “Info”
  • Naked URL – Links with the full URL serving as the anchor text

How to Use Anchor Text Wisely

These are some core SEO anchor text principles that you should apply to both the linking you do on your site and the links you accumulate from across the web.

Relevance is King

The more relevant the anchor text, the more powerful the backlink.

This is because relevant anchor text not only communicates stronger relevance signals to Google, it also contributes to a better user experience.

Those users who follow an exact match anchor text will be annoyed if they arrive at a new web page that has nothing to do with the content implied by the link. And even if a user clicks on a less specific, generic anchor text like, “Read this,” they still have the expectation that their new destination will have close relevance to their previous one.

So although we are focusing on the importance of anchor text diversity, your anchor texts should never veer off the relevance road in terms of content or industry niche.

Contextual Anchors Are Powerful

You want your anchor text to both accurately describe the linked site to Google and make your user interested in clicking on it. Contextual anchor text then — instead of generic anchor text like “click here,” or “learn more,” — can accomplish both. It provides Google and users more context about the subject of the linked page.

Google will also look to the text before and after the anchor text to understand the relevance of the linked content. So make sure that you are looking closely at the sentences where your anchor text appears and that you don’t miss out on any opportunities to provide contextual clues to Google.

Avoid too many Keywords

In the early stages of your website’s growth, it seems logical to strive to earn backlinks that have anchor text with the keywords you want to rank for. But too many keyword-dense anchor texts can taint your link profile in the long run, as it may start to look as if your backlinks were not acquired naturally.

Because in the past, webmasters used exact match anchor text to manipulate their backlink profiles and catapult to the top of SERPs, regardless if their site had any relevance to the search query. Algorithm updates like Penguin have enabled Google to easily identify anchor text manipulation. And Google’s most recent update, BERT, means Google is getting even better at contextualizing natural language surrounding anchor text.

If you have an SEO professional guiding your link building campaign, they should ensure that any backlinks you earn come from white-hat techniques. Although link building campaigns allow for slightly more agency in the types of anchor text used, backlinks still ultimately come from third parties. 

This means they should always have variety and not rely on the same keyword in order to look natural.

The Importance of Anchor Text Diversity, and How to Know If Your Backlink Profile Has It

Building a diverse backlink profile of anchor texts can have a really positive impact on your SEO performance. In the below graphic, you can see that the SEO community agrees that the anchor text of both backlinks and internal links are significant ranking factors.

So a regular part of your own website maintenance should involve reviewing backlinks to make sure they are not coming from low-quality, spammy sites, and to also ensure that even the best of your inbound links are using anchor text that helps build diversity.

Although there is no magic number for exactly how many of each anchor type you should have, semrush offers some realistic benchmarks:

  • 30-40% branded anchors
  • 30-40% partial match anchors
  • 20%-40% generic, related, naked, random, exact match, and others

If you are not quite sure how the distribution of your backlink anchor text is impacting your own site, you do have some options via the help of SEO tools and professionals.

Use a Backlink Analyzer

A backlink analyzer can give you some much-needed insight into not only the quality of your links, but on the type of anchor text being used. You can use a free backlink analyzer like ours to see where your top competitors earn backlinks and to compare their anchor text diversity with your own.

Frequently revisiting this backlink data provided by SEO tools should quickly become a regular part of your SEO practice. If building links is currently a part of your SEO campaign, you can harness the power of this data to shape the types of target keywords and target anchors that you go after.

Get an SEO Audit

Seeking out the help of professionals in the SEO world is always a good idea. An audit can identify technical SEO problems that you may not be able to identify on your own. It’s possible that your problems don’t necessarily lie in your link profile but in other areas like page speed, html tags, or attempting to rank for far too competitive keywords.

But the important thing is identifying the problems before they result in long term consequences. Optimizing is not easy work, but the good news is, there is always someone who can help.

3 Options for Dealing with Over-Optimized Anchor Text

If you learn through an SEO audit or tool that your backlink anchor text has too much keyword density, there are some immediate steps that you can take to change the impact it might be having on your search rank.

You want to resolve over-optimization quickly, because it’s possible that Google could red flag your website if there is too much anchor text optimization. The Penguin update specifically targeted the previously common practice of anchor text manipulation. 

So although you can’t choose the anchor text other webpages use to link to you, you can certainly monitor and control the ratio of the different types of anchors in your link profile.

Option 1: Remove Backlinks with Anchor Text that may be Harming your Rankings

You should always strive to remove toxic or spammy links that you acquire. Backlinks that come from sites with no logical correlation to yours should also be removed. Generic anchor text isn’t bad, but the average reputable site will more likely utilize your brand name or keyword-rich anchors when linking to yours.

As your link profile grows, you will have a lot more leeway. But in the early stages of your site, exact match anchors, or anchors without any relevance, can get you into trouble with Google.

Even if you are desperate for any link you can get your hands on, you need to practice discipline and evaluate how each can impact you in the long run. It’s better to be picky and choosy early on when your domain authority is low.

Your website will thank you later with traffic.

Option 2: Reach Out to Referring Webmasters to Request Anchor Text be Changed or Removed

Before disavowing any backlinks, most individuals reach out to the webmaster of the referring domain name and request for the link to be removed.

So if you earn a quality link but are concerned about the anchor text, you can attempt a similar strategy. However, you should do so sparingly, and really only if there is an obvious error of some kind (like a misspelling of a brand name).

You can send a thank you for the backlink with a request for a slight anchor text change. If the content writer feels your concern is legitimate, they may gladly change it. If they think you’re just being picky and don’t like that they used an exact match anchor or generic word, they may get annoyed. They have the power to remove the link altogether, so use digression.

This strategy should not be overused. If Google suddenly sees all of the anchor text of your backlinks changing, it’s going to look suspicious.

Option 3: Build New Backlinks with the Intention of More Anchor Text Diversity

This is by far the best and smartest option for diversifying your backlink anchor text. However, it does require more time and financial investment. It is not a quick fix like the previous options, but it has much better ROI in helping you secure those top spots in the SERPs.

Link building campaigns give you the option of building links with target anchor texts in mind–whether keyword rich anchors or branded anchors. Strategies like pitching and placing content like blog posts, long-form articles, or guest posts, gives you far more control of the anchor text you need to help diversify.

Best Practices for Backlink Anchor Text

Since targeted link building campaigns are far and above the best strategy for diversifying your anchor text, we are going to take some time to break down the best practices for how to use anchor text while link building. When done correctly, you can target specific anchor texts and use them to build a natural backlink profile.

Mix in Long-Tail Keywords with your Exact Match Anchor Texts

Long-tail keywords allow you to include exact match anchors without the risk of being flagged by that pesky Penguin.

For example, if you want to rank for the term, “seo software,” you could use variations like these:

  • best new seo software
  • user-friendly seo software
  • seo software for beginners

The more pages that use similar anchor text to link to your landing page, the more likely that page will rank for those keywords in search engine results. So this strategy allows you to target the exact match anchor through new links with less risk, while also adding anchor text diversity.

Look to your Competitors’ Anchor Text and Follow their Example

The more you know about your competition, the better. Looking at the types of anchor text that have helped your competitors can give you some insights into potential link building targets.

While you’re at it, check out what their ratios of anchor text types are. You can also see what sites they are earning links from, as this will give you some additional information for your link building campaign.

Take note though: Websites with higher domain authority have already built a good reputation with search engines. For this reason, they can have a lot of keyword-rich anchors or exact match anchors without causing any harm.

Remember, the lower your page authority, the more selective you should be about any backlink you earn.

Use Synonyms, LSI Terms, and Relevant Keywords for Anchor Text

The great bonus of Google getting so much better at understanding natural language is that good, thoughtful web writing is more often rewarded.

Lazy SEO tricks of the past like anchor text manipulation and keyword stuffing often resulted in content that was just unreadable. But now that our search engines can look at both our backlink anchor text and contextualize the words surrounding them, we actually have more options, not less, for how we choose our target keywords and increase our ability to appear for more queries.

So one way to avoid over-optimization and achieve more anchor text diversity during your link building campaign is to use synonyms, LSI Terms, and relevant keywords for your anchor text. LSI terms are conceptually related to your target keyword. Google will understand their relevance, but will not mark them as “exact,” and therefore not penalize you.

When you have accumulated too many exact match anchor texts, consider embracing your literary side and use some synonyms instead. Overall, this strategy will make for happier search engines, happier reading, and happier link profiles.

Use Exact Matches, Keyword Variations, and Branded Anchor Text on your Best Performing Pages

For both internal linking and link building, how you use anchor text should vary depending on whether you are linking to your homepage, a landing page, or a blog post.

In general, you want to use keyword-dense anchors on pages with higher PageRank. These are most likely the pages that promote your brand or service, and therefore earn higher quality links. Exact-match anchors do better on web pages with better PageRank because those pages have earned credibility and trust. If Google sees a keyword-dense anchor on a well-performing page, it is less likely it will penalize you for over-optimization. 

If you have landing pages like blog posts that have good quality content, use more generic links, partial matches, and keyword variations. The purpose of blog posts are not usually to promote a product or service, but to provide useful information or in-depth knowledge. For this reason, less keyword focused anchors fit the medium and look far more natural to crawlers

Relish the Randomness

Overall, more anchor text variety can give you a powerful SEO boost. Just like backlinks vouch for your reputation, anchor text provides an objective description to Google about your site’s content. Because anchor text is chosen by a neutral third party, Google values the input.

A quality link building campaign will focus not only on high-domain authority websites, but strategic, diverse anchor text. So keep anchor text in mind as you continue your journey in link building and website growth.

There are few SEO elements more complex than links. While it’s common knowledge that strong internal linking structures, anchor text, and high-quality backlinks are vital to strong SEO, many content writers tend to gloss over annotation text. This can be a long-lasting and costly mistake. Annotation text for SEO may be the key to better use of anchor text.

Let’s all commit to maximizing the value of our links with well-optimized annotation text. The process is pretty simple. In fact, as an SEO writer, you’re likely 90% of the way there. As for the remaining 10% this article will walk you through how to write quality annotation texts that help pack every possible ounce of link equity into your internal, external, and backlinks.

What is SEO Annotation Text?

Annotation text is the text that surrounds an outbound link that a web crawler can use to better understand the linked page. It is highly likely web crawlers will index text beyond the sentence level in relation to a hyperlink. This suggests annotation text can include words and phrases that span the section, paragraph, or even entire document where the hyperlink appears.

Most importantly, Google’s algorithm patent suggests this text is a ranking factor.

Diving into Google’s Web Crawler System & New Annotation Description Patent

Annotation text is an SEO element developed from Bill Slawski’s research into an update of Google’s algorithm patent. Bill Slawski, who analyzed Google’s 2007 anchor text patent, realized there was an interesting discrepancy in how Google described their anchor text indexing. First, the patent describes that their anchor text indexing system will store at least one term in association with the outbound link. However, the patent goes on to explain that the annotation includes a text passage that is a predetermined distance of an outbound link.

Bill Slawski noticed that Google added geo-semantic indexing for text surrounding the anchor text.

This means that the text within a certain distance of the anchor text is indexed to better understand the meaning of the linked page.

How Does Annotation Text Affect SEO?

Google’s NLP algorithms use artificial intelligence to better understand human language. This provides Google with a better understanding of any given web page’s content. As Google’s bots crawl a page they index these NLP signals in order to provide searchers with better search results.

While anchor text is a primary indicator of the content on the linked page, Google’s NLP algorithms also index information from the surrounding text as a secondary indicator.

By including surrounding text in Google bot indexing, Google is able to provide the best user experience since the surrounding text provides insight into the relevance of the outbound link. The better Google understands the linked page, the more precise its search results can be.

Your Site’s SEO & Text Around Your Links

Providing Google’s web crawlers with more information about a page is always a better approach to trying to rank for Target Keywords, especially when it comes to off-site syndication and guest posts.

Furthermore, because most SEO content creators are solely focused on optimizing anchor text, you can gain a competitive advantage by optimizing for annotation text in addition to anchor text.

TLDR: Annotation text gives your outbound links contextual meaning.

How Strongly Does Annotation Text Affect the Meaning of a Link?

This is where things get a little sticky. As we know, Google keeps all of the signals used for PageRank’s indexing secret. “But there’s a patent…” Well, just because Google registered a patent that describes annotation text doesn’t guarantee that PageRank uses this particular patent.

It’s likely that Google does since this patent was registered around the time they were developing their SMITH algorithm. However, Google has never confirmed it.

The Structure of Annotation Text

The word annotation means notes or in-text notation. So, when Google describes the process of pulling annotations through indexing, their crawler is likely pulling hints as to the link meaning from within the text. This, like taking notes on a literary piece, has an amorphous structure.

When it comes to the relationship between anchor text and annotation text, I often liken this to an atom. The anchor text is the dense center: the nucleus. The annotation text is the electrons circling this center. Both the center and the surrounding text define the atom, however, the anchor text is the most reliable in terms of its relationship to meaning since the electrons can be more difficult to pin down.

This often means there is no perfect version of a page. The web crawlers system will pull data from a “predetermined distance of an anchor tag.” So, commonsense says to make sure the most important words are near your anchor text but you maintain the quality of your content throughout the page.

What Is Anchor Text?

Unsure of what anchor text is? It’s a simple concept. Anchor text is the text that is presented within a webpage and when clicked, directs the browser to navigate to a URL. Most often, the text is a description or explanation of the information found on the linked webpage.

For example, “an update of Google’s algorithm patent” in the previous section is the anchor text that links to the referenced patent in the Patent Database.

Google’s web crawlers use anchor text to determine the topic of the page to which the link is pointing. This information is gathered as the bot works its way through a sitemap.

Is Annotation Text the Same as an Annotation Link?

No. While annotation links or annotated links have many meanings in the world of web development, this phrase is not interchangeable with annotated text.

Annotation links can be hyperlinks that allow video viewers to skip ahead to the section of the video that’s most relevant to their needs. Some people use “hyperlink annotation” as a synonym for “anchor link” when a photo, video, or audio file stands in place of text.

Techniques for the Best SEO Annotation Text

So, how can you make the most of annotation text for improved SEO? Most SEO content optimization software like Yoast SEO will ensure you have the foundations of keyword basics. However, these tools won’t ensure you’ve optimized your anchor text or surrounding text. However, with some attention to the text surrounding your links, you can turn a basic paragraph into a text machine that fully optimizes your links.

1. Use Focus Term Nearby Your Links On-Page and Off-Site

Annotation text gives you the opportunity to refine how Google understands a webpage. To make the most of your internal links, employ annotation text throughout your site whenever you internally link to another one of your pages.

Additionally, when you’re implementing an off-site backlinking campaign, pay careful attention to what you include in the surrounding text.

2. Identify Focus Terms to Include

Like developing the best quality content, you want to determine the best Focus Terms to use. To do this, you will want to perform keyword research using SEO software. 

For this article, we used the dashboard’s SEO Content Assistant. Use higher-important words in the text nearby your anchor text for the biggest impact. When selecting the most important terms, consider the topical relevance of the Focus Term in relation to the linked page. Even if a Focus Term is your most important term, do not include it if it’s not topically related to the linked page.

3. Incorporate Your Focus Terms Naturally

It likely won’t come as a surprise to you that Google has an aversion to keyword stuffing. This continues to ring true when it comes to the best quality annotation text. While you want to include Focus Terms, you must be strategic about it.

Consider including the most impactful Focus Terms in your annotation text. These are often the most semantically related to your anchor text.

4. Develop the Surrounding Paragraph

If you’re going to make the most of annotation text, you must consider where the anchor link falls within a piece of content. Furthermore, you must also develop the content surrounding the anchor link.

In order to do so, avoid:

  • Placing your backlink or most important internal at the tail end of a section
  • Using a backlink or high-value internal link in a thin section
  • Linking to your site in a bullet point in an off-site article
  • Using a high-value link in a chart or table

Annotation Text = Better SEO

When it comes to the best content and the best links, annotation text provides context to your readers about your link. And Google likely uses your annotation text in addition to anchor text to better understand the page you’re linking to. This results in a better experience for your readers and more depth for Google’s algorithms to parse.

When you optimize your off-site outbound links’ anchor text and annotation, you have a better chance of ranking on the SERP #1.

Any business owner investing in SEO wants to improve their website’s chances of appearing at the top of the search engine result pages. But how do you measure SEO success? 

Because SEO is not as immediate as other digital marketing channels like PPC or social media advertising, it can sometimes feel difficult to measure its impact. But it is incredibly important for site owners to understand how to evaluate and measure their SEO results in order to see what optimizations are working, what they need to improve upon, or what tactics they should retire. Also, understanding whether your SEO is effective is essential to quantifying the ROI of your digital marketing spend. 

Measuring the impact of your SEO optimizations is actually very simple if you have the right tools. You can also start measuring the impact of your optimizations within 1-2 weeks. Here’s how to accurately evaluate and measure your website’s SEO performance as a whole.

Why Site Owners Need to Measure their SEO Results

If your business advertises a service but then fails to deliver on your promises, you will likely lose clients. Everyone wants to see results, because results prove the value of your efforts. This is just one reason for site owners to be closely monitoring the impact of their website optimizations, but there are several more.

Organic Search is Often the Largest Source of Traffic

Google owns about 75% of the Internet’s search market. Additionally, Google averages over 400,000 web searches every second, which equates to a whopping 1.2 trillion searches every single year. All of these statistics just go to show that your potential customers most likely will go to the Internet to find you first, making it incredibly important to ensure your website follows as many search engine optimization best practices that you can.

Because Google will most likely be your primary source of website traffic, taking the time to understand the details of how each page of your website is performing will give you the data you need to refine your SEO strategy and make improvements. The more refined your strategy gets, the more organic traffic that Google will drive to your website.

Good SEO Results in a Positive User Experience

Sure, you want to make sure your website shows up in search engines. But that is the first half of the battle. Once potential consumers land on your website, you’ll want to do everything in your power to keep them there and ultimately convert them. Tracking your SEO performance not only helps you understand how Google sees your content, but it gives you a lot of information about whether users are finding your content valuable. 

It is important to understand how Google works when deciding what to implement in your website. Google’s overall goal is to provide consumers with the best experience possible, so they have an army of website crawler robots that index every single website out in the World Wide Web. Those web crawlers then figure out what the website is about, and match up the content of the website to match search engine inquiries in order to provide the most relevant websites for the user. 

With this in mind, it is important for a website to have relevant, informative content for its users, not only for Google to find and index, but for your users to learn about you! With the right SEO metrics, site owners can understand which pages of their website that users are finding the most valuable, and then use those pages as models for other pages on their site.

SEO Changes as Google Changes

SEO best practices don’t really tend to change drastically, but Google updates its ranking algorithm several times a year. Having a more granular understanding of your landing pages’ performance will help you ride those waves of Core Updates without drastic changes to your keyword rankings. If you are consistent in evaluating your SEO strategy, you will be able to more easily find those components that need to be changed or tweaked as Google improves on its algorithm from year to year.

How to Measure SEO Success: The 9 Definitive Factors

Keep the following factors in mind when you create a plan for measuring SEO success. You can access all of the below SEO metrics for your website in Google Search Console or Google Analytics.

Keyword Rankings

Keyword rankings are the basis of any SEO strategy, simply because they are what users type into the search engines to find your website! So it is safe to say that a website’s keyword rankings offer a great snapshot into how your website is performing within the search engine result pages. 

Usually, the first thing any website owner does when creating their website is to complete keyword research. Well, the same goes for measuring if your digital marketing is working or not– keywords say it all. 

Keyword rankings are as followed:

  • Ranking from 1-10 means that page is showing up on page 1. 
  • Ranking from 11-20 means that page is showing up on page 2. 
  • Ranking from 21-30 means that page is showing up on page 3.
  • And so on and so forth.

Under the “Search Performance” tab in Google Search Console, you are able to see the queries users made that allowed your website to show up in the results. It is a good practice to take a look at these every month, and make sure that you use this extra information to your advantage.

Maybe there are keywords you didn’t expect your landing page to rank for — that can be a good or a bad thing! If the search intent is related to your content, that’s great, But if not, you may need to rework the on-page content or HTML tags to communicate more clearly to Googlebots what the content is about.

When looking at keyword performance, it is important to look at growth over time. SEO is a long game, and it takes, on average, about three months to six months to really see significant fluctuations after making a change. You can monitor the progress of certain keywords and phrases, and these insights will be able to tell you what content is working and showing up in the SERPs, and which content may need a little extra TLC.

Having the most relevant content for what your users are looking for is essential to seeing all of your SEO metrics improve. There is a lot that goes into using Google Search Console to track keyword rankings, so take the time to familiarize yourself with the “Search Performance,” section of this tool.

Organic Traffic

Organic traffic is the traffic that comes to your website solely from search engines. It is a powerful metric that can tell you if your website is not only performing well, but if it is worthwhile traffic that has a higher potential to convert. 

Consistency is key when it comes to SEO, so it is common to expect a consistent source of traffic, no matter your digital marketing efforts. However, if you notice a huge change, such as an influx of traffic, chances are Google may have discovered a few new pages and is starting to rank them within the SERPs. On the other hand, if you notice a big decrease in traffic, your website could have been penalized due to an algorithm change. 

Ideally, the more keyword rankings that your content earns the more organic traffic will continue to grow. However, if you are seeing your organic traffic stay the same despite additional keyword rankings, your landing pages may just not be ranking high enough, or maybe your page titles or meta descriptions are not enticing users to click. Paying close attention to which pages are not only ranking, but also getting clicks, will help you hone in on those on-page elements that seem to be satisfying both Google and users.

Economic Value of Traffic

Not all traffic is created equally, and some traffic channels are more expensive than others. If you’ve ever run a PPC campaign on Google Ads, you know it can be expensive to generate clicks. Economic value of traffic helps you understand how much you would pay for those organic clicks if you had targeted them in a Google Ads campaign. 

CPC often correlates to conversion potential. If advertisers are willing to pay a high price to target that keyword in a PPC campaign, it’s likely because the users who click have high search intent and are likely to convert. If you secure organic rankings for keywords with high CPCs, you are getting those same high-quality clicks, but at a much cheaper cost. 

As you evaluate the economic value of your traffic, It is important to crunch the numbers and ask yourself the following questions. 

  • Which keywords are driving the most valuable traffic to your site? Can you work on improving the rankings for the landing pages that are ranking for those keywords?
  • Once users land on your page, are they converting? If not, you may need to improve that page with CTAs or lead capture forms. You can use your Google Analytics account to understand whether or not those organic clicks are actually converting on your website.

Yes, SEO requires a lot of work upfront and regular maintenance, but it is ultimately much cheaper than PPC and longer-lasting. You can incorporate these costs into your metrics and get a better grasp on the true value that organic traffic brings.

Conversions

A website’s ultimate goal is to make sure the traffic you’re getting is traffic that actually converts. You need to view your conversions in the lens of quality vs. quantity, while it is always great to have a bunch of new users on your site, if you only have one person converting out of 100, then there is a bigger problem to solve here. 

If this is the case, take a step back and look at your website from the lens of a consumer. What is getting in the way of them converting? Could it be lack of information, a slow website, problems checking out, or your contact information being hard to find? Just determining the cause of the problem can be the difference between a handful of conversions a quarter and hundreds of conversions a month.

Other Key Metrics to Measure your SEO Efforts

While the above factors are very important when you measure SEO success, the following metrics should also be taken into consideration as well.

Domain Authority (DA)

Domain Authority (DA) is a metric developed by large SEO company Moz that predicts how likely a website is to rank on the search engine result pages. A DA is ranked on a 0-100 algorithmic score, and while they are not a metric used officially by Google, keeping tabs on your own website’s ranking can show you how you stand compared to other websites in your industry.

There are a lot of separate factors that go into compiling a DA score, including the number of linking root domains, the total number of backlinks, and the overall strength of a website’s backlink profile. Domain Authority is a great benchmarking tool and a good overall metric for your website’s ranking potential. You can check your domain authority score with our tool.

Click-Through-Rate (CTR)

A website’s click-through-rate measures the percent of people that clicked onto your page from the search results. A high click-through rate provides insights on if your page title and meta description are properly optimized and are enticing to potential customers. The higher the click-through-rate, the better chances are that you are getting engaged, ready to convert customers landing on your page.

Bounce Rate

Bounce rate is the percentage of users who visited your website and left without interacting anywhere on the page. A high bounce rate indicates that users visited your website but then didn’t find the information they were looking for, so they left. This can be due to a variety of different reasons, from having a poorly mapped site structure, to not including informational content that answers user’s questions before they get a chance to ask them. 

To prevent a high bounce rate, do what you can to create content that grabs potential customers the minute they land on your page. Use multimedia and graphics to add some extra visual appeal.

Scroll Depth

A scroll depth is exactly what it sounds like, it measures how far down visitors scroll on each webpage. You can monitor each page’s scroll depth to see if users are actually getting to the most important content, and if they are reading what you have to say.

A way to encourage long scroll depth is to add pictures, graphics, larger headlines, and omit blocks of text. The more pizazz you can add to the page, the better!

Backlink Profile

Google views a link as a measure of authority leading to your website. This means that every time another website links to your website, Google sees it as a measure of trust, and that your website should show up in search results.

A healthy backlink profile includes backlinks from a variety of websites, with a variety of differing DA scores, as this shows versatility and your links were acquired naturally and organically. It is a SEO best practice to gain links from other websites, and a diverse backlink profile shows that multiple websites see you as an authority in their space.

How to Evaluate SEO Progress Over Time

Again, the most important part of SEO measurement is what the data is telling you about what you are doing right. If you don’t take the time to measure SEO efforts with as much detail as possible, you will never have an accurate picture or understanding of the value of your SEO efforts.

It is normal for your SEO strategy and goals to change as your business ebbs and flows. But as long as you stay consistent in your SEO efforts, your website will become a reflection of your business online, and you will be able to build a strong presence in the digital world.

You do not have to do your entire SEO strategy by yourself. Our SEO experts are here to understand your business needs and goals, and our team is here to see your business thrive online. Give us a call and we’ll get started creating a comprehensive SEO strategy to bring your business to new heights.

 

 

 

 

 

 

 

 

SEO reporting software is indispensable for keeping your SEO campaigns organized and on track. 

The ideal SEO reporting software allows you to provide clients with clear, concise, and visually appealing reports. However, knowing which software does this best can be a time-consuming challenge.

Additionally, you want to invest in SEO software that will grow with your agency or business while supporting your search engine optimization needs. This includes providing you with real-time insight, automatically tracking key performance indicators (KPIs), graphing analytics. So, with so many SEO reporting tools, how can you decide which will work for you?

In this article, we will cover what to look for when testing and choosing an SEO reporting tool, what reports to provide your clients, and how top-rated software compares.

The Importance of SEO Reporting for Agencies & Freelancers

Search engine optimization isn’t the simplest service to explain and demonstrate to non-SEO professionals. This can present a hurdle when working with clients, especially those without a lot of knowledge in SEO. Being able to provide these clients with easy-to-understand reports is vital.

SEO reports allow you to:

  • Provide the reasoning behind your SEO strategy
  • Present evidence of short-term and long-term results
  • Explain upcoming shifts to your strategy
  • Provide evidence for a rate increase or contract renewal
  • Present a monetary value of the organic traffic your efforts have generated
  • Keep your clients up-to-date
  • Increase client confidence in your SEO skills

When to Present SEO Reports to Your Clients & What KPIs to Include

Just as college students can anticipate midterm grades and final grades, it’s important to establish when to present your clients with SEO progress reports. In fact, most agencies outline in their contracts when clients should expect updates. For the most part, these are the best times to create reports for your SEO campaigns and which metrics to provide at those times.

Pitch to Potential Clients with Preliminary Reports

Presenting reports when pitching to a potential client is a smart move. It’s difficult to beat a data-driven pitch when attempting to secure a new client. But how much data should you provide during the customer acquisition phase?

The key to SEO pitch reports is to demonstrate your capabilities, client’s SEO potential, and to not spend excessive time performing research. In order to achieve these, stick to providing the potential client with the basics.

A Potential Client Preliminary Report

  • Their domain rating
  • Their competition’s domain ratings
  • A page speed audit (desktop & mobile)
  • 5 keywords they could potentially rank for

Providing these shows the potential client that you’re willing to go above and beyond.

SEO Case Study Reports: Pitch without Pitching

Not every SEO or agency has the time or resources to create custom reports for all of their pitches. Additionally, consumers now do a lot of research on their own before reaching out. This is when high-quality case studies can benefit your business. 

Every agency should have case studies on-hand or available to point to on your website. 

When creating the visualizations and charts for your case studies, keep in mind that they need to be self-explanatory and attractive. You also want to highlight your greatest achievements and tell the story of how you improved your client’s business through your services.

When presenting a case study, you can essentially repurpose a year-end report from several of your most successful campaigns.

On-Boarding SEO Reports

Once you’ve landed a client, it’s important to record the client’s current SEO standing since this will be your starting point. A broad analysis will give you the data you need to create a smart, actionable SEO strategy tailored specifically to your client.

On-boarding SEO reports should include:

  • Domain authority
  • Keyword Positions
  • Traffic sources by country
  • Current keyword rankings
  • Organic website traffic
  • Estimated Value of Traffic (EVOT)
  • Number of referring backlinks
  • Indexing speed
  • Overall technical score
  • Total site issues
  • Backlink profile

After outlining your client’s current SEO status, it’s easier to create a strategy as well as measure growth From here, you can go more in-depth for the areas where your specific client needs more work, like identifying and cleaning up toxic backlinks or finding opportunities to quickly increase keyword rank with new or refreshed content.

Active Campaign Reports

The majority of clients want to receive updates for active SEO campaigns. These can be monthly, quarterly, semi-annually, and annually. For these, report templates and report automation will save you time and energy while creating uniformity.

Instead of wrestling with spreadsheets and building a custom report each month, you can use SEO software like GSC Insights with integrated real-time data, KPI tracking, and exportable reports or report templates.

GSC Insights improves upon the data made available through Google Search Console. This software analyzes Google Search Console’s metrics to create easy-to-read and insightful SEO reports. This allows you to create beautiful growth visualizations for your clients while making it easy for you to notice fluctuations in traffic. 

GSC Insights also allows you to customize your reports by date range, so you can demonstrate changes month-over-month (MoM) or quarter to quarter.

SEO reports for clients’ active campaigns should include:

  • Keyword Position History which shows the site’s overall keyword positions

Note that this graph marks major Google algorithm updates. This can be quite useful should your client incur a drop in traffic as a result of these updates.

  • Change in organic traffic
  • Growth in search impressions
  • Change in total keywords the site is ranking for
  • Overall rank change

You can also include:

  • Top performing keywords
  • Top performing pages
  • The economic value of the site’s organic traffic
  • Keyword movements

Depending on your contract, you can also utilize other dashboard tools to create Technical SEO reports for:

  • Backlinks
  • Broken links
  • Indexing Speed
  • Page Speed
  • Overall Content Score
  • Inaccessible and orphaned pages

With GSC Insights, you can export a complete report that can be easily inserted into an email, Google Doc, or another software. You can also export a wide array of SEO performance analytics to CSVs. All of these file formats can be easily exported to PDF or printed to PDF.

As-Needed Reports

Sometimes there’s a massive change in the SEO ecosystem. These changes could be a competitor dramatically raising PPC investment and therefore ad prices, a Google algorithm update requiring technical SEO, or regulatory changes in the client industry that impact social media.

When these disruptions happen, a standard weekly or monthly report won’t be enough. A full site audit would be too much. These client reports need to be focused on the SEO strategy that specifically addresses this new challenge.

Contract-Renewals & End of Campaign Recaps

As contracts come up for renewal, seasonal or temporary campaigns, or campaigns coming to an end, providing your clients with a clear report can have a major impact. A well-designed report can result in a successful renewal, service expansion, or a positive review and recommendation.

In these cases, a post-campaign audit can highlight not just your ongoing work but also show the client their ROI for the campaign. For these reports, you’ll want to focus on the value you provided.

Campaign metrics to highlight in a final report include:

  • The total economic value of the traffic you generated
  • The overall increase in organic search traffic
  • The total growth in rankings
  • Any increase in conversions that the client has shared
  • Increases in impressions and CTR

We recommend also including a brief written summary of the campaign’s strategy and results along with how you could provide for the client’s SEO needs in the future.

 This report needs to show how you took the client to the next level through your specific.

How to Produce SEO Reports (Including PDFs)

Creating SEO reports for clients is as easy as the software you use allows. For example, when using an SEO tracking software that does not allow you to export data, you can anticipate spending hours each month manually transferring numbers into spreadsheets. With a platform like ours, you can simply hit the “export to PNG” or export to CSV to create an automated report. 

Here, we will cover how to create a variety of active campaign reports and customized reports using GSC Insights:

How to Create SEO Reports with GSC Insights

This report will include all data presented on the GSC Insights dashboard as seen below:

  1. First, navigate to GSC Insights and simply click the Export to PNG button.
  2. After you’ve exported the PNG, you can insert it into a Word Doc, Google Doc, PowerPoint, or other software.

How to Create SEO Reports in PDF for Clients

  1. Follow Step 1 above and export the GSC Insights to PNG.
  2. Next, open Google Docs.
  3. Then, create a New blank document
  4. Go to Insert>Image>Upload from Computer.
  5. Navigate to the folder where the PNG is saved (most often you will find it in your Downloads folder).
  6. After the PNG is loaded, you can crop it to include the metrics you want to include. You may need to insert it multiple times.
  7. To create a PDF, go to File>Download>PDF Document (.pdf)

How to Turn CSVs or Spreadsheet SEO Reports into PDFs

  1. Navigate to the data set you would like to export.
  2. Click on the Export to CSV button.
  3. Once the file is downloaded, you drag and drop the file into Google Drive then open it with Google Sheets.
  4. In Google Sheets, select File>Download>PDF Document (.pdf)

What to look for in SEO Reporting Software for Clients

When your SEO software creates your reports, you’re able to maximize your productivity. However, not all SEO tools have the capabilities. So, when choosing the right SEO software for your needs, keep these criteria in mind.

1. Accurate data from trusted sources

SEO reporting tools use different data sources. Ahrefs brags about its extensive web crawls that make up the source of its information. Google Data Studio obviously pulls from Google Analytics and Google Search Console. While our suite draws data from a wide array of trusted sources, including Google Search Console.

2. Real-Time Data

Working with the latest data makes a major difference when it comes to SEO. Not only can you react and plan better when your metrics are up-to-date but can share the most recent and accurate data with your clients.

3. Site Crawls

It’s nearly impossible to perform many SEO tasks without the ability to access and use indexing data gathered from crawls. While some content editor tools like Clearscope, leave this feature out, you will want to have another software that makes up for it.

4. Clear Visualizations

To really deliver the impact of your work, you need to show your results in a way that speaks to your clients. SEO reporting software should be able to convert the raw data, whether it comes from Google or from proprietary research, into engaging charts and graphs that non-SEOs understand.

5. Fresh Insights into Performance

Sometimes it takes a fresh insight to highlight your hard work. Therefore, you’ll want to look into how each SEO software tool allows you to approach the numbers in a different way.

For example, GSC Insights offers Page Groupings, which allows you to see how certain groups of pages perform. Instead of manually compiling data for, say, blog posts vs static pages, GSC Insights lets you group pages together to get impression and traffic data for those aggregates, saving you time.

6. Local SEO Data

If your clients need local SEO, then you’ll want to keep this in mind when choosing your SEO tools, including your reporting tools. Your clients may appreciate being able to compare a list of keywords for each city or state, or seeing SERPs for distinct geographic areas.

7. Formatting Options

As reports are used for different purposes, it’s helpful to have a reporting tool that supports multiple reporting formats. PDF reports are great for emails or in-person meetings. CSVs are a best practice when you’re combining information from multiple sources for a custom report. 

Exporting to CSV for Google Sheets can be extremely convenient as well. Check that your provider offers exports in formats that you need.

8. Multiple Client & Campaign Profiles

Finally, SEOs need to be able to monitor the progress of all of their clients on a regular basis. Having one dashboard to access all of your clients is a must. Ideally, you can toggle between client accounts to easily produce your reports at designated intervals.

Consider White Label SEO Reporting Software for Clients

As a digital marketer, you want to spend time with clients or working on projects, not creating SEO reports. White-label reports can save you time. White label SEO reports are often automated reports that export to a PDF format. These effective SEO reports have great scalability and are perfect for clients with a basic understanding of good SEO.

Comparing Top SEO Tools for Reporting and Tracking

You will find a wide variety of SEO tools out there. Each will have its own capabilities. However, you will want to be sure you invest in the one that provides you with unique insight as well as report options that clients will appreciate. Here’s how the most popular SEO software stack up:

1. Our Dashboard

Our dashboard is an all-in-one SEO suite that allows agencies, freelancers, and site owners to optimize their sites based on real-time data and analytics. 

What sets us apart from its competitors is that it has a tool or feature for nearly every aspect of SEO, including 

  • Site auditing tools
  • Keyword research tools & data
  • A content editor
  • A DA checker
  • Organic traffic data
  • Backlink data
  • A schema creator
  • A local search results tool
  • Social media tool (coming soon)

Reporting options: we offer a wide array of reporting options with an export feature for CSVs and PNGs. You can also easily track your SEO campaigns alongside your SEO performance metrics with the Site Events feature.

2. SEMrush

SEMrush is a popular SEO software that provides users with a collection of tools and features for technical SEO and content SEO. SEMRush provides easy exporting to PDF and spreadsheets on most of its reports.

However, the biggest drawback of this platform is users must pay a hefty price for access to the full range of tools.

Reporting options: SEMrush allows you to export to PDF, XLSX, or CSV for more data.

3. Ahrefs

Like us, ahrefs provides all of their users access to their full suite of tools. Unlike us, Ahrefs doesn’t have as many capabilities. Primarily, this software does not provide backlink data, content creation tools, or a local SEO tool.

Ahrefs is one of the few SEO software options that doesn’t offer a free trial. 

Reports: Ahrefs allows you to export most reports to CSVs. Their reporting options include a quick-view, full-report, or custom report.

4. Moz Pro

When it comes to SEO, most people are familiar with Moz Pro–and often for good reason. Moz Pro was one of the first extremely popular SEO software options. Additionally, Moz Pro’s dashboard is clean and easy to use.

However, Moz Pro does not support SEO content creation and lacks several of the other bells and whistles you will find in other suites. This means if you have a full-service SEO agency, you will need to invest in multiple SEO platforms.

Reporting options: (Not all available at the base-level subscription) Custom reports, templates, branded reports, automated reports, PDFs, CSVs

SEO Reporting Software for Clients makes a Difference

As an SEO agency or professional, your time is one of your most valuable assets. When you choose an SEO tool, it’s vital to review its exporting and report options. Keep in mind that you want the data to tell the story of your SEO skills seamlessly. 

While there are many SEO tools out there, such as ahrefs and Moz, we offer the greatest capabilities for all aspects of SEO. However, the easiest way to determine which software is best for your company is to choose a few and try them out.

We all know the importance of keywords to a business’s search engine optimization (SEO) efforts. Ranking for valuable keywords in your industry is a sustainable, affordable marketing strategy to boost your brand’s visibility online and drive leads and sales. The process of keyword tracking helps digital marketers measure whether their SEO strategy is succeeding.

Here is a breakdown of the importance of keyword tracking to your SEO strategy, how to choose a keyword tracking software, and some top keyword tracking softwares used by SEO professionals and digital marketers today.

What is Keyword Tracking?

Keyword tracking is the process of using software tools to monitor the organic keywords your website ranks for in search engine results. It also involves tracking the ranking positions for those keywords and monitoring how those rankings change over time.  

But keyword tracking is not just about watching your keywords and hoping something good happens. Instead, the right keyword tracking tools will provide important data and metrics that offer insights into how to grow and scale your web presence in Google. 

Your ideal keyword goal should be to rank in the number one position in the (SERPs) for the keywords that your target audience is using. However, certain keywords are more competitive than others. Other site owners will try to improve their content in order to outrank your brand.

As a result, you should constantly monitor your ranking fluctuations in order to respond when your ranking positions decrease or when your content stops being promoted entirely.

Why Keyword Tracking Matters to Successful SEO

To understand exactly why keyword tracking is an important factor within a successful SEO campaign, it’s critical to understand how the SERPs work, how target keywords tie into rankings, and the most important keyword metrics you’ll want to monitor.

How Do Search Engines Work?

Search engines work by crawling all the web pages on the Internet and then reading, categorizing, and indexing them into a large database. While there are a ton of different technical elements to a website that can make it easier for the search engine bots to find you, the keywords you include on your web pages help the bots understand the content on your web pages and whether or not to promote them in the SERPs.

Similarly, users rely on keywords when searching for new content or information about products and services. But users don’t always search the same way. There are a variety of keywords they might use to find relevant content. If your web pages provide high-quality information or answers to the questions users are asking, Google is more likely to promote it to users. 

How Do Keywords Tie Into SERP Rankings?

As mentioned before, your keywords are ideas and topics that define what your content and web pages are all about. They are a fundamental part of SEO. When used properly, keywords drive organic traffic to your website and can make or break your business. 

The more often that you create useful, informative content, the more opportunities you’ll have to get your brand name in front of potential customers by showing up in Google searches for relevant keyword queries.

Common Keyword Metrics

There are a few common keyword metrics that SEOs or site owners use to determine which keywords they want to rank for. They include: 

  • Search Volume: This is the number of monthly searches a keyword receives
  • CPC: This is the price advertisers pay per click to target the keyword in Google Ads. High CPCs are a sign of strong conversion potential.
  • Keyword Difficulty: The competitive landscape of the keyword. Higher keyword difficulty scores mean more competition to rank.

Site owners often aim to rank for keywords that have higher search volume and stronger conversion potential. A smart SEO strategy will also include keywords that a brand can realistically rank for so they can start ranking and driving clicks sooner.

Types of Keyword Search Intent

In addition to the above metrics, search intent also plays into the value of keywords. Stronger search intent can mean a greater need for rank tracking.

The four ways that SEOs usually categorize search intent are:

  • Informational: Keyword searchers where the user is looking for information.
  • Navigational: When a searcher is looking for a specific web page or website.
  • Transactional: Searchers where the user is looking to make a purchase.
  • Commercial: Keyword searchers where the user is looking for information that will help them make a purchase decision

Keyword searches that fall into the latter two categories of search intent are often more competitive because they have higher economic value. For businesses who use SEO to grow sales and revenue, they’ll more likely want to track keywords that fall into those categories.

What Metrics Should I Monitor When Rank Tracking?

Once you optimize your content for your target keywords and publish it on your site, there are four metrics to keep an eye on when rank tracking. They include:

  • Total Keywords: This is the total number of different keywords that a web page ranks for in the SERPs
  • Current Position: The ranking position of a web page for a specific keyword query
  • Average Position: This is the average ranking position of a web page across all of the keywords that it ranks for in Google 
  • Rank Change: This is the number of positions gained or lost since the previous day or the previous rankings update

By using all of these factors to monitor and track your keywords, you’ll be better equipped to hone in on your strategy and promote successful SEO.

How Often Should You Track Your Keyword Rankings?

Arguably, of any marketing element, keywords are the most volatile and change the most often. So much so, that a keyword’s ranking can change within hours and minutes. This makes it necessary to monitor them often.

But each business is different. The word “often” can be subjective. There are a lot of factors to consider when deciding how to monitor your keyword rankings. Our rule of thumb is the more organic clicks that a keyword drives to your website, the more frequently you should check your ranking position for that keyword.

Why? Because for some keywords, a change in rank position can mean hundreds, if not thousands, of fewer clicks to your website.

For beginner mid-sized businesses and those just getting off the ground in terms of content production, we recommend first giving your content time to be promoted on Google. On average, it can take Google up to 90 days to find, crawl, and index your web pages. If you are only starting out with your content implementation, have some patience. 

Until Google has been able to find your page, there won’t be much movement in your rankings, so sit tight. Give the search engine time to crawl and index, and once you have been found, we recommend checking your rankings on a regular basis every week. 

But for larger, more established websites, keyword rankings should be checked on a daily basis. Because when a keyword is high-performing (meaning strong conversion-potential), other site owners will dedicate a lot of time optimizing their content to rank on page 1. 

Just because you earn a top rankings spot doesn’t mean you’ll stay there.

Getting Started With Keyword Rank Tracking

Just as you wouldn’t jump right into any other part of your content marketing strategy, you’ll need to have a game plan when it comes to your keyword rank tracking. Here are some steps that will help you get started.

1. Determine Your Most Important Keywords

The first step is to decide what keywords are the most important for your business. That list can include keywords driving traffic to your website right now and those keywords that you hope to rank for in the future.

Are you earning lots of organic clicks from a specific keyword? Do you earn more conversions from specific queries? If the answer is yes, you’ll want to make sure your content stays in top positions for those keyword terms.

2. Identify Key Position Opportunities

Although the ultimate goal is always to rank in position 1, this is not always realistic. This is especially true when vying for the more competitive keywords in your niche. But because higher ranking positions have higher CTRs, just one position improvement for a keyword can mean loads of additional traffic to your website. 

So in addition to the important keywords, you’ll also want to identify keyword position improvements that present huge organic traffic opportunities for your website. You’ll have to improve the quality of your content and page experience of the pages targeting those keywords to see changes in position. But, you’ll want to have those keywords clearly identified so you can include them in your keyword tracking. 

This is particularly true for SEOs or digital marketers providing SEO services to clients. If one keyword position change produces huge influxes of traffic for a client, you’ll want to make sure you show it in your keyword tracking reports.

3. Choose a Keyword Rank Tracker

What rank tracker you choose will depend on the features that are most helpful to your own SEO strategy. Not all keyword rank tracker tools have the same accuracy or features. That’s why you’ll want to make sure that the software you ultimately choose provides the ranking data you need to stay on top of your keyword rankings and positions.

What Makes a Quality Keyword Tracking Software?

Remember, you want your keyword tracking software to work for you. So for the best results, the best rank tracker should include the metrics and functionalities you need in an easy-to-find place. The tool should also provide  easily comprehensible data.

When looking for a rank tracker tool, you’ll want to ask yourself the following questions to ensure it is a good fit for your needs:

  • Does this software track 100% of your site’s keyword rankings?
  • How often are the keyword positions updated? 
  • Do I have to pay extra for tracking more keywords? 
  • Does the software include historical data so I can see all keyword positions since Google first indexed pages on my website?
  • Does the software allow me to generate keyword reports, and are they helpful?
  • Are there metrics to help me estimate how my rank changes will impact my organic traffic?
  • Can I track rankings for different countries or geographic areas?
  • What’s the total number of keywords that I track at once?
  • Can I monitor my direct competitor’s keyword rankings and performance?
  • Can I set up alerts for my most important keywords?

Most Important Rank Tracking Features

You might not be able to answer yes to all of those questions based on your current keyword tracking software. 

However, there are some absolute musts that you will want to prioritize in order to do keyword rank tracking in the most impactful way.

1. Number of Keywords

For small businesses with only one primary service or product, you may not need a rank tracking software that provides position tracking for every single keyword your website ranks for.

But for enterprise organizations or digital marketing agencies, having 100% of your keyword data can be the difference between proving the value of SEO efforts to key stakeholders or clients.

Not all keyword tracking software provides comprehensive keyword tracking, and some charge higher prices for more additional keywords. Make sure you choose a software that provides access to as many keywords as possible.

2. Daily Position Updates

Imagine this experience. Your keyword tracking software states your web page is in position 4 for a high value keyword in your industry. But then, when you type that keyword into Google, your page is showing up in position 11?

That’s likely because your current keyword tracking software doesn’t have daily position updates. Some keyword tracking softwares only provide daily updates for keywords with higher search volumes, meaning keyword position data can be backdated by weeks, and sometimes even months.

If you are in a highly competitive market or a niche industry with keywords with lower search volume, delayed position updates can be very problematic and prevent you from responding to rankings drops promptly and effectively.

3. SERP Scraping or Google Data

The above two features are predicated on one primary factor: Whether your rank tracking software uses SERP scraping to gather data or uses Google’s API. 

Some keyword rank tracking software uses bots to scrape the SERPs and gather position information. Most of these softwares don’t have the capability to scrape the SERPs for every keyword every day.

Google, however, has the most complete keyword dataset in the world. If your keyword tracking software is using Google data, that means you can see more keywords and see changes on a daily basis.

Keyword Tracking Tools for Site Owners

Based on the above features, here are three different rank and position tracking tools that meet most of the above criteria. They vary from completely free to a higher price point. But they offer a range of features that empower site owners to make the most of keyword rank tracking.

Google Search Console

Google Search Console is the original keyword tracking software. This free platform gives site owners a nice introduction to keyword tracking and its importance to SEO. 

In addition to keyword tracking, this platform helps site owners troubleshoot page experience issues. You can also use it to monitor mobile usability, submit disavow files, and see your backlinks.

Best Features of Google Search Console

  • The World’s largest SEO dataset: No other rank tracking platform has as much keyword data as Google. If you want to see every single keyword your website ranks for – this is the place to do it.
  • Impressions Data: An impression happens every time a user sees your SERP result, even if they don’t click on it. Not all keyword tracking softwares include impressions data, but they can be a useful metric for benchmarking your SEO strategy.
  • Daily Updates: Google updates keyword rankings data for your website on a daily basis. If there are any rankings changes that might significantly impact your site traffic, you can know almost immediately with this free platform.

Unfortunately, Google’s keyword tracking platform isn’t the most user friendly and is pretty bare bones in terms of UI/UX. It is free, after all, so it’s no surprise that there are no advanced features here. 

But in terms of accuracy and real-time position information, nothing can beat the power of Google’s platform.

GSC Insights on the Dashboard

Ideal for enterprise websites and agencies, GSC Insights is one of the most comprehensive keyword tracking platforms. Because the tool is built over Google’s API, it provides the same daily updates and data as Google Search Console, but with more advanced features and data visualizations.

By linking GSC Insights with your Google Search Console account, you can have a fresh look on your keywords ranking data. The data representations that GSC Insights includes help you not only track your keyword rankings, but make insightful decisions for improvement. 

In addition to daily updates and full impressions data, GSC Insights also has these additional features.

Best Features of GSC Insights

  • Total Keyword History: See the total keywords your website has ranked for overtime. Although Google Search Console tells you all your data, it doesn’t chart total keyword rankings, only total clicks and impressions.
  • Overall Rank Change: Shows you the total number of position changes across all of your keywords. This is a great feature to see the impact of sitewide optimizations like link building or page experience improvements.
  • Full historical Data: GSC Insights charts all of your historical keyword data for a web page in one easily viewable chart.
  • Traffic and CTR by Rank Position: This feature can help site owners identify which keywords have the most untapped organic traffic. This feature shows average CTR and total organic traffic by position, meaning all of the keywords where your web pages rank in position 1, position, 2 and so on.
  • Keyword Cannibalization: Sometimes, you may have multiple web pages that rank for the same keyword. This happens when Google doesn’t know which page to promote. This feature identifies which pages are cannibalizing each other so you can resolve it.

For those who really want to leverage the power of keyword data to improve their SEO, GSC Insights is the ideal platform for position tracking. With various subscription levels, site owners and digital marketers can find the level that fits best for their team.

Moz Rank Tracker

Moz’s Rank Tracker offers a very simple and straightforward keyword rank tracking platform. For non-SEOs or beginners to keyword tracking, Rank Tracker offers only the high-level metrics you absolutely need to understand whether your SEO performance is improving or not.

However, because Moz relies on SERP scraping to gather keyword ranking data, they only offer 300 keywords to track, and upcharge for an additional 200 keywords. As a result, this rank tracking software is a better option for smaller sites or local businesses with less robust content or keyword goals.

Additional Features of Moz Rank Tracker:

  • Easy On-Boarding: Get started with Rank Tracker quickly with an easy on-boarding process to start tracking your keywords
  • Multiple Search Engines: If your brand is also focused on performing well in other search engines besides Google, Moz Rank Tracker provides keyword rankings and positions data for Bing, Yahoo, and Google Mobile as well
  • On-Demand Daily Keyword Checks: For an additional cost, Rank Tracker provides additional on-demand query checks so you can see the positions for the keywords that are most important or relevant to your strategy right now.

Conclusion

Every site owner executing an SEO strategy should be engaging in regular keyword tracking. Staying on top of keyword positions can be the difference between maintaining your organic web traffic or losing out on significant sales or lead generation.

All of the above keyword tracking softwares have trial versions of their tools so you can see which one is the right fit for your business or SEO strategy. When done well, keyword tracking can help you achieve more success and maintain your top positions well into the future.

 

 

 

 

We all know how important search engine optimization is, but ranking on the first page of the SERPs can only get your business so far. Getting organic clicks is how you get the most value from SEO, so trying to improve your organic click-through rates (CTR) will always be a worthwhile effort.

There is a strong relationship between SERP position and the overall amount of clicks a SERP result receives. Understanding that relationship, and doing your best to use it to your advantage, can be a surefire way to increase your overall organic traffic from search engines.

What is Organic Click-Through-Rate (CTR)?

To completely understand the definition of organic click-through-rate, it is imperative to know the following SEO terms.

  • SERP– Also known as a search engine results page. There are 10 organic listings on every SERP. Everything we discuss in this post will focus on improving CTRs for organic listings that appear below paid Google Ads.
  • Web Searchers– These are potential customers who are using Google to find you! They will enter a keyword or a phrase into Google’s search bar and will be given relevant SERP results depending on what they search.
  • Impressions– This is the number of times your website shows up in the SERPs. An impression occurs whenever your website is populated for any relevant keyword phrase.
  • Clicks – This is when a consumer actually clicks on your website and is brought to a landing page.

That brings us to our definition:

Organic Click-Through-Rate is the number of web searchers who click on your result in the SERPs divided by the number of impressions your result receives.”

So with that in mind, a higher CTR can mean all the difference between capturing traffic and having your potential consumers click on your competitor’s website. The name of the game is to shoot for a high CTR and to constantly improve upon it.

Improving your website’s organic CTR isn’t the easiest thing to do. In today’s digital world, the quicker you can capture a consumer, no matter the stage of the marketing funnel they are in, the better. But, since Google has incorporated paid search ads above organic listings, search ads earn over 40% of clicks.

This means you have to do what you can to squeeze as much organic traffic as possible, and a high CTR will help you achieve this.

Is Click-Through Rate a Ranking Factor?

Whether click-through rate is an official ranking factor has been debated in the SEO industry for years.

There are plenty of CTR experiments out there that have tested out the relationship between CTR and ranking. Many have shown a strong correlation.

If a user clicks on your SERP result over a competitor that ranks higher than yours, it shows that your result is highly relevant or appealing to the user. A strong click-through rate shows Google that your result is meeting the needs and desires of users.

Regardless, it takes clicks to get customers. There is no downside to improving your CTR across all of your web pages.

What is the Relationship Between CTR and Rank Position?

Simply speaking, a higher CTR and a higher rank position go hand in hand. According to a 2020 study, the first organic search result has an average CTR of a whopping 28.5%.

The study analyzed over 80 million keywords and billions of search results to see how consumers engage with the SERPs. As a whole, they found that traffic and CTR declined significantly after the first organic search result.

All of this information isn’t particularly shocking. Most digital marketers are familiar with website SERP positions and their overall total number of clicks.

But what was most interesting about this study is that it examined how the amount of traffic generated by websites on page one and how much they vary by position. For example, while a result in position 2 would have a 15% CTR, that website will generate three times more clicks than the website in the sixth position.

So what does this mean? Well, that the relationship between CTR and rank position is exponential, and even if your website shows up on page 1 of the SERPs, this does not guarantee you will experience a drastic influx of traffic.

While online visibility is great, focusing your efforts on increasing the number of clicks to your website by trying to rank for the first and second position, rather than the easier to attain positions of 8, 9, and 10.

Key Facts About CTR & Ranking Position

Here are some specifics about CTR and ranking position.

  • Websites that show up in the first position are 10 times more likely to receive a click
  • The CTR for positions 7-10 are largely the same and have little impact for brands
  • Only 0.78% of all Google users click on a website located on the second page.
  • According to Moz, 72% of Google users prefer to only click on the organic search results, instead of banner ads and ppc ads. This results in a higher click-through for all organic searches.
  • In the same study, 75% of respondents reported that they instinctively clicked on the first two results, making the clickthrough rate of positions 1 & 2 the highest on the SERPs including paid search ads.
  • URLs that include a keyword have a 45% higher CTR than URLs that do not contain any keywords.
  • Title tags that have a question in them boast a 14.1% higher CTR, and titles with a strong emotional statement, either positive or negative, improved CTR by at least 7%.

All of this evidence shows how important ranking position is to improve CTR and organic traffic. But to understand CTR and organic traffic for your website more specifically, turn to the right keyword tracking tools.

How to Analyze CTR with SEO CTR Software 

One of the best tools available for tracking your CTR across multiple different keywords is our GSC Insights Tool. GSC Insights is built over Google’s API. That means real-time, accurate data to help you accurately measure your SEO improvements on a daily basis.

With our Traffic by Rank Position feature, you’ll be able to see how your CTR measures up against the standard. This chart tracks your organic traffic and CTR by rank position across all the keywords on your website. It helps site owners to get an idea of where they can make improvements to their overall keywords, thus improving your average click-through rate.

Comparing CTR by Rank Position Charts

Let’s look at a few Traffic and CTR by Rank Position charts. They can show us how GSC Insights helps site owners strategize CTR and organic traffic improvements for their own web pages.

In this first example, we see a website with click-through rates that do not reflect the standard metrics. This website actually has a higher CTR for keywords where they rank in position 3 & 4 rather than positions 1 & 2.

Because we know most clicks go to the top 2 results, this site owner would want to review their page titles and meta descriptions for the keywords where they rank in the 1 & 2 positions to try to improve their relevance and click-ability.

In the next example, we have a website with CTRs that more closely follow expected outcomes. They have a higher CTR for keywords that rank in position 1 & 2. They also have relatively the same CTR for positions 7-10.

However, what’s most noticeable about the above chart is the large amount of traffic that this site is getting from keywords where they rank in position 3. Most likely, the keywords where they rank in position 3 have larger search volume and thus stand a better chance of earning more clicks.

If this website could get their webpages in position 2 instead of 3 for those same keywords, they would see a huge influx of organic traffic.

In the third example, we see a website that has fairly low click-through rates across all rank positions. This means that their page titles and meta descriptions are likely unoptimized. An SEO strategist working for this client could make an argument for how much they would benefit from an SEO campaign.

See all that traffic coming from the 10 & 11 rank positions? Imagine how much traffic this website would get if they could improve those same keyword rankings to position 1-5?

In our final example, we see a website that has strong CTR metrics that reflect what is expected. But there are some positions where the CTR doesn’t meet expectations: positions 1 & 3.

Why does this website meet or exceed average CTR in position 2 but underperform in position 1 (where CTR should be 8% higher)? This site owner should compare the keywords where they rank in position 1 & 2. See what page titles or meta descriptions are pushing the higher percentage of clicks.

Do those meta descriptions include questions? More keywords? Are they the appropriate length? After identifying why position 2 drives more clicks, they should model their page titles and descriptions for the pages where they rank in position 1 accordingly.

6 Strategies for Improving Click-Through-Rates

So now that we know why CTR is an important metric to track and how to use GSC Insights to identify areas for improvement, how do you actually increase your click-through rates as they currently stand?

Here are some proven strategies to help you succeed in getting a good click through rate.

1. Create Your Content Based Off of What Your Target Audience is Already Asking

Your landing pages need to answer your target audience’s questions before they even have a chance to ask them. To see what they are already looking for, take a look at content that is already ranking for your target keywords.

What are potential consumers looking for when they enter that keyword? Use our SEO Content Assistant tool to review your competitor’s content and to improve your web pages relevance and topical depth.

2. Remove Any Keyword Cannibalization

Keyword cannibalization is when you have multiple pages that all rank for the same keyword. You never want this to happen as you don’t want only a few of your website pages to benefit from your digital marketing efforts.

Instead, you’ll want to spread out your keywords throughout every single page of your website, so you have multiple pages of relevant, important information. This way, you’ll also be able to spread out your organic clicks and overall conversion rate.

3. Focus on Your Meta Copy

As we already mentioned, your page titles and meta descriptions play a very important role in your ratio of clicks. Make sure to create a headline that also has a CTA for your consumer to do your desired action of clicking through to your website.

You’ll also want to include a CTA within your meta description, as these few sentences are the best first impression of your brand. Some helpful tips for making your SERP result stick out are:

  • Avoiding heavy title tags; you’ll want your headlines to be quick and concise.
  • Use a number. Research has shown that using numbers in your title is a fantastic method as it increases your CTR by 36%.
  • Write the current calendar year. You’ll want to show that the content you are promoting is timely.
  • Add brackets into your titles. A HubSpot survey found that doing so increases CTR by 40%.

4. Create Descriptive URLS

Consumers want to know exactly what they are clicking on, and a specific link lets them do exactly that. Include as many specifics about the landing page as possible, and make it relevant to the page itself. As a best practice, if your title is short, make your URL match the title exactly. URLs are a ranking factor within the Google search network, so make sure not to forget this step!

5. Research Long-Tail Keywords

When you are creating relevant content, you will need to use a whole variety of keywords, from shorter terms to long-tail keywords. Considering that consumers will sometimes search using longer phrases or even complete questions in their search queries, it is a good thing to include as many in your content without keyword stuffing.

6. Implement Structured Data

Structured data, otherwise known as Schema Markup, helps you appear in rich results, so you’ll want to utilize it correctly!

Example of Products Rich Result on a SERP

As Google uses structured data to accurately present your website’s information, not including structured markup can be a detriment to enticing those extra clicks. The most common types of schema are:

  • Organization
  • Person
  • Local Business
  • Breadcrumbs
  • Article
  • Video
  • Event
  • FAQ
  • Product & Offer

Using the above techniques will ensure your website achieves the highest CTR possible.

Conclusion

In summary, your website’s CTR is one of the most important metrics any website owner should track. Understanding your organic click-through rates and the strategies that can help you boost it is imperative for any business owner.

 

To the untrained eye, SEO might seem like a guessing game. You choose your keywords wisely, build some links, and––presto!––suddenly, your website sits atop the first page of a Google search. In truth, search engine optimization is anything but magic. When a Google bot scans your website, crawling through each of its pages, there are specific criteria it looks for, like content quality, site authority, and page responsiveness. What’s more, search engines are continually adjusting their algorithms, refining how they categorize and rank pages.

But what does this mean for your business? A digital agency that specializes in SEO can help your business maximize returns on your web presence. Much like the search engines they work with, SEO is a rapidly evolving industry, and the right agency can wield their expertise to suit your company’s needs, allowing you to focus on the rest of your business.

It’s essential for starters to understand that search engine optimization, while complex, is not magic but rather science. Through a deliberate approach of link building, content development, and web page optimization, new websites can use SEO to increase both organic traffic and conversions. The good news is you can remove the wand and crystal ball from your Amazon cart. The better news is that you can implement an effective SEO strategy with the help of a professional SEO company.

Once you consult with your team and decide to hire an SEO company, the question remains––Which SEO agency best suits my business? Currently, there are hundreds of options at your disposal, and more cropping up each day. Between SEO freelancers, in-house specialists, and full-fledged agencies, you’ll have plenty of options to choose from. Likely, you’ll start by browsing the web; however, we recommend asking your friends and colleagues. Which services have delivered results for them?

When you reach out to an agency, they’ll likely begin with a conversation, one in which they’ll assess the needs of your business and outline a plan of attack. During this call, be sure to ask them the following questions.

#1. What is your overall SEO strategy, and how do you produce results?

The goal of any SEO approach is, at its core, the same: to increase organic traffic to your website. With that said, there are many different ways of achieving this, so when consulting with an SEO agency, it’s important to consider the ways through which they plan to boost your search rankings.

Essentially, there are three main categories of SEO, and any SEO strategy worth its salt will account for each of them.

On-Page SEO

As the name implies, on-page SEO refers to the content visible on your page. This includes landing pages, rich media, blogs, your services pages, and other website content that your user will engage with. A solid SEO company will make sure to optimize all of this content for you, which can be done through a variety of means.

Keyword research consists of finding the best keywords to target on a page in order to boost its search engine rankings. Especially if your website is relatively new, you’ll want to select a company that targets keywords with attainable ranking difficulties. For example, if you’re hoping to boost traffic to your meal kit service, a good SEO company will find high-value keywords that drive qualified, converting traffic to your page, effectively carving a niche for your business, even in crowded markets. 

Technical SEO

Technical SEO refers to the backend of your website, that is, the parts not visible to the average user. While it’s tempting to settle for optimized on-page content, what happens under the hood is of equal importance. While the user experience is important for human users, they are not the only one’s reviewing your site.

In order to categorize and rank your site, search engines crawl each of your pages, “reading” them and rating them on their own scales of readability, so to speak. Especially in the coming months, with Google’s page experience update, the speed, mobile-friendliness, site architecture, and security all factor heavily into your page’s ranking, and a good SEO service should account for this.

Off-Site SEO

The third, and arguably most difficult type of SEO, occurs off-site. It consists of all things related to building your website’s reputation within the greater community of the internet. By building a network of links to your site, links from authoritative and relevant sources, you’ll ultimately bolster the authority of your own page, all of which ties back to higher SERP placements.

This is where an SEO service really comes in handy, as building a backlink profile is as difficult and time-consuming as it sounds. With this in mind, you’ll want to look into a search engine optimization strategy that can see you through in the long-haul.

Remember, not all SEO companies will offer solutions for all three of these elements, so be sure to assess the particular needs of your business before hiring their services.

#2. Are your SEO tactics Google compliant?

Just as a library has the Dewey Decimal System, Google and other search engines have their own systems with which they crawl and index web pages. Unlike your average library, however, Google has the mind-boggling task of organizing the trillions of pages available on the internet. What’s more, Google must rank these pages. To accomplish this, Google has a set of rules and criteria that pages must abide by in order to show up in searches.  

At first glance, it might sound tempting to manipulate these criteria. Cloaking content, stuffing irrelevant keywords into your copy, and plagiarism are all examples of what we’d call black hat SEO. While these methods may boost your rankings in the short term, they will ultimately damage how search engines regard your site.

Bonafide SEO specialists will use white hat SEO tactics that abide by Google’s webmaster guidelines, which are rooted in the quality of both content and user experience. Sure, strategies such as link-building and keyword analysis take a lot of time and effort, but if you’re going to shell out for an SEO company, then you’ll want to make sure they comply with Google; otherwise, the benefits of their services will be short-lived.

#3: Does your SEO company know how to rank in my industry?

Although all SEO is centered around boosting organic traffic, SEO is not a one-size-fits-all type of service. Even if a search engine optimization company increases your traffic, there’s no guarantee that this correlates with increased revenue. A professional SEO company could maybe land you at the top of the rankings for cloud computing software, but this will do you little good if you’re in the business of CRM platforms.

Solid professional SEO services will be familiar with your industry. Certain industries, such as insurance, are highly competitive in terms of SEO. Others, like call center software, while less saturated, possess a high search intent, thus driving up the CPC for associated keywords. An SEO company should be able to understand the types of search idiosyncrasies in your industry and create a strategy accordingly. Armed with industry-specific knowledge, they’ll attract the type of potential customers that drive conversion.

#4. How does your SEO agency approach keyword research?

There are two ways in which SEO experts utilize keywords: buying search ads and page optimization.

Search Ads

The more straightforward of keyword methods, buying search ads allows you to show up in the search results for specific keywords you’ve selected. While this can certainly be a way to vault atop the Google rankings, you’ll be charged for every click, which can quickly become a costly form of paid advertising, not to mention the perils of click-fraud.

Especially for small and medium-sized businesses, Google Ads can be a hefty price to pay with little guarantee of a return on the investment. In the short term, it will surely bring a larger audience to your page; however, this will only hold for as long as you keep paying for clicks. On the other hand, SEO, while more complicated, provides a long-term solution, ultimately boosting organic traffic at lower cost. 

Optimize for Organic Search

The more laborious option is to optimize your page for organic search results. This is done by first making sure your site has content that is both relevant and of high quality. Then you’ll have to get other websites to reference your content and link back to it. All of this plays into Google’s algorithm.

Most professional SEO services will likely focus more on optimization for organic search. This process begins with keyword research, the best way to figure out which terms to target. One might assume that a personal injury law firm should naturally target the term “personal injury”; however, with its high keyword difficulty rating (62) and high costs ($44 per click), there is likely a more effective approach. A true SEO expert will take the time to figure out with you, the client, alternative search terms that will both save you money and increase your likelihood of SERP visibility.

#5. What previous clients or business owners has your company worked with? What were the results?

Nowadays, there are hundreds of digital marketing agencies that offer SEO services, and it can be difficult to figure out which one will suit your needs. A good place to start is by reviewing who they’ve worked with. Most SEO specialist services will list prior clients on their website. They provide detailed case studies of their clients, in which they outline their methods and provide their tangible measures of success. Take a look at who a company has worked with. If they have a history of success in your industry, then it’s likely they’ll be a good fit for you.

In addition to the specific type of industry, you’ll also want to look into their success metrics. Did the SEO firm simply boost traffic? How did this traffic affect conversion? Did they utilize social media, email campaigns, local SEO, graphic design, or competitor analysis? For the best results, you’ll want to have your SEO goals in mind so that you can better observe a similar pattern of success in professional SEO services.

#6. For link building services: How do you get your backlinks, and how long do they last?

Link building is the backbone of any content marketing strategy. It improves your position on major search engines and expands both your web presence and authority. With that said, not all links are created equally, and you should pay close attention to how an SEO firm obtains backlinks for their clients.

Simply put, the way to obtain backlinks is by creating quality content. A good SEO company will review the entirety of your website and determine which pieces of your online presence hold a high linkable value. From there, they will optimize your content for both overall quality and target keywords in addition to creating new content that they’ll work to place in authoritative publications. It’s important to note that all of this outreach should be 100% organic. While there are certainly ways that you can pay in order to place an article, the most effective (albeit time-consuming) method is through pitching content to publications. With scalable link building campaigns, their editorial team will draft original, high-quality content and pitch it directly to authoritative web publications, thus building the authority of your own web presence.

Some companies will obtain links that only last for 6-12 months, giving you little benefit in the long run. Since Google’s algorithm can detect other unnatural link building tactics (that is, black hat SEO), it’s important to make sure that a professional SEO company plays by the rules. Bad links are penalized by search engines, whereas high-quality links that appear in useful content can drive organic traffic to your page for years to come.

#7. How will I know what changes your SEO company has made to my website?

It can be scary, essentially handing over the keys to your company’s web presence, especially for those not well versed in the language of Google, Bing, and the almighty algorithm. A professional SEO company understands this, and they’ll take steps to keep you in the loop on their process and overall progress of your SEO campaign. This can be achieved through a variety of means such as weekly reports of keyword rankings, conversion rates, backlink numbers, and traffic quality.

With this program, users can keep tabs on key analytics and better manage their SEO campaigns. For those looking to maximize their ROI, this tool functions as a handy dashboard, providing insights into technical SEO, backlinks, meta tags, and keyword performance. Some companies portray SEO as a nebulous concept, a tendency you should be wary of. In truth, effective engine optimization is driven by data, which is why we aren’t afraid to quantify and display findings, providing clients with the utmost transparency.

#8. What is your SEO firm’s pricing model, and do you require long-term contracts?

While the cost of SEO services varies widely, based on the size of a project, its duration, and the specific SEO pros behind it, it will behoove you to consider a company’s pricing model. This includes initial charges, hourly rates, and monthly retainer fees for longer-term agreements.

Again, this all comes back to knowing the goals of your SEO strategy. If all you’re looking for is to establish a backlink profile, then this can likely be done in a couple of months. More in-depth tasks, such as overhauling your site’s user interface or producing original content, will likely take longer and, naturally, come with a higher price tag.

Many of the larger, more established professional SEO companies will require you to lock into a long-term contract of up to two years. This might not be a bad thing; however, many younger agencies and independent consultants will be more flexible with their SEO marketing services, allowing you to opt-out of your agreement in the event that you’re unsatisfied with their results. You and your team should establish some tangible KPIs and evaluate your progress accordingly. With that said, it’s important to remember that the most worthwhile SEO strategies take time to come to fruition. SEO work takes time, patience, and years of experience.

#9. How involved does your company need to be in the process?

While SEO companies may all refer to themselves as such, there’s often a wide discrepancy in the services they offer. Some agencies specialize specifically in link building, whereas others offer full SEO overhauls, from web design to content creation and direct marketing. When taking stock of a company’s services, you should consider not only your goals but also the inner workings of your own operation.

If your interior design team is literally just you and your installers, then you’ll want to consider seeking an agency that can plan and execute an SEO campaign from the ground up. On the other hand, if your company is on the larger side (perhaps you already have a graphics team and web developer), then you can pick and choose the desired services, integrating them with your existing team.

In an ideal world, an SEO agency could handle everything without you having to lift a finger; however, you should be careful of companies that ask nothing of you. At the very least, they should ask you for access to your CMS, Google Analytics account, Webmaster Tools, and social media accounts. Otherwise, it’s hard to imagine they’re doing much of anything at all.

#10. Why are you the best SEO company, and why should I choose your services over competitors?

Ultimately, the best indicator of a good SEO company is transparency. From inception to execution, an agency should keep you fully aware of their research, strategies, implementation, and success metrics.

Unfortunately, it’s not so uncommon for SEO agencies to do more harm than good. Through black hat strategies, some companies are able to sell you on short term traffic gains and then leave you holding the bag when the algorithm recognizes the shoddy nature of their techniques and penalizes your page.

Like many B2B services, SEO marketing can become a fixture of your business. The road to the top of Google is both long and complicated; however, with a solid SEO company as your partner, you can get there.

 

 

 

 

 

 

Some webmasters can get intimidated when it comes to working on the backend of their websites. But the metadata you include on the page (and how it’s formatted) can have a significant impact on your SEO performance. Of the many types of SEO-friendly metadata, one of the most powerful is schema.org markup.

Schema markup is a form of structured data that helps search engines read your web pages better. It also improves the appearance and click-ability of your search result. Anyone can add schema.org markup to their website, and you don’t have to be a web developer to do it.

Here’s a complete guide to understanding the SEO power of this data markup and a detailed explanation about how to add it to your website.

What is Schema Markup?

In simple terms, schema markup is a type of semantic vocabulary code. You can place it on your website to help search engines create more informative and relevant results for users.

On the backend of your website, schema.org markup is a specific type of structured data in your HTML code. On the front end, that schema markup results in a rich result in Google, or a prominent SERP display that provides more information and context for your audience.

A normal snippet in the search engine result pages (also known as the SERPs), shows very basic information about the website such as the title of the page, the URL, and the meta description.

A rich snippet is a bit more complex and includes additional information highly relevant to search intent that you want to appear within the SERPs. Some examples of rich snippet information include hours of operation, star ratings, event details, and ingredients for a recipe. Schema is the code that allows for the rich snippet to populate with this extra information on the search result pages.

In order to use schema markup properly, you need to use a specific vocabulary of data. Luckily, the main search engines Google, Yahoo!, Bing, and Yandex, created this vocabulary in a centralized website, schema.org. They did so in order to reach a main standard of language so their search engines can perform properly.

This is a free resource and is used by digital marketing analysts to propel their website to better rankings and more clicks. On schema.org, you’ll be able to find plenty of tags, with specific categories, that can help you describe your business, products, reviews, job postings, and contact pages. We’ll get into this in more detail later on in this article.

SEO Benefits of Schema.org

There are many SEO benefits to utilizing schema.org vocabulary. Despite the benefits, it’s estimated that only 33% of markets are actually utilizing this powerful optimization. By adding schema markup to your site, you will be level up against your competitors in a variety of ways. Here are some of the benefits:

Schema tells the search engines what the data means.

Think of schema as a way to translate to the search engines what the data on your website means.

Search engines work through a process of crawling and indexing websites. Through this, they can populate those web pages within the SERPs when a specific keyword is entered into the search bar. However, there’s more to crawling a website than simply reading the text on a website.

Instead, you need to make sure your website’s HTML code and format can be read correctly. As a result, the information you want about your website is displayed properly. Schema is a free tool that does just that.

Schema is a data type that creates informative results.

Consumers have very short attention spans. In order to stand out in Google, you’ll have to give your prospective audience information in the method they want, and when they want it. All this extra information, provided by schema vocabulary, provides what is known as an “enhanced search result.”

Businesses, especially local businesses, only have a few seconds to make a good impression, and providing as much informative text as possible can mean all the difference when it comes to converting potential customers.

Schema improves your webpage’s click-through rate (ctr).

As stated above, the more informative your website is within the SERPs, the easier it is to improve one of the most important metrics for your website, your click-through rate. Creating multiple web pages can only work so well unless they are converting the consumers you need!

There’s more to digital marketing than creating content and putting it on a web page. You have to make sure each page works towards a specific marketing goal. Your about me page will have a different goal than your homepage, your blog posts, and your services page.

Schema is one of the easiest ways to help each page stand out on its own in Google search results. Since each page has a specific function, there are different schema types that relay different information in the rich search results. As a result, prospective consumers will be given more specific information for each web page they find. This increases the likelihood they will click through to your website and convert.

Schema boosts your local SEO efforts, especially on mobile.

We all know how important it is for our website to be mobile responsive, considering how many consumers use mobile devices to shop and scroll every single day. There’s a benefit to mobile rich snippets as they take up more space within the mobile SERPs, where real estate is more lucrative.

When schema is implemented correctly, searches for certain types of local businesses, such as local restaurants and cafes, movie theaters, and small retail shops will pop up showing a full list of items within the rich snippet to educate their consumers.

These design elements are implemented in something that is known as a carousel, where the user can quickly scroll through and click to the right web page they are looking for. As a result, this type of metadata allows for your local business to take up a good chunk of the important mobile SERP real estate, boosting your brand authority and awareness.

Schema is a little-known secret in the marketing world.

Many businesses know about schema, but don’t always implement them. In fact, only one-third of Google search results incorporate rich snippets, which means they use this type of source code. On top of that, throughout the rest of the major search engines, less than one-third use any type of schema markup.

In other words, there are a ton of website owners out there – literally millions – that are missing out on this massive source of SEO potential. And if you use it, you’ll be on your way to standing out amongst your competitors in no time at all.

Most Popular Schema Markup Types

There are many different types of markups that you can use within the realm of the schema vocabulary. The goal is to structure the markup type to fit three categories; people, places, or things.

The most popular types of schema are used to indicate the following item types:

  • Articles
  • Events
  • People
  • Products
  • Organizations
  • Local Businesses
  • Product reviews
  • Medical conditions
  • Recipes
  • Breadcrumbs within the website
  • Job postings
  • FAQ pages
  • Job training
  • Books
  • Podcasts
  • How-to
  • Logos
  • Movies
  • Sitelinks search box
  • Subscription and paywall content
  • Videos
  • Image license metadata

Once added to your website, these pieces of microdata will be then turned into a rich snippet, or what is also known as a rich result.

One of the great details about schema code is that it is completely customizable to your brand and business no matter your industry. There is a lot of microdata that is implemented into schema code, so the above are just common themes. The following data vocabularies are more niche uses of schema, under the themes outlined above.

Creative Works

This is the library of markups that are used for multiple forms of creative content such as books, movies, video games, and music, to name a few examples. For websites about movies, its schema would have movie-specific elements that highlight the star rating, genre, and nearby theaters to watch the film.

RDFa

An RDFa is a language of code that is added to the HTML code that already exists on your web pages. It stands for Resource Descriptive Framework in Attributes, and you are able to add it to any HTML, XHTML, and XML-based document. Some examples of RDFa attributes include:

  • Rel and Rev; symbolizing a relationship and a reverse relationship with another resource.
  • About, which explains what the microdata is about.
  • Content; to override the content of the element when using the property attribute.
  • Datatype; to specify the type of datatype used when using the property attribute.
  • Typeof; to specify the type of RDFa used.

Microdata

Implementation for microdata is the same as RDFa, except for having separate attributes. You can use the following microdata attributes on your website;

  • Itemscope; this is when you create the item and thus you indicate what the rest of the element is about.
  • Itemtype; this is when you describe the item itself using the schema.org vocabulary.
  • Itemid; a unique identifier of the element.
  • Itemref; to reference specific properties within an element.

JSON-LD

Standing for Javascript Object Notation for Linked Objects, this is an annotation type that can be simply copied and pasted into the heading or the body tag of a web document. All you have to do is use the tags “@context” and “@type” attributes when specifying which schema.org vocabulary you want. According to SEO experts, it is pivotal to use this JSON-ld format as often as possible, as it is considered the easiest way to implement schema markup for beginners.

How to Choose the Right Schema Markup for Your Web Pages

In order to choose the right schema markup for your website, you will have to zoom out and consider your overall digital marketing strategy for each web page. You first need to figure out what web pages you will want to optimize, and what part of the schema.org vocabulary you’ll use to get the best organic traffic. But how?

The easiest way to think of schema as a way of telling a story on your website, a story that is told between multiple similar pages that all relate back to overall goals. Here are some tips that will help you decide what schema markup is the best for you.

1. Identify the key details of your business.

This may seem obvious, but in order to choose the right schema markup, you’ll need to determine what your business is all about, what search terms you want to rank for, and how you want to tell the world about them. Typically, this includes your contact information, products, product reviews, FAQS, and thought-leadership pieces about what your business does. It’s a good idea to make a list of every page type on your website, and then categorize them based on what “business purpose” they fit into.

2. Map your web pages to the proper schema.org vocabulary.

Now take your list and map every single webpage to fit into the proper schema.org vocabulary. There are a few tools that help you do this ( we’ll get into them later!) but as of right now, take the time to meticulously map all your data out so you have everything in one place.

3. Evaluate each page for recurrence.

This is different from mapping your pages to each schema.org website option because this step is about recurrence. To figure this out, you can simply ask yourself the question ” does this page have content that is published somewhere else on the website?” If so, you’ll need to use a different data format for your schema implementation. A good rule of thumb is that if your website has more than 5 pages of similar content, then that content theme is recurring. If the content only appears once, it can be classified as a single page.

4. Connect your content.

You’ll now need to connect the dots between your metadata so you don’t have an empty text string. Your goal here is to create a knowledge graph so any search engine can easily read your website and understand the context between your content and how it all relates to one another.

When a search engine understands exactly who you are and what you do, you are sure to get an SEO boost. That’s because Google tends to show the most relevant information it can find within the first page of the organic search rankings for a query.

There are many tools that can help you connect your schema paths, such as this one from SchemaApp.

How to Use a Schema Markup Generator

Luckily, there are a lot of fantastic online tools to use when creating your website’s schema. The Schema Markup Generator is one of these options and is an easy way to boost your SEO efforts overnight. In most cases, these tools will write all of the code snippets you need, including HTML tags, and all you have to do is place them in the backend of your website.

Our markup generator is quite easy to use. Follow these steps for the best results:

  1. Login.
  2. Select “Schema Creator”
  3. Select the schema type you determined based on the page content you want to promote and input it into the field. For example, “local business.”
  4. You will get a javascript result, so copy and paste subsequent JSON-LD markup into the heading section of your web page.
  5. And there you go, you have successfully added schema to your website!

As a way to double-check your work, input your schema markup into Google’s rich results test tool. This test is a wonderful resource to use as it will identify if there are problems with your schema code, plus it will confirm whether or not Google is able to generate rich results from your markup.

In addition to Google’s data testing tool, here are some other options for checking your work:

  • SEMRush Audit Tool, which checks for markups and it tells you the percentage of your website that currently uses schema. This gives you information that can help you identify opportunities for improvement.
  • Google’s Content Markup Guide, made specifically for Creative Work schema in the goal to acquire more rich results.
  • Checking out any new releases from the Schema.org website to help you stay on top of industry updates.

Final Thoughts on Schema.org Markup and Rich Results

With all the free tools available to you, it is surprising how many businesses do not take advantage of the rich results that come with implementing the different types of schema markup. Even though it may seem a bit intimidating to work with schema code at first, these tools, especially your Schema Markup Generator can really help to elevate your website to the next level and increase your website rank for multiple keywords. And what more could you ask for?

There’s plenty of options available to you, as long as you stay dedicated to learning. Remember, SEO is similar to a stock market; the effort you put into equates to what you get out of it, and schema is one of the best ways to stand out among your competitors.

As always, our team of SEO experts and web developers are here to help you with any and all of your schema needs. Contact us today for more information about how we can bring your website to new heights.

An important element that many websites forget to include on their web pages are open graph meta tags. These tags are important for ensuring that your content is appealing and clickable when shared on popular social media sites. 

Here is a guide to what open graph tags are, why they matter, and how to implement them across your web pages.

What Are Open Graph Tags?

Open Graph tags are a set of HTML tags that help you control how your content is displayed on social media websites.

When someone shares one of your web pages on a social network like Facebook or LinkedIn, open graph communicates how to display and format the shared content.   

On the frontend, open graph tags make your content appear with the title, image, and display that you prefer, like this:

But without Open Graph meta tags on your web pages, your content on social media sites may look like this:

Which post would you be more likely to click on?

Are Open Graph Tags Important?

Open Graph is important because it allows you to control how your content appears when it is shared on social media.

They allow platforms to pull in specific details about the website and its contents to create a richer, more engaging social media post. It is very similar to a schema.org markup.

Open Graph is a great way to ensure that your content looks its best, displays your branding, and projects your content as valuable whenever it is shared online.

Are Open Graph Tags a Ranking Factor?

Open Graph is not a ranking factor that Google relies on when promoting web pages. But they still can indirectly impact your SEO.

How? Because social media is a great avenue for sharing content to grow brand visibility and site authority. If a webmaster sees your content on a social media site and enjoys it, they may choose to link to it on their own web pages, giving you a backlink and valuable link equity.

But if your content doesn’t utilize open graph meta tags, the content looks less appealing whenever shared on social networks or other popular sites, making users less likely to click on it, and webmasters less likely to link to it.

Who Uses OG Tags?

Open Graph Tags are used on major social media platforms, but some of the most important are:

  • Facebook
  • LinkedIn
  • Google+
  • WhatsApp
  • Slack

Twitter has their own version of Open Graph which are called Twitter Cards. To learn more, check out our post on Twitter Cards best practices.

Types of Open Graph

There are a number of different tags that you should be implementing on your web page.

If you do not have these open graph tags present on your web pages, they will be flagged in your Site Audit Report as “Missing”.

og:title

The title of the webpage is important for your users understanding the purpose and focus of your content. In HTML, the og:title tag looks like this:

Best practices for og:titles include the following:

  • Should be present on all shareable web pages on your website
  • Character count should be between 40-60
  • Focused on accuracy, brevity, and clickability

og:description

Similar to a meta description for SEO, the og:description should offer a brief description of the web page’s content to give the user more reason to click.

In HTML, the og:description looks like this:

Best practices for og:description include:

  • Should be present on all shareable web pages on your website
  • Character count should be between 120-160
  • Accompanies the title to make a clickable, sharable piece of content

og:image

The og:image is arguably one of the most important open graph tags because of the visual nature of social media platforms.

The og:image tag includes the URL of an image that represents the content of the webpage. In HTML, it will look like this:

Ideally, you should also use the og:image:width and og:image:height tags to let the social media websites know the size of your image.

Best practices for og:image open graph include:

  • Should be present on all shareable web pages on your website 
  • Uses images with a 1.91:1 ratio for the best clarity on desktop and mobile
  • Also includes og:image:width and og:image:height to ensure that the image loads properly when it is shared

og:url

The og:url tag tells social media websites the url of the webpage. It looks like this in HTML:

Best practices for og:url include the following:

  • Follows SEO-friendly url best practices
  • Points to the canonical url, especially if they are multiple versions of the page

og:type

The og:type tag helps social media sites understand the type of webpage (e.g. “article”, “blog”, “website” “video” etc.). In HTML, it looks like this:

Best practices for og:type include the following:

  • Use “article” for blogs and “website” for all other pages
  • Only define one type per page

og:locale

For international websites with content in multiple languages, the og:locale tag indicates the language of the web page content. In HTML, it looks like this:

Best practices for og: locale are pretty simple:

  • Use the appropriate country code (Here’s a complete list of HTML country codes)
  • Only use for pages that are not written in English

Common Mistakes with Open Graph

There are a few common issues that webmasters make with open graph tags that will be flagged in your Site Auditor report.

Missing Open Graph Tags

The most common mistake that webmasters make is they simply do not include open graph tags on their web pages. 

If an open graph is missing from a web page, you will see this issue displayed in your site audit report.

This is an easy fix, and simply involves adding the missing open graph tag into the HTML header of your web page (according to the best practices listed above).

Improper Character Counts

For og:title and og:description, your character counts will be flagged if they do not follow best practices.

To resolve this issue, you will need to either shorten or lengthen the character count of your open graph tags. 

This can be done by editing the HTML of your individual web pages.

How to Implement Open Graph on your Web Pages

There are two primary ways that you can add open graph tags to your web pages.

The first is utilizing a popular plugin. For WordPress websites, the most common used by webmasters is Yoast SEO.

Other popular CMS like Wix, Shopify, and others have plugins that make adding open graphs simple. 

You can also add open graph tags manually by editing your web page HTML code. Open graph tags should go in the <header> section of your website.

To ensure you have the required fields, you can use a tool like an open graph generator.

Conclusion

If you need assistance adding open graph to your website, our team of web development experts are here to help. Simply request a proposal from our team.

You can also run a site audit yourself after registering for a trial of our SEO software.

Including Twitter Cards on your web pages is a great way to optimize the performance of your content on the popular social media platform.

Similar to open graph tags, Twitter Cards make your content more engaging and clickable to Twitter users.

Here is a guide to adding Twitter cards to your web pages and how it can help with your overall online visibility.

What are Twitter Cards?

Twitter Cards are a way to enhance your tweets whenever you (or someone else) shares a link to your web page content.

When you include a Twitter Card on a web page, it will show a preview of that content when shared on twitter. Then, the user can click on the tweet to see the full content on your website.

Without Twitter Cards, your content will look less appealing when shared on Twitter. 

For example, here is a web page that is missing an essential Twitter Card. As a result, it doesn’t populate the Twitter post in a way that makes it desirable for users to click or engage with.

Which of these tweets would you be more likely to click on?

Benefits of Twitter Cards

The upside of Twitter Cards to your social media presence and visibility is fairly obvious.

  • More Clickable Content
  • Greater Control of your own Branding
  • More likes, shares, and retweets:
  • More potential backlinks

But this meta tag can also have ancillary benefits to your SEO visibility as well. 

How? Well if a Twitter User clicks on your content and finds the content valuable, they may choose to link to that content on their own website. That link can help bring valuable link equity and improve your website’s ranking potential.

But without that initial click, those users will never see your content. So although not a ranking factor, using Twitter Cards can make it so there are fewer barriers to other people finding, clicking, and sharing your content.

If you are missing essential Twitter cards or there are issues with implementation, the issue will be flagged in your audit report.

Types of Twitter Cards

There are four Twitter cards types available to webmasters:

  • Summary Card
  • Summary Card with Large Image
  • App Card
  • Player Card 

We recommend using the first or second option for your content, and you will find details on how to implement each below.

Summary Card

Here is an example of what the summary card looks like in Twitter:

This is the most simple Twitter Card and is very easy to implement with just two Twitter card properties: “twitter:card” and “twitter:title”.

Summary Card with Large Image

The Twitter Summary Card with Large Image will make the image more prominent in the display. 

If your content has great visual assets, you may want to consider this Twitter card. Here is an example of the Summary Card with Large Image from Search Engine Journal:

And here is what the Twitter card implementation  looks like in HTML:

The required properties for this card are the same as the Summary Card, but you will need to specify a different twitter:card type.

Twitter Card Properties

There are only two required properties for a valid twitter card. 

  • twitter:card
  • twitter:title

However, you may want to consider more to make your content more appealing when shared on Twitter.

Here is a rundown on each property.

twitter:card

This is the most important property because it tells Twitter which of the four types of Twitter cards you are using.

This is a required property in order to produce a valid Twitter Card.

twitter:title

The other required property for validation is twitter:title. This should be the title of your content and give users a good idea of what your content is about.

If you don’t define your title, Twitter will use your open graph title– “og:title “– for the title of your content.

For best practices with this property:

  • Provide users a good idea of what your content is about
  • Less than 55 characters
  • Clickable, and engaging title for users

twitter:description

The description will be visible to users below the title and provides a summary about your content.

Best Practices for this property include:

  • Clickable, engaging description
  • Less than 125 characters

If you do not define twitter:description, twitter will use your og:description first, then your meta description.

twitter:site

This property communicates the Twitter account associated with the website.

For best practices with this property:

  • A working Twitter account and ideally the Twitter account handle of your business or brand
  • Should be included to help communicate the relationship between a website and a Twitter page

If you don’t define twitter:site, you should define twitter:site:id. The id is the unique numeric value associated with your twitter account.

twitter:image

Arguably one of the most important properties, this will tell Twitter which image to display when the content is shared.

The size of the image is very important, and depending on whether you are implementing a Summary card or a Summary with a large image card, your dimensions will be different.

  • For Summary Card: Aspect ratio 1:1, Minimum size 144 x 144 pixels, Maximum size 4096 x 4096 pixels
  • For Summary Card with Large Image: Aspect ratio 2:1, Minimum size 300 x 175 pixels, Maximum size 4096 x 4096

In addition to proper sizing, best practices with this property include:

  • JPG, PNG , WEBBP, or GIF files

twitter:creator

The twitter:creator property defines the author of the content. 

With author authority becoming even more important in communicating the quality of your content, we recommend including this property on your web pages.

Best practices with this property include:

  • Should include a working, active Twitter account

Common Issues with Twitter Cards

There will be two types of issues related to Twitter Cards that you may see flagged in your site audit report.

Missing Twitter Properties

The most common issue webmasters make is they simply just don’t add Twitter Cards to their web pages or are missing specific properties that make the implementation valid.

This is any easy fix and simply involves adding the required property to your HTML.

Improper Character Counts

The second issue you will see flagged in the site auditor is when your twitter properties do not follow character count best practices.

To fix this issue, you will need to edit the length of your title or description in your Twitter Card implementation.

Conclusion

Although not an SEO ranking factor, adding Twitter Cards to your shareable content is worth the effort!

For more information on how to use Twitter Cards, visit the Twitter Developers website.

Full-scope search engine optimization can feel like juggling cats at times Often, most site owners have a knack for the technical side or the content side–rarely both. And creating a sitemap for Google is definitely on the more technical side of things. It can also be one of the more time-consuming SEO tasks if you’ve never created one before.

This guide will eliminate the guesswork and teach you how to create an XML sitemap with just a few simple steps. We will also cover how to submit your sitemap in your Google Search Console account.

What is a Sitemap?

As the name implies, a sitemap is a map that tells Google’s webcrawlers what route to take through your website. This XML file helps web crawlers and search engines find and index the pages on your website.

More specifically, a sitemap is a list of the pages on a website with hierarchical signals so Google understands the structure of your website and which pages are the most important. Sitemaps contain all subdomains that you want to appear in Google search results.

What is XML?

XML (Extensible Markup Language) is a markup language that defines a set of rules for encoding documents in a format that is both human-readable and machine-readable. It’s the language webcrawlers are fluent in, and XML is used for data storage and definitions.

XML was designed to be both simple and extensible, making it easy to create and interpret documents for people and search engine bots. 

How Do Webcrawlers Use a Sitemap File?

There are a few ways that web crawlers can use a sitemap file.

The most common way is to crawl through the links in the sitemap file to find pages to crawl on the website. This is especially helpful if the website has a lot of pages that are not linked to from the main navigation.

Another way that web crawlers can use a sitemap file is to find new pages to crawl on the website. If the website has recently been updated with new pages, the web crawler can identify and find those pages by looking through the sitemap file.

Finally, web crawlers can use a sitemap file to get an overview of the website. This is especially helpful if the website is very large and has a lot of pages. By looking at the sitemap file, the web crawler can see which pages are the most important and which pages need the most attention.

How Many Webpages Can You Include in a Sitemap?

You can include up to 500 web pages into one regular sitemap file. However, you can use multiple sitemap files to create a list of a website’s remaining pages.

How Does a Sitemap Affect a Website’s SEO?  

A sitemap is an important tool to help improve a website’s search engine optimization (SEO). It allows search engines to crawl and index your website’s pages more efficiently, resulting in better search engine results visibility for your website.

The structure of your website’s sitemap can have a significant impact on your website’s SEO. If your website’s pages are not well organized and easy to navigate, the search engines may have a difficult time indexing them. A well-constructed sitemap will help the search engines find and index all of your website’s pages, which will improve your website’s ranking in the search results.  

A sitemap is also an important tool for improving the usability of your website. A well-organized sitemap will help your website’s visitors find the information they are looking for quickly and easily.

Why do you need an optimized sitemap?

One of the most important aspects of SEO is the creation and optimization of a site’s sitemap. When your website’s sitemap is properly optimized, it can help improve your website’s search engine ranking and visibility. You will find instructions on how to optimize your sitemap further down in this article.

What’s the difference between a sitemap and a robots.txt file?

A sitemap is a list of all the direct URLs on a website and what crawl delay they should use when indexing them. A robots.txt file is a file that tells web crawlers which URLs they are allowed to index and which should remain off the SERPs.

Your robots.txt can tell webcrawlers where to locate your sitemap (which comes in handy when you have multiple). You can also indicate to the web crawler which sites should be excluded from indexing. To do so, include noindex tags for pages that you do not want to appear in search results.

Where does a sitemap live on a website?

A sitemap lives in the website’s root directory. The root directory is the topmost directory of a website, which contains all of the other directories and files, including the sitemap index file. You can find your site’s root directory by looking at the website’s source code, or by using a web browser’s “view source” feature. However, root directories are not accessible to non-site owners.

For instructions on how to add your sitemap files to a WordPress site, see the instructions for Creating a Sitemap Manually below.

Do You Need to Submit a Sitemap to Google?

Yes! A sitemap is a great way to ensure that Google knows about all the pages on your website so they can be indexed and ranked properly. And while it may seem intimidating, creating a sitemap is easy when you have the right tools!

What’s a Sitemap URL?

A Sitemap URL is a web page that contains a list of all the web pages on a website that should be indexed. The standard format is <domain>/sitemap.xml. This URL is not included in Search Console’s URL data because it is considered a noindex page by Googlebots.

How to Create a Sitemap for Google

By following these tips, you can ensure that your website’s sitemap is properly optimized and will help you improve your website’s search engine ranking and visibility.

1. Create a Sitemap File Using the Yoast Plugin for WordPress

There are many different ways to create a sitemap for your WordPress site. However, we recommend creating a sitemap using the Yoast plugin. Why? Yoast is one of the most popular WordPress plugins for SEO, so there’s a good chance you already use it for blog content. While Yoast has become a popular SEO plugin for simple SEO optimization of your content, this plugin also includes a sitemap index file builder. This makes it easy to create a sitemap for your WordPress site.

To create a sitemap using the Yoast plugin, follow these steps:

  1. Install and activate the Yoast plugin.
  2. From your WordPress dashboard, go to Yoast SEO in the sidebar. Then select General.
  3. In the feature tab, locate the XML Sitemaps button.
  4. Click the “Enabled” checkbox or “on” to enable the sitemap builder.
  5. Click the “Save Changes” button.

That’s it! The Yoast plugin will automatically create a sitemap for your WordPress site.

Using Yoast on Other CMS Platforms

Yoast also works for creating sitemaps in other e-commerce platforms, including Shopify, Squarespace, and BigCommerce. While Yoast SEO works in Wix, most users don’t find it extremely intuitive.

2. How to Create a Sitemap in Google Search Console

Google used to have a sitemap creator as a free webmasters tool. However, they found that more people preferred to use Yoast and other third-party products, so they took theirs out of google webmaster tools. Luckily, they didn’t leave site owners without alternatives. You can use Google Search Console to create a site map.

If you’re using Search Console:

  1. Sign in to Google Search Console. Be sure you’re on the correct search console property.
  2. Click on the “Sitemaps” tab.
  3. Locate the “Add a new sitemap.” Enter the URL of your website’s sitemap. Click “Submit.”

3. How to Use the Free Version Of Screaming Frog to create a sitemap

The free version of Screaming Frog SEO Spider is a great tool for quickly creating a sitemap of your website.

To create a sitemap with Screaming Frog, follow these steps:

  1. After downloading Screaming Frog, open Screaming Frog SEO Spider and click on the “Sitemaps” tab.
  2. Click on the green “+ New Sitemap” button.
  3. Enter a name for your sitemap and click on the “Create” button.
  4. In the “Sitemap” tab, click on the “Add Sitemap” button.
  5. Select the file of your website and click on the “Open” button.
  6. Click on the “Start Scan” button.
  7. When the scan is complete, click on the “Export” button.
  8. Select the file format and click on the “Export” button.
  9. Save the file to your computer.

4. Create a Sitemap Manually

If you’re looking to brush up on your technical SEO skills, you can create a sitemap.xml file without the use of any tools. This process is a bit more time-intensive, but it offers insight into your website’s structure. It’s also a great exercise if you’re looking to learn XML.

If you’re creating a sitemap manually:

  1. Create a text file that includes the URLs of all the pages on your website you want to potentially appear in search results (your canonicals). This can take some time, depending on the number of pages contained on your site. One way to cut down on the time is to export a CSV file from Google Search Console.
  2. Save the file as “sitemap.xml” or “sitemap_index.xml.”
  3. Upload the file to your website’s root directory on WordPress by going to WP File Manager in the sidebar of your WordPress dashboard.

Then navigate to the root folder of your website. Select “Upload” and then your sitemap.xml file. If you use another e-commerce site builder or CMS, such as Shopify or Wix, you will find the steps are quite similar.

Common Mistakes in Sitemap Creation

There are a few common mistakes that people make when creating a sitemap. And the unfortunate reality is if you make a simple mistake when it comes to your sitemap file, you could hide a page (or your entire site) from search results.

To avoid producing sitemap errors, avoid these common mistakes:

  1. Not including all of the pages on your site. Make sure to include every page on your site in your sitemap, including the pages that are not linked from the main navigation.
  2. Not updating your sitemap when you add new pages or make changes to your site. Make sure to update your sitemap whenever you make changes to your site so that search engines can crawl your new pages.
  3. Including too many or too few pages in your sitemap. Make sure to include only the most important pages on your site in your sitemap. If you include too many pages, search engines may not crawl all of them. If you include too few pages, your site may not be fully indexed.
  4. Creating a sitemap that is difficult to read or understand. Make sure to create a sitemap that is easy to read and understand so that people can quickly find the information they need.
  5. Not using the correct format (the XML version) for your sitemap. Make sure to use the correct format for your sitemap so that Googlebots can crawl your pages.
  6. Take advantage of sitemap generators. Unlike most of us, they speak XML fluently and can create a sitemap index almost instantly. 

How to Optimize Your Website’s Sitemap

Optimizing your sitemap can help your page perform better in Google search results. While most sitemap generators will do these for you, they’re still good to know. 

There are a few key things to keep in mind when optimizing your website’s sitemap:

  1. Make sure your sitemap is up to date.

Make sure you are constantly updating your sitemap as you add new pages or make changes to your website. This will ensure that search engine crawlers are always aware of the latest changes on your website.

  1. Use the correct file format.

Your sitemap should be in XML format. This is the format that search engine crawlers prefer and will be best for helping them index your website.

  1. Include the right information.

Your sitemap should include the following information for each page on your website:

  • The page’s URL
  • The date the page was last updated
  • The page’s title
  • The page’s meta description
  1. Optimize your Titles and Descriptions. Make sure your Titles and Descriptions are keyword-rich and accurately reflect the content of each page. This will help improve your website’s search engine ranking and visibility.
  1. Only include a website’s content that is the canonical version. And don’t skip your canonical tags on your site’s pages.
  2. Don’t forget to submit your sitemap to Google.

How to Submit Your Sitemap to Google

Submitting your sitemap to Google is as easy as logging into Search Console and clicking a few buttons.

  1. Submit the URL of the sitemap to Google Search Console.
  2. Click on the “Sitemaps” tab.
  3. Select on “Submit a Sitemap.”
  4. Enter the URL of your website’s sitemap.
  5. Click “Submit.”

What are some plugin options for creating an XML sitemap?

There are a few different plugin options for creating an XML sitemap for your WordPress site. 

The Yoast SEO plugin has a sitemap feature that allows you to create an XML sitemap for your site. The plugin also includes features to help you optimize your site for SEO.

The Google XML Sitemaps plugin also allows you to create an XML sitemap for your WordPress site. The plugin automatically submits your sitemap to Google and other search engines, and it notifies you when your sitemap has been updated.

The WP Sitemap Page plugin allows you to create a sitemap for specific pages or posts on your site. The plugin also includes options to exclude certain pages or posts from your sitemap and to specify the priority of each page or post.

Checking Your Sitemap

It’s a good idea to check that your sitemap is functioning properly. This ensures that all your different URLs can be indexed by a Googlebot and potentially wind up in the SERPs. SEO tools, such as our Dashboard provide simple ways to ensure your canonical URLs are part of your sitemap index.

To run a sitemap report:

  1. Log into the dashboard.
  2. Locate the Site Audit tool in the navbar.
  3. Select report then XML sitemaps. From here, you can request a new crawl or identify non-indexable pages on your site.

Your Sitemap is a Ticket to SERP Success

If you have yet to create and submit your sitemap to Search Console, now is the time to do it. Your sitemap, which is an XML file, tells Googlebots how you would like your site to be indexed. If you’re creating your own site map, remember to use taxonomies to distinguish different types of content and their differing importance.

A canonical tag, often referred to as rel=“canonical,” is an HTML tag that tells search engines which URL is the primary version or “master copy” of content. These straightforward tags give site owners the ability to suggest one URL for Google to designate as the preferred page to appear in searches. Canonical tags also prevent SEO issues that arise from duplicate content.

These simple HTML link elements play a major role in your site’s SEO. They’re also easy to use, but only work when used correctly. If you’re unfamiliar with canonical tags, this article will help you learn how, when, and why to use canonical tags and how to avoid canonical tag issues.

What is a Canonical Tag?

A canonical tag is an HTML link element inserted into a page’s header or <head>. These tags were developed by search engines and rolled out in 2009. They’re one of those great examples of search engines working with site owners to improve the quality of search results.

Canonical tags tell search engines one of the following:

  1. That content on a page is a duplicate of another page in addition to which of the pages should be considered the primary version.
  2. For single pages with multiple URLs, the tag tells Googlebots or Bing bots which exact URL is the correct one to index.

This tag tells the crawler to index the primary page rather than the duplicate. The canonical URL indicates to Google which page the search engine should display in search engine results, this tag tells the search engine that the primary version is the one that should receive the organic search visibility.

Keep in mind that while you can tell Google which URL to index, Google may not follow your recommendation.

A canonical tag looks like this:

Or 

<link rel=”canonical” href=”https://example.com” />

What Are the Parts of a Canonical Tag?

A canonical tag is also referred to as a canonical link element–which is a bit more intuitive label for this unique HTML code. Why? Because a canonical tag provides a canonical link and defines the relationship between the page and the link.

In HTML, rel tells the Googlebot that there’s a relationship between the page and a linked resource. In this case, the relationship identifies the canonical page which appears after the href attribute (href is a hypertext REFerence).

What Is a Canonical URL

A canonical URL is the primary version of a webpage that site owners want search engines to recognize as the primary source of the content. The canonical URL is the webpage that you want the webcrawlers to index as the correct source of the content. This part of the link element appears after the href=”canonicalURL”. 

Is a Canonical Tag the Same as a Canonical URL?

The canonical URL appears within the canonical tag. The canonical URL is the hyperlink reference element within the canonical tag. This denotes the exact URL that should be considered the canonical version of the source content.

Why Does Canonicalization Matter?

When it comes to e-commerce sites and sites that generate ad revenue, you want to be sure you take every opportunity to put your best URL forward in the search engine results pages (SERPs). And canonicalization does just this by telling Google which site should be indexed. Not only can you gain more control over your site, but you’re also able to funnel users to the highest value page.

Should You Use Self-Referential Canonical Tags?

Even a webpage that may seem unique can be found under a variety of URLs. For example:

While each of these URLs will display the same homepage, each is also technically its own URL. This can lead to the same issues as having duplicate content on a third-party website. Without a canonical tag on “yourwebsite.com”, search engine algorithms will not know which is the preferred URL to display to searchers.

Making things even more confusing to search engines, dynamic pages often have a wide array of tags, each of which has its own URL. Content management systems (CMSs) like WordPress often automatically embed tags into web pages, too. So, even a basic page will wind up with a multitude of URLs–each perfectly indexable by search engines.

So, your best bet is to place a canonical tag within the canonical URL’s header, as well.

Canonical Tags Streamline Data Collection and Analytics

Furthermore, as you’re tracking your search metrics, you want to compile all organic searches for one page under the same URL. Your canonical tags ensure only the specified page will receive search result metrics.

Prevent SEO Conflicts with Syndicated Content

Many websites build backlinks through content syndication. However, creating content can be a timely and costly investment. Through syndicated relationships, you can provide users with your existing high-quality content on third-party sites. Or continue to build your library of content on your site while expanding your brand visibility. 

However, without canonical tags, search engines will not know whether to index your site for the article or the third party. Canonical tags allow you and your syndication partner to simplify this problem. Note: you can also use the noindex tag on one of the pages to prevent duplication.

What’s the Problem with Duplicate Content?

Duplicate content can cause various issues related to SEO. When Googlebots index webpages with identical or very similar content it can:

  • Slow down the indexing process resulting in less of your site being indexed.
  • Register as a negative ranking signal back to Google, causing your pages to rank further down the SERPs.
  • Confuse the search engine as to which page it should display to searchers.

How Canonical Tags Help SEO

First and foremost, canonical tags are one of the few ways you can influence how Google presents your site to searchers. Canonicalization also prevents you from getting ‘docked’ in PageRank for having duplicate content–although Google does not directly penalize duplicate content, they do prioritize original content that’s well-organized,

Finally, they also allow you to provide users beyond your website with excellent content for backlinking and brand building.

What Is Duplicate Content?

Duplicate content isn’t just copy-and-pasted text. It can be written text, images, and other media that are exactly the same, similar, or reordered. Google also considers place-filler text and images from a CMS duplicate content if it’s published to the web.

Basic information, such as copyright text, on every page on your site can even be flagged as duplicates. 

How to Use Canonical Tags

Ultimately, for the best SEO results, you will want to use canonical tags throughout your website. Once you update your existing pages, you will want to continue to implement canonicalization best practices.

The first step is to identify which URL version of your site pages should be the canonical URLs. Google prefers if your canonical links are consistent in formatting. So, if you use the “www.” in your homepage’s canonical link, include it in your other canonical URLs. 

For example, here it uses the “https” protocol in all over our canonical tags, but does not include the “www.”

This will solve any issues with multiple URLs pointing to the same page.

Next, you will want to tag or eliminate any duplicate content within your site. You can do this with the Site Audit tool. It’s as simple as viewing your Content/Duplicates report.

Finally, you will want to find any remaining duplicate content on third-party sites. You can use a tool such as Copyscape to do so. Once, you’ve identified content elsewhere on the web, you will want to decide if

  1. Your content has been stolen and republished without permission
  2. You have accidentally plagiarized pre-existing content or written content that is too similar to another page’s
  3. A syndicated page is registering as a duplicate
  4. You have pages with identical content but it’s appropriate, such as a product appearing on two different category pages

Then, you will want to respond with the corresponding solutions:

  1. Report the duplicate to Google
  2. Immediately remove the content and produce original, high-quality content
  3. Discuss with your syndication partner which page should be the canonical URL, then implement a canonical tag that reflects the correct canonical
  4. Employ the canonical tag with the designated canonical URL

When to Use Canonical Tags

When it comes to canonical tags, you can reduce duplicate content issues by always using canonical tags. However, if you’re updating your site, you want to prioritize:

  • Product category pages with variation filtering: this includes differing sizes, brands, colors, and quantities. Each of these variations requires a different URL.
  • Articles and pages that use pagination: often these are long blogs that have been divided into multiple pages.
  • Product pages that appear on multiple category pages.
  • Pages that have similar content, such as information about your business.

Implementing Canonical Tags on Your Website

Do you have to be a webmaster to implement canonical tags? Not necessarily. If you’re comfortable with working on your site’s HTML code, you can implement canonical tags on your own. 

Here’s how to set up canonical tags:

Canonical Tags in HTTP Headers

The easiest way to employ your canonical tags is to insert and update the tag text in your HTTP header. This HTTP header section of your page looks like this

  1. Identify your preferred canonical URL.
  2. Add a rel=canonical link tag to the non-canonical page’s <head> section with the correct canonical URL link inserted into the HTML link tag.

It should look like this:

Copy-and-paste version:

<link rel=”canonical” href=”https://yoursite/canonicalpage” />

That’s all there is to it. There’s no need to be a webmaster to link to the canonical version of a page.

Verifying Your Canonical Tag

To check if you’ve correctly implemented your canonical tag with the correct URL, you will need to view the source code of your webpage. This process is easy. 

  1. First, navigate to the version of a web page or piece of content you want to check using your browser.
  2. Then, right-click on anywhere in the page and select Inspect. This will open the source code of a page (or URL inspection tool) for your site or any other site to view others’ canonical link elements.
  3. After your HTML source code menu opens, press Ctrl + f for Windows or f + command for Mac. Then type “canonical” into the find by string, selector, or XPath.
  4. The word “canonical” will come into view and highlight yellow, making the header easy to see for verification. Check to see if the canonicalized URL is correct.  If no results appear, then the page does not have a canonical HTML tag.

Other Ways to Verify Canonical Tags

Google Search Console and GSC Insights are great tools for finding pages that have been incorrectly tagged. As you’re looking through your organic traffic stats and notice search traffic arriving at a non-canonical page, your canonical tags may be incorrect. 

To fix these pages, you will want to navigate to the specific URL then inspect the page.

Canonical URLs in Your Sitemap

When creating or updating your sitemap, do not include duplicate URLs. You only need to include your canonical URLs. Your sitemap’s inclusion of the canonical version of a page will hint to Google’s bots to not crawl the duplicate version of the content.

Should You Exclude Duplicate Pages in Your Robots.txt File?

You should not disallow duplicate pages in your robots.txt file. This would block Google from using these pages’ ranking signals. When you correctly implement canonical tags, ranking signals, such as engagements (clicks, scrolls, text entering) and content signals will count toward the canonical page’s metrics.

How to Use Canonical Tags in Your CMS

If you edit your site via a CMS platform such as WordPress, Shopify, Wix, or BigCommerce. Most of these CMSs will have specific instructions for adding canonical link tags, without directly editing your HTML document. We will cover the most common CMS platforms.

Using Yoast for Canonical Tags in Wix, Shopify, or WordPress Sites

Using Yoast SEO plugin for WordPress, Shopify, or Wix, you can easily edit and add the preferred URL as your canonical tag.

  1. After adding the Yoast SEO plugin, you will find the Advanced menu at the bottom of the Yoast editing. Open this menu.
  2. Enter the version of the URL you want to designate as the canonical URL.

Don’t Make These 8 Canonical Tag Mistakes

Canonical tags only work well when implemented correctly–and incorrect implementation can be a disaster. Luckily, there are common mistakes you can avoid to ensure your e-commerce site or ad revenue site makes the most of your next Google crawl.

If you notice that you’re receiving organic traffic to a non-preferred version of a page, you will want to check for the following problems:

1. Do Not Use 301 Redirects Instead of Canonical Links

Google and other search engines created canonical attributes to improve the organization of websites and improve the user experience. When you use 301 redirects, you will increase your page load time. This is because the server must retrieve the redirected URL before retrieving the other version of a page.

Additionally, when you opt for a redirect instead of a canonical attribute, you’re sending the wrong signal to Googlebots.

2. Internal Links & Canonical Tags

Do not select a page without any internal links pointing to it as your canonical version. Canonical tags are just hints to crawlers, and if your canonical URL doesn’t appear in your sitemap, there’s a good chance it will not be indexed.

3. Using ‘noindex’ on Any of your Duplicate Pages

There’s no need to prevent Googlebots from indexing your duplicate pages. In fact, you want your duplicate pages to pass their link equity and other quality signals onto your canonical page.

Noindex should be reserved for gated content and other content you want to hide from search results.

4. Prevent 4XX Status Codes for the Canonicalized URL

Be sure to enter the URL for your canonical link correctly. If you’re unsure of what version to use, consider making the absolute URL your default.

An absolute URL should include the protocol (HTTPS), the domain name (www.yourhomepage.com), and any subfolders (/subfolder). Remember that you want to use the HTTPS protocol to demonstrate your site has SSL security for your users.

And always check that your preferred URL has been spelled correctly. This is the most common reason for a 404 error.

5. Canonicalizing All Paginated Pages to the Root Page

When creating blog posts or guides with multiple web pages, do not canonically link to the first page in the series from the subsequent pages. This will prevent Googlebot from indexing the full series. Instead, you will want to substitute rel=”canonical” with rel=”prev” and rel=”next”. 

6. Not Using Canonicals with Hreflang Tags

Hreflang tags tell Google that a page appears in multiple languages to better serve a diverse and multi-regional audience. Differing language versions can be viewed as content duplicates. Therefore, Google asks that webmasters always use Hreflang tags in conjunction with the canonical tags.

7. Using Multiple Canonical Tags on One Page

An often overlooked issue is accidentally using more than one rel=canonical tag. This problem can arise when more than one person edits a page. Luckily, it’s easy to fix and easy to avoid if you’re aware of it.

8. Basic Typos in the Canonical URL

If you insert a canonical tag, but notice organic traffic arriving at the non-preferred page, double-check that all elements are placed correctly. Note that one of the most commonly skipped characters is the end slash.

Embrace Canonical Tags & Enjoy Better SEO Results

If you’re not using canonical tags, you’re likely missing out. Canonical tags can prevent a multitude of duplicate content issues that arise from URL variants, resulting in better SEO performance and a more organized site for Google to crawl. Furthermore, when you implement canonical tags, all of your search metrics will be compiled into one tidy page rather than countless variants.

Stay ahead on your search metrics and make the most of your consolidated data with the best keyword tracking tool available.

Whenever a user types a web page url into their web browser and hits enter, they send a request to the web server to access that specific website. The web server responds with the requested page (plus any additional resources, like images or scripts, that the page needs), and the browser displays the page. It also returns a HTTP status code along with every request.

Most of the time, these HTTP status codes are not shown because the request was successful. However, when the server cannot access the requested resource, it will provide an explanation of why it wasn’t successful via a specific response status code.

This list of HTTP status codes will define the most common types of response codes you might see and those that may be impacting your SEO performance.

What are HTTP Status Codes?

The HTTP status code is a three-digit number that tells the browser what happened when it tried to connect to the server. HTTP status codes communicate to the web browser and its user whether or not their request was successful.

HTTP Status codes are a big part of SEO because successful requests to the origin server make a better experience for search engine crawlers and for website visitors.

In contrast, response status codes that indicate errors or a missing target resource can signal to users, and Google, that the website owner is not doing the necessary maintenance of their website.

Types of Status Codes

There are five different series of status codes. All status codes are three digits. The beginning digit highlights the type of status code returned by the server.

  • 1xx: Provides Information
  • 2xx: Indicates Success
  • 3xx: Redirected page, meaning the page has been moved to another url
  • 4xx: Client error, meaning something is wrong with the requested web page
  • 5xx: Server error, meaning something occurred with the server’s connection

Most Common HTTP Status Codes

There are 60+ possible status codes, but some of them are more common than others. Some are important also when thinking about search engine crawlers and what is happening when they follow links to various urls on our websites.

200: Success

Pages in the 200 series are what you are aiming for. They communicate that the request was successful and the server has created a new resource. 2xx codes indicate that the server is working properly and the site visitor and client (or website) are all connecting properly.

Whenever a 200 status code is not found, the  site auditor will flag it in your report with the following message:

  • Status code not 200

301: Permanent Redirect

Arguably one of the most important status codes for SEO purposes, 301 redirects communicate that a web page has been permanently moved to a new location or a new url. When a user enters the url in their browser, or clicks on a link with the old url, they will be redirected to the new url of the page.

301 Redirects, when used properly, can help improve your SEO. They ensure that you do not lose link equity when moving or updating content on your website. For this reason, the Site Auditor does flag issues related to 301 redirects when crawling and analyzing your site.

Some issues related to 301s that you might see highlighted in your issues report include:

  • 301 Does Not Redirect to HTTPS: 301 redirects should take users to the HTTPS version of a web page, as it provides a safer browser experience for users.
  • Redirect url not lowercase: Redirect urls should be lowercase so search engine crawlers do not mistake the new page as duplicate content or a duplicate version of the page
  • Internal links with 301 redirects: Google looks down upon internal links with 301 redirects. It prefers that webmasters update their links with the new urls of the relocated pages.

404: Not Found

Status codes in the 400 series are generally used when the client has made a request that the server can’t fulfill.

For example, the 400 status code is used when the client requests a resource that doesn’t exist. The 401 status code is used when the client doesn’t have the appropriate authentication credentials. The 408 status code is used when the client makes a request that’s longer than the server is willing to wait for.

404s are not only bad for the user experience of your website, they are particularly bad for your SEO performance. If search engine crawlers are being repeatedly sent to unavailable or dead pages, Google is less likely to see your website as providing valuable content or a high quality page experience to users.

For this reason, the following 404 status code errors will be flagged in your site auditor report:

  • Url gives soft 404

What Causes a 404 Response Code?

Here are some of the potential reasons why a url might be 404ing and how to resolve the issue:

  • Deleted/Moved page: The page’s content may have been deleted or moved, causing a broken link. To fix, adding a 301 redirect would send the user and search engine crawlers to the new version of the page.
  • Incorrect URL: The url was incorrectly typed into the browser’s address bar or the wrong url was added to a link. Double check that your links are using the right urls.
  • Caching problems: The browser may cache the 404 error page instead of the actual content. Therefore, you keep seeing the error even when the website works for everyone else.
  • Missing Asset: If there is a missing asset, such as an image, CSS, or JavaScript file, it can generate a 404 error. The missing asset needs to be updated or replaced.

500: Internal Server Error

Status codes in the 500 series are general error messages. They are used when the server encounters an error while processing the request. These errors can often feel a bit like a mystery.

For example, the 500 status code is used when the server can’t find the requested resource. The 501 status code is used when the server can’t find the requested resource because it’s been moved. The 502 status code is used when the server can’t process the request because it’s overloaded.

If you’re web page is returning a 500 status code error, try the following fixes:

  • Refresh your browser: This is the best place to start. A second request to the server may produce a successful https status code.
  • Delete web browser cookies: Doing this may help reload the web page. 
  • Deactivate a plugin: Especially if the 500 http status code recently followed the installation of a plugin. It is possible that the plugin conflicts with some other software, or a software update makes the system incompatible.
  • Come back later: It’s possible that future requests at a later time will be successful.

How Do You Know What the HTTP Status Code Is?

It’s important to investigate web page urls on your website that are producing an invalid response. Why? Because they can prevent users from arriving at the requested resource.

Resolving them can mean better keyword rankings and fewer site visitors bouncing away from your website.

There are two primary ways that you can check the response codes of your web pages.

Use your Google Search Console Account

In your GSC account, navigate to Index > Pages.

You’ll find a display summarizing various errors related to indexing. Messages about 404s or 500 errors will appear in this list.

Click on the error to then analyze the impacted pages more closely.

Dashboard Site Auditor Report

The Site Auditor will check the HTTP response codes of your webpages. It will also flag any issues it identifies in relation to the status code.

After running your site audit, navigate to the Issues tab in the Site Auditor dashboard.

Click on the “Page URL” category.

Look for any error messages mentioning HTTP status codes, and then click, “See Affected Pages.”

You’ll see a complete list of all of the pages on your website that are not returning 200 status codes.

Hand this list to your web developer to resolve the issue, or connect with one of our SEO experts to determine your next steps for resolving the issues.

Conclusion

Now that you understand the most important HTTP status codes for SEO, you can hopefully resolve any errors on your web pages.

But if you are still not quite sure why your web page urls are returning specific HTTP status codes, you may want to reach out our a technical SEO agency to see if they can help you resolve the issue.

A crawl budget may seem like a foreign concept when you’re first learning about how search engine bots work. While not the easiest SEO concept, they’re less complicated than they may seem. Once you begin to understand what a crawl budget is and how search engine crawling works, you can begin to optimize your website to optimize for crawlability. This process will help your site achieve its highest potential for ranking in Google’s search results.

What Is a Crawl Budget?

A crawl budget is the number of URLs from one website that search engine bots can index within one indexing session. The “budget” of a crawl session differs from website to website based on each individual site’s size, traffic metrics, and page load speed.

If you’ve gotten this far and the SEO terms are unfamiliar to you use our SEO glossary to become more familiar with the definitions.

What Factors Affect a Website’s Crawl Budget?

Google doesn’t devote the same amount of time or number of crawls to every website on the internet. Webcrawlers also determine which pages they crawl and how often based on several factors. They determine how often and for how long each site should be crawled based on:

  • Popularity: The more a site or page is visited, the more often it should be analyzed for updates. Furthermore, more popular pages will accrue more inbound links more rapidly.
  • Size: Large websites and pages with more data-intense elements take longer to crawl. 
  • Health/Issues: When a webcrawler reaches a dead-end through internal links, it takes time for it to find a new starting point–or it abandons the crawl. 404 errors, redirects, and slow loading times slow down and stymied webcrawlers.

How Does Your Crawl Budget Affect SEO?

The webcrawler indexing process makes search possible. If your content cannot be found then indexed by Google’s webcrawlers, your web pages–and websites will not be discoverable by searchers. This would lead to your site missing out on a lot of search traffic.

Why Does Google Crawl Websites?

Googlebots systematically go through a website’s pages to determine what the page and overall website are about. The webcrawlers process, categorize, and organize data from that website page-by-page in order to create a cache of URLs along with their content, so Google can determine which search results should appear in response to a search query. 

Additionally, Google uses this information to determine which search results best fit the search query in order to determine where each search result should appear in the hierarchical search results list.

What Happens During a Crawl?

Google allots a set amount of time for a Googlebot to process a website. Because of this limitation, the bot likely will not crawl an entire site during one crawl session. Instead, it will work its way through all the site’s pages based on the robots.txt file and other factors (such as the popularity of a page). 

During the crawl session, a Googlebot will use a systematic approach to understanding the content of each page it processes. 

This includes indexing specific attributes, such as:

  1. Meta tags and using NLP to determine their meaning
  2. Links and anchor text
  3. Rich media files for image searches and video searches
  4. Schema markup
  5. HTML markup

The web crawler will also run a check to determine if the content on the page is a duplicate of a canonical. If so, Google will move the URL down to a low priority crawl, so it doesn’t waste time crawling the page as often.

What are Crawl Rate and Crawl Demand?

Google’s web crawlers assign a certain amount of time to every crawl they perform. As a website owner, you have no control over this amount of time. However, you can change how quickly they crawl individual pages on your site while they’re on your site. This number is called your crawl rate.

Crawl demand is how often Google crawls your site. This frequency is based on the demand of your site by internet users and how often your site’s content needs to be updated on search. You can discover how often Google crawls your site using a log file analysis (see #2 below).

How Can I Determine My Site’s Crawl Budget?

Because Google limits the number of times they crawl your site and for how long, you want to be aware of what your crawl budget is. However, Google doesn’t provide site owners with this data–especially if your budget is so narrow that new content won’t hit the SERPs in a timely manner. This can be disastrous for important content and new pages like product pages which could make you money.

To understand if your site is facing crawl budget limitations (or to confirm that your site is A-OK), you will want to:

  1. Get an inventory of how many URLs are on your site. If you use Yoast, your total will be listed at the top of your sitemap URL.
  2. Once you have this number, use the “Settings” > “Crawl stats” section of Google Search Console to determine how many pages Google crawls on your site daily.
  3. Divide the number of pages on your sitemap by the average number of pages crawled per day.
  4. If the result is below 10, your crawl budget should be fine. However, if your number was lower than 10, you could benefit by optimizing your crawl budget.

How Can You Optimize for Your Crawl Budget?

When the time comes that your site has become too big for its crawl budget–you will need to dive into crawl budget optimization. Because you cannot tell Google to crawl your site more often or for a longer amount of time, you must focus on what you can control. 

Crawl budget optimization requires a multi-faceted approach and an understanding of Google best practices. Where should you start when it comes to making the most of your crawl rate? This comprehensive list is written in hierarchical order, so begin at the top.

1. Consider Increasing Your Site’s Crawl Rate Limit

Google sends requests simultaneously to multiple pages on your site. However, Google tries to be courteous and not bog down your server resulting in slower load time for your site visitors. If you notice your site lagging out of nowhere, this may be the problem.

To combat affecting your users’ experience, Google allows you to reduce your crawl rate. Doing so will limit how many pages Google can index simultaneously. 

Interestingly enough, though, Google also allows you to raise your crawl rate limit–the effect being that they can pull more pages at once, resulting in more URLs being crawled at once. Although, all reports suggest Google is slow to respond to a crawl rate limit increase, and it doesn’t guarantee that Google will crawl more sites simultaneously.

How to Increase Your Crawl Rate Limit:

  1. In the Search Console, go to “Settings.”
  2. From there, you can view if your crawl rate is optimal or otherwise.
  3. Then you can increase the limit to a more rapid crawl rate for 90 days.

2. Perform a Log File Analysis

A log file analysis is a report from the server that reflects every request sent to the server. This report will tell you exactly what Googlebots do on your site. While this process is often performed by technical SEOs, you can talk to your server administrator to obtain one.

Using your Log File Analysis or server log file, you will learn:

  • How frequently Google crawls your site
  • What pages get crawled the most
  • Which pages have an unresponsive or missing server code

Once you have this information, you can use it to perform #3 through #7.

3. Keep Your XML Sitemap and Robots.txt Updated

If your Log File shows that Google is spending too much time crawling pages you do not want appearing in the SERPs, you can request that Google’s crawlers skip these pages. This frees up some of your crawl budget for more important pages.

Your sitemap (which you can obtain from Google Search Console or our dashboard) gives Googlebots a list of all the pages on your site that you want Google to index so they can appear in search results. Keeping your sitemap updated with all the web pages you want search engines to find and omitting those that you do not want them to find can maximize how webcrawlers spend their time on your site.

Your robots.txt file tells search engine crawlers which pages you want and do not want them to crawl. If you have pages that don’t make good landing pages or pages that are gated, you should use the noindex tag for their URLs in your robots.txt file. Googlebots will likely skip any webpage with the noindex tag.

4. Reduce Redirects & Redirect Chains

In addition to freeing up the crawl budget by excluding unnecessary pages from search engine crawls, you can also maximize crawls by reducing or eliminating redirects. These will be any URLs that result in a 3xx status code.

Redirected URLs take longer for a Googlebot to retrieve since the server has to respond with the redirect then retrieve the new page. While one redirect takes just a few milliseconds, they can add up. And this can make crawling your site take longer overall. This amount of time is multiplied when a Googlebot runs into a chain of URL redirects.

To reduce redirects and redirect chains, be mindful of your content creation strategy and carefully select the text for your slugs.

5. Fix Broken Links

The way Google often explores a site is by navigating via your internal link structure. As it works its way through your pages, it will note if a link leads to a non-existent page (this is often referred to as a soft 404 error). It will then move on, not wanting to waste time indexing said page.

The links to these pages need to be updated to send the user or Googlebot to a real page. OR (while it’s hard to believe) the Googlebot may have misidentified a page as a 4xx or 404 error when the page actually exists. When this happens, check that the URL doesn’t have any typos then submit a crawl request for that URL through your Google Search Console account.

To stay current with these crawl errors, you can use your Google Search Console account’s Index > Coverage report. Or use the Site Audit tool to find your site error report to pass along to your web developer.

Note: New URLs may not appear in your Log File Analysis right away. Give Google some time to find them before requesting a crawl.

6. Work on Improving Page Load Speeds

Search engine bots can move through a site at a rapid pace. However, if your site speed isn’t up to par, it can really take a major toll on your crawl budget. Use your Log File Analysis, PageSpeedInsights to determine if your site’s load time is negatively affecting your search visibility.

To improve your site’s response time, use dynamic URLs and follow Google’s Core Web Vitals best practices. This can include image optimization for media above the fold.

If the site speed issue is on the server-side, you may want to invest in other server resources such as:

  • A dedicated server (especially for large sites)
  • Upgrading to newer server hardware
  • Increasing RAM

These improvements will also give your user experience a boost, which can help your site perform better in Google search since site speed is a signal for PageRank.

7. Don’t Forget to Use Canonical Tags

Duplicate content is frowned upon by Google—at least when you don’t acknowledge that the duplicate content has a source page. Why? Googlebot crawls every page unless inevitably, unless told to do otherwise. However, when it comes across a duplicate page or a copy of something it’s familiar with (on your page or off-site), it will stop crawling that page. And while this saves time,  you should save the crawler even more time by using a canonical tag that identifies the canonical URL.

Canonicals tell the Googlebot to not bother using your crawl time period to index that content. This gives the search engine bot more time to examine your other pages.

8. Focus on Your Internal Linking Structure

Having a well-structured linking practice within your site can increase the efficiency of a Google crawl. Internal links tell Google which pages on your site are the most important, and these links help the crawlers find pages more easily.

The best linking structures connect users and Googlebots to content throughout your website. Always use relevant anchor text and place your links naturally throughout your content.

For e-commerce sites, Google has best practices for faceted navigation options to maximize crawls. Faceted navigation allows site users to filter products by attributes, making shopping a better experience.  This update helps avoid canonical confusion and duplicate issues in addition to excess URL crawls.

9. Prune Unnecessary Content

Googlebots can only move so fast and index so many pages each time they crawl a site. If you have a high number of pages that do not receive traffic or have outdated or low-quality content–cut them! The pruning process lets you cut away your site’s excess baggage that can be weighing it down.

Having excessive pages on your site can divert Googlebots onto unimportant pages while ignoring pages.

Just remember to redirect any links to these pages, so you don’t wind up with crawl errors.

10. Accrue More Backlinks

Just as Googlebots arrive on your site then begin to index pages based on internal links, they also use external links in the indexing process. If other sites link to yours, Googlebot will travel over to your site and to index pages in order to better understand the linked-from content. 

Additionally, backlinks give your site a bit more popularity and recency, which Google uses to determine how often your site needs to be indexed.

11. Eliminate Orphan Pages

Because Google’s crawler hops from page to page through internal links, it can find pages that are linked to effortlessly. However, pages that are not linked to somewhere on your site often go unnoticed by Google. These are referred to as “orphan pages.”

When is an orphan page appropriate? If it’s a landing page that has a very specific purpose or audience. For example, if you send out an email to golfers that live in Miami with a landing page that only applies to them, you may not want to link to the page from another.

The Best Tools for Crawl Budget Optimization

Search Console and Google Analytics can come in quite handy when it comes to optimizing your crawl budget. Search Console allows you to request a crawler to index pages and track your crawl stats. Google Analytics helps you track your internal linking journey.

Other SEO tools, such as the dashboard allows you to find crawl issues easily through Site Audit tools. With one report, you can see your site’s:

  • Indexability Crawl Report
  • Index Depth
  • Page speed
  • Duplicate Content
  • XML Sitemap
  • Links

Optimize Your Crawl Budget & Become a Search Engine Top-Performer

While you cannot control how often search engines index your site or for how long, you can optimize your site to make the most of each of your search engine crawls. Begin with your server logs and take a closer look at your crawl report on Search Console. Then dive into fixing any crawl errors, your link structure, and page speed issues.

As you work through your GSC crawl activity, focus on the rest of your SEO strategy, including link building and adding quality content. Over time, you will find your landing pages climb the search engine results pages.

What Are Core Web Vitals? (+4 Tips for Improvement)

When it comes to Search Engine Optimization (SEO), Google’s Core Web Vitals are a make or break ranking factor. Unlike many other SEO metrics, these metrics measure the performance of a website from the user’s perspective. The Core Web Vitals update is about making websites more enjoyable, more responsive, and more snappy.

While Google’s Core Web Vitals were designed to help web developers make the world wide web a happier place for everyone, Google also uses them as a ranking metric for search engine results. This means these straightforward metrics offer the opportunity for web developers to improve a website while SEO experts have one more way to improve a website’s ranking.

Core Web Vitals establish a win-win for businesses and their visitors. We can all agree the web experience is better without delayed load times, shifting content, or other issues. It’s also awesome that optimizing your website’s Core Web Vitals can result in a tangible impact on organic visitors and your business’s conversion rates.

What Are Core Web Vitals?

When a doctor checks their patient’s vitals, they get a snapshot of that person’s overall health without performing a full examination. Instead, they measure the most important indicators: temperature, heart rate, and blood pressure. This is similar to how Google’s Core Web Vitals give you a sneak peek of your website’s usability health and page experience.

Through three indicating factors, Core Web Vitals allow web developers and site owners to decide where updates need to be applied first.

What Do Core Web Vitals Measure?

While your health vitals include heart rate, temperature, and blood pressure, a website’s Core Vitals include:

  • Largest Contentful Paint (LCP)
  • Cumulative Layout Shift (CLS)
  • First Input Delay (FID)

If you read these terms and felt suddenly overwhelmed, don’t worry. These, like so many other technical SEO and web development terms, are less complicated than they sound. And we’ll explain each in-depth for you.

Why Did Google Develop Core Web Vitals?

Google realized that many top-ranking websites didn’t provide an optimal experience for visitors–despite providing relevant content. It was difficult to quantify the usability of a website from a visitor’s perspective with the existing metrics. So, Google made it a mission to fix both.

Was the web experience really that bad? Yes and no. When Google first rolled out their Core Web Vitals, 98% of top-ranking sites did not pass Core Web Vital standards. Here’s an example of a website that took a huge hit in ranking positions after the Page Experience update went live in June 2021.

It’s important to remember that Google wants to improve the overall health of the web ecosystem. The internet is built for users, after all.

In addition to elevating and setting a standard to measure the biggest complaints among users, it’s important to note that other issues always arise. One is that web development techniques have become extremely diverse. And having metrics for the user experience accounts for all the many ways to architect a website. 

What Other Advantages Are There for Examining Core Web Vitals?

Google views these metrics as a way to initiate a sea-change for the entirety of the industry. According to Annie Sullivan with Google’s Core Web Vitals program, by emphasizing the user experience in relation to load time, third-party content management systems like WordPress, Shopify, BigCommerce will be motivated to improve their products. This benefits small businesses, users, and web innovation.

Breaking Down LCP, CLS, and FID

Core Web Vitals are defined by three main metrics: LCP, CLS, and FID. These aspects of a website all impact how long a user interacts with a website. 

By focusing on these issues, web developers can greatly improve the user experience. By targeting these metrics, SEO experts can better optimize websites.

Largest Contentful Paint (LCP)

LCP is the render time for the biggest content element visible to the user–this differs from First Content Paint (FCP). The LCP does not count any elements that fall below the “fold line.” This is to say that if the user has to scroll to see it, it won’t count against your LCP.

So what may be identified as a website’s LCP? Most often your LCP is the page’s featured image or <h1> tag.

However, it could also be:

  • A video preview
  • An image, including a background image
  • A block of text
What Is an Ideal LCP Time?

When it comes to speed, seconds matter when it comes to your user’s experience. Seconds can feel like ages to a user anxiously waiting for content they’re interested in to load. So, for your LCP, ideally, you want your LCP to be under 2.5 seconds.

Why does LCP Matter?

The amount of time for the most important elements to load can result in users backing out. This can cause your bounce rate to balloon. Furthermore, if users are waiting too long for elements to load, you could be losing sales.

Cumulative Layout Shift (CLS)

Have you ever been reading an article and lost your place because the text shifted on the browser? This is likely because an image or video took some time to load somewhere above, leading to poor visual stability. CLS quantifies the number of these shifts.

CLS can also result in miss-clicks. For example, while logging into a website, if the login button shifts you may wind up clicking “forgot my password.” For obvious reasons, this is frustrating.

What Causes Layout Shifts?

Some of the most common elements that result in high CLS include:

  • Images that auto-size
  • Ads without iframes
  • Too much javascript
  • Font styles loading too late
What Is an Ideal CLS Time?

You want your CLS to be under .1 seconds.

Why Does CLS Matter?

The last thing you want is visitors to your website feeling frustrated. When a page shifts while in use, visitors wind up with a negative impression of your website and business.

First Input Delay (FID)

FID measures the amount of time it takes for a user to be able to interact with elements, such as entry fields on a page. FID also takes into account contingent entry fields or delays between when you press play and when a video begins.

This web vital was frightening for many web developers because many third-party widgets lag–and most devs use these tools to reduce laborious and cumbersome tasks. However, this metric will motivate CMS and widget developers to up their game.

You will notice that many tools, such as PageSpeed Insights and Lighthouse use the phrases “Time to Interactive” and “Total Blocking Time.” These often stand in place for FID since some web pages do not have interaction opportunities.

What are input interactions?

Input interactions are any actions on your website that allow the user to interact with and the site needs to respond to. These include:

  • Links
  • Buttons
  • Text fields
  • Drop-down menus
  • Check-boxes
What Is an Ideal FID Time?

You want your FID to be under 100 milliseconds.

Why Does First Input Display Matter?

Poor FID can lead to a major loss in conversions since many users will back out of a sales funnel due to wait times (and having more time to rethink their purchase).

Why Do Core Web Vitals Matter to SEO?

Core Web Vitals are a ranking metric for Google search results. And while there are over 200 factors that determine how your page will rank, poor Web Vitals metrics can have a cumulative negative effect on your website’s performance. 

Additionally, Google representatives have announced that Web Vitals will continue to play a role in how the search engine ranks websites for years to come. And we can expect these metrics to become even more important as they fine-tune and refine the search engine.

Bounce Rate and Brand Reputation

Additionally, an e-commerce website is similar to a brick-and-mortar business in that a user’s first impression matters. If you were to talk into a business and not be greeted or feel confused by the layout, you would likely turn and leave. 

The same is true with the eCommerce experience. When visitors leave, you’re impacting your visitor’s time-on-page and your bounce rate. We have no doubt that improving your website’s SEO through its Web Vital metrics will increase your conversions.

How to Audit Your Website’s Core Web Vitals

The easiest way to audit your website’s core vitals is to perform a free SEO audit to identify issues lowering your website’s score through field data metrics. Then fix these problems.

Your Core Web Vital’s Grading Scale

  • 90 to 100 (green): Best
  • 50 to 89 (orange): Moderate
  • 0 to 49 (red): Poor

However, you can also check your website’s CWVs using a few different approaches.

Auditing Your Website with Using Web Tools

Lighthouse in Google Chrome

  1. Switch to Google Chrome’s Incognito mode.
  2. Navigate to your website.
  3. Open page inspection: Right-click on the page and then select “inspect.”

Or, hold Ctrl + Shift + C. Then select “inspect.”

  1. After the inspection menu appears, select the double-arrow icon (). Then select “Lighthouse.”
  2. After the menu loads, select the “Generate report” button () and what device type you want the report to reflect. We recommend looking at both.
  3. Once the report appears, you will see a score for each category, however, you will need to use “Total Blocking Time” in place of FID. You can select the expanded list button for more details.

Using PageSpeed Insights

  1. Using the PageSpeed Insights tool, paste your URL into the text field. Then select “Analyze.”
  2. You will see that a report appears with the Web Vitals appearing as red, orange, or green. You will also notice the Core Web Vitals have a blue bookmark icon next to them, and a button you can select to expand each category to learn more.
  3. When you scroll down, you will find suggestions to improve your website’s performance.

Using the Dashboard

If you are a user, you can easily audit your Core Web Vitals in your GSC Insights tool. 

Simply find the page you want to review metrics for and select, “Page Insights.”

In addition to the Historical Data for the page, the tool also shows you whether the page is meeting Google’s Core Web Vitals standards.

4 Tips for Improving Core Web Vitals Across Your Web Pages

When you first see your Core Web Vitals report, it’s easy to feel overwhelmed. However, it’s important to see Web Vitals as an opportunity. And some of the biggest fixes are also the easiest. Here are four we recommend you do sooner rather than later:

 

  1. Know your numbers and set goals: Use our dashboard, Google Search Console, Lighthouse, or PageSpeed Insights to understand your current Web Vitals scores. Identify if your website struggles with a common issue. If you find a commonality, such as images that are not optimized, you can add web development commit hooks to your CMS to automatically fix these issues.
  2. Review your real-users data often: While lab numbers are insightful, it’s important to check your reports regularly to understand what your real visitors experience. Remember to run reports after updates to ensure your changes don’t have a negative effect on your metrics.
  3. Identify your Largest Contentful Paint and optimize it: You can identify your LCP by selecting the “Performance” menu. Then follow the instructions to record the page load. Once identified, compress the image or heading.
  4. Consider content delivery networks for international business: While we don’t often think about physical distance regarding the Internet, it can cause a loading issue. If your server is countries away from a user, you may be better off using a content delivery network since CDNs can deliver content faster to more remote regions.

Final Thoughts on Core Web Vitals

Website Core Vitals are metrics that measure how enjoyable your website is for users. They are measured by load times of the largest contentful paint, first interaction delay, and shifting in website elements. While Core Web Vitals are one of many ways Google decides where a website should appear in their search engine results, improving these metrics also improves many businesses’ conversion rates and reputations.

 

How quickly does your website load for desktop and mobile users? If the answer is more than two or three seconds, you may be losing traffic as visitors bounce back to the search results page and choose a faster-loading page. And the question we can help you answer is how to improve website performance to improve your user experience and Google rankings.

When it comes to website speed for e-commerce websites, time is money. A few extra seconds of page load time could have a major impact on your ability to engage visitors, make sales, and boost your overall conversion rate.  However, if you’re looking to boost your site’s speed on searchers’ browsers, you’ve come to the right page. This article will cover how to transform your website’s page load speed from laggy to snappy for a better user experience.

How Website Speed Impacts Your Business and SEO

When it comes to your site’s speed, load time does more than just make your users wait–it affects your site’s ranking, your visitors’ user experience, and more.

Page Load Time is a PageRank Factor

Google is on a mission to make searching the internet a better experience for all. One way they do this is by prioritizing search results page load time is a Google ranking factor that’s become even more prominent with the release of Google’s Core Web Vitals.

This is all to say that a fast site speed is essential if you want your site to rank higher and for more keywords. So, when you improve your website performance, you’re improving your chances of ranking and earning organic traffic.

Long Load Times Lead to Less Revenue

It should be no surprise that the longer the delay in page load time, the more traffic a site will lose. A slow website can result in lost sales opportunities, lost revenue, and lost growth potential. According to Business, 53% of mobile users will exit a page if it takes longer than 3 seconds to load.

Slow page speed also disrupts the user experience, often impacting buying decisions. 

Conversely, increasing site speed corresponds to higher conversion rates, increased revenue, and better brand credibility. 

Here’s how page speed impacted some best-known, enterprise-level websites:

  • Amazon reported a loss in revenue of 1% for every 100 milliseconds of page load delay
  • Walmart saw a 2% increase in conversion rate for every second of page speed improvement
  • Mozilla increased page load speed by 2.2 seconds and Firefox downloads increased by 15.4% (or 10 million in a year)
  • Shopzilla: Decreased load time from 7 to 2 seconds and saw a 50% decrease in their operational budget

Companies of all sizes experience positive business results related to increasing site speed. Even for smaller sites, improving load times needs to be a priority in your search engine optimization (SEO) efforts.

Search Engines Prefer Providing Searchers Fast Websites

Website speed figures significantly into the algorithms used to rank sites in search engine results. The faster your site loads – especially with mobile searches – the better your position in the SERPs.

Site load time is part of Google’s search ranking algorithm. And, because of its mobile-first policy, load times on mobile sites now take precedence over desktop systems. 

Mobile-First Indexing | WMConf Lightning Talks

What Is an Ideal Load Time According to Google?

Google offers these benchmarks to help site owners set the bar for page speed:

Average speed index (how quickly a mobile page displays to a user): 3 seconds

Average request count (number of content pieces needed to display the entire mobile page): Fewer than 50

Average page weight (total size of a mobile web page in bytes): Less than 500K

And Google’s Core Web Vitals outline optimal load times of the most impact elements as follows:

The bottom line is that site speed, SEO, and business growth are interlinked. If your site gets penalized by Google due to page speed issues, your rankings will drop, and so will your page views. Your site could even wind up with a manual penalty and become completely hidden from the SERPs.

This loss of visibility can translate to:

  • Lower ad revenue
  • Fewer conversions
  • Fewer sales
  • Poor brand reputation

Improving your site speed is a key business growth strategy that needs your focus now.

How to Improve Your  Website Performance on Desktop & Mobile

It’s obvious that page speed matters. But when it comes to speeding up your pages and overall site, it’s often easier said than done. Why? There is not a one-size-fits-all solution to making every site perform at its optimal speed. Furthermore, site owners, web developers, and SEOs all have their individual technical abilities. 

However, selecting the most appropriate tactics for your site listed here can help you troubleshoot then fix page speed errors.

1. Test Your Current Website Speed

There are a number of online tools to test how fast your website runs. Free access to PageSpeed Insights allows every site owner to identify any elements that may be slowing their site down. 

The Google PageSpeed Insights tool (PSI) is the one most commonly used by website owners. It provides you with a report card and excellent insight into what is slowing your site down. Another magical aspect of using PSI is it provides the same data that Google does. This gives you a peek into how Googlebots will score your speed while indexing.

Keep in mind that your browser and internet connection will affect your PSI score.

How to Use PSI

Simply enter the URL you want to test into the text field and hit Analyze. For the most accurate data, disable any extensions in your browser.

Then, the PSI tool returns a report on your site’s Web Core Vitals for mobile devices. To see your site’s performance on desktop, select the desktop icon at the top of the screen.

When you scroll down, you will find a speed analysis of your page, including some opportunities that Google suggests. These are recommended ways to improve desktop and mobile page load times.

Key outputs of a PageSpeed Insights report include:

A performance score that summarizes the page’s overall performance. 

  • 90 or above is “fast” 
  • between 50 and 90 is “moderate” 
  • lower than 50 is “slow”

Opportunities for site improvement. Focus on the highest items on the list first. Click the “drop-down arrow to the right of the opportunity item to discover tips on how to fix the identified problem.

Diagnostics for technical issues. With the same option to expand the item for more details and an explanation as to how to fix it.

What to Do with Your PSI Data

Evaluation of your site’s Core Web Vitals and other unifying Google quality signals Google experience allows you to prioritize site changes. If you’re tech-savvy, you can fix some of these issues yourself. Otherwise, you may want to consider hiring a web developer for page speed optimization.

If you’re evaluating how to best use your web development budget, perform a URL analysis on your landing pages and popular blogs. Note any overarching themes in your metrics. For example, if your FCP is too high on average, you may have data-rich media above the fold on the majority of your pages. 

You can also keep a list of your pages with the lowest performance scores.

2. Consider changing your web host

If you have a shared hosting plan like those on BlueHost, consider switching to a dedicated server or cloud hosting. 

Though shared hosting invariably comes with a lower price tag, it can also affect site speed because resources like memory and bandwidth are shared across a number (and sometimes quite a larger number) of websites. Additionally, you can never account for traffic spikes for another site on the server–which will affect the performance of your website.

Switching to a dedicated server or cloud hosting as the sole website owner can increase site speed because resources are no longer being tapped by multiple sites. This is especially important for enterprise-level organizations that have a high bandwidth requirement in order to serve a robust amount of content.

3. Update your website theme

If you use a content management system (CMS) like WordPress, switch to a current WordPress theme that is already optimized for speed. Such themes are light and flexible, and some are focused only on including elements that support search engine optimization best practices.

While you’re making changes to your website, consider removing unnecessary widgets that require a lot of data to load and run.

4. Minimize HTTP requests

HTTP requests, such as 301 redirects, occur when a user first visits your site. They are sent to your server (on your hosting platform), requesting the files needed to render your site on the user’s screen. The more new requests made in order to get all the files needed for your site, the more time that web page will take to load.

What are redirects?

Redirects are code instructions that forward your user from one location on your site to another. When you have a number of requests in a chain, it can take the webserver a lot of time to return the right data to your visitors’ browsers. 

One way you can think of redirects is if you’ve asked one of your children to go find a specific book. They arrive at the room where they expect to find the book, but instead, they find a note saying it’s in another room. This detour delays their delivery of the book.

Redirects are commonly used for site migrations, website redesigns, or when content pruning, but each redirect adds to how long it takes for a web page to load. 

How to reduce redirects

It’s best to avoid redirects when you can since they’re one of the easiest ways to slow down website performance. But if you do have some, Google advises that you:

  • Never require more than one redirect to get to any of your resources; and
  • Never link to a page that you know has a redirect on it.

If you have spare time, you can also go through your internal links and revise their URLs to the new URL. You can also request your referral sites do the same.

5. Compress your files

Compressing your site files helps reduce HTTP requests. You may see response time decrease by as much as 70 percent. Gzip is a free tool used by web developers to effectively compress site files and improve how quickly a website loads.

This works exceptionally well for improving website performance for sites with a lot of images.

6. Optimize your images, videos, and other media

Images, videos, and other rich media are often the culprits when it comes to slow loading times. On the other hand, compression is often the easiest way to fix slow media loading times.

Save site images in the smallest possible file size without reducing image quality on the user’s end. Some recommendations for optimizing images include:

  • Using JPEG or .jpg format for colorful images, PNG for simple images, and GIF for animated images.
  • Reducing file dimensions to a suitable size that is visible and clear on multiple devices
  • Using an image compression tool like TinyPNG or JPEG Mini to compress images.

You should also use lazy loading for any images or larger elements below the fold. Lazy loading speeds up the amount of time it takes the most important elements to render. Additionally, it also reduces the number of HTTP requests. It works by utilizing the initial bandwidth to prioritize elements visitors will see first.

7. Consider using a content delivery network (CDN)

In today’s digital world we often do not think about data traveling through physical distance, but it does. A CDN is a network in close geographic proximity to your web server that delivers content. 

Close proximity decreases transmission time, which can improve the user experience by increasing the speed and site performance of desktop and mobile sites.

For example, if your site servers users in Florida, your CDN should be located within the US, and ideally close to Florida. This will reduce the page load times since the data for the server request can return to the browser with stealth speed.

8. Check your plugins

Each plugin you have on your site shaves time off your web page rendering speed. This is especially common for WordPress sites. 

Luckily the fix is easy. Review your plugins and ditch the ones you don’t use. If better-optimized plugins can replace the ones you want to keep, make the switch.

9. Clean up your site

Minification or the process of removing unnecessary and redundant data can do wonders for your website’s performance. This process is a way to clean up any excessive code that is bogging down your javascript (JS), HTML, or CSS files. This can increase your site’s responsiveness instantly.

Minification

To begin our minification, you will need to view your website’s code. You can do this in the element inspection menu in Chrome by pressing CTRL + i or right-clicking on the page and selecting Inspect. If you notice a lot of spaces, there’s a good chance minification can make the webpage faster.

You will then want to use the HTML editor if you have a WordPress website to pare back on excess lines of code. If you’re unfamiliar with working on CSS files, Javascript files, and HTML, you can always ask your web developer or use an agency as a resource.

The Trade-Off

Of course, there’s a slight tradeoff with minification. Longer HTML files, javascript files, and other code are often easier for developers to navigate.

Other ways you can clean up your site include:

Pruning outdated content, pages, and files. The dashboard’s GSC Insights and Site Audit tools provide you with a list of sites that are underperforming and those that are at risk of keyword cannibalization. 

Additionally, fix or remove broken links to boost how fast your website loads

10. Enable browser caching

Every users’ browser has a cache. Within the cache is data from the version of your site the user last explored. When a browser is called upon again to load the same site, the browser will reach into its cache and retrieve the previous data, or static files, to display.

By enabling browsers, such as Chrome to store your site’s static files, you’re improving your page speed metrics and providing a better user experience.

This tactic is a huge benefit for returning visitors. This can cut down the amount of time a page loads by nearly 100%!

11. Optimizing Your Website’s Fonts

While it may seem inconsequential, the font you choose for your site does affect page load speed. When you use a bespoke font (or web font) for branding purposes, you could be detracting from your website’s performance. Here’s how:

System fonts, such as Arial, Calibri, and Times New Roman do not require any data to be fetched from the server or elsewhere on the internet. They’re already stored in the user’s computer or mobile device.

Web fonts, on the other hand, require the visitor’s browser to wait for the data for the font to be fetched from the server. And this is the best-case scenario. Sometimes, web fonts require the data to be fetched from another webpage, which can result in high load time increases and a poor user experience.

Choosing a font from Google Fonts allows your website to optimize for speed without sacrificing style.

12. Give Your Header a Boost

An attractive, easy-to-navigate site is a must for e-commerce. This is where your header often earns its keep. However, it’s important to keep in mind that your header will load above the fold on every one of your web pages. Luckily there are some tricks to ensuring your header isn’t slowing down your page load speed. 

  • Load your JS scripts last
  • How CSS files to your footer and combine into a single CSS file
  • (Again) delete unnecessary plugins
  • Optimize your fonts (see above)

SEO Software for Page Speed

As you likely know, when you optimize for speed, you will need to plan for a multi-faceted approach. This can be quite a large investment of time and juggling different platforms such as Google Analytics, GSC, and Google Webmaster tools.

Our Dashboard allows you to track the overall performance of your website with a single sign-in. Because it’s built over Google’s API, users receive daily updates on their website’s SEO metrics, including page load speed.

Here’s a closer look at the analytics the dashboard provides

Monitor your website speed, including mobile devices on a regular basis with the Site Audit tool. Other features of this tool include:

  • Page Speed: View page load speed data for mobile devices and desktop. See how your site compares to the average user experience and how many of your pages need speed optimization.
  • Redirect reports: identify pages with redirects in order to decrease server response time.
  • Overall Index Speed: View your website’s performance through the eyes of Google’s crawlers
  • HTTP Status Codes Distribution: Identify if switching servers will improve page speed
  • Content Duplicates: Streamline site-wide content clean up with duplicate identifier
  • Page Pruning: Reduce unnecessary URLs and strategically keep only those that are Boostable.

Free Page Load Speed Monitoring Options

At a minimum, track your desktop and mobile search ranking results, and check your PSI score if you see your site rankings lower on the SERPs.

You can use these steps to make continuous speed improvements:

  1. Use your initial speed test as your baseline metric and test current speed on mobile and desktop devices.
  2. Check Google PageSpeed Insights suggestions for recommended improvements.
  3. Based on your results and PSI recommendations, decide which tactics to use to improve the speed of your desktop and mobile site.
  4. Retest your page speed after completing each tactic to assess results.

Rinse and repeat as often as necessary to keep improving your page speed.

Improve Your Website’s Performance & Speed for Better Rankings

Whether you own an e-commerce website, make money off of ad revenue, or simply host a forum as a hobby, you want your page load speeds to be swift and seamless. We predict Google will continue to emphasize the importance of page speed as a ranking factor for search queries. So, begin optimizing your site for good page load time. While this requires revising a lot of factors, you can start the process by prioritizing the most pressing tasks or knocking out those that are easy fixes. 

Consider using lazy load images, eliminate plugins that don’t serve your site speed, switch to loading your js files last, and enable website caching. Other quick fixes can include switching your WordPress theme, using a minification tool, and investing in your own server or a CDN.

Global websites often have content in multiple languages or that is tailored to specific regional markets. When those web pages are discovered in Google, how does Google know to show searchers the content that was tailored just for them? The answer is hreflang tags.

Adding hreflang tags is a powerful international SEO strategy. It helps crawlers understand your multiple web pages and promote them accordingly. If your website wants to go international, adding hreflang tags is a really powerful way to improve conversion rates for the global visitors you earn from organic search.

Here is a complete guide on how to add hreflang tags to your pages. Also, we will detail the many benefits of this powerful technical optimization.

Hreflang Basics

For global brands, creating a multilingual site is a powerful digital strategy. For example, this website communicates to users whose IP addresses are not aligned with the regional page that there is a better version of the page for them to view.

What Are Hreflang Tags?

Hreflang tags are snippets of code that communicate to web crawlers web pages that have similar content but are tailored to different languages or regions. In HTML, the hreflang tag looks like this:

Basically, this code tells search engine crawlers that this is the English version of the page. Therefore, it should be shown to searchers in the United States.

Alternatively, the below hreflang tag tells search engine crawlers that this web page, although also in English, should be shown to searchers located in Hong Kong.

Hreflang can communicate both the same content in the different languages or the same content, but targeted at different countries. This method ensures that someone in the United States will be directed to the English version of a page. Also, someone in a different region will find the same exact content in their preferred language through an alternative URL.

Hreflang is considered an advanced meta tag, It requires some more complex work on the backend of your website. However, these tags can be very powerful for any website that wants to expand their global reach or market share.

How does hreflang work?

First, Google will determine if it wants to rank the URL you have presented to it. Next, search bots or crawlers will check for any present hreflang annotations associated with that specific URL.

Then, Google will show searchers the variant of the URL that contains the right content for their language and geographic region. The searchers’ current location and the language settings they have specified on their devices are all taken into account.

These elements are then used to ensure the correct version of your site is presented to the right audience in the right language. If there is no specified language-country data in the searcher’s browser settings, Google will default to the version of the page that is specified as the “x-default.” 

Do I need hreflang tags on my website?

If you already have content in multiple languages on your website, then hreflang tags are essential to your SEO strategy.

If you are thinking about implementing an international SEO strategy, creating that content is the first step. This is far more time and resource intensive than the hreflang implementation. But once that content is live on your site, you’ll be ready to move forward with adding hreflang tags.

To run a successful international SEO campaign, you need to ensure that browsers are displaying the right content to the right audience. If not, the amount of users who exit your site after visiting the homepage will increase. 

Searchers in the United States are not very likely to stay and toggle language translation options on your homepage if it is written in another language. This is because there are billions of other sites to choose from that have already catered to them through the use of hreflang tags.

SEO Benefits of Hreflang Tags

Hreflang is an integral part of SEO for a website that hosts content aimed at different countries. As mentioned above, language codes can help search engines distinguish duplicate content that is actually written in a different language or geared towards different regions.

Some of the SEO benefits of hreflang include the following: 

  • Improved rankings for pages that are optimized for specific regions or languages
  • Protects your site from duplicate content issues
  • Decreased bounce rate (i.e. searchers are more likely to find your page relevant if it is tailored to them)
  • Better user experience for your global audience
  • Improved conversion rates and lead generation

How is Hreflang Different from Rel Canonical?

If you are familiar with rel canonical tags, hreflang may sound similar.

Rel canonicals tell Google to index a specific page and ignore others that are similar. They are particularly useful for websites that have multiple pages with similar content (like ecommerce sites with multiple product pages).

Rel canonical tags ensure that the master page is always promoted in search results, regardless of the searcher.

So how is hreflang different? 

Well, hreflang tells Google to index the page and all of the other similar pages with language variations. It then asks Google to promote those pages based on key information about the user including language preference and geographic location.

Your language variation pages each need to have their own self referencing canonical. If any of the canonical tags on your foreign or regional pages point to the “x-default” or some other url, those tags will actually work against each other. Working with skilled web developers who are familiar with technical SEO can prevent you from using these advanced tags incorrectly.

How to Implement Hreflang Tags on Your Website

Once you have your different web page variations created and published on your website, you can use hreflang to make sure Google promotes the right versions of the page to the right international searchers.

There are two ways to add hreflang tags. You can add tags to each individual variation of the page, or you can add hreflang tags via your sitemap.

Adding Hreflang to individual pages

The first step is to write your hreflang tags in a text editor. And make sure you do so properly! Each hreflang tag needs to have a specific HTML language and country code, so make sure to confirm the accuracy of your language and regional attributes.

You can also use a hreflang tag generator tool. Here’s one from Geotargetly you can try.   

Then, follow these three steps.

  1. Specify which version of the page will be the default. Google crawlers will show the default page when they can’t determine language or region.
  2. Add all of your hreflang links to the <head> section of your default url. You need to include return links for all of the language variations of the page (which will look like the image below).
  3. Add the same hreflang link elements to the <head> section of each page that you consider a language or regional variation of the default page. These are called reciprocal hreflangs.
  4. Double check that any rel canonicals on those web pages are self-referencing so Google doesn’t ignore your hreflang tags

Adding hreflang tags to your sitemap

If you only have a few language variations, you can add your hreflang tags to your sitemap. Warning: If you have hundreds of page variations, this can take quite a bit of time. 

However, adding hreflangs to your sitemap reduces the page load by not having all that additional data in your header. A step-by-step for adding hreflang to your sitemaps is as follows:

  1. Designate the language variations below the default url in your sitemap using the xhtml:link attribute. Your sitemap will look like the image below.

  2. All of the alternative urls that you specify with the xhtml:link attribute also need to be added to your sitemap.

Common Mistakes With Hreflang Implementation

Because hreflangs require quite a bit of HTML work, it’s easy to make mistakes. Beginners to the process should make sure they do their homework before adding metadata for SEO purposes. 

Here are some of the more common mistakes that webmasters make in their implementations. 

  1. Invalid attributes. The attribute’s value must be combined with the region and language through a two-letter code. There are multiple country and language codes, so make sure you have the right hreflang codes in your tags.
  2. Improper use of rel canonicals: Many site owners use rel canonical tags incorrectly on pages where they have hreflangs. Remember that the rel canonical tag tells Google to ignore all the language variations, and as a result, Google doesn’t know what to do. When you use rel canonical incorrectly, Google will just choose whatever variation it thinks is best to show searchers
  3. Not using reciprocal herflangs (or return links): Each language or regional url needs to have return links back to every other language or regional url. Any site owner that tries to get around this will ultimately implement hreflang incorrectly and not see the desired results. 
  4. Slowing down server requests: This post only breaks down adding hreflang via header tags and sitemaps, but you can also add hreflang via http headers for non-HTML content. This can end up adding a lot of overhead to every request, slowing down the web page load times.
  5. Not consulting professionals: The best practice for implementing hreflang into your website is to consult with professionals and use expert tools for site optimization.

The Value of Hreflang and International SEO

International SEO can take a lot of time and consume a lot of resources. Nevertheless, the payoff is well worth it.

The primary goal behind optimization as a whole is to provide internet users with the best information possible. Providing tailored content for language preferences and regional areas shows you’re a webmaster that thinks about the user’s experience.

Google’s primary goal is also making search better for users. If you implement hreflang correctly, Google will ultimately reward you by promoting your pages more often to the right audience.

Just as traditional architecture determines how people will use a building or another structure, information architecture (IA) guides users in how they use information systems. And while there are many information systems out there, the most commonly used are websites. 

Unlike the architecture of bridges and buildings, though, information architecture has more moving parts, a more abstract form of ‘building materials,’ and has only been around for a few decades. Additionally, information systems like websites are more malleable and can be adjusted and improved over time.

If you can master the principles of information architecture, you can build a website that will stand the test of time. Whether you’re in the process of creating your website or want to revamp your user experience and content, this article will provide you insight into how you can transform your website into a shining example of well-designed information architecture.

What Is Information Architecture in Relation to a Website?

Information architecture refers to the process your users go through to gather information about your products or services through a website or other digital platform like an app. Information architecture provides people with a systematic way to navigate from point A to point B in order to achieve an action or gain knowledge. In other words, better information architecture promotes easier accessibility of information through intuitive navigation design.

The best information architecture not only streamlines the user’s journey and goals, but it fulfills specific user needs by organizing a vast amount of information into little, easily digestible categories.

From Where Does Information Architecture Originate?

Much of the methodologies, techniques, and principles used to understand and improve information architecture design come from Peter Morville. Morville is the founding architect of this branch of user experience (UX) and content inventory systems. While he was the first, there are a large number of experts in this discipline who develop IA best practices through the Information Architecture Institute and user research.

What Elements Does Information Architecture Include?

Before we dive into how to improve your information architecture, it’s important to have a good sense of what is included in this field of study in relation to your website. While information architecture can apply to library science, spreadsheet science, and even physical structures, we will be focusing on IA in relation to websites.

So where can you find examples of information architecture on a website?

All it takes is for a website to load in order to be flooded with examples of information architecture. Information architecture is the strategic organization and presentation of your website’s content. In fact, nearly every aspect of a website and web design is part of IA. Of course, there is good information architecture and subpar IA, but all of the following are important parts of an IA system that go into your site:

  • UX design/UI design
  • Written content or web cop
  • Graphic design and design patterns
  • Images
  • Buttons
  • Links
  • Layout features
  • Website nomenclature
  • Metadata tags
  • Accessibility features

Good IA comes into play in all of the above. And these elements are often categorized into UX design, content creation strategy, and homepage layout (UI design).

How Do Information Architecture and SEO Work Together?

Search engine optimization (SEO) and information architecture both benefit website owners and web users by improving the internet experience. SEO and IA make quality content easier to find, understand, and navigate. SEO and IA differ in where they fit into the website creation process.

Good IA Supports SEO

SEO has the goal of increasing a website’s visibility through the science of configuring content, front-end web development, and back-end web development in response to search engine algorithms. The result is a website that search engines can find and display as search results to web users’ inquiries. This is an ongoing process. SEO requires a proactive and reactive approach since algorithms often change. Additionally, search engines see value to websites that regularly update their content.

SEO specialists regularly improve a website’s

  • Written content
  • Loading speed & responsiveness
  • Organization
  • Visual design
  • Graphics and photos

Information architecture often works best when established before active web design begins. IA establishes a framework that supports the efforts of SEO specialists for the lifetime of a website. With a well-strategized IA, a website will have a strong foundation of logical organization. This makes a website more enjoyable from the user’s perspective since they can find what they need easily. In turn, this improves the website’s reputation. A better reputation increases the website’s authority and pushes it higher on search engine results pages, so more people can find it.

Good information architecture only has to be designed once.

Like most systems, the best IA only has to be designed once. If an IA system is effective, it will allow a website to scale and respond to changes needed for the most current SEO strategies. As more blogs, products, or landing pages are added to a website for SEO, good IA already has a designated location and system to handle them.

Why is Information Architecture Important in UX?

As your local librarians will tell you, providing easy access to information is priceless. Information is both empowering and vital for the best individual experience and a better society. However, when it comes to your UX, IA has a more specific importance. It increases your brand’s value to potential clients while bolstering your sales.

Good IA structure based on set principles has the power to help people find what they are looking for within seconds. One of the simplest examples of this is concise and accurate folder labels in your Google Drive. This naming or navigation system allows you to access the files and information you’re looking for quickly and effortlessly–leading to less frustration and wasted time.

While more complicated, Google Maps also uses IA to help people find what they’re looking for in the physical world. For instance, if you type “food near me,” your search results will be full of nearby restaurants. This demonstration of IA is a perfect example of what it means to help a user understand what they are looking for since the user is likely looking for businesses that provide food. 

How to Improve Your Information Architecture

Improving your information architecture can turn your website from an ordinary e-commerce page into a resource visitors enjoy using. These tips can guide you through how to improve your IA and help you prioritize which tasks to begin with.

1. Utilize wireframes in the prototype stages of your sitemap and IA design development.

Wireframes serve a multitude of purposes when it comes to developing strong IA and a sitemap. They work superbly as information architecture diagrams that can be moved around and changed before your design is finalized. 

At their very core, wireframes connect your IA to its UX design. In striking similarity to an architectural blueprint, a wireframe functions as a skeletal outline of a site or mobile app. However, this method of UX development is not limited to visual design, unlike a mockup. To accurately determine the logic of your site’s flow and the intended customer journey, this is a necessary step in your IA project timeline. Your site’s intended functions can best be evaluated through wireframing.

Through wireframing, you will have a solid idea of your visual hierarchy when you are ready to move your site to the content strategy phase. Common elements of a wireframe include 

  • Search fields
  • Breadcrumbs
  • navigation systems
  • Headers and footers. 

Ideally, you would use wireframes during your initial UX/UI design process. However, you can still utilize them on an existing website.

Identify Paths with Wireframes

Aside from assessing functionality, wireframing is a particularly useful method of identifying paths between web pages. This critical phase of the IA process will allow you to visualize how much space should be allocated for specific content.

When Prototyping Your Visual Hierarchy, Start with a Sketch

Low-fidelity wireframe versions of a website are quick to develop and more abstract because their main focus is on the visual hierarchy of your site. These bare-bones prototypes often implement mock content (like Latin text) as filler for spatial visualization. However, they provide you with a guideline for content volume when the time comes.

Linking concepts to tangible images and links can be a complicated process, even for the seasoned designer. If you have trouble getting your ideas to match your result, consider implementing a mind mapping software like XMind. XMind is a productivity tool used professionally to solidify brainstorming.

Move from Broad to Detailed Wireframes

Conversely, high-fidelity wireframes with more detailed versions are excellent blueprints for interaction design. They include metadata about a particular page element, like its behavior or dimensions. These more detailed versions are excellent blueprints for previewing your interaction design.

2. Keep your brand personas in mind throughout the UX design and content strategy process.

Unity and consistency across your brand are integral parts of a solid information architectural system. 

Your site is a reflection of your brand, from the elements of your visual design down to each blog post and product page. Accordingly, you should be keeping your brand personas in mind each time you implement a UX feature or post a new content piece. This ensures fidelity between your company and your target audience. Use your personas as a guide to help you, your design team, and your content strategist collaborate on your ideal user perception. 

Define and Implement Your Goal User Perception

Your goal user perception is the way you would like customers or potential customers to view your brand. Before making one of the many decisions IA requires, run your ideas through this line of questioning:

  • Does this align with the image I want to create for my brand?
  • Will this decision affect consistency across my site or organization?
  • Am I appropriately conveying the good qualities of my business?
  • Does this get us closer to our main goal?
  • How does this project fit into the future of our company?

Any content or design elements that do not hold up to this line of questioning can be eliminated. Not only can this process help you avoid inconsistencies, but it reduces the possibility of having too much content on your site. This benefits your web admins, especially those who keep up with content creation for SEO purposes.

3. Your visual hierarchy determines readability, so prioritize your content accordingly.

Visual hierarchy is a principle of laying out and sizing visual elements to denote their importance to the viewer. For example, alignment, texture, whitespace, and contrast are a few of the visual design concepts that can help draw users’ attention to the right content. An effective user interface design does more than simply provide information. A quality hierarchy can persuade and impress users.

There are a few aspects of visual hierarchy that are highly beneficial to apply when creating UX design based on cognitive psychology. 

Visual Hierarchy Principles to Keep in Mind:

1: Larger images are perceived as more important

2: Bright colors garner more attention

3: Elements that are aligned are more pleasing to the eye

4: Higher contrast demands more attention

5: Repetition tells the viewer that elements are related

6: Proximity (or closeness) denotes interconnectedness in topic

7: More white space around an element draws more attention to it

Visual unity isn’t just essential to your brand image, it is also a critical part of your UX design. Familiar colors, menu hierarchies, and diagrams promote consistency and fluid usability. Even small distractions like slow-loading graphics or unaligned text columns can interrupt the user experience.

There are several useful IA software that can assist you in your UI development process, like OmniGraffle. OmniGraffle is used to create visuals and graphics for use in prototypes and mockups. As mentioned above, high-fidelity site frameworks utilize these types of visuals and graphics to help designers strategize where to put information and why it belongs there.

Visual Tidiness Affects More Than Just Usability

If you have ever been to a site that was unattractive, cluttered, or disorganized, you likely formulated a negative opinion of that business or organization. Perhaps you even deemed the information to be less reputable due to the nature or design of the site. This is why it’s important to stick to simplistic and user-friendly design. Together, a pleasant UX and UI can boost user confidence and solidify your site’s credibility. 

In addition to building trust among your users, a quality UX also lets Google and other search engines know that your site is worthy of ranking.

4. Structure and categorization are fundamental.

One mistake many people make is putting their content all in one place. In fact, overstuffing information into a single URL causes your UI to suffer, since there is no hierarchy or sense of organization. Too much information on a single page takes users much longer to sort through content to find a specific piece of information.

Users should be able to locate all desired information on your website quickly and easily. This requires a well-planned site map.

Category is… A Better User Experience

To create a better structure, you must first go through the process of categorization. Categorization is the process of organizing your content into a taxonomy system. Categorization is an integral part of navigation design because it has the ability to guide the user to the right content. 

Start by Finding Commonalities

Begin by grouping your content by similarities in content type. For example, you can group sources by format type (eBooks, blog posts, case studies, and videos).

The most common similarities should land higher on your sitemap since they’re usually the starting place for narrowing down the user flow for optimal navigation.

For example, if your website centers on pet care, you likely will want to first group your products or articles by pet species. From there, you may want to divide the information or products into what aspect of care they provide. As you can see, this would make navigation easier for cat owners looking for a technique or clippers to trim their cat’s nails.

Using tools like our dashboard can make long-term organization easier by allowing you to group pages into categories. This allows you to see the category performance, so you can target where you can eliminate or improve content.

Eliminate Unnecessary Content and Categories

While generating new content is extremely important, making sure you have room for this content on your site is also essential. It can be tempting to hold onto content that you have created, but it is best to let it go to make room for site updates.

Omitting unnecessary or irrelevant data can also enhance the user experience. So, don’t be shy to perform a content audit and delete pages that receive little-to-no traffic. A potential customer looking for a specific piece of information may become frustrated or lose interest in your digital product if it is too difficult to find.

5. Your homepage shouldn’t be the only local navigation point.

While the ideal destination page is the homepage, users find nearly endless different ways to land on a website. For this reason, the digital design of each page on your website should share the same functions as your homepage.

Your website will likely be backlinked on other websites to enhance reputability and SERP ranking when your business begins implementing a content strategy. Since backlinking incorporates relevant keywords that may bring visitors directly to content, such as blogs or guides, you should ensure that every entry point of your website is equally user-friendly and visually appealing as the homepage in order to make a good first impression and move visitors beyond the landing page.

Alt: A landing page for HGTV on tips for adopting a dog. A pink box around the navigation menu.

For example, if a user enters your site through the contact page URL, it should be easy for them to find navigation elements that will take them to the homepage or digital product browsing section.

Provide tools to make finding resources easier

An efficient search system is the backbone of a great user interaction design. This allows any of your webpage participants to find what they’re seeking in seconds rather than minutes.

Provide FAQs with links to more specific information. This gives users the choice of how much information they need and an easy way to access it.

Keep a navigation menu at the top of all your subpages. Subpages need to provide access points for other activities you offer–Otherwise, your users may never travel from a subpage to your sales funnel (or another offering on your main page).

6. Go through the customer journey then map out a blueprint for improvements

Alt: Two caucasian women sitting side by side with a laptop between, going through the customer journey

The best usability testing you can perform is going through the actions of a potential customer. You can do this yourself by going through your website manually. Mind maps can also make the task of mapping the customer journey easy.

For best results, anticipate how a user will engage with your interaction design. Once you have a clear blueprint of your users’ needs, you can create an information hierarchy and a sitemap. Your sitemap allows Google’s bots to crawl your URLs to identify information used for SERPs.

Keep Speed In Mind

In general, the online community values convenience and speed above all. A recent UX study demonstrated that 53% of visits are abandoned if a mobile app or site takes longer than three seconds to load. This means from decision points, your web design has about 3 seconds to sort and present the piece of information digital product the user is looking for.

This is to say that load time, page speed, and click response are essential parts of your information architecture and it is important to keep up with their performance. 

Artificial Intelligence and the Customer Journey

The behavior of an internet user is relatively predictable, and artificial intelligence technology can now mimic user activity for rapid results from AI user testing and other usability testing efforts. In conjunction with heatmaps, you can pinpoint where users tend to get hung up and turn decision points into exit points.

Perform Regular Performance Audits and Fixes

The dashboard can make tracking and monitoring performance simple once your site is live. This can help you improve the customer journey by identifying navigation issues such as broken links.

Identify What Pages Visitors Use Most with GSC Insights

Finding which pages on your site that visitors utilize most can help you prioritize their functionality when auditing your site’s performance. This also gives you insight into which categories of content your target audience is most interested in.

7. Make sure the information part of your information architecture is high quality.

Findability, usability, and graphic design are all essential elements to good IA. However, the content you are managing needs to be as relevant as it is organized. The same way an information architect is well-versed in the science of organization, content strategists and content creators are experts in SEO and how to improve the content structures.

Reader engagement is a must when it comes to engagement time and scroll distance. The easiest way to improve your content to encourage deeper navigation is with clear headings as a road map to your content. The first thing many visitors will do is preview your headings and images for relevance to their search terms.

The quality of your metadata and headings will also drive more visitors to your site and reduce your bounce rate.

Structure Your Content for Usability and SEO

The Core Web Vital update made the structure of content an even higher priority. This change takes into account how long it takes for users to access the most important aspects of your website. The difference is now that most IA designs locate data-heavy elements below the page fold. And if these elements are vital assets to your brand, you need to give visitors a reason to scroll through pieces of content far enough to move beyond the fold. This is where the quality of your content comes in.

Information Architecture: The Science of Organizing the Customer Journey

The impact of well-strategized information architecture continues to become more and more profound. With information architects, UX experts, and content auditors, websites are better able to provide every user with easier access to their desired outcomes. Through the science of user behavior, cognitive psychology-based UI design, and strict hierarchy patterns, IA is improving the internet for all users.

Although a more technical part of SEO optimization, robots.txt files are a great way to improve the crawling and indexing of your website.

This article will break down all of the details related to robots.txt and highlight common issues related to its implementation. 

If you run a seo audit in the site auditor, you may see issues flagged related to your robots.txt file. You can use this article to troubleshoot and resolve those issues.

What are robots.txt Files?

The robots.txt file tells web crawlers which areas of your website they are allowed to access and which areas they are not allowed to access. It contains a list of user-agent strings (the name of the bot), the robots directive, and the paths (URLs) to which the robot is denied access.

When you create a website, you may want to restrict access to certain areas pages of your search engine crawlers and prevent specific pages from being indexed. Your Robots.txt file is where web crawlers will understand where they do and do not have access.

Robots.txt is a good way to protect your site’s privacy or to prevent search engines from indexing content that is not rank-worthy or ready for public consumption.

Also, if you don’t want your website to be accessed by other common web crawlers like Applebot, Ahrefbots, or others, you can prevent them from crawling your pages via your robots.txt file.

Where is my robots.txt File Located?

Robots.txt file is a text file that is placed in the root directory of a website. If you don’t yet have a robots.txt file, you will need to upload it to your site.

If the web crawler cannot find the robots.txt file at the root directory of your website, it will assume there is no file and proceed with crawling all of your web pages that are accessible via links. 

How you upload the file will depend on your website and server architecture. You may need to get in contact with your hosting provider to do so.

Why Should I Care About robots.txt?

The robots.txt file is considered a fundamental part of technical SEO best practice.

Why? Because search engines discover and understand our websites entirely through their crawlers. The robots.txt file is the best way to communicate to those crawlers directly.

Some of the primary benefits of robots.txt are the following: 

  • Improved crawling efficiency
  • Prevent less valuable pages from getting indexed (e.g. Thank you pages, confirmation pages, etc.)
  • Prevents duplicate content and any penalties as a result
  • Keeps content away from searchers that is not necessarily of high-value

How Does robots.txt Work?

When a search engine robot encounters a robots.txt file, it will read the file and obey the instructions. 

For example, if Googlebot comes across the following in a robots.txt:

User-agent: googlebot

Disallow: /confirmation-page/

 

It also won’t be able to access the page to crawl and index. It also won’t be able to access any other of the pages in that subdirectory, including:

  • /confirmation-page/meeting/
  • /confirmation-page/order/
  • /confirmation-page/demo/

If a URL is not specified in the robots.txt file, then the robot is free to crawl the page as it normally would.

Best Practices for robots.txt

Here are the most important things to keep in mind when implementing robots.txt:

  1. robots.txt is a text file, so it must be encoded in UTF-8 format
  2. robots.txt is case sensitive, and the file must be named “robots.txt”
  3. The robots.txt file must be placed at the root directory of your website
  4. It’s best practice to only have one robots.txt available on your (sub)domain
  5. You can only have one group of directives per user agent
  6. Be as specific as possible as to avoid accidentally blocking access to entire areas of your website, for example, blocking an entire subdirectory rather than just a specific page located within that subdirectory
  7. Don’t use the noindex directive in your robots.txt
  8. robots.txt is publicly available, so make sure your file doesn’t reveal to curious or malicious users the parts of your website that are confidential
  9. robots.txt is not a substitute for properly configuring robots tags on each individual web page

Common Issues Related to robots.txt

When it comes to website crawlers, there are some common issues that arise when a site’s robots.txt file is not configured properly.

Here are some of the most common ones that occur and will be flagged by your site audit report if they are present on your website.

1. robots.txt not present

This issue will be flagged if you do not have a robots.txt file or if it is not located in the correct place.

To resolve this issue, you will simply need to create a robots.txt and then add it to the root directory of your website.

2. robots.txt is present on a non-canonical domain variant

To follow robots.txt best practice, you should only have one robotxt.txt file for the (sub)domain where the file is hosted.

If you have a robots.txt located on a (sub)domain that is not the canonical variant, it will be flagged in the site auditor.

Non-canonical domain variants are those pages that are considered duplicate pages, or copies of master pages on your website. If your canonical tags are properly formatted, only the master version of the page will be considered the canonical domain, and that is the version of the page where your file should be located.

For example, let’s say your canonical variant is

  • https://www.website.com/

Your robots file should be located at:

  • https://www.website.com/robots.txt

In contrast, it should not be located at:

  • https://website.com/robots.txt
  • http://website.com/robots.txt

To resolve this issue, you will want to update the location of your robots.txt. Or, you’ll need to 301 redirect the other non-canonical variants of the robots.txt to the actual canonical version.

3. Invalid directives or syntax included in robots.txt

Including invalid robots directives or syntax can cause crawlers to still access the pages you don’t want them to access. 

If the site auditor identifies invalid directives in your robots.txt, it will show you a list of the specific directives that contain the errors.

Resolving this issue involves editing your robots.txt to include the proper directives and the proper formatting.

4. robots.txt should reference an accessible sitemap

It is considered best practice to reference your XML sitemap at the bottom of your robots.txt file. This helps search engine bots easily locate your sitemap. 

If your XML sitemap is not referenced in your robots file, it will be flagged with the following message in the Site Auditor.

To resolve the issue, add a reference to your sitemap at the bottom of your txt file. 

5. robots.txt should not include a crawl directive

The crawl-delay directive instructs some search engines to slow down their crawling, which causes new content and content updates to be picked up later. 

This is undesired, as you want search engines to pick up on changes to your website as quickly as possible.

For this reason, the site auditor will flag a robots.txt file that includes a crawl directive.

Conclusion

A properly configured robots.txt file can be very impactful for your SEO. However, the opposite is also true. Do robots.txt incorrectly, and you can create huge problems for your SEO performance.

So if you’re unsure, it may be best to work with professionals to properly configure your robots.txt. Connect with one of our SEO professionals to learn more about our technical SEO services.

As SEO is becoming an increasingly important factor for success, developers need to understand the common issues that can arise when coding with JavaScript (JS). Unfortunately, many developers struggle to ensure that their JavaScript-based sites are properly optimized for search engine visibility. Common mistakes can range from missing meta tags to slow page loading speeds, and these issues can make all the difference in how well a website is ranked on search engine results pages (SERPs). Keep reading to learn more about JavaScript SEO as well as how to address any issues.

What is JavaScript SEO?

JavaScript SEO is a type of technical SEO that’s focused on JavaScript optimization. JS is a popular programming language that allows developers to create interactive websites, applications, and mobile experiences. 

While Javascript is a powerful tool for creating great user experiences, it can also cause issues for search engines when done incorrectly. JavaScript websites can also be heavy on page load and performance, which reduces their functionality and negatively affects the user experience.

How Does JavaScript Impact SEO?

JavaScript directly impacts technical SEO because it affects a website’s functionality. It can have a negative impact on rendering or enhance site speed. Incorrect implementation of JavaScript content can be detrimental to your website’s visibility. 

Here are some of the main on-page elements that affect search engine optimization:

  • Page load times
  • Metadata
  • Links
  • Rendered content
  • Lazy-loading images

To rank higher on SERPs, JavaScript content must be optimized for crawling, rendering, and indexing. For Google and other search engines to fully index a website, they need to be able to access and crawl its content. 

JavaScript and SEO

JavaScript, however, can present issues for crawlers. Some of the primary issues include:

  • Javascript makes it harder for crawlers to render and understand content: Because Javascript is a dynamic language that requires extra resources to interpret and execute, search engine crawlers can sometimes fail to properly understand or access the content on a page. As a result, they are unable to index it.
  • Too much Javascript impacts load times: A web page that contains too much JavaScript or very large JS files can take longer to load. In addition to lower rankings, slow loading times can even lead to an increase in bounce rate, as users will be more likely to leave a website if it takes too long to load.
  • JavaScript can block content from search engine crawlers: The code can be used to hide or limit the content that’s visible to search engines, which can prevent important pages from being indexed and ranked. This is known as cloaking and can lead to severe penalties from search engines. It’s vital that you don’t block access to resources. Googlebot needs this to render pages correctly.

Overall, JavaScript SEO requires troubleshooting and diagnosing any ranking issues as well as ensuring that web pages are discoverable through high-quality internal links for web pages to rank higher. This type of technical SEO involves streamlining the user experience on a webpage and improving page load times since both factors directly affect SERPs.

How Do I Know If My Website Uses JavaScript?

To determine if your website is using JavaScript, you can use a few different methods. 

The most accurate way is to open the developer tools and view the source code of the website. To do this, you can simply right-click on any part of the web page and select “view source” or “view page source.” This will open a new window with the source code of the website. Then, press Ctrl + F and search for “javascript”, or look for any lines of code or any code snippets that mention javascript:

Another way to determine if a website uses JavaScript is to inspect the website’s elements. If the interface is interactive and responds to user input, this is a strong indication that the website is using JavaScript. Here are some key elements that you can look for:

  • Drop-down menus
  • Fly-out menus
  • Dynamic content
  • Pop-ups
  • Interactive elements

If you see these types of features on your website, then it’s likely that JavaScript is being used. Lastly, if a website is using a content management system (CMS) such as WordPress or Joomla, JavaScript is likely being used.

How Does Google Handle JavaScript?

Google handles JavaScript by processing the JavaScript code and rendering the content that’s visible to the user. Google’s crawler can access the page’s Document Object Model (DOM) tree and process the code to determine what content is visible.

Here are the three main steps on how Google handles a webpage and how it processes JS:

  1. Crawling: First, the Googlebot crawls the URLs for every web page. It makes a request to the server, and the server sends the HTML document.
  2. Rendering: Googlebot then decides what is necessary to render the main content.
  3. Indexing: After it has identified what is necessary to render the content, Googlebot can then index the HTML.

But how does Google execute this process? For starters, any unexecuted resources have to be processed by Google’s Web Rendering Services (WRS). Googlebot is more likely to defer rendering any JavaScript until later. Moreover, Google will also only index rendered HTML once JavaScript is executed.

Overall, Google has been able to successfully crawl and index JavaScript for many years, including over 130 trillion web pages. However, there are still common Javascript issues that can arise. 

Content that’s entirely dependent on JS may experience a delay in crawling since Googlebot has a crawl budget. This crawl budget is the rate limit that affects how often the bot can crawl a new page. Another hurdle with a lot of JavaScript has to do with the WRS. There is no guarantee that Google will actually execute the JS code that’s in the Web Rendering Service queue. That’s why it’s important to follow best practices when it comes to JavaScript SEO.

Why Does the Site Auditor Check for JavaScript?

If you are running a report in the site auditor, you may have pages in your report that are flagged due to Javascript.

Because page performance is so important to ranking, the site auditor will flag pages with extra large Javascript files that are slowing down load times and responsiveness.

Common JavaScript SEO Issues

Poorly written or implemented JavaScript can interfere with a search engine’s ability to crawl and index a website, resulting in pages not appearing in search results as expected. This can lead to a decline in organic traffic, making it harder for businesses to reach their target audience. 

Some common JavaScript SEO issues include the following type of issues:

  • Indexing problems: These can occur if JavaScript is not properly implemented. Search engine crawlers need to be able to access the source code of a website to determine its content and relevance. If JavaScript is not properly configured, crawlers may not be able to access the content and the website may not be indexed. 
  • Content duplication: This can occur when the same content is being rendered by both the server-side and client-side code. This can lead to duplicate content being indexed by search engines, which can lead to a penalty. It’s critical to ensure that the content is unique and that there is no duplication.
  • Slow loading speeds: JavaScript code can be bulky and can slow down the loading speed of a website. Search engines consider loading speed as a factor in their ranking algorithm, so websites with slow loading speeds may not rank as well as those with fast loading speeds.
  • Crawlability: Search engine crawlers need to be able to access the source code of a website to index it. If the code is written in such a way that crawlers cannot access it, then the website may not be indexed. This can result in poor rankings and can prevent the website from appearing in organic search results.

How to Fix Common JavaScript Issues

To optimize your JavaScript files for SEO, you can fix the following issues that are common with JS:

Indexing

When it comes to JavaScript SEO, one of the most important aspects to consider is the structure of your source code. 

If you’re using JavaScript, it’s paramount to ensure the code is well-structured and organized. This means:

  • Code properly formatted
  • Unnecessary characters removed
  • External scripts should be properly linked 
  • Minimize the amount of JS used

Content Duplication

To prevent content duplication, webmasters should ensure that each page is served with a unique URL and that dynamic loading is used sparingly. 

Sometimes, content duplication can also be caused by third-party services. When a website uses external scripts, such as social media widgets, they can cause the same content to be loaded multiple times. 

To prevent this JavaScript SEO issue, webmasters should ensure that external services are loaded asynchronously and that the content is not being reused across multiple pages.

Slow Loading Speeds

There are a few common ways to address slow speeds. They include:

  • Use the latest version of the language (as well as any additional libraries that may be needed)
  • Make use of minification techniques to ensure the JavaScript code is as small as possible
  • JavaScript code should be properly organized
  • Separate code into small, manageable chunks, and use appropriate naming conventions
  • You should also use variable names that are relevant to the code they’re used in. This can help reduce clutter and allow for easier navigation
  • Ensure any extra resources that are being loaded with the JavaScript are being properly cached. Caching can help reduce the number of requests that need to be made to the server and can help reduce the amount of data that needs to be loaded overall.

Crawlability

To improve crawlability, it’s best to use progressive enhancement when developing a website. This ensures that all of the content is accessible to search engine crawlers without relying on JavaScript. 

Secondly, it’s vital to ensure all JavaScript is minified and compressed. This can help reduce the amount of time that it takes for the crawlers to read and index the content. It’s also important to use a content delivery network (CDN) to ensure all of the content is served quickly and efficiently to search engine crawlers. These steps can help improve crawlability and ensure that search engine results are accurate and up to date.

Conclusion

Taking the time to optimize JavaScript for SEO can help improve the organic visibility of a website. If you need assistance, make sure to book a meeting with one of our technical SEO experts to learn how we can help you optimize your web pages for better SERP performance.

Duplicate content is a problem for any website that wants to rank in search engines because it can complicate the user experience and confuses search engines. 

But most of the time, duplicate content is not intentional and is caused by technical issues or automatically generated pages. 

Resolving duplicate content can be an easy process, and this guide will cover all of the causes of duplicate content and how to fix it before it negatively impacts your search engine rankings.

What is Duplicate Content in SEO?

Duplicate content is content that appears on the same website in more than one place. It can also refer to the same content appearing on multiple websites. This content can take the form of identical or near-identical content, and it is ultimately detrimental to SEO visibility.

Why is Similar Content a Problem?

Google wants webmasters to provide valuable and unique content to users. If your website has lots of similar pages, it doesn’t signal a website that provides a good user experience. 

The primary technical problem with duplicate content is that it causes search engines to have a more difficult time determining which version of the content should be indexed and displayed in search results.

When Google faces this problem, it usually promotes both versions of the content less often.

Does Duplicate Content Result in a Penalty?

You may have read about a possible duplicate content penalty, but Google will not penalize websites that make an honest mistake. Most often, webmasters do not intentionally publish duplicate content on their websites or are unaware that the same content is available on different URLs. 

But if a website is consistently copying content from other websites and publishing it as its own, that’s a different issue. Google will penalize and potentially de-index those domains that are copying content from other websites.

What Causes Duplicate Content?

There are a few possible explanations for why your website has duplicate content or duplicate pages.

  • Duplicate content can be created inadvertently, such as when different versions of a URL point to the same page
  • When the content is served with subdomains that are not the canonical domain, such as https://www.website.com/ versus https://website.com/
  • When content is syndicated across multiple websites 
  • Websites that are improperly set up can often cause duplicate content
  • Duplicate pages that do not include the proper canonical tags to tell Google that they are copies of a master version of the page

How Do I Avoid Duplicate Content?

In order to avoid duplicate content in SEO, it’s vital to ensure that all web pages on a website have unique and original content. 

Additionally, it’s worth taking the time to make sure that the same content is not accessible from multiple URLs. It’s also necessary to be aware of syndicated content and to implement measures to avoid duplicate content issues.

But the simplest way is to use a site auditing tool. When duplicate content is found on your website, it will be flagged in the site auditor.

Thankfully, duplicate content is a simple fix, and there are a few options for preventing repeated duplicate content from appearing on your website.

How to Fix Duplicate Content in SEO

If you have a web page that is flagged in the site auditor for duplicate content issues, there are a few different options for resolving it. 

Update the Content on the Page

If the content or copy of the page is too similar to other pages, Google will not understand which page to promote in the SERPs for related keywords. In general, you want the different pages of your website to rank for different keyword clusters, and reoptimizing the page for a different set of keywords will help search engine crawlers more easily understand the content as unique.

Use a 301 Redirect

The first way is to use 301 redirects. A 301 redirect is a permanent redirect from one URL to another. This redirect lets search engines know that the original page has been permanently moved to the new page. This strategy prevents search engines from indexing duplicate pages.

Add Canonical Tags to Similar or Duplicate Pages

If your website has lots of product pages that are similar, you should get comfortable using canonical tags to communicate to Google which page is the master version of the content and should be indexed. This method helps search engines identify which version of the page should be ranked.

Add a noindex tag

The third way is to use the noindex tag. The noindex tag tells search engines to not index the page. This technique prevents them from indexing the duplicate page.

Consider a Content Delivery Network

Finally, you can use a content delivery network (CDN). A CDN is a network of servers that delivers web content to users based on their location. This network helps ensure that users are seeing the content from the original page and not a duplicate page. If you have different regional versions of a web page, a CDN may be the most viable solution.

Conclusion

Ultimately, fixing duplicate content in SEO is critical for any SEO strategy. To ensure that a website remains SEO-friendly, it’s necessary to identify and address duplicate content issues before they become a problem. Make sure you use the site auditor in your dashboard to identify and resolve duplicate content when it appears.