There’s so much more to creating an online presence than simply uploading content onto a webpage. Getting noticed in search results is the best way to grow a business and brand name, and creating good content is a key to ranking for multiple search queries. However, search engine crawlers use HTML tags to read and understand the web pages they index. This means that it is not just your content that is important, but how your content reads on both the frontend and the backend of your website.
It’s best to think of HTML as Google crawlers’ native language, and this guide will explain how to use your SEO HTML tags to better communicate to Google the relevance of your content to searchers.
HTML tags are the foundation of any website. They are small snippets of code that are embedded into the back-end of a website, and there are different HTML uses for different components on a webpage.
A HTML tag is characterized by <> and </> surrounding the word or phrase.
In order to understand why HTML tags are important for SEO purposes, it is important to understand the fundamentals of how a search engine works.
Simply put, a search engine’s goal is not only to provide informative answers to its users, but its job is to find relevant and timely content based on the searcher’s query.
There are over 200 ranking factors that go into how a search engine promotes relevant results in their search engine result pages (SERPs). In fact, Google regularly changes its algorithm to improve user experience and the quality of search engine results. Unfortunately, Google keeps these algorithm updates under lock and key, but SEO HTML best practices is a sure-fire method of communicating information about each page so search engines can read your content accordingly.
One incredibly important SEO best practice is investing time and energy into learning exactly how to implement the right HTML tags on the backend of your website.
The HTML process is as follows:
Not all HTML elements are created equally. There are some that are more important for showing up in search results, and you will create them depending on your target keyword and the specific topic of the given page.
Here we explain the most important HTML tags to direct your SEO efforts, and the best practices on how to do so for each.
Ask any SEO expert out there and they will say that your page title, or the title tags, are arguably the most important HTML snippet to include on your website. Because after all, if you don’t have a title, how will Google or users know what your page is about? HTML specifically tells Google “hey, here is the title of this page” and once indexed, the title becomes the clickable headline in the SERPs.
The HTML code for a title tag is: <title>your title here</title>
Technically, Google can choose any snippet of text to be the page title in the SERPs. But in order to ensure Google indexes the proper SEO title, there are certain best practices you should follow for all your title tags.
Before you optimize any HTML element, the first step is to identify your focus keyword for the page. Then you will want to put your target keyword into the page title. Not only will this provide informational context for the reader, but will give an additional signal to the search engine crawlers about what each page is about.
But do be careful about keyword stuffing and overusing keyphrases in your titles. Like with the rest of the content on your page, too many similar keywords in one place will send warning signals to the search engine that you may be spam.
Google will only show the first 50-60 characters of your SEO page titles. A good title tag will be short and sweet, preventing your title from being cut off and possibly confusing prospective customers. You’ll have a bunch more space in the headline tags and general content to expand.
At the end of the day, you want to be as helpful to your clients as you can. Your website should not only be a representation of your brand, but an informative resource for all website visitors. This means your page titles should be clear, concise, and adequately reflect what the content of the page is about.
So although a unique title that sparks the curiosity of searchers can seem like the right approach, in reality, users are looking to get the answer to their question as quickly as possible. In the long run, clear, relevant page titles will help improve CTR, which can help secure higher rankings for your website overall.
It is best to think of your page’s meta description as being the synopsis on the back of a book. They are short, quick, and easily digestible sentences that explain more in-depth what the page content is about.
Where a page’s title grabs the attention of the user, a meta description adds more context and background information.
A meta description is found within the SERPs directly under the page’s clickable URL. Implementing a meta description will provide Google with the right information they need, without Google having to take snippets of text from the same page and create one themselves. When this happens, your user may not get the most accurate description, and it can cause them to lose interest in your brand name.
The HTML code for a meta description tag is: <meta name =”description” content “your description”/>
Data shows that a well-crafted meta description undoubtably entices users to click over to your page. According to Backlinko, website pages with meta descriptions had about a 6% higher CTR than those that did not.
Now, if you are thinking this isn’t that big of a percentage, consider it this way. If your page shows up 500 times in a Google search per month, that is 30 more clients clicking over to your page than if you didn’t have a simple meta description! Adding a meta description is an easy way to get new customers headed your way.
Even if you do have a meta description, there is a small risk that Google will choose another sentence or two from that page that they think is more specific and relevant. But to prevent this, you can follow some SEO best practices.
You really want to drive home to the search engine spiders and your consumer that your page is about a specific keyword or phrase. Where SEO page titles are used for rankings, meta descriptions are more user-focused. Google does have an expected CTR as a ranking factor, so with this in mind, it is crucial to keep the important keywords consistent throughout all the SEO HTML tags, title tag and meta description included.
Just like with SEO titles, you need to watch the length of your meta description. There’s only limited space available in the search engine result pages, so Google has to cut off meta descriptions around 150-160 characters. That’s not to say that you have to match your description up perfectly with the character count, but do your best so the description is easily understood.
Internet users have very short attention spans, so it’s a good idea to remind them of why they need to enter your site! Call to actions don’t have to be overly complex or unique, a simple “learn more here” or “contact us today” can work wonders with your click through rate.
Just like having a strategy of choosing what key phrases you want to incorporate on your page, you have to develop a strategy and a plan for how you structure that information in subheadings. For maximum readability, for both users and search engine crawlers, you cannot just put a ton of information down on a page. There needs to be structure, and that’s where headlines (header tags) or h2 through h6 tags come in.
Users don’t always read the entire page of content, rather they scroll through the page and see if any of the different sections answer their questions. They’ll browse briefly, read the section that appeals to them the most, then leave to complete another action. And if your page isn’t split into multiple sections and is instead one long winding piece of content, then the user will bounce from your page before they even get to reading.
This is why headlines are so important; they are the foundation to your landing page or blog post’s construction.
A header tag looks like: <h1>your heading here </h1>
So how do you utilize keyphrases in your multiple headlines without keyword stuffing or sounding spammy? The answer comes with LSI keywords.
LSI stands for Latent Semantic Indexing, which are synonyms that are related to the main keyword(s) you are trying to target. Sprinkling LSI keywords throughout your content makes it easier for search engines and users to get a general idea of what your content is about.
However, it is important to note that LSI keywords are not always synonyms of your keyword, rather LSI keywords are related phrases to your topic. For example, if your keyword is coat, a synonym would be jacket. LSI keywords for coat would be winter, spring, feather down, puffy, warm, light, etc.
Just as your headlines give structure to the page as a whole, LSI terms give more context to the content. Considering Google takes a look at your entire page before indexing and categorizing it, utilizing LSI keywords and synonyms in your SEO HTML tags will work to drive home the meaning and messaging of your content.
You can use tools like our landing page optimizer to identify related keyphrases and focus words for your heading tags. Just enter your keywords into the tool, and our software provides a list of keywords that have strong topical relevance to the keyphrase. The dropdown menu will provide you plenty of terms to choose from.
Thinking in terms of how search rankings work is pivotal for creating headlines that convert. Here are three tips on how to optimize your heading tags for SEO purposes.
There are multiple sizes of header tags you can use: H1, H2, H3, H4, H5, and H6. The higher the number, the smaller the text will be, and the less important it is to use a keyword phrase. Using more than one H1 tag can confuse search engines, as they see the H1s as being the title of the page.
A quick note: H1 tags are not to be confused with title tags as mentioned above. Title tags are shown in the search engine result pages, whereas H1s are only shown on the web page itself.
You should write your headlines so they are consistent and concise. It is always a good practice to write your headlines in a way that if you were to remove all other content, the headings would read like a list.
Because headings are noticed and ranked by search engine bots, you should always use this space on your website to your advantage and write content that can help with rankings. Many users enter keyphrases as questions, so headlines that resemble queries a searcher would ask, or a helpful answer to a question, is an optimizing strategy that works well.
There are more SEO HTML tags that can also function as a ranking signal. Here’s other important elements of your HTML code to pay attention to.
An alt tag (alt attribute) is basically an image tag, or your own description or explanation of what the images on your website include or are about. When you think from an SEO perspective, you know that crawlers can’t see your images, so a little bit of alt text is the only way they will understand the relevance of that image to the keyphrases that users enter into their Google search bar.
The goal of alt text is to allow Google to know what the image is about, but also to help the user in case they are visually impaired or the image does not load. There is more to an alt tag than an accessibility factor, however, as alt tags help search engine crawlers read the images themselves and index them. This is why you sometimes see images from multiple brands when you click the “image search” tab during a Google search.
So as a rule of thumb, make sure to use an alt tag for anything visual on your website.
A website user has the ability to set up parameters for how search engine bots crawl their website via robots tags. These tags give direction on which pages can be crawled and which should be ignored from an indexing perspective. The nofollow attribute prevents Google crawlers from following internal links to other pages of your site. They are useful if you have some seasonal pages that may not always be relevant, or if you are currently working on updating a webpage.
A robots tag looks like: <meta name=”robots” content=”noindex, nofollow”>
Google is very strict when it comes to unique content, and will penalize you if you have duplicate content or thin content on your page. A canonical tag ensures that this doesn’t happen.
A canonical tag is a tag that you can put on a page that labels it as the “master.” Multiple pages like product pages can result in unique URLs, which can confuse search engines on which page to show in the SERPs. You may have multiple pages for many different reasons, but to Google, you need to use source code to tell them which pages to crawl and rank in the search results pages.
Adding a canonical tag to a page will tell the search engine to ignore any other duplicate content on the website, which will prevent you from being docked in the ratings.
As mentioned previously, there are plenty of search engine optimization benefits of optimizing the source code of your website. They include:
While SEO HTML tags may seem overwhelming to webmasters at first, rest assured with a little practice, they will become much easier to implement. You can add tools like the Yoast SEO premium to your Wordpress site to make sure you’re implementing your metadata correctly. In turn, your website will see more keyword rankings, growing your market share with each new post you publish.
Our experts are here to help your business stand out on the Internet. From link building to on-page SEO to keyphrase research, we can address all the important factors needed to get your website ranking and in not a lot of time.
When it comes to search engine optimization, the details matter. Why? Search engines analyze and index page details and these details add up to a recipe for success in search engine results. While many people overlook the importance of URLs, we urge you not to underestimate them. Creating SEO-friendly URLs from the time of a page’s creation can eliminate compounding issues in the future and give you a competitive edge in the SERPs.
TLDR: URL best practices will save you time, money, and stress while improving your site’s crawlability. And this article will teach you how to optimize every URL you create.
Every URL, or uniform resource locator, is unique. Why? Because every URL is only associated with one web page. As the name implies, a URL is a unique address that a web user enters into a web browser (such as Google Chrome). After hitting ‘enter’ on the address bar, your web browser sends a request to the server where the URL’s data lives. The server then locates or retrieves the web page (or “resource”) and sends the data page to the browser to present to the user.
You likely noticed that every URL is a string of characters, often including letters, numbers, periods, slashes, and colons. Furthermore, you likely noticed that these characters follow a pattern. This pattern denotes specific information.
The first element of every URL is the protocol–either https:// or HTTPS://. This is the protocol or instructions for how the data sent between a web browser and servers are handled. HTTPS is a more secure information delivery system with data encryption.
A subdomain is a part of the main domain that delineates different versions or parts of a website.
For example, a website that has a shop may have a subdomain that denotes that the page the user is a part of the online shop rather than the blog. You will also find language subdomains such as en, de, es.
A domain name is the root of the URL and represents the overall website.
The TLD or top-level domain indicates the category or type of website (to an extent). For example, “com” stands for commercial.
Many URLs also contain subfolders (also called subdirectories), these often act as one more level of organization (similar to how folders on your Google Drive allow you to organize individual files).
The final element of the URL in our example is the page designation, path, or “slug.” This is where you must refine the text while optimizing your URL for SEO.
Search webcrawlers, including Google’s, use a wide array of elements on a page to better understand the content on that page and the overall website. Two elements of a web page that search engine bots analyze to index a page are the URL text and URL structure.
More importantly, writing a URL in a search-engine friendly manner the first time helps you avoid rewriting it in the future and requiring a redirect. Why avoid redirects? Redirects require a server to first look in one location then move onto another location to retrieve the web page data. And while this is often only a fraction of a second, this time can compound during a search engine crawl leading to site load speed issues that can worsen the user experience.
Luckily, writing SEO-friendly URLs is straightforward and easy once you understand the foundations. This section will outline how you can create evergreen URLs that webcrawlers can easily understand and that will last the lifetime of your site.
Keyword-driven content is the heart of good SEO. In fact, it’s pivotal to SEO, that your keywords should extend to your URL text. Your primary or target keyword should appear in your URL text. If it makes sense to put it at the beginning of the page locator, then do so, but do not force.
You should also avoid keyword stuffing your URL. To do so, be sure you use a relevant keyword in relation to your page content. Descriptive keywords also result in a better experience for users and webcrawlers.
Examples:
Good: unlawn.org/never-cut-wet-grass-again
Bad (keyword stuffing a URL): unlawn.org/never-cut-wet-grass-again-avoid-cutting-wet-grass-stop-wet-grass-cutting
Shorter URLs are more user-friendly than long URLs. They also pair back any information that may be confusing to search engine crawlers. Using a shortened version of your page title often results in an accurate and short URL.
While Google denies URL length as a ranking factor, studies show that the highest-ranking search results often contain a total of 50 to 70 characters, including the root domain, subdomain, and page text.
When deciding on what to decide and what to omit when crafting your URLs, you will want to omit extraneous words or characters, including:
Shorter URLs improve the user experience and make your pages easier for search engine crawlers to understand.
When words are shoved together without separation, web crawlers’ NLPs struggle to understand the individual words. To avoid this problem, place hyphens between words in your URL. This makes your URLs easier to read for webcrawlers and searchers.
Example:
Good: yourwebsite.com/blog/seo-outsourcing-guide/
Bad: yourwebsite.com/seooutsourcingguide
Why does readability matter for searchers? Your URL path appears in the SERPs. And searchers do read them in order to understand the page they’re going to click onto. The hyphens in the URL improve the user experience of the person scrolling through search results.
Why not use underscores instead of hyphens in your slug? To put it simply, Google recommends using hyphens to make a URL easier for their bots to understand. Underscores add complexity which can confuse and slow down a crawler.
Why not use spaces in a slug? Spaces in a URL are converted into code. You may have even noticed them in the past. They transform into %20.
For example, example.com/how%20grooming%20your%20dog%20at%20home As you can see, this switch renders the URL more difficult to read and much longer–neither of which is ideal for a web address.
Many people don’t realize that URLs are, in fact, case-sensitive. And while Google’s John Mueller contends that Google doesn’t care if you have capital letters in your URL when it comes to ranking signals, we argue that all lowercase is still best practice. And here’s why:
1) Having a mix of uppercase and lowercase letters look bad and are just generally a little more difficult to read. It turns out all lowercase letters result in a more readable URL. (I’ve also noticed that this practice is done more often by people that simultaneously do not use hyphens).
2) Having all uppercase letters looks like the URL is yelling at the user.
3) People have to put in more effort to type with caps. (If you’re thinking “but they get to the same page either way,” keep in mind there are users that do not know that. And note our next point).
4) A URL with differing capitalization counts as different page styles can cause duplicate content issues to crop up unless you employ correct canonical tags). Without correct canonicals, you may get flagged by Google for duplicate content–or find the wrong version of your page appearing in the SERPs.
Just to help you better understand #1 and #2 above, here are some examples:
1) Good: yourwebsite.com/best-seo-practices-for-beginners
Bad: yourwebsite.com/Best-SEO-Practiced-For-Beginners or yourwebsite.com/BestSEOPracticesForBeginners
2) Good: yourwebsite.com/best-seo-practices-for-beginners
Bad: yourwebsite.com/BESTSEOPRACTICESFORBEGINNERS
For a very long time, you would hear that you should always leave out “stop words” in your URLs because they just take up space. And this was true–for a while–and still is when it comes to articles and conjunctions.
However, with Google’s BERT algorithm (an NLP algorithm that processes human language), Google often ignored prepositions since they were seen as minor signals in relation to content meaning. However, over time, the Google team realized that these words often add a lot of contextual meaning to searchers’ intent.
For example, someone looking for “restaurants nearby downtown Nashville” has a slightly different intent than “restaurant in downtown Nashville.”
So, if the word is important to understanding the category of the content on the page, keep it in your URL.
Special characters tend to clutter your URL. They’re also a bit more ambiguous in meaning than words. Therefore, it’s best to leave them out. Another way to think about this rule is that you never want a user to be confused if you were to tell them what to type.
For example: if you have the URL
example.com/SEO-&-so-much-more, the user would likely type out “and”.
Why remove numbers from your URL text? Content re-optimization is a possibility for any blog post. When you have a number in your URL, you’re not leaving your content room to grow in the future. And if you do change the number of items on a list-style blog, you wind up needing to create a new URL and a redirect.
For example: numbers in a list (ex: 7-ways-to-improve-your-running-form)
When it comes to subfolder best practices, keep it simple. While a short URL is easier for searchers to scan, subfolders also tell Google more about the organization of your page through category name. Your subfolders tell Google more about the organization of your page, but Google does warn that having a list of subfolders can lead to a long URL that isn’t as easy for users to understand at first glance from the SERPs. Having several subfolders in your URL structure also signals to Google that the content may be less important.
When creating subfolders, remember to keep your category names concise to reduce the risk of an overly long URL slug.
So, when it comes to creating URLs, use the guidelines above. However, we know that sometimes we don’t learn about the magic of SEO until we’re 100+ pages deep. We’ve got you covered. This section will go over that and answer those other questions many people have about URLs and SEO.
This question comes up time and time again: “Should I change my URLs just for SEO?” The answer is, “it depends.” When it comes to URLs, you have a few choices for re-optimizing (or not) existing URLs.
No. Google doesn’t count a site’s domain name as a ranking factor according to Google’s John Mueller. However, while search engine crawlers don’t care about your domain name, users do. A good domain name can improve brand awareness and visitors’ ability to remember your site. To create a good domain, avoid confusing spelling, unsafe characters, and make it memorable.
Dynamic URLs are URLs that change with each request, usually generated by a server-side scripting language. They are used to track user interactions and to pass data between pages. You will often see these on share URLs (for example, you will find these on Amazon and Zillow). This is because the unique tracking number allows sites to better understand how the sharer will use the link.
Yes. Google uses the anchor text as well as annotation text to better understand the content of your destination page. We recommend using diverse (but on-topic) anchor text and Focus-Term-rich content surrounding the link.
Whether you’re the site owner of a small business or a marketing content writer for an enterprise-level company, your URLs matter. A concise URL slug can help Google and other search engines better understand your content.
For the best search engine rankings, always refer to URL best practices and use the best SEO tools for content creation. High-quality quality with a URL that reflects the topic is a winning combination for landing pages, blogs, and product pages.
To learn more about SEO tools and SEO strategy, check out our SEO starter guide. With SEO from the address bar to your internal links, you can begin to improve your site’s overall search engine rankings with confidence.
There are three directives (commands) that you can use to dictate how search engines discover, store, and serve information from your site as search results:
These directives allow you to control which of your site pages can be crawled by search engines and appear in search.
The noindex directive tells search crawlers, like googlebot, not to include a webpage in its search results.
Indexing is the process by which Google scans, or ‘crawls,’ the internet for new content that is then added to the search engine’s library of search-accessible content.
There are two ways to issue a noindex directive:
By using the “no index” meta tag for a page, or as an HTTP response header, you are essentially hiding the page from search.
The noindex directive can also be used to block only specific search engines. For example, you could block Google from indexing a page but still allow Bing:
Example: Blocking Most Search Engines*
<meta name=”robots” content=”noindex”>
Example: Blocking Only Google
<meta name=”googlebot” content=”noindex”>
Please note: As of September 2019, Google no longer respects noindex directives in the robots.txt file. Noindex now MUST be issued via HTML meta tag or HTTP response header. For more advanced users, disallow still works for now, although not for all use cases.
It’s a difference between storing content, and discovering content:
noindex is applied at the page-level and tells a search engine crawler not to index and serve a page in the search results.
nofollow is applied at the page or link level and tells a search engine crawler not to follow (discover) the links.
Essentially the noindex tag removes a page from the search index, and a nofollow attribute removes a link from the search engine’s link graph.
Using nofollow at a page level means that crawlers will not follow any of the links on that page to discover additional content, and the crawlers will not use the links as ranking signals for the target sites.
<meta name=”robots” content=”nofollow”>
Using nofollow at a link level prevents crawlers from exploring a specific link, and prevents that link from being used as a ranking signal.
The nofollow directive is applied at a link level using a rel attribute within the a href tag:
<a href=”https://domain.com” rel=”nofollow”>
For Google specifically, using the nofollow link attribute will prevent your site from passing PageRank to the destination URLs.
However, Google did recently announce that as of March 1, 2020 the search engine will begin to treat NoFollow links as “hints” that contribute to a site’s overall search authority.
For the majority of use cases, you should not mark an entire page as nofollow – marking individual links as nofollow will suffice.
You would mark an entire page as nofollow if you did not want Google to view the links on the page, or if you thought the links on the page could hurt your site.
In most cases blanket page-level nofollow directives are used when you do not have control over the content being posted to a page (ex: user generated content can be posted to the page).
Some high-end publishers have also been blanket applying the nofollow directive to their pages to dissuade their writers from placing sponsored links within their content.
Mark pages as noindex that are unlikely to provide value to users and should not show up as search results. For example, pages that exist for pagination are unlikely to have the same content displayed on them over time.
Domain.com/category/resultspage=2 is unlikely to show a user better results than domain.com/category/resultspage=1 and the two pages would only compete with each other in search. It’s best to noindex pages whose only purpose is pagination.
Here are types of pages you should consider noindexing:
A page marked both noindex and nofollow will block a crawler from indexing that page, and block a crawler from exploring the links on the page.
Essentially, the image below demonstrates what a search engine will see on a webpage depending on how you’ve used noindex and nofollow directives:
If a search engine has already indexed a page, and you mark it as noindex, then next time the page is crawled it will be removed from the search results.
For this method of removing a page from the index to work, you must not be blocking (disallowing) the crawler with your robots.txt file.
If you are telling a crawler not to read the page, it will never see the noindex marker, and the page will stay indexed although its content will not be refreshed.
If you want to remove a page from the search index, after it has already been indexed, you can complete the following steps:
Confirm the page has been removed from searchOnce you’ve requested the crawler revisit your webpage, give it some time, and then confirm that your page has been removed from the search results. You can do this by going to any search engine and entering the site colon target url, like in the image below.
If the page you want removed from search is on a site that you own or manage, most sites can use the Webmaster URL Removal Tool.
The Webmaster URL removal tool only removes content from search for about 90 days, if you want a more permanent solution you’ll need to use a noindex directive, disallow crawling from your robots.txt, or remove the page from your site. Google provides additional instructions for permanent URL removal here.
If you’re trying to have a page removed from search for a site that you do not own, you can request Google removes the page from search if it meets the following criteria:
If the page does not meet one of the criteria above, you can contact an SEO firm or PR company for help with online reputation management.
It is usually not recommended to noindex category pages, unless you are an enterprise-level organization spinning up category pages programmatically based on user-generated searches or tags and the duplicate content is getting unwieldy.
For the most part if you are tagging your content intelligently, in a way that helps users better navigate your site and find what they need, then you’ll be okay.
In fact, category pages can be goldmines for SEO as they typically show a depth of content under the category topics.
Take a look at this analysis we did in December, 2018 to quantify the value of category pages for a handful of online publications.
We found that category landing pages ranked for hundreds of page 1 keywords, and brought in thousands of organic visitors each month.
The most valuable category pages for each site often brought in thousands of organic visitors each.
Take a look at EW.com below, we measured the traffic to each page (represented by the size of the circle) and the value of the traffic to each page (represented by the color of the circle).
Monthly Organic Traffic to Page = Size
Monthly Organic Value of Page = Depth of Color
Now imagine the same charts, but for product-based sites where visitors are likely to make active purchases.
That being said, if your categories similar enough to cause user confusion or compete with each other in search then you may need to make a change:
There are a few options to stop Google from indexing subdomains:
If your subdomains are for development purposes, then adding an .htpasswd file to the root directory of your subdomain is the perfect option. The login wall will prevent crawlers for indexing content on the subdomain, and it will prevent unauthorized user access.
Example use cases:
If your subdomains serve other purposes, then you can add a robots.txt file to the root directory of your subdomain. It should then be accessible as follows:
https://subdomain.domain.com/robots.txt
You will need to add a robots.txt file to each subdomain that you are trying to block from search. Example:
https://help.domain.com/robots.txt
https://public.domain.com/robots.txt
In each case the robots.txt file should disallow crawlers, to block most crawlers with a single command, use the following code:
User-agent: *
Disallow: /
The star * after user-agent: is called a wildcard, it will match any sequence of characters. Using a wildcard will send the following disallow directive to all user agents regardless of their name, from googlebot to yandex.
The backslash tells the crawler that all pages off of the subdomain are included in the disallow directive.
If you would like some pages from a subdomain to show up in search, but not others, you have two options:
Page level noindex directives will be more cumbersome to implement, as the directive needs to be added to the HTML or Header of every page. However, noindex directives will stop Google from indexing a subdomain whether the subdomain has already been indexed or not.
Directory-level disallow directives are easier to implement, but will only work if the subdomain pages are not in the search index already. Simply update the subdomain’s robots.txt file to disallow crawling of the applicable directories or subfolders.
Accidentally adding a no index directive page on your site can have drastic consequences for your search rankings and search visibility.
If you find a page isn’t seeing any organic traffic despite good content and backlinks, first spot check that you haven’t accidentally disallowed crawlers from your robots.txt file. If that doesn’t solve your issue, you’ll need to check the individual pages for noindex directives.
WordPress makes it easy to add or remove this tag on your pages. The first step in checking for nofollow on your pages is by simply toggling the Search Engine Visibility setting within the “Reading” tab of the “Settings” menu.
This will likely solve the problem, however this setting works as a ‘suggestion’ rather than a rule, and some of your content may end up being indexed anyway.
In order to ensure absolute privacy for your files and content, you will have to take one final step: either password protecting your site using either cPanel management tools, if available, or through a simple plugin.
Likewise, removing this tag from your content can be done by removing the password protection and unchecking the visibility setting.
Squarespace pages are also easily NoIndexed using the platform’s Code Injection capability. Like WordPress, Squarespace can easily be blocked from routine searches using password protection, however the platform also advises against taking this step to protect the integrity of your content.
By adding the NoIndex line of code within each page you want to hide from internet search engines and to each subpage below it, you can ensure the safety of secured content that should be barred from public access. Like other platforms, removing this tag is also fairly straightforward: simply using the Code Injection feature to take the code back out is all you will need to do.
Squarespace is unique in that its competitors offer this option primarily as a part of the suite of settings in page management tools. Squarespace departs here, allowing for personal manipulation of the code. This is interesting because you are able to see the change you are making to your page’s content, unlike the others in this space.
Wix also allows for a simple and fast fix for NoIndexing issues. In the “Menus & Pages” settings, you can simply deactivate the option to ‘show this page in search results’ if you want to NoIndex a single page within your site.
As with its competitors, Wix also suggests password protecting your pages or entire site for extra privacy. However, Wix departs from the others in that the support team does not prescribe parallel action on both fronts in order to secure content from the crawler. Wix makes a particular note about the difference between hiding a page from your menu and hiding it from search criteria.
This is particularly useful advice for less experienced website builders who may not initially understand the difference considering that removal from your site menu makes the page unreachable from the site, but not from a prudent Google search term.
How Google Determines Relevance
Search engines want to prioritize the most relevant content possible for different searches and search terms (also known as keywords). One of the ways search engines are able to tell that content on a webpage is relevant for a particular search term, is that the term appears in the body of the content on the page. You’ll hear the terms on-page content and on-site content used in the SEO world. On-page content refers more to the content that is visible on the page itself, while on-site content can sometimes be used more broadly to include content found in meta data or schema markup.
Onsite content refers to both VISIBLE (page copy) and INVISIBLE (meta data) content.
If you were interested in finding an Italian restaurant in NYC – you might type Italian Restaurant NYC into the search bar. Google could quickly find pages that included all of those terms, but there’s going to be hundreds of pages that include the words “italian”, “restaurant”, and “NYC” (or variations on NYC – such as “new york” and “manhattan”).
To surface the most relevant results (as opposed to just a diner in NYC which has an italian sub on the menu), Google looks for additional “focus” keywords in the copy of the page to help determine that the ENTIRE page is relevant to the search term, as opposed to just one line on the page.
If I searched Italian Restaurant NYC Google would expect a relevant page to have some terms on it like pasta and parmesan, maybe burrata.
The more terms included in the copy of a page that google knows are relevant to italian restaurants, the more likely google isto prioritize that result in the search results compared to a page that has less focus keywords.
Search engines also take into account the frequency of certain terms on a page. For example, how often terms like entree or course are used can help a search engine understand if a page with the term italian is more relevant to a restaurant search as opposed to a sandwich search.
Focus keywords help search engines determine how relevant a page is to the term that was searched.
When optimizing content for search engines, you cannot be all things to all people. A page is unlikely to rank for both fancy birdhouses and Italian restaurants in NYC; those two terms will have almost no overlapping focus keywords.
This means that sites need to optimize content for the types of users (and searches) which are most likely to bring converting traffic to a website.
For a site where the profit model functions off of advertising and/or readership, it makes sense to prioritize keywords sheerly based on their volume. The more eyeballs on a site, the more likely content is to be shared, and the more advertisements will get viewed.
However, for most sites, search intent should be taken into account before volume.
For sites where the profit model functions off of purchase, or participation, we want to look at the exact search terms being used. When it comes to converting traffic, the intent behind keywords becomes much more important.
Take for example two searches one is used jeep dealers south detroit and the other is used blue cars. The first term indicates a much clearer intent to purchase, as the person already knows the type of vehicle they’re looking for, and is looking for a physical location where they could see the vehicles.
For your own site, if the goal is to attract more converting users to the site, you want to target keywords that suggest an intent to engage, purchase, donate, or otherwise complete a goal relevant to your business.
If you sell umbrellas a converting user is a user who will buy an umbrella. The term rain might be related to umbrellas, and have a huge search volume, but it’s unlikely to be the term that a user looking to purchase an umbrella would use for a search. Take a look at the keywords below.
Rain has the highest search volume, but not the highest CPC. The other keywords also have much clearer search intent (looking for umbrellas, specific types of umbrellas). Given the organic ranking difficulty of the keyword, the search volume, the CPC, and the search intent — custom umbrella would be the best term to target. However, if you don’t offer custom printed umbrellas, the term would not be converting for your site because the searcher would not be able to find the product they were looking for.
You should only target keywords that are likely to bring in CONVERTING traffic to your site.
To have your site receive more converting traffic, we want to target terms that are likely to convert for the site, and then create content that Google (and other search engines) will recognize as hyper relevant for those searches.
What terms would you want to rank for, what types of keywords or search terms would a user looking for your business put into the search bar? Selection of these keywords is done through Keyword Research*, where you look at where competitor sites are getting traffic as well as how currently converting traffic is coming into your domain.
Look at the keyword metrics to determine how difficult it will be for your site to rank for each of the identified terms. More difficult terms (terms with more competition) will have to have longer content, with more focus keywords included, for your site to be able to make it onto the first page of search results.
There are two parts to content optimization:
The first part is creating useful, meaningful, content that provides value to your desired audience. To get some ideas for what content might be useful to your audience, you can take a look at trending topics, frequently asked questions, and search volume for different topics or terms.
The second part is optimizing that content for search engines so that Google, Bing, and others know exactly what the content on the page is about and when to surface your content as a search result.
Search engines use internal links to help understand the subject matter and importance of pages within your site. The pages with the most internal links are viewed as the most important pages on your site.
Importance: Pages linked in your main navigation are linked to from every page on your site, and should be the ones you and your users consider most important.
Subject Matter: The anchor text that you use for internal links send signals to search engines telling them the subject matter or topic of the page. If you call a page “SEO Services” in your main navigation, and that text links users to a page – both the users and search engines expect that page to be about “SEO Services.”
Search engines use external links, in part, to evaluate the quality of the content you’re providing users. If you’re writing a piece on a topic, and linking to sites that a search engine already knows are authoritative on the same topic, it demonstrates that you are providing good resources. This increases a search engine’s trust in the relevancy of your content.
To optimize content we need to:
Let’s pretend we own an Italian restaurant in NYC and we want to capture people who are looking to select an Italian Restaurant in NYC.
We’ll start by popping a few potential search terms into a keyword explorer (Ahrefs in this case):
The first thing to note is that we did pretty well with our initial keyword guesses! We can see we managed to include the parent topic. The parent topic is the keyword related to our search that gets the most monthly search volume.
Sometimes how YOU think about searching for something and how AVERAGE users think about searching for something will be different.
Pro-tip: Always keep an eye on the parent topic to see if there are additional keywords you could explore.
We’ll pick italian restaurants nyc because out of this set of keywords it has:
Target Keyword Selected: italian restaurants nyc
How do we know which additional terms (focus keywords) need to be on the page for Google to recognize your content is hyper relevant for a search (target keyword)? We run that search ourselves, and then look at the top 10 results, and the content/terms on each of those pages.*
*For this example, we’re going to use the dashboard’s FREE Content Optimizer tool.
Once we’ve identified our target keywords, the search terms we want our page to rank for, we’ll want to analyze the pages currently ranking in the top 20 search positions for those terms.
This will help us identify the topics and focus keywords that need to be on our page if we want it to rank. Incorporating focus keywords will improve the relevancy of our page for the target keyword and, as a result, our page’s search engine ranking position (SERP).
Below you’ll find the current copy on our Italian restaurant’s home page. If we compare this copy to the copy on the first twenty search results for italian restaurants NYC we can see that our content currently only shares a few keywords with those pages (focus keywords are highlighted below in yellow).
Since the 19th century, townsfolk in Vatican City used to take a portion of all food brought in from local harvests to give to the needy. In 1935, the Aribetta family opened a restaurant for dinner in the nearby neighborhood, La Decima (The Tithe) in honor of the tradition and the people who continued to maintain it. Many iterations of their family later, in 2015, the descendants of the original founders of La Decima opened a satellite location in Brooklyn. Both restaurants adhere to two principals: stay close to the roots of their traditional recipes but with refreshed presentations, and local sourcing that gives back to the community.
TIME OUT NEW YORK
La Decima offers dishes like spaghetti cacio e pepe, roasted lamb and Italian ham served with hot, crispy mozzarella, accompanied by a list of all Italian wines.
NEW YORK OBSERVER
Even the pasta, which is hard to present in a way that gives proper credit to the effort needed to produce it, comes across well. The cacio e pepe, in which pecorino and Parmesan bind themselves to thick al dente strands of homemade spaghetti, is phenomenal.
NEW YORK TIMES
Where do you eat in Rome? Is it right that you love the Trastevere district and especially the La Decima restaurant, which has now opened a U.S. branch in the Park Slope section of Brooklyn, in New York City
FODOR’S – ONLINE
Following the lead of the many lauded NYC chefs who have opened second and third restaurants here, one of Rome’s most celebrated restaurants, La Decima, just opened its first stateside outpost not in Manhattan, but in the Brooklyn neighborhood of Park Slope.
Our menus feature homemade pastas, homemade desserts, the finest Italian imported cheeses and olive oils, and farm-to-table ingredients.
*No outside bottles of wine are permitted*
Review the focus keywords already incorporated.
Out of 92 potential focus keywords, our content only uses 6, and is not going to be recognized as very relevant for our target term.
Here are some additional terms (focus keywords) shared by a number of the pages ranking in the top 10 positions in google, that we can consider incorporating into our content:
Not all of these terms will make sense for our page, but we want to incorporate as many focus keywords as possible that will fit naturally into our page. Separate the list into two categories off topic and on topic.
You may find that you see competitor brands as suggested focus terms, or terms which are not relevant for your product or service offerings. These terms will likely go into your “will not incorporate” category, unless you add them in with content or features like product comparison tables.
Below you’ll see the same copy from before, but this time with additional focus keywords incorporated into the copy.This content is now better optimized for search.
You can see the additional focus keywords we added in BLUE.
Since the 19th century, townsfolk in Vatican City used to take a portion of all food brought in from local harvests to give to the needy. In 1935, the Aribetta family opened an Italian restaurant for dinner in the nearby neighborhood, La Decima (The Tithe) in honor of the tradition and the people who continued to maintain it. Many iterations of their family later, in 2015, the descendants of the original founders of La Decima opened a satellite location in Brooklyn. Both restaurants adhere to two principals: stay close to the roots of their traditional italian recipes but with refreshed presentations, and local sourcing that gives back to the community, which have made it one of the best Italian Restaurants in New York City.
But where the original La Decima might be a bit far for your special occasion night out in NYC, if you’re looking for cozy romantic restaurants to offset the rush of a day out sightseeing in Central Park or Times Square, consider skipping Little Italy and heading out to the quieter Park Slope for authentic traditional Italian without the rush.
See what reviewers have to say about our cozy brick oven style Italian food and wine bar that make La Decima the perfect Italian spot for date night.
TIME OUT NEW YORK
La Decima offers dishes like spaghetti cacio e pepe, roasted lamb and Italian ham served with hot, crispy mozzarella, accompanied by a list of all Italian wines.
NEW YORK OBSERVER
Even the pasta, which is hard to present in a way that gives proper credit to the effort needed to produce it, comes across well. The cacio e pepe, in which pecorino and Parmesan bind themselves to thick al dente strands of homemade spaghetti, is phenomenal.
NEW YORK TIMES
Where do you eat in Rome? Is it right that you love the Trastevere district and especially the La Decima restaurant, which has now opened a U.S. branch in the Park Slope section of Brooklyn, in New York City
FODOR’S – ONLINE TRAVEL GUIDE
Following the lead of the many lauded NYC chefs who have opened second and third restaurants here, one of Rome’s most celebrated restaurants, La Decima, just opened its first stateside outpost not in Manhattan, but in the Brooklyn neighborhood of Park Slope.
Our menus feature homemade pastas all perfectly al dente, homemade desserts, the finest Italian imported cheeses and olive oils, and farm-to-table ingredients. Our rotating tasting menus and full cocktail bar make La Decima the perfect place whether you’re looking for drinks, tapas, or fine dining restaurants in New York. Our brick oven pizzas are great for lunch, and you’ll be glad you got out of the east villages / west villages to sate your olive oil cravings!
Curious about our full date night recommendations for New York City? Check out our neighborhood travel guide, and let us know ahead of time if you’re celebrating a special occasion.
*No outside bottles of wine are permitted* Please check out the selection from our wine bar instead.
Once you’ve added a number of additional focus keywords to your content, run it through a content tool again and see if your copy has a higher score, and is now capable of ranking on the first page of the search results.
We have now incorporated 26 focus keywords, and this content is capable of ranking in the top 10 google search results.
Terms incorporated:
Sometimes simple edits are not enough to incorporate relevant focus terms. In some cases, missing a large swath of focus terms is an indication that you’re missing a block of content users would find helpful.
In our Italian restaurant example, if we wanted to incorporate more terms and get ourselves into the top 3 results in Google, we would need to add a section that allows us to mention more locations in Manhattan. A great way to do this would be to add a section about how to travel to our restaurant from different areas of the city.
To rank on the first page of the search results, your page copy needs to include your target keyword AND supporting focus keywords.
Once your content has been optimized for your target keyword(s) you should look for opportunities to insert links into your copy. You want to link internally to other pages using relevant anchor text, and externally to resources that will help your users.
For our Italian restaurant example, we might link internally to our menus and externally to our Yelp reviews or Open Table booking service.
There are two types of links that you’ll want to consider adding:
We add internal links to:
When adding links to a page, the first thing you’ll want to do is think of other RELEVANT content you’ve created that a user might want to access. For example, if we had a page on our italian restaurant’s website talking about how we cater events, it would make sense to link to our catering menu. Basically, you want to make it easy for a user to find all relevant content and information on your site.
Review each page for conversion opportunities. Link to sign up forms, scheduling forms, prompt users to call you, email you, or make a purchase. Calls to Action (CTAs) should be placed wherever a user might find them helpful. For example on our catering page, it would make sense to add a CTA linking the user to a page where they could request a quote or place an order.
Internal links also help search engines discover content on your site, and understand how that content is connected. Ideally all pages would be linked from the main navigation, and corresponding sub-navigations. Search engines discover site content in part by using site crawlers to map your site.
A site crawler is a program (also known as a bot) that lands on your site and then clicks into every link on the page they landed on indexing each page they find. For each now page it discovers, the crawler will repeat the same process – clicking every link and indexing any new pages it finds. Pages linked from the main navigation, and pages which are linked to frequently are easy for crawlers to discover and index. Pages which are never linked to, may not get indexed at all.
Lastly, internal links help distribute search equity through your site. If a search engine ranks one page on your site very highly, any other (relevant) pages you link to will also see a boost in the search results. This is partially because if Google thinks a page is relevant to users, it assumes that the additional content you link to will also be relevant to the user.
You may have heard a theory that linking out to other sites will cause you to lose search equity. While you don’t want to go crazy with links, being associated with other quality sites will increase your search equity.
It’s like the phrase “you’re known by the company you keep” – you are judged by the people you associate with. In search this is true, good quality sites link out to high-quality resources.
It’s the difference between someone suggesting a medical procedure and giving you the results of relevant clinical trials, and someone suggesting a medical procedure and linking to their friend’s instagram page who swears it works. The better the sources you reference, the more trust search engines will have in your content.
Good quality sites reference high quality content
Search engines view this as a trust signal for two reasons, it shows that you:
Remember! Quality content is content that provides value to a user – it can be as simple as well curated comedy clips, or as complex as the results of cutting edge clinical trials. It all depends on what will best serve the user.
You need room for copy on your pages. You’ll need enough room to include your target keyword, and enough focus keywords to get your page ranking. The more competitive the keyword (aka the higher the keyword difficulty) the more focus keywords you will need to incorporate.
Remember: search engines can extract meaning from the use of synonyms, the context in which the keyword appears, and the frequency with which specific word combinations are mentioned.
Don’t go crazy, but help the user access the information they’re looking for as easily as possible.
Calls to Action help users engage more effectively with your business and site. Prompt them to sign up, call now, email you, contact you, learn more, get started, schedule an appointment, or buy now!
Content on your site should be:
Author Authority has become even more important in SEO to communicate the quality of your website content and the trustworthiness of your website as a whole.
But what is author authority and how is it understood?
Here is a guide to this content-quality standard and what it means for your SEO efforts.
Author authority is a measurement of how much credibility and expertise a given author has on the topic they are writing about.
Say you’re not feeling well, and you go to an article online to get medical advice. When you get to the bottom of the article, you see that rather than being written by a health professional, the author bio describes the writer as working in real estate.
Likely, you would feel the content you just read has less trustworthiness, because the author doesn’t have that specific topic as their area of expertise.
Author authority is particularly important in fields like healthcare, law, finance, or any more technical niche.
The bottom line is, when it comes to the issues that matter the most to us, we want to hear from experts.
Technically, author authority is not a ranking factor.
But the importance of author reputation has been growing in recent years. In their recently updated Search Quality Rater Guidelines, Google emphasized the importance of the author of the main content.
Section 2.6 states, “An important part of the PQ rating is understanding the reputation of the website. If the creator of the MC [main content] is different from the creator of the website, it’s important to understand the reputation of the creator as well.”
So although author authority is not being used in Google’s ranking algorithm, it is one way that Google understands quality content. And it does evaluate content quality for how to rank search results.
Who writes your content is something that searchers and Google’s quality raters do pay attention to, and thus webmasters who want to show up in search engines should pay attention to it as well.
So how does Google know whether or not the byline of a piece of content is an authoritative source?
Here are the factors that you should include in your content to make sure Google sees your content creators as authoritative.
Every blog post or in-depth article on your website should be accompanied with a clearly displayed author byline and bio.
Although your primary service or product pages don’t need to have clearly displayed authorship, long-form SEO content that explores a topic or related subfields in depth should be accompanied with an author bio.
This also makes for more effective content. Per the previous example, you want your target audience to get to the bottom of an article and see that the particular author does have subject matter expertise.
For every content creator who has a blog post or article on your website, they should also have a bio page that communicates they are a trusted expert.
Author information in an author page can include any of the following: Job title, education, areas of expertise, the kind of content they create, and mentions to other trustworthy websites where that creator has published content.
If the content creator has written multiple blog posts or articles for your web site, it’s good to link out to all of their contributions from their author page.
You should also be including links to your content creators’ social media profiles on their bio pages.
This makes it easy for search quality raters and website visitors to further research your authors and cross reference their content to better evaluate whether or not they are true experts.
Having social media links can also make it easier for your expert authors to get verified on social media websites like Twitter.
It can help them earn Google Author panels and help elevate their status as experts, and thus your reputation as a webmaster who features expert authorship.
Schema markup makes it easier for Google to extract specific information about authors and display it in their search results.
You’ll often see that author bios that appear in search engine results are pulled directly from bios on the publications where those authors appear.
You can use the Schema Creator in your dashboard to easily generate author schema and add it to your bio pages.
Simply select the “Person,” option and complete all of the required JSON-LD properties. Then, copy and paste the markup into the HTML header section of your author pages.
The more content a website publishes, the more website owners should focus on establishing the expertise of their authors in specific fields.
If your website publishes content on different topics, author authority is also very valuable.
It will give your target audience full confidence that they are reading reliable information when they discover your content through search engine results.
Internal links allow Google to rank your site more accurately and index your site more effectively.
Your website’s Internal links not only improve the user experience, they communicate to web crawlers your site architecture and how your web content interrelates.
Without a strong, strategic internal linking structure, your site may lose SEO value and struggle to rank in search engines.
Here is a guide on SEO best practices for internal links, and some mistakes you might be making that could be impacting your organic visibility.
An internal link is a hyperlink that points to a different page on the same website.
They are commonly used to help users navigate between different pages of a website, but can also be used for SEO purposes.
Internal links help to keep visitors on your website longer, which can improve your site’s SEO performance.
There are a few different types of internal links you likely have on your website right now.
Some of them will bring more SEO value than others, so it’s good to know the difference between each.
The links in your menu/navigation bar are some of the most important internal links. These links remain consistent no matter where a site visitor travels across your website.
They should point to the most important pages (e.g. product categories, primary services, blog, about, etc.) and should give users a high-level overview of what type of content is on your website.
Because the majority of your link equity is most likely on your homepage, these internal links will distribute a significant amount of page rank across your website, so make sure the pages linked there are the most important and the ones you want to rank.
The internal links you include here will also communicate to those users visiting your website for the first time where to go next.
Footer links are at the bottom of your web pages. Like the nav bar, the footer is like an anchor that remains consistent across your website.
There may be some repetition in the links you include in your navigation menu and your footer, and that’s okay. They also will be sending quite a bit of link equity from your homepage to the pages linked there.
If users reach the bottom of a web page and have not found a place to click next, you want them to find what they are looking for in the footer.
The internal links that you include on your buttons or CTAs are important for shaping the user or buyer journey across your website and for conversion rate optimization.
Most likely, CTA links are pointing to web pages that push users further down the conversion funnel, whether that is to a web page to book a meeting, request a demo, submit an email address, or add an item to a cart.
The anchor text of these internal links will be primarily user and conversion focused.
Sidebar links are often used to provide users options of relevant content or what page they could go to next.
For publishers that feature a lot of content on their website, sidebar links can help site visitors who are browsing your website without necessarily looking for something specific, but are just exploring the various content you offer.
Sidebar links are very common on news sites, recipe sites, or those that want the opportunity to show users multiple pages (and thus multiple advertisements).
In-article links are those that are included in the body of blog posts or long-form articles. They point to relevant content that can provide users with more context or information.
These types of links are very common because they have loads of SEO value.
If you are not linking to other relevant articles on your website within each blog post, you’re missing out on opportunities to improve your ranking positions and search engine visibility.
The SEO benefits of internal links are significant, and can improve your search engine visibility for a variety of reasons.
Internal links let Google know the most important content on your website. You can use internal links to help Google understand which pages to promote in the SERPs.
When indexing sites, search engine crawlers begin on your homepage and spread out from there, using internal links as their navigational guide.
When you have a strong internal linking system, Google is more likely to find and index all your URLs, so your newest content has ranking potential.
You may wonder how Google knows what your site and landing pages are about.
Google’s web crawlers use the anchor text from internal linking to understand the purpose and meaning of your content and its relevance to specific search terms.
Anchor text best practices can improve your SEO.
Strategic use of noindex and nofollow tags with your internal links can help you ensure that Google is crawling and indexing your most important pages.
For pages that don’t need to be indexed, like thank you or confirmation pages, internal links with nofollow directives can prevent low-value or low-converting pages from ending up in Google’s index.
It also leaves room in your website’s crawl budget for Google to index those pages that you do want to rank.
Internal links also make your website a better place for site visitors.
Navigation links guide users along a conversion journey after they find you in the SERPs, and in-content links can point them to other relevant pages.
Interlinking your topically related pages can turn your website into a topical powerhouse.
Having lots of internal links in your blog posts to related topics or subtopics shows Google crawlers that your website has topical authority, and is a go-to expert source in a particular industry niche or topic area.
If you are not sure whether or not you have internal link issues on your website, a site crawler or site audit tool can help you identify any issues.
To run a site audit, do the following.
If you are not comfortable using our software on your own, you can also order an Internal Linking Analysis in our order builder. Our technical SEO experts will determine if there are any link issues on your site and provide a roadmap for how to optimize your internal linking profile for better organic visibility.
You can use the Site Auditor to see whether or not you are utilizing internal linking best practices.
Our report will flag any internal linking issues that may be preventing your web pages from earning higher keyword rankings in the SERPs.
One of the most common mistakes that new or unoptimized websites make is that they do not include enough internal links on their web pages.
If your web pages are failing to include the right amount of internal links, it will be flagged in your site audit report.
This may or may not be an easy fix, depending on the number of web pages you have on your website.
To resolve the issue, do the following:
Although you want to include internal links on your web pages, too many outlinks on a page (both external and internal) can appear like over-optimization to Google.
Make sure that you are only including links to relevant, helpful content. And don’t overdo it by stuffing your navigation menu or footer with too many internal links.
Reserve those links for the most important pages on your website – the ones you really want to rank in the SERPS.
Another very common issue that may be flagged in your site audit report is broken internal links.
A broken internal link occurs when you move or delete a page on your website, and you do not update previous internal links with the new destination url.
As a result, those internal links point to 404 pages. Sending Google crawlers and users to a dead page is not good for SEO or for the user experience.
Broken internal links are very common with large enterprise or ecommerce websites that are constantly updating their content.
To resolve a broken internal link, take one of the following actions:
Sometimes, webmasters may not be worried about internal links because they use 301 redirects whenever they move or delete a page.
Although 301 redirects are good for SEO in terms of the links from other websites that point to your web pages, internal links with 301 redirects are not considered SEO best practice.
Why? Because redirecting internal links slows down your website and causes Google crawlers to have to move through your website at a slower pace.
Whenever you move a page, a part of your website maintenance needs to be updating any internal links with the new destination url.
This shows Google crawlers that you are an attentive webmaster, and thus makes them more likely to promote your pages.
The anchor text that you use to internally link your pages is also important to your keyword rankings and your user experience.
Anchor text lets Google know what your other web pages are about, how your content interrelates, and displays the many valuable pieces of content that live permanently on your website.
Your website’s internal link profile is essential to optimize if you want to rank for high-value keywords in your industry.
Taking the time to audit your internal links and repair any issues can be all the difference in your ranking positions.
We all know that linking to other pages within your own website architecture, also known as internal linking, matters for ranking purposes. It helps search engine crawlers index your site, and the more you link to a page internally, the more important search engines believe that page is to your site (the better that page’s chances are of being prioritized in search).
External linking can also be helpful to your SEO and ranking. However, many companies, agencies, and small businesses are still hesitant about linking to outside sources from their own pages for fear of losing users, or losing search equity.
The fear of losing search equity demonstrates a slight misunderstanding of how links work in terms of SEO.
The Hose Myth
Many people think of links like houses that search equity flows through. In this mental model, search equity originates from users, they bestow it on a site by visiting/engaging, and that search equity flows to other pages/sites via links. The problem with this idea of links is that you view search equity as a very finite commodity—and believe that you “lose” search equity every time you link to another site (this is false).
A Better Mental Model
Think of a link like a recommendation. One site is recommending another site to their users by linking to that site.
Let’s take that metaphor a little further with a scenario: Let’s assume your friend asks you for recommendations on someone to hire. Consider the following two outcomes:
In the first outcome your friend probably found you very helpful, and would come to you for help again. In the second outcome, your friend probably didn’t find that useful at all, and they are unlikely to return to you for help.
Search engines are similar. If they see a site link out to high-quality, reputable resources, then they feel like that site is helpful, and they’ll reward that helpfulness in search.
To summarize, here are three key reasons why outbound links work for companies of any size.
The next major barrier most businesses face to outbound linking is the concern that they’ll lose converting users to other sites.
The Lost Traffic Concern
As pointed out by Moz, it’s true that by linking to another website, you’re directing some traffic away from your own page.
Benefits Outweigh the Cost
Most sites will set external links to open in a new tab for users, reducing the chance of the user being truly pulled away from the site. Additionally, users who are still in the research phase are less likely to have converted during their session anyways.
In your site’s overall SEO strategy, each page is an opportunity to showcase your expertise and depth of knowledge about the topic, space, or industry through your content. When you reference other authoritative sources via outbound links it builds trust for your own website with users, and sends content quality signals to search engines. In this way, outbound linking helps improve the SEO health of your website and reduces behaviors that negatively impact SEO, like u-turns and bounces.
Furthermore, posting an outbound link to a site you find valuable is also a way of extending your hand for potential partnership. This can be a solid way to start building relationships with bloggers, writers, and businesses in the same niche, location, or complementary industry. If you’re a local business, suggesting or recommending other local businesses can even help search engines recognize your page better for local search. In a way, you’re asking Google and other search engines to associate your page with that of other related sites and their SEO efforts/attributes (like location or authority).
First, only ever select resources that will provide value to your user (informational value, entertainment value, etc).
The easiest place to figure out how to rank in Google? Google itself. Complete a preliminary search of your top keywords and see what Google currently thinks is worth promoting.
If there are sites that you already know about, and want to check on their authority, you can use a Domain Authority (DA) checker. DA is a metric created by Moz that scores websites based on a scale that goes up to 100. The higher the score, the more domain authority that website holds, making it a strong candidate for an outbound link.
Another great place to start is Ahrefs, using their site explorer you can check the backlink profile for any site already ranking well for your target keyword(s) or sites already linking to a source that you know is authoritative. Look for sites with a high Domain Rating (DR), as a starting point.
Bonus: Screaming Frog, a free tool, can analyze your competitor’s site and provide you a list of their outbound links.
Here are a few questions to keep in mind when choosing backlinks:
Going through these questions can help you pinpoint whether or not another site is a good choice for a link.
The most valuable sites to link to are those with strong domain authority. Google prioritizes high authority and high organic traffic (OT) metrics. For example, if you were writing a page for your chiropractic business about post-car-accident back injuries, linking to research from the Mayo Clinic could be valuable because you’re backing up the statements you shared on your page with a trusted source of medical information. With Bing, the kings of content are sites that end in .gov and .edu. This is because they’re often associated with government agencies, research, and universities.
If a certain topic is trending in the news, linking to a site that has less domain authority but is the primary source of coverage for this topic can help you jump on the trend while the topic is still fresh in your readers’ minds.
A wonderful chance to link to outbound content while also building on your own traction is to link to other websites that have mentioned or profiled your company or your own site. Any form of earned media, such as a mention in a reported piece, a guest blog, or an interview on someone else’s podcast allows you to benefit from the other person’s link to your website and for you to write up a recap for your own site to link to theirs.
Over time, a strategy like this signals to the search engines that you’ve “shown up” as a trusted link by many others in your niche. This is also a much more organic way of building traffic and SEO traction than outdated spammy methods like link farms or linking parties.
You’ll sometimes see sites create an entire section for news or press to highlight earned media.
When you create quality content on your own site, you’re also likely to become a hub for outbound links from other people, too! Establishing your site as a worthwhile resource and home for quality content means that over time, you’ll continue to post outbound links to other valuable websites.Your own site might also pick up some backlinks of its own as other people connect to your content. At that stage, link-building becomes a cycle and it’s much easier to build on your own results.
One method we discussed earlier in this article is content curation. Or creating a page that links to all the best resources on a topic, and helps users quickly navigate those resources by providing either brief color-commentary or high-level organization. An example of this would be an article like “The 10 Best Places to Visit When Traveling to Arlington, VA” or “The 20 Best Resources for Getting Started with Inbound Marketing.”
Avoid two-way backlinking schemes run by private blog networks. These are sometimes referred to as “linking parties.” In Google’s recent updates, they have been penalizing efforts to game the system with links shared between blogs. (They’re calling these discrepancies “link schemes.”)
Worried about your site? If you haven’t participated in any of the following acts, you should be fine:
The best way to get on Google’s good side is to create unique, relevant content that your audience will genuinely love. We don’t care what you’ve heard, creating good content pays off. Remember, you’re in this for the long-game.
As pointed out in a July 2019 edition of #AskGoogleWebmasters, outbound linking should always be done without getting involved in any schemes, adding outbound links in user-generated content, and links in ads.
The best way to become a trusted source in your niche is to publish regular high-quality content of your own. Forming relationships with other writers and bloggers in your niche by following their content and commenting can also open the doors for future link-building opportunities.
Remember, Google is evolving all of the time. The company isn’t doing this to punish you or take away your hard-earned followers. The algorithm changes to filter out spammers. Rule of thumb? Do your research. Take the tips we’ve laid out in this article to heart. Google doesn’t play and it will penalize your site for a variety of reasons, including joining the wrong link directory, article marketing (which is spinning the exact same article multiple times in hopes of ranking), keyword stuffing, and unnatural anchor text. (You wouldn’t want unnatural text repping your brand anyway, right?
Your SEO efforts are best spent on pages that are not yet ranking on Page 1 for target search terms. Once you’ve seen what is working on your page, flex your new SEO muscles by selecting an under-performing page and test out how you can make improvements to that page. You can then track whether you’re able to boost your rankings for that page. We have a great article with advice on creating great on-page content.
Learning how to properly use 301 redirects for SEO can make it so your website maintains keyword rankings and organic traffic even as you make changes to your content or site architecture.
The reality is, our websites are constantly changing. Good and attentive webmasters will add new and updated content over time to make sure they are providing the highest quality content and page experience to users.
As a result, redirects become necessary to make sure users and search engine crawlers can find your content. But improper use of redirects can result in lost keyword rankings, lost link equity, and a poor user experience for your website visitors.
When implemented with SEO best practices, 301 redirects shouldn’t undermine your SEO efforts, but ensure that your search visibility is maintained. Here’s a guide to 301 redirects and how to implement them correctly.
Here are all of the redirects that you might want to know about, particularly if they are mentioned in your dashboard’s site auditor report.
As a general rule, if a page is important and you want it to rank, then you should use a 301 redirect if it the page is ever moved
301 redirects are used to tell browsers and search engines that a web page has been permanently moved to a new location.
For example, https://website.com/why-anchor-text-diversity-is-good-for-your-backlink-profile redirects to https://website.com/anchor-text-diversity
301 redirects ensure that users and search engines are always directed to the most current and relevant content. A 301 redirect tells the search engine that the page has been moved, and the old page can be safely removed from the search engine’s index, while the new page should be indexed instead.
There are a few ways that 301 redirects can impact your web pages SEO performance.
There are some common issues that occur with redirects that can impact SEO performance. It’s possible that one or more of these issues will be flagged if you run an SEO audit using the Site Auditor in your dashboard.
Broken redirects are those that point to 404 or dead pages. When this happens, you will often see an error message like this:
The negative impact of a broken page for users and search engines is pretty clear, so you want to avoid sending either to a dead page at all costs.
Unfortunately, broken redirects are hard to detect without the use of a site auditor. But if you’re a webmaster for an ecommerce website with thousands of product pages that are constantly being added or old pages being deleted, broken redirects are more common than you might think.
Here are the two ways you can resolve this issue.
Although when used sparingly, redirects are good for SEO, then can also. harm your SEO performance if used excessively.
A redirect chain occurs whenever one or more redirects point from a url to a destination url.
Google does not want to see redirect chains on your website, as they slow down your website and make it take longer for Google to crawl your website.
Redirect loops are when your redirects point to urls with other redirects, sending spiders in a loop where they never arrive at a destination page at all.
As a general rule, it should never take more than one redirect to get to a destination page.
The best way to avoid excessive redirect chains is to make sure you use SEO friendly urls from the beginning. That means optimizing your urls from the and sticking to them after you update the content.
However if you do need to resolve a redirect chain or loop, take the following steps.
It’s important that HTTP pages always redirect to HTTPS protocols. HTTPS provides users with a safer browsing experience and it is a confirmed ranking factor.
For more info on getting an SSL certificate and redirecting an HTTP site to HTTPs, read our detailed guide on HTTPS.
If you have internal links that redirect, it is likely slowing down your website and causing you valuable link equity.
The site audit report will let you know if this issue is present on any of your pages.
After adding a new version of a page or deleting a page, a part of your regular website maintenance needs to be updating all of your internal links that previously pointed to those pages to the new destination url.
This can take some time, particularly if you have a lot of web pages and are using internal links to elevate your SEO performance.
But it shows Google that you’re an active webmaster that is doing the necessary work to make your website the best place for visitors.
There should be no pages in your XML sitemap that redirect to other destination urls. You should be updating your sitemap instead with the new destination so Google crawlers are directed straight to the newest version of the page that you want indexed.
There are a few other redirect issues that might be flagged in your Site Audit report.
Redirect urls should be lowercase.
And all of the protocol variants (HTTPS, HTTPS) should redirect to the same destination url.
There are many ways to implement a redirect depending on your content management system. Some CMS like WordPress will automatically set up a 301 redirect when you make changes to the url path of an existing page.
There are also many plugins that you can add to your WP site that help confirm on-page SEO best practices with redirects.
But to add a redirect manually, you will need to edit your .htaccess file.
A .htaccess is a powerful website file that is used by Apache web servers. It is located in the root directory of your website. The root directory may be in a folder labeled public_html, www, htdocs, or httpdocs, depending on your hosting provider.
To edit the file, all you need is the old page’s URL and the new page’s URL.
If you have a large website and you haven’t been thinking about redirects until recently, there may be quite a bit of technical work you need to do to get your site on track.
If you are unable to resolve the redirect issues identified in your Site Audit report on your own, reach out to our technical SEO team.
Alt text often finds its way into SEO content optimization discussions. Designed as a means to increase a site’s accessib| ility, these seemingly inconsequential alt attributes can have an impact on your site’s SEO and usability. To help you make the most of your alt text, we will cover how to write alt text to maximize SEO potential and improve your site’s accessibility.
Alt text or alternative text are written image descriptions within an image’s IMG tag’s ALT attribute in HTML code.
Also referred to as “alt attributes” or “alt descriptions,” these text descriptions provide information about the appearance and function of images on a web page should the image not load or should the user be visually impaired.
These alt attributes may be at the forefront of on-page SEO checklists. However, the impetus for alt text began in 2006 when the United Nations audited the world’s most popular websites and found very few offered equal access to the information they provided for visually impaired users. Since then this text has primarily been utilized for:
Internet users with visual impairments from blindness to color-blindness rely on alt text to gain full access to a website’s content. Screen reader users and users of other assistive technology have alt text read aloud. This provides screen reader users with a clearer picture of all the information on the page.
Using a screen reader to explore sites can provide you with a better understanding of what a user would experience should they rely on a screen reader.
If an image file cannot load, its alt text will be displayed in its absence. This can be quite useful should a user have low bandwidth or choose to turn off their browser images to save data. Just as visually impaired users rely on this alt text to fill them in on the purpose and content of an image, users with slower internet connections do not miss out on the image through the use of alt text for an overall better user experience.
Additionally, when alt text stands in for an image, it enriches your content and provides the reader a more well-rounded understanding of the text.
Web crawlers use NLP to read the alt text HTML to better understand what the image is, the purpose of the image, and the context of the image for better indexing and better image search results.
This gives the crawler a better understanding of your web page and gives your image the opportunity to appear in a Google image search.
Ironically, understanding how to create good alt text often requires a show-don’t-tell approach. So, here are some examples of images with their alt tag texts:
alt=”Beagle standing in a frosty field on a cold morning.”
Here’s what it looks like in the HTML:
alt=”Dignity of Earth and Sky Statue”
alt=”<p>Clear evidence: Atlantic currents carry the Gulf Stream</p>”
If you want to find out whether there is alt text on a web page, you can use an alt text tester to check.
Most CMSs will format your alt text into HTML for you. However, to implement alt text you can insert the following code into your IMG tag:
<img src=”file” alt=”add text” width=”” height=””>
Writing good alt text doesn’t require expertise in creative writing or coding. It does require that you look at images through a new lens, though.
One way to do this is to imagine you’re describing the picture to someone over the phone. As you do so, keep in mind whether or not your listener would benefit from an explanation of the image’s purpose.
More descriptive alt text provides the users with a better understanding of the image. As you construct your descriptive alt text, include what makes the image important, unique, and how it enriches the text.
We can all agree that representation matters. Screen reader users also want to know when a brand is inclusive in its imagery. So, be sure to include gender and ethnicity when it’s relevant within your descriptions.
The example above is too long and would have benefited from the use of the caption tag or long description tag instead.
The best alt text is a phrase or two at most (or a line of alt text). When constructing your alt text, consider what’s a given, what the informational priorities are, and how it informs the webpage content. Reduce redundancy by omitting anything included in the content.
Again, considering the purpose of the image and article for context is key.
Keep in mind that alt text is not a caption. If you need to provide source credit or a source citation, use a caption for that information.
If your target keyword is evident in the image, include it in your alt text. As we pointed out, web crawlers will read these attributes to gain a better understanding of your content.
Keep in mind that long-tail keywords are easier to rank for, even when it comes to image searches.
For example, instead of ranking for “whale shark,” you could try to rank for “whale shark with its mouth open.”
Keyword stuffing is never a good idea. Especially when it leads the user astray as to what the image depicts. Always aim for appropriate and informative alt text that will substitute meaning in place of images when required.
Additionally, keep in mind that Google NLP is great at figuring our semantic relationships between words, so if your image is related to your target keyword, your alt text should be, too–and the result should be a natural signal to Google’s indexing system.
For example, notice in the image above the alt text mentions “heavy duty dog chew toys.” Google displays this image in search queries for “dog toys for heavy chewers,” which is semantically related to the original query.
Bad alt text = keyword stuffing: alt=“custom dog tag, custom dog ID tag, customized dog ID.”
One mistake many people make is to include “photo of,” “picture of,” or “image of” in their alt text. This is not needed. Your alt tag indicates that it’s a photo, so these just add unnecessary verbiage and redundancy.
There are times when an image benefits from a longer description within the alt text resulting in a better user experience. For example, an infographic that is not accompanied by a blog doesn’t add value unless explained clearly.
For these instances, you will want to use the longdesc=”” tag.
Buttons are often images with text embedded. These fall under the category of images of text, which means you need to let your user know what they say in order for them to be useful.
Provide your user with an accessible alternative for buttons with:
<input type=”” src=”” name=””
height=”” width=”” alt=”text on button”>
Proofreading and correct spelling can hinder a screen reader’s ability to correctly convey the meaning of your image. Additionally, typos in your alt attributes can become an image SEO disaster if left unchecked.
While you don’t need to point out that you are describing an image, you may want to mention if the type of image is unique. Some image forms you may want to mention include:
Writing effective alt text will become second nature over time. However, knowing when to use image alt text, when to skip it, and other image best practices can also improve your site’s SEO and accessibility.
What not to do:
It can be tempting to add a screenshot, PNG, or JPEG of text. However, this text will never be read by web crawlers. Additionally, because you do not want to exclude the visually impaired from the information in an image, you will want to type that image’s text into the alt text tag.
Decorative images do not need to include alt text. This is because the content of the image doesn’t add to the meaning of the webpage’s content. However, you should include an empty or null alt attribute in your HTML. This null alt text will signal to the screen reader to not read a description of the image.
You can write a null alt attribute as: alt=“ “ or alt=””
You may also want to use a null alt attribute with an image that is a link with a text version beside it.
No, but you will want to include a transcription of the video for hearing impaired users, those that speak other languages, and those viewers who cannot play the video with audio on.
To read the alt text of an image, all you need to do is right click the image and select “Inspect” or “Inspect Element.” This will open the HTML and CSS element inspection tool. On a Mac, you can also use Control + click.
You can also use an accessibility checker for accessibility issues.
When it comes to providing thoughtful alt text, consider the purpose of the image. This also presents you with a few more opportunities to use your target keywords.
For example, if the purpose of a blog is to compare the quality of dog treats, your keyword is premium dog food, and your image is two bowls of dog food for comparison, you can use the alt text, “a bowl of premium dog food beside a bowl of lower quality for comparison purposes.” This allows you to smoothly integrate your keyword without stuffing.
No. Image captions are visible to site users even when the image loads while Alt text only lives in your HTML. The purpose of captions is to provide copyright information or an explanation that is needed to understand the content of the image.
Adding image alt text in WordPress is simple. When you upload an image, you can add your image alt text before inserting it into the page. Some versions of WordPress include the image alt attributes menu beside the image thumbnails. Others include the menu at the bottom of the thumbnail screen.
When it comes to alt text, there are varying levels of quality. You can settle for decent alt text or you can strive to provide the best alt text for your users and SEO. Here are some examples of basic alt text models:
Bad: alt=”dog”
Better: alt=”brown dog with leash”
Best: alt=”Tan poodle happily playing in the grass with its leash still attached”
Bad: alt=”people with books”
Better: alt=”mother and son doing homework”
Best: alt=”illustration of a black mother helping her son with his homework to demonstrate the power of involved parents.”
Bad: alt=”picture of a cup, napkin, and pen”
Better: alt=”a blue coffee mug next to a napkin with writing and pen”
Best: alt=”a blue coffee mug with coffee to the left sitting on a wooden table with a pen opposite and a napkin between with the words set goals, not limits”
It can be easy to skip or rush constructing your images’ alt text. But doing so would be a disservice to your webpage visitors and your SEO. We urge you to think of your alt text as a way to improve every webpage. Optimizing your images for search engines includes providing web crawlers with context through alt text. Additionally, many people rely on alt text to fully understand and interact with your website. Alt text increases accessibility by taking the place of the image should it not load or a user has visual or cognitive disabilities.
Since 2020 over 58% of site visits now come from mobile search traffic. If you aren’t taking mobile into account heavily enough, it’s likely hurting your business.
The use of mobile devices is rapidly changing the way customers are searching, engaging, and buying. Consumers have access to faster Internet while they’re on-the-go. That means Internet traffic is increasing through mobile devices. Beyond social engagement and consuming content, they’re also making buying decisions.
According to Morgan Stanley, 91% of adults keep their smartphones within arm’s reach. That’s ninety-one percent of ALL adults, and it’s shifting both business culture and research practices. Rather than dedicating time to research a topic, users now perform micro-searches on the go, and then follow-up on those initially discovered options or solutions later on.
How big is this trend? An IDG Global Solutions survey found 92% of senior execs own a smartphone used for business, 77% of those research business purchases from their mobile device with 95% then finalizing related purchases via laptop/desktop. That’s a huge portion of the B2B purchase pool starting their journey from mobile. Missing a user during their initial mobile-based exploration may mean your business is losing out on a huge portion of the market.
This trend is even more compounded for local businesses, as 58% of mobile users search for local businesses daily. What’s more? 89% of those users search for a local business at least once per month. We also learn from HubSpot that, when consumers do a local search, 72% of them visit a store within five miles. What does this mean for business with an Internet presence? It’s time to make it mobile-friendly.
Websites now need to be responsively designed so they can serve mobile users just as well as desktop users. Responsive design is a design that adapts to the size of the user’s viewport (i.e. screen), by changing font sizes, adjusting images, and even collapsing page elements to make navigation simpler. Responsive websites that follow modern design standards help users access and understand the information they need more quickly.
Additionally users now view responsive functionality as a trust signal. A study conducted by socPub indicates that 57% of Internet users will not recommend a business that has a poorly designed mobile site.
Because mobile users comprise an increasing number of searches and site visits, they now represent the largest source of traffic in a slew of markets (new industry segments falling into this bucket each month). Our clients regularly pick up market share with simple mobile-friendly design updates, especially within industries that are traditionally late-adopters.
Your site is now your storefront. If your site looks terrible or functions poorly, users will leave instead of working to get at your information – it costs a user nothing to click the next result in search.
Google has switched over to mobile first indexing. Mobile-first indexing prioritizes mobile friendly sites over other sites in the organic search results. Even if your target consumers aren’t heavy mobile-users yet, your site still needs to be mobile-optimized if you want to show up higher in the search results (even for desktop-based searches).
With mobile devices rapidly changing the way consumers access information your offsite optimizations are also becoming critical. For example most users performing local searches never go past the search results themselves (aka they don’t actually click into websites anymore). Local search users are typically able to surface the information they want directly within the search results through features like the local Map Pack.
The first step toward reaching mobile users is having a mobile-friendly website. Currently, in 2021, responsive web design is the best design approach for mobile-friendliness. Responsive design is the best approach for mobile design because:
Responsive design in an approach for creating web pages where layouts and content dynamically adapt to the size and orientation of the screen or viewport being used.
In the example below you can see that the desktop version of this responsive site the text and video are displayed side-by side, and in the mobile version of the site those elements have been stacked.
This responsive theme adjusts to the width of different devices from smartphones to tablets, even large wide-screen viewports, by rearranging and resizing the design elements.
There have been a few ways to handle mobile sites since the invention of smartphones, the first two mobile design waves were plagued with usability issues, and hard to maintain. Let’s take a look at what didn’t work, and why you should consider migrating to a responsive design if you’re still employing one of these outdated mobile design tactics.
The first wave of design involved creating a different site entirely to serve as the mobile site. This approach involved serving a mobile version of the site using a different URL, a mobile URL. For those of you who have been around long enough, you may remember pages you visited from a mobile device redirecting from domain.com to m.domain.com.
This approach required setting up canonical tags for every page, as each mobile web page contained content duplicative to the desktop page. This approach also split the search equity for each page as desktop users interacted with the desktop site, and mobile users interacted with the mobile website.
When users shared pages from the site, creating backlinks they were split between the mobile subdomain and the regular site domain as separate URLs were being served to each user group. It also meant that every time an edit was made to content on the desktop site, a second round of edits had to be made on the separate mobile site. Mobile pages under this paradigm often provided a worse user experience as they typically served less content than the full desktop site did for desktop users.
The next wave of design consolidated pages under a single URL, but dynamically served cached pages based on the user’s device type using an http response header.
This iteration of mobile design allowed sites to consolidate search equity between their desktop site and mobile site. It also did away with the need for canonical tags on virtually every site page.
However, it meant that every time a device came out with new dimensions, a new instance of the site had to be spun up, formatted, and tested to be served to users. This system became increasingly impossible to maintain as the market diversified and the dimensions for mobile screens became rapidly non-standard. Dynamically serving a mobile version of your site was plagued with issues including repeated issues with serving the desktop version to mobile users.
Responsive design consolidates the mobile version of a webpage and the desktop version of a webpage under a single URL. It also serves the same instance of code, regardless of the size of the mobile screen or desktop viewport.
This allows site owners to combine their desktop SEO and Mobile SEO efforts, employing a single set of SEO best practices and strategies. Responsive design is easier to maintain as you don’t have to manage different content or code for a single page.
Instead all elements fluidly rearrange to suit mobile visitors and desktop visitors as needed. If a user switches from full screen to half-screen with their browser, the design elements will shift accordingly so the user experience is largely unchanged.
In July 2019, there were over 1.69 billion more mobile searches than desktop searches performed in the US alone. Search itself has become mobile-first. The first place you’ll start when checking your site for mobile optimization is checking out how Google views your site.
Google holds over 90% of the market share for mobile search traffic in the U.S., because Google has spent years optimizing search specifically for mobile users. Many of Google’s search results are so well optimized, that mobile users don’t even need to click into an actual result to find the information they need.
Rich snippets and rich results now display enough information for users to take action based off of the search results alone, from finding movie times to the addresses of local businesses, to how to troubleshoot tech problems.
How did Google get so far ahead of the competition with mobile search? They started testing and prioritizing mobile features years ago, and as mobile search volume overcame desktop search volume, Google shifted to prioritizing mobile users over desktop users.
In 2015 Google rolled out mobile-friendly search results, serving a separate set of search results to mobile devices. This update, often called Mobilegeddon, prioritized mobile-friendly websites in the search results.
In 2016 Google began to experiment with mobile-first indexing, cataloging the mobile version of page content, rather than the desktop version.
In March of 2018 Google formally began rolling out mobile-first indexing, and migrating over to the mobile-version of pages for sites that it had already indexed as desktop versions. To quote Google themselves, “Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.” Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.
In July of 2018 Google rolled out page speed as a mobile ranking factor, ranking sites with slow load times lower in the search results.
Over the past decade Google has also continually rolled out additional data-rich mobile-first search features from movie times, to reviews, to product images. Google often pivots when rolling out new features, as it continually tests and then prioritizes what works best for serving users the most valuable information.
For example, Google originally published a guide helping webmasters create separate mobile sites under the m.domain.com URL – a tacit approval of the process, only to pivot within a year to formally recommending responsive design under a single unified URL.
Similarly, the AMP (accelerated mobile pages) standard, has been pushed heavily in the past few years. AMP pages, which load in a fraction of the time of normal pages, seem to be struggling with many of the issues that m.domain.com mobile pages had back in the day.
Sites using AMP pages are often managing two sets of page content, with one set slimmed down to meet the AMP standard. There are also challenges with AMP pages being served from a Google URL rather than the site’s own domain. While Google recently addressed some of these concerns with signed exchanges, it’s still causing questions around whether link equity is being split between the AMP viewer URL, the original AMP source, and the AMP cache URL.
Trends that are here to stay? Responsive design, quality content that gets right to the point, making sites as fast as humanly possible.
So what should you pay the most attention to in terms of Mobile optimization? If you already have a website, start with Google’s Mobile Friendly Test. This tool will give you an aggregate rating for whether or not Google thinks your site is mobile-friendly. The tool will also prompt you to view a full usability report in Google Search Console.
If you want to access this report on your own directly from Search Console, login to your account for the domain, and use the left-hand navigation to click into “mobile usability” under Enhancements.
Here you will find a list of the mobile issues that Google has detected on your site. Examples included text being too small to read, clickable elements being too close together, content being wider than the screen, etc.
Click into any of these issues, and you’ll see more granular information to help you improve your mobile SEO, such as the pages where the errors are found. You’ll also see a space to validate that the error has been fixed once you make adjustments to your site.
These are errors Google is specifically recognizing and calling out for your site. From a search rankings perspective, these should be at the top of your list to fix.
Google can’t serve pages in the search results that it can’t see. Make sure that Google is indexing your pages for search.
Check your robots.txt file, and make sure that it’s not blocking Googlebot. Your robots.txt file can be used to block certain types of bots and crawlers, but if you’re trying to rank highly in the SERPs, Googlebot should not be one of them.
To check if your robots.txt file is blocking Googlebot, you can either use a free robots.txt tester, or use the link inspection feature in the search console.
A few years ago you could check blocked resources straight from google console in a consolidated view, but as these issues became less prevalent google has dropped the aggregate view. Secondary tools like screaming frog can still give you a full list of NOINDEX and NOFOLLOW pages from your site. Alternatively you can check the status of individual links straight from the Search console using the URL inspection tool.
This tool also allows you to manually submit links and request indexing of new pages, revised pages, and pages that crawlers have yet to discover.
Now that you’ve resolved a majority of the technical usability issues, it’s a good idea to check for issues mobile users face that may not have been caught by Google.
Start by taking a look at how your site appears on different devices, this free tool will let you select from a variety of mobile devices and desktops to give you a full sense of how your site looks on different devices.
You should quickly be able to see any major issues with formatting that could be hindering the mobile user experience, or making your site look unprofessional. Examples include poorly formatted text, grainy or stretched images, or overlapping page elements.
Work with your webmaster or web development team to clean up any design elements that aren’t displaying well on mobile. Once your site layout is mobile optimized, you’ll want to check that your site is compelling to mobile searchers on the Google search results page.
Users only click into a site from search if the rich snippet, page title, and/or meta description are compelling. Your title tag for your page needs to front-load your target keyword(s), and your meta description should include the most pertinent information about your page first.
Page titles can be very similar between pages, so meta descriptions can often make the difference for which result or results site visitors click.
Also keep in mind that rich snippets can provide even less space for title tags and meta descriptions. In the example below you can see how each result only displays about 3-4 words from the page title.
If you use a major platform like Wordpress there are SEO plugins that will help you manage your title tags and meta tags. If your site is custom, you may need to edit this information directly in the html code.
If you’re seeing a good amount of organic traffic from your target keywords, the next step is to make sure that traffic is actually seeing your mobile optimized content.
Over half of mobile searchers will abandon a page that takes longer than three seconds to load. Separately, for every additional second it takes a page to load, conversions fall by 12%.
To check your mobile page speed use Google’s PageSpeed Insights Tool, and see how quickly your site loads on a 4G connection. This tool will give you a granular breakdown of all speed issues you can address to improve your site speed.
Most major website platforms (Wordpress, Squarespace, Wix, etc) will have native features and plugins that will automatically optimize image files for mobile devices to reduce page load times.
Bounce rates are a great indicator that a page is not providing value to users. If you see bounce rates are much higher on specific pages for mobile users than for desktop users this is a sign that the page may have some issues with either mobile formatting, mobile load times, or that the relevant content may take too long to scroll to on mobile.
To check bounce rates, simply login to your Google Analytics dashboard. You’ll be able to view aggregate bounce rates for your site, bounce rates by page, and track how bounce rates change as you make adjustments to webpage content.
Intrusive pop ups, and poorly designed pop ups can increase your bounce rates on mobile and tablet devices. Intrusive pop ups can also hurt your organic search rankings, especially with Google. An update Google rolled out in 2016 devalues mobile pages that have intrusive pop ups, lowering the page’s rankings in the search results.
There are two major popup issues that can cause bounce rates and devaluing of a page in SERP. Pop ups that have not been optimized for mobile traffic can be impossible to close on small screens, and may cause mobile searchers to bounce from your site. Pop ups that prevent a user from accessing content on-load will hurt your mobile SEO especially with Google. Google considers pop ups that block site visitors from content to be “intrusive.”
Examples of intrusive pop-ups and interstitials:
That doesn’t mean you should abandon popups entirely. Used correctly, and designed with mobile UX in mind, pop ups can help improve your conversion rate. These pop ups are ones that help the mobile user along their journey, are contextually relevant to the content, or are a legal requirement. Pop ups that appear as a user is looking to complete the next step in their journey are generally fine as well.
Examples of pop-ups and interstitials that are okay:
A report issued by PwC states that, compared to conducting a traditional search, 71 percent of respondents prefer voice searching. Now that we know users prefer voice search, let’s look at how we can optimize our websites to reach them.
Mobile-friendly websites must think through the customer’s journey. Ask yourself these three questions:
Your main navigation should help users quickly and easily get what they want from your site, without a user needing to use site search or “click around.” Once you have a handle on your audience segmentation and goals, you should confirm that your users are not facing any major barriers along each journey.
There are a few ways to do that, here are two:
Your marketing shouldn’t be only about what devices your potential customer is using, it should be about the journey they’re taking. What are their lifestyles, habits, and device preferences? Conduct research, surveys, and interviews with your current audience. This marketing tactic is an excellent opportunity to develop a relationship with your existing customer base. Offer incentives and prizes to those who choose to participate.
Designing websites focusing on mobile users means we have drastically less real estate, so minimalism is critical. The last thing a user wants to do is scroll through or resize your pages. According to a scrolling and attention study the
Nielsen Norman Group conducted, 74 percent of users indicated their viewing time is spent on the first two screens of content. Therefore, responsive design is the solution. You can accomplish this in a variety of ways, including:
Pro-Tip: For mobile-users, one often overlooked difference is that tap-areas need to be large enough for users to click on interactive elements (links, buttons, drop-downs) with precision.
For local business:
For all businesses:
Mobile searching remains the leader because everyone loves the convenience of using their devices. Your audience is busy, on-the-go, and living in a digitally-driven world. As a result, their mobile queries will continue to be on an upward rise. Even though mobile searches are similar to those on a desktop, your site must be optimized for your audience’s visits. Your brand should be easy to use and support your customer’s journey. A mobile-friendly design that responds to the level of mobile searches you receive should be your goals.
There are times when a PDF file is the type of content that brings the most value to your audience. And just like with your traditional html pages, SEO for PDFs can help them earn keyword rankings so your target audience can discover them through organic search.
Although not always the greatest for SEO, Google does index PDFs, and sometimes ranks them, meaning that your executive reports, white papers, survey results, or other PDF content should be using SEO best practices and be optimized for search.
PDF files are treated like regular pages by search engine crawlers. Google converts PDFs into HTML, and then indexes the HTML version of the page.
FWIW we convert PDFs & other similar document types into HTML for indexing too, so theoretically there wouldn’t be too much difference.
In the SERPs, the PDF tag will be visible next to the title tag. This lets the user know that they will be directed to an indexed version of a PDF page rather than a traditional web page.
Because Google does treat PDF files like regular web pages, that means all of the same on-page SEO best practices still apply.
Just like Google looks to file names of your images to understand their relevance to your web page content, your PDF file name should communicate to Google what your content is about.
Best practices for optimized file names include:
SEO Meta Tags like the title and meta description of your PDF will be visible to users in the SERPs. They will be used by Google to understand the primary topic of your PDF.
Including your target keywords in these elements of your PDF, and those with Keyword Difficulty scores that are achievable for your website, will help improve your ranking potential.
If you are using Adobe Acrobat Pro, you can edit the title of the PDF via:
Similarly, you can also edit your meta description by clicking:
Edit your meta description in the Description field
Just like you use heading tags to help users navigate your web pages (and search engines understand the topical depth of your content), PDFs should also leverage headings.
Most likely, your white paper, report, or PDF document has some natural structure separated by headings. Make sure that you take the time to specify h1-h6 tags in Adobe Acrobat.
Having internal links will help direct users to other relevant pages on your website.
They also signal to Google the range of content that your website offers and what other topic areas you cover.
Google will follow the links in your PDFs, meaning you can use them to spread link equity to other pages.
Google is going to struggle to crawl and render the content of your PDF if it is saved as an image file. It also makes it more difficult for users if they want to highlight text.
The Site Auditor will flag an issue if a web page is linking to a .doc file instead of a .pdf.
.doc and .docx are not seen as SEO best practices because they will not be included in Google’s index, meaning missed opportunities to get your content in front of more audiences.
And because .doc are not compatible for all users like .pdf docs, including PDFs instead provides a better experience for website visitors.
In general, a web page is more likely to get in front of more audiences because of how easily it can be understood and indexed by search engines.
PDFs are generally seen as not great for SEO. But when you do have PDF content, you should make the most of it!
So the short answer is don’t make PDFs a big part of your SEO content strategy. But for content like annual reports and white papers that are in PDF form, optimize them!
In their everlasting quest to provide users with the best results for search queries, Google added Page Experience metrics to their ranking algorithms. The Google Page Experience Update made it so factors such as mobile-friendliness, web safety, interstitials, and a site’s overall UI/UX are officially ranking factors. The Page Experience Update rollout started in early June 2021 and ended on September 2nd. It was the first update to heavily focus on a user’s experience within each part of a web page.
Google’s motivation behind the update was to improve the overall search experience through the websites they promote in Google search. As a result, websites that prioritize creating a high-quality and engaging page experience saw an improvement in their overall rankings. Those that didn’t adapt, well, they dropped in their keyword rankings.
If you’re not sure whether your web pages provide a high-quality page experience for users, this article is made for you. Our guide will walk you through how websites that have maintained their search visibility have responded to the Page Experience Update. Then, you can replicate their strategy in your own website for improved SEO performance.
The Page Experience truly shook up the SEO world in 2021. Why? This update added a new layer to how SEO experts prioritize the usability of websites. As a result of the update, Google is not only focused on promoting relevant pages, but those that provide enhanced speed, less element shifting, and improved responsiveness. The value of a web page is not only in its relevance, but in how it performs for the user, and most experts e agree this update is a change for the better.
This is not the first update Google has launched to its algorithms. Google has a long and varied history of updating its algorithm. In 2018 alone, Google launched over 3,000 updates to how the browser produces search results. These types of updates range from large to small, and they usually include changes to indexation, data, search UI’s, webmaster tools, and ranking factors.
All of these updates play into the many algorithms that power every search. Google uses algorithms to help fulfill a specific function, grouped into one larger, core algorithm. Sound complex? We promise it’s not.
All of these updates play into the many algorithms that power every search. Google uses algorithms to help fulfill a specific function, grouped into one larger, core algorithm. Sound complex? We promise it’s not. Here is a breakdown of the different types of ranking factors used by Google:
Simply put, user experience is the study of how users interact with your website. User experience targets potential users at all steps of their journey and helps you get into your customer’s minds before they come to your website, during their time on the site, and after they leave.
For many business owners, a good user experience equates to a pretty website. While it is always a good idea to have an aesthetically pleasing website, a few pretty graphics won’t cause your customers to convert. Instead, your website’s interface needs to be optimized with the consumers in mind.
Here are some user experience statistics that drive home the sheer importance of creating a good page experience:
When it comes to your website, there are likely hundreds, if not thousands, of competitors offering products and services similar to yours. With this in mind, you can’t risk that your potential customer’s first impression of you is impacted by low-quality UX. Staying on top of user experience trends and best practices has always been important to earning new customers, but it will now be essential to showing up in search results.
Unlike many of Google’s algorithm updates, Google did release a lot of information and tools to help users prepare for and respond to this update. The update was a big one and is now considered one of Google’s largest.
Due to trade secrets and proprietary information, Google only released some information about their updated algorithms. But as 2021 unfolded, web developers and SEO experts inferred how to make optimizations to best match the new ranking factors.
Luckily, we’ve done the heavy lifting for you by outlining the key information you need to know to ensure your website provides the kind of page experience that will be most valued by Google.
Google released a metric set named Core Web Vitals, a set of metrics that measure a website’s speed/loading time, responsiveness, interactivity, and visual stability. These metrics were released in May, fully functional in June, and remain the foundation of the 2021 algorithm release.
The Core Web Vitals include these three benchmarks: 1. Largest Contentful Paint, 2. First Input Delay, and 3 Cumulative Layout Shift), to help site owners measure a website’s holistic user experience.
While we know that these new measures are subject to change and can still evolve, since June of 2021, they have remained consistent. Here’s the breakdown of the three basic metrics:
Largest Contentful Paint reports the render time of the largest image or block of text visible within a web page’s viewport. Simply put, it relates to the time it takes for your webpage to load the biggest piece of content on a page. An ideal LCP would be within 2.5 seconds of loading the page.
First input delay measures a consumer’s first impression of your website’s interactivity and responsiveness. It does so by monitoring how long it takes from when a user first interacts with a web page (i.e., clicking on a button) to how long it takes for the browser to respond to that action. Think of it as how long it takes for a user to press a button and for that information to appear. An ideal FID is under 100 milliseconds.
Have you ever been scrolling on a website and are just about to click on a button, when the layout moves and you are all of a sudden in a different portion of the page? That is a layout shift, and if your website has a lot of them, it can hamper your user experience. Cumulative layout shift measures the combined effect of this movement on one webpage.
Visual stability is exactly that—how stable the webpage is when it loads—and if the page stays steady throughout a consumer’s scroll. CLS measures how many times a user experiences unexpected layout shifts, with the ideal metric for this being less than 0.1.
As a best practice, to ensure that you are meeting the right target for each of these metrics, it is recommended you test and monitor about 75% of all pages on your website. It is important to understand that these Core Web Vital metrics are user-centered new metrics that give real-world data to see and understand how users interact with your website.
A better page experience leads to deeper engagement and allows consumers to get more done. There are already existing page experience metrics that Google uses to help webmasters monitor their performance, including:
Mobile Friendliness: Not all searches are created equal, meaning your website should perform on mobile phones at the same level it does on desktops. This new signal will factor more heavily into SEO.
Safe Browsing: This metric ensures the security and safety of your website, verifying there is not any harmful content on it.
HTTPS Security: Having an HTTPS tag on your website means it is safe and secure for users, and their information isn’t at risk of being stolen.
Intrusive Interstitial Guidelines: Many websites have a ton of intrusive pop-ups that get in the way of a user finding the information they need. Because of this, Google has created a set of guidelines on how to include pop-ups on a webpage without severely hampering the user’s experience as a whole.
All this information on search engine functionality and algorithms may sound complicated, but don’t worry. There are many easy steps anyone can take to prepare their website for the most important aspects of The Page Experience.
There are plenty of free tools available to you that will allow you to monitor these new ranking factors on your website. Using them to consistently monitor your own website will not only help your user experience metrics soar but bring more potential customers to convert. A few examples include:
Web Vitals JavaScript: This tool measures all Core Web Vitals in JavaScript using APIs.
If you have both a smartphone and computer, then you likely know the way in which different devices load pages differently, both in terms of visuals and page speed. There are some tools that can help you audit your website without having to purchase a truckload of devices.
Google’s PageSpeed Insights (PSI) tool lets you know how well your website performs for both desktop and mobile browsers. It also provides detailed information that can be used to deliver a faster user experience. If you find that your PSI is scoring less than ideal (anywhere below a 90), then you’ll want to take some measures to boost your page speed. Here are some ideas to consider:
Implement Accelerated Mobile Pages (AMP): Originally used for news sites, AMP pages are essentially stripped-down versions of existing pages that can load up more quickly on mobile devices. While not necessary for pages loading optimally, AMP can be a boon to pages that are currently lagging. It’s likely you’ve already encountered AMP on your phone, noted by the little, encircled lightning bolt in the page’s corner.
4. Have a Benchmark
It is of the utmost importance to understand where your website stands before you make changes. We all know that having the top spot in the search engine result pages is our top goal, but, if anything, the rollout of this new algorithm means that it is time to shift focus to include a user’s experience.
So you need to test, test, and test! Use the free tools above on each page of your site and move slowly. Take note of what is working and what isn’t in order to be best prepared. This way, whenever you make changes, you’ll be able to track your results easily and won’t be sidelined with the introduction of Google’s search algorithm next year.
Your website is nothing if not a place for your potential customers to gain information, so be sure to optimize your content, one of the most important Google search ranking factors.
The SEO Content Assistant is the best way to improve your on-page content (you can access it by setting up a free account). Using this tool, you can target up to five keywords and take immediate steps to give your content more topical-depth and authority.
But you can’t just put the content on your page without any organization, as this is where header tags come in. The proper use of headers such as the title tags and header tags will not only segment out your information into easily digestible chunks, it will also make it easier for Google to crawl and index. The SEO Content Assistant will let you know which focus terms should appear in headings.
These subheadings do double duty. They’re also a great way to optimize your target keywords, as the more prominent they are on your page and your URL, the more Google will believe the information you are creating is valuable content.
Yes, it is important to have original written content on your website. However, it is much more important to diversify the types of content you use. Images are a significant ranking factor, in addition to how they engage the searcher and create a great page experience. Plus, you cannot appear in a Google search for images if you do not have optimized images.
The easiest way to use images is to put them at the top of the page, grabbing the user’s attention as soon as they get on a specific landing page. However, keep in mind the Largest Contentful Paint metric. And make sure to optimize these images by reducing their load time by compressing them. You also want to incorporate relevant keywords in the alt text if it’s appropriate, so in case of a problem with loading the page or visually impaired users are visiting your site, they can see what the photos are meant to be used for.
What did we learn from this rollout? Details and Milliseconds matter… and updating your website in response to Google’s Page Experience update is a win/win for you and your web visitors. They receive a better user experience, and your website is rewarded with positive signals to Google’s web crawlers.
Site owners who focus their efforts on following proper user experience best practices have sailed through the update without major negative impacts to their overall search visibility.
So, get started with amping up your website’s mobile friendliness, responsiveness, and other fixes for a great page experience.
It’s essential to closely monitor your website, even long after new metric rollouts. Be sure to keep tabs on ranking changes. It can take weeks (and sometimes months) for Google to register changes to a page and change your ranking for a Google Search, so you’ll want to check up on your GSC Insights reports.
What are users actually searching for when they type in a keyword into Google? This is the question that millions of marketers are trying to answer when they create a piece of content that they want to rank in the SERPs.
Similarly, Google also wants to understand what users want. We are all trying to understand “search intent,” or what a user is ACTUALLY looking for when they type something into the Google search bar.
Search intent optimization is the effort to create content that satisfies what users are actually looking for when they type a specific search term into Google. This definitive guide provides an in-depth look into what search intent optimization is, the different types of search intent, and how to use search intent to its full potential.
Keep reading to get an inside look into how to optimize your website for search intent.
Search intent is the true intention behind the user’s search query. Providing content that matches the user’s search intent helps the user find the exact content they are looking for quickly and easily and leaves them with a “satisfying,” experience. Satisfying search intent is considered one of the highest quality indicators by Google.
For example, let’s say a user types “brunch” into the Google search bar. What information are they actually looking for? Places in their neighborhood to get brunch? The history of brunch? Brunch recipes?
In this example, Google doesn’t seem to have a complete understanding of what the user’s intent is with this term. As a result, they provide SERP results that can satisfy all of those various intentions.
In contrast, if a user types the search term, “brunch places near me open during weekdays,” Google has a much better understanding of what the user is looking for. They show the user a list of restaurants in the Map Pack that are open on weekdays. In addition, the top-ranking web page result is a complete list of all of the weekday brunch restaurants in the searcher’s location.
In short, understanding search intent is important to both Google, which wants to give the user the best search experience, and marketers, who want their content to rank on page 1.
Figuring out what a user wants, and providing them EXACTLY that, is the foundation of search intent optimization.
Why do people even turn to Google when they need information? Because Google is so very good at giving users the exact information they are looking for.
This ability to satisfy the user’s intent has all sorts of beneficial outcomes for all of the various stakeholders involved in search engines:
By understanding search intent, it’s much more possible to create content that will meet the needs of the user.
But sometimes, that is easier said than done, because every user is different. However, thinking about the main types of search intent, and creating content accordingly, can help your website match the intention of the majority of users.
To understand and discuss search intent, digital marketers have developed four primary terms that can categorize all of the billions of searches that happen every day in search engines.
Informational search intent is when the user is looking for information on a certain topic.
This is the most common type of search intent. All of the below search terms can be categorized as informational:
Clearly, these users are looking for very specific answers to their questions. So for searches like these, Google will often show a featured snippet that provides the answer to the user very, very quickly.
In addition, Google is showing results that may answer the next questions users have about that same topic.
These types of searches can often result in a zero-click search, or those where the user doesn’t actually click to any web page because they immediately see the answer they are looking for. However, sometimes users are not necessarily searching for a specific answer, but simply more information about a given topic.
The below keyword queries, then, also fall under the “informational” category.
With informational searches like these, Google is likely to show a wider range of resources that explore the topic in-depth, giving the user the ability to choose the web page that appears most relevant to their intentions.
Navigational search intent is when the user is looking for a specific web page or website.
For example, if a user types in a company name, they are likely looking for that company’s website. Or, maybe the user visits the same website multiple times, and is looking for a specific page on that specific website.
Here are some examples of keywords that fall under the navigational type:
With navigational searches, Google will most likely rank the specific pages that the user appears to be looking for.
As seen by the examples above, navigational searches most often include the mention of a brand name or the name of a website.
But for enterprise brands that have a large online presence, branded searches may produce other types of results, like Wikipedia pages, news articles, stock prices, etc.
Also, large brands will often have a Knowledge Panel appear in their branded searches. This panel includes key information about the company that Google has gathered from various sources across its index.
Transactional search intent is when the user is looking to purchase something specific. In these cases, users are further down the sales funnel and closer to a purchase decision.
For these types of searches, Google most often ranks specific product pages, like this result for “2017 grand cherokee blue.”
Transactional searches will very often return rich results, meaning results beyond just the traditional blue link. They will include product images, prices, reviews, and other information about the product.
Also, because transnational searches often represent users with the intention to make a purchase, transactional searches are often returned with Google Search Advertisements.
Commercial search intent can resemble both informational and transactional searches. This category represents users who are looking to make a purchase or to educate themselves in order to make a future purchase.
For this example, a user might be doing research for something they plan to buy in the future. Some examples might include:
In the above cases, Google is likely to show web pages that compare products or detail features, reviews, or additional information on the products/services. They know that the user is not ready to buy a specific product but is still in the research phase.
And because Google knows from this search that the user is still in the consideration stage of the marketing funnel, Google might promote various types of results.
In the above case, Google is showing ads, specific products, as well as blogs that compare various features and benefits of multiple brands of hair growth supplements.
By doing so, Google is able to satisfy the various search intentions of different users.
By understanding the actual search intent of keywords, content marketers can create content that is more relevant and valuable to their target customers.
So how do you actually optimize for search intent?
Here are three basic steps that will help your content creators better satisfy search intent.
A keyword can tell you a lot about what a user wants.
By understanding the topics and queries that users are searching for, companies can create content that is more likely to rank and be seen by their target audience.
This can be done through a variety of tools, including keyword research tools and analytics, to identify the keywords that are most relevant to your target audience.
More specifically, look at the keyword variations provided by the dashboard tools to see if there are other similar keywords that may be more relevant to the content you are creating.
Not sure what the search intent of a keyword is? Look at the content that is already ranking on page 1.
Because not only is satisfying search intent important to you, it’s also important to Google. Looking at what Google is already promoting gives you a hint of the type of search intent they associated with the keyword. This should clue you in to the type of content to create if you also want to rank for that particular search term.
You can look at the Keyword Magic Tool to review the type of content that has been ranking on page one over time.
Google Analytics can tell you how long your organic visitors are staying on a particular page, whether they click on additional pages, or how many sessions those users have.
These are all vital pieces of analytics that can help you understand whether or not your content is satisfying search intent.
You can also use other behavior analytics tools like HotJar or Mouseflow to see how users are actually interacting with your content.
For your high-value, high-converting pages, these tools can give you essential insights to improve your content, and user experience, and to better satisfy your user’s intentions.
In conclusion, thinking about the search intent of keywords can help you create better content and reach your target audience more effectively!
Updated for 2022
For website owners, marketers, and business owners who want to improve a site’s Google search visibility, there is no better data available than those that come from Google Search Console. What makes these important metrics priceless to site owners and webmasters? They come straight from Google. So, if you have yet to use Google Search Console to improve your site’s impressions and clicks in Google search results. Now’s the time.
This guide will cover how to create a GSC account as well as how to make the most of your GSC account to improve your keyword rankings.
There are many SEO softwares out there that track keyword rankings, but Google has the most accurate and up-to-date information about the keywords your site ranks for on a daily basis. In this way, Google, or software built over Google’s API, is the only source of truth for how your URL is performing in search right now.
If you want to earn more keyword rankings and drive organic traffic from search, Google Search Console and GSC-powered analytics are the best available. Beyond keyword tracking plus providing site owners with crawl-request capabilities, the platform empowers you to perform key SEO actions that will improve site visibility and ensure that Google is crawling and indexing your site as efficiently as possible.
Google Search Console is a free platform provided by Google to measure website performance in search engine results pages.
Formerly known as Google Webmasters, Google Search Console allows site owners to check the indexing status of their web pages, track their keyword rankings, and improve the overall visibility of their websites in search engine results pages.
In September of 2019, Google Search Console went through an update that removed many of the old reports as well as dashboard options. So, if you read a reference to the “old” vs “new” Google SearchConsole, the new GSC refers to the version introduced in 2019.
However, the new user interface still provides most of the same functionality and capabilities as the old version but with new names. Console continues to improve search query results for searchers and site owners by allowing site owners to see their site through the eyes of a Googlebot.
There are many features that Google Search Console offers webmasters to understand their search engine performance and elevate their usability to meet Google’s standards.
If you’re brand new to tracking your website in the GSC platform, setting up your account and getting started is simple. Here is a run-through of all of the first steps to get your account properly linked to your domain, adding users, and linking your GSC account with other useful platforms.
One of the biggest hindrances for many site owners is connecting their GSC account to their domain. While this preliminary step may seem intimidating, it’s not technically complicated. Additionally, you only need to do it once for each of your domains.
Once you’ve created an account with Google Search Console, open the drop-down menu and select “Add Property”.
You will be given the selection of five different types of properties, various ways to verify your property, as well as the subsequent steps for each verification method. These verification methods depend on the property type you’re adding.
All of the above are simple verification methods. Google provides detailed instructions for each in Google webmaster tools.
Add all domain versions to your GSC account and verify which is your preferred. This includes a version of your site with “www” and without. Otherwise, GSC may not give you a complete collection of your metrics. You can also set up a 301 redirect to your preferred domain from your non-preferred.
After you’ve established your GSC account, you can begin exploring all the features. Spend time looking through all the reports and functions. Do not worry about ‘breaking’ your account. There are very few, if any, ways you can mess up your GSC account.
If you have multiple team members or individuals monitoring the same domain, it’s easy to add additional users to your Google Search Console account. Property owners can designate both owners, restricted users, or associates. Each comes with limited permissions. It’s important to consider what level of access each user should be granted before adding them to the property.
If you’re a site owner that is working with an SEO or digital marketing agency to execute your SEO strategy, you can add your account owners or agency representatives as users or associates.
Verified Owner: There is only one verified owner per GSC property. This should always be a person that will always be with the company.
Added Owners: Can change and control all aspects of their connected GSC properties. They can also see all of the Google Search Console data and can take most actions that a property owner can, but only the verified property owner can make fundamental changes to the account, such as adding or removing property owners, changing an address, or submitting a sitemap.
Added owners can add more owners, users, and associates if they choose.
Users: Users have access to all data on a GSC property and perform some actions. However, users cannot add more users. You will find there are restricted users and full users. As the name implies, restricted users have more limited access to data and actions.
Associates: This title can seem a bit misleading. It is for associating a Google Analytics property with a GSC account. This association gives you access to GSC reports within your GA account (and vice versa). See below for instructions on linking your GSC and GA accounts.
A GSC property can have up to 100 users and 1 GA property.
Here are the Google webmaster instructions for how to add a user to an account.
Google Analytics is another great free tool that Google provides to site owners and one you should already be using to understand your website traffic and other key performance indicators. Google Analytics focuses on metrics related to all traffic on your website, not just the users who arrive from organic search.
If you want to simplify your life and have all of your analytics in one location, you can link your Google Search Console account to your Google Analytics account. By doing so, Google will import all of your GSC data into your Google Analytics reporting.
To do so, you will have to have both accounts already set up. Then, login into your Google Analytics account.
Once these steps are complete, you will have additional reporting available and your GSC data in your Google Analytics account.
While the data from GSC is still the best, because it comes straight from Google, the dashboard and reports are rather limited in comparison to other SEO software. This is one of the few downsides of GSC. If you want access to a more comprehensive dashboard with more advanced data visualizations, our dashboard GSC Insights tool combines a more engaging user interface with the same powerful Google Search Console Data.
Within the tool, you will also find a multitude of unique reports based on your GSC data, such as the economic value of your organic traffic (compared to how much that traffic would cost in a CPC campaign).
GSC insights is built over Google’s API, meaning you have access to the same real-time data you find in GSC–which is a great advantage.
To access the tool, create an account (you will also get access to our other SEO tools like our SEO Content Assistant and Keyword Researcher).
Select the “Google Search Console” tool on the left side of the dashboard, where you will be able to link a new account via the “Projects” tab.
Once you’ve added your domain property, users, and linked your primary Google accounts, you’re ready to start tracking your website’s performance in the SERPs.
Much of the learning curve related to GSC is getting used to the jargon. Here is an overview of the primary metrics that Google tracks organized by how they are structured in your Google Search Console Account.
The overview section is a summary of the three primary categories of metrics that Google Search Console Tracks: Performance, Coverage, and Enhancements. Here’s what they mean:
The URL inspection tool is a way to quickly inspect the indexed version of any given web page on your domain.
There are three ways to access and use the tool in your Google Search Console account.
Once you inspect a URL, GSC will provide you an overview of information from their last crawl attempt, including whether or not the URL appears in search results, whether it appears in search results but has issues, where the page was found, and more.
Watch the below video for more information on how to make the most of the URL inspection tool and the data it provides.
The Performance section of your Google Search Console account will provide all of the metrics related to your URLs’ search appearances. The primary data graph will provide four key metrics that quantify how your site is performing overall across all web pages.
If you want to filter these search performance metrics for specific keywords, landing pages, days, or some other factor, you can set your parameters from the table located below the primary graph.
The table will populate based on the filters or combination of filters you select. You can filter your clicks, impressions, average position, and average CTR by any of the following parameters.
How best to utilize all of this data will depend on your keyword targets, strategy, and SEO goals. Overall, the data is designed to provide you with both a big picture summary and many granular pictures of the strengths and weaknesses of your search performance.
Before Google can rank your URLs for keywords, it has to crawl and index your pages. The Index section of your Google account will give you all of the information you need to understand how Google is indexing your pages, as well as tools to improve how Google is understanding your web pages.
This feature of the Google Search Console tool allows site owners to see which of their landing pages have been indexed and whether Google bots encountered any problems when crawling them. Since 2018, Google uses mobile-first indexing data when available.
Ideally, there will be zero errors on your coverage report, but if not, Google will identify what pages had errors and the type of error so you can attempt to correct them.
To review your site errors, simply select Coverage under the Index section of the sidebar.
After the report opens, you can identify individual pages with crawl errors.
A sitemap is exactly what it sounds like — a roadmap for all of the web pages in your domain. It communicates to Google’s crawlers what the most important pages of your website are, even if you don’t have internal links pointing to all of your pages.
Essentially, a sitemap helps Google crawl your website more intelligently and efficiently, which benefits your keyword rankings in the long run. You can generate an XML sitemap using a site map generator (try the Yoast plugin). If you are unsure of which pages to include in your sitemap, consult one of our SEO strategists before generating your file.
You are certainly not required to add a sitemap to your GSC account, and for smaller sites with only a few landing pages, it may not be necessary. However, larger websites (like e-commerce sites with lots of product pages) owners can really see their keyword rankings benefit by adding a sitemap.
To upload your XML sitemap in your Google Search Console account. First, select the “Sitemap” tab.
ou will be directed to the page where you can upload your XMLgoogle bots the link to your file and click submit.
In Google Search Console, you will be notified when there are any crawl or indexing errors on your website. Checking back in on your sitemap is important as Google won’t necessarily crawl every page on your website with the same frequency.
Also, as your website grows or anything changes with your overall site architecture, you will want to update your sitemap so Google continues to understand which pages are central to your site architecture.
One of the newest features of Google Search Console, the Removal tool allows site owners to both remove content from the SERPs and see which content has been removed due to the requests of third parties.
The removal tool offers three options to site owners:
The majority of site owners will only ever access the first feature of the tool. If for some reason you need to temporarily remove a URL from Google, you can submit your request via the removal tool. The request can be submitted for a specific URL or any URL containing the same prefix, and they will be removed from Google for six months.
This tool does not permanently remove web pages from Google’s index. To do so, site owners need to either use the robots <noindex> tag or delete the content from their site.
In 2020, Google switched out its PageSpeed report with Core Web Vitals report. This report is all about the technical performance of your website. Google bots measure website performance and page experience through their Core Web Vitals metrics.
Google Search Console provides Core Web Vitals reports for both the mobile and desktop versions of your website.
The following metrics make up Core Web Vitals:
Google recommends focusing on improving those affected URLs that receive “Poor,” and “Needs Improvement,” ratings. Click on the item in the Details list to get more details about the issue type that is resulting in the subpar rating.
Once you’ve attempted to fix the issue, you can verify whether Google is seeing the issue as resolved across your entire site or notify Google of individual fixes.
Simply select the status of the fix under the Validation column in the Details menu beneath the Core Web Vitals report.
As you work through these fixes and check off that you’re ready for the fix to be validated, the item will drop to the bottom of the status table. Once all items are accounted for, Google will track usage data for 28 days to see whether the issue is fixed.
Some improvements you may be able to make on the backend of your WordPress site (or whatever CMS you use), but other errors may require you to work more closely with a developer.
The mobile usability report provides site owners with insights into how their website performs on mobile devices. If any of your pages do have mobile usability issues, the “Details” table will highlight the type and status of the error. Errors that may be noted on your mobile usability report include:
Some of these errors are simple to fix. Others are not. But it is important to address the errors as best as possible, particularly if your website traffic primarily comes from a mobile device. With mobile-first indexing, Google uses the mobile version of your website for indexing and ranking, meaning solving these issues is of great importance for getting higher rankings.
AMP pages are essentially a streamlined version of HTML that results in faster loading times on mobile devices. If you want to create accelerated mobile pages for your website, your web developer will need to follow AMP HTML specifications. Then, you can use Google’s tool to confirm that your AMP pages are properly set up and recognized by googlebot indexing.
The enhancements section of the Google Search Console account provides site-wide insights and performance reports that can help site owners improve their rich results or those SERP results that include carousels, images, or other non-text elements.
Rich results can help your search results look more attractive to users, but your site can only appear in rich results if you have structured data markup or the standardized format of website code for classifying the page content.
What you see in the “Enhancements” portion of the Google Search console toolbar will depend on what structured data types you have implemented across your site. You can work with your web developer or an SEO agency to implement structured data on the backend of your website.
Enhancements are essentially HTML improvements that elevate the performance of your pages for users as well as helping Google’s bots better understand the primary purpose of each of your web pages. It is worth the effort to make sure to use structured data markup if you want to improve your overall appearances in the search engine results pages.
First, explore Schema.org, to find structured data or schema markup that signal to Google that your site or pages have a specific form of information.
These include:
Breadcrumbs are a rich results type that help Google’s crawlers and users understand your site structure, or where your web pages are located in your website’s hierarchy.
Site owners in any industry can benefit from adding breadcrumbs to their websites for SEO purposes and to improve the user experience. However, large sites with multiple landing pages.
The first page of your website that users arrive at through organic search may be a blog post or a resource page, not necessarily your homepage or primary category pages. With breadcrumbs, users (and Google bots) always have a sense of where they are located on your website.
Once you have implemented the breadcrumbs structured data markup, Google will provide a report that lets you know whether or not there are any errors and which support team can help you resolve the issue.
Logos: Improve your Brand Impression
The Logos structured data markup helps Google understand the logo associated with your business or website. By adding the Logos structured data markup, it ensures that your logo appears in branded search results for your company. The logos structured data type will help Google know to show your brand logo in knowledge panels.
In the Google Search Console Tool, the logos report will let you know whether or not your logo is appearing correctly in rich results and if not, how to correct the issue.
The local business structured data markup is really important for any local or small business with a brick-and-mortar store that users will likely find via mobile search.
This structured data markup type provides essential information about your local business like your physical address, hours of operation, online reviews, and makes it easy to contact you by phone straight from the result. If there are any errors in your search result, Google will notify you in the GSC dashboard.
Having a safe, secure website creates a higher-quality web experience for users, which is why Google provides warnings of any security issues it detects on your website. Google will flag any issues that could harm users it sends to your website and will notify you of which pages the security issues were detected on.
Google Search Console provides details on how to resolve the specific security issues it detects. It is the site owner’s responsibility to resolve those security issues in order for their rankings to improve.
In order to test whether or not the security issues have been resolved, simply “Request Review,” in your Security Issues report and provide the following information in your request:
It usually takes 2-3 weeks before Google Search Console notifies site owners by email of whether the issues have been resolved.
The manual actions report is meant to prevent those websites that use black hat or unethical attempts to manipulate their search results. If your site is issued any manual actions, some or all of your site will not be shown in search results.
There are a variety of reasons why your website may be issued a manual action, but they all fall under the umbrella of failing to follow Google’s webmaster quality guidelines.
To fix the issue, follow these guidelines outlined in Google Search Console’s help center.
Since the launch of the new Google Search Console, legacy tools and reports do not have a current replacement in the new Google Search Console dashboard. Google does not make the older version of GSC available, but site owners can still access these tools via the links Google provides in their help center:
The Links section of Google Search Console allows site owners to see their backlinks, internal links, top linking sites, top linked pages, as well as the most common anchor text other site owners use when linking to theirs. Site owners can export a CSV file of all of their backlinks by clicking the “Export External Links,” button to get a closer look at those sites that are linking to theirs.
If you have a large amount of low-quality links that you think might be impacting your rankings, you can submit a disavow file using the disavow tool. This will not remove the links, but instead, make it so Google no longer considers them in their evaluation of your website. You can use our backlink analyzer to generate disavow text for your disavow file.
Only advanced SEOs should be disavowing links, as you don’t want to accidentally harm your site by disavowing the wrong links. It’s also important to use this tool sparingly, as Google wants site owners to find other ways to get those links removed or to instead focus on earning more high-quality links to their site.
With so much data and performance reporting available in Google Search Console, it can be overwhelming for new site owners. How do you make the most of all that valuable information to improve your search results? Here are the primary ways to think about using the Google Search Console dashboard. In conjunction with our GSC tool and our software suite of tools, you can make the most of Google’s data to improve your search results more effectively and quickly.
The most straightforward purpose of using GSC is to see the specific keywords your site is ranking for. If you are doing the work of SEO correctly, you create and publish content on your site with specific keywords in mind. You can use GSC to confirm whether your content is high-enough quality to earn those rankings.
After you optimize your landing pages, use Google Search Console to measure your site’s performance well for those keyword targets. If you make additional on-page optimizations or off-site backlink building to those pages, measure how your keyword rankings change.
By elevating your site’s presence with Search Console data, you can improve your search traffic across multiple relevant keywords for the long term.
Understanding the best and worst-performing pages of your website helps you determine your next step for improving your number of impressions. Whether you need to create new content, redirect specific pages, optimize an HTML tag, re-optimize relevant pages, resolve keyword cannibalization, or more, you can identify the different ways you can improve your content with our Google Search Console.
To automate and more easily track your SEO performance, GSC Insights makes tracking campaigns, keywords, and pages even easier.
GSC Insights and our dashboard can help you identify SEO potential based on competitor data, sites with potential keyword cannibalization (often caused by duplicate content), and indexing issues, including a sitemap report.
In addition to SERP data, SEO professionals encourage site owners to monitor off-site SEO metrics such as backlinks.
Your domain rating and the likelihood of appearing as a top page in Google’s SERPs depends on your site’s domain rating. Backlinks are essential to your site’s performance.
Directing SEO efforts off-site can elevate your site’s ranking more quickly than anything else. Backlinks are key in the algorithm and can improve your performance for different keywords because your domain name will be perceived as more authoritative in Google’s algorithm.
To increase your domain rating, you need backlinks.
After you connect your Search Console account to GSC Insights, you are able to monitor your backlink profiles to ensure that your website is not keeping company with any questionable web properties.
Our Dashboard makes monitoring your backlinks easy, and the software suite provides you with backlink outreach opportunities, so you can become discoverable in more relevant queries.
For more advanced SEO specialists, our Dashboard is a great platform for running A/B tests on your website–with spreadsheets.
Site events allow you to easily mark then track changes you make to your site within the GSC Insights tool. Whether you change a minor element in a piece of content such as a title tag, meta tag, meta description, or edit a number of pages, you can create an event to mark the change.
From there, you can track months of data, years of data daily to monitor the effectiveness of your efforts. You can isolate data such as page rank, keyword changes, impressions, and CTR,
Because our Dashboard’s keyword data is more up-to-date than tools like ahrefs or semrush, you can perform better keyword research and see if the changes improve your site’s search performance more quickly and accurately.
If you want to improve your site’s SEO, the first thing you need is full access to the important information that helps you track your search engine optimization efforts. Google Search Console website has all the basics and is the easiest way to track SEO and site settings for both large and small sites.
Paired with GSC Insights, you can understand Google results for your site’s pages and site’s keywords with even more granularity. For more pro tips on how to make sure Google‘s tools are working best for you, talk to one of our SEO specialists. We can help you understand your search console reports and provide managed SEO plans that will elevate your website URL in the search results pages.
Is your SEO strategy failing to drive enough conversion-oriented organic traffic to your e-commerce site? Or are you just getting started with a new e-commerce store?
This article covers e-commerce specific SEO strategies and considerations. By applying these advanced on page SEO tactics and best practices, you can expect your online business or online store to gain search visibility, better organic rankings, higher organic search traffic, and ultimately more online shoppers.
We’ll walk you through the same basic SEO elements you’ve undoubtedly read about countless times before, but address their application for an ecommerce store. These practical SEO tips will help you significantly increase your new customers and online sales by making it easier to reach your target audience without spending a dime on PPC advertising.
Category-based information architecture (IA) and site architecture are critical for ecommerce sites, and this IA should inform your main navigation.
Online users have no patience. If they can’t find what they’re looking for quickly, they’ll bounce back to the search results and try the next site in the search results.
As your product catalog grows, you will need to put more effort into making it easier for users to find the right products!
Your site must have well-thought through UX (user experience) elements such as filters, navigation links, breadcrumbs, product categories and subcategories as well as clear URL structures and product naming conventions.
Good UX makes it easy for a user to understand where they are in your site, and what products and/or services your online store offers. For ecommerce SEO, site structure is typically based off of product categories, product collections, and products/product filters.
Product categorization for your e-commerce website may need to be different from your product categorization operationally. Your ecommerce SEO strategy needs to reflect how your consumers view your products, not how your business views your products.
In general, your site structure should reflect how your customers think about your products and services, even down your actual product names.
A common mistake that e-commerce sites make is organizing their products online the way they view those products from a production or operational perspective.
How everyday shoppers think about your products may be different than how you think about your products.
For example, you may think of a piece your company makes as “Breville part #: BJE510XL/45” but your everyday shopper may search for “Breville Filter Basket Replacement”.
So how do you establish and/or close the gap between how you think about your products, and how your prospective customers search for your products?
Pull from a usability best practice — have direct conversations with existing customers. Ask them how they’d describe your products or services, how they mentally categorize your offerings, or how they searched for your business to begin with. Their input will help you understand the language your customers are using, and how they think about your products or services.
If you need a starting place for understanding how your consumers view your products, you can have them complete a card sorting exercise. Card sorting is a UX (user experience) tactic that helps you prioritize and group information based on how your customer’s see it – by literally giving them all the elements and asking for their feedback.
Next, complete keyword research. Keep in mind a handful of responses can be very helpful for gaining insights, but they may not reflect the broader market. Spot check search volume for keywords using the language your customers use, the language you would use, and be open to discovering additional ways the market overall searches for your products/services.
Keyword research helps you understand how the market thinks about products and services, and which search terms are likely to convert if you can attract those shoppers to your ecommerce site.
There are a number of tools to help you with keyword research. If you already use adwords, you could use the google keyword planner tool as a starting point for establishing the best keywords (search terms) to target. You can also use our keyword volume or keyword tracker tools.
Your goal should be to identify a list of keywords that describe your products and have high search volume. This set of terms represents how your consumer base thinks about your products, and this is the language that you should use throughout your ecommerce website (ex: for product categories or collections).
You need to know what long-tail keywords, which specific terms, people use when searching for the exact products that you sell.
The types of keywords you would use to optimize an e-commerce site aren’t necessarily the types you would use in another niche. You want to attract users towards the bottom of the purchase funnel. To do this, you need to identify a user’s search intent.
As the name implies, the term “search intent” refers to the reason someone is performing a search. This also influences the words they choose when performing a search. Search intent can be broken down into four categories: informational, navigational, commercial, and transactional.
It’s important that you select keywords that have commercial or transactional search intent. These are the terms that will convert for your site. Do NOT simply pick keywords that have high search volumes. After all, your core goal is actual ecommerce sales. You want to attract more potential customers, not just generic organic traffic!
In general, ecommerce keywords belong to the commercial and transactional categories. It’s easy to understand why you’d want to focus primarily on researching transactional (and, to some degree, commercial) keywords. These are simply the types of keywords people frequently use when they are ready to make a purchase. This makes them ideal for ecommerce websites.
It’s also worth noting that ecommerce keywords are often very specific, or long tail. Long-tail keywords are keywords that have modifier terms around the basic keyword. Identifying these more specific keywords can be extremely valuable.
In the example below we see how more specific searches can have a higher cost per click. This is a market indicator that the term is higher-converting. You may also notice that there is less search volume on long-tailed, or specific, keywords. Broad searches have higher search volume because there is a much wider range of reasons those searches could be performed. Specific searches have much clearer intent.
A very specific search phrase, for example one that includes model type, size, brand name, or location indicates a user knows exactly what they want and they are ready to make a purchase. Very specific searches tend to be more high-converting, and more impactful on your bottom-line.
One more indicator that a keyword is likely to be used by potential customers and have a higher conversion rate is that the term has a high Pay Per Click (PPC) or Cost Per Click (CPC) value. These are terms that the market has already validated as conversion-oriented. However, you cannot rely on CPC alone, you still need to check for the relevancy of terms against your own product list and/or services. Converting for the market will not always mean converting for your specific site or business.
Once you understand the search queries your customers are using, at both a category and product level, you’re ready to finalize your site architecture. Use broad high-volume terms as product categories/product collections, and then more specific terms for sub categories and individual products.
You’ll use this architecture to set up the structure of your site, from your home page, to category-based landing pages, sub category landing pages, collection pages, and finally product pages. This architecture will also inform how your main navigation and sub navigations are structured.
Once you’ve determined your product categories and subcategories, consider creating related landing-pages for each category and/or subcategory. A recent study has shown that sites which increase their landing pages from 10 to 15 see a 55% increase in leads. These pages can be used to target keywords that are broader than your product-specific keywords. For example, this page on Guitar Center for electric guitars:
This landing page (and URL) target the broader and higher-volume term “electric guitars.” Category pages help your site capture traffic from higher-volume terms while the individual product pages target much more specific long-tailed keywords (aka higher-converting keywords).
Category-based site structures also help users navigate quickly to relevant products/services right from the homepage. For ecommerce websites of any size each category page can be a main nav or sub nav item. This strategy adds the related category keywords to every page on your site, as well as increases the page rank for these pages through internal linking, as the main navigation is repeated on every page of your site.
Take Guitar Center’s site for example, all of these sub-category pages for “guitars” are listed (linked) in the main navigation, and therefore all the terms you see here are “read” by search engines on every single page of Guitar center’s site — not just their homepage. This site structure also makes it very easy for users to find the exact product they’re looking for and even discover new products. This boosts your page’s odds of appearing in relevant search results.
For larger ecommerce sites, adding all product categories or subcategories to the main navigation may not be feasible. In this instance, breadcrumbs can provide an alternative method for leveraging internal linking and product pages to help users navigate deeper sites.
Guitar Center’s main nav does not display or internally link to any pages below the “electric guitars” category. However, there are additional product subcategories within the electric guitars product category. To improve usability and discoverability of these additional subcategory pages breadcrumb navigation has been added (highlighted in red below).
Breadcrumbs are especially useful on product pages, as they can help users discover a full product line, clarify the website structure, and provide a secondary navigation link to bounce a user back multiple site levels without having to press the back button multiple times.
Amazon, as another example, uses breadcrumbs to provide users with secondary navigation on almost every product page.
Finally, category pages can be helpful for ecommerce sites, as you can set up category-based redirects. Category-based redirects allow out-of-stock products to be redirected to the main category page. This improves the user experience and reduces the chances of Google, Bing, or other major search engines reading any of your pages as 404ing.
A well thought-through site structure will also enable you to programmatically generate custom product URLs that include relevant keywords (describe the product) and are easy for users to read/understand.
Keyword-based URLs are more “clickworthy” than URLs that consist of seemingly random characters such as product SKUs. URLs featuring keywords essentially “tell” a search engine algorithm more about what type of product is featured on the page.
In ecommerce seo, the URL slug will typically be the main product keyword — usually the product name. Your main keyword for the product should also be included in your H1. Continuing with the earlier example of a user searching for a beginner guitar set, this page from GuitarCenter demonstrates the right way to generate a product URL:
The URL slug is the name of the product. It’s also the H1 (not just stylistically but also with the HTML H1 tag applied). The result? This is the first page to appear in a Google SERP when users search for “yamaha gigmaker eg electric guitar pack.”
The URL also illustrates a popular ecommerce URL format: domain.com/category/product. Other options to consider include domain.com/collection/category/product and simply domain.com/product.
Determining which format to use requires deciding whether your products belong to specific categories and collections, or whether they stand on their own. The chosen format works in this example because the product is an electric guitar pack belonging to a specific brand.
Explore the traditional URL structure of various ecommerce platforms when reviewing your options, but keep in mind that you can often change the structure by choosing the right theme or directly editing the code.
I’m sure you’ve spent time optimizing your homepage already, but did you know it’s even more crucial to optimize the SEO of your individual product pages? These are the pages that need to appear in relevant search results, and you want them to be strong enough that they convince guests to make a purchase.
Once you’ve done some preliminary keyword research, begin to optimize your individual product pages by addressing your Technical SEO:
Page titles, also known as title tags, need to accurately represent what a product is. They also need to feature the primary keyword for which you most want to rank. Make sure this keyword or phrase is front loaded in the title tag so it’s more noticeable on a small mobile screen, and the key information from the title tag still displays even in rich snippets.
Take a look at the examples below. In the first, only the first two words of the page title are visible in the first one. In the second we see subcategories highlighted.
You should also include relevant keywords in the meta description for your product page. Your meta description should encourage the user to click into the search result, clarify what the user can expect from the page, and include product-relevant keywords to help your page rank.
It’s important to keep in mind that Technical SEO is primarily about helping users navigate online. The meta information on your product pages (including page titles, descriptions, and headers) serve much the same purpose as highway signs or signage at an airport: They help users reach their intended destination. Thus, you should attempt to be as informative as possible, while also being brief and direct.
When thinking about what information to include in your title, meta description, and/or H1 it can help to think about product modifiers. These are terms (often included in long-tail keyword searches) that further describe the product. These often include include items such as:
Remember, many of the users you’re targeting know exactly what they’re looking for in detail. You thus need to provide them with information demonstrating you’re selling exactly what they’re looking for.
Including images of your products is crucial to ecommerce SEO.
Images allow you to display your products in dynamic ways. If you’re selling apparel, product images where items are being worn by models provide customers a better sense of how an article of clothing looks when worn.
Images can also provide context for products. For instance, maybe you sell furniture and fixtures. An image of a product in a room (ideally surrounded by a few other items) will give users a better idea of its size, and how it will look in their own homes.
Product images with small file sizes, which do not display page load speeds, and which adjust responsively – display well on mobile. Google is continuing to switch sites over to mobile-first indexing and both Google and Bing noticeably reward sites with images and rich media in search.
Image ALT text tags are simply descriptions of images on your site. They also play a significant role in ecommerce SEO. Alt text tells a user what an image depicts when the image either doesn’t load, or when the user is blind, but it also provides search engines more information about what an image itself is relevant for in search.
Where appropriate, ALT tags should feature the keywords you want to rank for without adding confusion to the image description itself. Keep in mind the point of alt text is still to help people with accessibility issues, so keep alt text relatively short (no more than 125 characters) and try to be specific about any key product features highlighted in the image (such as product name, size, materials, and any other relevant information). This is another way in which you can tell a search engine what type of content appears on the page.
Including a short product description right next to the product image is a smart way to capture a potential customer’s attention and improve the on page SEO. They’ll see both the product and the most important information about it at the same time.
Review this example to understand why this method is effective. The image shows off the product, while the copy provides a user with the basic essential information.
Scroll down, however, and you’ll find a lengthier product description. A longer product description section gives you the opportunity to include more keywords in your content and ensures you won’t be prevented from ranking for thin content.
Additionally, great content is more likely to engage customers and build brand awareness. When you only have thin content users spend less time on the page, and less time considering your product. Not only does more time spent on the page improve your rankings in the SERPs, but also great descriptions will help guests better understand why a product is valuable.
The right word count for descriptions depends on how much content exists on a blank product page. What this refers to is the sum of text in the navigation, header, footer, etc. before product information is added. Making sure product descriptions are longer than the sum of the base page content is a good starting point.
One thing to always note, though: never use the manufacturer’s product description. Ensure yours is unique and not copied from the manufacturer or another site. This is important because search engines won’t show the exact same content (duplicate content) from multiple sites. They instead only display content from the site that is deemed most “trustworthy.” This can end up being the site that has had the content up the longest, the site with the most traffic, the site with the most backlinks, or the site with the most users.
If you have multiple pages for essentially the same product (ex: the same product in different colors, or the same product in different sizes), you’ll need to make some choices so that search engines are not confused by what are essentially duplicative pages/duplicative content:
To avoid duplicate content when creating product descriptions, try breaking up the content into multiple sections. For example, one section could describe the story behind the product. Another could list its key features. Yet another could feature customer testimonials, just like the Biossance example above.
LSI keywords are terms that search engines expect to see on a page that is related to a particular topic. Often called focus terms, LSI keywords help Google understand the focus of a page. Read more about focus terms and content optimization here.
LSI keywords can help you tailor each page to rank better for the longtail keyword or term you’ve selected. We even have our own content optimization tool that you can use for free.
You should also consider that different people care about different product features. Breaking your descriptions up into sections makes it easier to appeal to all users. For example, if you were selling a garment, various users might care about such information as size, durability, warranties, shipping options, color options, special features (such as water-resistance), and more. Use your longer product descriptions to provide all information you believe potential buyers would be interested in. If your product descriptions are exceptionally long, you can even use internal links called jump links to help users navigate down to relevant content more quickly.
Schema markup – also known as rich snippets – refers to HTML tags you can add to your content. When used correctly, schema can increase CTR by as much as 677% and boost traffic 20-30%. By providing users with more valuable information about your content when it shows up in Google search results.
For example, maybe a user’s search results include one of your product pages. With schema markup, you could include customer ratings in your organic search result – or even show up in the product rich snippets that appear at the top of the search results. It’s also worth noting that Google’s own John Mueller has confirmed that schema is important to SEO.
Product schema can substantially improve your SEO. It’s also fairly easy to implement. The following are two simple ways to do so:
Do you use a major platform like Shopify or WooCommerce to manage your ecommerce site? If so, you can simply install a plugin for schema markup. It will allow you to add the necessary schema with ease.
Other platforms, like WordPress, have their own plugins, too, like these. So does Squarespace. While these platforms don’t allow for extensive schema markup on their own, plugins can expand their capabilities.
If you have a custom site that is NOT managed via WordPress, you could instead use SchemaApp, which allows you to organize your schema markup data on one platform. You can also use this tool if you host your e-commerce site through such platforms as Shopify, Woocommerce, BigCommerce, and Squarespace.
There’s also Google’s Structured Data Markup Helper, which you can follow along with after selecting “Products” from the main screen.
Does your organization have substantial in-house technical resources? If so, you can coordinate with a web developer to add schema markup to your site via Schema.org. This allows you to exercise a greater degree of control over the purpose of the schema you wish to add. It’s not an option for all businesses, but it’s worth considering if you have the necessary resources.
Building a strong backlink profile is part of an effective SEO strategy for any site. Ecommerce sites are no exception. Inbound links (also known as backlinks) are critical for improving your SERP, and can help you bump terms stuck on the second search engine results page up to the first.
What are inbound links or backlinks? A backlink is when another (external) site links back to your site, referencing your products, services, or content. In essence it’s another site referring their own users to your site because your site provides value. Essentially, search engines view other sites linking to your site as a positive reference from a real person. Link signals are weighted heavily in SEO. Each backlink your site receives, increases the value of your site in the eyes of Google, and thus improves your rankings.
As more domains link back to your site, your own site’s domain authority will increase. As your domain authority increases, so does your site’s SEO value. Search engines use these ranking signals (backlinks) to determine which sites are most relevant online for related topics. Adding link authority boosts your page’s SEO value, and it’s ranking in the search results.
A strong backlink profile improves brand awareness and captures top of funnel web visitors who may encounter your site/brand via another initial source.
Link building starts with creating quality content that will be used and shared by people outside of your own site. Securing inbound links from reputable sites tells search engines your site is also reputable. These links may also provide additional opportunities for your products to display in rich snippets, and direct more traffic back to you.
There are several ways you can build backlinks for an ecommerce site. The following are a few methods we’ve found to be successful for e-commerce sites:
Many sites routinely post lists such as “Best Holiday Gifts for College Students,” “25 Life Changing Products under $25”, or “What to Get the Person Who Has Everything.” There are also sites specifically designed to help users discover new products (such as uncommon goods, product hunt, or pinterest). Submitting your products to these sites boosts your odds of showing up on such lists. Additionally, you may wish to submit your products to sites where users actively discover products, such as Pinterest, Product Hunt, or Wish.
A blog featuring valuable content can be a very useful tool for building backlinks. We recommend starting by identifying a list of ideas for blog posts. Each blog post should be tailored to a frequently asked question, or frequently searched topic, relevant to your business and your target consumer. Popular blog content, such as “Top 10 Gift Ideas for Father’s Day 2020” can encourage others to link back to this type of entry if it is fairly comprehensive.
Additionally, you could submit guest blogs to other sites, linking back to your products in the content. For further reading, SEMRush has a great guide to guest blogging as a linking strategy.
A more advanced approach would involve pitching product pages. Think about the kind of sites and publications that are likely to cover your products. Check their writer profiles and masthead to find their contact information, and submit a product for review. Each site will have its own process, so research publications and influencers or discuss this with an editor or other relevant individual before submitting your products blindly.
Image Above: Example of a Product Write-Up Included in a List
You may also want to coordinate with influencers in your niche. This guide gives an excellent intro to reaching out to influencers. Search for social media influencers in your industry. If a popular Internet personality recommends your products, that will generate more backlinks and drive overall interest in your brand.
Once you’ve optimized the basics (product descriptions, URLs, schema, etc.), Make sure that your site has enough trust signals and social proof that consumers feel confident purchasing from your online store:
Customers typically trust user reviews through third-party platforms (such as reviews on Google My Business or Amazon) more than curated product reviews you post to your own site. However, testimonials from people where their full name is displayed can still be a great first step, or even an addition to pulling third party customer reviews onto your own site.
Depending on how you’ve set up your product schema, you’ll also be able to display your aggregate rating, or star rating directly in the search results (especially the Google search results).
These points are all important to keep in mind when developing an ecommerce SEO strategy for your site. The right combination of tools and techniques can be the key to ranking higher, attracting more organic traffic, improving your conversion rate, and ultimately increasing your e-commerce sales.
Get 7 Days Free to use the most powerful SEO software on the planet
Many Shopify store owners are not leveraging their full SEO potential. You might think that you don’t need Shopify SEO or that it is too difficult to set up, but you could be missing out on a large number of potential customers browsing your online store. If you want to help your e-commerce store show up in Google, here are some SEO Shopify tips to help you get started.
Shopify SEO is the process of making your online store more visible in the search engine results pages (SERPs). This means that when people search for the product that you sell, you want to rank high so that you get more organic traffic and increased sales.
In order to optimize your Shopify store effectively, follow these 9 tips.
Category pages allow online shoppers to easily navigate your products. You’ll want to target keywords that your shoppers would be interested in.
You’ll first want to start with title tags and meta descriptions. As PracticalEcommerce states, “The title tag is the most influential on-page element that sets your page’s keyword theme and, combined with the meta description, influences the search terms the page ranks for.”
Another element you’ll want to optimize is your heading tags. Heading tags are parts of the page that tell the reader what certain parts of the page contain.
The Media Captain offers these tips for more effective headlines:
If you want to rank well, another thing you have to keep in mind is your URL structure. A clear URL makes it easy for visitors and search engines to navigate your site.
ContentKing states a URL is generally considered good if it’s:
This makes it so that when using Search Engines users can see the whole URL and keywords before they click.
You should also try to include keywords in your URLs, so rather than something like this:
example.com/categories/jk13d3
You should have a URL that contains keywords from your product page. So if ID jk13d3 stands for “Levi Jeans” on your site, you could change the URL to:
example.com/categories/levi-jeans
Another tip is to make sure you separate your URLs with hyphens. Google may not be able to read your URLs (eg. example.com/soccershoes) if you don’t include punctuation. Keep in mind Google treats hyphens as a space (eg. example.com/soccer-shoes) and underscores as a separate character.
Tip #3: Choose Your Keywords Wisely
The keywords that you incorporate into your eCommerce store are incredibly important. That’s why it’s imperative to do research and see what your direct competitors are using. This can be done just by going to their websites and compiling data or using a keyword research tool. Here are just a few tools that might work for you:
When picking what keywords to target there are a couple of factors you’ll want to look at:
Usually, you want to target keywords with a high search volume, high CPCs, and less competition.
Once you know what keywords you’re targeting, start incorporating them into product descriptions, landing pages, and headlines. You’ll want to continue to monitor these keywords and see how they’re performing, and make adjustments when necessary.
Tip #4: Attract Buyers With Your Page Title
When searching Google your page title is the first thing online shoppers will see and influences whether or not someone will click on it. And what’s the right length for a title tag? As Moz states, “While Google does not specify a recommended length for title tags, most desktop and mobile browsers are able to display the first 50–60 characters of a title tag.”
Here’s a formula for writing a great page title:
Keyword | Additional Keyword | Business Name
Here’s what this looks like when you search “cashmere sweater” on Google:
As you can see, these Page Titles are short and sweet. It’s important not to stuff your titles full of keywords as this can get you into trouble with Search Engines.
To create a great content strategy you can’t just create blog content, stuff in a bunch of keywords, and call it a day. Instead, you’ll want to start off by thinking about your target audience and what type of content resonates most with them.
Dive into your analytics and look at things like the age, gender, and location of people who buy your products. Another big factor is how your audience likes to consume your content. Do they use Instagram? Prefer videos or white papers? Short or longer-form content?
Start thinking about what sets you apart from your competition. From there, you can develop original content that aligns with your brand and your audience’s interests. Make sure to do keyword research, SEO competitor analysis, and look at keyword gaps. And finally, get started on writing your unique content! Content can be for:
Once published on your Shopify website, you’ll want to monitor performance to see what page content your audience is enjoying and interacting with. This will impact your content plan moving forward.
Keyword density – also called keyword frequency – is the number of times a keyword appears on a webpage compared to the overall word count. In the past keyword stuffing, or putting as many keywords as possible into content was very common. Now Google penalizes the page rankings of sites that keyword stuff.
Want to figure out your keyword density? It’s easy to do! Simply divide the number of times a keyword is used on your page by the total number of words on the page.
For example, say your content had 30 keywords and overall 2,000 words:
30 / 2000 = .015 %
Multiply that by 100 to get the percentage and you get 1.5 %.
For keyword density, there’s no perfect amount although “…many SEOs recommend using approximately one keyword for each 200 words of copy.”
There are also some free tools to help you calculate keyword density if you want to streamline the process:
What’s a backlink? Backlinks are links from one website to another. They’re important because these links tell Google and other search engines that your site has domain authority.
So when a bunch of other websites link to your website this increases your website’s domain authority with search engines, therefore also increasing your SEO.
Not all backlinks hold the same weight though. You want to have backlinks to websites that have high domain authority. For example, a backlink from a page on Shopify would be extremely valuable since they are an authoritative website.
There are a couple of strategies you can use to when link building:
Today, nearly 38% of Google’s SERPs show images. That’s why it’s important to capitalize on another source of organic traffic – images.
Alt text or alt tags is the copy that will appear if an image fails to load. If someone is using a screen reading tool the image will be described using this copy.
In order to add Alt text to your products Shopify provides these instructions:
Again, you won’t want to keyword stuff with your Alt text images. Instead, focus on only one or two keywords and be descriptive of the product. If a product has text on it make sure to include that.
A good example is the picture above of a bag of Doritos. In the Alt text they include the product name, flavor, size, and amount of product included. This tells the reader exactly what the image is without going overboard with copy.
You may have heard the term UX thrown around before – it stands for User Experience. In terms of an eCommerce website this means things like your website is easy to navigate, visitors can find products they’re looking for, it’s easy to add items to a shopping cart and checkout.
How do SEO and UX work together? SEO drives traffic to your site, and UX gets that traffic to convert.
You should take these factors into consideration to create a great UX experience.
Mobile Friendly – In 2021, 72.9 percent of all retail e-commerce is expected to be generated via m-commerce. That’s why your site needs to work and look great on both desktop and mobile. Make sure your homepage, navigation, images, and copy all display correctly on mobile devices. Go through the process of purchasing a product and ensure that process is seamless. Shoppers will immediately switch to a different site if they experience any issues.
Deploying these tips across your Shopify store can be all the difference in driving real clicks to your website. If you are not quite ready to take your SEO strategy into your own hands, reach out to an e-commerce SEO expert to get started.
For anyone who wants to drive organic traffic to their website from search engines, it’s important to have a strong understanding of the role keywords play in getting your website in front of your target audience. There are thousands of ways that users might search for products or services like yours. So what are keywords and their role in SEO strategy?
Keywords are the words and phrases that people enter into search engines. Searchers use keywords to find what they are looking for, and search engine bots use keywords to understand what kind of content users want.
Another common way of thinking about keywords is the term search intent. Keywords communicate the search intent of the user, for example if they want to make a purchase, compare products, read reviews, or something else.
There are four primary types of search intent every business owner should attempt to address with their web page content:
Within these four categories, there are two popular subsets of keywords:
It’s important to note that more than 70% of all keyword searches are long-tail. The majority of people use conversational queries when using search engines. Publishing multiple pages on your website that target long-tail keywords can be an easy, affordable way to expand your market share.
When a user performs a search relevant to your business, we want that user to see your website listed in the search results. It doesn’t matter if you have the best products/services on the planet if a potential customer can’t find you.
There are two ways to use keywords to show up in searches:
You can show up in the search results for various keywords by creating search ads that will display whenever a user searches for the keywords you’ve selected.
Example of a paid search advertisement
After you create an ad or ad campaign, the platform you are using (ex: Google Ads) will ask you to enter your target keywords. Pro tip: if you’re just getting started with ads, less is more when it comes to keyword targeting, it’s best to start with just a handful of high-value terms.
The other way to show up in the search results for a keyword, is to optimize your site for that keyword. This is a two part approach: you first have to make sure your site has content relevant to that keyword, and then second you have to get some other sites linking back to yours referencing that content.
Buying ads costs direct money (you pay per click), and optimizing for organic search costs indirect money (time and effort to create content and optimize pages). So we want to narrow down our focus to just the keywords that are likely to provide your business the most value.
Remember: Ads can be quick wins, but will cost you more over time, they’re like renting space in the search results where SEO investments may take slightly more time, but you’re building search equity and earning your spot more permanently.
Let’s put search intent aside for a moment, and look at the other ways we can establish which keywords are good targets for your business.
Here are examples of important keyword metrics as seen in the Keyword Researcher. Let’s walk through what each of these metrics are and what they mean.
This is a metric that measures how difficult it is to rank in search results for this term organically. The scale goes from 1-100 with 100 being most difficult. Websites with higher Domain Authority stand a better chance of ranking for more competitive keywords.
How do I use this info?
The best way to use Keyword Difficulty is as a part of your decision for whether or not you’re going to target the keyword with organic optimization.
If the Keyword Difficulty score is really high, but the CPC is really low, you may be better off buying ads than trying to rank organically.
Example from the Above Keyword Searches
The keyword enterprise seo has a KD of 81. This is a very competitive keyword to rank for organically, so for a website just starting out, this keyword would likely be too competitive to target.
However, the keyword enterprise seo services has a KD of 54, meaning less competition to rank in organic positions but very similar search intent.
This metric is how much advertisers are paying to target the keyword in Google Ads campaigns. A high CPC tells you that other businesses think searches using this keyword will result in a sale. The higher the CPC, the more valuable the sale is to other businesses.
How do I use this info?
For paid media, the benefit of low CPCs is obvious — lower costs. But for SEO campaigns, CPC can be used to help determine keywords that are good candidates for organic optimization.
Using CPC in conjunction with KD is a great way to find organic keyword target that are realistic to rank for but still have strong buying intent.
Example from the Above Searches
The keyword enterprise seo services has lower keyword difficulty, but the highest CPC on the above list. If your brand invested in on-page seo for your landing page, off-site seo, and some supporting blog content, your brand could probably rank on the first page of search results for this term within 3-6 months. In the long run, that might be more cost effective than paying for paid search ads over time.
This metric tells you how many users are performing searches for the keyword each month. A high search volume tells you that if you rank on the first page of the search results for the keyword, you are likely to get a lot of users hitting your site.
How do I use this info?
If you’re looking at two keywords that have similar keyword difficulties, similar CPCs, similar search intent, but one has higher search volume – that’s the one you should target with organic search optimizations (i.e. improving your onsite content to target that term).
Example from the Above Searches
All of the above searches have a meaningful search volume. The higher search volumes for the keywords enterprise seo and seo for enterprise are likely due to the significantly higher Keyword Difficulty scores. Although ranking for those keywords would be great, remember it is more competitive. So for organic SEO purposes, targeting the terms with 480 and 500 Search Volumes can still have tremendous benefits.
The true VALUE of a keyword is the number of converting users that keyword can draw into your site. The number of users is measured by search volume and the conversion likelihood is measured by CPCs and search intent.
Remember – don’t get lost in the data! A keyword could have great volume, high CPC, low difficulty, and STILL be a bad choice for your specific business if the term has nothing to do with what you sell or offer or the content on your website.
High-value keywords are ones that bring CONVERTING users to your site.
Indicators that a keyword is likely to convert:
500 conversions are better than 50 conversions (remember best case scenario, in position 1 of the search results your click-through-rates will average 33% of all traffic for the keyword). For example, here is the traffic share by position for the keyword enterprise SEO.
Make sure you choose keywords that have meaningful search volume, otherwise you will be optimizing your website for no one.
You can use keyword research tools like google keywords planner or our dashboard to do keyword research. Ideally, each one of the landing pages on your website should be targeting a different keyword, meaning you’ll need to find multiple relevant keywords to deploy a comprehensive SEO strategy.
You can also use the Content Researcher tool in your dashboard to identify whether or not a keyword is a realistic target for your website. Here’s a video tutorial walking you through the process.
If you are still in the early stages of determining whether to spend your digital marketing budget on SEO paid ads, make sure you consider all of the important keyword information above. Ideally, you will likely want to have some combination of both strategies.
Invest in SEO Improvements in the following scenarios:
Low-Hanging Fruit
Quick wins.
Precision Strikes
Quick wins.
High-Return Investments
These are terms that will take you longer to rank for, but will have a huge impact on your business.
Consider buying Ads in the following scenarios:
Work Smarter Not Harder**
While We Wait
If you’re interested in learning more about the best keywords to target for your site, sign up for a free account and check out our Keyword Researcher.
Here are some of the key metrics you can check with the help of the Keyword Researcher:
You can easily add keywords to a list and export to a spreadsheet to share with key members of your team.
If you want to target multiple keywords with every web page on your website, you may want to consider a keyword clusters strategy. If you need help with choosing the right keywords for your SEO campaigns, schedule a meeting with one of our SEO strategists.
If you’re looking for ways to improve the SEO of your website, content pruning is an effective method to consider.
Content pruning is all about removing any unnecessary content from your website, which can help your website rank higher in search engine results and can also improve user experience.
In this article, you will learn how to do content pruning for SEO, from identifying what needs to be pruned to taking steps to ensure that it’s done correctly. Keep reading to find out more.
Content pruning involves a thorough review of all content on a website to identify any content that could be seen as irrelevant or low quality from the perspective of search engine algorithms. Once identified, content pruning involves removing those low-quality pages from the website or replacing them with better content.
A part of regular website maintenance is making sure that all of the content on the website is up-to-date and relevant to the website’s goals. This can help ensure that a website has content that is useful to its visitors and provides value to users who arrive from search engines.
By removing low-quality or outdated content, websites can see improved search visibility for their highest-value, highest-converting web pages.
Content production takes time and resources. So you may be wondering: Why remove content from my website after all the work it took to create it?
Although it may feel counterintuitive to your content strategy, content pruning can actually have major benefits to your search engine performance. This is even more true for websites that have a robust SEO content strategy and are publishing new web pages on a regular basis.
Some of those benefits include the following:
Any content that sits on your websites that doesn’t pull its weight in either traffic or conversions isn’t actually bringing value to your business.
By taking the time to regularly prune their content, content managers can ensure that their website is performing at its best. That’s why sometimes content pruning is the right choice for your content strategy.
Here are some of the qualities to look for when searching for content on your website that may need to be pruned.
Bad content can have a negative impact on a website’s rankings. Low-quality content can include pages with duplicate content, thin content, or other qualities. It can also include pages that are not user-friendly or are low on useful information. Content pruning can help to identify and remove such pages, allowing search engines to easily index the more relevant, more quality content on the website.
Duplicate content is web pages with the same content or similar content that search engine crawlers do not identify as distinct. Google does not want to see duplicate content on a website unless it has been clearly identified as such via a canonical tag.
Thin content is often short-form content that doesn’t provide any real value to the user. Although there is no exact content length that Google privileges in search engine results, experiments have shown that longer, in-depth content tends to rank higher.
Most often, thin content can be combined with other pages on your website to provide a more comprehensive, in-depth answer to a topic, question, or keyword query. Combining content, or redirecting thin pages to more in-depth ones, are also a part of the content pruning process.
The reality is, the content on our website will become outdated over time. This is why creating evergreen content is important, however, it’s unlikely that your long-form content will last forever without the need for updating. Trends, technologies, and knowledge will change, and web pages should include the most up-to-date, useful information for search engines.
Also, outdated information can be confusing for visitors and lead to a poor user experience. Removing outdated content can ensure that visitors are presented with the most relevant and useful information.
If you have a web page on your website that does not get traffic or conversions, what value is it bringing your business? If the web page does not rank in search results, convert users, or is not a vital part of the buyer journey, it doesn’t really have a place on your website unless you take the time to improve its content.
You can use the Page Pruning tool in the dashboard to discover pages that may be eligible for content pruning.
To find the tool, navigate to Site Auditor > Page Pruning.
This tool will show you any pages that are eligible for pruning due to a variety of reasons:
Remember, just because a page appears on this list doesn’t mean that it has to be pruned/deleted, but that it may be eligible based on its performance metrics.
Once you have reviewed the software’s suggestions and confirmed that the pages are eligible for pruning, here are the next options for you to take.
The underperformance of the page may rest in the fact that the content is thin or is not providing a comprehensive answer to the user’s questions.
You can look to the “Boostable,” tab in the Page Pruning tool to identify those pages that just might need a slight content score boost.
The URLs that are listed here are already earning organic impressions but are not seeing as much success in organic traffic. Most likely, Google sees those pages as relevant but is not ranking them on the first page as a result of the content.
You can use the SEO Content Assistant in your dashboard to discover ways to strengthen and improve your content. Or, use the on-page audit tool to see what on-page elements may be impacting your performance.
Follow the guided suggestions for focus terms, headings, questions, and internal links. Include them on the page to make the content more rank worthy.
If your content covers trends or keywords that have seasonal search volume, that may impact their underperformance.
Consider updating the content with more evergreen information so the content has a longer shelf life in the SERPs.
Also, make sure that the information on the page is up-to-date with accurate, relevant information. Over time, links may break or content may become outdated. Updating your blogs and articles every 1-2 years should be a part of your regular website maintenance.
If both the SEO Content Assistant and on-page content school confirm that your content has high scores and is rank-worthy, you may just need a bit more of a link boost.
Backlinks are Google’s number one ranking factor. If you don’t have very many backlinks pointing to your web page, that may be a reason why it is not ranking on the first page.
You can use the backlink research tool in your dashboard to see which of your web pages have the least amount of link equity.
Consider investing in a link-building campaign in order to improve the off-site signals of the content. Doing so is likely to improve the overall keyword rankings, impressions, and organic clicks.
Another possible explanation for your content’s poor performance may be keyword related.
Some keywords are more competitive than others. If you optimized the page for a keyword that is out of reach, reoptimization may be your next step.
When choosing keywords for SEO, you want to make sure your website has a realistic chance of ranking for the target keyword. Websites with higher Domain Authority will stand a better chance of ranking for competitive keywords.
We suggest keywords that are less than or equal to your website’s Domain Authority.
So once you find a more realistic goal, optimize for that keyword instead. This will likely involve changing metadata, website copy, and headings on the page. But it can make a huge difference in improving organic performance.
A page may be flagged for pruning because Google is ranking a more helpful piece of content on your website.
This is known as keyword cannibalization. It happens when two pieces of content are very similar and Google doesn’t know which to promote. If there is a page that is ranking less often but is similar in relevance, you can do your “content pruning,” by adding a 301 redirect from the less comprehensive page to the better performing.
If you have a series of pages that are thin on content but relate to a similar cluster of keywords, consider combining those pages into a more useful, long-form resource.
Why? Because Google likes to promote content that satisfies users’ search intent. That means not only answering their initial questions but all of the additional questions that might follow regarding that primary topic.
So before giving up on that set of keywords entirely, combine those various pages into one page. Then, see if the overall keyword rankings improve.
This is the last step you want to consider after you have concluded that none of the above steps help to elevate content performance.
The reality is, if a piece of content is not driving website traffic, converting users, or an essential part of the buying journey, it doesn’t really deserve space on your website.
Take the ultimate step and trim that content branch off your tree.
Making content pruning a regular part of your website maintenance is a good habit to get into. This is especially true for websites that publish a lot of content and have a robust SEO strategy.
There is so much that goes into launching a new website. Purchasing a domain name, finding a great web designer, and writing website copy are all essential steps in a successful website launch. Most site owners invest a lot of money and time into the design and readability of their websites. But without any website traffic, no one will even see the high-quality website they have worked so hard to build.
That’s why SEO is so important for new websites. Executing a careful SEO process with your new url can help you earn keyword rankings, protect from Google penalties, and save you time in the long-run. Starting with a good SEO plan can prevent site owners from having to make changes to foundational elements of their website down the road like the url structure, pillar pages, or overall site architecture.
Those new site owners who implement the best practices of search engine optimization are better situated to quickly show up in search results and start earning new site visitors and customers. For new site owners who want their hard work to be rewarded with organic traffic, these 10 SEO practices are the best approach for building your new website on a solid foundation.
If you don’t have the resources in-house, connect with SEO professionals to ensure your new website has what it takes to launch straight into the search results.
Keyword Research | Site Architecture | On-Page SEO | SEO Tracking | Page Experience | XML Sitemap | Local SEO | Link Building | PPC | Content Strategy
Any new business or website entering a niche market needs to have a concrete understanding of their target audience. In the world of SEO, businesses connect with their target audiences through keywords and search terms. Keyword Research is the process of identifying and selecting the search terms that your target audience is already using to search for products, services, and content like yours.
Although getting traffic from search is the ultimate goal, increased traffic is not helpful if the visitors who arrive at your new site aren’t actually in the market for the products or services your brand offers. If you select keywords that are irrelevant to your industry niche, organic traffic is unlikely to translate into increased conversions. If you select keywords with low search volume, it is unlikely you will drive many clicks simply from a lack of impressions.
The best keywords, then, will be those that your website stands a strong chance of ranking for and converting from. Keywords with higher CPCs are more likely to drive qualified traffic and lead to your website. Why? Because if there are companies out there willing to pay top-dollar to rank in a PPC campaign for specific keywords, it’s likely the individuals using those search terms are converting.
It’s also important to consider the keyword difficulty of your target keywords. One of the most common mistakes that new site owners make is targeting keywords that are far too competitive. Like any new business or product, your website has to prove its value and usefulness to internet users over time. Choosing keywords with SERPs that are dominated by well-established websites with huge backlink profiles is not a smart strategy for new websites. Instead, identify longer-tail keyword phrases that are less competitive and provide opportunities for your new website to start ranking right now.
In this example, “business intelligence services,” is a great keyword target because it has lower organic difficulty, reasonable search volume, and higher cost-per-click
As you grow your backlink profile and domain authority over time, you can reoptimize your landing pages and web content for those more competitive search terms. The majority of searchers never make it past the first page of the search results, so if your website is unlikely to get to page 1 for your main keyword target, it’s best to go back to the drawing board and find a more realistic keyword.
Site architecture refers to the way your new website is structured. Not only is site architecture essential for users to navigate your website with ease, it’s also important for the search engine bots that crawl and index your landing pages. Because your site’s architecture is an essential part of whether or not your landing pages show up in Google search results, it’s important to identify the primary keywords you want to rank for and structure your website accordingly.
The homepage is the natural starting point of your new website, but it’s not necessarily the first page users will come across in search results. A strategic site architecture will give new site visitors strong footing regardless of which page they land on first. Users should always know where they are in your website and where they need to click for the content they seek next.
A common metaphor for site architecture is the pyramid. Your homepage will form the top of the pyramid, and beneath it, you will have the primary category pages of your site. These category pages should be able to encompass all of the content on your website. Beneath those category pages you will have the specific pages that naturally fit within that category (e.g. For ecommerce sites, these likely include product pages). Your url paths should not only reflect your site architecture, but be short and keyword rich to be as SEO-friendly as possible.
The header and footer of your new site should include links to key pages so users can quickly navigate to important, valuable content. Your internal linking structure will provide the pathways for Google’s crawlers as well, and it will also distribute PageRank across your website, an important Google Ranking factor. Having breadcrumbs on your site will also help users and Google bots understand your website’s architecture.
All of these SEO site structure tips can be completed by your web developer on the backend of your website. Popular CMS platforms like WordPress make it simple to add breadcrumbs. If you don’t think about these structural elements early on, you could be forced to make changes down the road that require an extensive site migration with redirects, url changes, or even website redesign. This can cause money, time, and even a loss of organic search rankings.
Once you have identified the keywords your ideal audience is using, you need to optimize your content for those keywords. On-Page content includes all of the visible and invisible content of your website. Google crawls all of it to understand the quality, relevance, and authority of your content in relation to your competitors.
Optimizing your landing page copy is easy with a tool like the Landing Page Optimizer. Enter your target keyword into the tool, and our software will scan the top ranking SERP results to identify key terms and phrases that Google associates with the search term. Add the suggested Focus Terms into your content to improve the topical depth, overall content score, and elevate your chances of ranking.
If you’re using a popular CMS like WordPress, Magneto, or Shopify, optimizing the backend of your website is fairly simple. Still, make sure that the web developer you are working with has SEO knowledge and can make sure the HTML elements of your website like your page titles, meta descriptions, and heading tags are optimized for your target keywords. Even better, have them use schema.org markup on the backend of your new website so Google crawlers can more easily crawl and index your web pages and improve the quality of your search appearances.
As you add new landing pages to your website, optimize those pages for other relevant keywords or long-tail phrases that will expand your market and increase your overall keyword rankings. Also, make sure that each piece of content you create has good information architecture that includes internal links back to your primary category pages. By doing so, you will help Google better understand how all of your web pages relate.
Learn more about the best practices of on-page content optimization with our comprehensive ebook
There are many different tools and softwares that you can use to track the performance of your website, but Google Analytics and Google Search Console are the primary tools every site owner should set up to track how their website is performing. With the key data and metrics these free tools provide, you can start understanding how users are interacting with your new site and how Google crawlers are ranking your web pages.
Google Analytics allows you to understand where your website traffic comes from, how long users stay on your website, how many pages they view, and other key metrics. With Google Search Console, you can measure your keyword rankings and search engine performance. Metrics like impressions, clicks, and click-through-rates can help you understand which content on your website is performing best in search engines and which pages are driving the most traffic to your website.
There are other tools and software that can help you monitor your overall SEO strategy. Once you begin to acquire backlinks for your new website, a tool like the backlink analyzer can help you identify spammy or toxic backlinks, perform competitor benchmarking, or identify new link building opportunities.
New Websites SEO Tracking Checklist :
An optimized website with high-quality content and high-performing UI/UX is key to driving organic traffic to your site. Users are going to press the back button on their browsers if your web pages take too long to load, don’t perform well on mobile devices, or don’t provide a high-quality user experience as they click from landing page to landing page.
In 2021, the page experience of a website will be considered in Google’s ranking algorithms. Google’s crawlers will evaluate the page experience and usability of your new website primarily through their Core Web Vitals metrics, which measure speed, load times, interactivity, and visual stability.
You can monitor your new website’s Core Web Vitals in your Google Search Console Account. GSC will provide you a score of “Good”, “Needs Improvement”, or “Poor” for every page of your website. If you see multiple pages with a “Poor,” rating, you may need to make some upgrades to your web hosting, web design, or other technical aspects of your website.
The more new pages you add to your website, the more important an XML sitemap will be. Sitemaps communicate to Google crawlers which web pages of your website are the most important, make it easier for Google to find and crawl all of your pages, and help Google better understand the hierarchy of your website.
There are tools that can help you make your own sitemap, but consulting with an SEO expert is a good idea to make sure you accurately identify the highest-performing, most important pages of your website.
For larger and enterprise-level websites with hundreds to thousands of landing pages, there may be multiple pages of your website that don’t generate conversions that don’t need to show up in search results. An SEO expert can implement noindex tags to ensure Google doesn’t rank those in their search engine results pages.
Once you have your sitemap complete, upload it in Google Search Console. GSC will notify you of crawl errors so you can make adjustments accordingly.
After your on-page optimization is complete, the next step is to work on implementing an off-site SEO strategy for your new website. Off-site SEO is all about creating signals on other websites that show search engines like Google and Bing that your new website is trusted by lots of people on the internet. These off-site signals play a big part in Google’s ranking algorithm.
An easy way for your website to start building these signals is to get listed across business directories in your industry. For local businesses, local citations with up-to-date, accurate information about your business are particularly important for appearing in location-based searches. A local citation service can get your new website and business information listed across hundreds of business and online directories really quickly.
And for websites of all sizes, not just local businesses, creating profiles on the most important Third-Party review sites in your industry is essential for building your off-site signals. Google crawls and aggregates data from these sites and considers the quality of your reviews in its ranking algorithms. Make sure you are monitoring them regularly for any negative content.
It’s also a good idea to incentivize customers you’re confident had a positive experience to leave reviews. Point them toward the review sites that will have the most impact and are likely to show up in the top-ranking spots for searches with your brand name.
One of the most important search engine ranking factors is your overall site authority. When your new website launches, your Domain Authority score will be zero, because you have not yet earned a reputation among internet users, search engines, and other webmasters. If you want to improve your Domain Authority score, you’ll need to earn backlinks.
Google measures site authority primarily through backlinks, or links to your website from other web pages. The logic goes that if another website links to yours, they must find your content trustworthy and valuable. Earning links on other websites is one of the most important parts of any off-site SEO strategy, and it’s commonly known as link building.
Creating original content that includes links to your website, and pitching that content to other websites, is one of the best ways to earn backlinks in a way that is Google compliant. Webmasters are always on the hunt for good content, and links that come from reputable websites in your industry will be more valuable and move the needle on your site authority even faster.
There are many link building strategies out there. From guest blog posts to broken link building, getting creative is key in earning new links that are outside of your existing network. Building up your backlink profile is one of the most essential steps in improving your keyword rankings and driving organic traffic to your new site.
Link Building Checklist for New Websites:
The reality is, search engine optimization takes time. Getting your web pages ranking organically for competitive keywords does not happen overnight. It will likely take months for your new site to establish its reputation with search engines. So in the meantime, a carefully executed PPC campaign can help you start generating clicks, traffic, leads, and revenue while you continue to do the work of link building and content creation.
A good way to think about PPC campaigns is like “renting,” traffic. Once your budget runs out, those clicks will no longer come. Overall, SEO is still a better strategy than Google Ads campaigns because owning those top spots means more traffic in the long-term. For this reason, you can use your PPC campaigns to prospect traffic quality for SEO. If those keywords you pay to rank result in conversions, it will be worth your efforts to try to rank organically for those same terms.
PPC can get very expensive though, so before you devote the majority of your new website’s marketing budget to PPC, make sure you are working with a reputable digital marketing agency that has a proven track of creating optimized PPC campaigns.
Most site owners launch their new website with a homepage, their primary category pages, and other key product pages. However, if you never add new pages to your website, you will struggle to drive traffic for the long-term. Internet users are always looking for new content, and if your website doesn’t offer anything new, users will quickly grow uninterested and will probably not return to your site.
Not only do new landing pages mean additional keyword rankings, multiple studies have shown that websites with over 40 landing pages have higher conversion rates. Having many long-form, high-quality landing pages that are valuable and useful to your site visitors gives those visitors more reasons to navigate your site without having to bounce back to the SERPs or search for content somewhere else.
Blog posts are an ideal way to constantly create new content that targets long-tail keywords, creates content marketing opportunities, and builds authority, expertise, and trustworthiness. A strong content strategy also benefits your off-site SEO, because great content assets that live permanently on your website give more reasons for others to link to your new site. Overall, a strong content development strategy is essential to moving your website from launch to long-term traffic.
We all know that keywords are essential when it comes to optimizing your website for better rankings in search engines. It was once common practice to choose one keyword to optimize each piece of content for. But with newer, more complex search engine technology, optimizing for keywords clusters is a more standard SEO practice.
Google has gotten better at understanding content, so your keyword strategy also needs to get more advanced. Using a keyword cluster model to drive your overall content strategy can help bolster your search engine results and help you outrank competitors.
Here is a guide on everything you should know about keyword clustering.
In short, keyword clusters are keywords that represent searches with similar consumer intent.
Also known as keyword groupings, these “bunches” of keywords are paired together because they represent the same, overarching intention.
Users don’t always search for products, services, or answers to their questions in the same way. For example, say your business is an e-commerce brand that sells women’s athletic shoes. An example of a possible ways that users might search for your products include:
All of the above search queries display the same intent, which is to purchase athletic shoes for women. If you only optimize your content for one of these terms, you’ll miss out on thousands of users who are looking for products and services like yours.
The reality is, Google usually ends up ranking our web pages for multiple keyword phrases. With a keyword clusters model, you can be more strategic in making sure Google ranks your web pages for the similar terms that your target audience is using.
Creating a keyword cluster involves being more thorough with your research and more strategic about keyword targeting. It requires a strong understanding of your audience and the types of terms they use to find products, services, or content like yours.
Any SEO professional understands that before you start with any type of keyword targeting, you must do your research.
You not only need to see what keywords users are searching for surrounding your search term/ topic. You also need to know which of those terms are more valuable and display the greatest conversion potential.
Example of Keyword Research in Google Sheets
And when we say research, we don’t just mean finding a few keywords. Done correctly, extensive keyword research involves putting together a list of hundreds to even thousands of keywords that might bring potential customers to your website.
When thinking about what kinds of keywords to add to your list, ask yourself the following questions:
Once you have some ideas about what primary keywords and overarching topics you want to target, take your time to identify all the variations of the keyword and topic as possible.
This means all long-tail keyword phrases, pillar topics, synonyms, and related subtopics. While there is no perfect number to shoot for, when you finish your keyword research you should have a couple hundred keywords to work from. This will give you a good number to help you build out multiple keyword clusters.
To find essential keyword metrics for the many keywords you’ll need in your list, you will want to use a tool like our keyword researcher. Features like “Suggested Keywords,” will help you find similar keywords quickly and then easily add them to a list that you can eventually export to a CSV file.
Our Keyword Researcher tool simplifies your keyword strategy by giving you all the data and insights you will need to create keyword clusters to improve your website’s content.
Our tool will provide the data for:
Once your list of keywords is complete, you’ll want to take your list and identify similar themes. Chances are, you may have already noticed some themes pop up while you were collecting your research.
The various patterns you might see will guide your keyword clusters. Some examples to look for are:
This goes back to natural language processing. Are there certain groups of words that are synonyms and share the same search intent? The more similar the keywords, the easier it is for Google to crawl your landing page and gain insight on its subject matter.
The core keywords in your clusters need to have a reasonable search volume. This shows that users are actually searching for those terms.
While long-tail phrases will naturally have lower search volume due to their specificity, make sure any long-tail terms you include in your cluster still display strong conversion potential in their cost-per-click metrics.
Most people know what keywords are, but understanding how to choose keywords can be a bit more mysterious. When presented with search volume, cost-per-click, and keyword difficulty, it’s easy to feel overwhelmed, especially when you are creating your content strategy. When it comes to keyword metrics, keyword difficulty is often considered the most challenging to understand.
So, what is keyword difficulty, and how does keyword difficulty factor into your SEO content strategy? This article will explain how this metric is calculated and how you can use keyword difficulty to improve your website’s search visibility.
Keyword difficulty is a metric used in search engine optimization that estimates how much of a challenge it would be to rank for that keyword. Some platforms use the terms SEO difficulty and keyword competition instead of keyword difficulty.
A higher keyword difficulty score indicates how difficult it would be to be displayed on the first page of Google’s search engine results page (SERP). A high score means that there is fierce competition among the websites that are already ranking for that keyword.
Keyword difficulty is measured on a scale of 0 to 100. The closer to 100, the more difficult the keyword would be to rank for. The nearer to 0, the easier the keyword is to rank for.
So, where does this number come from? Most SEO tools use a range of metrics to calculate these metrics. Some of these factors include the competing domains’ ratings and your own domain rating.
We use a weighted formula that takes into account the rating of the domains that rank for the specific keyword, the traffic share that frequents the top-ranking SERPs, and other nuanced factors. Domain rating is calculated by the number and quality of backlinks to that particular website.
You want to compete in your own weight class. If your competition is a domain rating giant, your content won’t be able to compete–and it will become buried in the SERPs. Keyword difficulty allows you to rank on Google for keywords that you can be discovered for. This, of course, leads to higher traffic and increased conversions.
Furthermore, keyword difficulty allows you to strategically invest your time, money, and other resources in efforts that will pay off. This makes creating an SEO content strategy easier.
Choosing the right keywords is key to SEO success. When selecting what keywords you can rank for, you will want to take into account keyword difficulty, search volume, CPC, and search intent.
The first step to creating a successful SEO content campaign is narrowing down where to focus your efforts. This is part of the beauty of keyword difficulty–it allows you to identify words you can realistically rank for, saving you time, and earning you organic traffic.
Most SEO content specialists suggest starting with pillar pages, then clustering supporting target keywords around them. A pillar page is a long-form guide, landing page, or blog that focuses on a primary or core aspect of your business or website. Cluster content relates to the pillar topic but gives you the opportunity to expand upon smaller details or aspects of the pillar topic.
You should have a handful of pillar pages that relate to your industry. You can add cluster content as needed and as new keyword opportunities arise.
A pillar structure allows Google’s web crawlers to more easily assess your website’s semantic content and give you kudos for overall topical depth.
For example, if you run a spa, your pillar topics will feature the primary aspects of your business. These may be massage, skincare, acupuncture, hot stone treatments, and waxing. From there, you can expand upon the topic. For example, you can have cluster content pieces for the benefits of each individual type of massage.
Once you’ve defined your pillar topics, you can then begin to create cluster content ideas based on keyword research.
With established pillar topics, you can begin your keyword research for your content clusters. When performing your research, your goal is to find keywords, or search terms, that are the most relevant to your business and that you will be able to rank for. This will require research.
Often the simplest way to choose keywords is to list subtopics for your pillar pages. Then, plug those topics into a keyword research tool to explore potential keyword choices.
The SEO suite allows you to quickly perform research and nail down your supporting keywords. With the software suite, the Keyword Discovery tool is an excellent choice for this step.
Simply type what keyword you want to research into the search box. Then, explore suggested keywords by viewing all. If you have a brick-and-mortar business or only ship within one geographic zone, you will want to refine your keyword location.
Review the list of keywords and select keywords by Search Volume (SV) and Pay-Per-Click Difficulty (PPCD)–which is similar to keyword difficulty but takes into account using a paid campaign.
Select keywords with a difficulty score that is lower than your domain rating or domain authority. Keep in mind the low competition keywords have low difficulty scores. This ensures you have the best chance of appearing in the search results for that keyword. Watch this video for more information on how to use Keyword Difficulty and Domain Authority in your keyword selection.
Once you have a list of keywords you want to create cluster pages around, you can use SEO content creation tools to optimize for those keywords.
In the suite, navigate to the Content Assistant tool. Then select create a new article. Enter your selected keyword into the Add keyword field. The Content Assistant will run that keyword through our proprietary analyses and display the keyword’s
If you hover over the keyword, you can select the graph icon to display a full report of the keyword’s metrics.
You will also find suggested target keywords you can add to the same article. You can add up to 5 target keywords per piece of content.
From there, you will be given a list of focus terms to include that relate to each particular keyword. These terms help improve the value of your content for search engine algorithms.
With the keyword researcher tool, you will find the keyword difficulty scale prominently displayed below the keyword you are researching. This scale is color-coded by difficulty level and labels the difficulty score of the particular keyword.
0-25 = Easy
26-50 = Average
51-75 = Hard
76-100 = Very Hard
A simple search will result in a wide array of keyword difficulty score tools. While each software will use a slightly different method of calculating keyword difficulty scores or keyword difficulty metrics, you will gain insight from whichever you choose. When comparing competitor software, keep in mind that you want to choose the best for your overall SEO needs. This often includes:
Our Dashboard offers a full suite of SEO tools, including a range of tools to help users select competitive keywords that they have the highest chance of appearing in the top organic search results for.
There, you will find:
There are a plethora of free keyword difficulty software choices available, including the Keyword Research Tool and the SEO Content Assistant.
Keyword Difficulty & Your Website
With the right keywords, you can take your content from underperforming and undiscoverable to Google’s first page of search results. Become the author of your site’s future by taking the next step toward mastering keyword selection, page authority, and SERP rankings. Our Dashboard makes keyword difficulty metrics easy to understand and guides you through the steps of producing the best content in your niche.
When you enter a phrase, question, or topic into the Google Search bar, Google returns relevant results that offer you an answer or solution. So how does Google know exactly what you’re asking for?
In order to improve the relevance of the SERPs, Google has had to get better at understanding human language. Thanks to natural language processing algorithms and machine learning, Google is one of the most literate robots out there.
So what can website owners do to leverage the power of Google’s NLP models? It’s called semantic SEO. It’s all about creating content that shows Google that your content has topical depth, relevance, and quality.
Semantic SEO is the process of creating your content around topics, instead of one or two keywords. It is when you build more meaning into the content you create. Also, it involves thinking about the true intent of your readers and how the various landing pages on your website interrelate.
A piece of content that is optimized properly for semantic SEO not only will answer the question a user has now but will answer the second, third, and fourth question they may have after reading. It is all about adding more depth, meaning, and reason to your content.
There are 3 problems that content creators face when dealing with content on a search engine:
Google is smart, but it’s still a robot. Several Google algorithm updates have focused on helping Google crawlers understand human language. However, they are bots. While their machine learning is quite advanced, they aren’t able to truly speak with human language. This is where semantic writing can come in handy.
The concept of semantic search can help with all 3 of the above problems. Using this method not only will help a content creator develop topics that answer their user’s intent, but they are also able to position the content where more than one question can be answered in more than one way. The goal is to add depth to your content and to frame content to longer queries and topic clusters rather than a literal keyword match.
To get a better understanding of what exactly semantic SEO is, it’s important to understand how the data behind language is processed.
Natural language processing (NLP) is how computers work to understand human language and infer meanings behind what is spoken.
NLP models are building blocks of communication between humans and computers. New advancements in NLP are happening all the time, like with SMITH and GPT-3.
With every new algorithm update, search engines like Google get better at understanding human language.
Google released an update back in 2019 that aided them in semantic analysis and NLP. The BERT update (Bidirectional Encoder Representations from Transformers), was one of the most important algorithm updates in years and affected 10% of all existing search queries.
In short, BERT helped Google better understand what the words in a sentence mean. It also helps Googlebot gain insights of the context around certain words.
Here are a few different examples of linguistic artificial intelligence capabilities that are used by NLP models like BERT.
Semantic mapping is the act of exploring connections between any word and phrase and a related set of words and concepts. A semantic map will visualize how terms work together, and the search intent of the consumer.
For example, if a person simply searches for the term “pizza for dinner,” they could be looking to order a pizza from a local restaurant. Or, they could be looking for a recipe to make one at home.
So how does Google know what the true search intent is? It’s all about the terms that the consumer searches with the main “pizza” keyword. If they are looking to make one at home, they could be searching for ingredients and proper oven temperatures. Or, if they’re looking to order, we can assume their search query will include the terms “near me” or descriptive words like “best ” or “great.”
This is the process of using coding to better explain to Google what types of information can be found on each different page.
One popular example of semantic coding is schema.org. Schema markup is a semantic vocabulary of tags, or microdata, that you can add to your HTML. It improves the way search engines read and represent your web pages in relevant search results. When you use schema markup, you tell Google exactly what is on your page, and how you want to present it.
There are other vocabularies and tags that you can add to help Google understand your content. Header tags such as H1-H6 map out the multiple subheadings and breaks in the content.
There are also other semantic tags that can be used for emphasis, quotations, citations, acronyms, definitions, and thematic breaks.
As mentioned before, there are many benefits to crafting content with natural language processing in mind. By crafting your content towards how NLP models work, you can earn more keyword rankings, better SERP positions, and more organic traffic.
Google wants site owners to add structured data to their website so they can better understand their content. If you have specific products, events, or jobs that you’re trying to promote, there is no reason not to add the appropriate markups so your pages can appear in rich results.
When a user clicks onto your website, they want to find a solution to their problem. They’re looking for the easiest and best answer to their solution. If they have to spend time hunting around for information, they most likely will move on to another website.
That’s why you’ll want to provide as much relevant, and contextual information to them as soon as they land on your website.
This means not only using content to create meaning around a particular topic, but to include helpful internal links and anchor text.
You need to think of your customer’s journey. Instead of putting all the information they may need on one web page, a strong internal link structure can work wonders. It both answers users’ questions and boosts your own SEO. However, you will want to create links with relevant anchor text that matters to the reader, to ensure they click on them!
Keyword research is, in a nutshell, complex. Most digital marketers just do simple keyword research on individual keywords that are relevant to their products and services.
But just think of how many different types of words there are in the English language! Your keyword research should really aim to capture all the different ways that users may search for products or services like yours. For example, there are better Chrome extension to better understand other ways users are searching for similar queries.
This includes verbs, adjectives, related questions and phrases, subtopics, and lsi keywords. LSI keywords, also known as latent semantic index keywords,are search terms that are relevant to the main, single keyword you are targeting.
When you enhance your keyword research and double down on content creation, you will have great success in creating relevant content. That content will also appear higher up in the search results for more relevant keyword phrases.
There are many benefits to writing content with topical depth. Although longer content is not technically a ranking factor, the advantages of creating more in-depth content are clear in the SEO research and the rankings.
Even if you only want to rank for one primary keyword, Google usually ends up ranking our web pages for multiple keyword phrases. Why not hit multiple birds with one stone when writing your content?
Pieces of content with topical depth tend to explore more subtopics and questions related to the primary keyword target. This broadens the reach of the content within the SERPs.
The idea is simple; the more you write about multiple different related subtopics, the higher chances you have when improving your visibility in multiple search results.
Above anything else, optimizing for multiple keywords will ensure that you have more opportunities to drive traffic to your website.
Not only does Google read the content you promote on your website, they take a look at how people are consuming it.
In a quest to ensure the best user experience possible, Google only wants to promote high quality web pages. If they see that consumers are bouncing from your website almost as soon as they land there, they will believe that your website isn’t relevant or valuable.
Because in-depth content requires longer landing pages, your visitors will scroll further and spend longer on the page. That is, as long as the text, images, and rich media are quick to load.
Topical depth allows for you to dive really deep into a certain topic at hand. But as you add topical depth to the page, make sure that you don’t sacrifice other key parts of the user’s experience. That includes site architecture, links, and easy navigation elements like jumplinks.
Have you ever read a piece of web content that’s stuffed full of the keyword the site owner wants to rank for? Not only is it difficult to read, it devalues the entire experience.
As stated before, semantic SEO is not hyper-focused on one keyword. By writing with semantic SEO in mind, you actually improve the readability of your content.
When you use related words to your main topic, then you’re able to give the reader more context, which improves readability tenfold. Readability is important on both a user intent and SEO basis, as readability improves engagement with the page.
Engagement is one of the most important metrics that Google pays attention to, and it ties directly in with improving search rankings and search volume. And who wouldn’t want that?
Allow for Easy Conversions
The whole purpose of writing content is to get some sort of conversion, whether it is a phone call, an email subscriber, or a purchase.
So every bit of content you create will need to serve its own purpose. If your content is stagnant and doesn’t inspire users, then what is the point? Content ties all of your marketing efforts together, and fantastic content will allow for an easy conversion.
When users find content that is in-depth and answers their questions, they will be more likely to buy. They will also more likely see your brand as an industry authority.
Our SEO Content Assistant uses NLP algorithms and machine learning to help site owners practice semantic SEO.
With this tool, any digital marketer can take their target keyword and create high-quality, in-depth content. The higher the content score, the more likely the content will rank well in Google and push the user toward conversion.
Our SEO Content Assistant uses semantic technology. It compiles a wide variety of related phrases, LSI keywords, and shoulder topics from your specific keyword and main topic. Additionally, the software will suggest focus terms, multiple on-page elements to use, and keyword frequency that your content needs. This will help your web page outrank competitors for high-value keywords in your industry.
As a result, you will have dozens of Focus Terms to use in your content that help to enhance the topical depth of your content. The SEO Content Assistant allows for you to optimize your content up to 5 keywords, but make sure those target keywords are related and have similar relevancy.
Our easy interface makes it seamless to import your content from existing URLs, export new content into Google Docs, and collaborate on your projects with other members of your team.
All in all, this tool will enhance your content and make it a valuable asset to your digital marketing goals.
Semantic SEO is a fantastic tool for both search engine optimization purposes, but also to really grab your users at the perfect time for conversion.
By creating content that works to speak to a consumer in natural language, instead of stuffing keywords unnaturally, consumers are more likely to gain the overall context of your main topic.
Pillar pages and topic clusters have become increasingly popular in the world of SEO, offering a powerful way to boost your website’s search engine rankings. This approach to SEO content enables marketers to create comprehensive content hubs with multiple related topics, linking them together to create a powerful library of content that optimizes their website for more overall visibility. Continue reading to learn more about pillar pages and topic clusters as well as how our SEO agency can help you use a topic cluster model to your advantage.
For starters, topic clusters are a popular content marketing strategy used to create an information hierarchy that is easily navigable for readers. They are made up of the main page, known as a pillar page, and several other pages, known as cluster pages. The pillar page serves as an overview of a specific subject while the cluster pages provide more detailed information.
Topic clusters are connected by internal links, which allow readers to easily find other topically-related content. This also helps to keep readers on a website as well as improve their understanding of the subject.
For example, we have a topic cluster on our website centered on link building, which is one of our agency services. Our link building page is considered the “pillar page,” and we have created a variety of “cluster content,” that explores that main topic more in-depth. Some of the blog posts in that cluster include:
All of these blog posts link back to the primary pillar page, helping create a cluster of content that Google understands is topically related and full of in-depth information about the topic of link building.
Topic clusters can help to increase website visibility in search engines, as their algorithms tend to favor websites that are well-organized, authoritative on a topic, and have a strong internal linking profile.
As an example, let’s say that your business is a personalized travel agency catered toward ethical tourism.
One of your pillar content pieces would include a page optimized for the keyword “ethical travel companies.” Within the pillar content pieces, you’ll also want to target keywords like “ethical travel agency” or “eco travel company.” The total number of topic clusters on a website will be determined by the services the business offers.
For example, you can create pages about sustainable travel packages or reasons why ethical travel is important. Your business can also create another content cluster around a larger topic, such as how to become a more “sustainable traveler.” This cluster could potentially explore the following topics:
This is just one example of how you can create a well-structured, easy-to-scan cluster. This makes it easier for searchers to crawl, and it makes it easier for potential customers to browse your cluster topics. It’s all about diving deeper into the informational side of your services and showing a search engine that you can provide value to your target audience.
Yes. In the past, websites would simply create blog posts to match popular keyword phrases. Each web page would be optimized to rank for a specific keyword, and there was no attention placed on the content of the site in its entirety.
But search algorithms began to change. The updates made them better at determining the expertise of a website. Now, they search for in-depth content, shifting their focus from keywords to topic areas. As a result, content strategists have now moved toward using topic clusters to organize their content. By building a content plan and implementing it appropriately, strategists have learned how to create a significant impact on SEO results, improving overall site authority.
With the current state of algorithms, not using a topic cluster strategy would put your webpage at a disadvantage, particularly if your competitors are enterprise-level. Topic clusters exhibit the quality indicators that Google looks for, and creating content with these qualities will produce the best results.
It’s important to realize that topic clusters are a long-term strategy that allows your site to gain search rankings for broad, overarching keywords. Building a content cluster out on your website won’t happen overnight. In the beginning, topic clusters can help your website gain search traffic for longer tail keywords that are less competitive and elevate the performance of pillar content for more competitive, higher search volume keywords in the long term.
Topic clusters will also help your brand build E.E.A.T. (Experience, Expertise, Authority, Trustworthiness) in certain topics so that Google sees your website as an industry leader.
To create a successful SEO topic cluster, it is important to first research relevant keywords and phrases using a keyword research tool. This will help identify the topics that are most relevant to your website’s target audience. By creating a cluster of related topics, the website’s SEO will improve, as it provides more opportunities for search engines to index and crawl the website’s pages.
Below, you’ll find step-by-step instructions on how you can use topic clusters to improve your search engine optimization.
If you want to build topic clusters on your website, here are the following steps to take:
Look below to find a more detailed breakdown of each step.
Begin by choosing a topic and doing some keyword research. This topic area should be relevant to your products, services, content, or areas of expertise. For example, if you are a real estate broker, some relevant topics may be “real estate agents,” “commercial real estate,” or “investment properties.”
You can use the content planner tool from the dashboard to help you identify possible subtopics for your topic cluster.
For example, the following keyword clusters were provided from the keyword input, “commercial real estate.”
You can also use the built-in AI content generator to help you develop a relevant title for your cluster content. For example, here is a suggested article title for the keyword cluster “small warehouse space.”
Keyword cluster and Topic Idea generated in the Content Planner
Content Creation
Once you’ve identified keywords and a blog topic idea, it’s time to start creating high-quality content. This is essential for every website page, and you’ll want to use the appropriate list of related keywords provided by the content planner. When creating any web page, it’s important to make sure that it’s well-optimized for your target keywords, so make sure to use a content optimizer like the SEO Content Assistant to confirm your content has strong ranking potential.
Content optimization in the SEO Content Assistant
This tool will show you recommended focus terms and internal links to include in your content in order to improve its topical depth. If it’s going to rank well, your cluster content will need to provide value and meet the search intent of your audience.
The final step is to link your cluster content to your pillar page. This means creating links from your pillar landing page to your other content pages, and from your other content pages back to your landing page. You can also link to other pages within that same cluster.
Internal linking is an important part of the SEO topic cluster optimization process, as it helps to establish the overall hierarchy of your content. By following these steps, you can create a well-optimized topic cluster for your website. And by doing so, you can improve your website’s search engine ranking and attract more website visitors.
You can continue building out topic clusters on your website for as many topic areas that have relevance to your products or services. For example, we have topic clusters related to our primary agency services, including link building, content marketing, technical SEO, and more.
Eventually, your website will look like this:
Topic Clusters Internal Linking Profile
Here are some things to keep in mind to improve the performance of your topic clusters even more.
Creating a pillar page is a bit different from creating a traditional blog post. This page aims to offer well-organized content that keeps visitors on the site longer and provides better signals to Google that your site is authoritative. It’s essential to create a well-thought-out pillar page to drive up engagement, increase page views, and appear as an expert on a particular subject.
The entire goal of a topic cluster is actually to elevate the rankings of your pillar pages. Those pillar pages should serve as the top of the cluster hierarchy, and you will want to be sending link juice from your homepage directly to them.
As a general rule, pillar pages should target more competitive keywords while your cluster pages target long-tail versions of keywords that represent subtopics of that topic area. Again, your pillar page is the most important, and you want your cluster content to elevate those pillar pages, not compete with them.
Linking to a pillar page of another topic cluster may confuse search engine crawlers about which page is the most important. Although you may want to link to pages outside of the topic cluster, avoid linking to a pillar page of a separate cluster.
When using a topic cluster strategy incorrectly, it can sometimes result in keyword cannibalization. This occurs when multiple web pages on a single website contain similar keywords or phrases and therefore Google doesn’t know which one it should promote. Search engines will then often prioritize the page with the most optimized content over the other pages, resulting in the other pages not appearing in search results.
When identifying cluster content, make sure that your subtopics are distinct enough for search engines to understand they need to be promoted for a different set of keywords. If you end up experiencing some type of keyword cannibalization, you can correct this by combining some of your similarly focused cluster content into one page.
Then, add 301 redirects from those old pages to the newer, more comprehensive page.
You can use the GSC Insights tool in your dashboard to track the performance of your topic clusters. This can help you get a good idea of how your website is performing for the key “topic areas,” that search engine crawlers associate with your website.
Navigate to the “Page Groupings,” feature and start assigning your topic cluster content to the same group as the accompanying pillar page.
Tracking Topic Cluster Performance in GSC Insights
From the above data, I can see that our link building topic cluster is performing very well, meaning Google sees our website as authoritative in this topic area. If we add new link building content to our website, Google is more likely to promote it, as we have already proven ourselves authoritative in this topic area.
Adding more cluster content to our Technical SEO and Content Creation, as well as investing in link building to those content clusters, will help Google start seeing us as more authoritative in those sub-niches of SEO as well.
Remember that SEO topic clusters do not happen immediately and require consistent content creation efforts. However, taking the time to build out all of these pieces of content can have drastic impacts on your organic rankings and help Google rank your web pages for years to come.
Domain Rating (DR) and Domain Authority (DA) are two metrics SEO experts rely on to understand the authority of websites. SEO strategists regularly use DA and DR as key performance indicators for their campaigns or to benchmark how websites measure up against competitors.
DA and DR are separate authority scores with their own unique calculations, but both are regularly used in the SEO industry to represent ranking potential. In general, high DA and high DR scores are seen as reliable predictors that a website will perform well in search results.
As a result, many SEO professionals have posed the question of which metric is more reliable. Is DA or DR more accurate in predictions of better SEO performance? These questions are not answered in the below study, but we set out to compare DR and DA metrics and to better understand their relationship to other SEO performance indicators like organic traffic.
Domain Authority is a metric created by Moz that predicts the ranking potential of a website. It is scored on a 0-100 scale. According to Moz, the higher a domain’s DA score, the more likely web pages from that domain will rank well in search engine results.
DA is calculated by evaluating several factors, but most significantly, the inbound links pointing to a domain. The link data used to calculate DA scores comes from Moz’s Link Explorer web index.
Domain Authority is designed to be a comparative metric, meaning earning a 100 score is not the ultimate goal. Site owners can use DA to measure the ranking potential of their own website in comparison to other domains in their industries.
Domain Rating (DR) is a proprietary metric from Ahrefs that is designed to measure the strength of a domain’s backlink profile via total number of backlinks and total unique referring domains. In addition to quantity, Ahrefs also considers the quality of the inbound links pointing to a domain while calculating DR scores.
DR is on a logarithmic 0-100 scale with higher scores indicating stronger, more robust backlink profiles. The link data used to calculate DR scores comes from Ahrefs’ large link index.
Unlike DA, DR is best described as a representation of backlink profile strength rather than ranking potential. But because backlinks and unique referring domains are some of Google’s top ranking factors, Ahrefs’ DR scores are also often used in the SEO community to quantify how well a domain will rank in search engine results.
DA/DR and Google
It’s important to note that neither of these metrics are Google ranking factors. However, because both authority metrics rely on similar factors that Google uses in their ranking algorithm, higher scores often have a higher correlation with better search engine rankings.
For this DR vs. DA comparison, we used the below metrics for 9,739 active websites.
The below scatter plot charts Domain Rating and Domain Authority scores along with estimated monthly organic traffic. Organic traffic is charted on a logarithmic scale with green representing higher organic traffic and red lower organic traffic.
Based on the upward trend, we can make the following conclusions:
Where the data diverts from this trend is the group of domains where DA is significantly higher than DR scores.
It’s possible that this outcome is due to Ahrefs’ greater emphasis on higher quality links and more authoritative referring domains. It’s possible that Ahrefs does not assign as much value to the links pointing to those domains as Moz does, therefore resulting in higher DA scores but lower DR scores.
Regression analysis is a mathematical way to understand whether a variable impacts another. In the below regression analyses, we attempted to isolate DA and DR to understand their unique relationship to organic traffic.
Organic traffic as explained by DR | intercept = -141730 slope = 6709.6
As expected, there is a positive correlation between higher DR metrics and higher organic traffic.
Organic Traffic Explained by Mox DA | intercept = -295370 slope = 7908.4
Like DR, higher DA scores also correlate with higher organic traffic. DA showed a slightly stronger relationship with organic traffic than DR, but only minimal.
When comparing Domain Rating vs Domain Authority, it is clear that both have earned their place in the SEO community as useful metrics to predict improved SEO performance. Both DA and DR have a positive relationship to each other as well as higher organic traffic to a domain.
Utilizing both metrics for comparative and SEO analysis can be beneficial, and increasing domain authority and domain rating scores can have a positive relationship with other SEO metrics like total keyword rankings and increased organic traffic.
Have you ever heard SEOs use the term “backlink profile,” but are not quite sure what it means?
Improving the strength of your backlink profile can help Google perceive your website as more trustworthy and authoritative.
If you’re new to backlink profile analysis, this guide will cover all of the terminology related to backlink profiles and teach you how to analyze the strengths and weaknesses of your backlinks.
Then, you can leverage that information to develop a more impactful off-site SEO strategy.
A backlink is a link from another website that points to your website.
Backlinks are important because they are one of the main factors that search engines use to rank web pages. The more high-quality backlinks a web page has, the more likely that web page is to rank in the SERPS for relevant keywords.
If your website has lots of backlinks, it shows Google that other webmasters on the internet find your content valuable. It’s the primary way that search engine crawlers determine whether a website is trustworthy, reputable, and worthy of being promoted in the SERPs.
Let’s say I was writing an article about Google search advertising and wanted to mention a statistic about search ad conversion rates. To cite my claims and provide more information to my audience, I may choose to link to this informative list of marketing statistics from Hubspot.
By doing so, I just created a backlink for Hubspot.
If I were to link to another relevant page on my same website, that would be considered an internal link. Although good for SEO, internal links are an on-page SEO signal and should not be confused for backlinks.
An inbound link is only considered a backlink if it comes from another website.
A backlink profile is the complete list of all of the backlinks pointing to your website.
The number and quality of these backlinks is a key factor in how well your website ranks in search engine results pages (SERPs).
But there are other key factors that SEOs use to evaluate the overall strength of your backlink profile. To do backlink analysis successfully, it’s important you understand the terminology associated with this evaluation process.
Total backlinks is the total number of backlinks pointing to your website.
In general, websites with more backlinks will outrank websites with fewer backlinks.
However, it’s not just a numbers game. If your competitor has more backlinks than you, that doesn’t guarantee better keyword rankings.
The quality of the websites that link to yours is a big factor in determining backlink profile health and ranking potential.
When a website links to yours, that website is considered one of your website’s linking domains, or referring domains.
Essentially, that website can send you referral traffic because of the link they have included to your website in their content.
Say your website has a thousand backlinks, but they all come from the same three websites. Although a good number of backlinks, it does not signal a strong backlink profile, because only three other websites on the internet find your content valuable enough and trustworthy enough to link to.
That’s why in addition to backlinks, the total unique referring domains in your backlink profile is another top Google ranking factor.
The more unique referring domains in your backlink profile, the more likely Google is to promote your web pages, as your content is clearly trusted by many people across the internet.
A quality indicator for referring domains is their Domain Authority or Domain Rating scores.
Domain Authority is a metric from Moz that estimates the site authority (or overall reputation) of a website.
Backlinks that come from authoritative domains with higher site authority scores will be even more valuable.
Backlinks from websites with very low DA or DR scores can actually weaken your backlink profile, as Google crawlers will see that your website is keeping company with questionable web properties.
If your referring domains also rank for multiple keywords and earn organic traffic, that also increases the value of a backlink.
Why? Because if those searchers are finding their content valuable, and that linking domain is linking to your content, then search engines understand that your content must also be valuable.
In layman’s terms: I know a guy, who knows a guy.
Every time a website links to another website, it sends along a portion of its PageRank (or link equity) to the linked page. The more PageRank the linking page where your backlink comes from, the more link equity that backlink sends along to you.
PageRank is a patented metric that Google uses to understand the value of each individual page in its index. The more PageRank a page has, the more likely it is to rank in search engines.
Google used to let users know how much PageRank a web page had, but not anymore. Page Authority, then, is a metric SEOs now use to estimate PageRank.
Backlinks that come from referring domains with topical relevance to yours will be more valuable than backlinks from websites outside your industry.
For example, if you sell pet products on your website, links from veterinarian clinics, pet stores, pet blogs, or animal publications will all be topically relevant.
But if your backlinks come from websites focused on appliance repair, from the comment section of random blog posts, or other irrelevant sites, Google may suspect your website of purchasing those links or trying to falsely elevate your backlink profile strength.
When you start looking for link building opportunities, you will want to focus on websites that share content relevance.
The anchor text of your backlinks also influences the health of your backlink profile.
Although you do not have control of how other webmasters link to yours, Google will be looking at the anchor of your backlinks to understand what your content is about.
The majority of your anchor text will include your brand or business name, but a healthy diversity of anchor text signals that the backlinks your website is earning are through organic, Google-compliant practices.
Google also considers where your backlinks are on the linking page. Is it in the body of the text? The comment section? In the caption for an image?
The linking location tells Google quite a bit about your website. If the webmaster of a high-quality website links to yours in the body of their article, they likely trust your reputation.
If a user links to your website in a comment on another webmasters blog post, Google considers that questionable linking and will trust your website less.
Toxic backlinks are backlinks that harm your backlink profile for any of the reasons listed above. Some signals of a toxic backlink may include:
A big part of backlink profile analysis is identifying any toxic links and trying to reduce their impact. Historically, SEOs have used Google’s Disavow tool to discount the impact of toxic links, but Google’s 2021 Link Spam update has helped Google crawlers better identify and nullify low-quality links.
This means that although the low-quality site may be linking to yours, it’s not harming your backlink profile health, because Google is not counting it against you.
So now that you understand the terminology associated with your backlink profile, you know how to successfully analyze it to determine its strengths, weaknesses, and overall health.
How can you then leverage that information to improve your own SEO performance?
If your competitors have stronger backlink profiles than yours, you are unlikely to outrank them for search queries with higher Keyword Difficulty scores.
By analyzing the total links and total referring domains of your backlink profile against your competitors you can get a sense of how aggressive you’ll need to be with your own link building strategy.
Seeing who is linking to your competitors can give you a list of websites that you may want to reach out to for link building.
If those referring domains are finding your competitors’ content valuable, they may be willing to link to your content as well (as long as your content is similarly high-quality).
You can use the Link Gap Analysis tool in your dashboard to identify common referring domains among your competitors.
This can help you easily and quickly get a list of outreach targets and then reach out to those relevant webmasters and bloggers.
One of the best ways to improve your website’s backlink profile is to actively pursue high-quality backlinks from authoritative websites–also known as link building.
You can do this by publishing high-quality content that other websites will want to link to, participating in relevant online communities, or engaging in more technical strategies like broken link building.
Now that you understand what a backlink profile is and how to analyze it, you can use a backlink profile analysis tool to evaluate your website’s backlinks more closely. You’ll be able to leverage your new knowledge and our software together to start building a healthier, strong backlink profile.
If you care about SEO, you probably know how important backlinks are in your website’s ability to rank in search engine results. But what you might not know is the significant impact that the anchor text of those backlinks has on your SEO performance. Just like the Domain Authority and topical relevance of a linking website impacts how Google perceives you, your backlink profile’s anchor text diversity, or lack of it, can influence your SEO performance.
According to Moz, anchor text is an important attribute that determines a link’s value. Google pays close attention to anchor text and relies on it to understand what web content is about.
When other websites link to yours, they don’t always link in the same way. That’s why a wider variety of anchor text is beneficial to your backlink profile, because it looks more natural and organic to Google crawlers.
In contrast, if all of your backlinks have the same anchor text, or keyword-rich anchors, that will appear as possible manipulation or black-hat link building. If Google suspects your website might be trying to unnaturally elevate your rankings through suspicious techniques, it can harm your SEO performance in the long term.
The biggest challenge of monitoring the anchor text in your backlink profile is that for the most part, you have zero control of how other webmasters choose to link to your content.
Link building is already hard enough work as it is, and the added necessity of being strategic about anchor text can make it feel harder. But you should already be monitoring every backlink you earn–whether organically or through outreach–to ensure each one ends up helping you rather than causing harm.
So if you have already started earning backlinks but have not yet started paying attention to anchor text, it’s time to dive in. Here’s some introductory knowledge of how Google views anchor text, and also some best practices to make sure your backlinks don’t sink you into the internet void. We want to make sure you swim right up to the top of the SERPs.
Anchor text is the clickable text in a hyperlink that directs to another website or location. Google relies on anchor text and the words that surround it to understand the subject matter of the linked page.
But not all types of anchor text bring the same value to your backlink profile. In the eyes of Google, keyword-rich anchors, generic anchors, and naked urls each have their own nuance.
As you begin to accumulate backlinks, it’s a good idea to familiarize yourself with the different types of anchor text so you can ensure that your backlink profile always has a healthy level of variety.
There are multiple types of anchor text that might be appearing in your backlink profile. You can find a more detailed run-down of each type here, but these are the most common types you will begin to see as you embark on your link building efforts.
These are some core SEO anchor text principles that you should apply to both the linking you do on your site and the links you accumulate from across the web.
The more relevant the anchor text, the more powerful the backlink.
This is because relevant anchor text not only communicates stronger relevance signals to Google, it also contributes to a better user experience.
Those users who follow an exact match anchor text will be annoyed if they arrive at a new web page that has nothing to do with the content implied by the link. And even if a user clicks on a less specific, generic anchor text like, “Read this,” they still have the expectation that their new destination will have close relevance to their previous one.
So although we are focusing on the importance of anchor text diversity, your anchor texts should never veer off the relevance road in terms of content or industry niche.
You want your anchor text to both accurately describe the linked site to Google and make your user interested in clicking on it. Contextual anchor text then — instead of generic anchor text like “click here,” or “learn more,” — can accomplish both. It provides Google and users more context about the subject of the linked page.
Google will also look to the text before and after the anchor text to understand the relevance of the linked content. So make sure that you are looking closely at the sentences where your anchor text appears and that you don’t miss out on any opportunities to provide contextual clues to Google.
In the early stages of your website’s growth, it seems logical to strive to earn backlinks that have anchor text with the keywords you want to rank for. But too many keyword-dense anchor texts can taint your link profile in the long run, as it may start to look as if your backlinks were not acquired naturally.
Because in the past, webmasters used exact match anchor text to manipulate their backlink profiles and catapult to the top of SERPs, regardless if their site had any relevance to the search query. Algorithm updates like Penguin have enabled Google to easily identify anchor text manipulation. And Google’s most recent update, BERT, means Google is getting even better at contextualizing natural language surrounding anchor text.
If you have an SEO professional guiding your link building campaign, they should ensure that any backlinks you earn come from white-hat techniques. Although link building campaigns allow for slightly more agency in the types of anchor text used, backlinks still ultimately come from third parties.
This means they should always have variety and not rely on the same keyword in order to look natural.
Building a diverse backlink profile of anchor texts can have a really positive impact on your SEO performance. In the below graphic, you can see that the SEO community agrees that the anchor text of both backlinks and internal links are significant ranking factors.
So a regular part of your own website maintenance should involve reviewing backlinks to make sure they are not coming from low-quality, spammy sites, and to also ensure that even the best of your inbound links are using anchor text that helps build diversity.
Although there is no magic number for exactly how many of each anchor type you should have, semrush offers some realistic benchmarks:
If you are not quite sure how the distribution of your backlink anchor text is impacting your own site, you do have some options via the help of SEO tools and professionals.
A backlink analyzer can give you some much-needed insight into not only the quality of your links, but on the type of anchor text being used. You can use a free backlink analyzer like ours to see where your top competitors earn backlinks and to compare their anchor text diversity with your own.
Frequently revisiting this backlink data provided by SEO tools should quickly become a regular part of your SEO practice. If building links is currently a part of your SEO campaign, you can harness the power of this data to shape the types of target keywords and target anchors that you go after.
Seeking out the help of professionals in the SEO world is always a good idea. An audit can identify technical SEO problems that you may not be able to identify on your own. It’s possible that your problems don’t necessarily lie in your link profile but in other areas like page speed, html tags, or attempting to rank for far too competitive keywords.
But the important thing is identifying the problems before they result in long term consequences. Optimizing is not easy work, but the good news is, there is always someone who can help.
If you learn through an SEO audit or tool that your backlink anchor text has too much keyword density, there are some immediate steps that you can take to change the impact it might be having on your search rank.
You want to resolve over-optimization quickly, because it’s possible that Google could red flag your website if there is too much anchor text optimization. The Penguin update specifically targeted the previously common practice of anchor text manipulation.
So although you can’t choose the anchor text other webpages use to link to you, you can certainly monitor and control the ratio of the different types of anchors in your link profile.
You should always strive to remove toxic or spammy links that you acquire. Backlinks that come from sites with no logical correlation to yours should also be removed. Generic anchor text isn’t bad, but the average reputable site will more likely utilize your brand name or keyword-rich anchors when linking to yours.
As your link profile grows, you will have a lot more leeway. But in the early stages of your site, exact match anchors, or anchors without any relevance, can get you into trouble with Google.
Even if you are desperate for any link you can get your hands on, you need to practice discipline and evaluate how each can impact you in the long run. It’s better to be picky and choosy early on when your domain authority is low.
Your website will thank you later with traffic.
Before disavowing any backlinks, most individuals reach out to the webmaster of the referring domain name and request for the link to be removed.
So if you earn a quality link but are concerned about the anchor text, you can attempt a similar strategy. However, you should do so sparingly, and really only if there is an obvious error of some kind (like a misspelling of a brand name).
You can send a thank you for the backlink with a request for a slight anchor text change. If the content writer feels your concern is legitimate, they may gladly change it. If they think you’re just being picky and don’t like that they used an exact match anchor or generic word, they may get annoyed. They have the power to remove the link altogether, so use digression.
This strategy should not be overused. If Google suddenly sees all of the anchor text of your backlinks changing, it’s going to look suspicious.
This is by far the best and smartest option for diversifying your backlink anchor text. However, it does require more time and financial investment. It is not a quick fix like the previous options, but it has much better ROI in helping you secure those top spots in the SERPs.
Link building campaigns give you the option of building links with target anchor texts in mind–whether keyword rich anchors or branded anchors. Strategies like pitching and placing content like blog posts, long-form articles, or guest posts, gives you far more control of the anchor text you need to help diversify.
Since targeted link building campaigns are far and above the best strategy for diversifying your anchor text, we are going to take some time to break down the best practices for how to use anchor text while link building. When done correctly, you can target specific anchor texts and use them to build a natural backlink profile.
Long-tail keywords allow you to include exact match anchors without the risk of being flagged by that pesky Penguin.
For example, if you want to rank for the term, “seo software,” you could use variations like these:
The more pages that use similar anchor text to link to your landing page, the more likely that page will rank for those keywords in search engine results. So this strategy allows you to target the exact match anchor through new links with less risk, while also adding anchor text diversity.
The more you know about your competition, the better. Looking at the types of anchor text that have helped your competitors can give you some insights into potential link building targets.
While you’re at it, check out what their ratios of anchor text types are. You can also see what sites they are earning links from, as this will give you some additional information for your link building campaign.
Take note though: Websites with higher domain authority have already built a good reputation with search engines. For this reason, they can have a lot of keyword-rich anchors or exact match anchors without causing any harm.
Remember, the lower your page authority, the more selective you should be about any backlink you earn.
The great bonus of Google getting so much better at understanding natural language is that good, thoughtful web writing is more often rewarded.
Lazy SEO tricks of the past like anchor text manipulation and keyword stuffing often resulted in content that was just unreadable. But now that our search engines can look at both our backlink anchor text and contextualize the words surrounding them, we actually have more options, not less, for how we choose our target keywords and increase our ability to appear for more queries.
So one way to avoid over-optimization and achieve more anchor text diversity during your link building campaign is to use synonyms, LSI Terms, and relevant keywords for your anchor text. LSI terms are conceptually related to your target keyword. Google will understand their relevance, but will not mark them as “exact,” and therefore not penalize you.
When you have accumulated too many exact match anchor texts, consider embracing your literary side and use some synonyms instead. Overall, this strategy will make for happier search engines, happier reading, and happier link profiles.
For both internal linking and link building, how you use anchor text should vary depending on whether you are linking to your homepage, a landing page, or a blog post.
In general, you want to use keyword-dense anchors on pages with higher PageRank. These are most likely the pages that promote your brand or service, and therefore earn higher quality links. Exact-match anchors do better on web pages with better PageRank because those pages have earned credibility and trust. If Google sees a keyword-dense anchor on a well-performing page, it is less likely it will penalize you for over-optimization.
If you have landing pages like blog posts that have good quality content, use more generic links, partial matches, and keyword variations. The purpose of blog posts are not usually to promote a product or service, but to provide useful information or in-depth knowledge. For this reason, less keyword focused anchors fit the medium and look far more natural to crawlers
Overall, more anchor text variety can give you a powerful SEO boost. Just like backlinks vouch for your reputation, anchor text provides an objective description to Google about your site’s content. Because anchor text is chosen by a neutral third party, Google values the input.
A quality link building campaign will focus not only on high-domain authority websites, but strategic, diverse anchor text. So keep anchor text in mind as you continue your journey in link building and website growth.
There are few SEO elements more complex than links. While it’s common knowledge that strong internal linking structures, anchor text, and high-quality backlinks are vital to strong SEO, many content writers tend to gloss over annotation text. This can be a long-lasting and costly mistake. Annotation text for SEO may be the key to better use of anchor text.
Let’s all commit to maximizing the value of our links with well-optimized annotation text. The process is pretty simple. In fact, as an SEO writer, you’re likely 90% of the way there. As for the remaining 10% this article will walk you through how to write quality annotation texts that help pack every possible ounce of link equity into your internal, external, and backlinks.
Annotation text is the text that surrounds an outbound link that a web crawler can use to better understand the linked page. It is highly likely web crawlers will index text beyond the sentence level in relation to a hyperlink. This suggests annotation text can include words and phrases that span the section, paragraph, or even entire document where the hyperlink appears.
Most importantly, Google’s algorithm patent suggests this text is a ranking factor.
Annotation text is an SEO element developed from Bill Slawski’s research into an update of Google’s algorithm patent. Bill Slawski, who analyzed Google’s 2007 anchor text patent, realized there was an interesting discrepancy in how Google described their anchor text indexing. First, the patent describes that their anchor text indexing system will store at least one term in association with the outbound link. However, the patent goes on to explain that the annotation includes a text passage that is a predetermined distance of an outbound link.
Bill Slawski noticed that Google added geo-semantic indexing for text surrounding the anchor text.
This means that the text within a certain distance of the anchor text is indexed to better understand the meaning of the linked page.
Google’s NLP algorithms use artificial intelligence to better understand human language. This provides Google with a better understanding of any given web page’s content. As Google’s bots crawl a page they index these NLP signals in order to provide searchers with better search results.
While anchor text is a primary indicator of the content on the linked page, Google’s NLP algorithms also index information from the surrounding text as a secondary indicator.
By including surrounding text in Google bot indexing, Google is able to provide the best user experience since the surrounding text provides insight into the relevance of the outbound link. The better Google understands the linked page, the more precise its search results can be.
Providing Google’s web crawlers with more information about a page is always a better approach to trying to rank for Target Keywords, especially when it comes to off-site syndication and guest posts.
Furthermore, because most SEO content creators are solely focused on optimizing anchor text, you can gain a competitive advantage by optimizing for annotation text in addition to anchor text.
TLDR: Annotation text gives your outbound links contextual meaning.
This is where things get a little sticky. As we know, Google keeps all of the signals used for PageRank’s indexing secret. “But there’s a patent…” Well, just because Google registered a patent that describes annotation text doesn’t guarantee that PageRank uses this particular patent.
It’s likely that Google does since this patent was registered around the time they were developing their SMITH algorithm. However, Google has never confirmed it.
The word annotation means notes or in-text notation. So, when Google describes the process of pulling annotations through indexing, their crawler is likely pulling hints as to the link meaning from within the text. This, like taking notes on a literary piece, has an amorphous structure.
When it comes to the relationship between anchor text and annotation text, I often liken this to an atom. The anchor text is the dense center: the nucleus. The annotation text is the electrons circling this center. Both the center and the surrounding text define the atom, however, the anchor text is the most reliable in terms of its relationship to meaning since the electrons can be more difficult to pin down.
This often means there is no perfect version of a page. The web crawlers system will pull data from a “predetermined distance of an anchor tag.” So, commonsense says to make sure the most important words are near your anchor text but you maintain the quality of your content throughout the page.
Unsure of what anchor text is? It’s a simple concept. Anchor text is the text that is presented within a webpage and when clicked, directs the browser to navigate to a URL. Most often, the text is a description or explanation of the information found on the linked webpage.
For example, “an update of Google’s algorithm patent” in the previous section is the anchor text that links to the referenced patent in the Patent Database.
Google’s web crawlers use anchor text to determine the topic of the page to which the link is pointing. This information is gathered as the bot works its way through a sitemap.
No. While annotation links or annotated links have many meanings in the world of web development, this phrase is not interchangeable with annotated text.
Annotation links can be hyperlinks that allow video viewers to skip ahead to the section of the video that’s most relevant to their needs. Some people use “hyperlink annotation” as a synonym for “anchor link” when a photo, video, or audio file stands in place of text.
So, how can you make the most of annotation text for improved SEO? Most SEO content optimization software like Yoast SEO will ensure you have the foundations of keyword basics. However, these tools won’t ensure you’ve optimized your anchor text or surrounding text. However, with some attention to the text surrounding your links, you can turn a basic paragraph into a text machine that fully optimizes your links.
Annotation text gives you the opportunity to refine how Google understands a webpage. To make the most of your internal links, employ annotation text throughout your site whenever you internally link to another one of your pages.
Additionally, when you’re implementing an off-site backlinking campaign, pay careful attention to what you include in the surrounding text.
Like developing the best quality content, you want to determine the best Focus Terms to use. To do this, you will want to perform keyword research using SEO software.
For this article, we used the dashboard’s SEO Content Assistant. Use higher-important words in the text nearby your anchor text for the biggest impact. When selecting the most important terms, consider the topical relevance of the Focus Term in relation to the linked page. Even if a Focus Term is your most important term, do not include it if it’s not topically related to the linked page.
3. Incorporate Your Focus Terms Naturally
It likely won’t come as a surprise to you that Google has an aversion to keyword stuffing. This continues to ring true when it comes to the best quality annotation text. While you want to include Focus Terms, you must be strategic about it.
Consider including the most impactful Focus Terms in your annotation text. These are often the most semantically related to your anchor text.
If you’re going to make the most of annotation text, you must consider where the anchor link falls within a piece of content. Furthermore, you must also develop the content surrounding the anchor link.
In order to do so, avoid:
When it comes to the best content and the best links, annotation text provides context to your readers about your link. And Google likely uses your annotation text in addition to anchor text to better understand the page you’re linking to. This results in a better experience for your readers and more depth for Google’s algorithms to parse.
When you optimize your off-site outbound links’ anchor text and annotation, you have a better chance of ranking on the SERP #1.
Any business owner investing in SEO wants to improve their website’s chances of appearing at the top of the search engine result pages. But how do you measure SEO success?
Because SEO is not as immediate as other digital marketing channels like PPC or social media advertising, it can sometimes feel difficult to measure its impact. But it is incredibly important for site owners to understand how to evaluate and measure their SEO results in order to see what optimizations are working, what they need to improve upon, or what tactics they should retire. Also, understanding whether your SEO is effective is essential to quantifying the ROI of your digital marketing spend.
Measuring the impact of your SEO optimizations is actually very simple if you have the right tools. You can also start measuring the impact of your optimizations within 1-2 weeks. Here’s how to accurately evaluate and measure your website’s SEO performance as a whole.
If your business advertises a service but then fails to deliver on your promises, you will likely lose clients. Everyone wants to see results, because results prove the value of your efforts. This is just one reason for site owners to be closely monitoring the impact of their website optimizations, but there are several more.
Google owns about 75% of the Internet’s search market. Additionally, Google averages over 400,000 web searches every second, which equates to a whopping 1.2 trillion searches every single year. All of these statistics just go to show that your potential customers most likely will go to the Internet to find you first, making it incredibly important to ensure your website follows as many search engine optimization best practices that you can.
Because Google will most likely be your primary source of website traffic, taking the time to understand the details of how each page of your website is performing will give you the data you need to refine your SEO strategy and make improvements. The more refined your strategy gets, the more organic traffic that Google will drive to your website.
Sure, you want to make sure your website shows up in search engines. But that is the first half of the battle. Once potential consumers land on your website, you’ll want to do everything in your power to keep them there and ultimately convert them. Tracking your SEO performance not only helps you understand how Google sees your content, but it gives you a lot of information about whether users are finding your content valuable.
It is important to understand how Google works when deciding what to implement in your website. Google’s overall goal is to provide consumers with the best experience possible, so they have an army of website crawler robots that index every single website out in the World Wide Web. Those web crawlers then figure out what the website is about, and match up the content of the website to match search engine inquiries in order to provide the most relevant websites for the user.
With this in mind, it is important for a website to have relevant, informative content for its users, not only for Google to find and index, but for your users to learn about you! With the right SEO metrics, site owners can understand which pages of their website that users are finding the most valuable, and then use those pages as models for other pages on their site.
SEO best practices don’t really tend to change drastically, but Google updates its ranking algorithm several times a year. Having a more granular understanding of your landing pages’ performance will help you ride those waves of Core Updates without drastic changes to your keyword rankings. If you are consistent in evaluating your SEO strategy, you will be able to more easily find those components that need to be changed or tweaked as Google improves on its algorithm from year to year.
Keep the following factors in mind when you create a plan for measuring SEO success. You can access all of the below SEO metrics for your website in Google Search Console or Google Analytics.
Keyword rankings are the basis of any SEO strategy, simply because they are what users type into the search engines to find your website! So it is safe to say that a website’s keyword rankings offer a great snapshot into how your website is performing within the search engine result pages.
Usually, the first thing any website owner does when creating their website is to complete keyword research. Well, the same goes for measuring if your digital marketing is working or not– keywords say it all.
Keyword rankings are as followed:
Under the “Search Performance” tab in Google Search Console, you are able to see the queries users made that allowed your website to show up in the results. It is a good practice to take a look at these every month, and make sure that you use this extra information to your advantage.
Maybe there are keywords you didn’t expect your landing page to rank for — that can be a good or a bad thing! If the search intent is related to your content, that’s great, But if not, you may need to rework the on-page content or HTML tags to communicate more clearly to Googlebots what the content is about.
When looking at keyword performance, it is important to look at growth over time. SEO is a long game, and it takes, on average, about three months to six months to really see significant fluctuations after making a change. You can monitor the progress of certain keywords and phrases, and these insights will be able to tell you what content is working and showing up in the SERPs, and which content may need a little extra TLC.
Having the most relevant content for what your users are looking for is essential to seeing all of your SEO metrics improve. There is a lot that goes into using Google Search Console to track keyword rankings, so take the time to familiarize yourself with the “Search Performance,” section of this tool.
Organic traffic is the traffic that comes to your website solely from search engines. It is a powerful metric that can tell you if your website is not only performing well, but if it is worthwhile traffic that has a higher potential to convert.
Consistency is key when it comes to SEO, so it is common to expect a consistent source of traffic, no matter your digital marketing efforts. However, if you notice a huge change, such as an influx of traffic, chances are Google may have discovered a few new pages and is starting to rank them within the SERPs. On the other hand, if you notice a big decrease in traffic, your website could have been penalized due to an algorithm change.
Ideally, the more keyword rankings that your content earns the more organic traffic will continue to grow. However, if you are seeing your organic traffic stay the same despite additional keyword rankings, your landing pages may just not be ranking high enough, or maybe your page titles or meta descriptions are not enticing users to click. Paying close attention to which pages are not only ranking, but also getting clicks, will help you hone in on those on-page elements that seem to be satisfying both Google and users.
Not all traffic is created equally, and some traffic channels are more expensive than others. If you’ve ever run a PPC campaign on Google Ads, you know it can be expensive to generate clicks. Economic value of traffic helps you understand how much you would pay for those organic clicks if you had targeted them in a Google Ads campaign.
CPC often correlates to conversion potential. If advertisers are willing to pay a high price to target that keyword in a PPC campaign, it’s likely because the users who click have high search intent and are likely to convert. If you secure organic rankings for keywords with high CPCs, you are getting those same high-quality clicks, but at a much cheaper cost.
As you evaluate the economic value of your traffic, It is important to crunch the numbers and ask yourself the following questions.
Yes, SEO requires a lot of work upfront and regular maintenance, but it is ultimately much cheaper than PPC and longer-lasting. You can incorporate these costs into your metrics and get a better grasp on the true value that organic traffic brings.
A website’s ultimate goal is to make sure the traffic you’re getting is traffic that actually converts. You need to view your conversions in the lens of quality vs. quantity, while it is always great to have a bunch of new users on your site, if you only have one person converting out of 100, then there is a bigger problem to solve here.
If this is the case, take a step back and look at your website from the lens of a consumer. What is getting in the way of them converting? Could it be lack of information, a slow website, problems checking out, or your contact information being hard to find? Just determining the cause of the problem can be the difference between a handful of conversions a quarter and hundreds of conversions a month.
While the above factors are very important when you measure SEO success, the following metrics should also be taken into consideration as well.
Domain Authority (DA) is a metric developed by large SEO company Moz that predicts how likely a website is to rank on the search engine result pages. A DA is ranked on a 0-100 algorithmic score, and while they are not a metric used officially by Google, keeping tabs on your own website’s ranking can show you how you stand compared to other websites in your industry.
There are a lot of separate factors that go into compiling a DA score, including the number of linking root domains, the total number of backlinks, and the overall strength of a website’s backlink profile. Domain Authority is a great benchmarking tool and a good overall metric for your website’s ranking potential. You can check your domain authority score with our tool.
A website’s click-through-rate measures the percent of people that clicked onto your page from the search results. A high click-through rate provides insights on if your page title and meta description are properly optimized and are enticing to potential customers. The higher the click-through-rate, the better chances are that you are getting engaged, ready to convert customers landing on your page.
Bounce rate is the percentage of users who visited your website and left without interacting anywhere on the page. A high bounce rate indicates that users visited your website but then didn’t find the information they were looking for, so they left. This can be due to a variety of different reasons, from having a poorly mapped site structure, to not including informational content that answers user’s questions before they get a chance to ask them.
To prevent a high bounce rate, do what you can to create content that grabs potential customers the minute they land on your page. Use multimedia and graphics to add some extra visual appeal.
A scroll depth is exactly what it sounds like, it measures how far down visitors scroll on each webpage. You can monitor each page’s scroll depth to see if users are actually getting to the most important content, and if they are reading what you have to say.
A way to encourage long scroll depth is to add pictures, graphics, larger headlines, and omit blocks of text. The more pizazz you can add to the page, the better!
Google views a link as a measure of authority leading to your website. This means that every time another website links to your website, Google sees it as a measure of trust, and that your website should show up in search results.
A healthy backlink profile includes backlinks from a variety of websites, with a variety of differing DA scores, as this shows versatility and your links were acquired naturally and organically. It is a SEO best practice to gain links from other websites, and a diverse backlink profile shows that multiple websites see you as an authority in their space.
Again, the most important part of SEO measurement is what the data is telling you about what you are doing right. If you don’t take the time to measure SEO efforts with as much detail as possible, you will never have an accurate picture or understanding of the value of your SEO efforts.
It is normal for your SEO strategy and goals to change as your business ebbs and flows. But as long as you stay consistent in your SEO efforts, your website will become a reflection of your business online, and you will be able to build a strong presence in the digital world.
You do not have to do your entire SEO strategy by yourself. Our SEO experts are here to understand your business needs and goals, and our team is here to see your business thrive online. Give us a call and we’ll get started creating a comprehensive SEO strategy to bring your business to new heights.
SEO reporting software is indispensable for keeping your SEO campaigns organized and on track.
The ideal SEO reporting software allows you to provide clients with clear, concise, and visually appealing reports. However, knowing which software does this best can be a time-consuming challenge.
Additionally, you want to invest in SEO software that will grow with your agency or business while supporting your search engine optimization needs. This includes providing you with real-time insight, automatically tracking key performance indicators (KPIs), graphing analytics. So, with so many SEO reporting tools, how can you decide which will work for you?
In this article, we will cover what to look for when testing and choosing an SEO reporting tool, what reports to provide your clients, and how top-rated software compares.
Search engine optimization isn’t the simplest service to explain and demonstrate to non-SEO professionals. This can present a hurdle when working with clients, especially those without a lot of knowledge in SEO. Being able to provide these clients with easy-to-understand reports is vital.
Just as college students can anticipate midterm grades and final grades, it’s important to establish when to present your clients with SEO progress reports. In fact, most agencies outline in their contracts when clients should expect updates. For the most part, these are the best times to create reports for your SEO campaigns and which metrics to provide at those times.
Presenting reports when pitching to a potential client is a smart move. It’s difficult to beat a data-driven pitch when attempting to secure a new client. But how much data should you provide during the customer acquisition phase?
The key to SEO pitch reports is to demonstrate your capabilities, client’s SEO potential, and to not spend excessive time performing research. In order to achieve these, stick to providing the potential client with the basics.
Providing these shows the potential client that you’re willing to go above and beyond.
Not every SEO or agency has the time or resources to create custom reports for all of their pitches. Additionally, consumers now do a lot of research on their own before reaching out. This is when high-quality case studies can benefit your business.
Every agency should have case studies on-hand or available to point to on your website.
When creating the visualizations and charts for your case studies, keep in mind that they need to be self-explanatory and attractive. You also want to highlight your greatest achievements and tell the story of how you improved your client’s business through your services.
When presenting a case study, you can essentially repurpose a year-end report from several of your most successful campaigns.
Once you’ve landed a client, it’s important to record the client’s current SEO standing since this will be your starting point. A broad analysis will give you the data you need to create a smart, actionable SEO strategy tailored specifically to your client.
On-boarding SEO reports should include:
After outlining your client’s current SEO status, it’s easier to create a strategy as well as measure growth From here, you can go more in-depth for the areas where your specific client needs more work, like identifying and cleaning up toxic backlinks or finding opportunities to quickly increase keyword rank with new or refreshed content.
Active Campaign Reports
The majority of clients want to receive updates for active SEO campaigns. These can be monthly, quarterly, semi-annually, and annually. For these, report templates and report automation will save you time and energy while creating uniformity.
Instead of wrestling with spreadsheets and building a custom report each month, you can use SEO software like GSC Insights with integrated real-time data, KPI tracking, and exportable reports or report templates.
GSC Insights improves upon the data made available through Google Search Console. This software analyzes Google Search Console’s metrics to create easy-to-read and insightful SEO reports. This allows you to create beautiful growth visualizations for your clients while making it easy for you to notice fluctuations in traffic.
GSC Insights also allows you to customize your reports by date range, so you can demonstrate changes month-over-month (MoM) or quarter to quarter.
Note that this graph marks major Google algorithm updates. This can be quite useful should your client incur a drop in traffic as a result of these updates.
You can also include:
Depending on your contract, you can also utilize other dashboard tools to create Technical SEO reports for:
With GSC Insights, you can export a complete report that can be easily inserted into an email, Google Doc, or another software. You can also export a wide array of SEO performance analytics to CSVs. All of these file formats can be easily exported to PDF or printed to PDF.
Sometimes there’s a massive change in the SEO ecosystem. These changes could be a competitor dramatically raising PPC investment and therefore ad prices, a Google algorithm update requiring technical SEO, or regulatory changes in the client industry that impact social media.
When these disruptions happen, a standard weekly or monthly report won’t be enough. A full site audit would be too much. These client reports need to be focused on the SEO strategy that specifically addresses this new challenge.
As contracts come up for renewal, seasonal or temporary campaigns, or campaigns coming to an end, providing your clients with a clear report can have a major impact. A well-designed report can result in a successful renewal, service expansion, or a positive review and recommendation.
In these cases, a post-campaign audit can highlight not just your ongoing work but also show the client their ROI for the campaign. For these reports, you’ll want to focus on the value you provided.
Campaign metrics to highlight in a final report include:
We recommend also including a brief written summary of the campaign’s strategy and results along with how you could provide for the client’s SEO needs in the future.
This report needs to show how you took the client to the next level through your specific.
Creating SEO reports for clients is as easy as the software you use allows. For example, when using an SEO tracking software that does not allow you to export data, you can anticipate spending hours each month manually transferring numbers into spreadsheets. With a platform like ours, you can simply hit the “export to PNG” or export to CSV to create an automated report.
Here, we will cover how to create a variety of active campaign reports and customized reports using GSC Insights:
This report will include all data presented on the GSC Insights dashboard as seen below:
When your SEO software creates your reports, you’re able to maximize your productivity. However, not all SEO tools have the capabilities. So, when choosing the right SEO software for your needs, keep these criteria in mind.
SEO reporting tools use different data sources. Ahrefs brags about its extensive web crawls that make up the source of its information. Google Data Studio obviously pulls from Google Analytics and Google Search Console. While our suite draws data from a wide array of trusted sources, including Google Search Console.
Working with the latest data makes a major difference when it comes to SEO. Not only can you react and plan better when your metrics are up-to-date but can share the most recent and accurate data with your clients.
It’s nearly impossible to perform many SEO tasks without the ability to access and use indexing data gathered from crawls. While some content editor tools like Clearscope, leave this feature out, you will want to have another software that makes up for it.
To really deliver the impact of your work, you need to show your results in a way that speaks to your clients. SEO reporting software should be able to convert the raw data, whether it comes from Google or from proprietary research, into engaging charts and graphs that non-SEOs understand.
Sometimes it takes a fresh insight to highlight your hard work. Therefore, you’ll want to look into how each SEO software tool allows you to approach the numbers in a different way.
For example, GSC Insights offers Page Groupings, which allows you to see how certain groups of pages perform. Instead of manually compiling data for, say, blog posts vs static pages, GSC Insights lets you group pages together to get impression and traffic data for those aggregates, saving you time.
If your clients need local SEO, then you’ll want to keep this in mind when choosing your SEO tools, including your reporting tools. Your clients may appreciate being able to compare a list of keywords for each city or state, or seeing SERPs for distinct geographic areas.
As reports are used for different purposes, it’s helpful to have a reporting tool that supports multiple reporting formats. PDF reports are great for emails or in-person meetings. CSVs are a best practice when you’re combining information from multiple sources for a custom report.
Exporting to CSV for Google Sheets can be extremely convenient as well. Check that your provider offers exports in formats that you need.
Finally, SEOs need to be able to monitor the progress of all of their clients on a regular basis. Having one dashboard to access all of your clients is a must. Ideally, you can toggle between client accounts to easily produce your reports at designated intervals.
As a digital marketer, you want to spend time with clients or working on projects, not creating SEO reports. White-label reports can save you time. White label SEO reports are often automated reports that export to a PDF format. These effective SEO reports have great scalability and are perfect for clients with a basic understanding of good SEO.
You will find a wide variety of SEO tools out there. Each will have its own capabilities. However, you will want to be sure you invest in the one that provides you with unique insight as well as report options that clients will appreciate. Here’s how the most popular SEO software stack up:
Our dashboard is an all-in-one SEO suite that allows agencies, freelancers, and site owners to optimize their sites based on real-time data and analytics.
What sets us apart from its competitors is that it has a tool or feature for nearly every aspect of SEO, including
Reporting options: we offer a wide array of reporting options with an export feature for CSVs and PNGs. You can also easily track your SEO campaigns alongside your SEO performance metrics with the Site Events feature.
SEMrush is a popular SEO software that provides users with a collection of tools and features for technical SEO and content SEO. SEMRush provides easy exporting to PDF and spreadsheets on most of its reports.
However, the biggest drawback of this platform is users must pay a hefty price for access to the full range of tools.
Reporting options: SEMrush allows you to export to PDF, XLSX, or CSV for more data.
Like us, ahrefs provides all of their users access to their full suite of tools. Unlike us, Ahrefs doesn’t have as many capabilities. Primarily, this software does not provide backlink data, content creation tools, or a local SEO tool.
Ahrefs is one of the few SEO software options that doesn’t offer a free trial.
Reports: Ahrefs allows you to export most reports to CSVs. Their reporting options include a quick-view, full-report, or custom report.
When it comes to SEO, most people are familiar with Moz Pro–and often for good reason. Moz Pro was one of the first extremely popular SEO software options. Additionally, Moz Pro’s dashboard is clean and easy to use.
However, Moz Pro does not support SEO content creation and lacks several of the other bells and whistles you will find in other suites. This means if you have a full-service SEO agency, you will need to invest in multiple SEO platforms.
Reporting options: (Not all available at the base-level subscription) Custom reports, templates, branded reports, automated reports, PDFs, CSVs
As an SEO agency or professional, your time is one of your most valuable assets. When you choose an SEO tool, it’s vital to review its exporting and report options. Keep in mind that you want the data to tell the story of your SEO skills seamlessly.
While there are many SEO tools out there, such as ahrefs and Moz, we offer the greatest capabilities for all aspects of SEO. However, the easiest way to determine which software is best for your company is to choose a few and try them out.
We all know the importance of keywords to a business’s search engine optimization (SEO) efforts. Ranking for valuable keywords in your industry is a sustainable, affordable marketing strategy to boost your brand’s visibility online and drive leads and sales. The process of keyword tracking helps digital marketers measure whether their SEO strategy is succeeding.
Here is a breakdown of the importance of keyword tracking to your SEO strategy, how to choose a keyword tracking software, and some top keyword tracking softwares used by SEO professionals and digital marketers today.
Keyword tracking is the process of using software tools to monitor the organic keywords your website ranks for in search engine results. It also involves tracking the ranking positions for those keywords and monitoring how those rankings change over time.
But keyword tracking is not just about watching your keywords and hoping something good happens. Instead, the right keyword tracking tools will provide important data and metrics that offer insights into how to grow and scale your web presence in Google.
Your ideal keyword goal should be to rank in the number one position in the (SERPs) for the keywords that your target audience is using. However, certain keywords are more competitive than others. Other site owners will try to improve their content in order to outrank your brand.
As a result, you should constantly monitor your ranking fluctuations in order to respond when your ranking positions decrease or when your content stops being promoted entirely.
To understand exactly why keyword tracking is an important factor within a successful SEO campaign, it’s critical to understand how the SERPs work, how target keywords tie into rankings, and the most important keyword metrics you’ll want to monitor.
Search engines work by crawling all the web pages on the Internet and then reading, categorizing, and indexing them into a large database. While there are a ton of different technical elements to a website that can make it easier for the search engine bots to find you, the keywords you include on your web pages help the bots understand the content on your web pages and whether or not to promote them in the SERPs.
Similarly, users rely on keywords when searching for new content or information about products and services. But users don’t always search the same way. There are a variety of keywords they might use to find relevant content. If your web pages provide high-quality information or answers to the questions users are asking, Google is more likely to promote it to users.
As mentioned before, your keywords are ideas and topics that define what your content and web pages are all about. They are a fundamental part of SEO. When used properly, keywords drive organic traffic to your website and can make or break your business.
The more often that you create useful, informative content, the more opportunities you’ll have to get your brand name in front of potential customers by showing up in Google searches for relevant keyword queries.
There are a few common keyword metrics that SEOs or site owners use to determine which keywords they want to rank for. They include:
Site owners often aim to rank for keywords that have higher search volume and stronger conversion potential. A smart SEO strategy will also include keywords that a brand can realistically rank for so they can start ranking and driving clicks sooner.
In addition to the above metrics, search intent also plays into the value of keywords. Stronger search intent can mean a greater need for rank tracking.
The four ways that SEOs usually categorize search intent are:
Keyword searches that fall into the latter two categories of search intent are often more competitive because they have higher economic value. For businesses who use SEO to grow sales and revenue, they’ll more likely want to track keywords that fall into those categories.
Once you optimize your content for your target keywords and publish it on your site, there are four metrics to keep an eye on when rank tracking. They include:
By using all of these factors to monitor and track your keywords, you’ll be better equipped to hone in on your strategy and promote successful SEO.
Arguably, of any marketing element, keywords are the most volatile and change the most often. So much so, that a keyword’s ranking can change within hours and minutes. This makes it necessary to monitor them often.
But each business is different. The word “often” can be subjective. There are a lot of factors to consider when deciding how to monitor your keyword rankings. Our rule of thumb is the more organic clicks that a keyword drives to your website, the more frequently you should check your ranking position for that keyword.
Why? Because for some keywords, a change in rank position can mean hundreds, if not thousands, of fewer clicks to your website.
For beginner mid-sized businesses and those just getting off the ground in terms of content production, we recommend first giving your content time to be promoted on Google. On average, it can take Google up to 90 days to find, crawl, and index your web pages. If you are only starting out with your content implementation, have some patience.
Until Google has been able to find your page, there won’t be much movement in your rankings, so sit tight. Give the search engine time to crawl and index, and once you have been found, we recommend checking your rankings on a regular basis every week.
But for larger, more established websites, keyword rankings should be checked on a daily basis. Because when a keyword is high-performing (meaning strong conversion-potential), other site owners will dedicate a lot of time optimizing their content to rank on page 1.
Just because you earn a top rankings spot doesn’t mean you’ll stay there.
Just as you wouldn’t jump right into any other part of your content marketing strategy, you’ll need to have a game plan when it comes to your keyword rank tracking. Here are some steps that will help you get started.
The first step is to decide what keywords are the most important for your business. That list can include keywords driving traffic to your website right now and those keywords that you hope to rank for in the future.
Are you earning lots of organic clicks from a specific keyword? Do you earn more conversions from specific queries? If the answer is yes, you’ll want to make sure your content stays in top positions for those keyword terms.
Although the ultimate goal is always to rank in position 1, this is not always realistic. This is especially true when vying for the more competitive keywords in your niche. But because higher ranking positions have higher CTRs, just one position improvement for a keyword can mean loads of additional traffic to your website.
So in addition to the important keywords, you’ll also want to identify keyword position improvements that present huge organic traffic opportunities for your website. You’ll have to improve the quality of your content and page experience of the pages targeting those keywords to see changes in position. But, you’ll want to have those keywords clearly identified so you can include them in your keyword tracking.
This is particularly true for SEOs or digital marketers providing SEO services to clients. If one keyword position change produces huge influxes of traffic for a client, you’ll want to make sure you show it in your keyword tracking reports.
What rank tracker you choose will depend on the features that are most helpful to your own SEO strategy. Not all keyword rank tracker tools have the same accuracy or features. That’s why you’ll want to make sure that the software you ultimately choose provides the ranking data you need to stay on top of your keyword rankings and positions.
Remember, you want your keyword tracking software to work for you. So for the best results, the best rank tracker should include the metrics and functionalities you need in an easy-to-find place. The tool should also provide easily comprehensible data.
When looking for a rank tracker tool, you’ll want to ask yourself the following questions to ensure it is a good fit for your needs:
You might not be able to answer yes to all of those questions based on your current keyword tracking software.
However, there are some absolute musts that you will want to prioritize in order to do keyword rank tracking in the most impactful way.
For small businesses with only one primary service or product, you may not need a rank tracking software that provides position tracking for every single keyword your website ranks for.
But for enterprise organizations or digital marketing agencies, having 100% of your keyword data can be the difference between proving the value of SEO efforts to key stakeholders or clients.
Not all keyword tracking software provides comprehensive keyword tracking, and some charge higher prices for more additional keywords. Make sure you choose a software that provides access to as many keywords as possible.
Imagine this experience. Your keyword tracking software states your web page is in position 4 for a high value keyword in your industry. But then, when you type that keyword into Google, your page is showing up in position 11?
That’s likely because your current keyword tracking software doesn’t have daily position updates. Some keyword tracking softwares only provide daily updates for keywords with higher search volumes, meaning keyword position data can be backdated by weeks, and sometimes even months.
If you are in a highly competitive market or a niche industry with keywords with lower search volume, delayed position updates can be very problematic and prevent you from responding to rankings drops promptly and effectively.
The above two features are predicated on one primary factor: Whether your rank tracking software uses SERP scraping to gather data or uses Google’s API.
Some keyword rank tracking software uses bots to scrape the SERPs and gather position information. Most of these softwares don’t have the capability to scrape the SERPs for every keyword every day.
Google, however, has the most complete keyword dataset in the world. If your keyword tracking software is using Google data, that means you can see more keywords and see changes on a daily basis.
Based on the above features, here are three different rank and position tracking tools that meet most of the above criteria. They vary from completely free to a higher price point. But they offer a range of features that empower site owners to make the most of keyword rank tracking.
Google Search Console is the original keyword tracking software. This free platform gives site owners a nice introduction to keyword tracking and its importance to SEO.
In addition to keyword tracking, this platform helps site owners troubleshoot page experience issues. You can also use it to monitor mobile usability, submit disavow files, and see your backlinks.
Unfortunately, Google’s keyword tracking platform isn’t the most user friendly and is pretty bare bones in terms of UI/UX. It is free, after all, so it’s no surprise that there are no advanced features here.
But in terms of accuracy and real-time position information, nothing can beat the power of Google’s platform.
Ideal for enterprise websites and agencies, GSC Insights is one of the most comprehensive keyword tracking platforms. Because the tool is built over Google’s API, it provides the same daily updates and data as Google Search Console, but with more advanced features and data visualizations.
By linking GSC Insights with your Google Search Console account, you can have a fresh look on your keywords ranking data. The data representations that GSC Insights includes help you not only track your keyword rankings, but make insightful decisions for improvement.
In addition to daily updates and full impressions data, GSC Insights also has these additional features.
For those who really want to leverage the power of keyword data to improve their SEO, GSC Insights is the ideal platform for position tracking. With various subscription levels, site owners and digital marketers can find the level that fits best for their team.
Moz’s Rank Tracker offers a very simple and straightforward keyword rank tracking platform. For non-SEOs or beginners to keyword tracking, Rank Tracker offers only the high-level metrics you absolutely need to understand whether your SEO performance is improving or not.
However, because Moz relies on SERP scraping to gather keyword ranking data, they only offer 300 keywords to track, and upcharge for an additional 200 keywords. As a result, this rank tracking software is a better option for smaller sites or local businesses with less robust content or keyword goals.
Every site owner executing an SEO strategy should be engaging in regular keyword tracking. Staying on top of keyword positions can be the difference between maintaining your organic web traffic or losing out on significant sales or lead generation.
All of the above keyword tracking softwares have trial versions of their tools so you can see which one is the right fit for your business or SEO strategy. When done well, keyword tracking can help you achieve more success and maintain your top positions well into the future.
We all know how important search engine optimization is, but ranking on the first page of the SERPs can only get your business so far. Getting organic clicks is how you get the most value from SEO, so trying to improve your organic click-through rates (CTR) will always be a worthwhile effort.
There is a strong relationship between SERP position and the overall amount of clicks a SERP result receives. Understanding that relationship, and doing your best to use it to your advantage, can be a surefire way to increase your overall organic traffic from search engines.
To completely understand the definition of organic click-through-rate, it is imperative to know the following SEO terms.
That brings us to our definition:
Organic Click-Through-Rate is the number of web searchers who click on your result in the SERPs divided by the number of impressions your result receives.”
So with that in mind, a higher CTR can mean all the difference between capturing traffic and having your potential consumers click on your competitor’s website. The name of the game is to shoot for a high CTR and to constantly improve upon it.
Improving your website’s organic CTR isn’t the easiest thing to do. In today’s digital world, the quicker you can capture a consumer, no matter the stage of the marketing funnel they are in, the better. But, since Google has incorporated paid search ads above organic listings, search ads earn over 40% of clicks.
This means you have to do what you can to squeeze as much organic traffic as possible, and a high CTR will help you achieve this.
Whether click-through rate is an official ranking factor has been debated in the SEO industry for years.
There are plenty of CTR experiments out there that have tested out the relationship between CTR and ranking. Many have shown a strong correlation.
If a user clicks on your SERP result over a competitor that ranks higher than yours, it shows that your result is highly relevant or appealing to the user. A strong click-through rate shows Google that your result is meeting the needs and desires of users.
Regardless, it takes clicks to get customers. There is no downside to improving your CTR across all of your web pages.
Simply speaking, a higher CTR and a higher rank position go hand in hand. According to a 2020 study, the first organic search result has an average CTR of a whopping 28.5%.
The study analyzed over 80 million keywords and billions of search results to see how consumers engage with the SERPs. As a whole, they found that traffic and CTR declined significantly after the first organic search result.
All of this information isn’t particularly shocking. Most digital marketers are familiar with website SERP positions and their overall total number of clicks.
But what was most interesting about this study is that it examined how the amount of traffic generated by websites on page one and how much they vary by position. For example, while a result in position 2 would have a 15% CTR, that website will generate three times more clicks than the website in the sixth position.
So what does this mean? Well, that the relationship between CTR and rank position is exponential, and even if your website shows up on page 1 of the SERPs, this does not guarantee you will experience a drastic influx of traffic.
While online visibility is great, focusing your efforts on increasing the number of clicks to your website by trying to rank for the first and second position, rather than the easier to attain positions of 8, 9, and 10.
Here are some specifics about CTR and ranking position.
All of this evidence shows how important ranking position is to improve CTR and organic traffic. But to understand CTR and organic traffic for your website more specifically, turn to the right keyword tracking tools.
One of the best tools available for tracking your CTR across multiple different keywords is our GSC Insights Tool. GSC Insights is built over Google’s API. That means real-time, accurate data to help you accurately measure your SEO improvements on a daily basis.
With our Traffic by Rank Position feature, you’ll be able to see how your CTR measures up against the standard. This chart tracks your organic traffic and CTR by rank position across all the keywords on your website. It helps site owners to get an idea of where they can make improvements to their overall keywords, thus improving your average click-through rate.
Let’s look at a few Traffic and CTR by Rank Position charts. They can show us how GSC Insights helps site owners strategize CTR and organic traffic improvements for their own web pages.
In this first example, we see a website with click-through rates that do not reflect the standard metrics. This website actually has a higher CTR for keywords where they rank in position 3 & 4 rather than positions 1 & 2.
Because we know most clicks go to the top 2 results, this site owner would want to review their page titles and meta descriptions for the keywords where they rank in the 1 & 2 positions to try to improve their relevance and click-ability.
In the next example, we have a website with CTRs that more closely follow expected outcomes. They have a higher CTR for keywords that rank in position 1 & 2. They also have relatively the same CTR for positions 7-10.
However, what’s most noticeable about the above chart is the large amount of traffic that this site is getting from keywords where they rank in position 3. Most likely, the keywords where they rank in position 3 have larger search volume and thus stand a better chance of earning more clicks.
If this website could get their webpages in position 2 instead of 3 for those same keywords, they would see a huge influx of organic traffic.
In the third example, we see a website that has fairly low click-through rates across all rank positions. This means that their page titles and meta descriptions are likely unoptimized. An SEO strategist working for this client could make an argument for how much they would benefit from an SEO campaign.
See all that traffic coming from the 10 & 11 rank positions? Imagine how much traffic this website would get if they could improve those same keyword rankings to position 1-5?
In our final example, we see a website that has strong CTR metrics that reflect what is expected. But there are some positions where the CTR doesn’t meet expectations: positions 1 & 3.
Why does this website meet or exceed average CTR in position 2 but underperform in position 1 (where CTR should be 8% higher)? This site owner should compare the keywords where they rank in position 1 & 2. See what page titles or meta descriptions are pushing the higher percentage of clicks.
Do those meta descriptions include questions? More keywords? Are they the appropriate length? After identifying why position 2 drives more clicks, they should model their page titles and descriptions for the pages where they rank in position 1 accordingly.
So now that we know why CTR is an important metric to track and how to use GSC Insights to identify areas for improvement, how do you actually increase your click-through rates as they currently stand?
Here are some proven strategies to help you succeed in getting a good click through rate.
Your landing pages need to answer your target audience’s questions before they even have a chance to ask them. To see what they are already looking for, take a look at content that is already ranking for your target keywords.
What are potential consumers looking for when they enter that keyword? Use our SEO Content Assistant tool to review your competitor’s content and to improve your web pages relevance and topical depth.
Keyword cannibalization is when you have multiple pages that all rank for the same keyword. You never want this to happen as you don’t want only a few of your website pages to benefit from your digital marketing efforts.
Instead, you’ll want to spread out your keywords throughout every single page of your website, so you have multiple pages of relevant, important information. This way, you’ll also be able to spread out your organic clicks and overall conversion rate.
As we already mentioned, your page titles and meta descriptions play a very important role in your ratio of clicks. Make sure to create a headline that also has a CTA for your consumer to do your desired action of clicking through to your website.
You’ll also want to include a CTA within your meta description, as these few sentences are the best first impression of your brand. Some helpful tips for making your SERP result stick out are:
Consumers want to know exactly what they are clicking on, and a specific link lets them do exactly that. Include as many specifics about the landing page as possible, and make it relevant to the page itself. As a best practice, if your title is short, make your URL match the title exactly. URLs are a ranking factor within the Google search network, so make sure not to forget this step!
When you are creating relevant content, you will need to use a whole variety of keywords, from shorter terms to long-tail keywords. Considering that consumers will sometimes search using longer phrases or even complete questions in their search queries, it is a good thing to include as many in your content without keyword stuffing.
Structured data, otherwise known as Schema Markup, helps you appear in rich results, so you’ll want to utilize it correctly!
Example of Products Rich Result on a SERP
As Google uses structured data to accurately present your website’s information, not including structured markup can be a detriment to enticing those extra clicks. The most common types of schema are:
Using the above techniques will ensure your website achieves the highest CTR possible.
In summary, your website’s CTR is one of the most important metrics any website owner should track. Understanding your organic click-through rates and the strategies that can help you boost it is imperative for any business owner.
To the untrained eye, SEO might seem like a guessing game. You choose your keywords wisely, build some links, and––presto!––suddenly, your website sits atop the first page of a Google search. In truth, search engine optimization is anything but magic. When a Google bot scans your website, crawling through each of its pages, there are specific criteria it looks for, like content quality, site authority, and page responsiveness. What’s more, search engines are continually adjusting their algorithms, refining how they categorize and rank pages.
But what does this mean for your business? A digital agency that specializes in SEO can help your business maximize returns on your web presence. Much like the search engines they work with, SEO is a rapidly evolving industry, and the right agency can wield their expertise to suit your company’s needs, allowing you to focus on the rest of your business.
It’s essential for starters to understand that search engine optimization, while complex, is not magic but rather science. Through a deliberate approach of link building, content development, and web page optimization, new websites can use SEO to increase both organic traffic and conversions. The good news is you can remove the wand and crystal ball from your Amazon cart. The better news is that you can implement an effective SEO strategy with the help of a professional SEO company.
Once you consult with your team and decide to hire an SEO company, the question remains––Which SEO agency best suits my business? Currently, there are hundreds of options at your disposal, and more cropping up each day. Between SEO freelancers, in-house specialists, and full-fledged agencies, you’ll have plenty of options to choose from. Likely, you’ll start by browsing the web; however, we recommend asking your friends and colleagues. Which services have delivered results for them?
When you reach out to an agency, they’ll likely begin with a conversation, one in which they’ll assess the needs of your business and outline a plan of attack. During this call, be sure to ask them the following questions.
The goal of any SEO approach is, at its core, the same: to increase organic traffic to your website. With that said, there are many different ways of achieving this, so when consulting with an SEO agency, it’s important to consider the ways through which they plan to boost your search rankings.
Essentially, there are three main categories of SEO, and any SEO strategy worth its salt will account for each of them.
As the name implies, on-page SEO refers to the content visible on your page. This includes landing pages, rich media, blogs, your services pages, and other website content that your user will engage with. A solid SEO company will make sure to optimize all of this content for you, which can be done through a variety of means.
Keyword research consists of finding the best keywords to target on a page in order to boost its search engine rankings. Especially if your website is relatively new, you’ll want to select a company that targets keywords with attainable ranking difficulties. For example, if you’re hoping to boost traffic to your meal kit service, a good SEO company will find high-value keywords that drive qualified, converting traffic to your page, effectively carving a niche for your business, even in crowded markets.
Technical SEO refers to the backend of your website, that is, the parts not visible to the average user. While it’s tempting to settle for optimized on-page content, what happens under the hood is of equal importance. While the user experience is important for human users, they are not the only one’s reviewing your site.
In order to categorize and rank your site, search engines crawl each of your pages, “reading” them and rating them on their own scales of readability, so to speak. Especially in the coming months, with Google’s page experience update, the speed, mobile-friendliness, site architecture, and security all factor heavily into your page’s ranking, and a good SEO service should account for this.
The third, and arguably most difficult type of SEO, occurs off-site. It consists of all things related to building your website’s reputation within the greater community of the internet. By building a network of links to your site, links from authoritative and relevant sources, you’ll ultimately bolster the authority of your own page, all of which ties back to higher SERP placements.
This is where an SEO service really comes in handy, as building a backlink profile is as difficult and time-consuming as it sounds. With this in mind, you’ll want to look into a search engine optimization strategy that can see you through in the long-haul.
Remember, not all SEO companies will offer solutions for all three of these elements, so be sure to assess the particular needs of your business before hiring their services.
Just as a library has the Dewey Decimal System, Google and other search engines have their own systems with which they crawl and index web pages. Unlike your average library, however, Google has the mind-boggling task of organizing the trillions of pages available on the internet. What’s more, Google must rank these pages. To accomplish this, Google has a set of rules and criteria that pages must abide by in order to show up in searches.
At first glance, it might sound tempting to manipulate these criteria. Cloaking content, stuffing irrelevant keywords into your copy, and plagiarism are all examples of what we’d call black hat SEO. While these methods may boost your rankings in the short term, they will ultimately damage how search engines regard your site.
Bonafide SEO specialists will use white hat SEO tactics that abide by Google’s webmaster guidelines, which are rooted in the quality of both content and user experience. Sure, strategies such as link-building and keyword analysis take a lot of time and effort, but if you’re going to shell out for an SEO company, then you’ll want to make sure they comply with Google; otherwise, the benefits of their services will be short-lived.
Although all SEO is centered around boosting organic traffic, SEO is not a one-size-fits-all type of service. Even if a search engine optimization company increases your traffic, there’s no guarantee that this correlates with increased revenue. A professional SEO company could maybe land you at the top of the rankings for cloud computing software, but this will do you little good if you’re in the business of CRM platforms.
Solid professional SEO services will be familiar with your industry. Certain industries, such as insurance, are highly competitive in terms of SEO. Others, like call center software, while less saturated, possess a high search intent, thus driving up the CPC for associated keywords. An SEO company should be able to understand the types of search idiosyncrasies in your industry and create a strategy accordingly. Armed with industry-specific knowledge, they’ll attract the type of potential customers that drive conversion.
There are two ways in which SEO experts utilize keywords: buying search ads and page optimization.
The more straightforward of keyword methods, buying search ads allows you to show up in the search results for specific keywords you’ve selected. While this can certainly be a way to vault atop the Google rankings, you’ll be charged for every click, which can quickly become a costly form of paid advertising, not to mention the perils of click-fraud.
Especially for small and medium-sized businesses, Google Ads can be a hefty price to pay with little guarantee of a return on the investment. In the short term, it will surely bring a larger audience to your page; however, this will only hold for as long as you keep paying for clicks. On the other hand, SEO, while more complicated, provides a long-term solution, ultimately boosting organic traffic at lower cost.
The more laborious option is to optimize your page for organic search results. This is done by first making sure your site has content that is both relevant and of high quality. Then you’ll have to get other websites to reference your content and link back to it. All of this plays into Google’s algorithm.
Most professional SEO services will likely focus more on optimization for organic search. This process begins with keyword research, the best way to figure out which terms to target. One might assume that a personal injury law firm should naturally target the term “personal injury”; however, with its high keyword difficulty rating (62) and high costs ($44 per click), there is likely a more effective approach. A true SEO expert will take the time to figure out with you, the client, alternative search terms that will both save you money and increase your likelihood of SERP visibility.
Nowadays, there are hundreds of digital marketing agencies that offer SEO services, and it can be difficult to figure out which one will suit your needs. A good place to start is by reviewing who they’ve worked with. Most SEO specialist services will list prior clients on their website. They provide detailed case studies of their clients, in which they outline their methods and provide their tangible measures of success. Take a look at who a company has worked with. If they have a history of success in your industry, then it’s likely they’ll be a good fit for you.
In addition to the specific type of industry, you’ll also want to look into their success metrics. Did the SEO firm simply boost traffic? How did this traffic affect conversion? Did they utilize social media, email campaigns, local SEO, graphic design, or competitor analysis? For the best results, you’ll want to have your SEO goals in mind so that you can better observe a similar pattern of success in professional SEO services.
Link building is the backbone of any content marketing strategy. It improves your position on major search engines and expands both your web presence and authority. With that said, not all links are created equally, and you should pay close attention to how an SEO firm obtains backlinks for their clients.
Simply put, the way to obtain backlinks is by creating quality content. A good SEO company will review the entirety of your website and determine which pieces of your online presence hold a high linkable value. From there, they will optimize your content for both overall quality and target keywords in addition to creating new content that they’ll work to place in authoritative publications. It’s important to note that all of this outreach should be 100% organic. While there are certainly ways that you can pay in order to place an article, the most effective (albeit time-consuming) method is through pitching content to publications. With scalable link building campaigns, their editorial team will draft original, high-quality content and pitch it directly to authoritative web publications, thus building the authority of your own web presence.
Some companies will obtain links that only last for 6-12 months, giving you little benefit in the long run. Since Google’s algorithm can detect other unnatural link building tactics (that is, black hat SEO), it’s important to make sure that a professional SEO company plays by the rules. Bad links are penalized by search engines, whereas high-quality links that appear in useful content can drive organic traffic to your page for years to come.
It can be scary, essentially handing over the keys to your company’s web presence, especially for those not well versed in the language of Google, Bing, and the almighty algorithm. A professional SEO company understands this, and they’ll take steps to keep you in the loop on their process and overall progress of your SEO campaign. This can be achieved through a variety of means such as weekly reports of keyword rankings, conversion rates, backlink numbers, and traffic quality.
With this program, users can keep tabs on key analytics and better manage their SEO campaigns. For those looking to maximize their ROI, this tool functions as a handy dashboard, providing insights into technical SEO, backlinks, meta tags, and keyword performance. Some companies portray SEO as a nebulous concept, a tendency you should be wary of. In truth, effective engine optimization is driven by data, which is why we aren’t afraid to quantify and display findings, providing clients with the utmost transparency.
While the cost of SEO services varies widely, based on the size of a project, its duration, and the specific SEO pros behind it, it will behoove you to consider a company’s pricing model. This includes initial charges, hourly rates, and monthly retainer fees for longer-term agreements.
Again, this all comes back to knowing the goals of your SEO strategy. If all you’re looking for is to establish a backlink profile, then this can likely be done in a couple of months. More in-depth tasks, such as overhauling your site’s user interface or producing original content, will likely take longer and, naturally, come with a higher price tag.
Many of the larger, more established professional SEO companies will require you to lock into a long-term contract of up to two years. This might not be a bad thing; however, many younger agencies and independent consultants will be more flexible with their SEO marketing services, allowing you to opt-out of your agreement in the event that you’re unsatisfied with their results. You and your team should establish some tangible KPIs and evaluate your progress accordingly. With that said, it’s important to remember that the most worthwhile SEO strategies take time to come to fruition. SEO work takes time, patience, and years of experience.
While SEO companies may all refer to themselves as such, there’s often a wide discrepancy in the services they offer. Some agencies specialize specifically in link building, whereas others offer full SEO overhauls, from web design to content creation and direct marketing. When taking stock of a company’s services, you should consider not only your goals but also the inner workings of your own operation.
If your interior design team is literally just you and your installers, then you’ll want to consider seeking an agency that can plan and execute an SEO campaign from the ground up. On the other hand, if your company is on the larger side (perhaps you already have a graphics team and web developer), then you can pick and choose the desired services, integrating them with your existing team.
In an ideal world, an SEO agency could handle everything without you having to lift a finger; however, you should be careful of companies that ask nothing of you. At the very least, they should ask you for access to your CMS, Google Analytics account, Webmaster Tools, and social media accounts. Otherwise, it’s hard to imagine they’re doing much of anything at all.
Ultimately, the best indicator of a good SEO company is transparency. From inception to execution, an agency should keep you fully aware of their research, strategies, implementation, and success metrics.
Unfortunately, it’s not so uncommon for SEO agencies to do more harm than good. Through black hat strategies, some companies are able to sell you on short term traffic gains and then leave you holding the bag when the algorithm recognizes the shoddy nature of their techniques and penalizes your page.
Like many B2B services, SEO marketing can become a fixture of your business. The road to the top of Google is both long and complicated; however, with a solid SEO company as your partner, you can get there.
Some webmasters can get intimidated when it comes to working on the backend of their websites. But the metadata you include on the page (and how it’s formatted) can have a significant impact on your SEO performance. Of the many types of SEO-friendly metadata, one of the most powerful is schema.org markup.
Schema markup is a form of structured data that helps search engines read your web pages better. It also improves the appearance and click-ability of your search result. Anyone can add schema.org markup to their website, and you don’t have to be a web developer to do it.
Here’s a complete guide to understanding the SEO power of this data markup and a detailed explanation about how to add it to your website.
In simple terms, schema markup is a type of semantic vocabulary code. You can place it on your website to help search engines create more informative and relevant results for users.
On the backend of your website, schema.org markup is a specific type of structured data in your HTML code. On the front end, that schema markup results in a rich result in Google, or a prominent SERP display that provides more information and context for your audience.
A normal snippet in the search engine result pages (also known as the SERPs), shows very basic information about the website such as the title of the page, the URL, and the meta description.
A rich snippet is a bit more complex and includes additional information highly relevant to search intent that you want to appear within the SERPs. Some examples of rich snippet information include hours of operation, star ratings, event details, and ingredients for a recipe. Schema is the code that allows for the rich snippet to populate with this extra information on the search result pages.
In order to use schema markup properly, you need to use a specific vocabulary of data. Luckily, the main search engines Google, Yahoo!, Bing, and Yandex, created this vocabulary in a centralized website, schema.org. They did so in order to reach a main standard of language so their search engines can perform properly.
This is a free resource and is used by digital marketing analysts to propel their website to better rankings and more clicks. On schema.org, you’ll be able to find plenty of tags, with specific categories, that can help you describe your business, products, reviews, job postings, and contact pages. We’ll get into this in more detail later on in this article.
There are many SEO benefits to utilizing schema.org vocabulary. Despite the benefits, it’s estimated that only 33% of markets are actually utilizing this powerful optimization. By adding schema markup to your site, you will be level up against your competitors in a variety of ways. Here are some of the benefits:
Think of schema as a way to translate to the search engines what the data on your website means.
Search engines work through a process of crawling and indexing websites. Through this, they can populate those web pages within the SERPs when a specific keyword is entered into the search bar. However, there’s more to crawling a website than simply reading the text on a website.
Instead, you need to make sure your website’s HTML code and format can be read correctly. As a result, the information you want about your website is displayed properly. Schema is a free tool that does just that.
Consumers have very short attention spans. In order to stand out in Google, you’ll have to give your prospective audience information in the method they want, and when they want it. All this extra information, provided by schema vocabulary, provides what is known as an “enhanced search result.”
Businesses, especially local businesses, only have a few seconds to make a good impression, and providing as much informative text as possible can mean all the difference when it comes to converting potential customers.
As stated above, the more informative your website is within the SERPs, the easier it is to improve one of the most important metrics for your website, your click-through rate. Creating multiple web pages can only work so well unless they are converting the consumers you need!
There’s more to digital marketing than creating content and putting it on a web page. You have to make sure each page works towards a specific marketing goal. Your about me page will have a different goal than your homepage, your blog posts, and your services page.
Schema is one of the easiest ways to help each page stand out on its own in Google search results. Since each page has a specific function, there are different schema types that relay different information in the rich search results. As a result, prospective consumers will be given more specific information for each web page they find. This increases the likelihood they will click through to your website and convert.
We all know how important it is for our website to be mobile responsive, considering how many consumers use mobile devices to shop and scroll every single day. There’s a benefit to mobile rich snippets as they take up more space within the mobile SERPs, where real estate is more lucrative.
When schema is implemented correctly, searches for certain types of local businesses, such as local restaurants and cafes, movie theaters, and small retail shops will pop up showing a full list of items within the rich snippet to educate their consumers.
These design elements are implemented in something that is known as a carousel, where the user can quickly scroll through and click to the right web page they are looking for. As a result, this type of metadata allows for your local business to take up a good chunk of the important mobile SERP real estate, boosting your brand authority and awareness.
Many businesses know about schema, but don’t always implement them. In fact, only one-third of Google search results incorporate rich snippets, which means they use this type of source code. On top of that, throughout the rest of the major search engines, less than one-third use any type of schema markup.
In other words, there are a ton of website owners out there – literally millions – that are missing out on this massive source of SEO potential. And if you use it, you’ll be on your way to standing out amongst your competitors in no time at all.
There are many different types of markups that you can use within the realm of the schema vocabulary. The goal is to structure the markup type to fit three categories; people, places, or things.
The most popular types of schema are used to indicate the following item types:
Once added to your website, these pieces of microdata will be then turned into a rich snippet, or what is also known as a rich result.
One of the great details about schema code is that it is completely customizable to your brand and business no matter your industry. There is a lot of microdata that is implemented into schema code, so the above are just common themes. The following data vocabularies are more niche uses of schema, under the themes outlined above.
This is the library of markups that are used for multiple forms of creative content such as books, movies, video games, and music, to name a few examples. For websites about movies, its schema would have movie-specific elements that highlight the star rating, genre, and nearby theaters to watch the film.
An RDFa is a language of code that is added to the HTML code that already exists on your web pages. It stands for Resource Descriptive Framework in Attributes, and you are able to add it to any HTML, XHTML, and XML-based document. Some examples of RDFa attributes include:
Implementation for microdata is the same as RDFa, except for having separate attributes. You can use the following microdata attributes on your website;
Standing for Javascript Object Notation for Linked Objects, this is an annotation type that can be simply copied and pasted into the heading or the body tag of a web document. All you have to do is use the tags “@context” and “@type” attributes when specifying which schema.org vocabulary you want. According to SEO experts, it is pivotal to use this JSON-ld format as often as possible, as it is considered the easiest way to implement schema markup for beginners.
In order to choose the right schema markup for your website, you will have to zoom out and consider your overall digital marketing strategy for each web page. You first need to figure out what web pages you will want to optimize, and what part of the schema.org vocabulary you’ll use to get the best organic traffic. But how?
The easiest way to think of schema as a way of telling a story on your website, a story that is told between multiple similar pages that all relate back to overall goals. Here are some tips that will help you decide what schema markup is the best for you.
This may seem obvious, but in order to choose the right schema markup, you’ll need to determine what your business is all about, what search terms you want to rank for, and how you want to tell the world about them. Typically, this includes your contact information, products, product reviews, FAQS, and thought-leadership pieces about what your business does. It’s a good idea to make a list of every page type on your website, and then categorize them based on what “business purpose” they fit into.
Now take your list and map every single webpage to fit into the proper schema.org vocabulary. There are a few tools that help you do this ( we’ll get into them later!) but as of right now, take the time to meticulously map all your data out so you have everything in one place.
This is different from mapping your pages to each schema.org website option because this step is about recurrence. To figure this out, you can simply ask yourself the question ” does this page have content that is published somewhere else on the website?” If so, you’ll need to use a different data format for your schema implementation. A good rule of thumb is that if your website has more than 5 pages of similar content, then that content theme is recurring. If the content only appears once, it can be classified as a single page.
You’ll now need to connect the dots between your metadata so you don’t have an empty text string. Your goal here is to create a knowledge graph so any search engine can easily read your website and understand the context between your content and how it all relates to one another.
When a search engine understands exactly who you are and what you do, you are sure to get an SEO boost. That’s because Google tends to show the most relevant information it can find within the first page of the organic search rankings for a query.
There are many tools that can help you connect your schema paths, such as this one from SchemaApp.
Luckily, there are a lot of fantastic online tools to use when creating your website’s schema. The Schema Markup Generator is one of these options and is an easy way to boost your SEO efforts overnight. In most cases, these tools will write all of the code snippets you need, including HTML tags, and all you have to do is place them in the backend of your website.
Our markup generator is quite easy to use. Follow these steps for the best results:
As a way to double-check your work, input your schema markup into Google’s rich results test tool. This test is a wonderful resource to use as it will identify if there are problems with your schema code, plus it will confirm whether or not Google is able to generate rich results from your markup.
In addition to Google’s data testing tool, here are some other options for checking your work:
With all the free tools available to you, it is surprising how many businesses do not take advantage of the rich results that come with implementing the different types of schema markup. Even though it may seem a bit intimidating to work with schema code at first, these tools, especially your Schema Markup Generator can really help to elevate your website to the next level and increase your website rank for multiple keywords. And what more could you ask for?
There’s plenty of options available to you, as long as you stay dedicated to learning. Remember, SEO is similar to a stock market; the effort you put into equates to what you get out of it, and schema is one of the best ways to stand out among your competitors.
As always, our team of SEO experts and web developers are here to help you with any and all of your schema needs. Contact us today for more information about how we can bring your website to new heights.
An important element that many websites forget to include on their web pages are open graph meta tags. These tags are important for ensuring that your content is appealing and clickable when shared on popular social media sites.
Here is a guide to what open graph tags are, why they matter, and how to implement them across your web pages.
Open Graph tags are a set of HTML tags that help you control how your content is displayed on social media websites.
When someone shares one of your web pages on a social network like Facebook or LinkedIn, open graph communicates how to display and format the shared content.
On the frontend, open graph tags make your content appear with the title, image, and display that you prefer, like this:
But without Open Graph meta tags on your web pages, your content on social media sites may look like this:
Which post would you be more likely to click on?
Open Graph is important because it allows you to control how your content appears when it is shared on social media.
They allow platforms to pull in specific details about the website and its contents to create a richer, more engaging social media post. It is very similar to a schema.org markup.
Open Graph is a great way to ensure that your content looks its best, displays your branding, and projects your content as valuable whenever it is shared online.
Open Graph is not a ranking factor that Google relies on when promoting web pages. But they still can indirectly impact your SEO.
How? Because social media is a great avenue for sharing content to grow brand visibility and site authority. If a webmaster sees your content on a social media site and enjoys it, they may choose to link to it on their own web pages, giving you a backlink and valuable link equity.
But if your content doesn’t utilize open graph meta tags, the content looks less appealing whenever shared on social networks or other popular sites, making users less likely to click on it, and webmasters less likely to link to it.
Open Graph Tags are used on major social media platforms, but some of the most important are:
Twitter has their own version of Open Graph which are called Twitter Cards. To learn more, check out our post on Twitter Cards best practices.
There are a number of different tags that you should be implementing on your web page.
If you do not have these open graph tags present on your web pages, they will be flagged in your Site Audit Report as “Missing”.
The title of the webpage is important for your users understanding the purpose and focus of your content. In HTML, the og:title tag looks like this:
Best practices for og:titles include the following:
Similar to a meta description for SEO, the og:description should offer a brief description of the web page’s content to give the user more reason to click.
In HTML, the og:description looks like this:
Best practices for og:description include:
The og:image is arguably one of the most important open graph tags because of the visual nature of social media platforms.
The og:image tag includes the URL of an image that represents the content of the webpage. In HTML, it will look like this:
Ideally, you should also use the og:image:width and og:image:height tags to let the social media websites know the size of your image.
Best practices for og:image open graph include:
The og:url tag tells social media websites the url of the webpage. It looks like this in HTML:
Best practices for og:url include the following:
The og:type tag helps social media sites understand the type of webpage (e.g. “article”, “blog”, “website” “video” etc.). In HTML, it looks like this:
Best practices for og:type include the following:
For international websites with content in multiple languages, the og:locale tag indicates the language of the web page content. In HTML, it looks like this:
Best practices for og: locale are pretty simple:
There are a few common issues that webmasters make with open graph tags that will be flagged in your Site Auditor report.
The most common mistake that webmasters make is they simply do not include open graph tags on their web pages.
If an open graph is missing from a web page, you will see this issue displayed in your site audit report.
This is an easy fix, and simply involves adding the missing open graph tag into the HTML header of your web page (according to the best practices listed above).
For og:title and og:description, your character counts will be flagged if they do not follow best practices.
To resolve this issue, you will need to either shorten or lengthen the character count of your open graph tags.
This can be done by editing the HTML of your individual web pages.
There are two primary ways that you can add open graph tags to your web pages.
The first is utilizing a popular plugin. For WordPress websites, the most common used by webmasters is Yoast SEO.
Other popular CMS like Wix, Shopify, and others have plugins that make adding open graphs simple.
You can also add open graph tags manually by editing your web page HTML code. Open graph tags should go in the <header> section of your website.
To ensure you have the required fields, you can use a tool like an open graph generator.
If you need assistance adding open graph to your website, our team of web development experts are here to help. Simply request a proposal from our team.
You can also run a site audit yourself after registering for a trial of our SEO software.
Including Twitter Cards on your web pages is a great way to optimize the performance of your content on the popular social media platform.
Similar to open graph tags, Twitter Cards make your content more engaging and clickable to Twitter users.
Here is a guide to adding Twitter cards to your web pages and how it can help with your overall online visibility.
Twitter Cards are a way to enhance your tweets whenever you (or someone else) shares a link to your web page content.
When you include a Twitter Card on a web page, it will show a preview of that content when shared on twitter. Then, the user can click on the tweet to see the full content on your website.
Without Twitter Cards, your content will look less appealing when shared on Twitter.
For example, here is a web page that is missing an essential Twitter Card. As a result, it doesn’t populate the Twitter post in a way that makes it desirable for users to click or engage with.
Which of these tweets would you be more likely to click on?
The upside of Twitter Cards to your social media presence and visibility is fairly obvious.
But this meta tag can also have ancillary benefits to your SEO visibility as well.
How? Well if a Twitter User clicks on your content and finds the content valuable, they may choose to link to that content on their own website. That link can help bring valuable link equity and improve your website’s ranking potential.
But without that initial click, those users will never see your content. So although not a ranking factor, using Twitter Cards can make it so there are fewer barriers to other people finding, clicking, and sharing your content.
If you are missing essential Twitter cards or there are issues with implementation, the issue will be flagged in your audit report.
There are four Twitter cards types available to webmasters:
We recommend using the first or second option for your content, and you will find details on how to implement each below.
Here is an example of what the summary card looks like in Twitter:
This is the most simple Twitter Card and is very easy to implement with just two Twitter card properties: “twitter:card” and “twitter:title”.
The Twitter Summary Card with Large Image will make the image more prominent in the display.
If your content has great visual assets, you may want to consider this Twitter card. Here is an example of the Summary Card with Large Image from Search Engine Journal:
And here is what the Twitter card implementation looks like in HTML:
The required properties for this card are the same as the Summary Card, but you will need to specify a different twitter:card type.
There are only two required properties for a valid twitter card.
However, you may want to consider more to make your content more appealing when shared on Twitter.
Here is a rundown on each property.
This is the most important property because it tells Twitter which of the four types of Twitter cards you are using.
This is a required property in order to produce a valid Twitter Card.
The other required property for validation is twitter:title. This should be the title of your content and give users a good idea of what your content is about.
If you don’t define your title, Twitter will use your open graph title– “og:title “– for the title of your content.
For best practices with this property:
The description will be visible to users below the title and provides a summary about your content.
Best Practices for this property include:
If you do not define twitter:description, twitter will use your og:description first, then your meta description.
This property communicates the Twitter account associated with the website.
For best practices with this property:
If you don’t define twitter:site, you should define twitter:site:id. The id is the unique numeric value associated with your twitter account.
Arguably one of the most important properties, this will tell Twitter which image to display when the content is shared.
The size of the image is very important, and depending on whether you are implementing a Summary card or a Summary with a large image card, your dimensions will be different.
In addition to proper sizing, best practices with this property include:
The twitter:creator property defines the author of the content.
With author authority becoming even more important in communicating the quality of your content, we recommend including this property on your web pages.
Best practices with this property include:
There will be two types of issues related to Twitter Cards that you may see flagged in your site audit report.
The most common issue webmasters make is they simply just don’t add Twitter Cards to their web pages or are missing specific properties that make the implementation valid.
This is any easy fix and simply involves adding the required property to your HTML.
The second issue you will see flagged in the site auditor is when your twitter properties do not follow character count best practices.
To fix this issue, you will need to edit the length of your title or description in your Twitter Card implementation.
Although not an SEO ranking factor, adding Twitter Cards to your shareable content is worth the effort!
For more information on how to use Twitter Cards, visit the Twitter Developers website.
Full-scope search engine optimization can feel like juggling cats at times Often, most site owners have a knack for the technical side or the content side–rarely both. And creating a sitemap for Google is definitely on the more technical side of things. It can also be one of the more time-consuming SEO tasks if you’ve never created one before.
This guide will eliminate the guesswork and teach you how to create an XML sitemap with just a few simple steps. We will also cover how to submit your sitemap in your Google Search Console account.
As the name implies, a sitemap is a map that tells Google’s webcrawlers what route to take through your website. This XML file helps web crawlers and search engines find and index the pages on your website.
More specifically, a sitemap is a list of the pages on a website with hierarchical signals so Google understands the structure of your website and which pages are the most important. Sitemaps contain all subdomains that you want to appear in Google search results.
XML (Extensible Markup Language) is a markup language that defines a set of rules for encoding documents in a format that is both human-readable and machine-readable. It’s the language webcrawlers are fluent in, and XML is used for data storage and definitions.
XML was designed to be both simple and extensible, making it easy to create and interpret documents for people and search engine bots.
There are a few ways that web crawlers can use a sitemap file.
The most common way is to crawl through the links in the sitemap file to find pages to crawl on the website. This is especially helpful if the website has a lot of pages that are not linked to from the main navigation.
Another way that web crawlers can use a sitemap file is to find new pages to crawl on the website. If the website has recently been updated with new pages, the web crawler can identify and find those pages by looking through the sitemap file.
Finally, web crawlers can use a sitemap file to get an overview of the website. This is especially helpful if the website is very large and has a lot of pages. By looking at the sitemap file, the web crawler can see which pages are the most important and which pages need the most attention.
You can include up to 500 web pages into one regular sitemap file. However, you can use multiple sitemap files to create a list of a website’s remaining pages.
A sitemap is an important tool to help improve a website’s search engine optimization (SEO). It allows search engines to crawl and index your website’s pages more efficiently, resulting in better search engine results visibility for your website.
The structure of your website’s sitemap can have a significant impact on your website’s SEO. If your website’s pages are not well organized and easy to navigate, the search engines may have a difficult time indexing them. A well-constructed sitemap will help the search engines find and index all of your website’s pages, which will improve your website’s ranking in the search results.
A sitemap is also an important tool for improving the usability of your website. A well-organized sitemap will help your website’s visitors find the information they are looking for quickly and easily.
One of the most important aspects of SEO is the creation and optimization of a site’s sitemap. When your website’s sitemap is properly optimized, it can help improve your website’s search engine ranking and visibility. You will find instructions on how to optimize your sitemap further down in this article.
A sitemap is a list of all the direct URLs on a website and what crawl delay they should use when indexing them. A robots.txt file is a file that tells web crawlers which URLs they are allowed to index and which should remain off the SERPs.
Your robots.txt can tell webcrawlers where to locate your sitemap (which comes in handy when you have multiple). You can also indicate to the web crawler which sites should be excluded from indexing. To do so, include noindex tags for pages that you do not want to appear in search results.
A sitemap lives in the website’s root directory. The root directory is the topmost directory of a website, which contains all of the other directories and files, including the sitemap index file. You can find your site’s root directory by looking at the website’s source code, or by using a web browser’s “view source” feature. However, root directories are not accessible to non-site owners.
For instructions on how to add your sitemap files to a WordPress site, see the instructions for Creating a Sitemap Manually below.
Yes! A sitemap is a great way to ensure that Google knows about all the pages on your website so they can be indexed and ranked properly. And while it may seem intimidating, creating a sitemap is easy when you have the right tools!
A Sitemap URL is a web page that contains a list of all the web pages on a website that should be indexed. The standard format is <domain>/sitemap.xml. This URL is not included in Search Console’s URL data because it is considered a noindex page by Googlebots.
By following these tips, you can ensure that your website’s sitemap is properly optimized and will help you improve your website’s search engine ranking and visibility.
There are many different ways to create a sitemap for your WordPress site. However, we recommend creating a sitemap using the Yoast plugin. Why? Yoast is one of the most popular WordPress plugins for SEO, so there’s a good chance you already use it for blog content. While Yoast has become a popular SEO plugin for simple SEO optimization of your content, this plugin also includes a sitemap index file builder. This makes it easy to create a sitemap for your WordPress site.
To create a sitemap using the Yoast plugin, follow these steps:
That’s it! The Yoast plugin will automatically create a sitemap for your WordPress site.
Yoast also works for creating sitemaps in other e-commerce platforms, including Shopify, Squarespace, and BigCommerce. While Yoast SEO works in Wix, most users don’t find it extremely intuitive.
Google used to have a sitemap creator as a free webmasters tool. However, they found that more people preferred to use Yoast and other third-party products, so they took theirs out of google webmaster tools. Luckily, they didn’t leave site owners without alternatives. You can use Google Search Console to create a site map.
If you’re using Search Console:
The free version of Screaming Frog SEO Spider is a great tool for quickly creating a sitemap of your website.
To create a sitemap with Screaming Frog, follow these steps:
If you’re looking to brush up on your technical SEO skills, you can create a sitemap.xml file without the use of any tools. This process is a bit more time-intensive, but it offers insight into your website’s structure. It’s also a great exercise if you’re looking to learn XML.
If you’re creating a sitemap manually:
Then navigate to the root folder of your website. Select “Upload” and then your sitemap.xml file. If you use another e-commerce site builder or CMS, such as Shopify or Wix, you will find the steps are quite similar.
Common Mistakes in Sitemap Creation
There are a few common mistakes that people make when creating a sitemap. And the unfortunate reality is if you make a simple mistake when it comes to your sitemap file, you could hide a page (or your entire site) from search results.
To avoid producing sitemap errors, avoid these common mistakes:
Optimizing your sitemap can help your page perform better in Google search results. While most sitemap generators will do these for you, they’re still good to know.
There are a few key things to keep in mind when optimizing your website’s sitemap:
Make sure you are constantly updating your sitemap as you add new pages or make changes to your website. This will ensure that search engine crawlers are always aware of the latest changes on your website.
Your sitemap should be in XML format. This is the format that search engine crawlers prefer and will be best for helping them index your website.
Your sitemap should include the following information for each page on your website:
Submitting your sitemap to Google is as easy as logging into Search Console and clicking a few buttons.
There are a few different plugin options for creating an XML sitemap for your WordPress site.
The Yoast SEO plugin has a sitemap feature that allows you to create an XML sitemap for your site. The plugin also includes features to help you optimize your site for SEO.
The Google XML Sitemaps plugin also allows you to create an XML sitemap for your WordPress site. The plugin automatically submits your sitemap to Google and other search engines, and it notifies you when your sitemap has been updated.
The WP Sitemap Page plugin allows you to create a sitemap for specific pages or posts on your site. The plugin also includes options to exclude certain pages or posts from your sitemap and to specify the priority of each page or post.
It’s a good idea to check that your sitemap is functioning properly. This ensures that all your different URLs can be indexed by a Googlebot and potentially wind up in the SERPs. SEO tools, such as our Dashboard provide simple ways to ensure your canonical URLs are part of your sitemap index.
To run a sitemap report:
If you have yet to create and submit your sitemap to Search Console, now is the time to do it. Your sitemap, which is an XML file, tells Googlebots how you would like your site to be indexed. If you’re creating your own site map, remember to use taxonomies to distinguish different types of content and their differing importance.
A canonical tag, often referred to as rel=“canonical,” is an HTML tag that tells search engines which URL is the primary version or “master copy” of content. These straightforward tags give site owners the ability to suggest one URL for Google to designate as the preferred page to appear in searches. Canonical tags also prevent SEO issues that arise from duplicate content.
These simple HTML link elements play a major role in your site’s SEO. They’re also easy to use, but only work when used correctly. If you’re unfamiliar with canonical tags, this article will help you learn how, when, and why to use canonical tags and how to avoid canonical tag issues.
A canonical tag is an HTML link element inserted into a page’s header or <head>. These tags were developed by search engines and rolled out in 2009. They’re one of those great examples of search engines working with site owners to improve the quality of search results.
This tag tells the crawler to index the primary page rather than the duplicate. The canonical URL indicates to Google which page the search engine should display in search engine results, this tag tells the search engine that the primary version is the one that should receive the organic search visibility.
Keep in mind that while you can tell Google which URL to index, Google may not follow your recommendation.
A canonical tag looks like this:
<link rel=”canonical” href=”https://example.com” />
A canonical tag is also referred to as a canonical link element–which is a bit more intuitive label for this unique HTML code. Why? Because a canonical tag provides a canonical link and defines the relationship between the page and the link.
In HTML, rel tells the Googlebot that there’s a relationship between the page and a linked resource. In this case, the relationship identifies the canonical page which appears after the href attribute (href is a hypertext REFerence).
A canonical URL is the primary version of a webpage that site owners want search engines to recognize as the primary source of the content. The canonical URL is the webpage that you want the webcrawlers to index as the correct source of the content. This part of the link element appears after the href=”canonicalURL”.
The canonical URL appears within the canonical tag. The canonical URL is the hyperlink reference element within the canonical tag. This denotes the exact URL that should be considered the canonical version of the source content.
When it comes to e-commerce sites and sites that generate ad revenue, you want to be sure you take every opportunity to put your best URL forward in the search engine results pages (SERPs). And canonicalization does just this by telling Google which site should be indexed. Not only can you gain more control over your site, but you’re also able to funnel users to the highest value page.
Even a webpage that may seem unique can be found under a variety of URLs. For example:
While each of these URLs will display the same homepage, each is also technically its own URL. This can lead to the same issues as having duplicate content on a third-party website. Without a canonical tag on “yourwebsite.com”, search engine algorithms will not know which is the preferred URL to display to searchers.
Making things even more confusing to search engines, dynamic pages often have a wide array of tags, each of which has its own URL. Content management systems (CMSs) like WordPress often automatically embed tags into web pages, too. So, even a basic page will wind up with a multitude of URLs–each perfectly indexable by search engines.
So, your best bet is to place a canonical tag within the canonical URL’s header, as well.
Furthermore, as you’re tracking your search metrics, you want to compile all organic searches for one page under the same URL. Your canonical tags ensure only the specified page will receive search result metrics.
Many websites build backlinks through content syndication. However, creating content can be a timely and costly investment. Through syndicated relationships, you can provide users with your existing high-quality content on third-party sites. Or continue to build your library of content on your site while expanding your brand visibility.
However, without canonical tags, search engines will not know whether to index your site for the article or the third party. Canonical tags allow you and your syndication partner to simplify this problem. Note: you can also use the noindex tag on one of the pages to prevent duplication.
Duplicate content can cause various issues related to SEO. When Googlebots index webpages with identical or very similar content it can:
First and foremost, canonical tags are one of the few ways you can influence how Google presents your site to searchers. Canonicalization also prevents you from getting ‘docked’ in PageRank for having duplicate content–although Google does not directly penalize duplicate content, they do prioritize original content that’s well-organized,
Finally, they also allow you to provide users beyond your website with excellent content for backlinking and brand building.
Duplicate content isn’t just copy-and-pasted text. It can be written text, images, and other media that are exactly the same, similar, or reordered. Google also considers place-filler text and images from a CMS duplicate content if it’s published to the web.
Basic information, such as copyright text, on every page on your site can even be flagged as duplicates.
Ultimately, for the best SEO results, you will want to use canonical tags throughout your website. Once you update your existing pages, you will want to continue to implement canonicalization best practices.
The first step is to identify which URL version of your site pages should be the canonical URLs. Google prefers if your canonical links are consistent in formatting. So, if you use the “www.” in your homepage’s canonical link, include it in your other canonical URLs.
For example, here it uses the “https” protocol in all over our canonical tags, but does not include the “www.”
This will solve any issues with multiple URLs pointing to the same page.
Next, you will want to tag or eliminate any duplicate content within your site. You can do this with the Site Audit tool. It’s as simple as viewing your Content/Duplicates report.
Finally, you will want to find any remaining duplicate content on third-party sites. You can use a tool such as Copyscape to do so. Once, you’ve identified content elsewhere on the web, you will want to decide if
Then, you will want to respond with the corresponding solutions:
When it comes to canonical tags, you can reduce duplicate content issues by always using canonical tags. However, if you’re updating your site, you want to prioritize:
Do you have to be a webmaster to implement canonical tags? Not necessarily. If you’re comfortable with working on your site’s HTML code, you can implement canonical tags on your own.
Here’s how to set up canonical tags:
The easiest way to employ your canonical tags is to insert and update the tag text in your HTTP header. This HTTP header section of your page looks like this
It should look like this:
Copy-and-paste version:
That’s all there is to it. There’s no need to be a webmaster to link to the canonical version of a page.
To check if you’ve correctly implemented your canonical tag with the correct URL, you will need to view the source code of your webpage. This process is easy.
Google Search Console and GSC Insights are great tools for finding pages that have been incorrectly tagged. As you’re looking through your organic traffic stats and notice search traffic arriving at a non-canonical page, your canonical tags may be incorrect.
To fix these pages, you will want to navigate to the specific URL then inspect the page.
When creating or updating your sitemap, do not include duplicate URLs. You only need to include your canonical URLs. Your sitemap’s inclusion of the canonical version of a page will hint to Google’s bots to not crawl the duplicate version of the content.
You should not disallow duplicate pages in your robots.txt file. This would block Google from using these pages’ ranking signals. When you correctly implement canonical tags, ranking signals, such as engagements (clicks, scrolls, text entering) and content signals will count toward the canonical page’s metrics.
If you edit your site via a CMS platform such as WordPress, Shopify, Wix, or BigCommerce. Most of these CMSs will have specific instructions for adding canonical link tags, without directly editing your HTML document. We will cover the most common CMS platforms.
Using Yoast SEO plugin for WordPress, Shopify, or Wix, you can easily edit and add the preferred URL as your canonical tag.
Canonical tags only work well when implemented correctly–and incorrect implementation can be a disaster. Luckily, there are common mistakes you can avoid to ensure your e-commerce site or ad revenue site makes the most of your next Google crawl.
If you notice that you’re receiving organic traffic to a non-preferred version of a page, you will want to check for the following problems:
Google and other search engines created canonical attributes to improve the organization of websites and improve the user experience. When you use 301 redirects, you will increase your page load time. This is because the server must retrieve the redirected URL before retrieving the other version of a page.
Additionally, when you opt for a redirect instead of a canonical attribute, you’re sending the wrong signal to Googlebots.
Do not select a page without any internal links pointing to it as your canonical version. Canonical tags are just hints to crawlers, and if your canonical URL doesn’t appear in your sitemap, there’s a good chance it will not be indexed.
There’s no need to prevent Googlebots from indexing your duplicate pages. In fact, you want your duplicate pages to pass their link equity and other quality signals onto your canonical page.
Noindex should be reserved for gated content and other content you want to hide from search results.
Be sure to enter the URL for your canonical link correctly. If you’re unsure of what version to use, consider making the absolute URL your default.
An absolute URL should include the protocol (HTTPS), the domain name (www.yourhomepage.com), and any subfolders (/subfolder). Remember that you want to use the HTTPS protocol to demonstrate your site has SSL security for your users.
And always check that your preferred URL has been spelled correctly. This is the most common reason for a 404 error.
When creating blog posts or guides with multiple web pages, do not canonically link to the first page in the series from the subsequent pages. This will prevent Googlebot from indexing the full series. Instead, you will want to substitute rel=”canonical” with rel=”prev” and rel=”next”.
Hreflang tags tell Google that a page appears in multiple languages to better serve a diverse and multi-regional audience. Differing language versions can be viewed as content duplicates. Therefore, Google asks that webmasters always use Hreflang tags in conjunction with the canonical tags.
An often overlooked issue is accidentally using more than one rel=canonical tag. This problem can arise when more than one person edits a page. Luckily, it’s easy to fix and easy to avoid if you’re aware of it.
If you insert a canonical tag, but notice organic traffic arriving at the non-preferred page, double-check that all elements are placed correctly. Note that one of the most commonly skipped characters is the end slash.
If you’re not using canonical tags, you’re likely missing out. Canonical tags can prevent a multitude of duplicate content issues that arise from URL variants, resulting in better SEO performance and a more organized site for Google to crawl. Furthermore, when you implement canonical tags, all of your search metrics will be compiled into one tidy page rather than countless variants.
Stay ahead on your search metrics and make the most of your consolidated data with the best keyword tracking tool available.
Whenever a user types a web page url into their web browser and hits enter, they send a request to the web server to access that specific website. The web server responds with the requested page (plus any additional resources, like images or scripts, that the page needs), and the browser displays the page. It also returns a HTTP status code along with every request.
Most of the time, these HTTP status codes are not shown because the request was successful. However, when the server cannot access the requested resource, it will provide an explanation of why it wasn’t successful via a specific response status code.
This list of HTTP status codes will define the most common types of response codes you might see and those that may be impacting your SEO performance.
The HTTP status code is a three-digit number that tells the browser what happened when it tried to connect to the server. HTTP status codes communicate to the web browser and its user whether or not their request was successful.
HTTP Status codes are a big part of SEO because successful requests to the origin server make a better experience for search engine crawlers and for website visitors.
In contrast, response status codes that indicate errors or a missing target resource can signal to users, and Google, that the website owner is not doing the necessary maintenance of their website.
There are five different series of status codes. All status codes are three digits. The beginning digit highlights the type of status code returned by the server.
There are 60+ possible status codes, but some of them are more common than others. Some are important also when thinking about search engine crawlers and what is happening when they follow links to various urls on our websites.
Pages in the 200 series are what you are aiming for. They communicate that the request was successful and the server has created a new resource. 2xx codes indicate that the server is working properly and the site visitor and client (or website) are all connecting properly.
Whenever a 200 status code is not found, the site auditor will flag it in your report with the following message:
Arguably one of the most important status codes for SEO purposes, 301 redirects communicate that a web page has been permanently moved to a new location or a new url. When a user enters the url in their browser, or clicks on a link with the old url, they will be redirected to the new url of the page.
301 Redirects, when used properly, can help improve your SEO. They ensure that you do not lose link equity when moving or updating content on your website. For this reason, the Site Auditor does flag issues related to 301 redirects when crawling and analyzing your site.
Some issues related to 301s that you might see highlighted in your issues report include:
404: Not Found
Status codes in the 400 series are generally used when the client has made a request that the server can’t fulfill.
For example, the 400 status code is used when the client requests a resource that doesn’t exist. The 401 status code is used when the client doesn’t have the appropriate authentication credentials. The 408 status code is used when the client makes a request that’s longer than the server is willing to wait for.
404s are not only bad for the user experience of your website, they are particularly bad for your SEO performance. If search engine crawlers are being repeatedly sent to unavailable or dead pages, Google is less likely to see your website as providing valuable content or a high quality page experience to users.
For this reason, the following 404 status code errors will be flagged in your site auditor report:
Here are some of the potential reasons why a url might be 404ing and how to resolve the issue:
Status codes in the 500 series are general error messages. They are used when the server encounters an error while processing the request. These errors can often feel a bit like a mystery.
For example, the 500 status code is used when the server can’t find the requested resource. The 501 status code is used when the server can’t find the requested resource because it’s been moved. The 502 status code is used when the server can’t process the request because it’s overloaded.
If you’re web page is returning a 500 status code error, try the following fixes:
It’s important to investigate web page urls on your website that are producing an invalid response. Why? Because they can prevent users from arriving at the requested resource.
Resolving them can mean better keyword rankings and fewer site visitors bouncing away from your website.
There are two primary ways that you can check the response codes of your web pages.
In your GSC account, navigate to Index > Pages.
You’ll find a display summarizing various errors related to indexing. Messages about 404s or 500 errors will appear in this list.
Click on the error to then analyze the impacted pages more closely.
The Site Auditor will check the HTTP response codes of your webpages. It will also flag any issues it identifies in relation to the status code.
After running your site audit, navigate to the Issues tab in the Site Auditor dashboard.
Click on the “Page URL” category.
Look for any error messages mentioning HTTP status codes, and then click, “See Affected Pages.”
You’ll see a complete list of all of the pages on your website that are not returning 200 status codes.
Hand this list to your web developer to resolve the issue, or connect with one of our SEO experts to determine your next steps for resolving the issues.
Now that you understand the most important HTTP status codes for SEO, you can hopefully resolve any errors on your web pages.
But if you are still not quite sure why your web page urls are returning specific HTTP status codes, you may want to reach out our a technical SEO agency to see if they can help you resolve the issue.
A crawl budget may seem like a foreign concept when you’re first learning about how search engine bots work. While not the easiest SEO concept, they’re less complicated than they may seem. Once you begin to understand what a crawl budget is and how search engine crawling works, you can begin to optimize your website to optimize for crawlability. This process will help your site achieve its highest potential for ranking in Google’s search results.
A crawl budget is the number of URLs from one website that search engine bots can index within one indexing session. The “budget” of a crawl session differs from website to website based on each individual site’s size, traffic metrics, and page load speed.
If you’ve gotten this far and the SEO terms are unfamiliar to you use our SEO glossary to become more familiar with the definitions.
Google doesn’t devote the same amount of time or number of crawls to every website on the internet. Webcrawlers also determine which pages they crawl and how often based on several factors. They determine how often and for how long each site should be crawled based on:
The webcrawler indexing process makes search possible. If your content cannot be found then indexed by Google’s webcrawlers, your web pages–and websites will not be discoverable by searchers. This would lead to your site missing out on a lot of search traffic.
Googlebots systematically go through a website’s pages to determine what the page and overall website are about. The webcrawlers process, categorize, and organize data from that website page-by-page in order to create a cache of URLs along with their content, so Google can determine which search results should appear in response to a search query.
Additionally, Google uses this information to determine which search results best fit the search query in order to determine where each search result should appear in the hierarchical search results list.
Google allots a set amount of time for a Googlebot to process a website. Because of this limitation, the bot likely will not crawl an entire site during one crawl session. Instead, it will work its way through all the site’s pages based on the robots.txt file and other factors (such as the popularity of a page).
During the crawl session, a Googlebot will use a systematic approach to understanding the content of each page it processes.
The web crawler will also run a check to determine if the content on the page is a duplicate of a canonical. If so, Google will move the URL down to a low priority crawl, so it doesn’t waste time crawling the page as often.
Google’s web crawlers assign a certain amount of time to every crawl they perform. As a website owner, you have no control over this amount of time. However, you can change how quickly they crawl individual pages on your site while they’re on your site. This number is called your crawl rate.
Crawl demand is how often Google crawls your site. This frequency is based on the demand of your site by internet users and how often your site’s content needs to be updated on search. You can discover how often Google crawls your site using a log file analysis (see #2 below).
Because Google limits the number of times they crawl your site and for how long, you want to be aware of what your crawl budget is. However, Google doesn’t provide site owners with this data–especially if your budget is so narrow that new content won’t hit the SERPs in a timely manner. This can be disastrous for important content and new pages like product pages which could make you money.
To understand if your site is facing crawl budget limitations (or to confirm that your site is A-OK), you will want to:
When the time comes that your site has become too big for its crawl budget–you will need to dive into crawl budget optimization. Because you cannot tell Google to crawl your site more often or for a longer amount of time, you must focus on what you can control.
Crawl budget optimization requires a multi-faceted approach and an understanding of Google best practices. Where should you start when it comes to making the most of your crawl rate? This comprehensive list is written in hierarchical order, so begin at the top.
Google sends requests simultaneously to multiple pages on your site. However, Google tries to be courteous and not bog down your server resulting in slower load time for your site visitors. If you notice your site lagging out of nowhere, this may be the problem.
To combat affecting your users’ experience, Google allows you to reduce your crawl rate. Doing so will limit how many pages Google can index simultaneously.
Interestingly enough, though, Google also allows you to raise your crawl rate limit–the effect being that they can pull more pages at once, resulting in more URLs being crawled at once. Although, all reports suggest Google is slow to respond to a crawl rate limit increase, and it doesn’t guarantee that Google will crawl more sites simultaneously.
A log file analysis is a report from the server that reflects every request sent to the server. This report will tell you exactly what Googlebots do on your site. While this process is often performed by technical SEOs, you can talk to your server administrator to obtain one.
Once you have this information, you can use it to perform #3 through #7.
If your Log File shows that Google is spending too much time crawling pages you do not want appearing in the SERPs, you can request that Google’s crawlers skip these pages. This frees up some of your crawl budget for more important pages.
Your sitemap (which you can obtain from Google Search Console or our dashboard) gives Googlebots a list of all the pages on your site that you want Google to index so they can appear in search results. Keeping your sitemap updated with all the web pages you want search engines to find and omitting those that you do not want them to find can maximize how webcrawlers spend their time on your site.
Your robots.txt file tells search engine crawlers which pages you want and do not want them to crawl. If you have pages that don’t make good landing pages or pages that are gated, you should use the noindex tag for their URLs in your robots.txt file. Googlebots will likely skip any webpage with the noindex tag.
In addition to freeing up the crawl budget by excluding unnecessary pages from search engine crawls, you can also maximize crawls by reducing or eliminating redirects. These will be any URLs that result in a 3xx status code.
Redirected URLs take longer for a Googlebot to retrieve since the server has to respond with the redirect then retrieve the new page. While one redirect takes just a few milliseconds, they can add up. And this can make crawling your site take longer overall. This amount of time is multiplied when a Googlebot runs into a chain of URL redirects.
To reduce redirects and redirect chains, be mindful of your content creation strategy and carefully select the text for your slugs.
The way Google often explores a site is by navigating via your internal link structure. As it works its way through your pages, it will note if a link leads to a non-existent page (this is often referred to as a soft 404 error). It will then move on, not wanting to waste time indexing said page.
The links to these pages need to be updated to send the user or Googlebot to a real page. OR (while it’s hard to believe) the Googlebot may have misidentified a page as a 4xx or 404 error when the page actually exists. When this happens, check that the URL doesn’t have any typos then submit a crawl request for that URL through your Google Search Console account.
To stay current with these crawl errors, you can use your Google Search Console account’s Index > Coverage report. Or use the Site Audit tool to find your site error report to pass along to your web developer.
Note: New URLs may not appear in your Log File Analysis right away. Give Google some time to find them before requesting a crawl.
Search engine bots can move through a site at a rapid pace. However, if your site speed isn’t up to par, it can really take a major toll on your crawl budget. Use your Log File Analysis, PageSpeedInsights to determine if your site’s load time is negatively affecting your search visibility.
To improve your site’s response time, use dynamic URLs and follow Google’s Core Web Vitals best practices. This can include image optimization for media above the fold.
If the site speed issue is on the server-side, you may want to invest in other server resources such as:
These improvements will also give your user experience a boost, which can help your site perform better in Google search since site speed is a signal for PageRank.
Duplicate content is frowned upon by Google—at least when you don’t acknowledge that the duplicate content has a source page. Why? Googlebot crawls every page unless inevitably, unless told to do otherwise. However, when it comes across a duplicate page or a copy of something it’s familiar with (on your page or off-site), it will stop crawling that page. And while this saves time, you should save the crawler even more time by using a canonical tag that identifies the canonical URL.