SEO is basically a pitch to Google about why it should invest its resources in your website. After all, it takes a tremendous amount of time, money, and resources to crawl the countless websites available on the web. In order for Google to buy into your pitch, it needs to understand what your website is offering and why it’s the best option out of all the other websites in the world.

To do this, it is generally considered that Google looks at over 200 different factors when evaluating a website. Way back in 2010, Google stated it uses 200 factors, but that number seems to have increased over the last 12 years. And, the exact number and way these factors are used are still shrouded in mystery. Many SEOs have run countless experiments over the years to try and uncover what and how all 200 factors come together to form a ranked result, but the results can be misleading.

In fact, as I was researching for this article, I came across numerous top results that had completely incorrect information. Many claimed a certain factor was important for ranking, only for me to come across official Google documentation contradicting the claims made by the SEOs. For those, I added information that dispelled the myth. Additionally, many listed the same type of factor multiple times, just worded in a different way.

Due to this, I condensed some topics and linked only to official documentation as best I could. This means, there are not actually 200 factors listed in this article. Nevertheless, just because we know many of the ingredients, it doesn’t mean we can see the recipe.

That is why so many SEOs disagree on various factors and also why there is a lot of confusing misinformation out there. One thing is for sure, these factors are ever-changing and often subtle, but they can have a significant impact on whether or not your website shows up in Google’s search results.

An important note to remember: Not only is your site competing against other sites in terms of products & sales, but you’ll also be competing against Google’s ranking algorithm and human reviewers who use an extensive guide to manually assess some of the factors that contribute to a page’s ranking.

While some of these factors are well known (e.g., content quality and keywords), others are much less transparent (e.g., click-through rate and dwell time). So, how can you convince Google that your site is worth the effort on their part? Let’s take a (painstakingly in-depth) look at all of the known ranking factors (in no particular order of importance) and try to shed some light on how they impact a page’s ability to rank on the SERPs.

Table of Contents

Domain-Level Factors

1. Age of the Domain

There is a common misconception among many SEOs that the older a domain, the more Google will trust it. This is not necessarily true. While the age of a domain may be one small factor in Google’s algorithm, it’s certainly not a major one. A new domain has ample opportunity to rank just as well as an older one, provided the content is high quality and there are enough backlinks pointing to it.

Google’s John Mueller explicitly stated that the age of the domain in and of itself basically means nothing in the “eyes” of the algorithm. While having a seasoned domain is good for other ranking metrics, the fact is that there are many other factors that are much more important when it comes to determining how much trust Google will place in a site.

Read More About Domain Age as a Ranking Factor >>

2. Domain History

Age and history are two separate things when it comes to domains. Just because a domain is old, doesn’t necessarily mean that it has a good history. In fact, there are many factors in a domain’s history that can negatively impact the trust Google places in the domain.

Some of these can include:

  • The number of times the domain has changed hands.
  • Whether or not the domain has been used for spammy activities, including whether it has been penalized in the past.
  • If the content on the domain has always been high quality, or if it has fluctuated wildly.

Read More About Domain History as a Ranking Factor >>

3. Exact Match Domain (EMD)

An exact match domain (EMD) is a domain that includes a target keyword in the URL. For example, if you wanted to rank for the keyword “blue snorkels”, you might want to purchase the exact match domain “www.bluesnorkels.com”.

Exact match domains used to be a major ranking factor in Google’s algorithm. However, this changed when Google rolled out its EMD update in 2012. Since then, exact match domains have had much less weight in the algorithm. In fact, Matt Cutts, Google’s former head of webspam, explicitly stated that exact match domains are no longer a major ranking factor.

While they may not be as important as they used to be, exact match domains can still give some benefits. These days, they are mostly used by brands as a way to get a leg up on the competition.

For example, if you have the exact match domain “www.bluesnorkels.com”, and your competitor has the domain “www.bestbluesnorkels.com”, you’re slightly more likely to rank higher for the keyword “blue snorkels”. Although, nothing is ever guaranteed when it comes to SEO. So, don’t count on it.

Read More About Exact Match Domains & Ranking >>

4. Keywords in Top-Level Domain

Placing keywords in a top-level domain (TLD) has a similar effect as an EMD. There really is no significant benefit to it, other than brand recognition.

5. Keywords in a Subdomain

According to independent research done by Moz, having keywords in a subdomain can give a slight ranking boost. However, this is likely not because of the keyword itself, but rather because it’s an indication of site structure.

For example, if you have a blog on your website, it would make sense to put it in a subdomain like “blog.example.com” or “shop.example.com”. This is because blogs and stores are typically separate from the rest of a website’s content, and thus would logically be on a separate subdomain. However, whether or not you use a subdomain for your blog or any other purpose is entirely up to you.

While having keywords in a subdomain can give a small ranking boost, it’s important to remember that over-optimizing your website for keywords can actually lead to penalties from Google. So, if you’re going to use keywords in a subdomain, make sure that it’s for a logical purpose and not just to game the system.

Read More About Subdomain Keywords as a Ranking Factor >>

6. Country TLD Extension

A country TLD is an extension that identifies a country, rather than having the typical .com, .net, etc. For example, .ca is a country TLD for Canada, .uk is a country TLD for the United Kingdom, and .au is a country TLD for Australia.

If you’re targeting a specific country with your website, it can be beneficial to use a country TLD. This is because it signals to Google that your website is relevant for that country.

However, it’s important to note that using a country TLD does not guarantee top rankings in that country’s search engine results pages (SERPs). And, in fact, it might hurt your ability to rank globally. So, it is important to consider whether or not using a country TLD is right for your website before making the decision to do so.

7. Domain Registration Expiration

It’s up for debate about whether the length of time between now and when your domain is set to expire can affect your rankings. The confusion over this started back in 2005 when GoDaddy claimed Google’s “Information retrieval based on historical data” patent application provided proof that domain expiry date is a ranking factor.

This is simply not true. And, Google’s John Mueller has stated multiple times that domain expiration date is not a ranking factor. As John explains the reason being:

Regardless, if your domain is set to expire within a few months or even next year, it’s a good idea to renew it for multiple years. This will show users that you’re serious about your website and that you’re in it for the long haul, even if it has no effect on rankings.

8. Public vs. Private WhoIs

The WhoIs directory is a database that contains the contact information for every registered domain. This information includes the name, address, phone number, and email address of the domain’s owner.

The WhoIs directory is public by default, which means that anyone can look up the WhoIs record for any domain. However, most registrars offer the option to make a WhoIs record private.

If you’re not comfortable with your personal information being public, you can opt for a private WhoIs record. However, there is some evidence that having a public WhoIs record can actually improve your SEO. In particular, it seems to be a factor in local search rankings.

9. Previously Penalized WhoIs Owner

If the owner of a domain has been penalized by Google in the past, it could negatively affect the SEO of any new or existing domains that they register. This is because Google may view the owner as someone who is trying to game the system.

Therefore, if you’re considering buying a domain from someone, it’s a good idea to check their WhoIs record to see if they have a history of being penalized. Otherwise, you could be buying a domain that has a black mark against it, and ranking highly may be more difficult than you think.

10. Domain Trust/TrustRank

Similar to Google’s PageRank, it also uses a metric known as “TrustRank” which is a measure of the trustworthiness of a website, and it’s based on factors like age, popularity, and webmaster interactions.

In general, websites with high TrustRank are more likely to rank higher than those with low TrustRank. Google’s TrustRank patent explicitly states in the Abstract section:

“A search engine system provides search results that are ranked according to a measure of the trust associated with entities that have provided labels for the documents in the search results. A search engine receives a query and selects documents relevant to the query. The search engine also determines labels associated with selected documents and the trust ranks of the entities that provided the labels. The trust ranks are used to determine trust factors for the respective documents. The trust factors are used to adjust the information retrieval scores of the documents. The search results are then ranked based on the adjusted information retrieval scores.”

So, it is clear that TrustRank is a significant ranking factor.

11. Parked Domains

Parked domains are domains that are registered but are not being used for an active website. Instead, they are “parked” by the registrar, typically with placeholder content and/or ads.

In 2011, Google stated its new algorithm can detect and, in most cases, hide parked domains from search results:

“New “parked domain” classifier: This is a new algorithm for automatically detecting parked domains. Parked domains are placeholder sites with little unique content for our users and are often filled only with ads. In most cases, we prefer not to show them.”

If you have recently purchased a parked domain, it’s best to wait until it’s been active for a while before you try to use it for SEO purposes.

12. Shared Web Hosting Flagged as Spam

If the IP address of your server has been flagged for spam, you may or may not also get flagged. As Matt Cutts explains:

“On the list of things that I worry about, that would not be near the top. Google understands that shared hosting happens and you can’t really control or help who else is on that IP address or Class-C subnet…I have seen this happen at least once, where if you have, say, thousands and thousands of spammy websites on one IP address, and then one normal website, that can look a little bad.”

But, in general, if there is a normal mix of websites on the IP address, it’s likely not something you have to worry about. However, if you want to avoid this situation entirely, make sure you’re not using a shared server. Additionally, you can use Google’s Search Console to check if your IP address has been flagged for spam. If it has, you’ll need to contact your hosting provider and have them clean up the reputation of the IP address.

Site-Level Factors

13. Site Architecture

Your site architecture is the structure of your website. It is the way that you organize your pages and connect them together. A well-designed site architecture makes it easy for users and search engines to find the content they’re looking for.

Some common elements of a well-designed site architecture include:

  • A clear hierarchy
  • Easy navigation
  • Logical URL structure

Not only does having clean architecture improve user experience (UX), but it also helps Google crawl the site more efficiently. The easier you make it for users and Google to find your content, the more likely it is that they will actually visit your site.

14. Site Usability

If your site is difficult to navigate, users will likely leave without finding the content they’re looking for. This often leads to a higher bounce rate. Likewise, if it’s hard for Google to navigate your site, it may never show up in the search results.

Therefore, it’s important to design a website that is both user-friendly and search engine friendly. Some common elements of a user-friendly website include:

  • Intuitive navigation
  • Logical site structure
  • 404 pages are redirected to relevant, updated content
  • Use of breadcrumbs
  • Easy to find contact information

15. Sitemap(s)

A sitemap is a file that tells Google (and other search engines) about the pages on your website. It is essentially a map of your website. Creating and submitting a sitemap helps to ensure that Google knows about all of the important pages on your site, and it can also help to speed up the indexing process.

A sitemap often includes content such as pages, posts, images, newsletters, etc. Sitemaps can also be used to specify the frequency with which a page is updated, the date when it was last updated, and the importance of a page.

There are two types of sitemaps:

  • XML sitemaps: These are designed for search engines, and they help to improve the crawling and indexing of your website.
  • HTML sitemaps: These are designed for users, and they help to improve the usability of your website.

Ideally, you should have both an XML sitemap and an HTML sitemap. However, this may not always be practical, so you may need to choose one or the other depending on your website.

16. Unique Content

Your site needs to have high-quality, unique, and engaging content if you want to rank well on Google. This has been true for a long time, and it’s still true today. Google explicitly states that thin content, especially on affiliate sites, can be a reason for manual action.

To avoid being penalized, make sure that your site’s content is unique and valuable. This means writing original content that actually provides something of value to your readers.

One of the best ways to ensure that your site has high-quality content is to produce original research. This could take the form of data studies, surveys, unique product descriptions, and more.

17. E-A-T Principle

E-A-T is a principle Google asks site owners to follow when creating content. First and foremost, you want to focus on providing the users with high-quality content that is relevant to their search query.

In addition to this, you also want to focus on creating content that has a high level of Expertise, Authoritativeness, and Trustworthiness. This is often referred to as “E-A-T.”

To create content with a high level of E-A-T, you need to:

  • Be an expert on the topic you’re writing about
  • Have credentials that show you’re an authority on the topic
  • Be a source that can be trusted

Google has said that they use E-A-T as a ranking factor, so it’s important to focus on this when creating content for your website.

18. Reviews & Reputation

According to Google’s review manual, it states that reputation is especially important to YMYL (Your Money or Your Life) websites. These include, but are not limited to, sites in industries such as healthcare, finance, and e-commerce.

If you have a website in one of these industries, it’s important to focus on building up your reviews and reputation. Google uses reviews as a ranking factor, so the more positive reviews you have, the better.

There are a few different ways to get reviews, but the most common is to simply ask your customers or clients to leave a review on your Google My Business listing or on a third-party site like Yelp. Alternatively, e-commerce sites often have reviews built into their product pages.

Sites that are not e-commerce may opt to include information on executives and employees on their website. This is often seen in the “About Us” section and can help to build trust with potential customers or clients. Doing this also gives you a chance to showcase certifications, awards, and other credentials that can help to build your reputation.

Whenever possible, be sure to add an external link to sites that can verify credentials and awards. These include, but are not limited to, Better Business Bureau, Angie’s List, and the Chamber of Commerce.

19. Contact Us Page

The contact page is one factor that goes into building your TrustRank. A site that has a well-designed contact page with all of the necessary information is more likely to be trusted than one that doesn’t.

At a minimum, your contact page should include:

  • Your name
  • Your company name
  • Your address
  • Your email address
  • Your phone number
  • A map or directions to your location (if applicable)
  • A form for visitors to fill out (if you don’t want to display your email address)

20. Sitewide Links

Sitewide links are links that appear on every page of a website. Sitewide links are generally used for navigation. These can be links to internal pages, like the contact page or the About Us page, or they can be links to external websites.

As Google’s Matt Cutts has stated, sitewide links are compressed into a single link. This means that if you have a link in the header that appears on every page, it will be counted as one link.

21. Hacked Site

If your site gets hacked, you will likely drop off search results entirely until the security of your site can be verified by Google. This is because hacked sites often contain malicious code that can harm the users of your site.

22. Site Over-Optimization

Yes, there is such a thing as too much of a good thing. When it comes to optimizing your website for search engines, you need to strike a balance between making your site as relevant and keyword-rich as possible while still making it readable and user-friendly.

If you go too far with the keywords, you run the risk of being penalized for keyword stuffing. Likewise, if you don’t use enough keywords, you may not be able to rank as high as you’d like.

Therefore, it’s important to find a happy medium when optimizing your site for search engines. So, don’t overthink it. Keep your focus on creating amazing, unique, engaging content.

23. Affiliate Sites

Affiliate sites are sites that promote products or services in exchange for a commission. While there’s nothing inherently wrong with this, Google does not look favorably upon affiliate sites. This is because such sites are often filled with low-quality content and exist solely for the purpose of promoting products.

By building an affiliate site, you’re opening yourself up to enhanced scrutiny from Google. Therefore, if you want to avoid any potential penalties, it’s best to build an affiliate site that is rich in helpful content and low on promotions.

Additionally, don’t try to obscure the fact that you earn a commission in any way. Use the proper rel=sponsored tag for links and do not try to cloak your affiliate links. If you’re caught doing this, you will most definitely be penalized. Be as transparent as glass.

24. Blocked Sites

One upon a time, Google allowed users to block sites using a Chrome extension. While this feature is no longer available, it’s possible that Google either still uses that data or a similar form of data when determining rankings.

25. HTML/W3C Validation Errors

HTML and W3C validation errors happen when your site’s code doesn’t meet the standards set by the World Wide Web Consortium. These errors can range from small issues like missing closing tags to more serious issues like invalid code.

Validation errors won’t necessarily hurt your ranking directly (although this is up for debate), but they can make your site look unprofessional and cause problems for users. If users have a bad experience, it’s going to alert Google and that can indirectly hurt your ranking.

The best way to fix HTML/W3C validation errors is to use a tool like the W3C Markup Validation Service. It will scan your site’s code and provide you with a list of all the errors it finds. Once you have a list of errors, you need to fix them.

26. Manual Actions

A manual action is when Google takes action on a site that is in violation of its Webmaster Guidelines. This can happen for a number of reasons, such as spammy or artificial links, thin content, and more.

To determine if your site has been subjected to manual action, you can view the Manual Action Report in Search Console. If you receive a manual action, it’s important to take the necessary steps to fix the issue and submit a reconsideration request. Otherwise, your site could be severely penalized, which could lead to a dramatic drop in traffic.

27. Reconsideration Request

A reconsideration request is a formal way of asking Google to review your site after it has been penalized for violating its Webmaster Guidelines. If you receive a manual action, you will need to submit a reconsideration request in order to get your site back in Google’s good graces.

In order to submit a reconsideration request, you will need to fill out a form in Search Console. Once you have submitted the form, Google will review your request and determine whether or not to lift the manual action.

28. Google Dance & Sandbox

For sites that are suspected of violating Google’s guidelines, they may find themselves in Google Sandbox or doing the Google Dance.

Google Sandbox is a temporary holding pen for websites that require review. It’s a way for Google to keep an eye on these sites and make sure they’re not up to any funny business. Once a site has proven itself to be trustworthy, it will be released from the sandbox.

The Google Dance is when a site’s ranking fluctuates wildly. Sometimes this term is used when rankings fluctuate after a major algorithm update or as a way to determine whether a site is trying to manipulate its rankings. According to Google’s patent “Changing a rank of a document by applying a rank transition function” it:

  • “determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.”

Technical Factors

29. Site Updates

Google strives to deliver the most timely, relevant information to its users. However, according to John Mueller, it doesn’t really pay special attention to the frequency of updating content. If an article from 10 years ago is still relevant and helpful today, it could still rank well.

However, if your website is outdated and no longer relevant, it will likely struggle to rank. So, this is likely where the idea of consistently updating content on the site comes from. Additionally, updating with new content can help your site really nail your topics, rank for new keywords, and more.

Or, it could also refer to updating the site’s technical specs, such as security patches, to make sure it’s up-to-date and secure. If you don’t regularly update plugins, PHP versions, and other things, your site could become vulnerable to attack. If your site is attacked, it could not only lose rankings but could also get blacklisted by Google for obvious reasons.

Despite the ambiguity in how it affects your rankings in search engines, keeping your website fresh and pertinent to users’ needs is still very important for indirect reasons. You simply don’t want to set your website and then neglect it.

One way to ensure that your site is regularly updated is to add a blog. However, as time goes by, old blog posts end up in a “blog post graveyard” so to speak as new content pushes them down in the search results.

To avoid this, set up a content calendar that tracks the date of each blog post and the topics that you plan to cover. Routinely review older posts and update them as necessary with new internal links, information, and images. This will help to keep your site fresh and relevant, which is what users (and Google) want.

30. Server Location

If your SEO strategy relies heavily on local searches, but you also target audiences around the world, then it’s important to have your server located in the same country (and, ideally, region) as your target market.

A common misconception is that the server location only matters for international SEO. However, if you’re targeting a specific country or region, you should have your server located there as well.

For example, if you’re targeting customers in Australia, your server should be located in Australia. This is because Google uses the server location to determine where a website is physically located.

Additionally, keep in mind that if you’re targeting multiple countries, it can be advantageous to have a separate server (and possibly even subdomain) for each one. This can also help you better localize your content to reflect the language, culture, and other factors of each target market.

31. Site Uptime

Your site’s uptime is the percentage of time that it’s accessible and working properly. If your site is down, users can’t access it, which obviously isn’t good for business.

Additionally, if your site is down when Google tries to crawl it, that can lead to errors in the indexing process, which can hurt your rankings. Therefore, it’s important to make sure that your site has a good uptime record.

There are a number of tools that you can use to check your site’s uptime and will monitor your site and alert you if it goes down so that you can take action to fix the issue.

32. HTTP vs HTTPS

An SSL certificate is a digital certificate that authenticates the identity of a website and encrypts information sent to the server using SSL technology. It is what distinguishes between HTTP (non-secure) and HTTPS (secure) sites.

SSL stands for Secure Sockets Layer, which is a protocol for establishing an encrypted link between a web server and a browser. This link ensures that all data passed between the web server and browser remain private and secure.

To understand the difference, you may have tried to visit a site, only to be met with a screen warning you that the connection is not secure. However, sometimes the warning is more subtle, with the little lock icon in the browser bar being unlocked. Obviously, if users encounter these types of warnings, it’s not going to inspire a lot of confidence in your business.

Google has stated that they prefer websites that use SSL encryption and may give a slight ranking boost to those sites. Therefore, if you’re not using an SSL certificate on your site, it’s definitely something to consider.

33. HTML/W3C Validation Errors

HTML and W3C validation errors happen when your site’s code doesn’t meet the standards set by the World Wide Web Consortium. These errors can range from small issues like missing closing tags to more serious issues like invalid code.

Validation errors won’t necessarily hurt your ranking directly (although this is up for debate), but they can make your site look unprofessional and cause problems for users. If users have a bad experience, it’s going to alert Google and that can indirectly hurt your ranking.

The best way to fix HTML/W3C validation errors is to use a tool like the W3C Markup Validation Service. It will scan your site’s code and provide you with a list of all the errors it finds. Once you have a list of errors, you need to fix them.

34. Metadata

Metadata is the text that appears in the search results which includes the title tag and meta description. The title tag is the main headline of the search result, while the meta description is a brief summary of the page’s content.

Both the title tag and meta description are important for SEO as they give users an idea of what your page is about and can influence their decision to click through to your site.

To optimize your title tags and meta descriptions, make sure that they are clear, concise, and relevant to the page’s content. In addition, using keywords in your title tags and meta descriptions can help to improve your click-through rate as it will make your listing more relevant to what users are searching for.

35. Meta Description Spamming

Meta description spamming is the act of stuffing keywords into your meta tags in an attempt to manipulate search rankings. This is a form of black hat SEO and is looked down upon by Google.

Meta tag spamming used to be more common in the early days of SEO but has since fallen out of favor. Google has become better at detecting this type of spam and penalizing sites that engage in it.

Therefore, if you want to avoid being penalized, it’s best to refrain from stuffing keywords into your meta tags. Instead, focus on creating descriptions that accurately reflect the content on your page and are useful to searchers.

36. Breadcrumbs

Breadcrumbs are a type of navigation that shows the user’s current location within a website. They usually take the form of a list of links, with the last link being the current page. For example:

Home > Category > Subcategory > Page

Breadcrumbs are beneficial for both users and search engines as they provide a way to navigate a website and can help to improve the click-through rate from the search results. In addition, they can also help to improve the crawlability of a website as they provide an additional layer of navigation.

To implement breadcrumbs on your website, you can use a plugin or add the code to your theme. If you’re not comfortable doing this, you can also contact your web developer to help you out.

37. Core Web Vitals

Google’s Core Web Vitals are a set of metrics that measure the performance and user experience of a website. The three metrics are:

  • Largest Contentful Paint: measures the time it takes for the largest element on the page to load.
  • First Input Delay: measures the time it takes for the page to become responsive to user input.
  • Cumulative Layout Shift: measures the amount of unexpected layout shift on a page.

These metrics are important as they give an indication of how user-friendly your website is. In addition, Google has said that they may use these metrics as a ranking signal in the future.

To improve your Core Web Vitals, you need to focus on improving the performance of your website. This can be done by optimizing your images, reducing the amount of code, and using a faster web host.

38. Mobile Optimization

As the world increasingly moves to mobile devices, it’s important that your website is optimized for them. Mobile optimization includes making sure that your website can be easily viewed and used on a mobile device.

Some things to keep in mind when optimizing for mobile are:

  • Use a responsive design: this means that your website will automatically adjust to the screen size of the device.
  • Use large, easy-to-tap buttons: this will make it easier for users to navigate your website on a small screen.
  • Use legible fonts: this will ensure that your content can be easily read on a mobile device.
  • Use compressed images: this will help to improve the load time of your website on a mobile connection.

To check if your website is mobile-friendly, you can use Google’s Mobile-Friendly Test tool. This will give you a report on any issues that need to be fixed.

39. Keyword + Brand Searches

A keyword + brand search is when a searcher includes a brand name in their query. For example, you may type “blue snorkels” and see “blue snorkels BlueSnorkelCo.” However, this can be for any type of website, not just eCommerce sites.

Keyword + brand searches are beneficial for two reasons:

  1. They indicate that the searcher is interested in your brand. This means that they are more likely to be qualified leads.
  2. They can help to improve your click-through rate as you’re appearing in a search that includes your brand name.

To ensure that you’re appearing in keyword + brand searches, you need to make sure that your brand is included in your title tags and meta descriptions. In addition, finding other (ethical) ways to maximize this effect can give you a significant boost in rankings.

Google's Products

40. YouTube

Google owns YouTube, the world’s largest video sharing website. If you want to use YouTube to market your business, you can create a channel and start uploading videos. You can also use YouTube to drive traffic to your website by including links in your video descriptions.

It is almost guaranteed that if your site is linked to a high-quality YouTube site, it will result in a higher search engine ranking. The keyword there is “high-quality.” If the quality is good enough, you may actually see your videos in the search results instead of your site.

41. Google Analytics & Search Console

Google Analytics and Search Console are valuable tools in their own right, but Google denies that using them will have a direct effect on your rankings. However, they can be useful in indirect ways.

Google Analytics can help you to track the traffic to your website and see how users interact with it. This information can be used to improve your website and make it more user-friendly.

Search Console can help you to identify any errors on your website that could be affecting your rankings. It can also help you to track your website’s performance in the search results over time.

Both of these tools are free to use and easy to set up. If you’re not already using them, it’s worth taking the time to do so.

42. Google Business Profile

Google Business Profile (formerly known as Google My Business) is a free listing service that allows businesses to manage their information on Google. This includes their address, phone number, business hours, and website.

Having a Business Profile is important for two main reasons:

– It helps your business to be found on Google Maps and in the local search results.

– It allows you to control what information is shown about your business.

If you haven’t already, you should claim your Business Profile and fill out as much information as possible. Having a brick-and-mortar address may provide a strong signal to Google that you are a legitimate company, as well.

43. Google Discover

Google Discover is a feed of personalized content that is shown to users when they open the Google app or visit the Google homepage. The content is based on the user’s interests and is updated over time.

Google Discover can be a great way to promote your content to a wider audience. If you create high-quality content that is relevant to your target audience, there’s a good chance it will be shown in the Discover feed.

To improve your chances of being included in Discover, you can use Google Web Stories. This is a format that is designed specifically for mobile users and is more likely to be shown in the Discover feed. While not a ranking factor in and of itself, being included in Discover can lead to an increase in website traffic.

Page-Level Factors

44. URL Length

The length of a URL appears to have an impact on search engine rankings. In general, shorter URLs tend to rank better than longer ones. This is likely because shorter URLs are easier to remember and share. They are also less likely to be affected by typos and other errors.

However, site structure can play a role in URL length. Factors such as whether the site uses www, categories, and subdomains can all affect the length of a URL. As such, you shouldn’t focus too much on URL length when optimizing your website. The goal is to have a clear, concise URL that is not trying to stuff keywords.

45. URL Path

This ties into the previous URL length. The distance from the root directory of a website to the page in question (i.e., the URL path) appears to have an impact on search engine rankings. In general, pages that are closer to the root directory (i.e., have a shorter URL path) tend to rank better than pages that are buried deep within the site.

However, even if you try to keep the URL short by using the shortest possible path, there are other factors that can come into play. For example, if your site is highly structured with multiple layers of categories and subcategories, the URL path may be longer even if the actual page is close to the root directory. For sites like this, shortening the category names won’t negate the depth of the URL path.

To paint a visual image, think of the page as a buried treasure. If it is buried just below the surface, it is easier to find than if it is buried deep underground.

46. URL String

Categories contained within the URL string (i.e., the part of the URL after the domain name) appear to have an impact on search engine rankings. In general, pages that contain relevant keywords in their URL tend to rank better than those that don’t.

However, this is just a general trend, and there are many exceptions. For example, if you try to stuff too many keywords into the URL string, it will look spammy and could hurt your rankings. The goal is to have a URL that is clear and concise, with one or two relevant keywords included.

47. Keyword in URL

Placing a keyword in the URL is, indeed, a ranking factor. However, it is very small. To better understand this, it is important to note that the URL is the first thing that a search engine sees when it crawls a page. Therefore, it makes sense that the URL would be one of the first places that a keyword would be found. But, the keyword in a URL is still just one small part of the larger picture.

48. Keyword in Title Tag

It used to be that having a keyword in the title tag was a significant ranking factor. However, while it is still important, its importance has diminished over time.

The title tag is the text that appears in the search engine results pages (SERPs) when a page comes up as a result of a particular query. It is also what appears at the top of the browser window tab when you visit a website.

49. Title Tag Starts with Keyword

Staying with the previous example of how the URL is the first thing that a search engine sees when it crawls a page, it makes sense that having the keyword at the beginning of the title tag would be a good thing.

Various experiments have shown that pages that place the keyword at the beginning of the title tag tend to rank better than those that don’t. The weight this practice carries in ranking is unclear. Regardless of the weighting, it is obviously a very beneficial practice in the context of user experience and making it explicitly clear what the page is about.

50. Keywords in Meta Description

The meta description is the text that appears below the title tag in the SERP. It is meant to give the searcher a brief overview of what the page is about.

According to Google, the meta description is not a direct ranking factor. With that being said, however, good meta descriptions tell a user what the page is about. And, it can impact your click-through rate (CTR) from the SERPs, which can affect your rankings.

51. Page Categories

The categories you place your pages in can have an impact on their rankings. This is especially true if the category is a broad one that contains many pages. This is because categories send signals to search engines about the relevancy of the page.

The bottom line is that if you want your pages to rank well, it is important to make sure that they are placed in appropriate categories. While the boost may be small, effective use of categories can be the difference between a page that ranks on the first page and one that doesn’t.

52. Keywords in H1 Tag

The H1 tag is often the same as the page title and title tag. You can think of the H1 tag as a secondary title tag in terms of importance. Adding a keyword in the H1 tag does offer a small boost in rankings.

It used to be that having more than one H1 tag on a page was a bad thing. In fact, many SEO tools will alert you if a page has more than one H1. Generally speaking, it is still best practice to only have one H1 tag per page. Nevertheless, Google has stated that with advancements in the algorithm, having more than one H1 doesn’t carry the weight it once did.

There is one caveat to this, though. Heading tags are used by screen readers to help those with disabilities understand the structure and hierarchy of a page. Therefore, if you have multiple H1 tags, it can be confusing for screen readers.

53. Keywords in H2, H3 Tags

Similar to H1 tags, you want the structure of the page to flow in a hierarchical manner. That is, you want your H2 tags to be subheadings of your H1 tag, and your H3 tags to be subheadings of your H2 tags.

While Google has said that they don’t carry as much weight as they used to, they are still important for the structure of your page. And, as John Mueller states, a clear hierarchy is a factor in rankings. In addition, they help break up the content, which can improve the user experience.

Some website developers and designers are unaware of the impact headings can have. Therefore, they often use the tags out of order based on aesthetics. If you’re in the process of building a website, be sure to use heading tags properly.

54. Keyword Density

Keyword Density – sometimes referred to as TF-IDF (Term Frequency-Inverse Document Frequency) – is the number of times a keyword appears on a page divided by the total number of words on the page.

For example, if a keyword appears five times on a page with 100 words, the keyword density would be 5%. In the past, SEOs would try to game the system by stuffing their pages with keywords in an attempt to increase their keyword density. This, of course, would result in a poor user experience.

Today, Google’s algorithm is much more sophisticated and can detect when a page is stuffed with keywords. In fact, keyword stuffing can result in a penalty from Google. The bottom line is that you should focus on creating high-quality content rather than stuffing your pages with keywords.

55. Keyword Placement

Keyword placement is the location of a keyword on a page. Placing a keyword in the first 100 words of a page can have a small, but noticeable impact on rankings.

With that said, it is important to note that you should not sacrifice the quality of your content in an attempt to place a keyword in the first 100 words. If a keyword doesn’t naturally fit in the first 100 words, it is probably best to leave it out.

56. Using “Near Me” Keywords

When it comes to local searches, some SEOs believe that using the term “near me” will help them get found easier. For example, instead of using a natural term like “blue snorkel” while describing a product, they would use “buy blue snorkel near me.”

However, there is no evidence to suggest that this actually helps. In fact, it is more likely to hurt your rankings as it looks unnatural and spammy. A better approach would be to focus on optimizing your website and off-page resources for local SEO.

57. Table of Contents

Using a table of contents can help both users and search engines understand the structure of your content. It can also improve the user experience, as users can quickly navigate to the section they are interested in. After all, imagine if I had written this ridiculously long article and not used a table of contents at the top. Yikes!

As a result, it is a good idea to use a table of contents if your page is long and has a lot of sections. There is no definitive proof of the weight a table of contents may carry in rankings. However, given the heavy focus on user experience and useful content, it is likely that it can have a positive impact, and most definitely will not hurt you in any way.

58. Content Hidden Behind Tabs

This is a tough one. The topic is a great example of how Google says one thing, but SEOs find evidence to the contrary. In 2014, Google’s John Mueller stated that hiding content behind tabs can have a negative impact on your site. Then, in 2016, Google’s Gary Illyes came out and stated they would grant full weight to content hidden behind tabs.

no, in the mobile-first world content hidden for ux should have full weight

— Gary 鯨理/경리 Illyes (@methode) November 5, 2016

 

Since then, however, many SEOs have conducted experiments and found that Google may not actually be giving full weight to content hidden behind tabs. This caused a bit of an uproar in the SEO community. So, in 2018, Gary Illyes once again stated:

AFAIK, nothing’s changed here, Bill: we index the content, its weight is fully considered for ranking, but it might not get bolded in the snippets. It’s another, more technical question how that content is surfaced by the site. Indexing does have limitations.

— Gary 鯨理/경리 Illyes (@methode) September 14, 2018

 

So, it is inconclusive at this time. It could very well be that with everything else being equal, the site with the content visible will rank higher. Perhaps it is best to err on the side of caution with this. If you don’t absolutely need tabs, don’t use them.

59. Bullets and Numbered List

While there is no direct impact on rankings and no known preference given, using bullets and numbered lists do help readers scan your content and find the information they are looking for. Because of this, it is a good idea to use them when it makes sense. Additionally, as user attention span continues to decrease, this will become more and more important.

60. Bolded & Italicized Text

There are two ways that text can be bolded. Either by using the or the tags. There’s no difference between the two tags either in how a search engine perceives them or how they should be used.

The same goes for italicized text. You can use the or the tags. Again, there is no difference in how a search engine perceives them or how they should be used.

When it comes to rankings, John Mueller has mentioned that the algorithm looks for text that has emphasis tags such as these. And, so it can provide a small boost in not only helping the algorithm and users understand the main emphasis of the page but also in rankings.

As with everything in SEO, though, it’s best to not try to overdo it. If every other word is bolded or italicized, it looks spammy and will likely have a negative effect on your rankings. Likewise, emphasizing only keywords will hurt you. A couple of strategically placed words that naturally emphasize a point should be fine.

61. Content Recency

The more recent the content, the better, right? Well… yes and no. It depends on the type of content. Let’s take a job post, for example. If it is a job that has been filled, the post is no longer relevant. So, in this case, you would not want to update the content. Similarly, if you have a blog post about a current event, keeping the content up to date is essential because it keeps the post relevant and timely.

But, if the content covers an evergreen topic, recency may not have much influence. For example, a post about historical events will always be relevant, no matter when it was written. In this case, updating the content may not make sense because you don’t want to change the historical facts.

At this time, there is no clear-cut answer as to how often content should be updated. And, when it comes to rankings, the bottom line is content recency is not necessarily a ranking factor. But, it is important for the overall user experience of your site. So, in general, if you can keep your content fresh and relevant, it is a good idea to do so.

62. Length of Content

There is a direct correlation between rankings and content length. Generally speaking, the longer the content, the higher the ranking. The sweet spot seems to be around 2,000 words. But, that is not to say that you should force your content to be a certain length.

The most important thing is that the content is relevant, well-written, and provides value to the reader. If you can naturally write long-form content, great! But, if not, don’t force it. It is better to have shorter, well-written content than longer, poorly written content full of fluff and nonsense.

63. Duplicated Content

Whether it is pages or meta descriptions, duplicated content can definitely hurt your rankings. In fact, it is one of Google’s Webmaster Guidelines.

When it comes to pages, the general rule of thumb is to have each page be unique. That being said, there are a few exceptions. For example, you may have a “Contact Us” page that is very similar to your “About Us” page. In this case, it is not a big deal if the content is duplicated.

As for meta descriptions, you also want each one to be unique and relevant to the content of the page. That being said, Google will often pull in a snippet of text from the page if it is relevant to the user’s query instead of showing the meta description. However, it is still a good idea to have unique meta descriptions as they can influence click-through rates, which is a ranking factor.

64. Syndicated Content

Syndicated content may or may not be treated as duplicate content in terms of penalization. For example, if you post an article on your website, then post it to other websites such as LinkedIn or Medium, there is a high chance that your website’s post will just be lost in the shuffle and Google will choose the article from the other platform.

Syndicating content in this manner won’t penalize your site, but it also won’t directly help it. There may be some indirect benefits due to users clicking through to your site from the other platforms, but it is not likely to have a significant impact.

However, syndicating content that is not yours – entire articles copied from another indexed site, for example – can severely hurt your rankings. Additionally, it can open you up to issues with copyright infringement. So, if you are going to syndicate content, make sure it is original content that you have permission to post.

65. Autogenerated Content

Autogenerated content is when a computer program generates content without the involvement of a human. This content is usually of low quality and is not useful to searchers. As a result, Google typically penalizes sites with autogenerated content.

As John Mueller explains:

“For us, these would, essentially, still fall into the category of automatically generated content which is something we’ve had in the Webmaster Guidelines since almost the beginning… So from our point of view, if we were to run across something like that, if the webspam team were to see it, they would see it as spam.”

66. Gibberish Content

In line with the previous point, Google penalizes sites with gibberish content. This is defined as “Content with no meaningful relationship to the rest of the page or website.”

Gibberish content is usually generated by computer programs and is of low quality. Not only is it unhelpful to searchers, but it can also hurt your chances of ranking high on search engine results pages.

67. Doorway Pages

Doorway pages are designed to manipulate search rankings so that no matter what result a user clicks on, they are redirected to the same page. These pages are expressly forbidden by Google and can result in a penalty if you are caught using them.

Doorway pages are often created with little to no original content and are instead filled with keywords and links. They provide no value to searchers and only exist to game the system. So, if you want to avoid being penalized, it’s best to stay away from using them.

68. LSI Keywords in Metadata & Content

Latent Semantic Indexing (LSI) keywords are related terms that help to contextualize your content. For example, if your article mentions “apple,” LSI keywords help distinguish whether the article is about the fruit or the tech company.

You may hear a lot of SEOs take about this, but it’s not a ranking factor.

As Google’s John Mueller says:

 

69. In-Depth Topics

Generally speaking, covering a topic in-depth is better than skimming the surface. This is especially true in terms of Google’s featured snippet boxes, which often pull from in-depth articles.

In-depth articles tend to rank better because they show that you are an expert on the topic and are able to provide a lot of value to the reader. So, if you want to improve your rankings, focus on writing in-depth articles that cover all aspects of the topic.

However, as we segue into the next two sections, it is important to note that in-depth doesn’t necessarily mean it is helpful or useful. For example, a scientific research paper is in-depth. But, the terms used may be so technical that it is difficult for the average person to understand. In this case, an in-depth article may not actually be helpful or useful and, as a result, may not rank well.

70. Useful vs. Quality Content

While often used interchangeably, there is a difference between useful and helpful content. Quality content is simply information that is accurate and well-written. Helpful content, on the other hand, goes a step further by being easy to understand and actionable. So, when creating content, focus on making it not only accurate and well-written, but also easy to understand and actionable.

71. Reading Level

The reading level of your content is very important. At one time, Google would show webmasters the estimated reading level of content. Many SEO tools still have this feature. The reason why the reading level is important is that it needs to match the level of the searcher.

Using the previous example of a scientific paper, the terms may be too technical for the average person. As a result, it is not likely to rank well because it is not helpful or useful to the average person.

On the other hand, if you write for a fifth-grade reading level, it is likely that your content will be too simple and may not rank well as it is seen as too simplistic.

To find the reading level of your content, you can use a plugin that offers this feature, or a tool like the Hemingway app. Ideally, you want to aim for a seventh- to eighth-grade reading level, which is the average reading level in the United States.

72. Spelling & Grammar

It should go without saying that spelling and grammar are important. Not only do they make your content easier to read, but they also show that you are credible and trustworthy.

To ensure that your content is free of spelling and grammar errors, you can use a tool like Grammarly or Hemingway. You can also hire a professional editor to help you with this.

73. Citations, References, and Sources

Using citations, references, and sources can help to improve the quality of your content not only in the eyes of readers but also in terms of how Google views the quality of the content. While using these elements is not required, it can be helpful, especially if you are writing about a controversial or sensitive topic where accuracy is important.

However, it is important to note that linking to highly authoritative content as a citation is not where you will find the benefits, as Google has stated that external links are not a ranking factor. But, Google’s own content quality guidelines do make reference to the importance of using citations, references, and sources.

74. Supplementary Content

Supplementary content is often used to improve the user experience. This can be things like images, infographics, maps, diagrams, calculators, etc. Google has said that they use supplemental content as a way to understand the topic of a page, as well as assess the overall quality.

While this type of content is not required, adding supplementary content is helpful for rankings in several ways. First, in boosting the quality of the page. Second, by giving other sites useful content they will want to naturally backlink to, which can help with link building. And, third, engaging users and keeping them on the page longer can improve dwell time and reduce bounce rate – two important ranking factors.

75. Multimedia

You can think of multimedia (videos, images, audio, etc.) in similar terms as supplementary content. Multimedia can be used to improve the user experience, as well as help Google to better understand the topic of a page.

Like supplementary content, multimedia is not required but can be beneficial in several ways. Adding videos, for example, can help improve dwell time and decrease bounce rate. And, as we know, both of those are important ranking factors.

76. User-Friendly Layout

If it hasn’t become abundantly clear yet, user experience is a very important ranking factor. Creating a user-friendly layout is one way to improve the user experience of your website. As Google’s quality guidelines state:

“The page layout on highest quality pages makes the Main Content immediately visible.”

A user-friendly layout is easy to navigate, visually appealing, and makes it easy for users to find the information they are looking for. Creating a user-friendly layout can be a challenge, but it is worth it as it can help improve the quality of your website, which will ultimately lead to better rankings.

77. Mobile-Friendly Updates & Usability

In 2015, Google took notice of the trend of users accessing the internet on mobile devices and rolled out the Mobile-Friendly Update. This update (often referred to as “Mobilegeddon” gave a significant ranking boost to pages that were deemed to be mobile-friendly. On the flip side, those with poor mobile experiences saw their rankings drop.

Then, in 2021, Google announced it would be pushing forward and upping the ante for mobile-friendliness. However, after learning from the lessons of the past, Google was far more lenient this time around, giving website owners an extension to come into compliance, the date of which has yet to be announced as of the writing of this post. One thing is clear, however, if your site is not optimized for mobile, it will suffer in the rankings.

To be considered mobile-friendly, a page must meet certain criteria, such as avoiding software that is not common on mobile devices, using text that is readable without zooming, and sizing content to the screen so users don’t have to scroll horizontally, and more. To determine if your site is mobile-friendly, you can use Google’s Mobile-Friendly Test tool.

78. Content “Hidden” on Mobile

According to John Mueller, content that is visible on desktops, but hidden from users on mobile devices may or may not affect rankings. In the video, John states that if it is “critical content” it should be visible to all users. So, if you’re thinking of hiding content on mobile devices in an attempt to improve your rankings, you may want to think again. It may not get indexed, or if it does, it may not carry the same weight as content that is visible to all users.

79. Page Loading Speed

The time it takes your page to load is a significant ranking factor. Slow pages frustrate users and can lead to higher bounce rates. In fact, Google has stated that they use site speed as a ranking factor, both on desktop and mobile devices.

To see how your site’s speed measures up, you can use Google’s PageSpeed Insights tool. This tool will provide you with a report that not only rates your page’s speed on a scale of 1-100 but also provides specific recommendations on how to improve your page’s speed.

80. AMP

AMP (Accelerated Mobile Pages) is an open-source project led by Google that is designed to help improve the performance of web pages on mobile devices. AMP pages are essentially stripped-down versions of regular web pages that are designed to load faster on mobile devices.

At one time, the use of AMP was a ranking factor for mobile searches. In 2018, Google announced AMP pages would be given higher preference in indexing, as well (not necessarily ranking). However, in 2021, Google announced that it would no longer use AMP as a ranking factor and it is set to be deprecated. Some sites, like Twitter, have also begun to abandon support for AMP.

81. Entity Match

In a 2016 patent, Google defines an entity as:

“[A]n entity is a thing or concept that is singular, unique, well-defined, and distinguishable. For example, an entity may be a person, place, item, idea, abstract concept, concrete element, other suitable thing, or any combination thereof.”

This part of the algorithm deals with connections between entities. So, if you’re trying to rank for a certain keyword, this algorithm looks at whether or not the keyword is associated with other entity match keywords. If it is, you may see a ranking boost.

82. Image Optimization

Image optimization can mean two separate things: 1. It is optimized to load quickly by being the correct size and format and 2. The image file is named appropriately and has relevant ALT text.

Images can help break up content and make it more visually appealing and easy to digest. However, if they’re not optimized, they can also slow down your page’s loading speed, which can hurt your ranking.

File names and alt text has less impact on rankings unless you want that image to show in the image search results. However, it is important to keep in mind that alt text also helps accessibility for visually-impaired users who are using screen readers.

Ideally, it’s better to optimize your images for both ranking and accessibility purposes.

83. Content Updates

When updating content to maintain relevancy, you may find yourself removing or changing large sections of content. You may also add new sections to the pages. This can cause a fluctuation in rankings as Google re-indexes the page.

If you see a drop in ranking after updating your content, don’t panic. Unless there are other alarmingly negative changes, this is to be expected and should level out relatively quickly. In some cases, it may actually provide a boost in rankings as the new content is seen as more relevant.

84. Frequency of Page Updates

In addition to how drastic the update was, how often you update your page can also affect your ranking. Depending on the topic, if a page is updated too frequently, it can be seen as spammy. On the other hand, if a page is never updated, it may be seen as outdated.

The key here is to find a balance. If your page is about something that frequently changes (like news), then you’ll want to update it more often. If your page is about something that doesn’t change much (like a history page), then you won’t need to update it as often.

85. Number of Internal Links Pointing to Page

The number of internal links pointing to a page is a signal to Google about the page’s importance. The more links pointing to a page, the more important Google sees it as. This is an easy one to control. If you want to increase the number of internal links pointing to a page, simply add more links to it from other pages on your site.

86. Quality of Internal Links Pointing to Page

In addition to the number of links, the quality of those links is also a factor. A link from a high-quality page on your site carries more weight than a link from a low-quality page.

There are a few things you can do to improve the quality of your internal links:

  • Use descriptive anchor text that accurately reflects the page you’re linking to.
  • Link to pages that are high-quality and relevant to the content on the page you’re linking from.
  • Don’t link to pages that are low-quality or have little relevance to the content on the page you’re linking from.

87. Rel=Canonical

The rel=canonical tag is an HTML element that tells Google which URL is the preferred version of a page.

For example, if you have the same content available on multiple URLs, you can use the rel=canonical tag to specify which URL you want Google to index. This helps prevent duplicate content issues and ensures that Google is only indexing your most relevant content.

If you’re not sure if you have duplicate content on your site, you can use Google Search Console to find out. Simply go to the “Coverage” report, click on the “Excluded” tab, and look for any pages that are marked as “Duplicate, Google chose different canonical than user.”

88. Number of External Links

Using external links can be a great way to add more relevant and high-quality content to your pages. However, too many external links can be seen as spammy or provide a poor user experience and hurt your ranking. In the Site Quality Guidelines, there is a specific area that addresses this:

“Some pages have way, way too many links, obscuring the page and distracting from the Main Content.”

The key here is to find a balance. Add external links when they’re relevant and add value to the page, but don’t go overboard.

Another thing to keep in mind is whether you should use the nofollow tag. This is an HTML element that tells Google not to follow a link. By default, all external links are followed. However, you can change this by adding the “rel=nofollow” attribute to the link.

89. External Link Quality

Even if you’re using the nofollow tag, the quality of the external links you’re using can still affect your ranking. Google looks at the quality of the site you’re linking to as well as the context of the link.

For example, if you have a page about cars and you link to a high-quality site about car parts, that’s going to be more beneficial than if you link to a low-quality site about car insurance.

90. External Link Theme

Keeping in line with the previous example, the theme of the site you’re linking to can also affect your ranking. If you have a page about cars and you link to a site about car parts, that’s going to be more beneficial than if you link to a site about cat food.

This is because Google looks at the theme of the site you’re linking to when determining the quality of the link. So, if the site you’re linking to is relevant to the topic of your page, that’s going to be seen as a positive signal.

91. Broken Links & 404s

Broken links and 404 pages are a big no-no. Not only do they frustrate users, but they also signal to Google that your site is low-quality and not well-maintained.

The best way to find broken links on your site is to use one of the many SEO tools available. It will crawl your site and provide you with a list of all the broken links it finds.

Once you have a list of broken links, you need to fix them. You can do this by either redirecting the old URL to a new one or by removing the link entirely. But, a word of caution here, as John Mueller explains:

301-redirecting for 404s makes sense if you have 1:1 replacement URLs, otherwise we’ll probably see it as soft-404s and treat like a 404.

🐝 johnmu.xml (personal) 🐝 (@JohnMu) June 25, 2017

 

92. Affiliate Links

Affiliate links can make or break a site. First and foremost, as mentioned above, affiliate links must be marked with the appropriate “sponsored” tag. If they’re not, Google will see them as paid links that are trying to game the system and they can result in a penalty.

The “sponsored” tag tells Google that the link is an affiliate link and it shouldn’t be given the same weight as a regular link. As long as you’re using the “sponsored” tag, there’s no reason not to use affiliate links. They can be a great way to monetize your site while still providing high-quality content.

In addition, affiliate links should be used sparingly and only when they’re relevant. If you have too many affiliate links on a page, it will look spammy and hurt your ranking.

93. Schema Markup

Schema markup is a code that you can add to your site to help search engines understand your content. It’s a fairly technical thing, but if you’re running a business, it’s definitely something you should be using.

Adding schema markup to your site is pretty simple. You just need to add a few lines of code to each page. The following is a list of available schema markup

To learn more about Schema Markup, visit Schema.org. Once you’ve added the code, you can use Google’s Schema Markup Validator tool to make sure it’s working properly.

Schema markup can definitely improve your ranking, and it will make your site more likely to show up in Rich Snippets. Rich snippets are those enhanced listings that you sometimes see at the top of the SERPs. They usually include things like reviews, images, and other information that can help your listing stand out. To see if your site is eligible for Rich Snippets, you can use the Rich Results Test.

94. Domain Authority

This is an issue that is talked about a lot. The term “Domain Authority” was actually created by Moz as an internal metric. It’s not an official metric used by Google and does not affect your chances of ranking highly.

That being said, Domain Authority is a good way to measure the quality of a site. The higher the Domain Authority, the more likely it is that the site is high-quality.

95. Legal Pages

Legal pages such as Terms of Service, Privacy Policy, Disclaimers, etc. are important for any website. They’re especially important if you’re running a business or collecting personal information from users.

These pages can help you avoid legal trouble and they can also build trust with your users. For example, when users see that you have a Privacy Policy, they’ll be more likely to trust you with their personal information.

96. PageRank

The formula that goes into PageRank is a closely-guarded secret. As such, be cautious of products out there claiming they can give you an accurate PageRank score. There is no way for them to know your real PageRank score.

What we do know, for sure, is that it’s one of the many factors that Google uses to determine the quality of a site. PageRank is a link analysis algorithm that was developed by Google’s founders Sergei Brin and Larry Page. It looks at the number and quality of links pointing to a site to determine its importance. The more important a site is, the higher its PageRank will be.

97. Human-Reviewed Quality Ratings

Google has filed a patent that allows human editors to influence SERPs manually. While some SEOs are unsure of whether this patent has even been put into use, it seems pretty obvious that it is, given the fact there is an entire Search Quality Rater Guideline manual.

However, we don’t know to what extent the human reviewer’s scores are used in the algorithm. What we do know is that they’re looking for certain things like whether the site has a high-quality design or whether the content is well-written and informative.

98. Sitemap Page Priority

Some SEOs believe that the priority number of a page in the sitemap file affects that page’s ranking. Yet another topic that is up for debate. Google has explicitly stated that not all sites need a sitemap, especially for small sites.

It appears that SEOs are interpreting the importance of sitemap page priority based on this statement:

“A Sitemap does not affect the actual ranking of your pages. However, if it helps get more of your site crawled (by notifying us of URLs we didn’t previously didn’t know about, and/or by helping us prioritize the URLs on your site), that can lead to increased presence and visibility of your site in our index.”

So, while it doesn’t seem to directly influence your rankings, it appears that using a sitemap can definitely affect how many of your pages get indexed. That would mean there is an indirect benefit since the more pages in the index, the better your chances of getting traffic.

99. Age of Page

As pages age, they accumulate metrics like backlinks, social signals, and even PageRank. It’s no surprise, then, that older pages tend to rank better than newer pages.

However, there is a limit to how much an age advantage can give you. Once a page gets too old, it might begin to lose its relevancy and authority. Google knows this and they adjust its algorithms accordingly. Still, if you have an older page that’s relevant and well-optimized, it’s more likely to rank than a newer page, all other things being equal.

100. Excessive 301 Redirects

301 redirects are intended to redirect users to fresh content, without losing most of the PageRank of the old page. In the past, as Matt Cutts explained, 301 redirects would not pass 100% of the PageRank value, and a small fraction was lost in the transfer. However, in 2018, Google decided it would allow 100% of the PageRank value to pass to the new page, with at least one caveat. If the new page is completely different from the old page, it will not pass PageRank and will be considered a soft 404.

But, if you’re doing a lot of redirect chains (redirecting one page to another, which is then redirected to another), that can lead to a compounded loss of PageRank, as well as slow down your site which compounds your problems even further.

In this Google Search Central video on 301 redirects, Matt Cutts explains there is no cap on the number of 301 redirects on the site level, such as when migrating a site to a new domain. But, there is a limit on redirect chains for the same page.

What this means is if you are migrating Site A’s Page 1 to Site B’s page 1, Site A’s Page 2 to Site B’s Page 2, etc. there is no limit for redirects, and loss of PageRank is likely going to be minimal. But, if you have a chain that goes from Page A version 1 to Page A version 2 to Page A version 3, etc., Google (and browsers) will stop following the chain and you’ll end up with a headache. Ideally, keep redirect chains to one or two, with an absolute maximum of three.

101. Link Location In Content

In several experiments, evidence suggests that links placed at the beginning of content are given slightly more weight than links placed at the end of the content.

Google’s patent for “Document ranking based on semantic distance between terms in a document” suggests that the location of links can influence their value. While Google hasn’t come out and said this explicitly, it’s reasonable to assume that links placed higher up on a page or within the body of content are given more weight than links placed lower down or on the side of the page. After all, users are often paying the most attention to the core of the content, and not so much to the sides or bottom of the page.

Anecdotally, I’ve seen this to be true as well. When I’m looking for a specific piece of information on a website, if there’s a relevant link in the first few sentences, I’m more likely to click it than if I have to scroll down to find it. This tells me that Google likely sees the same thing.

102. Brand Name Anchor Text

If you’re using your brand name for anchor text, it sends a very strong, positive signal to Google. It tells them that the linked-to page is very likely about your brand, since you’re linking to it using your brand name.

However, if you have too many instances of brand name anchor text, it can look spammy and unnatural. That’s why it’s important to vary your anchors when possible and to use other relevant keywords and phrases.

Here’s an example of what I mean. Say you have a website for your company, ExampleCo. You might be tempted to use “ExampleCo” as your anchor text every time you link to your website. But, if you do that too often, it’s going to look really spammy to Google.

User Interaction

103. UX Signals for Keywords

There is one key phrase from Google’s “How Search Works” page that really stands out here, and SEOs have seen it first-hand.

“We look for sites that many users seem to value for similar queries.”

This means that Google is looking at how users interact with sites after they click on them in the SERPs. If people quickly return to the search results and click on a different result, that’s a sign that they didn’t find what they were looking for.

On the other hand, if people stay on a site for a long time and visit several pages, that’s a good sign that they found what they were looking for. Google uses this data to help them determine the quality of a site for various related keywords. If people are consistently finding what they’re looking for on a site, that site will likely rank higher for related keywords, even if those keywords are not actually found on the site.

104. Direct Traffic

Direct traffic is when a user doesn’t discover your site in SERPs, but rather types your URL directly into their browser. This is an important metric to keep track of because it can give you insights into how well your brand is doing in other areas of advertising or offline. If you see a sudden spike in direct traffic, it might be a sign that your offline marketing efforts are paying off.

Additionally, it is generally believed that direct traffic is a significant ranking factor. Logically, it makes sense because if people aren’t finding your site through SERPs, it means you’re putting in heavy efforts elsewhere. Google has never confirmed this, but there is enough evidence from case studies and experiments to make a strong argument.

105. Repeat Traffic

If people are coming back to your site again and again, especially through direct traffic, that’s a really good sign that you’re providing value. It might also be a sign that you have a loyal following or base of customers. Repeat traffic can also help your rankings in SERPs. Google sees this behavior and interprets it as a signal of trustworthiness and quality.

106. Bounce Rate

The Bounce Rate is the percentage of people who visit a page and then leave without viewing any other pages on the site. A high Bounce Rate is generally a sign that people are not finding what they’re looking for on your site.

This could be because your content is not relevant to the keyword they used to find your site, your page copy doesn’t entice them to visit other areas of the site, or it could be because your site is not user-friendly. Regardless of the reason, a high Bounce Rate could hurt your rankings.

On the flip side, though, a high Bounce Rate might not mean users aren’t getting the answers they need. In fact, if a user stays on the page for a while (Dwell Time), even if they don’t click to another page, that could be a sign that they found what they were looking for. Google takes this into account when determining rankings.

107. Dwell Time

Dwell Time is the amount of time a user spends on your site before returning to SERPs. A long Dwell Time usually indicates that the user found what they were looking for, while a short Dwell Time might indicate that the user didn’t find what they were looking for or that your content wasn’t relevant to their needs.

Dwell Time is an important ranking factor because it’s a good indicator of the quality of your content. If people are spending a long time on your site, that means they’re engaged with your content and are getting something out of it.

Conversely, if people are bouncing off your site quickly, that’s a sign that something is wrong. Maybe your content isn’t relevant to the keyword they used to find your site or maybe your site isn’t user-friendly. Either way, low Dwell Time could hurt your rankings.

To improve your Dwell Time, you can use tools that measure and record how a visitor interacts with your site. This information can help you identify areas where your content is falling short so you can make changes accordingly.

108. Pogo-Sticking

Pogo sticking is when a user clicks on a result in SERPs, doesn’t find what they’re looking for, and then clicks back to SERPs and clicks on a different result. This is generally seen as a bad sign by Google because it indicates that the user didn’t find what they were looking for on your site.

There’s really no way to know if your site is encountering a lot of pogo-sticking because you can’t see the bounce rates of similar sites. However, if you see a sudden drop in traffic or rankings, it might be an indication that your site is being pogo-sticked.

109. Organic Click Through Rate for One Keyword

Click-Through Rate (CTR) can help you understand how well your site is optimized for a particular keyword. If you’re ranking in the top 3 positions for a keyword, but you’re getting less than 3% CTR, that means your listing isn’t as appealing as it could be. You can use this information to improve your title and description tags. A higher CTR will likely lead to a higher ranking, all other things being equal.

110. Organic CTR for All Keywords

Going a step further, you can also look at your overall organic CTR. This will give you an idea of how well your site is optimized as a whole. If you’re getting a low CTR, that means there’s room for improvement. Low CTRs on all keywords you’re trying to rank for could send a signal to Google that your site is not as relevant as it could be.

111. Bookmarked Sites in Chrome

Google does track bookmarks in Chrome. Chrome’s Privacy Notice states it will keep track of bookmarks for use in syncing information between devices. Whether or not Google uses this information in determining rankings is up for debate. After all, it would take a pretty large, coordinated effort to run experiments on this.

However, it’s not impossible to imagine that Google could possibly correlate this information with direct traffic to a site. If they saw that people were bookmarking a particular site more than others, it’s possible they could give that site a small ranking boost.

112. Comments

If your site allows comments, you definitely want to moderate them to ensure they’re high-quality. Google has stated in the past that they are not a fan of low-quality, spammy comments, and it’s possible that these could hurt your rankings. On the other hand, high-quality comments could potentially be a sign of engagement, which is a positive signal to Google.

113. RankBrain

Like PageRank, Rankbrain is a proprietary algorithm developed by Google. However, unlike PageRank, Rankbrain is a machine-learning algorithm that gets smarter over time.

Because Rankbrain is constantly learning, it’s difficult to say definitively how it affects rankings. However, we do know that it does play a major role in ranking factors. While RankBrain doesn’t replace BERT or MUM, it is used in conjunction with these algorithms to provide the best possible results.

114. Cloaking & Sneaky Redirects

Cloaking is when you serve one version of a page to search engines and another version to users. This is done in an attempt to manipulate search rankings by showing search engines different content than what users see.

Sneaky redirects are similar in that they also involve redirecting users to a different page than what was initially intended. A couple of examples of sneaky redirects, according to Google:

  • Search engines show one type of content while users are redirected to something significantly different.
  • Desktop users receive a normal page, while mobile users are redirected to a completely different spam domain.

Both of these practices are considered black hat SEO and are against Google’s Webmaster Guidelines. If you’re caught doing either, you could be penalized.

115. Links to Bad Sites

The last thing you want to do is to link to a bad site. Not only does this harm your own reputation, but it can also get you penalized by Google. Bad sites are those that are filled with spam, engage in black hat SEO practices, or are simply low-quality. By linking to such sites, you run the risk of being associated with them in the eyes of Google. Therefore, it’s important to be careful about who you link to. Make sure that you only link to high-quality sites that will not damage your own reputation.

116. PageRank Sculpting

Continuing the previous thoughts, and given what we have learned about outbound links and passing PageRank, you may find yourself tempted to use all nofollow or mix dofollow and nofollow external links on your pages. This practice is known as PageRank Sculpting. The purpose of it is to have more control over where you pass link juice.

For example, let’s say you have a page with two outbound links. One is a nofollow link and the other is dofollow. Or, you decide you don’t want to pass any PageRank to external links, so you use all nofollow tags. Or, perhaps you’re linking to iffy sites and don’t want to take the time to figure out if it is a bad site or not, so you use a nofollow link.

While you’re not expressly forbidden to use PageRank Sculpting techniques, doing so may raise red flags with Google, especially if you’re using nofollow links to bad sites. Additionally, it’s generally not considered a good practice because it goes against the natural flow of PageRank.

117. Above the Fold Ads

When it comes to website layout, “above the fold” means the portion of the page that is visible without scrolling. Research has shown that people are more likely to engage with content that is above the fold.

While one ad at the top isn’t necessarily a bad thing, the majority of the above the fold space should be content. If you have too many ads, especially if they are intrusive, you will likely be penalized by Google. Therefore, it’s generally considered good practice to keep your most important content above the fold and place ads below the fold.

118. Popups, Interstitials, and Distracting Ads

In Google’s Search Quality Rating Guidelines, it states:

“6.4 Distracting Ads/SC

We expect Ads and SC [Supplementary Content] to be visible. However, some Ads, SC, or interstitial pages (i.e., pages displayed before or after the content you are expecting) make it difficult to use the MC [Main Content]. Pages with Ads, SC, or other features that distract from or interrupt the use of the MC should be given a Low rating.”

This is because such ads are often unexpected and interfere with the content on the page. While there are some exceptions, such as age verification popups, most popups and intrusive ads will result in a penalty from Google. Therefore, it’s best to avoid using such techniques on your website.

Backlink Factors

Something to bear in mind as we enter into this often confusing and mysterious area of SEO, when it comes to backlinks, Google states:

“Google interprets a link from page A to page B as a vote by page A for page B. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

And, in my very controversial opinion, site owners often place WAY too much importance on their backlink strategies in comparison to the other on-page and off-page SEO factors. There is often a very misguided and skewed perspective of what’s really important when it comes to a site’s link profile.

While they understand that link quality, not quantity is what’s important, they focus too heavily on devising this big strategy that ultimately doesn’t work as well as they had hoped. Or, they hire some backlink-building “guru” who gives them a bunch of crappy backlinks. That time, money, and effort are much better spent on improving your site.

With this in mind, I do not offer backlink building services, per se. Instead, I focus on creating unique, high-quality content such as supplemental content and tools that will organically acquire backlinks. I also focus on thought leadership articles and media outreach (not sponsored posts) that are more likely to build a client’s reputation online, which will, in turn, attract backlinks.

I hope you adopt a similar mindset because it is also Google’s mindset on this topic. To quote Google once more:

“When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form.

And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything—including links —will follow (no pun intended).”

With that said, let’s forge ahead…

119. “Toxic” Backlinks

Let’s first address the elephant in the room: “Toxic” backlinks. There is a lot of confusion out there surrounding what a toxic backlink is. Basically, a toxic backlink is a backlink from a low-quality or spammy site that can potentially hurt your rankings.

When people engage in link schemes (which are against Google’s policies) in an attempt to artificially boost rankings, they will use spammy, low-quality sites to link to their main site. These links are easy to spot and are typically not from reputable sites.

Google’s algorithm is constantly getting better at detecting these kinds of links, and if they think you’re trying to game the system, they will penalize your site. And, that is where the fear comes in. Many SEOs are concerned that if there are a lot of these toxic backlinks, Google will just assume you’re participating in link schemes and penalize your site.

However, John Mueller recently took to Reddit to state:

So, this leads us to believe that there are measures in place to better recognize the difference between link schemes and random sites linking to yours. For example, if you’re getting 40 visitors a month, but gaining 2,000 backlinks a month (see Backlink Velocity below), that could be a sign you’re engaging in a link scheme. But, if you have low-quality links pointing to your site, it’s nothing to worry about. Google knows that not all sites are perfect and that some links are just going to be low quality.

120. Nofollow Backlinks

And, one more elephant to address: Far too often, site owners will approach SEOs and tell them they want to focus on building backlinks. They give an arbitrary set of guidelines for what they’re looking for (e.g., “I don’t want nofollow backlinks” or “Dofollow backlinks only from sites with DA above 50), but they don’t really understand that nofollow links can have some value.

Google’s official stance is:

“Links marked with these rel attributes will generally not be followed.”

The key phrase there is “generally.” Typically, linking sites that use the nofollow tag are sites that don’t want to pass along their PageRank. However, that doesn’t mean they’re not valuable. For example, a nofollow link can still send traffic your way. Or, having nofollow backlinks from major outlets such as media, could indicate to Google that your site is a major source of information.

I often ask my clients “What do you think Google would pay more attention to? 15 nofollow (not sponsored) backlinks from major sources or 100 from small, random sites?” The truth is, we don’t really know for sure, but I would argue that the former is by far more valuable. At the end of the day, there’s no need to obsess over getting dofollow links. Just focus on building quality content and the rest will fall into place.

121. Linking Site’s Domain Age

Just like the age of your domain, the age of the domain backlinking to your site does not individually carry any weight. In other words, a backlink from a site that’s been around for 20 years is not going to necessarily give you any more link juice than one from a site that’s only been around for 6 months.

However, older sites have other established historical metrics (authority, trust, etc.) that could give them a slight edge. So while the age of a domain by itself is not going to have an impact on your rankings, it might be one small factor when taken in concert with all of the other Site Metrics.

122. Linking Site’s Domain Authority

As we previously discussed, the Domain Authority score was created by Moz, not by Google. However, it’s still a valuable metric to consider. Moz’s Domain Authority scoring system goes from 0 to 100. The higher the number, the more authoritative the domain is considered to be.

While not a heavily reliable metric, it can give you a general idea of whether the backlink is from a valuable site or not. So, don’t get so hung up on Domain Authority. I have personally seen plenty of what I consider to be very low-quality sites with high DA scores that I would never approach for a backlink.

123. Homepage Authority

If the referring site has more backlinks to its own homepage, it is possible that links to your site could carry more weight than if it had very few homepage links.

It’s important to keep in mind that, while the homepage authority is one metric you can look at, it’s not the only one. The authority of the specific page linking to you is also important.

124. Page Authority

Continuing on this speculative trend, it’s also possible that the individual page linking to your site is more important than the homepage authority.

To further this idea, let’s say that Site A has a DA of 30 and PA of 40, while Site B has a DA of 60 and a PA of 70. Which site would you want to get a link from? I would pick Site B without hesitation, as it’s more authoritative overall. However, if both sites have an equal DA of 70, but Site A’s linking page has a PA of 80, and Site B’s linking page has a PA of 50, I would pick Site A in a heartbeat.

This is all speculative, but it’s important to keep in mind that the page linking to you is just as – if not more important – than the site’s authority as a whole.

125. Referring Domain’s TLD Country Code

Similar to the benefits of having a TLD with a specific country code, it’s also possible that having a referring domain with a country code could give you a boost in SERPs for that particular country.

For example, if you have a site targeted at the United States market, but your site’s TLD is .au, and you get a backlink from a .us TLD, that link may hold more weight than if it came from a .com or .org. This is, again, speculative. But it’s something to keep in mind when trying to build backlinks from other countries.

On the flip side, with certain countries being known for hacking and spamming, you may want to be careful about getting links from those TLDs. For example, I would be very wary of a link from a .ru or .cn domain. In general, if you see backlinks from a country code that is not in the same country as yours, use Google Translate and check to see what the site is about. Keep those that appear relevant and disavow those that are not.

126. Number of Linking Root Domains

There is a very strong correlation between the number of unique referring domains and higher rankings in SERPs. A referring domain is the root domain of the site linking to you. So, if a site links to you multiple times on different pages, it will only be counted as one referring domain. For example, if example.com links to your site twice on two separate pages, and example2.com links to your site once, that would be a total of two referring domains.

Why is this important? Because it’s not enough to just get backlinks from a bunch of different pages. You want those backlinks to come from a variety of unique domains. The more unique domains linking to you, the better.

However, it is this correlation that led to the rise of blogging networks used specifically for selling backlinks. These networks would post links to your site on a large number of different domains, which would inflate the number of referring domains, and therefore give the illusion of increased authority.

For this reason, Google began devaluing links from these types of networks, and they are no longer as effective as they used to be. In fact, if you are caught using one of these networks, your site could be penalized. So, it’s best to avoid them altogether.

127. Number of Linking Pages

Similarly, the number of linking pages from a single domain may also be a factor in SERPs. The logic here is that, if a site has more pages linking to your site, it’s likely that those links are more valuable than if the site only had one page linking to you.

Again, this metric can be easily manipulated by blog networks and other link schemes. So, while it may be a factor in SERPs, it’s not necessarily one that you should focus on.

128. Relevancy of the Linking Domain

If a site linking to you is within the same niche, the is evidence to suggest such a backlink would carry more weight than if the site was not related to yours.

For example, if you have a site about SEO and you get a backlink from another SEO site, that link is likely to be more valuable than if you got a link from a site about cooking. This is because Google sees the link as being relevant to your niche, and therefore more valuable.

Of course, relevancy is not always easy to determine. If you are in a niche with a lot of sub-niches, it may be difficult to tell if a site is truly relevant or not. In these cases, it’s best to err on the side of caution and assume that the link is not as valuable as it could be.

129. Relevancy of Linking Page

While similar to the relevancy of the linking domain, the relevancy of the linking page may hold a different weight. For example, some sites cover a wide variety of topics in a certain niche, such as a site for hobbyists. In these cases, the site as a whole may not be 100% relevant to your niche, but the page linking to you might be. Again, this is another metric that can be difficult to judge.

130. Number of Links from Different IPs

With so many spambots out there using multiple IP addresses, it can be difficult to determine the value of a link if it’s coming from multiple IPs.

That being said, there is some evidence to suggest that links from multiple IP addresses may be more valuable than links from a single IP address. This is because oftentimes, spammy blog networks will have the same Class C IP address.

Of course, this is not a foolproof metric, as spammers can get around this by using proxy servers. But it’s something to keep in mind when assessing the value of a link.

131. Number of Links Not From Ads

Links created from ads are generally not as valuable as links not created from ads. This is because, oftentimes, the links are of low quality and are not relevant to the site they’re linking to. Additionally, depending on the type and context of the link, it might be considered spammy.

132. Backlinks From .edu or .gov Domains

Many SEOs believe that if a .gov, .edu, or similar TLD links back to your site, it must be a sign of high quality. After all, these types of domains are not easy to get links from.

In line with this thought process, some SEOs also believe that there are certain types of sites that Google “expects” to link to your site, and your site won’t be considered “relevant” until that happens.

Both of these concepts are nonsense. Numerous studies have proven otherwise. In fact, Google’s Matt Cutts has stated that there is no special weight given to these types of links. So, while it can still be beneficial to get links from .edu or .gov domains, don’t get too hung up on this metric.

133. Diverse Types of Backlinks

This is separate from the above types of link diversity in that if you have a large number of backlinks coming from a certain type of site, such as a forum or comments on a blog, that may be seen as less valuable than if you had a variety of backlink types.

For example, if you have 20 blog comment links, 18 forum post links, 24 social media links, 35 links from blog posts, 5 links from media articles, etc., that would be seen as more valuable than if you had 50 blog comment links and no other types of links.

The reason for this is that it’s more difficult to manipulate a diverse range of backlinks. So, if you can get a variety of different types of links pointing to your site, that’s generally seen as a good thing.

134. Competitor Backlinks

It’s not often that you will find a competitor linking to you, but it does happen. Most often, a competitor will write an article reviewing all of their competitors to outline why their product or service is different. This type of backlink may, indeed, carry more weight than a regular backlink, as it’s coming from a source that is highly relevant to your topic.

135. Guest Post Backlinks

A lot of site owners talk about getting guest post backlinks. It doesn’t work anymore. Remove the idea from your mind. The only exception to this is if you are personally willing to vouch for the guest poster. Besides, you might find yourself penalized by Google for participating in a link scheme.

136. Backlinks from Real Sites vs. “Splogs”

A “splog” is a fake blog that is created solely for the purpose of spamming the internet with links to a particular site. These are generally low-quality sites with very little content.

While it’s not necessarily a bad thing to get links from these types of sites, they’re not going to carry as much weight as links from real, high-quality sites.

137. Backlinks From Sponsored or UGC Tags

There are a lot of services out there that offer to post your article in very well-known publications. What they don’t tell you is the backlinks you get from those sources are marked as “rel=sponsored” or “rel=UGC” (user-generated content) tagged links and they serve absolutely zero direct value. These tags carry even less weight than a traditional nofollow.

The only benefit for these often highly expensive posts is perhaps name recognition and some brand value, but not SEO benefits. If you’re considering paying for these types of links, I would highly recommend doing your research first and seeing if they’re worth it for your particular situation. You may be better off using those funds to hire someone to improve your site.

As Google explains, they needed a way to take hints from nofollow links (remember what I said about there possibly being some value to nofollow links?), without allowing content like sponsored and UGC posts to muddy the waters:

“When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes—sponsored, ugc, and nofollow—are treated as hints about which links to consider or exclude within Search.”

While nofollow can still be used, it is recommended to specify between the three. If you do not explicitly disclose sponsored content with a rel=sponsored or rel=nofollow tag, you can get penalized. Once again quoting Google about sponsored content and ads:

“If you want to avoid a possible link scheme action, use rel=”sponsored” or rel=”nofollow” to flag these links. We prefer the use of sponsored, but either is fine and will be treated the same, for this purpose.”

138. Reciprocal Links

In some cases, reciprocal links are legitimate and useful. For example, if two organizations are working together on a project and decide to link to each other’s sites as a way to increase exposure. However, if you’re participating in reciprocal link schemes where you’re exchanging links with hundreds of other sites just to get a backlink, this will most definitely do more harm than good.

139. Backlinks from 301 Redirected Pages

As we learned previously, 301 redirects typically pass most of the PageRank to the new page (except in some cases). Similarly, some PageRank is passed along from Site A to Site B with a follow backlink, but some PageRank is also lost. In other words, any type of link intended to pass PageRank may or may not pass 100% of the PageRank score to the other page.

It used to be (and many SEOs still believe to this day) that some PageRank was lost in the transfer of 301 redirects within the same site. But, in 2018, Google announced that 301 redirects would generally pass PageRank without loss.

Both a 301 redirect and a direct link were roughly similar when it came to PageRank. Therefore in the past, in theory, if Site A’s page already had a 301 redirect that cut PageRank value by 10-15%, then added a dofollow backlink to Site B that cut another 10-15% of the PageRank value, the loss of PageRank would have been compounded. The link would only pass about 70-80% or so of the PageRank value.

No need to fret, though. While a backlink still does lose some of the PageRank value passed from the referring page, a backlink from a page that has already been redirected would not be compounded.

140. Backlinks to 301 Redirected Pages

Conversely, if old backlinks are linked to 301 redirected pages on your site, then those backlinks will now be pointing to the new page. Depending on the number and quality of the old backlinks, your new page could get a boost in rankings. For example, if you have an “old” blog post with a lot of good links pointing to it that gets redirected to your home page (or some other less relevant page), then those links will help your home page rank better.

141. Link Title Popups with Relevant Keywords

When you hover over a link in Chrome, not only will you see the full URL become visible in the bottom left corner of the page, but you will also see a small black box popup that shows the title of the page you’re about to visit. This small popup is called a tooltip and it’s something that we all see almost every day without really noticing it.

This small black box that pops up is a way for Google to help improve user experience so visitors know where they are being sent before they actually click on the link. If the title of the page contains the same keyword in the anchor text, it helps provide Google with more context and relevance for that keyword, thereby potentially offering a slight boost.

142. Backlink Anchor Text

Anchor text sends a strong signal to Google regarding relevancy and properly optimized anchor text can help boost rankings. However, over-optimization can result in penalties. The days of being able to stuff anchor text with keywords and get away with it are long gone. As Google explains:

“Here are a few common examples of unnatural links that may violate our guidelines:

  • …Links with optimized anchor text in articles or press releases distributed on other sites. For example:
  • There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.”

Instead of over-optimizing, you should focus on using a few different keyword variations for your anchor text, as well as branded terms and naked URLs (just the URL without any anchor text). A good rule of thumb is to keep your anchor text to less than 2% keyword density.

143. Text Around Anchor Text

The text immediately before and after anchor text is also a strong signal to Google. This is commonly referred to as co-occurring text and it helps Google better understand the relevancy of the link for certain keywords.

For example, if you’re trying to rank for the keyword “coffee beans”, having the following text around your anchor text would be beneficial:

I love buying coffee beans from Amazon. They have the best selection and they are always fresh.

In this example, the words “coffee beans” are surrounded by relevant co-occurring text, which helps Google better understand the context of the link.

144. Backlinks From Hub Pages

Hub pages are pages on a website that serves as a central point of information for a certain topic. Getting a backlink from a hub page can be beneficial because it not only helps improve the relevancy of your site for certain keywords but also helps improve the authority of your site. When Google sees a backlink from a hub page, they see it as a vote of confidence for your site.

145. Backlinks From “Fake” Blogs vs. “Real” Blogs

Google can distinguish between “real” blogs and “fake” blogs. A “real” blog is typically updated frequently with new content, while a “fake” blog is usually just a page full of links to other sites. Fake blogs typically contain little to no original content and are only used for the purpose of SEO. For example, a real blog might be “example.com” while a fake blog would be “example.wordpress.com” or “example.blogspot.com”.

In general, backlinks from real blogs are more valuable than backlinks from fake blogs. This is because Google sees real blogs as more credible and trustworthy sources of information. As a result, links from real blogs tend to be given more weight by Google algorithms. Even if you’re not participating in a link scheme, links from fake blogs may reflect poorly upon you if they are the majority of your backlinks.

The concerning thing about this, though, is it is easy to fall victim to negative SEO tactics when a competitor launches a massive backlink campaign with the hope it will torpedo your site. So, it is imperative to monitor your backlink profile and use Google’s Disavow Tool to disavow obviously fake blogs.

146. Alt Tag Backlinks (Images)

When an image is linked to your site, the alt tag text is used as the anchor text. This can be beneficial because it allows you to get backlinks with keyword-rich anchor text without having to worry about over-optimization penalties. In addition, if your image shows up in Google Image search results, the alt text will be used as the title text, which can help improve your click-through rate.

To get the most benefit from alt tag backlinks, make sure to use relevant and keyword-rich alt text.

Here’s an example of good alt text:

Fresh roasted coffee beans

In this example, the alt text is relevant to the image and it also includes a keyword (coffee beans). This will help improve the relevancy of the backlink for that keyword.

Here’s an example of bad alt text:

image

In this example, the alt text is not relevant to the image and it doesn’t include any keywords. This will not help improve the relevancy of the backlink.

147. Wikipedia Backlinks

Wikipedia backlinks are coveted among some SEOs because they believe these links are typically high quality and editorially given. However, that couldn’t be further from the truth. Wikipedia is supported and edited by the community, which means anyone can create or edit a Wikipedia page.

As a result, Wikipedia pages are not always accurate and up-to-date. In addition, even though contributors strive to keep bad links off the pages, Wikipedia pages often include links to low-quality websites. Regardless, Wikipedia backlinks are marked as nofollow, which means they don’t pass any link juice. So, while Wikipedia backlinks may be beneficial from a brand awareness standpoint, they are not going to help you rank higher in search engine results pages.

148. Age of the Backlink

According to Google’s patent “Information retrieval based on historical data” the older a backlink, the stronger it will be for ranking. As a general rule of thumb, age builds trust in nearly all things SEO. The longer a site has been around, the more likely it is to be trusted by Google.

Of course, age isn’t the only factor that determines trustworthiness. But, all else being equal, an older backlink will typically be more powerful than a newer one.

149. Upward Backlink Velocity

If your site is experiencing a steady upward trend in backlinks, it is a good sign. This is because it shows that your site is becoming more popular and authoritative over time.

However, if your backlink velocity is increasing too quickly, it could be a sign of link buying/spamming. In this case, Google may view your links as being less valuable and could penalize your site accordingly.

150. Downward Backlink Velocity

On the other hand, if the backlink velocity of your site is in a downward trend, it could signal to Google that the page is losing relevancy. This could result in a drop in rankings.

There are a number of reasons why this may happen, such as if you stop link building or if other sites remove their links to your site. In either case, it’s important to investigate the reason for the decline so you can take action to improve your link profile.

151. Natural Link Profile

There are several indicators Google uses to determine if a site has a natural link profile. These include, but are not limited to, the following:

  • Anchor text diversity
  • Link velocity
  • Link placement

If your link profile meets Google’s guidelines for a natural link profile, it will likely be viewed as more valuable and trustworthy than one that doesn’t. This could help improve your rankings on search engine results pages.

152. TrustRank of Linking Site

The TrustRank is a metric developed by Google in its patent “Search result ranking based on trust” to determine the trustworthiness of a website. It takes into account a number of factors, such as the age of the site, the number of backlinks, and the quality of those backlinks.

Generally speaking, sites with a higher TrustRank are more likely to be trusted by Google and thus may have an easier time ranking higher in search engine results pages.

153. Number of Outbound Links on the Referring Page

Touching back on PageRank, it is important that the link juice it passes is finite. If a referring page has lots of outbound links, it will spread the link juice too thin and the PageRank passed will be less valuable.

On the other hand, if a referring page has very few outbound links, it will pass more link juice and the PageRank will be more valuable.

Therefore, all else being equal, a backlink from a page with fewer outbound links will typically be more valuable than one from a page with more outbound links.

154. Word Count of Linking Content

There’s no doubt that long-form content performs better in search results. As a result, if the referring page’s content is long, it’s likely that Google will view it as more valuable than a shorter piece of content.

This is because long-form content typically contains more information and thus has the potential to be more helpful to searchers. Consequently, Google is more likely to rank it higher on search engine results pages.

155. Selling or Buying Backlinks

Selling or buying backlinks is expressly forbidden by Google and can result in severe penalties. If you are caught doing this, your site could be removed from Google’s search results entirely. Therefore, it is important to only acquire backlinks from reputable sources. If you come across a site that is selling or buying backlinks, you can report the site here.

156. Widget Links

Widgets can boost user engagement and help you promote your content. However, widget links are generally considered spammy or, worse, might be viewed as an attempt to manipulate PageRank. As Google states:

“…some widgets add links to a site that a webmaster did not editorially place and contain anchor text that the webmaster does not control. Because these links are not naturally placed, they’re considered a violation of Google Webmaster Guidelines.”

157. Directory Links

Directories seem like a great way to get a few backlinks quickly. However, most directory sites are low-quality and not worth your time. In fact, Google has stated that they are a “common example of unnatural links that may violate our guidelines.” So, while you may get a few backlinks from these sites, they are unlikely to have a positive impact on your search engine rankings and may even hurt you in the long run. Therefore, it’s generally best to avoid directory sites altogether.

158. “Poison” Anchor Text

Going one step beyond “toxic” links, there are “poison” links. These links often include sensitive words like:

If your site is linked to by one of these sites, it could negatively impact your rankings. As such, it’s important to disavow any links from these types of sites. You can find a list of pharmacy-related terms here and a list of adult content-related terms here.

159. Temporary Link Schemes

A temporary link scheme is a short-term effort to boost a site’s rankings by acquiring links from low-quality or spammy sites. These schemes are often used by black hat SEOs and are quickly caught and penalized by Google.

Social Media Signals

Social media profiles for your brand can provide valuable signals to Google. While you won’t get a benefit from backlinks – as these links are typically UGC and nofollow – social media profiles will not directly improve your search engine rankings. Nevertheless, they can still be valuable in other ways, such as helping to build brand awareness and providing another avenue for people to find your site.

160. Site is Linked to a Facebook & Twitter Page

Solid, well-maintained Facebook and Twitter pages for your brand can be valuable signals to Google. As these pages are generally the result of a deliberate effort on your part, they may reflect positively on the quality of your site.

161. Official Linkedin Company Page

Linkedin is a popular social networking site for professionals. Therefore, having an official Linkedin company page can be a valuable signal to Google, especially if your target audience is business people or other professionals.

162. Verified Social Media Profiles

As Google’s Eric Schmidt stated back in 2013 in a WSJ article:

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results.”

163. Verifying Legitimate Social Media Accounts

If you think buying fake followers, likes, and comments will help your Google ranking, think again. In fact, not only will this not help your rankings, it could actually hurt you. While it is not verified that it is in use, in 2013, Google filed a patent called “Method and system for detecting fake accounts in online social networks” which seems to suggest that Google is working on a way to detect fake social media accounts and de-value their links.

In other words, if you’re buying fake social signals in an attempt to game Google, you’re not only wasting your money, you could also be damaging your site’s reputation. Therefore, it’s best to focus on building legitimate engagement with your target audience on social media.

164. Site is Mentioned in Prominent Social Media Profiles

If you’re lucky enough to have your site mentioned in a prominent social media profile – such as the Twitter profile of a major news organization – even without a link to your site, this can be a valuable signal to Google. Not only does this help to build brand awareness, but it can also result in a significant increase in traffic.

Algorithm Factors That Influence SERPs

As the web and world evolve, Google’s algorithm must also evolve to keep up. Over the years, there have been many algorithm updates – some major and some minor. In fact, it is said that there are, on average, at least three minor updates every day.

These updates can range from something as small as a tweak to adding new features to the way a certain type of content is ranked to a complete overhaul of the way Google ranks sites. Needless to say, trying to keep up with all of these changes can be a full-time job in and of itself.

While we don’t have the space to go over every single algorithm update, below are some of the notable factors outside of optimization that can affect how a site is ranked in the SERPs.

165. User Browsing History

User browsing history is one factor that can influence SERPs. If a user has searched for a particular keyword in the past, Google may give preference to that keyword when the user searches for it again.

166. User Search History

In addition, if a user frequently visits a particular site, Google may give that site preference in the SERPs. This is because Google assumes that if a user keeps coming back to a particular site, it must be because they like what they see.

167. Top Stories Box

The top stories box is a feature that appears at the top of the SERPs for certain keywords. This box contains articles from various news sources related to the keyword being searched.

If your site is chosen to be featured in the top stories box, it can result in a significant increase in traffic.

168. Query Deserves Diversity

The query deserves diversity (QDD) algorithm is a way to add more variety to the SERPs. Prior to QDD, if a user searched for a particular keyword, they would often see results from the same few sites over and over again. With QDD, Google aims to show a wider range of results, including results from a variety of different sites.

169. Domain Diversity

Domain diversity is similar to query deserves diversity, but instead of showing a variety of results for the same keyword, it shows a variety of results from different domains.

170. Big Brand Preference

In general, Google gives preference to big brands. This is because, in most cases, big brands are more likely to have high-quality content and websites.

171. The Freshness Factor

Google also puts a high value on fresh content. This is why you’ll often see recent articles or blog posts appearing near the top of the SERPs.

172. Single Site Results for Brands

Another algorithm update that favors big brands is the single site results for brands update. This update ensures that when a user searches for a particular brand, they are more likely to see results from that brand’s website, rather than from other sites, even if those other sites have higher-quality content.

173. “YMYL” Keywords

“YMYL” is an acronym that stands for “your money or your life.” These are keywords that relate to sensitive topics that can have a major impact on a person’s well-being.

Some examples of YMYL keywords include “cancer,” “divorce,” and “debt.”

Google takes extra care when it comes to ranking sites for YMYL keywords, as they don’t want to send users to sites that could potentially harm them in some way.

174. Payday Loans Update

The payday loans update is an algorithm update that was specifically designed to target sites that were trying to game the system by using YMYL keywords that lead users to spammy sites.

175. Shopping Results

Another type of result you’ll see in the SERPs is shopping results. These are results that come from online retailers and are typically triggered by keywords that relate to shopping, such as “buy,” “price,” and “discount.”

176. Transactional Results

Transactional results are results that allow users to complete a transaction, such as buying a product or making a reservation. These results typically appear for searches that have commercial intent, such as “flight to/from” or “restaurant reservations.”

177. Image Results

Image results are another type of result that you’ll see in the SERPs. These are results that come from image-based search engines, such as Google Images. The intent behind the search is typically to find a particular image, rather than to find information about a topic.

178. Easter Egg Results

Easter egg results are Google’s fun way of hiding surprises in the search results. There have been many Easter eggs with various themes. For example, in 2018 at the height of popularity for the movie “Avengers: Infinity War,” when users searched for “infinity gauntlet,” they were treated to a Thanos-themed Easter egg that displayed a Knowledge Graph with the Infinity Gauntlet on it. Clicking on it caused its fingers to snap, disintegrating half of the links and images on the page, as well as counting down the number of results to half. 

179. Geo-Targeting

Location, location, location. Google takes into account the user’s location when determining which results to show. This is why you’ll often see results that are specific to your city or region.

180. Local Pack Results & Near Me Searches

Local pack results are a type of search result that is specifically designed to give users information about local businesses. These results typically appear for searches that have local intent, such as “pizza near me” or “coffee shops in San Francisco.”

Local pack results typically include the business name, address, phone number, and website, as well as a map of the business’s location. Users will often see the results in a map format, with the businesses being represented by little pins.

As discussed above “near me” searches are a type of search that is becoming increasingly popular, as more and more people use their mobile devices to search for local businesses. It’s important to note that some SEOs try to optimize sites for the term “near me.” It’s up for debate whether this is effective, but most often it comes across as spammy and unnatural in text. Additionally, near me takes into account far more than that keyword, so it’s likely a waste of time to optimize for it.

181. Voice Search

Voice search is a type of search that is conducted by speaking into a device, rather than typing. Voice search is becoming increasingly popular, as more and more people use voice-activated assistants, such as Amazon Alexa, Google Home, and Apple Siri.

One of the benefits of voice search is that it can be done hands-free, which can be convenient in many situations. For example, you can do a voice search while you’re cooking, driving, or walking.

Another benefit of voice search is that it can be more natural and conversational than a typed search. This is because users can ask questions in a way that they would speak to another person. This can also help boost Google’s Natural Language Processing (NLP) algorithm.

182. Query Deserves Freshness (QDF)

The Query Deserves Freshness (QDF) is an algorithm factor for certain types of queries, such as news, current events, and seasonal trends. The QDF algorithm is designed to give users the most up-to-date and relevant results for their queries.

This ranking factor is important for two reasons. First, it ensures that users are seeing the most up-to-date and relevant information. Second, it encourages content creators to keep their content fresh, which can help improve the overall quality of the web. This means adding new information, removing outdated information, and generally keeping your content relevant.

183. User Intent

User intent is the goal of “guessing” what the user has in mind when they perform a search. Google is getting better by the day in distinguishing user intent. It incorporates many of the factors listed in this article and it seems to be a factor that can morph and learn over time as Google gets better at understanding searcher behavior.

184. Featured Snippets

Featured snippets are a type of search result that is designed to give users quick and useful information in response to their query. These results typically appear at the top of the search results page, above the regular organic results.

Featured snippets can be in the form of a paragraph, list, table, video, calculator, etc. They are typically pulled from the content of a web page, though they can also be generated by Google algorithms.

185. Safe Search

SafeSearch is a feature that allows users to filter out explicit content from their search results. If SafeSearch is turned on, it can affect which sites are shown in the SERPs.

186. DMCA Complaints

DMCA stands for the Digital Millennium Copyright Act. This is a US law that protects copyright holders from having their content stolen or copied without permission. If a website receives a DMCA complaint, it means that someone has complained that the site is infringing on their copyright. Google takes these complaints seriously and may take action against the site, such as removing it from the SERP.

The End... Or Is It?

Whew! That is a lot to take in, I know. I did my best to dispel some myths, verify some facts, and give you a comprehensive guide to Google’s ranking factors. However, this is not an exhaustive list by any means. The SEO world is constantly changing and there are always new factors to consider.

While many of us think it would be great to know the exact “secret sauce,” the truth is, if everyone knew how Google’s algorithms worked, the playing field would be far from level. So, for now, we can all continue to experiment and try to keep up with the ever-changing landscape of SEO.

So, what’s next? Well, if you’re feeling ambitious, you can try to tackle all of these ranking factors. Or, you can focus on the ones that are most important to your business or website. If you’ve come to the realization that this is all too much, I’d like to help! I offer SEO & Thought Leadership strategy consulting services that can help you boost your website’s ranking. Want to learn more? Contact me!

Kimberly Forsythe

Kimberly Forsythe

The motto for firms in the 21st-century 🖖🏼 should be "Disrupt or Be Disrupted." My passion is helping small businesses get started on the path to digital transformation and growth.

Connect on Upwork