Is Google trying to tell me something?!

Share This Post

Google’s algorithm doesn’t try to determine how good or bad your site is. It doesn’t even rank your “site” in search results. They are much more granular in their approach. They are rank each of the individual pages of your website using over 200 signals of relevance. Some of the signals are based on trust and off-page signals that will carry more weight than the same signals on a different page.

The key things to focus on are relevance and usefulness. The more relevant and useful your page, and the more relevant and useful the pages linked to your page, the better your page will rank. Naturally, it’s Google’s definition of relevance and usefulness that matters, seek to understand how Google quantifies these values and you will have more clarity on your page ranking.

When we follow the principle of Occam’s Razor we find that the simpler explanation is usually the best, however, the examples you cite are easily explainable by well-known ranking factors without formulating additional speculative factors.

It’s a matter of perspective. To use an analogy, the early man looked up at the sky and saw that the heavenly bodies rotated around the earth in complicated patterns. That was their perspective because they had limited knowledge of orbital mechanics. Likewise, from the perspective of a webmaster, we see many of our pages drop in rank for many different keywords and we feel as if our entire website has been somehow penalized. That’s a reasonable conclusion from the limited perspective of a webmaster. When we study how search engines work we learn that they do things that may not be targeting your entire website, not even your website specifically, but it may feel that way to the webmaster.

It’s well known that Google has an anti webspam team that hunts down webspam and devalues the influence of links from those pages. If a webmaster has used webspam to bolster the ranking of their pages for certain keywords, and those backlinks were devalued, we would expect to see those pages drop in ranking for those particular keywords. Additionally, if a page that has a significant amount of PR coming from backlinks that get devalued or removed, and was passing PR to other pages of a website, we would expect those pages to lose some of their ranking power.

These well-known factors already account for the ranking drop phenomenon without adding a theory of individually targeting a particular website for specific keywords.

I for one do not buy into the concept of site authority. I can see how so many have bought into it because, from the limited perspective of a webmaster, it sounds very plausible. However, it’s a theory that doesn’t hold up under closer scrutiny.

The well-known PageRank algorithm along with the lesser-known Trust factors, as applied to individual pages, account for all the effects that are commonly attributed to “site authority”. Since “site authority” would dilute the relevancy of SERP results it seems like a poor choice as a ranking factor. I can’t see Google using a factor that could only serve to lower the relevancy of results.

I think this is a case of using your own view of relevance rather than Google’s. The fact that you do show up in the SERP at all is evidence that Google found your page relevant enough to be included. And if not for those other 400+ pages that were also relevant your pages would be top-ranked.

Again, this is a matter of perspective. Clearly Google is determining relevance differently than you had expected. I’m sure if you determined relevance, in exactly the same way as Google’s method, you would find those pages you called “irrelevant” at the top of your list.

The lesson to learn here is that Google has a specific method used to determine relevance and the key point is to learn how Google determines relevance. When we understand the signals of relevance used by Google and have a general idea of how they apply those signals to rank pages we can use that knowledge to improve the ranking of our own pages.

Search engines like Google do not rank “sites” they rank individual pages. They use on-page factors as well as signals from pages that link to that page. They see your page as part of a web, a web that expands beyond the limits of a particular “site”. And this web that your page is contained within plays a major role in influencing the relevancy score of your page.

Naturally, anything that influences the web that your page belongs to will have an impact on your page. And when all the pages of your website share a common set of webs they also share the impact of changes in those webs. This is, in part, the reason diversity in your webs is important to achieve stable rankings.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Copy.ai

https://www.copy.ai/ Say ‘goodbye’ to the blank page for good Experience the full power of an AI content generator that delivers premium results in seconds. 3,000,000+ professionals &

2021 CopyBlocks

https://copyblocks.ai/ AI writes marketing copy in seconds Automatically Generate Profitable Marketing Copy In 100+ Languages Using A.I. & NEVER Write A Word Yourself Again Marketing copy that

Do You Want To Boost Your Business?

drop us a line and keep in touch

DonBurk.com

We Would Be Happy To Meet You And Learn All About Your Business