Home » News » SEO News » Knowledge-Based Trust In Search Rankings

Knowledge-Based Trust In Search Rankings

Slashdot has recently reported that Google has tapped a Cornell research team’s paper to re-develop Google’s algorithm that considers the amount of incorrect facts on a given page as a measure of trustworthiness called knowledge-based trust. Pages with a higher amount of factually correct information may rise in the rankings, where sites with more incorrect facts, or perhaps facts not as correct, will fall. This reminds me of many GRE test questions where the directions ask you to pick the best answer, even when several correct answers are given.

This new method is compared to the link popularity metric that Google has been using in their algorithm since the search engine’s inception, and also gave-way to the popular PageRank metric.

Google is building what is called a Knowledge Vault as opposed to its Knowledge Graph. The former collects facts from the entire web, whereas the latter collects information from crowdsourced sites such as Wikipedia and Freebase. The real question is: How does Google’s Knowledge Vault determine a topic or concept as fact and how will it relay this information?

While this knowledge-based trust concept has not been adapted into the current algorithm, this comes just a few weeks after Google’s John Mueller recommended for websites to not build links, and instead develop content that entices others to link to you. Now that we know that Google is looking at the factual information of a webpage, it now more important than ever to develop factually correct content that helps users. Which really should be a no brainer.

This might fix several of the inherent problems in Google’s algorithm. Links have been considered as an endorsement of the page or site you’re linking to. But context will tell you this is not always true. If you’re writing a legitimate negative review of a product or service and link to that product or service, previously, Google would consider that to be a positive signal about that webpage, product, or service. Although the introduction of schema.org ratings markup has helped Google deliver star-ratings to search result pages, this new idea of using facts to determine the trustworthiness of a page could be a major game changer, potentially helping the real experts gain the recognition their work deserves.

In most cases, I would say not to react too quickly when information like this is published, especially since this new method of rankings has not been announced as implemented into the algorithm just yet. However, delivering high-quality, factual content shouldn’t be done to please the Googlebots, but to please your users. In other words, it should be common sense.

I predict that this will mean a lot for ecommerce as well. Would you make a purchase from a store that only has a little information about a product, or a store that has invested the time and money into producing tons of information about a product including video tutorials and blog posts?

How will this work when it comes to facts that are still under dispute? What about situations such as the debates about climate change? I suspect this has been considered as well, and my guess is that when such situations are considered in the actual content, Google’s Knowledge Vault will recognize that the content considers multiple sides in a respectable and reasonable fashion.

I also wonder about situations where subjectivity is a legitimate contextual form. Think analytical and critical works or reviews about artistic forms. Then again, let’s go back to the original idea of adapting the Knowledge Graph to content: The more factual information you cite, the higher the potential rankings. Just like a research paper, even if it is about an artistic work where subjectivity has a place, by using as many facts as possible you can make more deductions and develop a higher quality scholarly research paper.

Another idea that comes to mind is what happens if someone discovers new information (backed by facts) that reverses prior knowledge? Can this form of revisionism create a confusing paradigm within the Knowledge Vault? Can knowledge-based trust make the correct call?

Lastly, will this completely diminish the value of link? I suspect that “no” is the answer. It will simply become one of the hundreds of factors that determine a website’s rankings. If a link is generating quality traffic to your website, and even if converting into sales or leads, Google’s metrics may pick that up as a sign of quality.

The possibility that Google might add facts into the algorithm is among many recent developments to make their product more powerful and more helpful to users. Also, you should have cold hard facts (backed up with sources when possible) on your site whether Google wants it anyway. It’s just common sense.

Submit a Comment

Your email address will not be published. Required fields are marked *

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

I agree to these terms.

Share This