Google's new technology determines whether the 'Facts' published on internet are indeed true

Google’s new technology determines if the “facts” given on websites are true or not

With overloading of information on the popular search engine Google has decided to go for the Truth rather than Popularity.

What would this mean: Usually the blog sites which filter in loads of information on Google get rankings based on the links pasted and the popularity. However, recently Google found that the higher rankings need not also give the right or true information. Hence it has now decided to implement a new technology which would be based on “truth score” to give the ranking and not popularity. This idea is still on the research paper and not yet implemented.

Though the idea seems to be unbelievable, Google says that as per the recent research it is not impossible. To verify something you would need two things: one is the fact and other a reference to evaluate the accuracy of the fact. In this case Google has already started building its “Knowledge Graph” which will act as reference base. This reference will extract the information from Wikipedia, CIA World Factbook, Freebase. At the same time Google’s internal research database “Knowledge Vault” will also extract separate information from the text in the websites.

Now both these databases will produce “knowledge triples” which is a combination of subject, relationship, attribute for e.g: In case if user types types August 15 1990 and birthday, Google pops “Jennifer Lawrence, birthday, August 15 1990.”

Now when some blog or news is posted to website Google will find its “knowledge triples” in the text and match it with its Knowledge Graph, if the data matches with the blog it will get a high score otherwise it will get a lowly rank on Google search and thus lose visitors.

Now why would Google go in for such complexity? This is because a random check carried out by the researchers on some pages of Google’s algorithm showed a shocking revelation that only 20 out of the 85 pages were given higher scores though the other pages were factually true. This means the users who rely on Google for information are actually being misled and at the same time the sites who give the correct information are being pulled down the above link pasting method.

Similarly for past two years Google has come across many fake stories which keep circulating in its News Feed and Google has found a way to keep these hoaxes away by adding  a warning on hoax stories indicating that they’re fake. This will not affect the genuine sites and will affect only those who are updating these fake stories as Google found out that there are some sites who are making money only by uploading fake and hoax stories.

“We’ve now refined the signal in ways we expect to visibly affect the rankings of some of the most notorious sites,” Google’s senior copyright counsel, Katherine Oyama, said in October.

Indeed this seems to be pretty cool if Google actually starts implementing this technology, users can then rely completely on Google for factual and truthful information.

1 COMMENT

  1. Interesting… If this technology is implemented by Google, then it will surely develop a new hope among new website and blog owners by giving them an opportunity to rank their truthful, quality content. Once content was considered the KING, but in modern age of the web, obtaining a good ranking for any blog or website in Google has got too difficult, even with “HIGH QUALITY” content. Let’s see what happens with this prediction 🙂

LEAVE A REPLY

Please enter your comment!
Please enter your name here