Latest Algorithm For USA  

Posted by Ritesh Kumar Jha

Stemming


When a person looks at a cached page it will state at the top some of the relations of the location of that page to the search which found it. When you search for coffee, Starbucks is the #1 resource (even though the word coffee does not even appear on the page.)

With some sites the reference has been way out to lunch using phrases such as "allinurl:blablabla" and "allinanchor:blablabla". When you search for something it should not tell you how operatives are working on other pages when you are not searching for them - this is part of the reason why some say "Google is Broken."

Bayesian Spam Filter
In October it is believed that Google implemented a Bayesian spam filter. A bayesian spam filter compares returned results to known examples of spam, and determines what is spam using a probability analysis of the whole by breaking it down into its simplest components.

Things which would appear as dead on spam to a search engine would be

high keyword density
high keyword proximity
only having reciprocating links
keyword stuffed inbound links
most links from off topic resources
Local Reorganization of Results
Teoma is one of the five major crawlers on the web. They have a ranking system which displays results based on their local inter connectivity. After the user searches, the initial set of results are re ranked based on links to and from topical hubs and authorities.

Google's new algorithm  

Posted by Ritesh Kumar Jha

Commercial Bias of the Web A good research paper has solid information in it, but typically cares little about what a search engine may think. The fact that the internet can provide essentially free distribution after initial investment makes it appeal to marketers of all types.


The marketer wants to make money, frequently caring very little about what search engines or the end user think. In the past, webmasters worked hard to place the second result above the first. Recently Google has aimed to change this.

If Google were to make one simple change at a time people like me could easily reverse engineer their every move. This is the reason I think they made multiple changes at the same time. Built into the new features was selective engagement. Recently Google has added Stemming, a Bayesian Spam Filter, and perhaps local redistribution of search results.