01 Jun 09
16 Jun 10 2:02 pm
I heard that this is how it works:
The google bots crawl web pages and index and rank them according to the logarithm programmed into them. Then, after an undisclosed period, the google wizards review the pages and reassign them to their appropriate ranks.
The first process is supposed to be all automated (well, they do say bots, as in robots) and thus does not really take the content into account the way human readers would, LSI or no LSI. And that would be why it gets crazy.
The second is supposed to be more human-oriented and more thorough and the indexed pages either go up or down the SERPs depending on how good they are. This could explain what we are calling the google sandbox or the slap or the dance.
I don't have details, though. It's all supposed to be top secret and my source won't tell me "on pain of death".