On 3rd July a member on Digital Point posted a he got when a query for a cached page from a Google datacenter (.104) went wrong. The following is rarely seen, Google’s Error Code:
pacemaker-alarm-delay-in-ms-overall-sum 2341989
pacemaker-alarm-delay-in-ms-total-count 7776761
cpu-utilization 1.28
cpu-speed 2800000000
timedout-queries_total 14227
num-docinfo_total 10680907
avg-latency-ms_total 3545152552
num-docinfo_total 10680907
num-docinfo-disk_total 2200918
queries_total 1229799558
e_supplemental=150000 –pagerank_cutoff_decrease_per_round=100 –pagerank_cutoff_increase_per_round=500 –parents=12,13,14,15,16,17,18,19,20,21,22,23 –pass_country_to_leaves –phil_max_doc_activation=0.5 –port_base=32311 –production –rewrite_noncompositional_compounds –rpc_resolve_unreachable_servers –scale_prvec4_to_prvec –sections_to_retrieve=body+url+compactanchors –servlets=ascorer –supplemental_tier_section=body+url+compactanchors –threaded_logging –nouse_compressed_urls –use_domain_match –nouse_experimental_indyrank –use_experimental_spamscore –use_gwd –use_query_classifier –use_spamscore –using_borg
I am not for 1 minute suggesting that I know anything about this code, but 2 things stand out to me; “use_experimental_spamscore” and “use_spamscore –using_borg”. So as well as having an algorithm based on scoring an individual page based on linkage data and on-page factors, Google also seem to have a new scoring algorithm for spam. Could this be their reaction to the recent or just a refinement on how they filter out spam?
Aside from this, Google are now using Borg technology. How they managed to get in touch with the Borg collective is beyond me, but Yahoo and MSN are in serious trouble now. Having said that, maybe Google and the Borg did some sort of Capt. Janeway-style collaborative work? Maybe Google will announce a deal with Unimatrix One to place Adwords in Borg cubes? Sorry , couldn’t resist. Crap that’s another bad pun.
–use_experimental_spamscore has to be relant to all the recent chatter about how a site got 5million inclusions in google for spam, i mean this stuff must be pretty easy to score and remove. cant wait to see it go live.
Yeah it could be something to do with that. DaveN said on WebmasterRadio that Google probably wouldn’t have the loolhole fixed for a while, so this experimental spam scoring feature has maybe been working in the background for a while.
For all we know it could be linked to the ‘sandboxing’ phenomenon where new sites take a while to rank?
Pingback:
Pingback: Use_experimental_spamscore
Pingback:
Pingback:
Pingback: Consequentia Mirabilis » Blog Archive » Googlism: Google Rank and Spam
Pingback:
Pingback: Websoft Blog » Google Error Reveals Rarely Seen Code!
Thats really interesting! Great to get an insight into how the big boys do it
Pingback: Basement Tapes » Examining Google’s Error Code
Pingback: Klemnov Kot » 4.7.2006 >> zanimivo na internetu
Pingback:
Pingback:
Pingback: Loud Opinions | Blog » Blog Archive » Google Error Reveals Rarely Seen Code!
Pingback: S-O-S SEO Blog
Pingback:
Pingback:
Pingback:
seems 2 me that this error code explains G’s June 27 F…up
I perfectly agree that removing spam is key … but please Google guys out there … don’t remove decent, original content rich sites !!!!
speculation
Pingback:
Pingback: Error Forum Blog » Blog Archive » Use_experimental_spamscore
Pingback: Aquà estamos : El futuro de los buscadores y la publicidad