Google's War on Webmasters Continues, Tools Targeted

Story Text:

Classic SEO tool Optilink has finally been full on targeted by Google in the ongoing, and futile, war on SEO. The email from Optilink creator Leslie Rhode arrived this morning, but it was Aaron's post that made me open it...

From the email:

A few days ago, Google began to employ a "spyware detector" that will in some cases block OptiLink through the use of a cookie and a human visible "ransom note".

The use of Google from "normal" browsers is not effected -- it is only specialized programs such as OptiLink that are targeted by Google's change with the result that OptiLink can be blocked from Google for two or more hours.

As Aaron points out in his post, you can't move for people saying that Search engines are pretty much on par with results these days, and one wonders if Google have just got a bit bored with actual websites recently.

It would certainly seem that they're not content to filter out junk, improve their algorithm and work with those of us that work with them, they seem hell bent on being nobheads about it.

Oh well, as Leslie said, with the linkage data they provide, the loss of the Google interface on Optilink is a minor thing...


Google vs. Webmasters

Seems like Google doesn't want pesky professional webmasters messing up their spam-junk-adsense-utopia ;O)


Web Position has also been targeted to a point. When ranking reports are ran, Google will block the ip from doing ANY searches without entering a password manually. This is OK considering that Web Position allows for use of the API key!


It's actually funny to watch how fast Google react on "bombing" them with requests - and how much it changes over time of day. And this is not new. I did some testings a few months ago on "throw away IPs" (open proxy's). At some times a day (typically daytime Europe) I could do about 1000 request before cut off (high speed requests). In US day time I could often not do more than 200. Based on my limited study I believe there is a server load component to this as it seems the cut-off is not constant. At least, there seems to be some other variable than just requests.

do no evil, but don't help either

Google also returns a 403/Forbidden to wget if you are trying to access the cache. Not at high rates, but on the very first request.

The purpose of accessing the particular cached document with wget was to read about a particular root kit without actually accessing the document or the cache with a browser and risk being affected by any code on the page. All I wanted to do was download the document and open it in a text browser to read the claims made for the root kit in order to help someone clean out a server. As there were only a handful of references to the root kit, it would have been helpful to be able to read the background information on it.

Perhaps no evil was done, but the 403/Forbidden restriction certainly was not helpful.


>>Google also returns a 403/Forbidden to wget if you are trying to access the cache

wget -U "UA STRING HERE" http://blah....

Rand's SEOmoz tools are also hit

Boy am I glad I only link to tools...

It's a shame though. Ok I'll stop now for fear of starting a whine fest :-)


That'll be interesting - he's a big fan of "eveyone should be friends" stuff...

Now maybe the point will hit home - If you optimize your site, in even the smallest way, Google hates you.

Simple as that.


>Now maybe the point will hit home - If you optimize your site, in even the smallest way, Google hates you.

I felt a bit naive when Google filtered my site out for a while. they are a business and have a job. they do what they feel they need to do.

i was just lazy

wget -U "UA STRING HERE" http://blah....

Yeah, I forgot the commandline switch and was too lazy to open the help file at the time.

I just thought it was somewhat paranoid on their part. They scrape every site in existence, but worry about a single wget request. I mean when I am serious about scraping, you cannot tell the difference between my bot and my mouse. And I don't use wget for those jobs :)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.