Matt's Cloaking Fud

22 comments

Matt posted (link) a googlified definition if ip delivery and cloaking. He said:

IP delivery: delivering results to users based on IP address.
Cloaking: showing different pages to users than to search engines.

IP delivery includes things like “users from Britain get sent to the co.uk, users from France get sent to the .fr”. This is fine–even Google does this.

It’s when you do something *special* or out-of-the-ordinary for Googlebot that you start to get in trouble, because that’s cloaking. In the example above, cloaking would be “if a user is from Googlelandia, they get sent to our Google-only optimized text pages.”

So IP delivery is fine, but don’t do anything special for Googlebot. Just treat it like a typical user visiting the site.

Come on Matt that definition of ip delivery vs cloaking terribally incomplete is just Google crud. There is no difference, technically they are the same damn thing and there is no way for google to know whats legit. I mean lets take your example of the com versus uk.

1. If I want to serve different content at the same URI for different users based on their locale thats the same thing. I am improving the user experience but in googles eyes I'd be seen as being manipulative because their bot can't get the proper content. Your suggestion that I use a redirect means that I'd be developing my site for the googlebot.

2. What do I do when googlebot comes in that scenario ? How do I know what bot s for the uk and what's for the com index ? hrm... again I don't but no matter what I do I am telling google this page is optimized for the US or the UK.. did I do that for serps or user experience how does the google algo know ?

3. What about when I optimize my content for a user based on referer by showing extra or tweaked content ? Say they for instnace I notice the Almighty GooG keeps sending users to the wrong page or just plain sends users to a bunk page on my site for what they were looking for ( it happens ). So I have a script that helps that user by presenting CONTEXTUALY RELEVENT INFORMATION much like your site does. Obviously I didn't show that content to googlebot because they have no referrer so when you check how are you going to know whether I did that for user expereince or to optimize my page for search ?

The real google message is.. you can be good but don't be too good because thats too much like us.

Comments

Nothing in this world is

Nothing in this world is perfect, and i think that what I think Matt is trying to emphase is webmaster education.

Off course there are a few exceptions (and you pinpointed some above, but the base stands: the same content to G and to each other user.

I have such...

... a tough time with this concept. I mean, Google is a service to surfers AND webmasters. Google makes me money. They don't have to, and I don't have a RIGHT to get Google traffic.

On the other hand, aren't they using my data to make money? Aren't I providing a service to them? When are my terms ever considered? Google is not the internet.

It is pretentious to say that IP delivery is fine as long as I don't do anything special for Googlebot. It will get the content I feed it -- and it will like it.

The real message is do whatever you want, as long as you are willing to deal with the consequences. (Or happen to be a large brand, because it would reflect badly on us if you didn't show up first in the results.)

I am torqued because....

- Insinuating I am trying to manipulate google by helping my users pisses me off.
- Saying that using a technique they use is unacceptable for me to use is bullshit.

On the other hand, aren't

On the other hand, aren't they using my data to make money? Aren't I providing a service to them? When are my terms ever considered?

this is the biggest issue IMO -- each new member of the google network adds value to google (more information they can use to profile anything and everything), but the new member may not receive any value from google (or at the very least, the value is designated on google's terms and google's terms alone). personally i think the solution will an organization that can make search more of a commons-type of good -- something that is collectively owned, because its value is derived from collective participation. until someone finds a way to properly align value added to a search engine with profits extract from a search business model, i think the problems with search are only going to grow.

why agonize?

That's what I just don't get... why all the pain over this stuff? Maybe you need a few more counseling sessions to get past the emotional baggage...the same baggage that is being exploited by webmaster fud, imho.

Repeat after me:

I (state-your-name) am a webmaster. I have a website. It is my site, and I design it the way I want. I think it's cool. My users like it. Life is good. Some people don't like it. That's ok, because it's not for everybody.

If you also make money from your websites, and you want search engine referal traffic from Google, take off your webmaster hat and put on your business hat. Ready? Repeat after me:

I (state-your-name) am a businessman. I have a website, advertisers, and mouths to feed. Google the impersonal non-human software program doesn't know how to serve my site to interested searchers. Google the corporation of employees is too busy or too preoccupied to care about my specific needs, or may have competing interests. How can I help google the machine serve my content to the people who would like to see it, without asking google employees to expend personal effort on my behalf?

Be creative. You can do it. I know you can. Serve your users, satisfy your advertisers, and feed your dependents. Fud? What fud?

Matt, is usualy a very smart

Matt, is usualy a very smart guy but in this case he is playing a political game rather than a technical correct one. Its a hopeless fight, Matt, don't try to define the technical difference between IP delivery and cloaking because there are none. Instead, just focus on the fact that you, and any engine, can decide whatever you think is spam and delete it if you don't like it. That simple. Don't use silly arguments for your editorial policies that makes no sense when you have other perfect ones that works much better :)

John, I find it offensive

John, I find it offensive that a google rep misrepresents the truth via an oversimplification, much like your summary of how I feel about it. Also I resent their attempt to apply double standards to the web. That does not mean I allow them to make my business decisions.

okay

Quote:
John, I find it offensive that a google rep misrepresents the truth via an oversimplification

Understood, and appreciated. However, it may be that he is speaking not to you, but to a different audience. When a mechanic says the problem is in your engine, he's misrepresenting the truth through an oversimplification (the problem may have been with the ignition system, attached to the engine). No harm intended... or was there? It has to be considered in context.

Matt says:

Quote:
IP delivery: delivering results to users based on IP address.
Cloaking: showing different pages to users than to search engines.

Maybe, here Matt is simplifying the labels that newbies (clients and practitioners) may find in the field, "cloaking" and "IP delivery" ( no dash ;-)

If you look closely, a word is missing from Mr. Cutts statement. I personally wonder how long he agonized over the edit, trying to find a replacement, before accepting that he had to leave the adjective out and thus leave his statement bereft of full meaning. That word is "different". It should be there... like this:

Quote:
IP delivery: delivering different results to users based on IP address.
Cloaking: showing different pages to users than to search engines.

Leaving it where it naturally belongs would invite a challenge to clarify.. which he cannot address. So he takes the opportunity to define the terms and impose his spin. Why not? In my view, Matt's whole mission is PR to the webmasters and online community, and not the skilled SEOs. Tomorrow's SEOs, if you will. Google has a problem and it's the growth of SEO and they need to stop it or control and ..... heeeeeeere's Matt!

Go in there Mr. Cutts and start labeling things (we need to label them before we can talk about them), start impression management (Google is trying real hard to be helpful, seo is SPAM, seo's come in TWO KINDS and some are BLACK which is EVIL, etc) and most importantly, become a defining voice of how newbies to SEO (whether clients or tactical practitioners) define the field. What percentage of the Google user base will blame SEOs for crappy SERPs these days? And what was the percentage 2 years ago? And how does that compare with the truth?

Individuals will be managed separately... haven't we seen most of the high-profile SEOs visit the plex at one time or another in the past few years? By invitation?

Of course the counter to Matt Cutts would be a leader of the SEO/SEM community who knows how to write well and hold an audience, and is able to address each of Mr Cutt's assertions with solid, understandable, clarifying language that is easy and interesting to read and follow and reference. I bet Mr. Cutts (or the machine behind him, if you will) manages that risk by assuming that profit-minded, competitive SEOs will remain fragmented and no single strong voice would emerge with sufficient force to override Mr. Cutts PR machine.. at least not with as broad an audience (due to the level of expertise required to follow a technical argument). Personally, I don't think it needs to be a technical discussion. A blogger-level would be much more effective on the masses. Tin foil hat conspiracy theorists need not apply..they are self defining as "uncredible".

Each time Mr. Cutts introduces a concept piece, he also introduces an opportunity for someone to show how he is running an agenda. In this case, he appears to have avoided the word "different" above, possibly because it is impossible to separate cloaking from IP delivery in the context of dynamic content. But... opportunity missed to reach the Technorati 100 audience with that message, due to a lack of reward IMHO. As I think I recall reading Mikkel say (forgive me if I get this wrong) "I could tell you, but why would I?"

(Hi Matt :-)

..

I have always though it should be "Cloaking: showing different pages to users than to search engines for the purposes of deception.

I honestly still don't

I honestly still don't understand why the engnes spend so much time on "hunting" SEOs - do you see newspapers doing the same to (all the spammy) PR experts? Who is the biggest spammer - a political spindoctor or a geeky SEO? Search engines need to adapt to the real world, analyze it and present it to users in the format they like - it is NOT the job of the engines to change the world and make webmaster do what they want. It is completely up-side down to me. Annimal farm.

Not so much FUD

As it is Matt playing Elmer Fudd...

"Be vewwy vewwy qwiet, we're hunting cwoakers...."

different content at the same URL

Webprofessor wrote

Quote:
If I want to serve different content at the same URI for different users based on their locale"

(my bold). By Matt's definition that's cloaking.
But if you detect a users IP address - and redirect them to another page - Matt's view is that is (acceptable) IP delivery. e.g. an Australian types in www.google.com and gets redirected to www.google.com.au

So isn't the difference (in Matt's definition) of cloaking V IP delivery whether you redirect to a new URL; or deliver different content on the same URL?

That's my take. FWIW. It's much clearer than the published guidelines.....

IMHO, Matts take is just

IMHO, Matts take is just that the bots should see what the human users see. For most websites that isn't even a problem, but say, if you use

- IP delivery or
- HTTP_ACCEPT_LANGUAGE or
- cookies or
- referrer based content or
- cross-site passing of parameters (eg aff. pages)

...or something else, then it's not always sure that the bot will see the same things as a user (or at least it may be treated as an US user). So, there are more than a few possibilities for false positives.

I agree 100% with webprofessor and Mikkel that using technical arguments isn't the right thing to do, as the technology does not imply intent, and IP delivery is after all a cloaking technique. Just like it can be used without cloaking.

Same thing with hidden text really. In the current HTML world it is quite common to have hidden layers of information - it works just like preloaded pages, so when you click a link the information appears insatantly, you don't have to wait for a page load. That stuff has nothing to do with SE's at all.

But then, it's really hard for Google to tell when this is an attempt to "cheat them" and when it's not, isn't it?

Intention

It seems to me that the only difference between the two definitions as used by Google are intention. What does the site owner intend? So, according to Matt, Google is able to algorithmically deduce whether you are IP Delivery for geolocation or SEO purposes? That's sophisticated stuff, if true.

I swear, I am so sick of

I swear, I am so sick of cloaking coming up again and again. I've heard people debating what this is and isn't and if it is ok since 2000.

Quote:
- Insinuating I am trying to manipulate google by helping my users pisses me off.
- Saying that using a technique they use is unacceptable for me to use is bullshit.

I think that is the most intelligent thing I've heard about this topic in a long time.

Matt is manipulating newbies. Enough said.

Is it still worth living ...

I have to agree with grnidone..

At the end of the day no matter how many ways we (I emphasis “we” as opposed to google) define cloaking, or classify good/bad IP redirecting it results in zilch.

Just the other day I come across one of the highest ranking accommodation websites in APAC for major terms participating in blatant cloaking, and it would seem they have been doing it for an eternity without consequences …

I want to live in an idealistic SEO world where all the bunny rabbits are white and playing happily together with big brother (a.k.a. Googlebot) looking over them… just a dream…

Under Matt's definition does

Under Matt's definition does manipulation links on a page and meta information count? I mean a popular webmaster website has been doing that for ages.

Don't Let The Pursuit

of perfection, impede progress. It wasn't that long ago that the 'unofficial position' of G's favorite spokesman was a bit different. In a long and lengthy thread in which IP delivery was being discussed, I believe Matt said, and I'm paraphrasing, 'sure it works, until we find you'.

So at least now, some 'cloaking' is considered fine. Albeit Matt's definition is suspect. Truth be told, I haven't seen a site in ages that needed to cloak to rank well. However, many sites can use IP delivery to make the site more useful or provide local, targeted advertising.

As if they could

actually and properly "see" and discern what human visitors see - what a load of bull and time worn SE spin again.

I mean, it's not as if search engine spiders were any good at Java content, or Flash, or Shockwave or even animated Gifs, for that matter. Do they still get the hickups when confronted with session IDs? Like hell. Are they still terminally challenged by most of the more complex content management systems' dynamic links? Sure they are, just like they were in 2000. Or in 1998. Some progress.

Plus - do they actually crawl every page now before featuring it on their SERPs? If so, how come you still get tons of 404 pages ranked on positions 1 thru 10? I don't recall ever hitting a site that was cloaked to generate 404 errors ... "Seeing what the user sees", eh?

Also, it's a totally linear and delusional view of things, way off reality. What if you rotate ads on your site (as Google's AdSense will, too)? Or news items? What if a second hand bookshop shifts or rotates selections from their inventory every time a visitor hits their page? What if you have alternate pages (e. g. different sales letter versions) for the same product?

Technically, in this scenario every page view is 100% unique - so how would Googlebot notice the difference? Hell, it wouldn't even be cloaking/IP delivery or blatant "black hat" but just could be misinterpreted as such nevertheless ...

Fully agree with John - it's all about agenda driven language control to dominate the discourse. Or, less kindly put, PROPAGANDA, pure and simple. (And when did any Big Brother outfit in history ever own up to "doing evil"?)

So: if they really want webmasters to stop trying to work around whatever they may be imagining as the binding parameters of an ideal search engine world, let them pay them - the original content providers, after all - and not vice versa: making a pile a minute from other people's labor without giving anything in return but self-righteous commands, regulations and dictates.

I love T shirt worthy

I love T shirt worthy quotes.

That last paragraph might have had a few too many words to make a clean print, but it is a tshirt worthy passage nonetheless.

Good to see ya again Fantomaster.

That'll be a big T shirt

..but I believe it'll fit me fine.

Think it'll fit

on GoogleMaps, too? :-)

And thanks, Aaron, for the thumbs up!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.