Google Big Daddy = Belly Flop


WOW.. what can I say , have google completely messed up or is this some crazy plan..

Why when I search for I get Sorry, no information is available for the URL, but when I do I get pages ??

Why a site that had millions of pages drop to 100,000's pages

Why MFA site's are so powerful in google now

Why when I search for a domain name do I get another domain name in the serps ( I thought Big Daddy fixed the hijacking issues .. guess not)

When will it all be fixed ??



Dave if you don't know,

I sure don't!

Google is busted

Other parts of Google such as exact phrase search just don't work right anymore.

Been busted for weeks, they don't care

FWIW, they over report pages per site anyway, probably why all their servers are "full" as my site has about 40K pages, Yahoo gets it right, Google says 260K or some crap.


When I do a search for, the SERP shows me the exact site name and description as it appears in DMOZ. Dave - are you searching for domains that aren't in DMOZ?

google Broke ;(

Well they do say that incompetence rises to the top and google is living proof of that, Like someone just mentioned in another thread...."the worse it gets, the more money google makes off Adwords"

hopefully someday ppl will move on from the "cult of Google" and stop buying into all thier harvesting crap

And yet, Cutts and GG simply

And yet, Cutts and GG simply don't see it. All they see are BD crawling priorities, when clearly crawling has little or nothing to do with the issues. And no matter how many times people point this out to them, the messages apparently become invisible.

The worse they are on the

The worse they are on the left, the better they are on the right.


Every time there has been an update, dance or reshuffle all of my sites have been unaffected or improved slightly.

This is the first time (sandbox notwithstanding) I have ever had problems in Google.
I thought the lost pages issue was simply them rebuilding the index after BD but I now think there is something very seriously amiss.

They enjoy the money and relish the virtual monopoly but with that comes responsibility, they need to sort it and sort it now!

It is a belly flop... tons

It is a belly flop... tons of pages now in Supplemental... sites missing.. blah....


said it is just webmasters who can't count.

The results count has been a

The results count has been a "guesstimate" for the past few years, but granted it's gettign increasingly more unreliable. Especially with the weird fluctuations in the amount of "supplementals".

I find it very interesting, and a bit disturbing, that the exact search seems to be disabled. I hope it's only temporary.

That, and the current malfunction of the "site:" search. Plus, of course cached copies of pages as old as 2004 appearing (now only text from those caches)

nice to hear

Nice to hear bigger boys seeing the same things re: no results for domain. However, in such cases G is still serving up ranking individual pages.

As for broken site: command etc, right around the time they promoted those as "insider info" to sitemap users, suggesting they could discover more about how Google has crawled, indexed, and interpreted their site. LOL

The guesstimate excuse...

... has always bewildered me. How hard can it be to keep track of the sites you've spidered? I refuse to believe that the farm of parallel machines can't handle count++!

Good to hear

I haven't complained because for some reason I've had a lot of success with this big daddy update. I'm showing up for things I really have no business ranking for. I think Big Daddy has some major technical issues as well as the typical baby with the bathwater filtering on Google.

# of results

Some searches, on the first page I've seen 'results 1-10 of about 40,000,000', then go to page 2 and it's 'results 11-20 of about 150,000,000'.
Hmmm, only a hundred-million-odd page discrepancy there.
Not cool.
And MSN is looking goooooood. Except in terms of market share of course. Hopefully that will change.


MSN walks away with it in my field.
Been saying so for months.


I think the guesstimate thing is officially for saving computing ressources, and for speed. But then which kind of Google user is it that is harmed by this information not being accurate ;-)

(actually I think that some linguist researchers have used that kind of data for scientific research, so it's not just SEO's)

Dazzlindonna, I wrote a

Dazzlindonna, I wrote a rather long reply to your comment. Here's what I said:

Dazzlindonna, believe me, lots of different people at Google have been reading that feedback. The Sitemaps team was reading it to make sure that Sitemaps has nothing to do with the issue. The crawl/index team checked into several reports and each time came up with other reasons why the site wouldn’t be crawled as much (e.g. the ‘next page’ url on one site wasn’t short; it was a total hairball with like 200 chars of params), and some supplemental results folks have been through the raw emails, which is how one of the site: changes was noticed. So far, about half of the feedback to the email isn’t about pages dropped. Of the other half, one factor is that several sites have spam penalties. Of the remaining feedback, the two site: changes were the only two that we noticed. We’re going to keep digging in, but people need to bear in mind that Bigdaddy does have different crawl priorities, so a site that had more pages indexed by the earlier Googlebot won’t necessarily have as many pages indexed in the future. But don’t get me wrong; we’re still going through the feedback to see if there’s anything else to be identified and improved.

I just spent another hour going through feedback, and a fair fraction of that was sites in co-op link exchanges not being crawled as much.

IncrediBILL, the examples that you mention at
work fine now; unusual phrase searches in supplemental results were done as unphrased searches a while back, but I believe that's all fine now.

low quality = low crawling

>I just spent another hour going through feedback, and a fair fraction of that was sites in co-op link exchanges not being crawled as much.

Would it be a reasonable conclusion to say sites that are deemed "low quality" can expect "light crawling"?

Others are broken

Matt, those are fixed but I sent you one in PM, it's still not quite right.

no turning back?

Thanks for the feedback, Matt. I wonder if you all are facing a point of no return now, when it comes to "spamming".

Every SERP I check for a client reveals a top page that is demonstrably spamming (very clearly, by your definitions). Every one! So here's why that is "strange":

I met with client to talk about optimization. They want to be #1 for "new car". They dont want to break the rules. They have money for consultants, and they heard from others how I do well by Google. I show up, and see that they have done their homework.

They show me SERP after SERP where a competitor is ranked at the top, who wasn't 6 months ago. It's all relevant, as they, too, would be. Yes, they also note that some big irrelevant pages are no longer in the SERPs, and they think that's because Google has done a "great job cleaning up". Now, they are trying to get their shot.

The problem is, even a quick look at those "winners" reveals lots of 'search engine spam". Alt text stuffing, hidden text of all sorts, doorways, and extensive link-matrix-footers abound. Titles that go Titles that go Titles that go on and on- Titles. Those few that pass a view source check usually show signs of cloaking and are often already banned in Yahoo! (probably for their elite "cononicalization skillz"?)

Just what should the SEO say to the client? The evidence is clear... grey is the new black. So what happens next? ratting out those above you is the new grey?.

It sure seems like a dead end to me.... you're either a scammer or a rat fink or you go out of business, and all the SEO can do is look at the same SERPs and say "yup, sure seems that way".


these google results are the WORST i've seen in years.

in addition to all the SPAM in virtually every single search i do, there's even signs of PREVIOUSLY banned (spam) domains coming back to life in 100K+ ranges and ranking.

also seems like the 'get trusted links' advice really translates to 'get MORE links, because the more you have, the odds are better that some will be trusted' .... from what i've seen link spammers have been dominating as of late.

so let's be honest...
is this about really about relevance or is about the adwords?

Not working for me

I've had one of my super clean sites have only the homepage listed. A couple of my other squeaky clean sites have disappeared from the top serps. And my one and only 'darker' site I run for giggles is crawling up the serps on some competitive keywords - a site I never intended to rank in Google.

Makes a fella want to go blackhat.

One thing I can say with big daddy, is that the idea of dropping oodles of 'low quality' pages in itself won't really hurt the serps. You only need the first two or three pages of good results to have great serps. Whether there's another 500K of pages or 10 million pages behind those first three pages doesn't change the user experience. So dropping pages isn't going to hurt Google - the sites that rank in the first few pages is what counts. (i.e. it's the serp quality, not the number of pages indexed).

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.