Google Loves Old Domains


Patrick has posted about some some observations on increased trust in older domains over at the LinkBuildingBlog.

It seems Google has intensified the ranking factor of one key data point that is very tough to manipulate: the age of your domain name. I have been keeping an eye on trends with Google's search results and it seems that there is an increasing trend towards "trusted sites" owning page one. A "trusted site" is loosely defined by me as: an older domain, a nice mix of anchor text, links built over time, links coming in from all kinds of c class blocks, maybe a .edu snuck in there, etc.

There's a fair bit more to the post, including the theory that domains older than 2002 are tops for Google.



I have so noticed the same.

I have so noticed the same.

I took a client for the term "email hosting" from nowhere to second spot in 6 months time, with not that much effort and a really small budget.

Website was from 1998.

I agree...

I see it more every day. New sites are a struggle.

Have a client who has a small, 5 pager, but went from nowhere to no.6 in a few weeks time.

party like it's 1999

This was already a trend of course but Jagger cranked it up another notch.

I stand by my quote, that older sites can get away with murder (tons of spammy recips, coop, what have you)

All the SERPs I watch have

All the SERPs I watch have the same thing - domains prior to 2002 get inordinate amounts of extra boost, simply for being old.

Hasn't this been the rule

Hasn't this been the rule for years now? Old domains have *always* done better.


The sandbox/adwords filter just seems kind of silly to me. It's amazing to see a site from 1998 with 50% keyword density, 90% links with the same anchor text, etc ranking high while good sites made in the last year are deemed "spam". Google is great if you don't care to find information on anything new in the past year or two.

So I guess the real question

So I guess the real question is: When a domain changes hands, is the clock re-set?

>party like it's 1999 '96

>party like it's 1999


I think it was the patent

I think it was the patent that said something about you loosing some trust when domain changes owner.

But the whois and google historical data will be kept so I guess that wont be disregarded.

Has anayone seen a surge in rnakings for adding 10 years of registration for a domain?

I don't think it's just as

I don't think it's just as simple as whois changing. Even the PHD's at the plex had to plan for something as mundane as a company moving to a new office.

That said radically changing all of the whois, doubling the amount of pages, and changing the "theme" of a site isn't something I would advocate.

Mmmmm. My site from 1999 is

Mmmmm. My site from 1999 is the only one to be knocked back by Google in the past few weeks and nothing has changed design wise on that for over a year (it has an updated front page most days too)

Jim also likes 'em old

But then, I'm not so sure he's talking about Web sites...


I understand the concept of giving old sites an edge, and actually agree in some degree. However, it is just far too overdone in Google to be helpful anymore. You can't find the official website for any new movie, artist, politician, celebrity, etc for over a year. That is just plain stupid if you ask me. Heck, the Bush-Clinton Katrina Fund website wasn't ranking in the top 1000 for the first month it was live.

My thoughts are still that it has to have something to do with Adwords. No other search engine has implemented, or even played around with the implementation of a system like this. Although old sites have their advantages, they also have some very strong disadvantages. Typically their designs are not as up-to-speed as new sites and their content is stale. If I search for information on Lyme Disease, I'd prefer a site that has new medical information, not one created 7 years ago.

What I'm saying is that age should play a very minor role in rankings.

It's too early for that kind of analysis

People need to wait until this update is finished before trying to figure out what has changed.

not too early at all

since this trend began several months ago and has been slowly, inexorably strengthening.


Certainly there is a trend that older domains have it easier for ranking purposes, and I've illustrated this before aswell.

Personal observation, though, is that age of domain is not a Jagger1 factor so far - though other "authority" trust factors are certainly in the running.

Interesting that GoogleGuy in the WMW thread mentioned Jagger2 and Jagger3 having to move in a very particular manner:

Jagger1 was the sort of thing that we could launch at all data centers quickly, but Jagger2 and Jagger3 require switching over each data center individually.

>began several months ago As

>began several months ago

As I recall, it was first noticed as a significant factor a couple of years ago when some sites appeared to be bullet-proof during the Florida update and then those following it. The patent certainly seemed to confirm it, but that was also some months ago now. Whether the weight has been cranked up vs. other factors, that's the fun part.

Heh, all our old sites are

Heh, all our old sites are king and the relatively new competition have been washed away. Thanks G my man. Can just imagine them now, desolate, in a park with a bottle of booze in a brown paper bag muttering continously "jagger, jagger, jagger"

But on the serious side of things a lot of people have been hurt in travel because of this age thing. I think that was the point to stop people building dross. If you are not in the serps now you are going to have to wait for the domain to mature or get some nice links from some nice sites.

They'll keep tightening...

...until we break it;) Then they'll study how we broke it and make it harder.

G is a great example of web 2.0 - all of the Public relations, marketing, Beta-testing, and soon mountains of content is USER GENERATED - what an awesome business model...have everybody else do the work while we play roller hockey and eat deadhead snacks:)

Echoing RC and Andy... this is definitely not a new thing...and they've went quite overboard imho. Our newest tool shows wayback age (which I love)...I'm hard pressed to find a post 2003 site in the top 10 for many two word phrases. At best, generally one or two in the '03 - '04 range.

Let the old site buying (with accompanying subtle changes) spree begin! :)


>>If you are not in the serps now you are going to have to wait for the domain to mature or get some nice links from some nice sites.

yeah but the sad thing is that evil nasty spammers will simply use aged domains to put their sites on. The nice innocent types who setup a new site and work hard to put good content on it won't be able to get links from nice sites because no one will give a PR0 site a link any more. And they'll go out of business before they've aged for three years.

Or (and here's a novelty) they'll learn to live without Google :)


that's like life without red wine. hardly worth living!

The wayback tool

stuntdubl: The wayback tool was just great. Thanks for sharing!


Listen, we've all been asked

Listen, we've all been asked how we'd fingerprint a bad site. Several of the data fields underlying a 2nd-level tld registration ARE signs of likely authority or lack of it. We all know it's there and I see no reason that anyone building an algo shouldn't know it. The patent just provided a punchlist to see how many we had guessed right, imho. Collateral damage? Yeah, sure, but the SEs are interested in the overall numbers. Unfair? Where were you three-four-five years ago? The long-haul sites are the antithesis of churn & burn and I suspect that the SEs see that as a good thing. This strategy fits hand-in-glove with the sandbox if you ask me.

Google's mission is to

Google's mission is to organize all the worlds information and make it univerasally accessible and useful.
<fine print> as long as it's old</fine print>

Eric Scmidt says you should all be out of the sandbox in 300 years or so

The problem is that the

The problem is that the spammers are better funded than the legit newb sites. Buying old sites is not a problem at all.

On the non-spammy side of old site buying, I can think of a dozen Bay Area lead-gen firms and VC's that have been acquiring "underdeveloped prominent search engine real estate" since early 2004.

The net effect is that Google has made it harder to spam immediately at the cost of a fresh set of results. In doing so, they've consolidated the least the business web...back into the hands of capitalized media buyers.

I'm not passing judgement on that. It affects myself and probably most of the crowd here very little. But it's hurting a lot of people with some great ideas or content and no access to the means of distributing it. The Internet is all growns up, I guess.

Don't You Get It

Every site made in 2005 is spam in Google's eye.

My bad, Phenom! I guess I'm

My bad, Phenom! I guess I'm just bitter that my site is getting shafted in the SERPs so that some 1996 Tripod page can display its kickass under construction .jpg.

Multiple factors/patterns used to determine age

Personally I think “registered” age of a site is only one part of what is being factored into these old domains ranking so high. Instead the engineers have figured out certain criteria to determine if a site is old/authority from multiple factors. This would make is more difficult for people to manipulated. Let’s compare this to real life. Do you show an elderly person more respect because they say they are 75? Or do you show them more respect because they look and act 75. I suspect it is the later.

I was doing some research a few months back about this topic and was focused on older sites ranking high and their percentage of deep links. Unfortunately, I had other personal projects come up before I could put together some solid data.

Here are some of the factors I think Google may uses to determine if a site is an old authority.

- Very high number of deep links

- High number of deep links to individual pages

- Multiple links from same site. Think about a good college research paper. Is a research project citied once in the paper? Mostly like a research project will be citied multiple times and different sections of the projected will be referenced each time. Remember algothrims are created by individuals who commonly have PhDs. Research papers are second nature to them. It only makes sense they will factor in what is familiar to them.

- Links from other authority sites developed over time. Meaning these sites tend to mentioned (link to) the site every few months or years. Similar to how you recommend your mechanic a few times a year to different people. If you only recommend them once they are obviously not that great.

- Site architecture and pages are not optimized perfectly. Some pages have not been touched in years.

- Has links from .edu, .gov and .mil. Deep links and multiple links are probably given more weight.

I also suspect the old site ranking high also links out in a similar manner described above.

These factors could slowly be introduced into the algothrim one by one to be test and tweaked. This could happen over the course of months without anyone being able to narrow it down to specific factor. This would thereby make it harder to deceiver and manipulate the system.

Now the question is what can you do to make your site appear more like that “75” year old lady ;)

Just my thoughts...


Now you're talking! I think I still have an old GeoCities page from 1998 with the mailbox and flying e-mail logo circling around it (I know some of you remember that logo). Some online casino keywords and some links and I'm in business!

I don't want to knock it too much, I just find the whole age thing silly. If you take a step back and say "In Google's eyes, almost every site made in 2005 is not relevant", you have to laugh. I just never saw the reasoning behind a new movie not being able to rank the official site but some emo forum with 16 year olds typing l33t and pwn3d can outrank it because it's been online since 1998.

As others have said, it doesn't matter to me. I haven't bought many new domains in the past year and most of us have found similar solutions to handle the problem. Those who depend on Google traffic will adjust and have, those who don't won't. In the end, SEOs will have this filter gamed and sandbox will turn into a sweet revenue boost for Adwords on new website owners.

Seriously everyone focuses

Seriously everyone focuses on age, but I'm really starting to feel it's more of a quality test. Intentionally or not age turned out to be factor in in the quality test.

>step back and say "In

>step back and say "In Google's eyes, almost every site made in 2005 is not relevant"

Agree, except when you carry that logic forward it doesn't sound so bad... In Google's eyes, in 2007 almost every site made in 2005 (that is still around and has insert-xyz-factors-here) is relevant.

Okay, my mistake

I only scanned Patrick's article on the first read and missed the fact he was speaking of the trend (which several of us have addressed over the past year or so). I guess I'm getting a bit jaded by all the October 2005 Google Update intraupdate conclusion discussions.

We dont tend to do update

We dont tend to do update threads till after they've happend - but to be frank, the last one came through cos a) a mate posted it and did a good job imo, and b) i was flat out with another project and really needed something on the homepage heh..

There is an upside

Every site made in 2005 is spam in Google's eye.

The (one) good thing about this is that people will start to learn other marketing techniques while they're waiting. At least we can hope that is true.

No one should ever rely on free Google listings for their bread and butter, and yet that's what hundreds of thousands of sites have been able to do. If they learn to do real marketing first, and use their G listings as the gravy, they'll be better off in the long run. maybe G is just trying to help people again! (LOL...just like autolink!)


Even the PHD's at the plex had to plan for something ...

and we've seen the results in all of their beta's.

Nonetheless, I keep the renewals paid on my portfolio.

It's like waterfront real estate, they ain't making anymore of it.


I agree with Jill. A business based soley on traffic from SEs is not a sound business plan. Well unless you are a spammer lol


The googlites ALWAYS look at a combination of factors. Domain age is just one of those factors.

As has been brought up in this thread, when you look at Domain age combined with other factors like deep links, site growth (or lack of it), linking structure and numerous other factors, the googlites are able to set a "Trust" factor to give to each site.

To be honest, I like goog using 'age' as a factor to help determine "Trust".

I have a bunch of sites (Domains) that we have put a lot of work into over the years, kind of a sweat equity.

I like the fact that someone that just decided to get into competition with me is going to have to work damn hard and a damn long time in order to be a serious threat to my bottom line.

Me and mine worked long and hard to become successful. Why should any of these 'upstarts' expect after a month or even a year that they should be automatically on the first page for 'their' keywords?

A business based soley on traffic from SEs is not a sound business plan.

pfffft - As someone that runs several sites that only do search engine marketing, I can safely say that you don't know what you are talking about.

Of course, the search engine marketing approach is not for everyone. But is it a sound business model. But then any business model that stays in the green year after year is a sound and viable model.


>A business based soley on traffic from SEs is not a sound business plan.

...I didn't ask Jill before I started this 8 years ago, hhh.




I'm talking about one's entire livelihood wrapped up in one website's free search engine rankings. Surely, neither lotso nor rcjordan are silly enough to do that?

Please don't tell me that you think that's a good business model.


>Surely, neither lotso nor rcjordan are silly enough to do that?

Unless lotso and rcjordan are the same person. Then it's a great idea.

>one's entire

>one's entire livelihood

Glad you clarified, Jill. I diversified beyond a single revenue source starting about 10 years ago. In fact, going to the web in the first place was part of my diversification plan. As a rule, I don't like any one source to be more that 25% of my income --that includes web ventures taken as a whole (though it has creeped up a little more than that). I get uncomfortable when I start to feel dependent upon things I don't own outright.


As someone that runs several sites...

I never rely on a single source of income. ;-)

Unless lotso and rcjordan are the same person.

rc wishes we were the same person, I am much better looking... LOL... I should point out that I have never met rc in person, just seen some pics.

(note: It is lots0, with a zero :-)

Same page, guys...

Then I think we're all on the same page.

Unfortunately, there are many people out there (the ones crying at update time) who have one site in which they've depended on free Google listings to make their money.

Silly, silly, silly business model!


and I am better looking than RC as well.

The age bias particulalry

The age bias particulalry noticable for terms that didn't exist in 2005.

I should have elaborated

I should have elaborated more when I made the statement about a business model based off of search traffic not being a sound. That was my mistake on my part for not doing so. I was referring to people with a "single" site selling stuff that gets all their traffic from search engines and use no other marketing techniques to acquire customers. You know the ones that quit their jobs and then they lose their rankings for whatever reason. Then they are left with little income.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.