Trust & Inheritance - Testing Hypotheticals and Google

6 comments
Source Title:
Changes in Google Ranking Strategies
Story Text:

This is a followup to this previous thread about changes to Google's algorithm by Michael Martinez. It's particularly relevant in terms of the new Google patent recently discussed...

I'm getting sort of rusty on my Google optimization, so I don't know that I've fully wrapped my mind around it, but it is interesting as heck.

Comments

Michael Martinez is a smart guy

IMO, this is a total must-read. His timeline fits a number of factors we noticed as well. I think his analysis is right-on.

 

Gave me a nosebleed, so consequently i skipped most of the 2nd page :(

What does it mean in practical terms?

 

...at the start of a document's life, its relevance is determined in part by the documents it links to. But as it accumulates inbound links, its relevance will be determined more and more by those inbound links (though never completely so -- and on-page factors will continue to be important). A disconnect between the document's outbound links and its inbound links, or a disconnect between the outbound links and the on-page content, could weaken the document's relevance score. A disconnect between inbound links and on-page content has always weakened relevance scores, but the practice of link bombing offsets that imbalance.

Yes yes yes. I agree, but I am not confident of the way in which inbound are wighted relative to outbound. I believe it is modulated by niche, much like some other factors seem to be. Some pages do not get many inbounds ever, and the outbounds seem to continue to have a decent impact. Perhaps the ratio of in/out is in play.

Sorry about the complexity of the subject

I skimped on some of the details because they could lead to entire papers by themselves.

Brad Enslen asked me for the bullet points and my first attempt at that was still pretty lengthy. Here is a shorter explanation.

The paper sets out to do several things:

1) Explain a methodology for determining the competitiveness of search expressions. This methodology is used to test some of the discussion examples.

2) Refine or correct some of the points I made in "On the Googleness of Being".

3) Compare what I have observed in Google's recent changes of behavior to some of the more interesting points of the multi-purpose patent application.

My analysis leads me to conclude that Google is developing (or has developed) a LINK OBJECT MODEL which they use to categorize data about a link's history. I believe they are analyzing document performance in search results, linkage evolution, and query modeling to determine what is NATURAL (what the majority of people do blindly without concern for optimization) and what is ARTIFICIAL (what search optimizers do with purpose).

To put it another way, I believe that Google has devised methods for measuring relevance so that they can distinguish between what they feel are natural (documents, links, queries) and what they feel are artificial (documents, links, queries).

fantomTip: “The Googleness of Being” Further Qualified

Michael Martinez has done it again: his revised and expanded version of his original take is easily the most important reverse-engineering based analysis of Google's present and future ranking algorithm published the world has seen for the past 2 years...

 

Thanks Michael for taking the time to write the original paper and the bullet points. I think you points are insightful.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.