Web Trends reporting

6 comments

We've been using Web Trends 5.0 for the past two years to report our clients' monthly stats and will be moving on to the new and improved (sic) Web Trends Enterprise 7.0 - the big Kahuna....I've noticed that when comparing stats results between 5.0 and 7.0, the figures reported by 7.0 seem higher by anywhere between 3 and 14 percent. I was wondering if anyone else has experienced this and if so, were you able to find an answer as to why that is the case....

Comments

look very closely

look very closely at how they track proxied visitors. I had alot of similar troubles with WT upgrades. Scary when you look to close.... hard to put much faith in the guestimates.

Hopefully they have config settings to match the old way so you can remain compatible.

"hard to put much faith in the guestimates"

"hard to put much faith in the guestimates"

Ain't that the truth. And apart from hits and bytes, most of the rest of logfile analysis is about guestimates.

Saying that, the last version of Webtrends I used - version 5 - consistently over-reported sessions by up to 20% in comparison to AWStats, Urchin and NetTracker.

If you haven't already, check out NetTracker before you upgrade Webtrends.

Web Trends 7.0

I actually had the opposite problem when using Web Trends 7.0, figures were consistantly lower than both Urchin and other logfile analysis.

7.0 does add some additional flexibility to reporting profiles. It seems to be a more stable system than in the past... however I still don't recommend it

Do yourself a favor

Move to clicktracks. YOu'll find that the data is not only more consistant, but reporting is much much easier.

clicktracks

I agree clicktracks is nice, although I don't believe it fixes much of the problem (it does provide a great set of interative tools IMHO). It also depends on which version (js tracking or log file analysis). They are very different animals.

In my view any respectable scientist will normalize datasets before jumping to conclusions, and then try and only put faith in comparative analyses (especially after segmenting visitors). After that, the interpretation of the meaning of those analyses is really a business function - not science. And of course business is driven by risk management (not fact) so such guestimating is par for the course.

That said, I like that ClickTracks has Steven Turner (Ph.D. in stats, author of Analog) and that CEO John Marshal was driven to create Clicktracks by his frustration with existing web analytics while neck deep in real tech companies during the initial boom (Netscape and two others he never names ;-)

But ClickTracks has to make assumptions just like everybody else. Here's an example of a quote from Marshall in a web analytics discussion (bold added):

Quote:
...the robots don't execute JavaScript so they don't appear as normal users. As for a list of robots, it's not that easy. There are increasing numbers of robots that have a normal mozilla useragent because they specifically don't want to reveal themselves, perhaps because they're harvesting email addresses. A log analyzer vendor is therefore having to constantly adapt the algorithms. For example a session starting with 'robots.txt' is obviously a robot and is excluded, as are users that repeatedly request the same page, or that make many page requests at regular intervals. All this gets thrown into some heuristics and it ends up being remarkably accurate, though not perfect. Code to do all this cannot be easily broken out and sold as a package. Nevertheless we encourage our customers to buy maintenance and updates so they can install the latest version any time and get such upgrades to the useragent list and the exclusions heuristics.

That's one good argument for a hosted solution (keeping up to date) as well as against a hosted solution (each change makes for a new experimental model, and may void historical comparisons).

Sawmill.

Sawmill.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.