Best Tracking Software? What are you using?

37 comments

As noted in another thread, there's still the occassional problem with Google analytics. Add in the privacy concerns and it's still not the perfect solution for all people.

There's also the issues of hosted vs not hosted, third party systems vs inhouse only, and so on. I personally think online and self hosted is the way to go. Unfortunately the OSS alternatives are weak and the non OSS packages are too expensive for my tastes.

So, time to share. What are you using to track your websites - and is it good, bad, or indifferent? Awstats? Webalizer? Clicktracks? Google Analytics? Something else?

Comments

track what?

They are quite different beasts... tracking visitors, traffic, ads.... and need specialized tools imho.

Maybe it would be best to clarify :

Traffic Analysis (log-based, client-side script based, or both)
Visitor tracking (when it i smore than traffic analysis)
Ad tracking (even that has much variety)

For me, it's just about everything. Clicktracks (pro) and Webtrends (suite) and Webalizer and AWStats for log analysis, ClickTracks for visitor tracking and js-based traffic, home grown for server side visitor tracking as well as ad click tracking... it really depends on the particular need.

Best are home grown scripts for log analysis... fast and furious and never any doubts.

Never, ever used Google Analytics. You have to pay me a lot more than nothing for access to my website data.

Summary and Visitors

Both do exceptionally good jobs very fast.

Visitors is opensourced and (from what I hear) very optimized ASM code; it is blazingly fast, doing what AWStats would take hours on in literally 1/2 a minute (500,000 lines per second). It supports a great many things, such as keyword/phrase tracking, visitor demographics and site paths, but alas, it's *only* supported search engine is google, which is one of its greatest disadvantages, besides it being GPL licensed, which precludes me contributing. further, its graphs and reports are in pure-CSS (*zero* images) so you can *effortlessly* email them to upper management :-)

The one I personally use is Summary Plus, which costs $250 and worth every penny. It has virtually every thing WebTrends does and is very fast (80,000 lines / sec on a Sempron 3000+). It handles 500 MB log files (compressed) and a great many statistics. It's major downside is that the standard one does not have the option of saving to disk, so you have this memory-hog that can use up maybe 500 MB of RAM on a non-optimised setting (this is for a huge website). Turning off the most memory-intensive things (such as user paths and spider crawl stats) drops it down to a moddest 100 MB.

The standard version ($59) supports 1 domain and is quite good. Unofficially you can run multiple instances for multiple domains and I had no problem doing this, but after 4 months of Plus I'd never go back, even if I only had a single domain.

See http://www.summary.net/ for more info.

Sitecatalyst - How does it stack up?

I use Urchin so I am familiar with its feature set, and I always hear about Clicktracks so I know what it is capable of.

However I never see anything about Sitecatalyst.

I have heard some say it is on the top of the list however I never see actual evidence of this. Does anyone here use it or seen a decent demo? Yes I know it comes with a price tag, but as they say “you pay peanuts, you get monkeys”.

questions first

Before deciding on which software to use, it's necessary to ask yourself some important questions:

1) do you want a page-tagged tracking/analytics or a server log analyzer?
2) if you want a page-tagged service, do you prefer a hosted or a third party service?
3) do you have a dedicated team to handle these things or want everything managed for you?

I reckon most users only decide on the solution after careful consideration of the above points. Deciding the brand and the product comes a bit later.

I personally do not wish to suggest any one product as each has its own unique feature. I so far have not had any bad experience with G/Analytics, though I admit my interest in it is mainly academic. For that matter, my experiences with 3rd party page-tagged services is always academic and experimental. Except for one...VisitLab

VisitLab has gone open-source and their code is available for download. Along with the usual click fraud detection and normal statistical service, one of its plusses is the VSI (the visitlab scam index) which calculates a score and thus the probability of scam. They are also trying to involve a community of developers to work on a 'keyphrase position manager'.

RubberChicken : We've

RubberChicken : We've recently started using Omniture SiteCatalyst.

We decided on this solution as it was the only one (to my knowledge) that could do all of the following:

1. Add data from several sources. eg. Offline conversions, 3rd party ad campaigns, PPC click data
2. Excel plugin - Management tend to prefer Excel reports at regular intervals than checking a web based system.
3. Good form abandonment analysis - Our business is based on users filling out forms
4. Site overlay - This is pretty useful for testing new layouts/page designs. The plugin for this works in both ie and mozilla which is quite refreshing as well.
5. Segmentation

These are the reasons we chose them. We've not yet integrated all of the features though.

If anyone has any specific questions about SiteCatalyst drop me a PM.

Phil

Analytics Tidbit

Urchin, Google Analytics, etc. are all javascript based and miss a bunch of stuff.

People with javascript disabled and ad blockers/privacy filters could skew your actual view of user stats as much as 5% conservatively. If you use javascript enabled tools then embed an image load on all pages via tags and you can count all actual page loads and visitors with javascript disabled vs. bots and the rest.

Then it's a simple matter of subtracting ALL unique IPs - javascript analytics + java disabled/ad blocker visitors loading images to get a clue of your true traffic.

Choices with advantages and disadvantages

Basic Web Analytics tools usually fall into one of two categories:

  • Web server log based
  • JavaScript embedded page tags

Both have advantages and disadvantages.

By default, server logs contain much richer data than that usually tracked by JavaScript page tracking. In the SEO field, web server logs will show you what pages have been crawled when by each search engine crawler. JavaScript page tracking code does not trigger when a page is downloaded by automated robots. Proponents of JavaScript systems tout this as beneficial – their reports only track human activity. This is really just putting a brave face on a limitation. Better web log analysis systems are able to separate human from non-human traffic. JavaScript systems also fail when html programmers forget to embed tracking code in newly written pages. No tracking code, no data. JavaScript systems can slow down your pages if the tracking server is bogged down – that's been the case of Google's free hosted analytics service and what launched this thread. Non-html media, such as word processor documents, pdf files, images, won't be tracked unless specific JavaScript code is added to each link. JavaScript systems proponents also say they track more page views – their code is always executed when a user views a page. Implicit in this affirmation is the idea that web caching is preventing a call to the site's web server, thus, the page view doesn't get logged. This is a false argument. Part of web server log based web analytics system configuration is insuring proper web server cache directives. Simply telling a user's web browser to check the server to see if a page has been modified results in accurate web log data with minimal extra overhead. It also results in a more accurate browsing experience. Some users (not too many, it is true) disable JavaScript.

IT Managers start to like JavaScript systems because web logs can be onerous to manage. The logs get very big very quickly for highly trafficked sites. A few "corrupted" lines in the middle of the file (usually due to attempts exploit Windows buffer vulnerabilities) can stop lesser web log analysis systems in their tracks. A junior systems administrator won't usually have the necessary command line experience to find and remove the offending lines. Web Analytics systems are by their nature resource intensive – a great candidate for outsourcing.

So what to do? The devil is really in the details. Many organizations which have adopted one Web Analytics solution or another underutilize the tools due to a lack of trained personnel. A lack of training also leads to “less then optimal” implementation – leading to poor and misleading data quality.

In general, open source tools are more privative than their commercial counterparts. Commercial tools offer more intuitive reports and add functionality such as click stream analysis and data drill down capabilities. Yet sometimes the free tools provide more detail than commercial tools. In marketing (and SEO!), referral information is very important. The free hosted Google Analytics only provides referrer domain information - AWStats offers page level URL referrer.

To avoid a mismatch between a tool and an organization's needs, a business should consider getting up to speed with web analytics before investing in a software solution. They can start by using a free tool, such as AWStats (web server log analysis) or Google Analytics (JavaScript page tags), or even better, both. I've written a two part guide to getting started with AWStats for O'Reilly. Businesses can also consider retaining a vendor neutral consultant to help then in a solution selection process.

Sean Carlos

Click Tracks

Guys if you haven't checked it out I've always found that Click Tracks is excellent. Plus it works with your raw log files so it's a hell of a lot more accurate than other methods. It's desktop based and has a great GUI for viewing your site along with a ton of data at the same time.

Worth a try. I definitely like it and have recommended it to many people. If you do have major infrastructure backing your site(s) then you may want to inquire about their enterprise version.

sawmill.net nothing comes

sawmill.net

nothing comes close imho

Sawmill

I've also heard Sawmill is really good. Can you explain what it is so good at though?

Sure Phil. It is the

Sure Phil.

It is the intricate details it can give you that make it amazing. Clickpaths through a site, exit pages, time on page, etc etc etc

All of this adds together to help you see not only what works, but more importantly what doesn't so it can be improved. If I had to start again from scratch this is one tool I wouldn't want to be without.

Thanks Jason, I can see how

Thanks Jason,

I can see how that is useful for some sites but is that good enough for e-commerce where you want to be able to track where revenue is coming from , not just clicks? And also import other data such as costs?

Unless I'm missing something I don't see how Awstats of Sawmill can do this. (Granted these features arn't needed by everyone - and maybe not yourself)

A Little Humor Because it's Friday

1. Dear God, It's me Natasha, Please don't let Threadwatch become one of those sites where people ask How-To questions.

2. A great place to ask this question (and a super great resource) is the Web Analytics Discussion Group (Which I have noticed has been getting more SEO related questions over the last few months). I read that site almost as much as I read Threadwatch - lol

That Girl From Marketing

E-Commerce Needs

Phil,

Just to confirm: advanced organizations which are able to integrate transaction data with their base web log or javascript data will need a commercial solution.

While AWStats does make its data available in an XML format for use by third party applications, AWStats does not directly integrate with other data sources. Relative to commercical solutions, AWStats does not yet support clickstream analysis; visitor recognition capability (via host) is modest.

Sean

about latency ...

JavaScript systems can slow down your pages if the tracking server is bogged down ...

As noted earlier, Google Analytics has been suffering from this recently. Overload is also the cause of problems loading up theregister.uk because of the ad servers at falk.de. Unfortunately, most end users will attribute any delay in the loading of a page to the site itself.

Fortunately, frontend servers that are purposed to receiving tracking hits can have latency at the nic of less than .005 seconds under load. They can also be built to be tolerant of backend outages. That means the end user never sees a delay. I have been working on/with such a beast for some time now. Public release has been ermm.., imminent for some time.

Then it's a simple matter of subtracting ALL unique IPs - javascript analytics + java disabled/ad blocker visitors loading images to get a clue of your true traffic.

umm... AOL proxy servers.

To be fair, most of the big names in the web analytics business include a img tag enclosed in a noscript block to account for the 5 percent or so of browsers without javascript enabled. Most of them also employ cache busting techniques.

Proxy caching and browser caching are the biggest shortcomings of local log analysis because page views that don't actually touch the server cannot be counted.

maxi_million ..

I just checked out your site. Browsing the html based cvs listing is fine, but where is the tarball?

The kitchen sink ...

Phil, cheers for the indication on Sitecatalyst’s capabilities. I am interested in site analytics software more so from the e-commerce perspective, however at the end of the day doesn’t everybody want everything? Or is that just me..

Sean, you seem to have your head around this subject. Of the commercial software out there for advanced organisations (who want the lot), where would you be putting your money?

Summary

I would echo hopeseekr, also use summary. Especially like the prompt support and the way you can set up as many customised html reports for your clients.

Clicktracks will also

Clicktracks will also assimililate other data into their reports, though in the past it was a manual thing. I believe thier newest version does it all automatically.

Another vote for ClickTracks

Have a few clients using it right now that couldn't be happier with it.

one big drawback

One of the big problems I have had with almost every "solution" is that they don't accommodate SEO very well.

When the vendor offers trial versions, they limit them to something like 50k page views (I don't recall the actual numbers) as well as a fixed time span of the logs. How can I test a product using a few days of data? This was a big problem when I wanted to test Clicktracks, for example. It is a barrier for any hosted solution these days, because they seem to set pricing based on volume according to some non-seo-style monetization scheme.

Maybe a "traditional website" that sees 500k+ page views per month can afford a $600/month analytics solution, but many SEO sites see that volume with relatively low levels of monetization. This is especially true of test sites where I really want to run a good analytics solution, and traffic sites that send the traffic somewhere else for monetization.

Sometimes it seems that the vendors are blocking use in the places where their products might shine the brightest.

napkin math

John,

Let's take the example of 1 million page views per day.

CTR of 1 percent, 10 cents per click.

So, 1,000,000 x 0.01 x $0.10 = $1000.00/million

If the vendor is charging $200.00/million what's wrong with that? Especially if you can apply the information to driving the CTR or EPC higher? That *is* the ultimate goal is it not?

I would argue that the income is volume related, so why not the costs? The traffic you throw at an analytics vendor is directly related to the vendor's costs. Like you, the vendor is interested in making some money at the end of the day.

Now, a gigabyte is cheap. But those terabytes cost a *great* deal more when sold by Sun or EMC. Furthermore, the horsepower required to deal with true volume skyrockets. Remember, everything looks great, until the system becomes i/o bound because there is no more memory to cache the indexes. Then the only solution is more hardware as there is a finite limit to what can be done by optimising the backend code. True, whitebox servers are cheap, but they don't have the required performance characteristics. Remember, you can have fast, cheap, or reliable. Pick any two. :)

What is is about webmasters that want to keep everything for themselves? I know of one case involving 1M page views per day, a conversion rate of 1 in 3000 @ $20 per conversion. That's about $6600.00 per million pageviews. Yet, $6000/month was too much for him to part with out of that $6600/day.

Quick survey, what *is* everyone paying per million pageviews these days anyways?

**

hmmm...

Quote:
So, 1,000,000 x 0.01 x $0.10 = $1000.00/million
If the vendor is charging $200.00/million what's wrong with that?

You mean what is wrong with asking for 20% of gross revenue for the analytics software? Plenty.

For every dollar in value that analytics produces, I suspect 70% is from the expertise required to use it/interpret it. That value belongs to me, not the vendor. I don't recall bringing on a profit share partner when I purchased my analytics software. Perhaps those who live by the PPC game may be thinking that way these days, but not me.

I actually convert far better than $1000/million and still I'm not interested in $200/million for analytics. Look at it this way... if I can do SEO/strategy for a client and it leads to $1000/million conversion, will I get paid $200/million off the top? Monthly?

John, When the firm I worked

John,

When the firm I worked for needed new metrics software, we insisted on at a least a full month of trial with the solution implemented before we said yes or no.

Hear! Hear!

Quote:
I don't recall bringing on a profit share partner when I purchased my analytics software

That's what's wrong with that pricing model and why I would never buy under such a plan. I'm buying software to do a task, not a rental agreement or a price per volume. There's no justification for that kind of model - it's not like CPU cycles are a big cost all of a sudden. This is just 'if they're bigger, we'll charge more for the same job'.

You mean what is wrong with

You mean what is wrong with asking for 20% of gross revenue for the analytics software? Plenty.

I'm actually quite happy spending 20% on something that can over the course of a few months add 50% to my revenue. There's also a cost saving involved with producing automated reports for management.

By increasing the conversion rate using high quality tools you're also reducing the percantage of revenue that they're costing as well.

I think you pay for what you get.

ie. Awstats and Google Analytics are both free and I no longer use either of them.

so there's a market

So then there is a market for high-priced tools that offer, in exchange for 20% of gross revenue, a chance to increased that revenue 50% over 6 months. What then... stop using the tool? Stop handing over 50% of those revenues? Or perhap the project hasn't though out that far...

Sorry, but I don't get the argument. I do understand that the tool vendor set that scaling price model as a startng point, and I assumed it was because they modeled it after tyical ecommerce business with an expectation to serve 80% and perhaps negotiate the remaining 20%. But then some of you say it's ok to pay 20% of gross for an analytics tool. I cannot understand that.

Then I never did agree with the large corporate environments, with their odd, often egocentric decision-making processes and disjointed side-agenda-driven management styles. That could easily make up a large enough market for the high-priced tools backed by slick marketing. In that case, though, isn't it ironic how they are sold as "performance metrics" ?

Apologies - should not post

Apologies - should not post when drunk.

I was using the percentages you gave as my example without doing the maths. Our cost's are knowhere near 20% of gross revenue. They are nearer 1 or 2%.

We could have used cheaper or even free tools but the fact of the matter is they just could not have provided to us the answers that we need.

The increased revenues should be continual as our sites constantly evolve.

There is also another factor which is time saved by being able to setup a number of different marketing and management reports.

nothing to do with gross ...

John,

The points I made had *nothing* to do with a share of the revenues.

It may happen that in the simple example that I used that it works out to 20 points of the gross.

I doubt that there are any vendors who price according the the gross revenues of the client.

What I *am* pointing out is that in the hosted model the costs for the vendor are directly related to the amount of services used by the client. So, scaled pricing is perfectly reasonable.

At first, on re-reading your post, I though that you were mostly interested in purchased software as opposed to hosted services which is what my comments were directed at. To wit, the allusions to the cost of acquiring and supporting the backend. Then, I came upon this:

It is a barrier for any hosted solution these days, because they seem to set pricing based on volume according to some non-seo-style monetization scheme.

So, I reiterate, costs and by implication pricing are directly related to the volume of services rendered to the client. There is no way that a hosted service can be delivered on a flat rate pricing model. And, I will add that in most cases an analytics firm is not a seo firm, although some are expanding into that area.

What I am stressing is that if a tool is useful, then it's cost ought to be considered in light of it's ability to improve total income. My mother puts it best as you gotta spend to make.

The extreme case that I pointed out in the bottom of the post was a buyer who refused to part with *3* percent of his monthly income on a tool to help him figure out how to improve his site. I came upon him after he had already shopped for pricing for over a month. The only reason that I know his conversion numbers is because I have several years of background as an insider in his particular niche. I would not normally have that knowledge.

The example used at the beginning of the post was intended to illustrate how even at those abysmal numbers, it would be productive to use the tool. Indeed, if the numbers are that abysmal it might be that the use of the tool is strongly indicated.

My own *personal* gripe is the expectation that big users should receive preferential pricing. This effectively means that these users are subsidised by smaller users paying full pop. I call this the Walmart Strategy.

plumsauce...


...but where is the tarball?

came back here a bit too late to notice your comment. im surprised how you never came across the ditro.
you will find it somewhere in the section called DeveloperDocs in the wiki

plum

Fair enough to emphasize that increase usage should bring proportional costs. My experience was across both product types, where the trials are limited to N log entries *and* the hosted is limited by hits. I took that to reflect a price scaling based on traffic volume because, well, that is how it practically works out.

There are investors and there are bootstrappers, so not everyone sees the benefits on investing money to make money every time. When you invest there are risks beyond your control, and it requires trusting third parties. When you bootstrap you move more slowly and pass on some opportunities, yet you retain control. With all the monetization of the channel that goes on these days, it is not a simple matter of buying expensive tools and reaping proportional rewards.

There are plenty of downsides to hosted services these days. I think that those risks should reduce the cost as a balance to the assumed risk.

If I buy a WinXP product and the vendor sells out to Google, I have a perpetual license to run that code as long as I choose to support it and its legacy WinXP system (in other words, as long as my personal finance model requires me to run it). Another way to put it might be, I'll trade a reasonable reward for services (and give up my perpetiual license) if the ASP gives me a reasonable SLA that mitigates the additional risk.

In your example of increased service use costing the vendor more, I agree but I also note that the increased costs associated with *my* increased usage are much less than the scaled pricing suggests (I use more bandwidth and maybe some addiitonal temporary storage, which is very cheap compared to other costs of scaling associated with product sales growth or geographic distribution or back end adminstrative costs related to company operations etc). I bet SEos consume far less tech support services than Fortune 1000 clients.

I don't believe big users should get preferred pricing either. But I don't think current scaled pricing presents a value proposition for the typical SEO client who consumes relatively few services (tech support) and may have much higher traffic than typical businesses, or much greater traffic fluctuation.

Perhaps the problem lies in the setting of price according to perceived market value and not actual marketed product. In my case, analytics tool vendor's perception of market value is out of whack with reality. I can do more than they are doing for that price point, and keep the assets working for me for free as long as I am willing to support it.

as a bootstrapper ...

John,

First let me thank you for being reasonable in considering both models.

I would suggest that there is a counterpoint, with respect only to purchased software, that the lower price point package might reflect a price that in no way would allow recovery of development costs, but is priced as a means of introducing the product. Or, of selling the product to those who could not otherwise afford it.

I also agree that certain users need less support than others. In a perfect world, the product would be so good that *no* support is needed.

Of course, bootstrapping is always admirable. And if their price point cannot be satisfied, then they need to look at free or near free pricing. But at the same time, they have to realise that they have decided, whether by nature, or by circumstance to be in bootstrap mode. As such, they should not be upset at other segments in the supply chain who are not catering to bootstrappers.

Mind you there are always deals that can be done. Remember that Webtrends was in bootstrap mode at the beginning and came to prominence by trading banner space on adult sites in exchange for tracking adult sites. The banner space was resold as inventory. Once Webtrends became mainstream they disassociated themselves from their history. From reading their financial filings, this was actually a backwards move.

As far as using software forever is concerned, I am in total agreement. I am quite happy using NT4 on my workstation. What I don't understand are the clients who want source code for a $500 package because I might get run over by a car, train, boat, bicycle, skateboard. It is just a checkmark on some corporate due diligence purchasing checklist. The real question is: does it do the job properly today? If it does, why would it change tomorrow? And if it did, even if they have the source code, do they have the skills to do anything with it? I think not. So, pffft.. goes the argument for open source.

On the one hand there is an argument that it is not fair that big organisations should expect lower per unit pricing. But then it is argued that pure additional costs do not track with additional income. Actually, they do. The question is whether the vendor is willing to accept some lower rate of return simply because someone is doing more volume. That's not a matter of fairness, but a negotiation point. What is the customer willing to give up in compensation? Because really, when a customer asks for a discount, he is really asking the vendor to take the money out of his pocket and hand it over. He might need some convincing. He might be convinced by an offer of prepayment, or a guarantee of a certain volume. Maybe he wants an introduction to another site. Everyone wants something. The negotiator's job is to figure out what that is.

The guiding light in any negotiation is that there has to be enough left on the table for the other guy so that everyone is happy. Not ecstatic, but happy. If one of the parties gets ground into the dirt, then it's going to be a pretty rocky relationship. There is a reason that some businesses actually refuse business from Walmart. It's not worth the aggravation.

In *my* perfect world, pricing should be along the lines of $x/million rack rate(published rate), come one, come all. The world is not perfect, oh well.

I personally don't buy encoded scripts

What I don't understand are the clients who want source code for a $500 package because I might get run over by a car, train, boat, bicycle, skateboard. It is just a checkmark on some corporate due diligence purchasing checklist. The real question is: does it do the job properly today? If it does, why would it change tomorrow? And if it did, even if they have the source code, do they have the skills to do anything with it? I think not. So, pffft.. goes the argument for open source.

I disagree - programmers do sometimes disappear, or just get a new profession and stop answering emails. Some companies do go bust.

Any code I use on my sites, I want the source, or I won't buy it.

My reasoning is - security holes appear all the time. If I don't get an update from the programmer (for reasons above) I'll arrange to have it done myself.

And it also gives me the option of having changes and additions done to the code.

Anything Zend or Ioncube encoded is always off the shopping list for me.

Sure luz I'll just give you

Sure luz I'll just give you my scripts for free so you can recopy them and give them to your friends, perhaps you could run a obsfucator and encode it then resell it.

Thats totally cool by me.

Sure luz I'll just give you

Sure luz I'll just give you my scripts for free so you can recopy them and give them to your friends, perhaps you could run a obsfucator and encode it then resell it.

Thats totally cool by me.

Sigh.

It's your code so it's your decision of course. I'm just saying that I don't buy encrypted code, and I'm sure others feel the same way.

Keeping the code open doesn't seem to have done vbulletin any harm.

somewhat relevant...NickW

somewhat relevant...NickW just launched a tracking / metrics program for bloggers
http://performancing.com/metrics/start

microsoft, oracle and ibm releasing source code ...

I think not.

and btw ...

And it also gives me the option of having changes and additions done to the code.

not if you have not also acquired the right to do so as part of the purchase.

Furthermore, most of the arrangements I have seen involve source code escrow. The escrow agent will only release the source code under the circumstances permitted under the escrow agreement. Most clients drop even this approach when they are told what the cost will be. Hint, the escrow arrangement itself will cost a couple of grand and goes to the escrow agent. Plus yearly escrow fees, which the client is responsible for. Then add the value of the intellectual property itself. You are *not* going to see a total under $10k for even the simplest package.

I am not talking about some silly script. But compiled software that has a great deal of embedded intellectual property.

I still point out, if it works today, it will work tommorrow. In the same operating environment. If you have done your due diligence testing, you will know exactly what the limits of the software are. If you haven't, then source code access is a crutch.

Even large companies disappear off the face of the earth. Take for example Ashton-Tate. Subsequent to their death, I saw the source code for DBIV, it was legally acquired under license for $X from Borland. There is *no* way anyone is going to be able to successfully do anything with that much source code without a tremendous investment in studying the code. Tough nuts. You would be better off investing the time in rewriting the dependent systems which you *do* understand. Trust me it's cheaper.

Of course, you are quite free to stay solely within the limits of open source. If what you want is available, then all should be well. If what you need is not available, please feel free to break out the compiler. Please remember to open source it. I like looking at source code, for educational reasons :)

Now I gotta run off and see what Nickw is doing ...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.