Web Analytics - Part 2

Mar 4
22:00

2004

Jason OConnor

Jason OConnor

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

... OF ... have ... to publish this article ... or in print, free of charge, as long as the bylines are incl

mediaimage

-----------------------------------------------------------------
TERMS OF REPRINT

You have permission to publish this article electronically or in print,Web Analytics - Part 2 Articles free
of charge, as long as the bylines are included and you follow these rules:
* Email distribution of this article MUST be opt-in email only.
* You must forward a copy of the ezine or newsletter that
contains the article inside to the author at: mailto:jason@oakwebworks.com
* If you post this article on a website, you must set any URL's
in the body of the article and most especially in the Author's
Resource Box as hyperlinks. Please send us the URL.
-----------------------------------------------------------------

Web Analytics - Part 2
Jason OConnor
Copywrite 2004

Not accessing and reviewing your vital website statistics is like never looking at your checking account activity and never knowing how much money you have in it.

In Part 1 of this two-part series I explained how to crunch relevant website statistical data to facilitate constant e-marketing initiative improvements. I explained what types of data are important, such as unique visits, click-thru numbers and percentages, lead conversion rates, and how to process all these numbers. (You can read Part 1 at http://www.oakwebworks.com/articles/article-6-analytics-part-1.htm). Here in Part 2 I’ll explain how you obtain the data in the first place and then provide a fool-proof method for website click-thru statistical acquisition.

The first thing you need to know is where your website lives. Every website sits on a server, a computer with the purpose of waiting for requests from clients (people’s personal computers by way of a browser). Each server physically lives in one of two places. It is either located at its website owner’s company, which is called in-house, internal, or self hosting. If company A has an active website and owns the server the website is on, and the server is physically located at their company, then it falls in this first category.

The other place a website server can physically live is at an Internet Service Provider (ISP) or host company. There are a number of configurations the server can fall under in this category which is beyond the scope of this article. The main thing to keep in mind is you first need to know where your website’s server is.

Once you know this, you can begin to assemble all the relevant site statistics. All servers automatically generate all the data you’ll ever need on an ongoing basis. They are relentless in their stats recording. They record all the data in what’s called server log files. Manually parsing through these log files is a horrible job that should only be wished on your worst enemy. They are huge laundry lists of everything from every site visitor’s IP address, browser type, site referral, time and date visited, and much more.

Fortunately, there are software programs that can do this for you. One of the most popular is WebTrends (http://www.netiq.com/webtrends/default.asp). You feed your server log files to the WebTrends software, and it produces for you an excellent presentation of all your relevant (and some superfluous) website statistics.

If your website sits on a server that your company has in-house, than you need to purchase WebTrends or some similar software and locate your server log files. The files often end in .log. In other words, it’s up to you to get your website’s statistics, and you do this by locating your server log files and running them through software such as WebTrends.

If your website sits on a server in an ISP then you can either request the server log files from them and run them through your own software, or you can ask them if they provide an interface for you to review your site statistics online. Most do provide this service. It’s often web based and all you have to do is log onto their site to view them.

Now you’re armed with a lot of good data. But if all your e-marketing initiatives drive traffic to your homepage, how will you know which ones are working and which ones aren’t? If you send out emails to rented lists and the call to action are all links that point to your homepage, then you’ll never know which emails are doing better than others. You may get an idea by seeing if your overall traffic increased the day you sent out the email or posted the banner (even to determine this you’ll need your website stats), but to do it right, you need exact data, and the web will provide it for you.

Some sites that you place banners on will offer you click-thru counting services to you. Most email brokers also offer similar services, at a price. But what if they don’t offer tracking information for you? Or worse, what if you don’t trust their reporting?

The solution: Create, implement, utilize and manage your own unique tracking pages.

It’s relatively simple. In every e-marketing campaign you conduct you create and assign a unique html page to it. Then the initiative’s call to action (hyperlink) points to its unique page. After the campaign is done, you can then go to your website statistics obtained through your website’s server log files, and see how many visits were logged for each unique tracking page.

For example, let’s say you send out an email to a list of 1000 email addresses. In the body of the email there is a call to action link that says, “Click Here to Buy Now”. This link points to a page on your website. But not just any page. It points to a unique tracking page you created earlier to track how many of the 1000 people clicked-thru from the email. It’s important that no users can get to this new page in any other way than through the email. Let’s say you named the page email-campaign1.htm. After the email campaign is done (I like to wait about 2-4 days), you go to your website statistics (the result of parsing the server log files through WebTrends or its equivalent) and search for the page called email-campaign1.htm. Finally, you view the page visits number. Let’s say the visits to this unique page totaled 200. That number is your click-thru number.

Now you can really start to fill in all the relevant data discussed in Part 1. This will enable you to determine how well each campaign is doing and whether you need to make adjustments.

To help manage all these unique pages, keep them all in one sub directory of your site. If you don’t do the technical work for your site, you ought to consider giving Part 1 and Part 2 of this series to your technical web person so they can get a better handle on your website vitals.

Until you know how well your website and e-marketing campaigns are doing, measured in visits, leads and sales, you can’t possibly maximize your operation and increase your bottom line. Now you have the information to make this happen.