Analytics has always been an important aspect of SEO and as an SEO consultant it is something I have found myself indulging in over the recent months. With all the changes hitting the SEO field at the moment, such as personalised search etc…. it is becoming more and more important to forget about rankings and focus on where your traffic is actually coming from and how to use it effectively.
I was going to leave my analytics tutorials until the very end, however I feel it is important for you guys to understand these important aspects sooner rather than later. The first aspect you need to consider when dealing with analytics is understanding data sources and today I want to take an in depth look at traffic data sources.
If you are new to SEO or even new to the analytical side of things some of this may take some understanding, but please, please, please stick with it as it is crucial to success, to understand these features. This is going to be a long one but trust me when I say it will be worth it
OK, let’s start with the very basics. There are two main ways of getting at visitor data about your web pages.
1 – To read the visitor/server logs
2 – To use a real time tracking system
Let’s start by breaking these two fundamentals down.
Whenever anyone requests a web page from a server it records the relevant data in it’s raw visit logs. To get an idea of what this data would look like this;
126.96.36.199 – Tim [ 10/Oct/2007 : 13:55:36 -0700] “ GET /apache_pb.gif HTTP/1.0” 200 2456 “http://www.[example]/start.html” “Mozilla/4.08 [en] (Win98; I ;Nav)”
This is simply an example of the kind of data held in a visit log, there are various things you can get from this;
1) IP Address
2) Password (When HTTP protected)
3) Date and Time
4) HTTP Request sent by browser
5) Outcome of the request Success/Failed (200 is the response in above example)
6) Size of the content
7) Where the visitor is from
8) User Agent
As you can see the server log contains a lot of useful information about the visitor and many programs called log analysers are able to display this information in a user friendly form. One of the most popular analysers is known as Webalizer (I made reference to this in my analytics tool list.)
Now collecting the visitor data in this way is fine, if all you want is a basic picture of the visitors to your site, however it is far from what we need if we are involved in marketing. The reason is that log analysers cannot give us enough information about repeat visitors, uniques, transaction cycles, visitor browser info, operating systems, analysis of advertising, keyword analysis and reference analysis etc…. All these things are highly important to an SEO and search marketing campaign.
Even if you could extract this kind of information from a server log it would take very complex software and a massive amount of time implementing integration.
Anyway there are other reasons why log analysers cannot provide us with the precision we need as Internet marketers for the following reasons;
1) Uses of dynamic IP addresses (dial up users) will be recorded as multiple visitors
2) Visitors from a large corporate network with only 1 IP address will count as 1 visitor
3) A visit will be recorded even if the page fails to load in the users browser as the log records a hit as soon as the info is sent from the server.
It is for this reason that we need a more effective way of tracking our visitors in my opinion.
Real Time Visitor Tracking
A lot if not all of web browsers including the likes of Internet explorer, are able to execute browser-side scripts.
Let me just explain this in simple terms for the benefit of the less experienced readers.
- When a web page is composed on a browser, a program code can be embedded into the page that will instruct the browser not only to show the page on screen, but to also perform certain other actions as well.
These are the same scripts (java and VB) that are used for on page animation, rollover effects and can be used to process data from a form entered by the user.
Google analytics is a great example of a browser side script that reports various data about visitor page views to a remote database and in my opinion is the perfect tool for constructing marketing info.
So when a person opens up a page in their browser, the browser executes the tracking script embedded into the page. This script is powerful and orders the server to send every possible piece of information about the visitor to the tracking centre. Let me make something clear at this point, it will not pull up details such as the users address, sex or name but will include information such as;
- screen resolution
- browser details
- cookie information
- time zone
- name of the page viewed
- keyword use
The whole thing remains 100% invisible to the visitor and the information passed to the tracking centre is stored in a database. You then request the statistics for your site and it is put together in a user friendly format.
It is called real time tracking simply due to the fact the information is stored as soon as the visitor see’s the information on their browser. The only thing you need, to use this form of tracking, is access to the source code of your pages and privileges to upload files to your hosting server and this is the limit of technical requirements
Ok I am aware this has been a long post and you may need to read over it again to get to grips with it. If you are familiar with traffic data sources then this may have been somewhat a refresh, h
owever I love the saying “repetition is the mother of skill”.
Hope you have got something out of this and please don’t under estimate the importance of analysing your traffic fully.
Author: Tim (292 Articles)
Tim Grice is the owner and editor of SEO wizz and has been involved in the search engine marketing industry for over 7 years. He has worked with multiple businesses across many verticals, creating and implementing search marketing strategies for companies in the UK, US and across Europe. Tim is also the Head of Search at Branded3, an SEO agency in Leeds.