When we are talking about “site architecture” in terms of search engine optimisation, what we are really discussing is the whole of your sites framework that supports your content. We are looking imparticularly at how your site structure facilitates the search engine spiders which will have an impact on how your site is indexed. In basic terms you architecture involves your navigation system, page layout and your file or directory system.
To make our structure search engine compatible you should use unique and relevant content, use search friendly design elements and a linking set up that allows the spiders to crawl your site more effectively.
Below are a list of the basic aspects of site structure that effect search engine rankings.
- File/directory system
- File names and extensions
- Navigation menus
- Landing pages
- robots meta tag or robots.txt file
- Error pages
- Introduction pages
- Dynamic content
In this post I want to explore some of these areas a little further from an SEO persepective, I will also be adding this post to my tutorials where you can find a complete SEO package.
Ideally you should have SEO in mind from the outset when designing your pages. However in most cases a webmaster will design a site taking the physical user into consideration and will have to redesign the structure later down the line to make it SEO compatable.
If you are looking to re-structure your site I would always recommend using a broken link checker. This will assist you when implementing changes.
Anyway let’s get down to the real meat of this area….
It is commonly known that most search spiders will not dive deeper than 2 levels into your site, therefore important information that you want to be indexed should be kept within this limit. eg. http://www.yoursite.com/L1/L2/page.htm
Now those of you who are up too date with the SEO industry will know that powerful search engines such as Google can dive as deep as 4 levels, however it is my own personal opinion that you should still keep the most important pages no more than 2 levels deep. This is because pages that are closer to the root directory will have a greater degree of importance placed on them by the search engines.
If you are the owner of a particularly large site (100 – 200 pages) these pages should be kept in the root directory, for smaller sites it is adequete to leave all pages under the root placing your most important pages at level 1 & 2. Another key point to remember is the use of your keywords, however as I said in a previous post keep your file keywords seperated by hyphens and do not stuff them. I find 4 – 5 keywords is the max before it starts looking spammy.
I also want to touch slightly on dynamic URL’s, as a rule search spiders have no problems reading these types of URL’s, however in Googles terms it does admit that dynamically generated web pages such as .asp, .php, and URL’s with question marks in them can cause problems and may be ignored.
There are some work arounds for dynamic URL’s using .htaccess but that is for another post. For the purposes of this post it is simply better to use htm, html and txt. For images stick to .jpeg, .gif, .png and .bmp.
Google/Yahoo/MSN do index other types of files but these are the ones that carry the most SEO value.
Remember always use the ALT tag to name your images as this will reap benefits in the image searh results.
Site navigation is an important aspect of you site structure and I would like to touch on a few aspects of it in this post.
DHTML pull down menu – This is a popular form of navigation, however far from the best.
Navigation Buttons – Slightly better but you can still improve from an SEO perspective
Hypertext links – This is the best form of site navigation for web optimisation
For large sites there is a system know as “breadcrumbs” which shows visitors where they are and a direct path back to where they have come from, it normally looks a little like this:-
home > seo > seo tutorials > site structure
This is a perfect way of navigation for larger sites not only from a usability point of view but also from an SEO perspective as it enhances the link structure by effective use of relevant keywords.
There is much more to write about on site structure such as robots.txt and sitemaps. I will try and touch on these a little more in later posts. Site structure is something that is often forgotten but can be the easiest thing to fix, it can also provide quick effective results, try to keep it in mind when designing and tweaking your site.
Until next time
Author: Tim (292 Articles)
Tim Grice is the owner and editor of SEO wizz and has been involved in the search engine marketing industry for over 7 years. He has worked with multiple businesses across many verticals, creating and implementing search marketing strategies for companies in the UK, US and across Europe. Tim is also the Head of Search at Branded3, an SEO agency in Leeds.