The ‘Daddy’ Search Roundup of 2011

Well that’s it, another year has come and gone, and what a year it’s been for all those involved in Search! Panda kills websites from February, Google removing our precious keyword data and the demise of Yahoo Site Explorer :(. Sounds pretty grim, however I personally think this year has been transformational from a Search point of view, a lot of the theories spoken of in 2010 have now been confirmed, social signals are playing a bigger role, Places optimisation becomes more important, Google + attempting to take on Facebook, the list really does go on.

In my opinion 2011 has created more opportunities within the industry, sure it’s pushed us out of our comfort zone, but that’s a good thing. As SEO’s we now have to become more focussed on page structure, content quality, click through rates, bounce rates, trust factors and developing a site into a brand not just building anchor text links.

Yes Google still has quite a way to go in terms of search quality but I think 2011 has seen positive changes in the SERP’s, changes that make the job an SEO more challenging and diverse.

So with all this in mind I thought it appropriate to put together a roundup of all the key elements that have made this year what it is, in my opinion.

Let me warn you before you continue, this is going to be a monster, I’m talking 3000 words +, please do not go any further without a warm drink and a few snacks.

The problem with detailing such a busy year is deciding where to start, so much has happened, however I’ll follow some advice from one of my favourite reads this year ‘Eat That Frog‘ and, well, eat that frog/panda!

The Panda Bomb!!

So, way back at the beginning of the year Google released a post about a new update that had rolled out across US sites.

If your traffic looked like this;

panda penalty / filter

…You were no doubt hit by Panda and saw anywhere between a 15% – 60% drop in organic traffic from Google.

This update was designed to weed out sites with low quality pages, however what defines ‘low quality’ is still very much up for debate, but here are some general guidelines.

> Keep content as unique as possible

> Try to add value instead of rewriting the news

> Delete old content that adds no value

> Increase social signals and trust

> Focus on usability and page structure

> Minimise ads and affiliate links above the fold

The update soon hit the UK and the rest of Europe during April/May leaving many webmasters wondering what was going on, the confusion was based on a couple of simple misconceptions; many thought Panda was set to destroy websites with duplicate content, so article directories and scraper sites, right? The problem was this was hitting websites that were in no way duplications, not internally or externally, so what was going on? The second misconception was that this was set to hit on a page level, so if you had some poor pages somewhere on your site, then these would be dropped. However, Panda was based on a threshold, if your website tipped it then your whole site was hit, not just the crap scraped articles from 2005.

The truth is Google only wants to deliver the most relevant, most trusted, most authoritative results. Panda analyses pages based on the above guidelines and decides where domains should rank depending on how they meet these guidelines.

In May I wrote about City Visitor and how they had been badly hit by Panda, this was based on referrals some of our clients get from them:

city visitor panda penalty

City visitor is basically a directory with very little in terms of content, why would Google rank this when they believe they do a better job with their places listings?

My guess is this was tactical, but it gives some insight into what types of content Google considers to be ‘low quality’.

As SEO’s we need to understand these factors so we can best advise clients on how to structure the pages of their site, to convince them to invest in content and to help them design pages that will not only attract search traffic, but that also add value to the web as a whole.

The JC Penney Disaster!

Towards the beginning of the year someone thought it would be a good idea to ‘out’ JC Penney for link spam, the story went up on the New York Times, and the same day Google cracked them with a penalty across all page levels.

Now this was unlucky for JC Penney, had they been a smaller brand it wouldn’t have been front page news and Google would have unlikely taken any action, however this was a warning to all major brands out there who are actively utilising SEO as a marketing channel. It’s surprising how many brands are running the risk of being called out because they won’t give SEO the investment it needs, instead they pay peanuts and get a load of crap paid links.

The mistake JC Penney made was stupid, the links were in no way defensible, but the worst thing about the whole affair was that they paid an agency to go out there and do it!

Big brands need to take SEO seriously, invest in getting the right things implemented by the right people; short cuts simply aren’t acceptable when you are under the microscope.

Social Signals Really Work..

social media optimisation

In late 2010 we found that tweets which linked through to pages seemed to give a ranking boost, we utilised it well to influence rankings over a number of core terms with nearly a 100% success rate. This year we are seeing more and more that socially active domains are able to rank much faster and much easier than sites that don’t. It’s still a very difficult thing to measure and allocating budget to it is a challenge, but building social referrals and mentions to the pages of your site is a sign of trust and does have a positive impact on your overall SEO strategy.

One thing we know for sure, Google wants to give weight to social and referral signals;

“Content recommended by friends and acquaintances is often more relevant than content from strangers. For example, a movie review from an expert is useful, but a movie review from a friend who shares your tastes can be even better. Because of this, +1’s from friends and contacts can be a useful signal to Google when determining the relevance of your page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, and we’re constantly tweaking and improving our algorithm to improve overall search quality. For +1’s, as with any new ranking signal, we are starting carefully and learning how those signals affect search quality.”

This statement is specifically referring to +1’s, however we have seen positive impacts from Stumble, Twitter, Facebook, Reddit and other social sites.

How these signals work is still not fully understood, there was talk of a ‘socialrank’ around a year ago, however nothing more has been confirmed on the exact signals. Personally, in my opinion, social signals seem to be noise signals, it’s a quantity game, the more people recommend the better the results.

Having said that I do believe authority and relevance will work its way into the mix, how popular are the accounts pushing the content, how old are they, how well used are they?

This will have to be worked out before too much weight is given to social signals; otherwise it is too easily manipulated.

Google’s Social Push

google plus the social networkThe last section brings us on nicely to Google’s attack on the social space and using referral data to drive their rankings.

Back in June Google launched ‘Google Plus‘, a social platform looking to take on Facebook, with one subtle difference; Google understand that not every relationship is the same, you may choose to share things with family that you wouldn’t share with colleagues, for this reason Google promotes circles. Circles are made up of groups of people within your network that you can separate and share on different levels, a brilliant idea, although I believe a social network stands and falls on simplicity and this may hinder world domination.

Anyway, that’s enough of what Google + is, how is it going to affect our SEO efforts?

To be honest we are struggling to find any correlation between Google + shares and rankings at the moment. Google have said they will be slowly integrating the data into their algorithm but as yet I personally haven’t noticed a change in organic results. However, others have tried testing Google +1’s with some pretty compelling results.

I think there is quite a long way to go before Google are able to gauge enough information to drop the link graph, if they ever could at all. Google +1 sharing suffers the same problem as any other signal, it’s easily manipulated. 2012 will be interesting and no doubt Google + will become more and more important.

Googlebot Capability

googlebot capabilitySEO’s and javascript have never really mixed due to Google’s inability to successfully crawl and index content behind Ajax and javascript. However, Matt Cutts announced around a month ago, that Googlebot now has the ability to do this, and straight away Facebook and disqus comments were popping up in the SERP’s.

“Googlebots, or the spiders that crawl web pages, are now reading Facebook comments on websites just like any other text content and the more interesting part is that you can also search the text of these comments using regular Google search.”

Should we leave best practice and start dynamically inserting content with JS? Well there are some that think it’s not an issue anymore, however there are plenty that still consider good old html to be the best way to feed Google your content. As for me, I need to see a lot more evidence to consider abandoning best practices.

There was a good test carried out here which basically shows Google is capable of a lot more than we think, however this isn’t always displayed in the SERP’s, if at all.

Taking all this into account I think it is very important that we fully understand what Googlebot can and can’t do, and what information it is collecting and counting from our pages.

With 140+ thumbs up on SEOmoz you have probably read this already but if not, take some time out to read Mike King’s post on SEOmoz about the ‘Headless Browser, some great insight and ideas, also check out the comments, very good.

The key takeaway from the post was that we need to be more focussed on the user experience; Google has the ability to understand what our pages look like, how a user may interact with them and, as we have known for a while, what sits above the fold.

Due to the amount of rubbish in the SERP’s Google is not actively basing rankings on usability; however you can bet as Google gets more sophisticated and as the Panda algorithm develops, user experience is going to be a key factor.

Google Freshness

Freshness is massive at the minute in my opinion, but not because of the new results we’re seeing for certain terms, instead I think (and have done for 18 months now) that the freshness of pages we get links from has a massive impact on rankings.

I posted back in March about links in fresh content and then again about ‘FreshRank’.

Historically SEO’s have used PageRank, MozRank, Link Authority or some other link based metric to decide where to build links, this is dated in my opinion and the new formula should be a mix of domain authority and quality, fresh content. These two factors play a huge role in the rankings we have managed to achieve in the last couple of years.

I am pretty sure that Google is using a ‘freshness’ score to help decide how important links are, but the more important factor of fresh link building is that they seem to last!

Add social campaigns to your fresh link building campaign, plus a few pieces of viral content on top of that and you’ll almost have an unstoppable mix. Lazy SEO will almost always lose to this ..

New Google Analytics + Features

We got a brand new Analytics interface this year with some nifty new features to help us optimise our campaigns.

Mashable do a great job of breaking down the top 10 features, however there are a few that I particularly like.

Multi-Channel Funnels

Multi-channel funnels are a set of data to help you understand the conversion journey. Historically analytics packages have credited either the first or last interaction with the conversion, however a visitor may have hit your site multiple times through multiple channels in the course of a conversion, multi-channel funnels allows you to see all of this data!

Since the day it landed we have been collecting data to understand the full value of SEO, not just the first or last conversion value but any assists, quickly finding out that natural search is anywhere between 30 – 60% more effective at driving leads/sales.

Read more about it here.

Speed Reporting As Standard

In times gone by you would have to add a little extra code to get site speed data from analytics, with the new version it comes as standard.

Not only is this important for making sure your site is fully optimised but it is also excellent for auditing your site. You may be getting a high bounce rate on a certain page, or you may see a drop in conversions, in both cases speed maybe an issue and now it easily established.

Social Engagement

With so many businesses eager to get on board the social train its great to see Google coming up with a tool that helps us analyse social interactions.

Google analytics integrates with ShareThis and AddThis to itemise social activity, allowing you to see what content has been shared and how much, essential for optimising your social strategy.

Google Places

In my opinion many businesses are failing to unlock the full potential of Google places, either they are dismissing it or are too lazy to do a proper job of making it work. Over the past 12 months we have had significant success attracting location based traffic through Google Places, and with 53% of mobile searches having local intent it’s something you cannot afford to miss out on.

At the end of 2010 Google changed the layout of Google places giving it more prominence on first page results, since then they have been gradually optimising it, making it more and more important for local businesses to be listed. Google has no intention of rolling any of this back so we need to get on board and optimise.

There are a few basic elements of Places optimisation:

1 – Claim your business

2 – Name, Address and Phone Number must be accurate and match any citations

3 – Use relevant keywords within listing

4 – Build citations on relevant directories

5 – Get reviews on Google!

Some also claim that good on page optimisation on your main site and external links are still a factor, but the above are fundamental. You can read more about it in this great Google Places Guide.

Intellectual Page Selection

It used to be that whatever page you built links to, that page would rank, however towards the end of 2010 this changed and has been a consistent feature since.

Google will now take into account your internal site structure, specifically your internal anchor text to decide which pages should rank, you may have a load of links with the anchor text ‘floppy slippers’ to your homepage, but if you have a category page with the internal anchor text of ‘floppy slippers’ that is going to be the page that ranks.

I wrote about it here, back in April, you need to focus on creating relevant hubs on your site leaving your homepage to deal with all your brand anchor text.

Yahoo Site Explorer

Around 3 weeks ago Yahoo Site Explorer finally gave up the ghost….. A moment of silence please…….

Yahoo Site Explorer has been around since 2005 and although it wasn’t the most user-friendly of link tools, it was certainly the quickest to update and often gave great insights into competitor strategies.

It has been rolled up into Bing Webmaster Tools; however we’re still coming to terms with the loss and haven’t delved too deeply into Bing.

What Happened to Keyword Data!

In October Google decided they wanted to make life difficult for the SEO world by encrypting search sessions of anybody signed in to Google.com.

Apparently this was for privacy reasons … cough!BS! … but is proving a bit of nightmare since it rolled out. Across all the sites I work with the average amount of keyword data affected in 12%, that’s a lot of data for websites that get 100’s of thousands of visits every day.

As it stands we don’t know fully how this will impact but it is still rolling out and the percentages could get higher!

It’s The Industry To Be In!

I honestly believe the Search & Digital Marketing industry is the most exciting place to be in this day and age, it is fast moving and a constant challenge to keep up to date with, but the diversity appeals to me and to a lot of other people who move into the industry on a monthly basis.

The recent Search Benchmark report suggested companies are looking to increase their online spend over the next 12 months in SEO, Social and PPC.

search marketing benchmark report

Where else can you reach an audience as targeted as with search marketing?

This is just one of the very positive industry reports with another suggesting SEO is the marketing channel to rule them all!

32% of small businesses were asked ‘if they had to put all their budget and time into one channel, which would it be?’ the answer? S E O..

seo to rule them all

This all makes for a very exciting 2012, I know at Branded3 we’re seeing massive growth in SEO, Social Media and Online PR and as the market place becomes more educated so will the need for transparent quality services.

Anyway, I think I better leave it there, this blog post is taking its toll but I hope it makes up a little for the lack of content in 2011; I have plans to be a lot more active in 2012.

So, if I have missed anything important (which I’m sure I have) please add it in the comments with any relevant links, but for now it’s goodbye until 2012, hope everyone who celebrates it has a great Christmas and happy New Year! For those that don’t have a great holiday!

Until next year……

Wow, how could I forget rich snippets!!!! Please see the following, massively important;

Rich Snippets Everywhere

Google’s Take

Rich Snippet Must Read

Negative & Positive Link Spikes – Tripping Anchor Text Filters

The term ‘link spikes’ has been around a long time, usually to describe unnatural link patterns that Google uses to put the smack down on overly aggressive link builders.

Using the term in this way is correct and yes, creating crazy link spikes in your link profile will put you in trouble.

What people don’t realise is that it isn’t just positive link spikes that can cause issues but also negative ones. Spikes from lost links can be as damaging as spikes from link growth.

Link Spikes

Of course losing links can mean lost rankings, however in my experience lost links can also trip filters and mean a specific page on your site will just never rank, no matter what you do.

Negative Link Spike Case Study

I had the unfortunate opportunity of experiencing a filter on an affiliate site of mine. After renting a few links for 6 months I decided to pull them down, knowing I had plenty in reserve to keep my rankings.

These links were site wide and all had the same anchor text, when they were removed I lost 52,000 links from 3 domains. Technically this should have been a good thing as the links were completely unnatural and obviously paid.

Link Spike Link Loss

Two days after the removal of these links my site dropped rankings, not just for the target keyword but for any other search terms containg the target term. Elsewhere organic traffic was growing, and growing well, but this keyword and variations weren’t even in the top 100, you can clearly see the effect on traffic below for the page where the links were pointing.

traffic link spike

The first thing I did was try and replace the lost links, I didn’t think this would work but wanted to see what happened, the answer? Rankings dropped even further. Now I had created 2 unnatural link spikes in a row and new my rankings would never recover.

Sometimes when you begin working on a new site you may come across ugly site wide links that you think need cleaning up, however do this with extreme caution, negative link spikes are as dangerous as positive ones.

Is Fresh Rank More Important Than Page Rank?

First of all let me confess the term ‘fresh rank’ has been stolen from fellow SEO blogger Justin Briggs, I am going to refer to one of his excellent posts throughout the rest of this one.

You will no doubt know about Google’s new QDF upgrade, an algorithm tweak designed to get you to ‘fresh’ content quicker, rather than bringing up old static results.

You can see an example of it here;

QDF Serp

They’re not site links but links to fresh content on the BBC for the search term ‘football’.

Google has stated that this affects around 35% of search queries, don’t get that mixed up with searches. Now that is all well and good but from my point of view I want to know a few key points;

1) How does Google decide what is fresh?

2) Is the link graph involved when deciding ‘freshness’?

3) How do links from these ‘fresh’ pages influence rankings for the taget website?

I wrote a really short post a few months ago based on fresh links vs text links vs links placed in old content. The results clarified that links in fresh content had a more significant impact on rankings.

However, I feel the need to delve more into this, as I think this strategy is one of the most important link building tasks you can undertake and will help you cement long term core rankings.

How Does Google Determine Freshness?

Justin wrote a great post on this on his blog and if you really want to delve into this you should definitely go take a read, I just want to touch on some of his points and then try to understand how we can use it for link building purposes.

Document Discovery

I think it is safe to assume that the discovery of a document through Googles crawl for the first time is enough to indicate freshness. There is a little bit of debate around this, but nothing that affects the take aways too much, does Google count it as fresh when it is first crawled, first linked to, first mentioned in a social capacity, first indexed…? No one really knows and the truth is it is probably a mixture of all those factors.

Proportion of Change

Is a document fresh if it hasn’t been discovered before, or is it if a document has significantly changed since the last crawl? Do Google give the content a score on a sliding scale?

Fresh Rank Score

This is important to know, if the freshness of a document determines the power of a link from that document then we need to know what this scale is, Google is never going to give anything away, I doubt a new FreshRank toolbar is going to emerge, however we can test this through our link building efforts.

Fresh Rank

This brings us nicely onto ‘fresh rank’ a term I am fully attributing to Justin. In the paper Systems and Methods for Determining Document Freshness, it describes a method of passing a freshness score between pages.

So just like PageRank is passed between pages so is FreshRank, so whether or not your landing page is ‘fresh’ will depend not only on the changing content on the page but also by the freshness of pages linking in, hence we have some kind of freshness score.

Is Freshness More Influential Than PageRank

I use the term PageRank very loosely and only to describe the authority of a domain or page based on the quality and quantity of links it has pointing to it.

We all know how PageRank travels around the web and through our sites, its long been the currency of the web, the more links a site or pages a site has pointing to it, the more value a link from it will pass.

Now before I go into this any further, let me first declare this has not been tested or researched in any way, my opinion is based on working with an SEO department that manually builds over 5,000 unique links every month.

The PageRank Model

PageRank Model

Value is passed from page to page based on popularity.

FreshRank Model

Fresh Rank Model

I am a firm believer that the above model is already in place and has been for sometime and that Google use a combination of FreshRank and PageRank to determine the ranking of a given page.

No one knows the exact calculation but it could be that a link from a fresher PR1 page is worth as much as a static PR6, maybe that is overselling it a bit but certainly getting multiple fresh links every month can be just as effective as acquiring 1 high PR link a month. I guess the ideal is to combine the two.

I’ve worked with many sites over the years and the best way to impact rankings is by creating a fresh link profile. The ultimate combination would be creating a fresh link from a high PageRank (authority) domain, combining high PageRank with a FreshRank strategy, is a sure way of dominating your industry SERP.

Diminishing Value

The one problem with a ‘fresh’ link building strategy is that it is likely to diminish in value over time, therefore efforts have to be ongoing.

PageRank flowing through a link is going to be more consistent than Fresh Rank flow, a document rarely consistently acquires a large quantity of links naturally, however once a document has a certain amount of PageRank it generally keeps it, that is of course as long as the links remain live and and the PageRank isn’t pushed through multiple 301’s.

As you can see from the above model links are carrying PageRank and FreshRank, however what about 6 months later;

Diminishing Fresh Rank

Overtime the freshness of the linking documents will diminish, no doubt everytime Google crawls them it will give them a new score. If only we could build lots of links that are from fresh, high domain authority pages that are going to be continually linked to for the rest of their existance ;)

Overall Thoughts

I have seriously thought about researching this in more depth, but from the results we see on a daily basis it is pretty much a given. SEO strategies that involve the consistent development of links from fresh pages will almost always achieve higher rankings than those that don’t.

Only Google will know exactly how this works and one day they may be willing to shed a little more light on it but that won’t stop me and more SEO’s implementing it.

Would be great to hear some debate on this or examples of tests/research.

Getting a Site Indexed and Monitoring Indexation Levels

OK, I’m going to cover some basics here but time and time again I get asked about monitoring indexation levels and what factors affect them.

There are quite a few misconceptions in this area and a lot of people rely on the wrong set of data, they also rely on old techniques to improve this, so let’s have a go at discussing some of the main areas.

HTML & XML Sitemaps

What does a sitemap really do? My opinion? Absolutely nothing in terms of getting a site indexed. It’s an auditing tool to monitor and test the architecture of your website, it doesn’t matter how many sitemaps you have, if the structure of your site is poor you’re going to have low indexation levels.

This is especially true of XML sitemaps; however what about HTML site maps?

Again in my opinion HTML site maps are over rated and used to cover up poor navigation, or used in the hope that they will create a magical ‘thumbs up’ signal to Google. The truth is most HTML site maps are not user friendly and consist of a pile of links, spread out over many pages, especially if you have a large site. With Panda hitting websites hard the last thing I would want on my site is a load of pages filled with html links tipping me over the low quality threshold.

The only time HTML site maps are effective in my opinion is if they are genuinely helping users navigate their way through a site and should include descriptive text as well as links to the various areas of the site, however if you own a website with 100,000 product pages the last thing you want to do is create a site linking to them all. The way your site is structured should help users and Google find all those products easily and effectively.

So to round up, yes use an xml sitemap to audit your site; don’t create a HTML site map unless it is user friendly and in place to help those users find their way to important parts of your site.

Try using multiple XML sitemaps to monitor the performance of different areas of your site.

Performing a Site Search

We’ve all done it and still probably do it, however this is a really inaccurate way of monitoring how many pages are actually indexed on your website. You can do one search and a search a minute later that fetches a different result.

However, if you know there are roughly 500,000 pages on your site, then doing a quick search can give you a very broad understanding of how well you are being indexed.

So go easy using the site: operator :)

Using Analytics

This is the best way of understanding not only how well your website is indexed but also the quality of those pages, it’s been spoken of many times before but let’s go over it again.

Login to google analytics, go into traffic sources and select google / organic, then select a secondary dimension of ‘landing page’, this will show you how many pages Google sent traffic to over a certain period of time, monitoring this figure every month gives you a really good indication of the indexation levels of your website, I would use and monitor this figure over anything else.

landing pages from google

So there we have a few ways of monitoring but how do we get more of our site indexed?

Flat Architecture

Sorry to go back to basics but want to cover it for a sense of completeness.

First thing, don’t get confused with getting your site crawled and getting it indexed; these are two completely separate things. Once Google finds out your site exists I have no doubt it will crawl your whole site at some point, however in my experience getting good amounts of your site indexed comes down to one factor and that is trust.

Having a flat architecture is all about creating the shortest route possible to the pages on your site, how many clicks are your major pages away from the top level?

flat architecture
(image from SEOmoz)

It’s simple really, the closer a page is to the top level the more trusted it is going to be and therefore has more chance of being indexed.

It all comes down to links

It’s easy to get a page ranked by building lots of anchor text rich links to a page, however if you have done this before you will also have realised it has very little impact on the overall trust of a website.

I speak with clients about this a lot, we point out an overall increase in traffic and indexation levels and the client will say ‘but you we’re trying to rank us for X not the other keywords that drove traffic, this was just natural growth that would have occurred anyway’.

Trusted, quality links will affect the overall organic traffic to a website and are absolutely essential for seeing a continual growth in organic traffic.

Yes by all means build optimised links to your site and go after rankings but you have to incorporate a strategy for getting links from the best sites in your industry, the most trusted sites, without them your website will never perform to its full potential.

The above are the main factors but site speed and quality of content since Panda are having more and more of an effect on the indexation of a website. Check the speed and make any necessary changes to rectify speed issues, also check for duplicate or pages with low quality content.

Remember Panda is a ‘threshold’ based algorithm, removing as many low quality pages from your site as possible will could be all it takes to sort out Panda issues.

As ever would love to hear your thoughts on this in the comments.

Baby SEO Launches – An SEO Company Focussed On Delivering Results For Small Businesses

baby seo

I have been working in SEO for over 7 years now and in that time I have had the opportunity to work with and discuss SEO services with many small business owners with limited budgets for investing in search engine optimisation. Many of them have invested £300 – £600 to find that the basics have not been covered and in effect they have been paying for a ranking report and some title tag optimisation.

This has always been a frustration of mine and with every client I have worked with I have always stressed the importance of link building and educated the client with regards to real ranking factors. Working with small businesses is a lot different to working with brand clients, both in terms of budget and in strategy.

As of today I am pleased to announce that myself and Branded3 have teamed together to create Baby SEO, it’s a company dedicated to delivering results based on smaller budgets and being absolutely transparent in terms of the work we do. As a consultant I have worked with many small businesses and have worked out efficient ways of delivering results based on smaller budgets.

Today is the official launch, however over the last month or two we have gained new clients and have a small business SEO Team in place. Packages range from £300 – £1500 per month, giving any business an opportunity to progress their SEO work. If you have friends or clients that you think would benefit from this we are currently offering a 20% referral fees and also offer a white label option, perfect if you’re a web designer or developer wanting to offer your client base genuine SEO services.

Please get in touch if you would like to know more about the service.

Thanks

Factors of a Natural Link Profile – It’s More Important Than Ever

Technically SEO could never be classed as natural, you’re undertaking activities to game the engines, whether you’re a spammer or a link bait specialist we’re all trying to achieve the same things, higher rankings and more traffic.

That said whatever activity you choose to undertake, you absolutely have to make it look natural, I don’t know why so many people struggle with this concept but with the recent changes and updates it’s more important than ever to come across as ‘genuine’.

So we’ve established SEO isn’t natural but we have to make it look like it is, how do we do it?

Over the years I have had the opportunity to work on all types of sites, corporate giants to brand new start ups, no matter what type of site you work on the results always last longer when you make things natural.

Lets go a through a few things that make up the natural mix;

Anchor Text

Good old anchor text! Not sure if you’ve all noticed recently but Google has been tightening it’s filters, sites with a lack of brand, variation and image vs text have been getting hit. With this in mind it’s more important than ever to build your anchor text using all of these elements;

Keyword Variation

If you’re trying to rank for a particular term don’t focus all your anchor text on the exact term, you will get filtered out eventually. I find the best way to pick anchor text variations is by using Google Suggest and the Keyword Tool;

google suggest long tail variation

google keyword tool suggestions

Use multiple terms regularly to keep on the right side of Google, it will also help you rank for more long tail variations. In terms of getting the right mix its difficult to say, I personally like to use no more than 30% exact, unless there is a good reason to be more aggressive, a peak season etc…

Brand Anchor Text

I blogged about this last year and quite a few people rubbished it, however I am sure most have seen positive results from building brand into their profile.

Again not sure what the exact mix is but if your anchor text profile is less than 30% brand orientated then I’d say it’s something you need to look at.

It’s not a case of doing this alone and you get high rankings, however if you are stuck in position 5 – 10 and just can’t break into that top 3 pack, then maybe this is something you need to look at.

Location of Links

I am seeing more and more link profile issues relating to the location of  links. If your site is based in the UK, you only ever do business in the UK and all your links are from the US then do you really think that is going to go unnoticed forever?

Blekko is pretty good for understanding the location split of your links.

blekko suggestions

Site Wide vs In Content

Again another obvious one, site wide links have been associated with link buying for a long time and towards the end of last year they were hit pretty hard. My advice would be to only use them when relevant and try and keep it brand based. In content links are so much more natural, think about it, when do people link? A brand new site would obviously create lots of new links, a new piece of content, however an established site would rarely keep updating exteranl links on a page, if all you do is acquire links on static pages your strategy is seriously flawed, stick with getting fresh links on new pages.

The idea of a ‘freshness’ boost has been talked about for years but it works, I would take a fresh page on a decent domain over a PR9 link any day.

Authority Spread of Incoming Links

Again I have written about this in the past and as far as I can tell it still plays a major factor in establishing how natural the link profile of a website is;

blekkosuggestions

You can use open site explorer to get the data and here’s a guide to pulling it into excel.

If all your links come from domains with an authority less than 20, the chances are you’re doing low quality publishing on article directories only and other low quality blogs. If all your links are from domains above 50 you’re probably spending a fortune on high PR links on authority sites. Even if all your links are in the middle somewhere you’re probably buying from a link broker.

What you should be trying to achieve is a very even spread of domain authority with links from every source possible, this to me would more likely be a natural profile.

Types of Links

This depends on the type of site you run, however I find it best to cover all bases. However be smart about it, if you run a 3 page site selling training services, you’re hardly going to be linked to by lots of fresh articles on blogs, however lots of business directories might be conceivable. If you run a blog then of course you should acquire more fresh links and social mentions.

However no matter what site you’re running I would always recommend a good mix, directories, press hubs, blogs, social sites and forums. Mix it up as much as possible.

Justified Links

If you have a site that never updates then your link velocity is going to be low, maybe a few partners and industry relevant websites every month but anything more than that is just going to look too unnatural. If you’re constantly increasing the content and pages on your site then you’re naturally going to warrant a higher link velocity.

Do things on page to justify the amount of links coming back to your site.

Natural Link Patterns

I think one of the worst link strategies you can adopt is ‘quality and relevance only’, why?

Think about it, if all the links your site ever acquires are those on high authority static pages, relevant to your industry and have to have a certain amount of links to them, does this really look natural? Would a link profile growing like this ever look natural?

Let me share a good example for you;

I got a link a couple of weeks ago from a post on SEOmoz blog, great link, super relevant and even had a good anchor text :) Happy days? Well ye, however a link on SEOmoz was not all I got, in 48 hours the my post that the link was pointing at got 210 track backs, 210!!! These were from all kinds of sources, scraper sites, news hubs, blogs re writing and publishing.

I think Google has to take this kind of link activity into account, one quality link and hundreds of crap ones within 24 hours? Do I get penalised? Is this unnatural? No and no. So why aren’t SEO’s taking this into account when running link building campaigns? Yes get quality links but don’t just leave it there.

I’m sure there are many other linking factors to consider when trying to establish a natural link profile, however these, for me at least, seem to be the most prominent. Please feel free to discuss anymore in the comments.

Understanding & Dealing with Pagination Issues for Google

Those of you who keep on top of the Google Webmaster blog will have no doubt read about their recent pagination fix, or at least their advice on how to deal with it and become more Google friendly.

What is pagination

For those of you who aren’t sure what pagination is, in layman terms it’s a list of web data more commonly seen on e-commerce sites to display multiple products, like here, however it can be used for many different reasons including:

Forums

Most forums have tons of replies to threads, from a user point of view you want to break these up (pagination), however Google would prefer all these replies or comments to be on the same page.

Blogs

Most blogs list a certain number of posts on the home page and then use pagination to allow the user to flick through historic posts, again it saves having a page 10 feet long, but Google don’t like having to trail through all the paginated pages.

E-commerce

Most e-commerce sites make use of pagination on product pages, if you have 900 products in a particular category, you hardly want to list them all on one page? So you use pagination to make it more user friendly, however Google is only ever going to crawl so much of the pagination.

Why is it an issue?

OK, we all understand what it is and why it is used, however what you may not realise is that it can cause huge issues from an SEO perspective. Think about it, if one of your category/tread/blog pages has 25 pages of pagination for Google to crawl through how far do you think they are going to crawl? Well, probably all the way, however why would Google index pages that are 25 clicks away from the top level?

This is the issue, helping Google understand that these paginated pages are important, insomuch that they contain valuable content and links, in effect they are the same page.

The noindex and canonical mistakes

Historically pagination has never been dealt with very well, in particularly the use of the rel=canonical tag is a big no no… Webmasters would use the canonical to link all the paginated pages back to the first page of the pagination, basically telling Google this is the only page that holds any importance. So what about all the content/links on the paginated pages, do you not want these indexing and following?

Using the rel=canonical to sort this is not the way and will mean a large quantity of your site does not get indexed.

Some webmasters also opt to no index everything but the first paginated page to avoid duplicate content issues, however it has the same sort of problems in that Google will not index all the pages linked to within the pagination.

So we want Google to index all the links and content in the paginated section of the site but Google don’t want to index every page to get this done, Google have now offered a little advice on the subject.

Googles solution to pagination woe!

Recently Google have come up with some recommendations around pagination and I just want to spend a little bit of time looking at them and how we implement them.

Rel=next / Rel=prev

Much like the rel=canonical we can use the rel=next and rel=prev to let Google know about pagination elements on your site and to indicate relationships between the pages. So to page 1 in the pagination you would add the rel=next + the next URL, all other pages in the pagination will need to implement next and prev along with the previous and subsequent URL (read more here Google Pagination)

Adding these tags in has a couple of effects:

A) It let’s Google know that all the content and links can be attributed to the first page and not spread out along 25 pages

B) Helps Google to understand what the most useful page is, where to send users and ultimately which pages to index.

The View all Solution

According to Google having a view all page that clumps all your pagination together is a better user experience, not so sure personally but lets trust them on this occasion. However, apparently there is another reason to implement your view all page and that’s because this is really what Google wants to index. So the recommendation is as follows;

a) Use the next/prev tags to indicate the relationship between paginated pages
b) Create a view all page as this is what Google really wants to index

I would also recommend using the rel=canonical on the first page in the pagination and pointing it to the view all page, this will prevent you confusing Google and getting mixed results across all your pagination.

How Can it Affect Quality Signals?

Google have stated that they will carry on trying to index pagination and deliver the best result to searchers regardless of the above advice. However, my feelings are that you need to follow the above advice in order to increase the quality of your site structure. We all know that Panda was rolled out to squash sites with low quality content, however from the examples I have seen this doesn’t just relate to the actual text but also relates to how pages/content is structured.

Pagination is popular among e-commerce websites, so is having duplicate product descriptions that are being used across the web. In order to stay out of trouble you need to provide Google with as much information as possible and take on any little bits of advice that are thrown your way, make use of micro formats, get listed in Google places and implement the above pagination advice, I personally think reacting to these changes is a sign that your website is genuine and therefore puts you out of harms way.