Anchor Text Variation & Analysing Links in 2011

A lot has changed this year so far in terms of how we analyse competitors and particularly their link profiles, no longer is it acceptable to pull the links of the top 20 sites, you need to dig deeper than that thanks to recent changes in Google’s SERPs.

The problem is quite straight forward, late last year Google decided that the pages with the most links wasn’t going to be the one that ranked, Google switched and decided they new the best page to rank on your domain regardless of where the links were pointing. Many websites have seen deep pages spring to life with zero links thanks to Google’s new emphasis on relevance, however this leaves us with a few issues.

Where do we build links?


How do we know which links are helping our competitors rank?

To quickly get an understanding of what I am saying you need only search the term ‘seo’, there are at least 2 lisitings with pretty much zero links other than internal ones, yet they are ranking for a majorly competitive term.

They are still ranking based on links with the only difference being that Google have picked a more relevant page and seem to be deciding this based on content, URL structure and internal links.

So you want to rank for a term, to do so you’re are going to need to have a picture of every single link to your competitor not just the ones to the ranking page, the problem is there isn’t a tool that let’s you export all the links to a given domain, I like open site explorer but even this limits you to 10,000 links.

For me the best way to get around this is by using to the site: search modifier to discover what Google thinks are the most relevant pages for the keyword you wish to target.

So in a few simple steps;

1 – Find the competitors that rank for your target term

2 – Perform a site: search to find their most relevant pages

3 – Pull the links from the top ten pages into excel using a link tool of your choice

4 – Go have fun :)

9 times out of 10 this will give you all the links you need to effectively target the term.

In terms of how you build links my advice hasn’t really changed:

  • Build links to the most relevant page
  • Build Brand and Brand + Anchor Text to the homepage
  • Build a good number of variations to your internal/targeted pages

This brings me on nicely to my next point…..

Anchor Text Variation

Now we have known for a long time that anchor text variation has been good practice, sometimes for nothing more than dodging filters, however I am increasingly beginning to think it is quite a major factor.

Last weekend there was an algo shift in the UK, not Panda which hit Monday but something that happend over the weekend. On initial analysis all we could see is that sites with a good mixture of anchor text that included the target terms were benefiting.

Over the past couple of day I have been pulling links for the top ten ranking sites for 10 core keywords we track and the results seem to suggest a definite correlation between good anchor text variation and rankings;

anchor text variation

As you can see top ranking sites have better variations, as well as more anchor text from unique domains.

This is correlation and not causation but I for one will be looking to spread my anchor text a little further.

A LittleĀ on Panda

I don’t know about everyone else but this Panda Update is crazy!!!

Google sold it as the ‘content farming’ killer, however we are seeing some really strange results, sites with 100% content being hit and sites that publish nothing but duplicated content not being affected, weird.

We currently have access to more than 100 analytics accounts and are trying to figure this out, don’t believe everything you read either, a recent report on Search Metrics suggested a list of losers from their data but I can tell you 5 of the sites on their losers list saw traffic increases from Google!!

If anyone is suffering from Panda please drop me a line, we’re looking into a number of cases free of charge in an attempt to get our heads around this update.

We know it relates to site quality but we haven’t identified the individual factor due to the random variation in the sites being hit and as far as we can tell no one else knows the exact factors involved.