Google SEO & Search Engine Marketing Services

The End Of Google Rankings As We Know It! Rotational Rankings?

google rankings death

I always worry about posting new found research as some clever dick always pops up to tell you they discovered it in 1999, however I have noticed something in the rankings recently that has made me re think Google rankings.

The Story

OK so I decided a couple of weeks ago to rank for “SEO consultant”, just something I fancied really, not the most competitive term.

Anyway….. I have obviously been keeping an eye on the rankings over the past few weeks on this particular keyword and a few more as well and there has been some pretty dramatic trends in the rankings.

One day I checked and I was 7th for “seo consultant” refreshed and I was 15th, refreshed again and I was 4th!!!

It’s not just that keyword either, I searched “search engine optimisation” I was 11th, refreshed and I was 26th and again and I was 7th.

Now rankings have always been up and down but this was a little crazier than usual.

Webmaster Tools Update

So…. Google announce the webmaster tools up date only a few days ago which explains that they are now tracking click through rates in the keyword sections of webmaster tools.

Now I don’t know how accurate these figures are and they seem to differ quite a bit to Google analytics, however the data it presents is interesting.

Just take a look at the below screen shot;

webmaster rankings

Now I am only using “bing SEO” as the example but the data establishes a few things across multiple keywords.

The first thing you notice is that out of 320 impressions I only ranked first 260 times (these impressions figures aren’t accurate as I usually receive around 600 visitors on this keyword a month), this means I don’t rank number one, I only rank there around 70% of the time.

The Old Theory

The old SEO theory determines your rank based on page optimisation and the number of links your page/domain has, the only thing affecting rankings would be competition in Google’s algorithm. I now think this is outdated and wrong, over the last 6 months I and other SEO’s have seen massive fluctuations in rankings suggesting a new way of ranking sites and determining traffic.

My Theory

Well, this is what I suspect, Google have confirmed nothing yet or any other SEO authority but the evidence I have seen strongly suggests rotational rankings.

Let me explain…. You do a search in Google on your main key term and you rank 4th, so you rank 4th??? Wrong you may rank 4th in that instance but you could complete the same search 10 seconds later and you rank 10th, a huge difference not only in ranking but the traffic too.

So… Google determines the rank of your site determined on page factors and links and then ranks you in a range of positions could be 5 – 10 or it could be position 2 to 4 pages back, depending on the person searching, there location, personalised search, time of day.

Ultimately how well optimised your site is determines how often you rank in a top position.

So your aim is not to rank in the top 5, which is now impossible, your aim should be to rank their most of the time.

What Does This Mean?

It means quite a lot.

1 – Determining traffic in a certain position is now impossible

2 – Your ranking position can change every second even if you’re searching on the same IP using an incognito window on chrome

3 – You can never say I rank number one, but only I rank number one most of the time

You see there are so many websites competing for the same space that Google has no choice but to use a rotational system like this and for all we know they may use the click through rate to decide whether your ranking gets better or worse, maybe.

I’d love to know everyone’s thoughts on this and if anyone has noticed these changes.

Line Break

Author: Tim (296 Articles)

is the owner and editor of SEO wizz and has been involved in the search engine marketing industry for over 9 years. He has worked with multiple businesses across many verticals, creating and implementing search marketing strategies for companies in the UK, US and across Europe. Tim is also the Director of Search at Branded3, a Digital Marketing & SEO Agency based in the UK.

Share

{ 39 comments… read them below or add one }

David Alexander April 16, 2010 at 9:23 am

Iv noticed this for several months, though I hadn’t analysed it at such a level I presume it was webmaster tools lying to me about my rankings, sometimes they would be accurate, sometimes fairly accurate, and sometimes the link was further back, which I found frustrating but have come to accept. Would be interesting to use the Google global Firefox plugin to investigate the geo side of it.

Reply

Tim April 17, 2010 at 5:49 am

Hi Dave,

I think geo plays a big part in any shuffle but I think this is more of a ‘split testing’ exercise. Seeing which sites get the clicks, maybe. Either that or there is simply too many sites of a similar strengh that Google is swithching it arounud to try and even the playing field.

Either way I am loving the switch, now ranking first page for “search engine optimisation” 40% of the time. Onwards and upwards :) Thanks for stopping by and the social promotion, appreciate it.

Reply

David Alexander April 17, 2010 at 6:37 pm

Hi Tim, Yes well if they are doing it because after trying to keep ahead of the SEO industry for sometime they have realized if it is on a rotational system allowing them to test which get clicks and are therefore relevant and to level the playing field and reduce and gross manipulation it is probably a good thing for the www in general.

Glad you are recieving the benefits of it very good to hear and I must say that my organic traffic has constantly improved and for keyword hits we keep breaking our record pretty much each week. Not so much with quixotic but GBC, I might do a case study on it once it reaches its first birthday, could make an interesting read, I shall let you know.

I’m by no means an seo professional but I do think regardless of what a lot of people say about social media and the often irrelevant high bounce rate traffic it generates does have an overall positive effect on your “chances” of getting the better organic results in time.

Reply

Tim April 18, 2010 at 2:57 am

I think it is a well known fact that I am not the biggest social media fan, however I have seen the benefits of bookmarking in terms of ranking deep pages quickly. On top of this I have seen fresh posts get more organic traffic when they are doing well on stumble or digg at the same time.

Social media does have it’s place in SEO, you just have to balance it out, over indulgment in sites like twitter, stumble and digg can mean your spending a lot of time driving none paying visitors to your content. This can pay dividends but if your main goal is sales or clicks on ads it is never going to happen.

However, if your looking to increase brand awareness or simply build a readership then there is no better way than social media but if Google rankings is your goal you need to be a little wiser about how you spend your time.

Thanks for the input Dave.

Reply

PotatoChef April 17, 2010 at 1:44 pm

I don’t see this as a completly negative thing. People who simply get a ton of links for a page and then move on never to return will have to change their ways.

The question is whether or not they are willing to change “read: do continous work” to maintain a high ranking?

My guess is that a whole lot of people will not be willing to do the extra work. Sites that were put up 5 years ago and never touched again, but still rank???, should start to diminish.

SEO is constantly changing, it is our jobs to change with it.

Good Eye Tim….

Reply

Tim April 18, 2010 at 2:52 am

Hi PChef,

I agree with you, this is not a negative thing at all, it makes any particular industry more competitive by allowing you to aim for a section of the SERPS instead fo a specific rank. It also means webmasters will need to be constantly working on their sites in an attempt to keep up the amount of times they rank for a particular key term.

For me SEO at it’s base doesn’t change much, however when it comes to techniques you have to be constantly trying new things in order to stay current and maintain rankings.

Reply

Kieran Flanagan April 19, 2010 at 5:38 am

There are so many variants effecting search now that it’s near impossible to measure success of failure by keyword rank. Traffic is the only true reflection of what is going on. I have noticed this as well for all clients. If I check their keyword rank positions after 6pm, they are top 5, the next morning they are page 2 and throughout the week, it will be fluctuate.
Looking at the new Google Webmaster Tools data, highlights that keyword rank has little bearing on what traffic they are getting (if I assume GWT is accurate). A lot of this is due to personalized search + geo targeting and a host of other factors.
If Google do take a lot more analytical data into account, then I would not be surprised if a lot of this is due to split testing i.e. switching sites to see which return the most favorable data i.e. organic CTR etc.
It’s quite frustrating at times and you have to wonder how much of this is focused on getting people to spend cash on “predicable” PPC traffic.

Reply

Tim April 19, 2010 at 6:52 am

Hi Kieran,

Great analysis. I have definitely noticed an increase in the unpredictability in rankings over the last 4 weeks, at first I just though it was the caffeine update but then I noticed rankings switching hourly on the exact same desk top. It is frustrating from a reporting persepctive, however we are looking to intergrate webmaster tools info into our reporting, it also helps us to prove to our clients that rankings in terms of landing a position is not the ultimate goal, developing traffic is….

Thanks for dropping in….

Reply

Jahnelle Pittman April 20, 2010 at 11:04 am

Love the analysis, Tim. I noticed the same, same ages ago… ;)

Seriously, I have to say I’m glad someone else spoke up; I thought my computer/browser/Google had gone on the fritz. Call up a client “congrats, you’re in #1 position!”.. they pull it up and they’re in #6. On the client end, it’s frustrating; on the SEO end, however, it’s exciting. Just one more piece of puzzle we have to fit into the mix to get a complete picture for clients.

Okay, so we have to work harder. The downside may be convincing clients that long-term SEO is even more pertinent than ever. The upside may be that we, as SEOs, can spend oh, so much more time focusing on all important traffic rather then having clients complain because they aren’t in a specific spot – “how come we’re only #3?”. Ultimately, I agree with Kieran – we’re not in the business to get you ranking, we’re in the business to get you traffic.

We tell clients all the time, rank high all you want, but if people aren’t searching for the term, there’s no traffic. Yes, ranking is important… (I’m going out on a limb here) somewhat. However, with these fluctuations, frankly I feel a huge amount of relief. Finally… FINALLY… maybe we can get past ranking and focus on what really matters to clients if they could just see past all the crap they read – TRAFFIC!

Love the article and thanks for being brave enough to post it! Hate social media all you want; it’s going on Twitter!

Reply

Tim April 21, 2010 at 1:05 am

Hi Jahnelle,

Thanks for the comment. I think your right, the only thing this really effects is how we report or how we sell success to the client. It may also effect those who rely on pure rankings as a success metric.

As always with SEO you are constantly having to re-educate yourself, clients and those you work with.

p.s. I don’t hate social media, I just think people indulge in it to the point that they are losing money or time, which is one in the same thing in my opinion. All tweets and SU’s, Digg’s and Sphinn’s are appreciated! :)

Reply

Tom Girling April 21, 2010 at 3:37 am

Surely some of this is to do with personal search rather than rotational ranking?

Reply

Tim April 21, 2010 at 7:47 am

Hi Tom,

That was my first thought, that and geo targeting, however we have been monitoring over 100 keywords on .com .co.uk and .au and all are switching positions constantly over 12 -15 industries. All ranking checks are carried out in Google incognito wndows which completely rules out personalised search and any other browsing history issues.

Reply

Free Web Traffic April 21, 2010 at 10:09 am

You know I have been thinking the same thing. A website is not ranked in the same spots all the time and it really depends on the database that is pulling the info. You will not rank on uks google the same as world. You may be right it is a very interesting concept.

Kris,

Reply

Tim April 22, 2010 at 1:11 am

Thanks Kris,

It is interesting not because it changes much in terms of link building or on site issues but it certainly changes peoples view point on rankings.

Reply

don_7111 May 12, 2010 at 1:45 am

No one knows what Google is doing or will be doing. I don’t buy the rotational rankings and here is why. Google’s goal is to get the most relevant sites listed for searches. So, Google gives relevance and accuracy the top priority over everything else. The rotational rankings would work against that goal. Does Google want to pull up less relevant sites from time to time just to make the whole thing interesting? If Google sacrifice relevance, that will be end of this search engine.

Reply

Tim May 14, 2010 at 7:39 am

Hi Don,

The rotational rankings theory is not a myth, it is based on Google’s own data, simply head over to webmaster tools and click on your keyword rankings individually, it will probably look a little something like this;

1st = 244 clicks = 22%

2nd = 34 clicks = 35%

3rd = 30 clicks = 40%

4 – 10 = 56 clicks = 5%

What I am saying is Google is telling us with this data that you never permanently rank in a fixed position. We run ranking queries on over 500 sites and movement in the SERPS happens daily. Even our strongest brands only ever rank first 60% of the time with the rest been divided between 2nd and 3rd.

The theory comes in when we consider why Google are doing this? Why so much switching around in the SERPS? Is it caffeine update or is something a little more systematic. Unfortunately there are thousands of relevant websites for any given search query and even though Google has around 200 signals it uses to rank them I am sure many come up pretty close together so why not switch them around a little?

I think you line of thought “Google is all about relevance and would only ever show the most relevant results” is a little naive. Google gives a huge amount of weight to links and this is how it determines relevancy and trust me when I say you don’t have to be the most relevant resource in order to rank well if you know how to build links effectively.

Reply

searchbrat May 14, 2010 at 8:40 am

I agree with this and it’s what I am seeing. The average organic position shown in GWT is pretty handy.

Reply

ben May 14, 2010 at 1:06 pm

I was 7th for the term ‘fix broadband’ and 10th for ‘repair broadband’ a few weeks ago.

However, I am now no where to be seen for those search terms but am 11th – when I search for ‘fix broadband problems’.

Do you guys use the tag nosnippet for googlebot so the desciption doesn’t get shown on the search or do you not use it? I set my main site to use this tag yesterday to see and improvement with ranks for different search terms.

I optimised every page of my websites, 23 pages for one site for 100% on SEO meta tag analyzer but I have updated the site since and have not analysed it with the analyzer so this might be why it isn’t rankings as well as before.

I’m led to believe the different positions of URLs in search terms, obviously discounting history and when you are signed into Google that Google has so many servers and databases that do not update as regular as each other, causing the pages to rank differently sometimes.

It is annoying but something else to consider with SEO I guess.

I believe that inbound links arean’t as important as many people think they are from previous rankings for searches that had over 7 million results and my site being on the first page and my site having few inbound links, probably 1 or 2, litterally.

I also believe that naming pages for search terms does SEO no end of good from experience and XML sitemaps set to hourly and last updated. I know many people do not think this makes any difference but I beg to differ.

Obviously I am in no position to say this when you are on the first page for ‘search engine optimisation’ <– that is amazing to be frank – nice work.

Reply

Tim May 17, 2010 at 1:17 am

Hi Ben,

Thanks for the indepth comment, I personally like to use the meta description, well on static pages any way, I know it has no weight in terms of rankings but I like to tailor the description for the user.

I think you are in that on page is very important and having a clean clear url structure will help you get more traffic, however in terms of rankings inbound links are vital, you need a constant flow of inbound links from easy sources but you also need some real authority links, links that pass a lot of trust, these are the hardest to find but if you can get one a month it will help you rise up the rankings very quickly.

It took me just over a year to reach first page for search engine optimisation, hours and hours of link building went into it and I only hit the first page after getting a link (without the exact anchor text) from a highly trusted site. You just have to keep going.

With regards to your early rankings I really can’t explain the drop, often Google can enter new sites in at the top of the SERPS, depending on competition and relevancy of course however it is almost always short lived as Google finds your site less and less as there are no inbound links poiinting at it.

Reply

ben May 18, 2010 at 6:56 am

I have found that the description meta tag is how I got good rankings, rather than keywords. Keywords are something I tend to keep the same as the desciption but with comma’s.

I still think a lot of is down to the keyword density, back to basics stuff but the percentage words are in text on the pages.

I did spend hours and hours optimisation each page of my original site for 100% for description, keywords and title.

Is it best practice to keep the title less than 80 characters, desc. less than 201 chars. and keywords less than 20 words? or is this an extreme limit as I have seen a lot of sites with a 400 char. desciption being a lot better ranked on Google.

I really need to look into getting high quality inbound links.

Do the links need to be repriciprical as my domain name was only registered 3 months ago and has gone through two different hosts which obviously envolved name server changes.

To get newly created sites on Google I have found adding them to webmaster tools and then analytics helps but not just this, making a hubpage with a bit of content from the page you want to link to helps a lot IMO and a link to the aforementioned page. Doing this, I have gotten a website on Google within an hour of registering it even though site:xxxxxxxxxxxx.co.uk showed no indexed pages.

Reply

ben May 18, 2010 at 7:01 am

sorry I meant I am 11th for ‘repair broadband problems’ not fix anymore but for ‘fix broadband problems’ I am roughly 16th.

sometime soon I am going to optimise properly for that search term as a test.

Reply

Tim May 18, 2010 at 7:23 am

Hi Ben,

With regards to word limits you should look at no more than 60 for the title tage and 3-4 lines in the meta description. Meta keywords don’t have any impact on rankings. When link building I suggest using article submissions and directory submissions to start with, then when you have some authority you could begin to look at some link partnerships but don’t go overboard with it.

Submitting a sitemap to webmaster tools is one of the best ways to be found in my opinion, but getting indexed and getting rankings are 2 completely separate tasks with 100% different methods.

Reply

ben May 18, 2010 at 7:33 am

Thanks for the very informative information Tim.

I have been reading about directory submission and have submitted my site to one so far. I did find a site that would submit it to over 1000 but obviously when I went through the pages it wanted payment and I would not even want my site submitted to 1000 directories in one go as I’d probably get kicked off Google!

I am going to submit my site to directories that individually as I do not want to pay for it.

I also believe sitemaps are very useful for good PR.

Do you think that updating the last modified date in an XML sitemap helps at all and when I’m on that topic do you think the re-visit tag helps if it is set to 1 days because don’t think Google even takes notice of that tag when set to 1 days.

Also, the http-equiv=no-cache tag – do you think that is a good or bad thing or neither?

Cheers,
Ben

mot test wakefield May 17, 2010 at 2:55 am

A very interesting article. I will have to try refreshing and see what happens.

Reply

judith May 17, 2010 at 9:03 am

Sounds like the best theory so far.

Reply

nocleg July 6, 2010 at 3:31 am

very good article thanks
jabolo

Reply

The Human CpU July 19, 2010 at 1:46 pm

I’ve notice this aswell. I would be looking at my stats and see a keyword pop up that is getting major attention. I’d do my own query for it and would see it be pushed far back in the search again. Similar patterns on a lot of my niche sites.

I run an online store front however that stays very consistent. But those rankings are based on more factors such as google base, ratings/relevance, and some major back linking.

I could put my finger on it exactly, but you seem to have nailed it on the head. It all makes sense now. cheers

CpU

Reply

Tim July 21, 2010 at 12:58 am

Hi there,

Ye the rankings switches are difficult I never rank in a permanent position for more than a week before it changes again, however I firmly believe that my site appears in different pages depending on the user, location etc….

Reply

TC October 19, 2010 at 2:38 pm

How interesting! Google hates affiliates and is NOW the biggest affiliate in the world rotating pages and testing CTR just like an affiliate to see which converts best! This is sure to maximize their content network revenue LOL!

Looks like I will be focusing more on titles and meta descriptions that show up in the SERPS too. Not just my linking profile.

I started following rankings over time using Authority Labs tool before launching a new link building campaign. After a couple weeks of collecting rankings I have started to see these fluctuations.

I am also somewhat relieved actually. I am thinking now I can worry more about creating the best content and user experience.

Reply

Tim October 20, 2010 at 12:56 am

Hi TC,

I think the ranking fluctuations will continue as Google is updating the index more regular after caffeine. You’re always going to need a strong link building campaign, however as you say great content and user experience are always going to pay dividends. Link building and ranking optimisation simply means you’ll be at the top more of the time.

Reply

Ben May 29, 2012 at 12:38 pm

Hi there,

I have just found this article again from backlinks checker even though its nofollow.

To the point now. My website http://www.holsworthycomputing.com up until yesterday was first for “IT support north devon” on Google and also for “computer repairs north devon” and near the top for “IT support devon”

A page from the site that isn’t the homepage was top for “broadband repair” and “repair broadband”.

Suddenly for no apparent reason my site seems to be lower than it should be for the above search terms and I am clueless as to why this has happened.

I have a hundred and something backlinks and the homepage is currently PR2 not that it means much and the rest of the internal pages are PR1.

Is there a chance Google have penalised me for having sites with similar content using different domain names? if so how can this happen? does Google automatically find them from crawling the sites and tie them together or has someone reported it?

I have made an additional page targeting local area’s for computer repair and in 3 or 4 days it should be indexed.

Thanks for any help.

Ben

Reply

Tim June 12, 2012 at 8:29 am

Hi Ben,

If the content is very similar to the other sites, then there is a good chance Google have crawled the sites and associated them. Do the sites link to each other?

If it’s not the content, you will need to have a close look at your back link profile, low quality links have been hit badly in the last 3 months.

- blog commenting
- article directories
- blog networks
- forum spam and comment spam

Are all likely to get you hit. As well as this look for any anchor text rich site wide links, the Penguin update specifically targeted this type of link.

Reply

Ben June 12, 2012 at 9:47 am

Hi Tim,

Thanks for the reply. It has got me thinking that an older URL I have with very similar content on to the main one is being used by Google higher up than the main solely down to the fact it is an older URL but a lower alexa and PR.

My main site is only a PR2 site with roughly 150 inbound links and the sites don’t link to each other. I am just worried that the content is too similar and Google has caught up with this.

Would it help if I took the oldest site offline? I did have a lot of different domain names for my company but I am gradually letting them all expire apart from my main one.

I have few back links from blogs, none in article directories, blog networks or forums – the vast majority are from other websites link pages, from PR0 up to PR3.

Thanks,

Ben

Reply

Tim June 12, 2012 at 12:31 pm

Hi Ben,

If you think this old domain has all the authority, and therefore out ranking your newer site, then I would consider using either a 301 or canonical link to let Google know what the master copy is.

I would be careful with links from resource and other forms of lists, ask yourself:

- are they relevant
- are the links to genuinely helpful to users
- what anchor text is being used

We have seen Google pick sites up for links on giant links list, even when the anchor text is brand related.

Reply

Ben June 12, 2012 at 1:48 pm

Hi Tim,

The old domain has more authority for certain searches and one thing I’ve noticed is that my homepage for the newer site doesn’t get picked up as often as before – the deeper pages seem to be getting higher rankings for certain searches?! even when the density of words is fewer than the homepage…somethings like this I just cannot figure.

With regard to the 301, would it be best to place a 301 from my old site to my newer site then? I know I can do this through webmaster tools.

Not all my links are relevant and OBL on my links pages are not always going to computer related pages either so I guess I’ve made a big mistake there, by linking to random sites and them linking to me.

Thanks,

Ben

Tim May 19, 2010 at 1:02 am

Hi Ben,

Keep you directory submissions relevant and yes don’t mass submit, especially not with a new site anyway. XML sitemaps in my opinion only help getting pages indexed, the re visit tag will not speed up the process in fact I doubt it actually helps that much. The more inbound links you get the quicker your fresh content will be indexed.

The no cache meta tag is only really of use on pages where you could have sensitive details such as credit cards and addresses, I wouldn’t recommend it on resource or content heavy pages as it can make crawling slower and slow down the website as whole.

Reply

Ben July 6, 2010 at 9:19 am

Hi there,

I have submitted my site to a few directories now and it is doing okay on Google, Yahoo and Bing.

I am under the impression the re-visit tag is useless as Googlebot will just crawl it at random pretty much rather than go by a tag that it probably ignores completely.

The last modified hourly in an XML sitemap – do you think that helps Google recrawl the page for new content because I personally see no difference to sites getting crawled with this set to weekly, monthly, daily or hourly – it seems to make no difference.

In England there is a site called gumtree. I’ve found that if I post my business as a service on there I can get a the gumtree link to the first page within an hour but for the URL itself I have only found HubPages good at getting a new domain on Google very quickly, having used Digg, Stumbleupon etc to no avail in that respect.

I am still working on getting better rankings for ‘repair broadband’ and ‘broadband repair’ and I am _usually_ 2nd for both.

The thing is though both searches are normally out of ~3.5 million results but I searched for ‘broadband repair’ a few minutes ago and there was ~35 million results. Seems Google is very very strange indeed.

I updated the page 5 days ago as well as updating the sitemap last modified date for the page and Google has yet to re-index it with it’s new title.

It is using snippets from the page itself in the description. I am not sure if this is just because I am using the ‘noodp’ tag or there is another reason behind it, not that I mind much.

Do you know about ?

I use this tag because about 3 years ago I had a site that ranked very poorly and two days after I added this tag it was on the first page of Google but I expect nowadays it doesn’t help at all and Googlebot ignores it or it always ignored it and my site fluked it’s way to the first page.

I’ve got nocache in the robots tag for a lot of my pages which I know is wrong but I guess Googlebot just ignores it.

Some of my pages have but I don’t seem to be getting any adverse affects from using it. The pages that tag is used on are not ones I need good for SEO anyway.

Thanks for your help

Ben

Reply

Tim July 7, 2010 at 1:42 am

Hi Ben,

Wow long comment but I’ll try my best to break it down.

- Your sitemap setting will only really effect how often your site map is updated, Google’s crawl rate is based purely on PR and authority, the more important your site becomes the more it will check back for content.

- I think it’s a good idea to use user generated content on hubpages and such to get your site indexed quickly.

- When assessing competition completely ignore the ‘results returned’ figure, this has no bearing on competition. Check the top 3 pages and see which ones have the keyword in their title tag, these are your real competitors, the ones you should be looking to compete with.

- the noodp tag which prevents Google from using the Open directory description, should have no effect on rankings, however just make sure your meta description is fully optimised. Even though this is not a ranking factor it seriously impacts click through rates.

Overall the best way to get indexed faster is by building more links, the more you have the faster your site will be indexed. Also have an audit of your main site architecture, make sure you are linking to your deeper pages from the home page, or at least from a second level file.

Reply

Tim June 13, 2012 at 1:01 am

Yes, I would definitely clean up the reciprocal linking, it’s a technique constantly being scrutinised by Google.

The 301 needs to go to which ever site you would prefer to rank, pick the one that is best optimised.

Reply

Leave a Comment

{ 5 trackbacks }

Previous post:

Next post: