03-18-2012, 09:40 AM
Google wants to see a natural mix of links.
That means the mix of links pointing to your site should include:
Different top level domains
Do-Follow and no-follow
Different anchor texts
Different platforms (WordPress, static HTML, etc)
Different types (articles, videos, etc)
Different PR's
Different ages of your links
Google knows the natural distribution of these elements. It also knows the variation between sites in your industry/niche (the standard deviation if you want the mathematical term).
So if your site’s mix is roughly the same as most of your competitors, that’s probably natural. And if it’s wildly different, that’s probably not natural. (there's no hard and fast rule to this, though)
.com domains make up around two thirds of all domains registered.
It’s difficult to get hold of exact figures as the main comparison sites like Registrar Stats miss out country level domains but .com still has the lion’s share.
At the time of writing there were 97 million .com domains and some 45 million domains with other extensions such as .net (14 million), .org (9 million), .uk (9 million), .info (8 million) and .biz (2 million).
There were just over 8,400 .edu domains. This is less than a hundredth of one percent of all domains in existence. Which makes them hard to find. It also calls into question whether it’s advisable to have a high percentage of your inbound links coming from .edu sources because that just can’t be natural.
The same kind of logic goes for all the other factors involved in the links pointing back to your site.
So if you’re using a network of private blogs as part of your backlink strategy, the mix in that network should be about right to protect you in the long term.
The best way to ensure a reasonably natural mix of backlinks which will fall within the computer parameters of being natural is to vary where you get your backlinks from.
It’s human nature to repeat things over and over again. Which is why so many people only have one or two types of link and that’s a big mistake.
Let's see how you can prevent those mistakes, and take your SEO to another level!
Anchor Text
Google likes to see this vary because in the real world, not everyone would use the precise same words to describe your site:
They might mix the order of the words up.
They might add or subtract words
They might just use the URL of the page rather than any other text
They might just say "click here"
They might mess up the code on their web page and not even link properly to your page
If you’re asking for links, it follows that you should mix up what’s in the anchor text if you have any influence over it.
I aim for a mix of anchor texts to mimic how they would get created in the real world.
Do-Follow/No-Follow
The one that most webmasters debate about is "Do-Follow" or "No-Follow".
This is partially a misnomer – Google has been shown to follow (and therefore index) links regardless of which instruction is given.
It actually means "don’t pass any authority with the link" and was originally designed to show when a link was a paid advertisement.
Nowadays it’s more likely to be used for different reasons:
Some webmasters like to conserve the authority of their pages and not spread it across tens or even hundreds of links.
Some sites use Do-Follow as a reward for good behavior and following their rules.
Some people suggest that pages such as your Terms and Conditions page should be marked No-follow but there’s no evidence either way as to whether this is good practice.
Some blog platforms such as WordPress default to making links in comments No-follow.
Personally, I don’t particularly care whether a link is marked No-follow or not.
I’m looking for a natural mix and there would be nothing more unnatural than if almost every link pointing to my site was a Do-follow one (the default, so if a link isn’t identified as No-follow it’s automatically Do-Follow).
And I don’t worry if the webmaster messes up the code and forgets to put the link in a tag at all. Google doesn’t worry about this and seems to treat these as regular links.
Link Footprints
A footprint is anything that can be identified as a common identifier between your backlinks. That could be as simple as only ever using YouTube videos to link back to your site.
Or commenting only on BlogEngine blogs because they have a higher chance of your comments being auto approved.
Or only getting Do-Follow links.
Or using the exact same anchor text every time.
Or anything else that could give Google a clue that your link building isn’t natural.
Let’s take a look at private blog networks for a minute. They often claim to have no footprint but is that really the case?
To start with, they’d need most of the domains to be .com because that makes up roughly two thirds of the domains in existence.
If they’ve gone for a cheap private network then maybe they’ve chosen a lot of .info domains, even though these are less than 6% of total domains registered.
Then there’s the registrar they’ve chosen.
Love them or hate them, GoDaddy account for almost a third of all domain registrations. So if the private network has all (or none) of their domains with GoDaddy, that would be a footprint.
Then there’s hosting. Figures for this are harder to come by but companies like Hostgator, 1&1, etc are big. So if none of the domains in the private network were with any of the big hosts (and Google will know all this from the IP addresses as well as the nameservers and other information it has access to) then that’s a footprint as well.
These anomalies may well fly under the radar for a few years. It’s happened before and will happen again. If Google think that a particular pattern or footprint is distorting their results, they’ll make the necessary changes. (Remember the Panda 3.3 that swept some major blog networks?).
So the smaller the footprint your backlinks create, the better.
Quality? Quantity?
There’s always a heated debate whenever this subject comes up in link building: is it better to have a handful of carefully created backlinks or thousands of low value links?
If you do a search for almost any topic, you’ll find the answer: both methods work. There will almost certainly be sites in the results that have almost no backlinks and others that have thousands listed.
You can use tools such as Market Samurai or Traffic Travis to gather this information for you. The free version of Traffic Travis works OK for a quick snapshot if you don’t use the SEO for Firefox plugin. These kind of tools can help with finding out how your competitors are ranking but don’t let them distract you from the actual task of getting backlinks.
It’s important to remember that even if you build thousands of links at once using some automated methods, it will take Google some time to find and index them.
Whilst it’s evaluating them, your listing may move up and down the search results. This is known as the Google Dance and is perfectly normal.
One of the components of Google’s algorithm takes care of working out whether something is breaking news or not.
It’s perfectly natural for lots of links to appear when a predictable event like the Superbowl or an unpredictable event like a major hurricane occurs. Google takes account of this along with the information it has about the number of searches being performed (and umpteen other factors) and can give a temporary boost if everything meets the correct criteria. When the event is over, the site will drop down in the search results until the main algorithm works out its correct resting place in the results.
Conclusion
This was the theoretical part. Next up, I'll be covering how to actually diversify links, not leave a single footprint, and serve the Panda without being it's slave.
It's REALLY important to understand this before we move on to various (awesome) link building methods.!
That means the mix of links pointing to your site should include:
Different top level domains
Do-Follow and no-follow
Different anchor texts
Different platforms (WordPress, static HTML, etc)
Different types (articles, videos, etc)
Different PR's
Different ages of your links
Google knows the natural distribution of these elements. It also knows the variation between sites in your industry/niche (the standard deviation if you want the mathematical term).
So if your site’s mix is roughly the same as most of your competitors, that’s probably natural. And if it’s wildly different, that’s probably not natural. (there's no hard and fast rule to this, though)
.com domains make up around two thirds of all domains registered.
It’s difficult to get hold of exact figures as the main comparison sites like Registrar Stats miss out country level domains but .com still has the lion’s share.
At the time of writing there were 97 million .com domains and some 45 million domains with other extensions such as .net (14 million), .org (9 million), .uk (9 million), .info (8 million) and .biz (2 million).
There were just over 8,400 .edu domains. This is less than a hundredth of one percent of all domains in existence. Which makes them hard to find. It also calls into question whether it’s advisable to have a high percentage of your inbound links coming from .edu sources because that just can’t be natural.
The same kind of logic goes for all the other factors involved in the links pointing back to your site.
So if you’re using a network of private blogs as part of your backlink strategy, the mix in that network should be about right to protect you in the long term.
The best way to ensure a reasonably natural mix of backlinks which will fall within the computer parameters of being natural is to vary where you get your backlinks from.
It’s human nature to repeat things over and over again. Which is why so many people only have one or two types of link and that’s a big mistake.
Let's see how you can prevent those mistakes, and take your SEO to another level!
Anchor Text
Google likes to see this vary because in the real world, not everyone would use the precise same words to describe your site:
They might mix the order of the words up.
They might add or subtract words
They might just use the URL of the page rather than any other text
They might just say "click here"
They might mess up the code on their web page and not even link properly to your page
If you’re asking for links, it follows that you should mix up what’s in the anchor text if you have any influence over it.
I aim for a mix of anchor texts to mimic how they would get created in the real world.
Do-Follow/No-Follow
The one that most webmasters debate about is "Do-Follow" or "No-Follow".
This is partially a misnomer – Google has been shown to follow (and therefore index) links regardless of which instruction is given.
It actually means "don’t pass any authority with the link" and was originally designed to show when a link was a paid advertisement.
Nowadays it’s more likely to be used for different reasons:
Some webmasters like to conserve the authority of their pages and not spread it across tens or even hundreds of links.
Some sites use Do-Follow as a reward for good behavior and following their rules.
Some people suggest that pages such as your Terms and Conditions page should be marked No-follow but there’s no evidence either way as to whether this is good practice.
Some blog platforms such as WordPress default to making links in comments No-follow.
Personally, I don’t particularly care whether a link is marked No-follow or not.
I’m looking for a natural mix and there would be nothing more unnatural than if almost every link pointing to my site was a Do-follow one (the default, so if a link isn’t identified as No-follow it’s automatically Do-Follow).
And I don’t worry if the webmaster messes up the code and forgets to put the link in a tag at all. Google doesn’t worry about this and seems to treat these as regular links.
Link Footprints
A footprint is anything that can be identified as a common identifier between your backlinks. That could be as simple as only ever using YouTube videos to link back to your site.
Or commenting only on BlogEngine blogs because they have a higher chance of your comments being auto approved.
Or only getting Do-Follow links.
Or using the exact same anchor text every time.
Or anything else that could give Google a clue that your link building isn’t natural.
Let’s take a look at private blog networks for a minute. They often claim to have no footprint but is that really the case?
To start with, they’d need most of the domains to be .com because that makes up roughly two thirds of the domains in existence.
If they’ve gone for a cheap private network then maybe they’ve chosen a lot of .info domains, even though these are less than 6% of total domains registered.
Then there’s the registrar they’ve chosen.
Love them or hate them, GoDaddy account for almost a third of all domain registrations. So if the private network has all (or none) of their domains with GoDaddy, that would be a footprint.
Then there’s hosting. Figures for this are harder to come by but companies like Hostgator, 1&1, etc are big. So if none of the domains in the private network were with any of the big hosts (and Google will know all this from the IP addresses as well as the nameservers and other information it has access to) then that’s a footprint as well.
These anomalies may well fly under the radar for a few years. It’s happened before and will happen again. If Google think that a particular pattern or footprint is distorting their results, they’ll make the necessary changes. (Remember the Panda 3.3 that swept some major blog networks?).
So the smaller the footprint your backlinks create, the better.
Quality? Quantity?
There’s always a heated debate whenever this subject comes up in link building: is it better to have a handful of carefully created backlinks or thousands of low value links?
If you do a search for almost any topic, you’ll find the answer: both methods work. There will almost certainly be sites in the results that have almost no backlinks and others that have thousands listed.
You can use tools such as Market Samurai or Traffic Travis to gather this information for you. The free version of Traffic Travis works OK for a quick snapshot if you don’t use the SEO for Firefox plugin. These kind of tools can help with finding out how your competitors are ranking but don’t let them distract you from the actual task of getting backlinks.
It’s important to remember that even if you build thousands of links at once using some automated methods, it will take Google some time to find and index them.
Whilst it’s evaluating them, your listing may move up and down the search results. This is known as the Google Dance and is perfectly normal.
One of the components of Google’s algorithm takes care of working out whether something is breaking news or not.
It’s perfectly natural for lots of links to appear when a predictable event like the Superbowl or an unpredictable event like a major hurricane occurs. Google takes account of this along with the information it has about the number of searches being performed (and umpteen other factors) and can give a temporary boost if everything meets the correct criteria. When the event is over, the site will drop down in the search results until the main algorithm works out its correct resting place in the results.
Conclusion
This was the theoretical part. Next up, I'll be covering how to actually diversify links, not leave a single footprint, and serve the Panda without being it's slave.
It's REALLY important to understand this before we move on to various (awesome) link building methods.!