WEBSITE

Creating Link-Worthy Content and Link Marketing : Further Refining How Search Engines Judge Links

1/10/2011 11:14:47 AM
Many aspects are involved when you evaluate a link. As we just outlined, the most commonly understood ones are authority, relevance, trust, and the role of anchor text. However, other factors also come into play.

1. Additional Link Evaluation Criteria

In the following subsections, we discuss some of the more important factors search engines consider when evaluating a link’s value.

1.1. Source independence

A link from your own site back to your own site is, of course, not an independent editorial vote for your site. Put another way, the search engines assume that you will vouch for your own site.

Think about your site as having an accumulated total link juice based on all the links it has received from third-party websites, and your internal linking structure as the way you allocate that juice to pages on your site. Your internal linking structure is incredibly important, but it does little if anything to build the total link juice of your site.

In contrast, links from a truly independent source carry much more weight. Extending this notion a bit, it may be that you have multiple websites. Perhaps they have common data in the Whois records (such as the IP address or contact information). Search engines can use this type of signal to treat cross-links between those sites that are more like internal links than inbound links earned by merit.

Even if you have different Whois records for the websites but they all cross-link to each other, the search engines can detect this pattern easily. Keep in mind that a website with no independent third-party links into it has no link power to vote for other sites.

If the search engine sees a cluster of sites that heavily cross-link and many of the sites in the cluster have no or few incoming links to them, the links from those sites may well be ignored.

Conceptually, you can think of such a cluster of sites as a single site. Cross-linking to them can be algorithmically treated as a single site, with links between them not adding to the total link juice score for each other. The cluster would be evaluated based on the inbound links to the cluster.

Of course, there are many different ways such things could be implemented, but one thing that would not have SEO value is to build a large number of sites, just to cross-link them with each other.

1.2. Linking domains

Getting a link editorially given to your site from a third-party website is always a good thing. But if more links are better, why not get links from every page of these sites if you can? In theory, this is a good idea, but search engines do not count multiple links from a domain cumulatively.

In other words, 100 links from one domain are not as good as one link from 100 domains, if you assume that all other factors are equal. The basic reason for this is that the multiple links on one site most likely represent one editorial vote. In other words, it is highly likely that one person made the decision. Furthermore, a sitewide link is more likely to have been paid for.

So, multiple links from one domain are still useful to you, but the value per added link in evaluating the importance of your website diminishes as the quantity of links goes up. One hundred links from one domain might carry the total weight of one link from 10 domains. One thousand links from one domain might not add any additional weight at all.

Our experimentation suggests that anchor text is not treated quite the same way, particularly if the links are going to different pages on your site. In other words, those multiple links may communicate link value in a way that is closer to linear. More links might not mean more importance, but the editorial vote regarding the topic of a page through the anchor text remains interesting data for the search engines, even as the link quantity increases.

Think about the number of unique linking domains as a metric for your site, or for any site you are evaluating. If you have a choice between getting a new link from a site that already links to you as opposed to getting a new link from a domain that currently does not link to you, go with the latter choice nearly every time.

1.3. Source diversity

Getting links from a range of sources is also a significant factor. We already discussed two aspects of this: getting links from domains you do not own, and getting links from many different domains. However, there are many other aspects of this.

For example, perhaps all your links come from blogs that cover your space. This ends up being a bit unbalanced. You can easily think of other types of places where you could get links: directories, social media sites, university sites, media websites, social bookmarking sites, and so on.

You can think about implementing link-building campaigns in many of these different sectors as diversification. There are several good reasons for doing this.

One reason is that the search engines value this type of diversification. If all your links come from a single class of sites, the reason is more likely to be manipulation, and search engines do not like that. If you have links coming in from multiple types of sources, that looks more like you have something of value.

Another reason is that search engines are constantly tuning and tweaking their algorithms. If you had all your links from blogs and the search engines made a change that significantly reduced the value of blog links, that could really hurt your rankings. You would essentially be hostage to that one strategy, and that’s not a good idea either.

1.4. Temporal factors

Search engines also keep detailed data on when they discover the existence of a new link, or the disappearance of a link. They can perform quite a bit of interesting analysis with this type of data. Here are some examples:


When did the link first appear?

This is particularly interesting when considered in relationship to the appearance of other links. Did it happen immediately after you received that link from the New York Times?


When did the link disappear?

Some of this is routine, such as links that appear in blog posts that start on the home page of a blog and then get relegated to archive pages over time. However, perhaps it is after you rolled out a new major section on your site, which could be an entirely different type of signal.


How long has the link existed?

You can potentially count a link for less if it has been around for a long time. Whether you choose to count it for more or less could depend on the authority/trust of the site providing the link, or other factors.


How quickly were the links added?

Did you go from one link per week to 100 per day, or vice versa? Such drastic changes in the rate of link acquisition could also be a significant signal. Whether it is a bad signal or not depends. For example, if your site is featured in major news coverage it could be good. If you start buying links by the thousands it could be bad. Part of the challenge for the search engines is to determine how to interpret the signal.

1.5. Context/relevance

Although anchor text is a major signal regarding the relevance of a web page, search engines look at a much deeper context than that. They can look at other signals of relevance. Here are some examples of those:


Nearby links

Do the closest links on the page point to closely related, high-quality sites? That would be a positive signal to the engines, as your site could be seen as high-quality by association. Alternatively, if the two links before yours are for Viagra and a casino site, and the link after yours points to a porn site, that’s not a good signal.


Page placement

Is your link in the main body of the content? Or is it off in a block of links at the bottom of the right rail of the web page? Better page placement can be a ranking factor. This is also referred to as prominence, which has application in on-page keyword location as well.


Nearby text

Does the text immediately preceding and following your link seem related to the anchor text of the link and the content of the page on your site that it links to? If so, that could be an additional positive signal. This is also referred to as proximity.


Closest section header

Search engines can also look more deeply at the context of the section of the page where your link resides. This can be the nearest header tag, or the nearest text highlighted in bold, particularly if it is implemented like a header (two to four boldface words in a paragraph by themselves).


Overall page context

The relevance and context of the linking page are also factors in rankings. If your anchor text, surrounding text, and the nearest header are all related, that’s good. But if the overall context of the linking page is also closely related, that’s better still.


Overall site context

Last is the notion of the context of the entire site that links to you (or perhaps even just the section of the site that links to you). For example, if hundreds of pages are relevant to your topic and you receive your link from a relevant page, with relevant headers, nearby text, and anchor text, these all add to the impact, more than if there happens to be only one relevant page on the site.

1.6. Source TLDs

Indications are that there is no preferential treatment for certain top-level domains (TLDs), such as .edu, .gov, and .mil. It is a popular myth that these TLDs are a positive ranking signal, but it does not make sense for search engines to look at it so simply.

Matt Cutts, the head of the Google webspam team, commented on this in an interview with Stephan Spencer (http://www.stephanspencer.com/search-engines/matt-cutts-interview):

There is nothing in the algorithm itself, though, that says: oh, .edu—give that link more weight.

And:

You can have a useless .edu link just like you can have a great .com link.

There are many forums, blogs, and other pages on .edu domains that spammers easily manipulate to gain links to their sites. For this reason, search engines cannot simply imbue a special level of trust or authority to a site because it is an .edu domain. To prove this to yourself, simply search for buy viagra site:edu to see how spammers have infiltrated .edu pages.

However, it is true that .edu domains are often authoritative. But this is a result of the link analysis that defines a given college or university as a highly trusted site on one or more topics. The result is that there can be (and there are) domains that are authoritative on one or more topics on some sections of their site, and yet can have another section of their site that spammers are actively abusing.

Search engines deal with this problem by varying their assessment of a domain’s authority across the domain. The publisher’s http://yourdomain.com/usedcars section may be considered authoritative on the topic of used cars, but http://yourdomain.com/newcars might not be authoritative on the topic of new cars.

One technique that link brokers (companies that sell links) use is the notion of presell pages. These are pages on an authoritative domain for which the link broker has obtained the right to place and sell ad copy and links to advertisers. The link broker pays the domain owner a sum of money, or a percentage of the resulting revenue, to get control over these pages.

For example, the link broker may negotiate a deal with a major university enabling it to place one or more pages on the university’s website. The links from this page do have some of the inherent value that resides in the domain. However, the presell pages probably don’t have many (if any) links from other pages on the university site or from other websites. As with other forms of purchasing links, presell pages are considered spam by search engines, and pursuing these types of links is a high-risk tactic.

Ultimately, every site and every page on every site gets evaluated for the links they have, on a topic-by-topic basis. Further, each section of a site and each page also get evaluated on this basis. A certain link profile gives a page more authority on a given topic, making that page likely to rank higher on queries for that topic, and providing that page with more valuable links that it could then give to other websites related to that topic.

Link and document analysis combine and overlap, resulting in hundreds of factors that can be individually measured and filtered through the search engine algorithms (the set of instructions that tells the engines what importance to assign to each factor). The algorithms then determine scoring for the documents and (ideally) list results in decreasing order of relevance and importance (rankings).

2. Determining a Link’s Value

Putting together a link campaign generally starts with researching sites that would potentially link to the publisher’s site and then determining the relative value of each potential linker. Although there are many metrics for evaluating a link, as we just discussed, as an individual link builder many of those data items are hard to determine (e.g., when a link was first added to a site).

It is worth taking a moment to outline an approach that you can use today, with not too much in the way of specialized tools. Here are factors you can look at:

  • The PageRank of the home page of the site providing the link. Note that Google does not publish a site’s PageRank, just the PageRank for individual pages. It is common among SEO practitioners to use the home page of a site as a proxy for the site’s overall PageRank, since a site’s home page typically garners the most links. You can also use the Domain mozRank from SEOmoz’s Linkscape tool to get a third-party approximation of domain PageRank.

  • The perceived authority of the site. Although there is a relationship between authority and PageRank, they do not have a 1:1 relationship. Authority relates to how the sites in a given market space are linked to by other significant sites in the same market space, whereas PageRank measures aggregate raw link value without regard to the market space.

    So, higher-authority sites will tend to have higher PageRank, but this is not absolutely the case.

  • The PageRank of the linking page.

  • The perceived authority of the linking page.

  • The number of outbound links on the linking page. This is important because the linking page can vote its passable PageRank for the pages to which it links, but each page it links to consumes a portion of that PageRank, leaving less to be passed on to other pages. This can be expressed mathematically as follows:

    For a page with passable PageRank n and with r outbound links:
    Passed PageRank = n/r

    This is a rough formula, but the bottom line is that the more outbound links a page has, the less valuable a link from that page will be.

  • The relevance of the linking page and the site.

Organizing this data in a spreadsheet, or at least being consciously aware of these factors when putting together a link-building campaign, is a must. For many businesses, there will be many thousands of prospects in a link campaign. With a little forethought you can prioritize these campaigns to bring faster results.

Other  
  •  The Art of SEO : How Links Influence Search Engine Rankings (part 2) - Additional Factors That Influence Link Value
  •  The Art of SEO : How Links Influence Search Engine Rankings (part 1) - The Original PageRank Algorithm
  •  Developing an SEO-Friendly Website : Optimizing Flash (part 2)
  •  Developing an SEO-Friendly Website : Optimizing Flash (part 1)
  •   Developing an SEO-Friendly Website : Content Management System (CMS) Issues
  •   Developing an SEO-Friendly Website : Redirects
  •  Developing an SEO-Friendly Website: Content Delivery and Search Spider Control (part 3)
  •  Developing an SEO-Friendly Website: Content Delivery and Search Spider Control (part 3)
  •  Developing an SEO-Friendly Website: Content Delivery and Search Spider Control (part 2)
  •  Developing an SEO-Friendly Website: Content Delivery and Search Spider Control (part 1)
  •  
    Most View
    SQL Server : Implementing One-Way Encryption (part 1) - Create the Primary Hash Column,Create a Secondary Hash Column for Searching
    Master Apple Mail (Part 1)
    Build Mobile Websites and Apps for Smart Devices : Design for Mobile - Application Icons
    The Windows 7 Command Prompt Environment
    USB DACs Super Test: PC + DAC = HI-FI (Part 4)
    The 3DS XL : Super-Sized!
    Toshiba SSD PC Upgrade Kit 60GB
    Best Brands In Electronic World (Part 1)
    Asus F2A85-V PRO Mainboard - A Socket FM2 Mainboard With Good Performance (Part 5)
    ASP.NET 4.0 : Data Source Components (part 5) - The ObjectDataSource Control - Setting Up for Paging , Updating and Deleting Data
    Top 10
    Sharepoint 2013 : Farm Management - Disable a Timer Job,Start a Timer Job, Set the Schedule for a Timer Job
    Sharepoint 2013 : Farm Management - Display Available Timer Jobs on the Farm, Get a Specific Timer Job, Enable a Timer Job
    Sharepoint 2013 : Farm Management - Review Workflow Configuration Settings,Modify Workflow Configuration Settings
    Sharepoint 2013 : Farm Management - Review SharePoint Designer Settings, Configure SharePoint Designer Settings
    Sharepoint 2013 : Farm Management - Remove a Managed Path, Merge Log Files, End the Current Log File
    SQL Server 2012 : Policy Based Management - Evaluating Policies
    SQL Server 2012 : Defining Policies (part 3) - Creating Policies
    SQL Server 2012 : Defining Policies (part 2) - Conditions
    SQL Server 2012 : Defining Policies (part 1) - Management Facets
    Microsoft Exchange Server 2010 : Configuring Anti-Spam and Message Filtering Options (part 4) - Preventing Internal Servers from Being Filtered