6. Tracking the Blogosphere
The blogosphere is a very dynamic part of the Web—always fresh and
constantly updated. News tends to break in the blogosphere, and in
Twitter, before it hits mainstream media or the Web at large. Tracking
the blogosphere will help you stay current on both your industry and the
SEO industry. Major influencers tend to hang out in the blogosphere as
well as have their own blogs. Identifying these influencers is the first
step to reaching them.
Here are four major uses for tracking blogs:
Reputation monitoring
A significant component of reputation monitoring relates to
the discussion in this section about domain and brand mentions
across the Web. Reputation monitoring takes it one step further,
where you look to identify problems and risks to your reputation
as they materialize on the Web.
For example, knowing when someone starts to bad-mouth your
brand somewhere on the Web is important. Tracking down those
issues early and addressing them quickly is a wise thing to
do.
Tracking buzz and public relations campaigns
A closely related activity is tracking buzz and the response
to your PR campaigns. When you make a major press push of some
sort or you succeed in getting a write-up about your website on an
influential blog, you should monitor the ripple effect across the
rest of the Web.
Using the type of brand tracking we’ve already discussed can
be a great way to do that. For example, did the number of mentions
you received on the Web this week increase significantly over the
number of mentions the week before?
Identifying potential influencers
Identifying influencers is a key part of link building. Establishing yourself as a
recognized expert is an important part of your content strategy.
Part of that is creating world-class content, but this won’t mean
much unless you get the word out somehow. A great way to do that
is to reach out and develop relationships with the key
influencers.
Competitive analysis
If you can do this type of research for your own site, why
not do it for your competitors’ sites? You can see reputation
problems that are developing for them, or see the impact of their
media campaigns as they roll them out.
7. Tracking Your Blog(s)
Blogs offer a lot of different benefits to the online marketer.
They can offer a great way to position yourself or the team behind your
site as experts. For that reason, you can reach major influencers as
well as a broad audience of people at the same time. This can become a
nice source of links as a part of your link-building campaign.
7.1. Blog subscribers
Measuring the number of subscribers to your blog is one basic
way to monitor the blog’s progress. One way to do this is to create
your RSS feed with FeedBurner and have users
use that RSS feed instead of any other solution, such as the one that
comes with your blog software. By having users subscribe to the
FeedBurner-based RSS feed, you can get much more granular statistics
on your subscriber base, as shown in Figure 26.
As you can see in the figure, you get data on the growth of your
readers over time, as well as a breakout of your subscribers showing
which feed readers they are using. Be aware that although this
measurement is pretty inaccurate, it can still give you an idea of
where you stand. Note that in Figure 9-39,
Bloglines is used by about 10% of this blog’s readers.
You can also get some data regarding how many of the people who
are subscribing to your blog or to your competitors’ blogs are using
Google Reader or
Bloglines
to do so. Figure 27 shows an
example of how such a report looks in Bloglines.
Figure 9-40 shows that
394 people are subscribing to Google Blogoscoped (http://www.blogoscoped.com) using Bloglines. You can
perform a similar check in Google Reader to see how many people are
subscribing to the blog using Google Reader. As there are many
different types of readers out there besides Google Reader and
Bloglines, this data is limited in its scope, but it is still
extremely useful for telling you something about how many people
subscribe to your blog, and for its use in competitive
analysis.
Note:
Another way to take this analysis deeper is to look for
specific reader user agents in your logfiles. For example, you could
use this to find out how many My Yahoo! readers you have.
7.2. Blog links
There are a few ways to extract data on the number of links to
your posts. One is to use Technorati to see the number of Reactions
(i.e., links) to your posts.
You can also use either Google Webmaster Tools or Bing Webmaster Tools
to track the number of links to your blog.
As we outlined in the discussion on link tracking, you can use
these tools to get link data on your website. You can also take a more
granular look at your data. For example, if you have a blog at
http://www.yourdomain.com/blog, you can pull a
report from Google Webmaster Tools (or your favorite tool of choice)
and sort your spreadsheet on the URLs of the pages receiving the
links. When you are done with this, your data might look something
like Table 2.
You can then find the end of the links that go to the blog and
see how many total links you have. So, for example, if the last link
is found on line 12,367 of the spreadsheet, the total number of links
to the blog is 12,367 – 8,437 = 3,930. That can get you a raw count of
inbound links to the blog. To go further, you can start analyzing what
parts of the blog have the most links as well.
If you want additional data on the links to your blog, you can
sign up to use Enquisite Optimizer, formerly known as Enquisite
Optimizer, which will additionally allow you to track the traffic into
your site from your links. You can organize it to track on a per-URL
basis, on a per-domain basis, or even on a custom group of URLs or
domains as a single item. Better still, you can also track the sales
that result from such links.
8. Search Engine Robot Traffic Analysis
Understanding how robots are spidering your site is another thing
that the expert SEO practitioner should know how to do. For one thing,
spidering frequency is a clue as to which pages on your site have the
highest PageRank and trust, because Google crawls the Web in reverse
PageRank order. It can also help you detect spidering problems on your
site.
You can use the tools we discuss in this section to help you find
potential spidering issues, and analyze how important the search engines
consider your content to be. You should be looking for clues of SEO
problems, such as robots.txt
blocking the crawlers, architectural problems, or even signs of a
penalty (as might be signified by a big drop in crawling
frequency).
However, it is important to know that this data will not tell you
everything. For example, the fact that a web page was crawled does not
mean it will be placed in the index. The page will still need to pass
some additional tests to achieve such placement (such as the presence of
unique content and enough links to justify its inclusion).
For pages that are indexed, you can look at how often the spiders
visit/crawl your pages versus how often the engine actually shows a new
version of your page in the index. To do this, look at the last cached
date the search engine reported, and compare it with your crawling
data.
Tools that perform log analysis include Webtrends, Unica Affinium NetInsight,
and Lyris HQ
Web Analytics. These are well-known web analytics packages that
offer the option of analyzing your logfiles. Figure 28 shows a snapshot of a robot report
from NetInsight.
Note:
Google acquired Urchin in March 2005 (http://www.google.com/intl/en/press/pressrel/urchin.html).
Urchin’s JavaScript-based solution became Google Analytics. However,
the logfile analysis software version of Urchin continues to be
available as well (http://www.google.com/urchin/index.html).
Other logfile analysis programs include:
8.1. Google Webmaster Tools
You can also get detailed information about spidering activity
on your own website using Google Webmaster Tools. Figure 29 shows a snapshot
for one site.
This provides a great visual snapshot. One question that emerges
from this data that this publisher may want to consider is why the
time per page jumped up from 300 milliseconds to 650 milliseconds or
so in mid-December. This may be the result of a change in the site
architecture or a change in the hosting arrangements for the website.
When you see these types of changes, it can be a flag that something
happened with your website, and you should investigate it.