Thursday, June 17, 2010

How To Increase Blog Traffic

1: Find Your Blog Sitemap

A blog sitemap is an XML feed of every link available on your blog. By default most blogging platforms, including WordPress, Blogger, and Moveable Type, have an RSS feed preinstalled on your blog. An RSS feed is a feed made using XML so there is no setup, it’s already there!

2 Load your blog sitemap in to Google Webmaster Tools

Visit the Google Webmaster Tools Home Page and login to your Google account. If you don’t have a Google account, register one real quick.

Once logged in to the Tools dashboard you will see a textbox at the top allowing you to add your blog:

Next you will need to Verify your blog to Google. Click the “Verify your site” link and follow the instructions given by Google.

Once you have verified your blog click the “Sitemaps” menu item from the left hand side and click the “Add A Sitemap” link.

Select “Add General Web Sitemap” when asked to Choose Type. You will then see a text box to enter in your RSS feed URL like this:

Add Your Robots.txt File

A robots.txt file is a small text file placed on your blogs web server that tells search engines what they are allowed, and not allowed, to index on your blog. Most blogs come with a robots.txt file preinstalled. Check to see if your blog has one by visiting the following URL on your blog:

http://www.blogname.com/robots.txt

Just replace blogname with the name of your site. We are going to create a new robots.txt file so it doesn’t matter if you do not currently have a robots.txt file on your blog.

Create a text file named robots.txt on your desktop. Enter the following code in to your newly created file:

User-agent: *
Disallow:
Sitemap: http://www.blogname.com/sitemap.xml

User-agent: * – indicates to all search engines they are allowed to crawl your site
Disallow: – incidates what URLs to exclude from being crawled by search engines. Since we are not entering a value to the right we are telling all search engines to crawl everything on your blog
Sitemap: http://www.blogname.com/sitemap.xml – indicates to the search engines where your sitemap is located. Use the sitemap link from the previous tip.

Save your file and upload to the root directory of your server. Google looks for a robots.txt file on your blog once a day. You can verify that Google finds your file by visiting the following Webmaster Tool section:

Dashboard > Tools > Analyze robots.txt

That’s it! Now all search engines will know where your sitemap is located from your robots.txt file. The major benefit of this is to help search engines find your new posts and links easier.

Google Webmaster Tools offer some really great statistics about how Google crawls and indexes your blog. Make sure you poke around in the other sections.


0 comments:

Post a Comment