Select Page

**yes, I may be talking about Google here but the essentials still count for Yahoo!, Bing and other search engines.

One of the number one questions I’m asked while I’m wearing my SEO and Analytics hats is “How do I get my site to rank better in Google?”. It seems to be a be all and end all. To many users it’s something of a Holy Grail.

Seeing your web site listed in the top 10 on the Google front page in your sector is certainly something to aspire to but it’s not something to live and die by.

There are certain best practices to promote your site on Google. There’s no magic trick, you can’t pay enough to make it happen (if you do it’ll not stay there long), you need to learn and do.

One of the key aspects of SEO is making sure that bots and spiders can crawl a site.

Search Engines don’t magically just know that your content exists. Every search engine has what are called bots or crawlers that scan sites for new content and report it back to the search engine.

Your site and the content in it has to be crawler accessible. If your site isn’t accessible to bots you’d might as well forget about search engine ranking.

Google makes it remarkably easy to get information how how to make it easy for the GoogleBot to take a look at your site. Think about it, it’s their business. If the GoogleBot isn’t looking at your site then Google really isn’t doing their job.

Your first stop (if you haven’t listened to me tell you before) should be the Google Webmaster Tools site – http://www.google.com/webmasters to submit your site, it’s sitemap and check the site crawler access.

You should take a look at this Google Slideshow too. It tells you what to do and what not to do to help crawlers do their job.

Reblog this post [with Zemanta]