How often to search bots surf your site?

This is a question that I’m frequently asked by many of my clients, as well as those just looking for help with the subject at hand. The following article will explain why bots/spiders surf your website and why it can and will be a benefit for you. Spiders/bots are important to help towards the success of your website. Look at AGn Designs, for example, I’m a PR5 with over 10,000 hits per day thanks to bots and word of mouth. This article should break down why those numbers are important.

Site’s are crawled by search engine spiders on a daily basis or even several times per day. The reason for these crawls is because the sites are either very popular or they frequently update content. If a website has new content on a regular basis, search engines visit frequently to see what’s changed. If a website is only updated or has new content only once or twice per year for example, a search engine won’t be encouraged to visit it very frequently because chances are nothing has changed since the last visit.

All crawls are not the same, this is something to keep in mind. A spider will visit a homepage or even a handful of home pages for that matter, just to see if there are any changes since their last visit, and if so get an idea for the magnitude of them. If a major change has been made, such as a redesign, new pages added, etc. then what bots will then call a ‘deep crawl’ will come later on. The page could be crawled 4 times a week but the first level interior pages only about once per week. The second or even third level pages may only be crawled 1-2 times per month, for that matter.

Do not mistake the frequency of being crawled with high search engine rankings. There are many sites that haven’t been updated in years who still rank well. It’s primarily based on what your site has to offer and how well it’s offered.

A trick for many of my clients, and even those who just request my help, is making sure every possible thing that can be standards compliant – is. From XHTML/HTML, CSS, RSS Feeds and more, cleaner code means a spider can read more into the material, not the countless number of div and br tags to fix something that you didn’t personally understand doing.

When it comes to ranking well on search engines, there are really only two options.

  • Get Crawled: You want search engines to crawl your website, it’s NOT a bad thing if half of your hits are coming from bots, trust me.
  • Improve Rankings: Proper code, spider crawling, networking, associating yourself with other sites of the same nature BACK LINKS are a must!

Having your website crawled by a spider is not a challenge. Take blogging for example. With WordPress, the script used exclusively on all of my domains, every time an entry is posted I’m able to ping about 50+ sites to show that I’ve updated my website. These ping sites are viewed by spiders who see my link, they are then redirected back to me to show that I’ve updated and will continue to come back to me time and time again. In the meantime, users who frequent these ping sites will also see my link – this is what one would call back links, some would call it link baiting – because I’m being linked from websites with fairly high rankings due to frequency of bots and human visitors.

What does this mean for me? It’s quite simple really. If you notice, I’m currently a page rank of 5 on Google. Based on my site wide statistics, 30% of my daily visitors are bots/spiders. I’m also lucky enough that when someone types in Krissy on Google, I’m literally the first site listed. Why? Because spiders frequently visit and now I’m known as the number one Krissy on the net, by Google standards. It is important to remember that keywords play an important role in your websites success!


A thirty-something code ninja + web diva. Former New Yorker who's passionate about web development, HTML/CSS, beautifying things and marketing.

More from this author