Google, MSN, Yahoo spiders crawling off my 'database universe'?

I recently figured out how to create a fairly complex Google Sitemap
file and am happy to share this code with anyone who asks. As I have a
highly nested database a common URL for me will look something like;

The spiders come to my website and 'crawl' along and increase these
sequences which eventually puts them into;

and as this URL has gone off the edge of my database universe, my
exception_notification features send me an email.
Is there a way to put logic somewhere so that if a spider (or person)
is messing around and requests a URL that isn't there, that a routine
kicks in telling them "You've gone off the edge of my database
universe" on a view, and then takes them back to where they were?

Thank you for any thoughts you may offer.

Well u can setup proper robotx.txt and disallow certain things to be
accessed from ur site..This is the only way to put restriction on

Hope this helps


Dhaval Parikh
Software Engineer