I always use .htaccess, which, if you use "Redirect permanent
" (as Randall pointed out), works fine with Google. I've done a few reorgs, and I find Google quickly picks up on things.
I'm sooooo fed up with other bots, that two weeks ago I rewrote our robots.txt so Google is the only
one allowed. Unfortunately, it turns out that Google's image bot is too stupid to realize that it's barred, and their tech "support"'s reply to my bug report was "explicitly exclude it". [ we need a "grumble, grumble, grumble" smiley ]
Mandi, excellent point about the custom 404 page! We've still got a few external links that are wrong, so I regularly monitor that, and our 404 has direct links to all
the moved pages.
The easiest way I know of to create a site map is to do a word grep (including subdirs) of "title" from the root dir of your local site image, redirect the output to a file, then use a text transformation tool to change the contents of each title tag pair and filename into a link. If you have the right utilities, it takes about 5 minutes to set up the first time, then just a few seconds to re-run.
That gives you a simple and complete
one, but the real time consuming part is
deciding what to exclude, and how to lay it out.
MPaul, I'm sure you'll
totally make Barry's day to let his spider onto your site. Barry, I might even let you spider mine, some day
, purely to help you stress-test it (we're pushing 700 pages). We don't have a site map (our navigation is too easy), but do have a complete "species list" that I used to generate using the technique above.