View Full Version : Best way to redirect
03-25-2004, 01:19 PM
I'm underway with my website redesign, and I'm almost finished. Since so much has changed (page name, extensions, location), I'll need to redirect the users (and bots). So I'm looking for the best redirection method, one that would be good for redirecting more than 50 pages.
Right now I know 3 ways of redirecting:
1) With the meta tag:
<meta http-equiv = "refresh" content="0;url=http://www.mydomain.com/newpage.php">
3) With .htaccess
I also don't want to make Google mad :o
03-25-2004, 01:49 PM
There are several good approaches - but I wanted to add that whenever I do this, I always make sure my custom 404 page makes reference to the recent redesign and points straight to a current site map, and also features an "Email us if you still can't find something" email link.
03-25-2004, 02:03 PM
Thanks Mandi, I was thinking about doing a custom 404 to help users. But about the sitemap, is there any script that can automatically create a sitemap for me, or do I have to do it myself?
03-25-2004, 02:24 PM
I believe there are some scripts that will spider your site and produce some sort of map - but I really prefer to do it by hand. Here's one I did that works well:
An improvement to that would be to use anchor tags with name=___ and the # thingy in the URL, so the person could jump straight to the item on the appointed page.
03-25-2004, 02:27 PM
Hey, I almost missed my cue! I wrote a script that could create a site map for you, but I have to run it from my own computer. It's under development, but it's useable. Here's a sample from a few months ago: http://www.polisource.com/PublicMisc/LinkStructor_Site_Map.html
The boxed parts on the bottom are easy to edit out if you don't want them. I'm looking for volunteers who will let me test it on their web sites, but I'm pretty aware of its abilities and issues.
03-25-2004, 02:55 PM
Is that the spider you've been talking about?
Well, I'll do both. I'll do a sitemap by hand now , because some parts in my redesign are still missing. And when I'm finished with everything, you can try your spider on my site :)
I'll give you the ok, when I'm finished with everything (in about 2-3 weeks, things should be finished)
03-25-2004, 03:06 PM
Yeah, that's the spider. In two weeks it will probably obey robots.txt (forgot to mention it doesn't yet) and pretty up maps of sites that have a bunch of directories with only one file in each. It's surprising how many websites have that.
03-25-2004, 04:24 PM
A redirect permanent (http://httpd.apache.org/docs/mod/mod_alias.html#redirect) in .htaccess will tell Google that the page has been moved.
03-25-2004, 04:54 PM
I always use .htaccess, which, if you use "Redirect permanent" (as Randall pointed out), works fine with Google. I've done a few reorgs, and I find Google quickly picks up on things.
I'm sooooo fed up with other bots, that two weeks ago I rewrote our robots.txt so Google is the only one allowed. Unfortunately, it turns out that Google's image bot is too stupid to realize that it's barred, and their tech "support"'s reply to my bug report was "explicitly exclude it". [ we need a "grumble, grumble, grumble" smiley ]
Mandi, excellent point about the custom 404 page! We've still got a few external links that are wrong, so I regularly monitor that, and our 404 has direct links to all the moved pages.
The easiest way I know of to create a site map is to do a word grep (including subdirs) of "title" from the root dir of your local site image, redirect the output to a file, then use a text transformation tool to change the contents of each title tag pair and filename into a link. If you have the right utilities, it takes about 5 minutes to set up the first time, then just a few seconds to re-run.
That gives you a simple and complete one, but the real time consuming part is :) deciding what to exclude, and how to lay it out.
MPaul, I'm sure you'll :) totally make Barry's day to let his spider onto your site. Barry, I might even let you spider mine, some day, purely to help you stress-test it (we're pushing 700 pages). We don't have a site map (our navigation is too easy), but do have a complete "species list" that I used to generate using the technique above.
03-25-2004, 06:02 PM
In two years my site has grown from 12 to over 80 pages. Google has always kept up well without any special prompting though I seldom abandon file names, branching and linking new pages from existing names. This keeps other, non-search-engine external links intact as well.
But when I have used redirects several times, I have used meta-tag redirects which seem to work much faster and Google has worked just fine for me with those.
When you move content and while Google is still pointing to the old page, the redirect will still send visitors to the new page from Google--it shouldn't matter. Google always catches up on indexing quickly though PageRank takes a little longer.
Just be sure the new location is generously linked everywhere using the new file name and Google will reindex at its usual speedy pace.
03-25-2004, 06:06 PM
I might even let you spider mine, some day, purely to help you stress-test it1.4 MB of stress test from over a month ago: http://www.polisource.com/PublicMisc/zzzzRESULTS_exploratorium.html
I disabled the links. I figure people don't like to have their websites experimented on.
You could see that alot of directories contain only one file. I determined a while back that those are true directories and not path data, but I don't remember which website I determined that for. They make for a sloppy, space wasting site map (when made with LinkStructor), so one of the next things I'll do is have such single files grouped together, under the directory name of the next higher directory (or lower... I keep forgetting the terminology).
I think the kind of site map that LinkStructure creates would be considered a complete "species list." I'll eventually add a feature to only map to a certain directory depth.
03-25-2004, 06:16 PM
But most importantly, I have found a direct correlation between simplicity and use. People just exit quickly from complex maps. Use of my site map nearly doubled when I simplifed it, even though it gives fewer details.
03-25-2004, 09:56 PM
You could see that alot of directories contain only one file.
With my website redesign I've actually used a lot more folders. Each category gets it's own folder with it "/images/" folder, "/data/" folder and an index page. Each category becomes like a sub website. And in the images folder of the www directory, I put general images (like headers, footers...) used by the whole website. In the images folder of each sub website has images specific to that category.
So that explains why a lot of my directories have 1 file. I find that this actually makes it more organised. And one day, I could let a friend manage one of the parts of my site, and this way, there wouldn't be an excuse that would force me to give him access to the whole site :)
And for redirecting, I think I'll go with the .htaccess method since I won't need to keep the 50 or more files there. It'll all be in the .htaccess. Oh, my next question, is there a limit on the .htaccess file? Would putting 50 redirection commands in it be a problem?
03-31-2004, 05:15 PM
03-31-2004, 05:17 PM
Don't know how it got there, but ignore that target bit in the above code.
03-31-2004, 05:27 PM
I thought the "target" part opened the redirect in a new browser window...
03-31-2004, 05:37 PM
My response was munched by using the board's php button, the code should have read:
header('location: mynewsite.com'); // or whatever
exit; // this should not be needed as you've gone
The caveate is that there must be no output to the client, headers must be sent before data.
vBulletin® v3.6.8, Copyright ©2000-2013, Jelsoft Enterprises Ltd.