Originally Posted by phppete
Even if a valid user agent is returned it still wouldn't maintain a session so therefore the token set when the form loads would not match the one checked when the form is submitted which means the form isn't sent. So far I am having 100% success rate for my clients against spam on web forms. One site was getting about 30 a day, after updating to the above methodology it completely stopped.
Your method, which sounds like it is similar to using a CAPTCHA, will stop the spam from going through. However, that doesn't necessarily mean it stops the spamming. There is another side to it.
Once a URL is loaded into a botnet, it will stay there for a long time (months). If you change the script that is receiving the spam, or even if you remove it, the botnets will keep pounding on it anyway, even if the attempts are not successful. The resources of the botmaster are plentiful and he doesn't care about a URL more or less. I've seen sites, that have never even had comments or trackbacks enabled, receive lots of spam traffic.
The server will still have to deal with the (attempted) spamming, the script has to be executed, or if it was removed or disabled, a 404 or 403 response has to be sent.