[UPHPU] throttling bots and spiders
phpninja at gmail.com
Fri Feb 22 15:06:13 MST 2008
well one idea could be using a mix of a xmlhttpquest object and php. you
would have to rely on the users interaction to trigger the session though
when they visit the page, possibly an OnClick(). That would assure it was a
real user who visited ( because they physically clicked the page.) It might
be handled with <body OnClick="CheckSession();>. Checksession would
determine whether the user has clicked the page once or not. you probably
will have to have some kind of variable counter to stop the check after 1
click, because you don't need to keep checking the body with every click. Or
you could just reload the page with a header() to not include the
checksession after the session is assured valid. just some ideas.
On 2/22/08, Wade Preston Shearer <lists at wadeshearer.com> wrote:
> > Do you want them to crawl, but just get throttled? Or no spider?
> Sorry, I didn't ask my question well. They are free to crawl as much
> as they would like. I don't wan to assign them a session if they are a
> bot/spider. I am looking for a way (besides manually maintaining a
> user-agent list) to automatically disqualify them from getting an
> session assigned to them.
> UPHPU mailing list
> UPHPU at uphpu.org
> IRC: #uphpu on irc.freenode.net
More information about the UPHPU