Is browser and bot whitelisting a practical approach?

Posted by Sn3akyP3t3 on Pro Webmasters See other posts from Pro Webmasters or by Sn3akyP3t3
Published on 2012-06-22T19:25:31Z Indexed on 2012/06/22 21:26 UTC
Read the original article Hit count: 216

Filed under:
|

With blacklisting it takes plenty of time to monitor events to uncover undesirable behavior and then taking corrective action. I would like to avoid that daily drudgery if possible. I'm thinking whitelisting would be the answer, but I'm unsure if that is a wise approach due to the nature of deny all, allow only a few. Eventually someone out there will be blocked unintentionally is my fear. Even so, whitelisting would also block plenty of undesired traffic to pay per use items such as the Google Custom Search API as well as preserve bandwidth and my sanity.

I'm not running Apache, but the idea would be the same I'm assuming. I would essentially be depending on the User Agent identifier to determine who is allowed to visit.

I've tried to take into account for accessibility because some web browsers are more geared for those with disabilities although I'm not aware of any specific ones at the moment.

The need to not depend on whitelisting alone to keep the site away from harm is fully understood. Other means to protect the site still need to be in place. I intend to have a honeypot, checkbox CAPTCHA, use of OWASP ESAPI, and blacklisting previous known bad IP addresses.

© Pro Webmasters or respective owner

Related posts about browsers

Related posts about robots