http://www.perlmonks.org?node_id=506272


in reply to using HTTP::PROXY instead of SQUID as a company firewall

The difficulty is not the configuring of squid or even HTTP::Proxy to filter out keywords or domains. Thats easy. The real problem is maintaining the restriction list. And that is a whole can of worms + nightmare all rolled into one. Even the professional companies, such as netnany, websense or secure computing don't allways get that right.


_______________________________________________________
Remember that amateurs built Noah's Ark. Professionals built the Titanic.
  • Comment on Re: using HTTP::PROXY instead of SQUID as a company firewall

Replies are listed 'Best First'.
Re^2: using HTTP::PROXY instead of SQUID as a company firewall
by BooK (Curate) on Nov 07, 2005 at 10:56 UTC

    The professional companies do it for everyone, while he'll be doing this for his own small company.

    Part of the job of maintaining the list can be done with yet another Perl script that looks at the daily proxy logs and shows only the new URLs (the already seen ones being split between black and white list) and lets one decide which list they belong to. After the first few painful weeks, you'd only get a few new sites a day, most of them probably harmless. Having a starting point for the black list would surely help.

    Using Regexp::Assemble could probably boost the black/white list performance.