Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

using HTTP::PROXY instead of SQUID as a company firewall

by schweini (Friar)
on Nov 07, 2005 at 09:22 UTC ( [id://506260]=perlquestion: print w/replies, xml ) Need Help??

schweini has asked for the wisdom of the Perl Monks concerning the following question:

Salve!,
Somebody asked me to implement a porn-filering firewall for their company. I know that SQUID has a lot of features for stuff like that, but I recently played around with HTTP::Proxy and liked the fact that I could use my perl knowledge to implement filters and rules - so my initial idea is to simply block all outbound traffic on the masquerading router, and only allow HTTP traffic via a perl Proxy, which filters all traffic, blocking sites that contain any blacklisted words, or are on a blacklisted domain. Is this feasible, or am i overlooking something obvious? I know that no filter will ever be perfect, but would this work for a ca. 20 employee-company, running on some pentium 2 or 3 hardware, on a ca. 512kbps link?
  • Comment on using HTTP::PROXY instead of SQUID as a company firewall

Replies are listed 'Best First'.
Re: using HTTP::PROXY instead of SQUID as a company firewall
by vagnerr (Prior) on Nov 07, 2005 at 10:27 UTC
    The difficulty is not the configuring of squid or even HTTP::Proxy to filter out keywords or domains. Thats easy. The real problem is maintaining the restriction list. And that is a whole can of worms + nightmare all rolled into one. Even the professional companies, such as netnany, websense or secure computing don't allways get that right.


    _______________________________________________________
    Remember that amateurs built Noah's Ark. Professionals built the Titanic.

      The professional companies do it for everyone, while he'll be doing this for his own small company.

      Part of the job of maintaining the list can be done with yet another Perl script that looks at the daily proxy logs and shows only the new URLs (the already seen ones being split between black and white list) and lets one decide which list they belong to. After the first few painful weeks, you'd only get a few new sites a day, most of them probably harmless. Having a starting point for the black list would surely help.

      Using Regexp::Assemble could probably boost the black/white list performance.

Re: using HTTP::PROXY instead of SQUID as a company firewall
by Anonymous Monk on Nov 07, 2005 at 10:49 UTC

    Instead of reinventing the wheel, I would suggest a slightly different solution. Use squidguard ( google for it) alongwith squid. Squidguard uses regex to filter web access. IIRC, it is a perl solution. Added bonus is that squid is a very well tested and reliable product.

    Mahesh

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://506260]
Approved by grinder
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (5)
As of 2024-10-09 13:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    The PerlMonks site front end has:





    Results (45 votes). Check out past polls.

    Notices?
    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.