Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask

Re: SSL on PerlMonks

by perl-diddler (Chaplain)
on Sep 12, 2017 at 22:39 UTC ( #1199238=note: print w/replies, xml ) Need Help??

in reply to SSL on PerlMonks


What's the benefit for a public discussion site? I know for deficits, it makes things slightly more complicated, slower, and, likely, less easy to cache for proxies, but I've really understood the benefits. It seems like asking whether or not the perl man pages should be encrypted in transit or not. I don't see a clear benefit given the deficits.

Can someone enlighten me?


Replies are listed 'Best First'.
Re^2: SSL on PerlMonks
by haukex (Abbot) on Sep 13, 2017 at 07:18 UTC
      Geez...that's more of an example of Google's "Let's be evil" new behavior than a case for using https:

      I know that using https seriously impaired the caching on my home squid-cache. On https sites, the caching fell to zero on sites that used https to encrypt the setting.

      It used to be that, depending on the site and site-type, I might get a 20-30% speed boost from my home cache mostly in lowering numbers of requests for common items like icons, style sheets and pictures. On news sites, I saw as much 30+%.

      I've restored some or most of that by using an SSL-bump proxy to decrypt & store... visiting a few news sites & looking at my caching rates: 25% (396/1557) requests were served via local cache, with 21% (20MB/94MB) of the traffic-by-bytes. Since I don't have Gigabit fiber @ home, that saves a noticeable chunk of time.

      My housemate noticed a major speed bump on You-Tube -- relating to the previews -- before, about 20 seconds/page, after, less than 2-3 seconds/page -- related to the previews (and the way they paged forward & back amongst the static preview images.

      One of the best example types which I've hit more than once is downloading large CD and/or DVD images from large SW vendors due to my max disk-cache object size being 2GB. One time I pulled down a 700+MB image from Microsoft -- *TWICE* -- having forgotten about the previous download -- nearly 2 months before. Couldn't figure out how I could download such a large file @ 200-300MB/s -- until I found it had been served from cache and I eventually found the previous place I downloaded it to.

      My main gripe is that this appears to be more about tracking 'traffic' and 'hits' than about security, which is actually *lowered* with more proxy-using sites being forced to decrypt HTTPS because of the large number of sites switching to HTTPS. Before -- HTTPS represented "sensitive" sites -- financial and maybe medical, but now, it represents "casual reading" of news and social sites. To continue caching and work-place monitoring of http usage, decoding https seems like its becoming an requirement. ;-(

      Anyway, no preference, for me, which way this site goes given my proxy, but for those who don't have such -- probably no big deal on this site (given that's it's mostly text) anyway...

        than about security, which is actually *lowered*

        WAT? So, using https is making the web less secure? Care to elaborate?

Re^2: SSL on PerlMonks
by erix (Parson) on Sep 13, 2017 at 05:43 UTC

    Passwords can be more easily stolen.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1199238]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (7)
As of 2018-07-19 17:39 GMT
Find Nodes?
    Voting Booth?
    It has been suggested to rename Perl 6 in order to boost its marketing potential. Which name would you prefer?

    Results (413 votes). Check out past polls.