Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

SSL on PerlMonks

by sapadian (Initiate)
on Aug 07, 2017 at 17:28 UTC ( #1196914=monkdiscuss: print w/replies, xml ) Need Help??

Does anyone know if there are plans to encrypt PerlMonks site traffic with a secure certificate?

Replies are listed 'Best First'.
Re: SSL on PerlMonks
by marto (Bishop) on Aug 07, 2017 at 17:35 UTC

    I believe talks are underway with Pair, the host, regarding deployment of letsencrypt certs.

      That's great news! Thanks marto.

        In the meantime, use https and just store the exception for the “wrong” pair.com pairsite.com cert.

Re: SSL on PerlMonks
by Anonymous Monk on Aug 07, 2017 at 17:50 UTC
    In the meantime https://perlmonks.pairsite.com/

      Just a note, that certificate will likely be removed soonish, together with the URL (I think). Instead, https://perlmonks.org (well, bsd_glob '{www.,}perlmonks .{com,net,org}', plus css.perlmonks.org) should be the ones with a working/proper certificate. If I understand how we can set up serving different HTTPS certificates with Pair.com, perlmonks.pairsite.com will remain, but my current understanding is that we can only have one certificate.

Re: SSL on PerlMonks
by Corion (Pope) on Sep 13, 2017 at 07:51 UTC

    There now (well, since Sunday) is a certificate installed on two of our three IP addresses. So you should be able to visit https://perlmonks.org successfully in statistically two out of three requests.

    I still consider this functionality in the beta stage because we haven't addressed some of the open issues:

    • roll out the certificate to the last machine which holds onto the pairsite.com certificate
    • Eliminate the protocol prefix from all site-internal links

    • Set up an automatic renewal process (basically done thanks to Crypt::LE, but untested since it has only been 5 days since creation)

    There also are some other issues that I consider issues but that I don't consider blocking:

    • Perlmonks will not send CORS headers for other sites so including http:// ressources like Javascript from other sites will fail.

      This could be addressed by hosting some of the more interesting Javascript on Perlmonks or by making the needed ressources available through https://.

      Thanks very much for this Corion, another beer I owe you ;)

Re: SSL on PerlMonks
by perl-diddler (Hermit) on Sep 12, 2017 at 22:39 UTC
    Why?

    What's the benefit for a public discussion site? I know for deficits, it makes things slightly more complicated, slower, and, likely, less easy to cache for proxies, but I've really understood the benefits. It seems like asking whether or not the perl man pages should be encrypted in transit or not. I don't see a clear benefit given the deficits.

    Can someone enlighten me?

    Tnx...

        Geez...that's more of an example of Google's "Let's be evil" new behavior than a case for using https:

        I know that using https seriously impaired the caching on my home squid-cache. On https sites, the caching fell to zero on sites that used https to encrypt the setting.

        It used to be that, depending on the site and site-type, I might get a 20-30% speed boost from my home cache mostly in lowering numbers of requests for common items like icons, style sheets and pictures. On news sites, I saw as much 30+%.

        I've restored some or most of that by using an SSL-bump proxy to decrypt & store... visiting a few news sites & looking at my caching rates: 25% (396/1557) requests were served via local cache, with 21% (20MB/94MB) of the traffic-by-bytes. Since I don't have Gigabit fiber @ home, that saves a noticeable chunk of time.

        My housemate noticed a major speed bump on You-Tube -- relating to the previews -- before, about 20 seconds/page, after, less than 2-3 seconds/page -- related to the previews (and the way they paged forward & back amongst the static preview images.

        One of the best example types which I've hit more than once is downloading large CD and/or DVD images from large SW vendors due to my max disk-cache object size being 2GB. One time I pulled down a 700+MB image from Microsoft -- *TWICE* -- having forgotten about the previous download -- nearly 2 months before. Couldn't figure out how I could download such a large file @ 200-300MB/s -- until I found it had been served from cache and I eventually found the previous place I downloaded it to.

        My main gripe is that this appears to be more about tracking 'traffic' and 'hits' than about security, which is actually *lowered* with more proxy-using sites being forced to decrypt HTTPS because of the large number of sites switching to HTTPS. Before -- HTTPS represented "sensitive" sites -- financial and maybe medical, but now, it represents "casual reading" of news and social sites. To continue caching and work-place monitoring of http usage, decoding https seems like its becoming an requirement. ;-(

        Anyway, no preference, for me, which way this site goes given my proxy, but for those who don't have such -- probably no big deal on this site (given that's it's mostly text) anyway...

      Passwords can be more easily stolen.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: monkdiscuss [id://1196914]
Approved by marto
help
Chatterbox?
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others pondering the Monastery: (6)
As of 2017-10-19 07:51 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My fridge is mostly full of:

















    Results (252 votes). Check out past polls.

    Notices?