Are you asking for proof that https makes it more difficult for web caching? I guess I thought that was self evident, but I forget that maybe not everyone notices or measures such things. I've measured my own cache hit usage off and on for over 15 years, running my own at home. To anyone paying attention to cache hit rates in their browsers across their machines, it should be pretty evident that when you hit a ssl site, you see a start of an ssl session at the beginning of contact, and an end-of-session at the end. You no longer see any individual items to cache.
You have to break apart the ssl stream in order to cache the objects. If you setup your own squid-proxy you'll see this -- nothing subjective about it. That's just for content caching and speedup. Many other private network providers (companies & institutions) want to see what is accessed for purposes of controlling their networks. Blocking the net entirely isn't an option for most such operations. So they go the route of opening the streams.
If they allow people to bring their own devices in, they can easily tell people they need to install a new root-cert to allow for auditing of content to comply with whoever sets regulations & laws for them and that users of the proxy need to be aware that while no one is specifically reading the content of their traffic, they destinations and contact points are logged and anyone accessing sensitive sites might not want to do it on premises.
I've only been on the squid list for about 15 years or more, and its easy to see an uptick in conversations related to SSL bumping and solving related problems. Information in the squid wiki has gets very detailed in how to set things up now due to the number of conversations. I don't think anyone is claiming that this is done "covertly" but it is becoming more "routine".
I can't see who would want to fund a study showing increased https usage is associated with increased SSL bumping, so if you are looking for such stats, you might have to do your own research.
While many people may have been able to use semi-public computers @ libraries and such for https sites 10-20 years ago, now I wouldn't be so sure about privacy. But feel free to regard it as subjective. Everyone has their own level of comfort.
I've seen more than one case of companies and ISP's having root-certs when they, officially, weren't supposed to. In most of those cases, the problematic access was said to have been closed and the problem brushed under the carpet... er, problem solved. Right. And with the US policing agencies having been caught multiple times with their hand in our traffic, you think they are going to have problems copping a root cert these days with every social site using https? Before, when it was 99+% banks and such for users, spying on https traffic might have raised a few more eyebrows, but today?
|